The Semicolon Wars
Every programmer knows there is one true programming language. A new one every week
The Endian Wars
In 1726 Jonathan Swift told of a dispute between the Little-Endians of Lilliput and the Big-Endians of Blefuscu; 41,000 perished in a war fought to decide which end of a boiled egg to crack. This famous tempest in an egg cup was replayed 250 years later by designers of computer hardware and communications protocols. When a block of data is stored or transmitted, either the least-significant bit or the most-significant bit can go first. Which way is better? It hardly matters, although life would be easier if everyone made the same choice. But that's not what has happened, and so quite a lot of hardware and software is needed just to swap ends at boundaries between systems.
This modern echo of Swift's Endian wars was first pointed out by Danny Cohen of the University of Southern California in a brilliant 1980 memo, "On holy wars and a plea for peace." The memo, subsequently published in Computer, was widely read and admired; the plea for peace was ignored.
Another feud—largely forgotten, I think, but never settled by truce or treaty—focused on the semicolon. In Algol and Pascal, program statements have to be separated by semicolons. For example, in x:=0; y:=x+1; z:=2 the semicolons tell the compiler where one statement ends and the next begins. C programs are also peppered with semicolons, but in C they are statement terminators, not separators. What's the difference? C needs a semicolon after the last statement, but Pascal doesn't. This discrepancy was one of the gripes cited by Brian W. Kernighan of AT&T Bell Labs in a 1981 diatribe, "Why Pascal is not my favorite programming language." Although Kernighan's paper was never published, it circulated widely in samizdat, and in retrospect it can be seen as the beginning of the end of Pascal as a serious programming tool.
Still another perennially contentious issue is how to count. This one brings out the snarling dogmatism in the meekest programmer. Suppose we have a list of three items. Do we number them 1, 2, 3, or should it be 0, 1, 2? Everyone in computerdom knows the answer to that question, and knows it as an eternal truth held with the deepest, visceral conviction. Only one of the alternatives is logically tenable. But which is it? Consider the Java expression Date(2006,1,1); what calendar date do you suppose that specifies?The answer is February 1, 3906. In Java we count months starting with 0, days starting with 1, and years starting with 1,900.
Even the parts of a program that aren't really part of the program can provoke discord. "Comments" are meant for the human reader and have to be marked in some way so that the computer will ignore them. You might think it would be easy to choose some marker that could be reserved for this purpose in all languages. But a compendium of programming-language syntax compiled by Pascal Rigaux—a marvelous resource, by the way—lists some 39 incompatible ways to designate comments: # in awk,\ in Forth, (*...*) in Pascal, /*...*/ in C, and so on. There's also a running debate over whether comments should be "nestable"—whether it's permissible to have comments inside comments.
Then there's the CamelCase controversy. Most programming languages insist that names of things—variables, procedures, etc.—be single words, without spaces inside them; but runningthewordstogether makes them unreadable. Hence CamelCase, with humps in the middle (also known as BumpyCaps and NerdCaps; but sTuDLy CaPs are something else). To tell the truth, I don't think there's much actual controversy about the use of CamelCase, but the name has occasioned lively and erudite discussions, revisiting old questions about Camelus dromedarius and C. bactrius, and offering glimpses of such further refinements as sulkingCamelCase (with a droopy head).