Why do utility companies supply at a frequency of either 50/60 Hz?
60 Hz was the result of engineering tradeoffs, I think made by or influenced by Nicolai Tesla. He was one of the early proponents of distributing AC, as apposed to Edison who wanted to distribute DC. The tradeoff had to do with size of the machines and transformers needed, which get smaller with higher frequency, and some losses, which go up with frequency. I remember reading that some careful study went into the decision to pick 60 Hz.
50 Hz, on the other hand, was due to marketing. There was a German manufacturer of power grid equipment that wanted to distinguish themselves and managed to get 50 Hz pushed thru as the standard in Germany and then much of Europe. This meant they didn't have to compete with the American 60 Hz equipment. The rest of the world ended up with 60 or 50 Hz depending on who they bought their equipment from and whether they were more economically tied to Europe or the US. Since Russia adopted the European 50 Hz standard, the soviet block all became 50 Hz countries.
This looks like what you're looking for:
http://en.wikipedia.org/wiki/Utility_frequency
In the early days of electrification, so many frequencies were used that no one value prevailed (London in 1918 had 10 different frequencies). As the 20th century continued, more power was produced at 60 Hz (North America) or 50 Hz (Europe and most of Asia). Standardization allowed international trade in electrical equipment. Much later, the use of standard frequencies allowed interconnection of power grids. It wasn't until after World War II with the advent of affordable electrical consumer goods that more uniform standards were enacted.
In Britain, a standard frequency of 50 Hz was declared as early as 1904, but significant development continued at other frequencies.[10] The implementation of the National Grid starting in 1926 compelled the standardization of frequencies among the many interconnected electrical service providers. The 50 Hz standard was completely established only after World War II.
By about 1900, European manufacturers had mostly standardized on 50 Hz for new installations. The German VDE in the first standard for electrical machines and transformers in 1902 recommended 25 Hz and 50 Hz as standard frequencies. VDE did not see much application of 25 Hz, and dropped it from the 1914 edition of the standard. Remnant installations at other frequencies persisted until well after the Second World War.[9]
In addition to the 1997 Edward L Owen column that Rasmus Faber cites, there's another good article here: "Technical origins of 60 Hz as the standard AC frequency in North America" IEEE Power Engineering Review, March 1999, Paul Nixon, p. 35-37. The full article is behind a paywall, but they post the first page as a png image, which I'll link to below.
The particularly interesting section here is:
By late 1889 and early 1890, direct-coupled alternators were coming into the experimental stage. These machines would prove to be much more reliable than the belt-driven generators, but would operate at much lower speeds. The need for lower ac operating frequencies [than 133.3Hz] was apparent, again driven by constructional and mechanical constraints. For example, an alternator direct driven by a 100rpm engine would require 160 poles to yield a frequency of 133 1/3 Hz. This type of construction was viewed as prohibitive. Around this time the Westinghouse Co. conducted an engineering study which considered both electrical operating characteristics with regard to the system components of the time and possible engine driven generator construction constraints, and recommended that 7,200 alternations per minute (60Hz @ 2 poles) was as high a frequency as would be desirable for the engine speeds which were then attainable. 60Hz was actually a carefully selected compromise. It was thought that higher frequencies would be better for the transformers then in existence, while lower frequencies might be better for engine-type generators. 60 Hz first appeared commercially in 1890. The earliest 60 Hz systems, like the earlier ac systems (140, 133 1/3, 125 Hz), were all single-phase.
By 1892, there were a large number of Westinghouse-designed 60 Hz central stations in existence, and 60Hz had taken over a share of the ac business from the higher frequencies.
(source: ieee.org)
So we basically have tradeoffs (transformers better at high frequencies vs. electric machines better at low frequencies, which is largely still true today) leading to a somewhat arbitrary compromise, then network effects solidifying the choices.
The Owen article also mentions that southern California was 50Hz up until conversion to 60Hz was completed in 1948. Japan still has half 50Hz and 60Hz: Owen states "In 1895, AEG sold a 50-Hz generator to the power company in Tokyo and the eastern half of Japan was put on the 50-Hz path. A little over a year later, GE sold a 60Hz generator to the power company in Osaka, and the Western half of Japan was put on the 60-Hz path." In the end it seems like it's mostly inertia/network effects -- it's just too painful to change large infrastructure projects.