Why does public mains power use 50-60 Hz and 100-240 V?
Why is mains frequency 50Hz and not 500 or 5?
Engine efficiency, rotational stress, flicker, the skin effect, and the limitations of 19th century material engineering.
50Hz corresponds to 3000 RPM. That range is a convenient, efficient speed for the steam turbine engines which power most generators and thus avoids a lot of extra gearing.
3000 RPM is also a fast, but doesn't put too much mechanical stress on the rotating turbine nor AC generator. 500Hz would be 30,000 RPM and at that speed your generator would likely tear itself apart. Here's what happens when you spin a CD at that speed, and for funsies at 62,000 FPS and 170,000 FPS.
Why not slower? Flicker. Even at 40Hz an incandescent bulb cools slightly on each half cycle reducing brightness and producing a noticeable flicker. Transformer and motor size is also directly proportional to frequency, higher frequency means smaller transformers and motors.
Finally there is the skin effect. At higher frequencies AC power tends to travel at the surface of a conductor. This reduces the effective cross-section of the conductor and increases its resistance causing more heating and power loss. There are ways to mitigate this effect, and they're used in high tension wires, but they are more expensive and so are avoided in home wiring.
Could we do it differently today? Probably. But these standards were laid down in the late 19th century and they were convenient and economical for the electrical and material knowledge of the time.
Some systems do run at an order of magnitude higher frequency than 50Hz. Many enclosed systems such as ships, computer server farms, and aircraft use 400 Hz. They have their own generator, so the transmission loss due to the higher frequency is of less consequence. At higher frequencies transformers and motors can be made smaller and lighter, of great consequence in an enclosed space.
Why is mains voltage 110-240V and not 10V or 2000V?
Higher voltage means lower current for the same power. Lower current means less loss due to resistance. So you want to get your voltage as high as possible for efficient power distribution and less heating with thinner (and cheaper) wires. For this reason, power is often distributed over long distances in dozens to hundreds of kilovolts.
Why isn't it lower? AC power is directly related to its voltage. AC power at 10 volts would have trouble running your higher energy household appliances like lights, heating or refrigerator compressor motor. At the time this was being developed, the voltage choice was a compromise between the voltage to run lights, motors and appliances.
Why isn't it higher? Insulation and safety. High voltage AC wires need additional insulation to make them both safe to touch and to avoid interference with other wiring or radio receivers. Cost of home wiring was a major concern in the early adoption of electricity. Higher voltages would make home wiring more bulky, expensive and more dangerous.
In the end, the choice of a single specific number comes from the necessity to standardize. However, we can make some physical observations to understand why that final choice had to fall in a certain range.
Frequency
Why a standard?
First of all, why do we even need a standard? Can't individual appliances convert the incoming electricity to whatever frequency they want? Well, in principle it's possible, but it's rather difficult. Electromagnetism is fundamentally time invariant and linear; the differential equations we use to describe it Maxwells' equations are such that a system driven by a sinusoidal input at frequency $\omega$ responds only at that same frequency. In order to get out a frequency different from $\omega$ the electromagnetic fields have to interact with something else, notably charged matter. This can come in the form of a mechanical gear box or a nonlinear electrical elements such as transistors. Nonlinear elements such as the transistor can generate harmonics of the input, i.e. frequencies $2 \omega$, $3 \omega$, etc. However, in any case, frequency conversion introduces efficiency loss, cost, and bulkiness to the system.
In summary, because of the time invariance and linearity of electromagnetism, it is considerably more practical to choose a single frequency and stick to it
Light flicker
In a historical note by E. L. Owen (see references), it is noted that the final decision between 50 and 60 Hz was somewhat arbitrary, but based partially on the consideration of light flicker.
During the lecture, while Bibber recounted Steinmecz’s contributions to technical standards, he briefly repeated the story of the frequencies. By his account, “the choice was between 50- and 60-Hz, and both were equally suited to the needs. When all factors were considered, there was no compelling reason to select either frequency. Finally, the decision was made to standardize on 60-Hz as it was felt to be less likely to produce annoying light flicker.”
The consideration of light flicker comes up elsewhere in historical accounts and explains why very low frequencies could not be used. When we drive a pure resistance with an ac current $I(t) = I_0 \cos(\omega t)$, the instantaneous power dissipation is proportional to $I(t)^2$. This signal oscillates in time at a frequency $2\omega$ (remember your trig identities). Therefore, if $\omega$ is lower than around $40 \, \text{Hz}$$^{[a]}$, the power dissipated varies slowly enough that as a visual stimulus you could perceive it. This sets a rough lower limit on the frequency you can use for driving a light source. Note that the arc lamps in use when electrical standards were developed may not have had purely resistive electrical response (see Schwern's answer where cooling on each cycle is mentioned) but the source frequency is always present in the output even in nonlinear and filtered systems.
Reflections / impedance matching
Alternating current signals travelling on a wire obey wave-like behavior. In a rough sense, the higher the frequency the more wavy the signal. A good rule of thumb is that if the length of wires is comparable to or much longer than the wavelength of the signal, then you have to worry about wave-like phenomena such as reflection. The wavelength $\lambda$ of an electrical signal is roughly $$\lambda = c / f$$ where $c$ is the speed of light and $f$ is the frequency. Suppose we'd like to transmit the electricity from an electrical substation to a house and we want to keep the wavelength big enough to prevent reflection physics without having to deal with careful impedance matching. Let's put in a length of $1000 \, \text{m}$ to be conservative. Then we get $$f \leq c / 1000 \, \text{m} = 300 \, \text{kHz} \, .$$
Voltage
We're talking about the voltage inside the building here. Note that power is transmitted at much higher voltage and then stepped down near the end point. The 120 V choice apparently comes from the fact that electricity was originally used for lighting, and the first lamps back in those early days were most efficient at around 110 V. The value 120 V may have been chosen to offset voltage drop in the wires going to the lighting sources.
Further reading
Detailed document by E. L. Owen with references
$[a]$: I'm not an expert in human flicker perception. This number is a rough guess based on personal experience and some literature.
P.S. I consider this answer a work in progress and will add more as I learn more.
The two other answers address the frequency issue. The voltage issue is much simpler.
If the voltage is too high, you run the risk of arcs between conductors. The minimum distance between conductors before an arc appears is proportional to voltage. At 240V, you arc at a distance of a few millimeters in air, depending on humidity. More voltage gets clearly impractical...
If the voltage gets lower, on the other hand, you need more current for a given power. But heating of wires is proportional to current squared: This means one needs thicker wire, with lower resistance. That's cumbersome, expensive and stiff (as an example, 32A rated wire is barely bendable enough for wall corners).
So the chosen 120/240V reflects this balance between arcing concerns (especially around connections) and wire heating.
I also heard that safety dictates highish voltage so that muscle spasms give you a chance to drop whatever you're touching before you get burnt to the core. I don't know to which extent this is true...