What is the advantage of optical TOSLINK over RCA coaxial cable?
In addition to the answer of TimB, there is another advantage of this optical communication.
With RCA, the two networks connected have to be referenced to each other. In the case of optical, there is galvanic isolation between the two. As a result, there might be less issues with ground loops, networks can remain isolated, etc. It also means that the grounds can't act as a big antenna which might make it easier to get low noise in the system as a whole.
And additional disadvantage of the RCA connectors is in the ground connection. If you look at most modern connectors, you will see that the ground connection is made first. As a result, the two circuits being connected are first pulled to the same potential, and then the actual data is connected. If data is connected first, this still happens - but the currents to do so now have to flow through your likely far more sensitive digital receiver circuits. In the RCA connectors, the first connection is the center pin that carries the data. For this reason I have often been told that you should always connect the RCA connectors first, before connecting the entire system to the mains voltage - or use the ground lug that some of these devices have to reference the system to mains earth at all times. Needless to say, this issue is not present in the optical versions, and thus they are more suited to hot-plugging.
I want to ask, within the scope of digital audio transmission, it is their any observable or measurable differences between the two cables?
Actually, yes.
Isolation:
Optical fiber isn't conductive, so it solves ground loops, hum/buzz issues and any is insensitive to RF interference. Coax can also be isolated with a transformer, however this adds to the cost and is uncommon in consumer equipment. A quick test with a multimeter between digital RCA ground and any other RCA ground will reveal if there is transformer isolation or not.
This really matters for cable TV boxes which are connected to the cable's ground, as this tends to create annoying ground loops.
Bandwidth:
The majority of optical transceivers on the market will have enough bandwidth for 24bits/96kHz, but only a few will pass 24/192k, and none pass 384k. If you want to know which one you got, make a test. That's rather binary: it works or it does't. Of course you can buy optical transceivers with much higher bandwidth (for ethernet, among other things), but you won't find these in audio gear.
Coax has no trouble with bandwidth, it'll pass 384k with no trouble, whether it will sound better is left as an exercise for the marketing department.
Whether 192k is a marketing gimmick or useful is an interesting question, but if you want to use it and your optical reveiver doesn't support it then you'll have to use coax.
Length
Plastic optical fiber is cheap. Count on 1dB/m attenuation. This isn't high-quality glass-core telecom fiber with 1-2dB/km loss! This doesn't matter for a 1m long fiber in your home cinema, but if you need a 100 meter run, coax will be the only option. 75R TV antenna coax is fine. Or better fiber, but not plastic. Connectors are, of course, not compatible.
(Note 1dB/m is for the digital signal, not the analog audio. If the digital signal is too attenuated the receiver won't be able to decode it, or errors will occur).
Bit Error Rate
Barring a major issue, all the bits will be there with both systems (I checked). BER is not an issue in practice. Anyone who talks about bit errors in SPDIF has something to sell, usually an expensive gimmick to solve a non-existent problem. Also SPDIF includes error-checking, so the receiver will mask any errors.
Jitter
Optical receivers add a lot more jitter (in the ns range) than well-implemented coaxial.
If the coax implementation is botched (not enough bandwidth extension on the low end, violation of 75R impedance, high intersymbol interference, etc) it can also add jitter.
This only matters if your DAC at the receiving end doesn't implement proper clock recovery (ie, WM8805, ESS DACs, or other FIFO-based systems). If it does it properly, there will be no measurable difference, and good luck hearing anything in a double blind test. If the receiver doesn't clean jitter properly then you'll have audible differences between cables. This is a "receiver not doing its job" problem, not a cable problem.
EDIT
SPDIF embeds the clock into the signal, so it must be recovered. This is done with a PLL synchronized with the incoming SPDIF transitions. The amount of jitter in the recovered clock depends on how much jitter is in the incoming signal transitions, and the ability of the PLL to reject it.
When a digital signal transitions, the important moment occurs when it passes through the logic level threshold of the receiver. At this point, the amount of jitter added is equal to the noise (or amount of error added into the signal) divided by the signal slew rate.
For example if a signal has a risetime of 10ns/V, and we add 10mV noise, this will shift the logic level transition in time by 100ps.
TOSLINK receivers have a lot more random noise than what would be added by a coax (the photodiode signal is weak and must be amplified), but this isn't the main cause. It is actually band-limiting.
Coax SPDIF is usually AC-coupled with a cap or transformer-coupled. This adds a high-pass on top of the natural low-pass nature of any transmission medium. The result is a bandpass filter. If the pass band isn't large enough, this means past signal values will influence current values. See fig.5 in this article. Or here:
Longer periods of constant levels (1 or 0) will influence the levels on the next bits and move the transitions around in time. This adds data-dependent jitter. Both the high-pass and low-pass sides matter.
Optical adds more jitter because its noise is higher, and its passband is smaller than a properly implemented coax. For example, see this link. Jitter on 192k is very high (almost 1/3 of a bit time) but jitter on 48k is much lower, because the receiver doesn't have enough bandwidth for the 192k signal, so it acts as a lowpass, and the previous bits smear into the current bit (that's intersymbol interference). This is almost invisible on 48k because receiver bandwidth is sufficient for this sample rate, so intersymbol interference is much lower. I'm not sure the receiver used by this guy actually supports 192k, the waveform really looks bad and I doubt the decoder chip would find it palatable. But this illustrates bandwidth vs intersymbol interference well.
Most optical receivers datasheets will specify a few ns jitter.
The same can occur with a bad SPDIF coax, if it acts like a low pass filter. The highpass part of the transfer function also plays a part (read the article linked above). Same if the cable is long and impedance discontinuities cause reflections which corrupt the edges.
Note this only matters if the following circuitry doesn't reject it. So the end result is very implementation dependent. If the receiver is CS8416 and the DAC chip is very sensitive to jitter, it can be very audible. With more modern chips which use a digital PLL to reconstruct the clock, good luck hearing any difference! These work very well.
For example WM8805 runs the received data through a tiny FIFO and uses a Frac-N clock synthetizer to reconstruct the clock, whose frequency is updated once is a while. It is rather interesting to watch on the scope.
Fiber optic does not radiate electromagnetically, but more important is that is immune to electromagnetic interference that can cause data corruption on copper in extreme conditions. Such interference may come from the arcing of a switch being switched off under load, or can be generated by a motor under high load.