Why don't we use low voltage Power Sources for high wattage applications?
You are right in that power is the product of voltage and current. This would indicate any voltage x current combination would be fine, as long as it comes out to the desired power.
However, back in the real world we have various realities that get in the way. The biggest problem is that at low voltage, the current needs to be high, and that high current is expensive, large, and/or inefficient to deal with. There is also a limit on voltage above which it gets inconvenient, meaning expensive or large. There is therefore a moderate range in the middle that works best with the inconvenient physics we are dealt.
Using your 60 W device as example, start by considering 120 V and 500 mA. Neither is pushing any limits that result in unusual difficulties or expense. Insulating to 200 V (always leave some margin, particularly for insulation rating) pretty much happens unless you try not to. 500 mA doesn't require unusually thick or expensive wire.
5 V and 12 A is certainly doable, but already you can't just use normal "hookup" wire. Wire to handle 12 A is going to be thicker and cost considerably more than wire that can handle 500 mA. That means more copper, which costs real money, makes the wire less flexible, and makes it thicker.
At the other end, you haven't gained much by dropping from 120 V to 5 V. One advantage is safety rating. Generally at 48 V and below, things get simpler regulator-wise. By the time you get down to 30 V, there isn't much saving in transistors and the like if they only need to handle 10 V.
Taking this further, 1 V at 60 A would be quite inconvenient. By starting at such a low voltage, smaller voltage drops in the cable become more significant inefficiencies, right when it becomes more difficult to avoid them. Consider a cable with only 100 mΩ total out and back resistance. Even with the full 1 V across it, it would only draw 10 A, and that leaves no voltage for the device.
Let's say you want at least 900 mV at the device, and therefore need to deliver 67 A to compensate for the power loss in the cable. The cable would need to have a out and back total resistance of (100 mV)/(67 A) = 1.5 mΩ. Even at a total of 1 m of cable, that would require quite a thick conductor. And, it would still dissipate 6.7 W.
This difficulty in dealing with high current is the reason that utility-scale power transmission lines are high voltage. These cables can be 100s of miles long, so series resistance adds up. Utilities make the voltage as high as they can to make the 100s of miles of cable cheaper, and for it to waste less power. The high voltage does cost some, which is mostly the requirement to keep a larger clearance around the cable to any other conductor. Still, these costs aren't as high as using more copper or steel in the cable.
Another problem with AC is that the skin effect means you get diminishing returns in resistance for larger diameters. This is why for really long distances, it becomes cheaper to transmit DC, then pay the expense to convert that to AC at the receiving end.
Combine $$ P = V \cdot I $$ with Ohm's law $$ V = R \cdot I $$ to obtain:
$$ P = I^2 \cdot R $$
where \$P\$ is the power dissipated on the supply wires, \$I\$ is the current flowing through the wires and \$R\$ is the wires' resistance.
For every doubling of the current, the power lost on the wires quadruples. To compensate for that, one would have to make the resistance four times smaller i.e. increase the cross section of the wire by a factor of four (double the wire's diameter) meaning four times more copper.
For the very same reason the power grid uses up to several hundred kilovolts to transport electricity (transport at household level voltages would require of the order of a million times more copper to keep losses the same).
Achieving the really low resistance reliably is a major issue. Until room temperature super conductors exists it will remain a big issue.
Many PC power supplies will feed high power over low voltages. They have a sense wire on the power rail that is bonded to the end of the cable. This feeds back to the regulator circuit to boost the voltage to compensate for the voltage drop from the high current draw and the internal resistance of the wire. However modern motherboard will draw most of their power from the highest voltage rail to avoid the losses and regulate it down internally.
High amp loads also need beefy conductors that won't heat up and melt under that high current. If the conductor is damaged in any way that spot will have higher resistance and heat up more.