Using CPU heat to generate electricity
The issue with thermoelectric generators is they are horrendously inefficient.
For a CPU you HAVE to get rid of the heat they produce or they melt down.
You could hook up a peltier module and extract a small amount of electricity from them but you would still need to dissipate the remainder of the heat via a classical heat exchange method. The amount of electricity generated would likely not be significant enough to warrant the cost of the setup.
You CAN also use peltiers as coolers. However, you need to ADD power to pump out the heat. That power then needs to be dissipated along with the heat you are removing via the heat-exchanger. In the end the latter needs to be larger so your net effect is worse.
Heat to power is a "holy grail" idea and is up there with cold fusion as a theoretical dream.
EDITED FOR CLARITY
Efficient DIRECT conversion from heat to electricity is a "holy grail" idea and is up there with cold fusion as a theoretical dream.
For generating electricity, you want the hot side (processor) to be as hot as possible for maximum efficiency. The thermal generator slows down the movement of heat as it extracts energy from it.
For doing computation, you want the processor to be as cold as possible. Higher temperatures increase the electrical resistance of the silicon. This is why you have highly-conductive heatsinks, fans etc: to move heat away as fast as possible.
These requirements directly contradict one another.
Surprised that nobody else has mentioned this:
Generating electricty from the waste heat from some process that burns fuel can make sense. Generating electricity from the waste heat from a system that is powered by electricity in the first place? That makes no sense. If it's possible for you to save energy by doing that, then it's possible for you to save even more energy by building a system that uses electricity more efficiently in the first place.