Why can a regular infrared camera not show temperature (thermography)?
This is a common confusion, because both thermographic cameras and "normal" cameras with some IR capability are called IR cameras often.
The typical video camera with IR capability has a solid state semiconducting camera sensor normally used for capturing visible light, which relies on the photons interacting with electrons and electron-"holes" inside the semiconductor to convert the incoming light into electric charge which is subsequently measured. These photons are in the wavelength range of 300-800 nm or so, but the sensor technology is typically responsive up to 1000 nm or more. As the eye is not sensitive to the energy in the 800-1000 nm band, an IR cut filter is normally inserted in cameras to make the resulting photo seem similar to what the eye sees.
But if you remove the IR filter, you can get some "nightvision" capability by bathing the scene with light in the 850-950 nm range which is invisible to the eye.
On the other hand, thermal radiation is peaked at a much longer wavelength, typically at 8000 nm or longer, and is much more difficult to work with in a direct photon -> charge process, so the typical thermal camera uses a completely different and more mundane physical process - it actually uses an array of thermometers!
These are nothing else than a grid of small metal squares that are heated by the incoming thermal radiation, and their temperature can be read out because their resistance changes by their temperature (they are called micro-bolometers).
So, very different physical processes are used and the radiation is of an order of magnitude different wavelengths.
The thermal cameras need optics that can bend these longer wavelengths, they are often made of germanium for example and are opaque to visible light.
Partly a definition, thermographic means shows temperature - so any camera that shows temperature is thermographic and any that don't aren't !
To measure temperature a camera needs a couple of features. It needs to be sensitive to a wavelength that the object is emitting. A room temperature object has a peak emission around 10um so an infrared security camera using a Silicon sensor that is only sensitive to 1um isn't going to see much thermal emission from a room temperature object.
To obtain anything like an accurate temperature you also need to measure the amount of infrared energy (the brightness) at more than one wavelength, then by comparing the relative amount of infrared you can estimate the peak of the blackbody curve and so the temperature.
It depends on the sensor. If all that makes a camera "infrared" is that it is capable of detecting light with a lower frequency than red light, that is not enough to identify temperature. Objects of different temperature emit different frequencies of infrared light at different amounts. In order to accurately determine the temperature of an object, you need to be able to accurately measure the distribution of frequencies, not just the amount of light. Thermographic cameras are tuned to multiple infrared frequencies and possibly a few visible ones just like how ordinary cameras are tuned to red, green, and blue.