Why does a cathode have to be heated to emit electrons?
Matter is held together by the electrical attraction between the electrons and the nuclei. Within the bulk of a solid or liquid, an electron feels these attractions from all directions equally, and therefore the force it experiences equals zero on the average. But if an electron finds itself at the surface, this isotropy is broken. The electron feels attractive forces toward the interior, which are not canceled out by any forces from the outside. Normally this causes any electron that impinges on the surface to be reflected back in like a pool ball hitting a cushion. To extract the electron out beyond the surface, you have to do a certain amount of work, called the work function $W$. The work function for a metal is typically about 5 eV.
Why does a cathode have to be heated to emit electrons?
It's not actually true that it has to be heated -- cold-cathode devices do exist, and thermionic emission does occur at all temperatures. However, at room temperature, $kT\approx 0.03\ eV$, which is much smaller than $W$. That means that only a very tiny fraction of the electrons have more energy than $W$. The probability of having an energy $W$ at temperature $kT$ goes like $e^{-W/kT}$, and Richardson found in 1901 that the current from a cathode, in the absence of an externally applied electric field, was proportional to $T^2e^{-W/kT}$.