What's the most fundamental definition of temperature?
It's the differential relationship between internal energy and entropy: \begin{align} dU &= T\,dS + \cdots \\ \frac{\partial S}{\partial U} &= \frac 1T \end{align} As energy is added to a system, its internal entropy changes. Remember that the (total) entropy is $$ S = k \ln\Omega, $$ where $\Omega$ is the number of available microscopic states that the system has. The second law of thermodynamics is simply probabilistic: entropy tends to increase simply because there are more ways to have a high-entropy system than a low-entropy system. The logarithm matters here. If you double the entropy of a system (by, say, combining two similar but previously-isolated volumes of gas) you have squared $\Omega$.
Consider two systems with different $U,S,T$ that are in contact with each other. One of them has small $\partial S/\partial U$: a little change in internal energy causes a little change in entropy. The other has a larger $\partial S/\partial U$, and so the same change in energy causes a bigger change in entropy. Because they're in contact with each other, random fluctuations will carry tiny amounts of energy $dU$ from one system to the other. But because of the internal differences that lead to different numbers of internal states, it becomes overwhelmingly more likely that energy will flow from the system with small $\partial S/\partial U$ (reducing its entropy by a little) and into the system with larger $\partial S/\partial U$ (increasing its entropy by a lot). So we call the first one "hot" and the second one "cold."
This definition even extends to the case where entropy decreases as energy is added, in which case the absolute temperature is negative. It also explains why those negative temperatures are "hotter" than ordinary positive temperatures: in that case adding energy to the positive-temperature system increases its entropy, and removing energy from the negative-temperature system also increases its entropy.