What's the size to talk about thermal equilibrium?
I think the confusion here is that, in the statement by Kittel, it is not emphasized that there exists a large temperature bath at temperature $T$. Namely, the single hydrogen atom is in thermal equilibrium with the temperature bath. It is precisely this large, macroscopic temperature bath that allows us to have a well-defined temperature and continue to do statistical mechanics. In other words, in the canonical ensemble where heat is allowed to flow, you can talk about "microscopic" things. Perhaps a clearer example of this is a single atom of H$_2$ in a gas of O$_2$ held at temperature $T$. The speed of the H$_2$ particle is still given by the equipartition theorem, even though we only have a "microscopic" number of H$_2$, namely $1$.
Of course, one can be perverse and apply statistical mechanics to exceedingly small physical systems, such as a single particle in the microcanonical ensemble. This a perfectly valid thing to do, and there is nothing inherently wrong with it, but in many ways it is not useful. For example, the temperature of a single particle is always $0$, cause it only ever has one microstate at each energy. In addition, many key results require the number of particles $N \rightarrow \infty$. But speaking purely in principle, there isn't anything wrong with looking at small systems.
To understand statistical mechanics and many-body physics, one often starts with examples which "decouple" in nice ways to yield a simple solution. In this case, this "hiding" of details is a bit confusing if you actually think about it. I will, therefore, give a longer mathematical walkthrough and some intuition in the end:
Let's ignore quantum mechanics! Consider any system of non-interacting, identical sub-systems. It can be described by a Hamiltonian function:
$$H([p, q]) = \sum_j^N H_0(\vec{p}_j, \vec{q}_j)$$
where $N$ is the number of subsystems (related to the thermodynamic limit), $\vec{p}_j, \vec{q}_j$ are the variables describing subsystem $j$ and I've written $[p,q]$ symbolically to indicate that the Hamiltonian, of course, depends on all coordinates and momenta! Note that $H_0$ is the same function for all particles and that there are no "cross-terms" since the subsystems do not interact.
The thermal partition function (in the canonical ensemble) can then be written as
$$ Z(T) = Tr e^{-H/{k_B T}} = \sum_{\text{all config.}} e^{-H[p,q]/{k_B T}} $$
You may not have seen this expression in the general form of a trace, but to be concrete: the sum over configurations could for example be multiple integrals over all $\vec{p}_i, \vec{q}_i$ for an ideal gas:
$$Z(T) \propto \dfrac{1}{\hbar^{3N}N!} \int \prod_j d\vec{p}_j d\vec{q}_j \exp\left(-H[p,q]/{k_B T}\right)$$
Then we do a trick! We rewrite the sum over all possible configurations as a product, using the factorization of $H$:
$$ \dfrac{1}{\hbar^{3N}N!} \int \prod_j d\vec{p}_j d\vec{q}_j e^{-H[p,q]/{k_B T}} = \dfrac{1}{\hbar^{3N}N!} \prod_j^N \left[ \int d\vec{p}_j d\vec{q}_j \exp\left(-H_0(p_j, q_j)\right)\right]$$
Here I just split up the exponential of total Hamiltonian into a product of exponentials of subsystem Hamiltonian:
$$\exp(-H/{k_B T}) = \prod_j \exp(-H_0(p_j, q_j)/{k_B T})$$
and shuffled around the order in the multiple integrals a bit. But then we see that this is nothing else than the product of $N$ identical integrals!
$$Z(T) = Z_0(T)^N$$
where the single-subsystem partition function is:
$$Z_0(T) = \hbar^{-3} \int dp_j dq_j \exp\left(-H_0(p_j, q_j)\right)$$
Compare the form of this to the partition function for a single harmonic oscillator which is written down in your included image!
With just one more step of calculation, one sees that the density matrix $\rho$, from which all expectation values are calculated, also factorizes into a product. This means that any averages for one subsystem are statistically independent of the others. We might as well ignore the other subsystems in the description, which is somewhat intuitive since the whole system is just $N$ copies of the same subsystem!
Thermodynamic equilibrium is the result of a competition between mechanical energy, pressure and sheer combinatorics (which state is most common statistically). Often this is summarized in the free (Gibbs/Helmholtz) potential. It can be written as (using $Z = Z_0^N$):
$$F = - k_B T \log Z = - k_B T N \log Z_0$$
Thermal equilibrium means that the free energy is minimized with respect to variations. From the above expression, it is clear that it is enough to minimize $\log Z_0$. In this case, the "combinatorics" of entropy is independent of $N$, so the limit does not change anything! This is of course highly unnatural: any physical system has at least some communication between its components. Other people are probably better at explaining when it is meaningful to talk about thermal equilibrium and when it is not.
So there it is, this is the reason why we sometimes talk about statistical physics and ensembles for single hydrogen atoms or harmonic oscillators.
Sources:
- "Statistical physics", volume 5, Landau & Lifshitz,
- "Entropy, order parameters, and complexity", Sethna 2011 http://pages.physics.cornell.edu/~sethna/StatMech/EntropyOrderParametersComplexity.pdf