Chemistry - charge density as a measure of lattice enthalpy & polarizing power?
Solution 1:
Atoms are not static distributions of charge, and the models we use are sometimes heuristic (approximations). It can be useful - if approximate - to describe an atom in terms of a distribution of static charges, and to compare two different atoms by saying they differ by evenly "stretching" the charge distribution in the radial dimension.
Yes, Gauss's law applies (assuming a static charge distribution). Assume we are comparing two ions that differ only in the radial charge distribution (by stretching the distribution). Gauss's law simplifies the comparison, since we need only consider the change in the distance from the ion center to the charged test particle (placed at the surface of the ionic sphere) and not the magnitude of the charge within the sphere (which remains constant; for a uniform charge distribution it can also be assumed to remain centered in the sphere). The Coulombic interaction energy falls as 1/r (the inverse distance between charges) and the electric field as $1/r^2$. The effect of radially scaling the distribution ($r\rightarrow a\cdot r$) is to multiply the energy and the field by a constant ($a^{-1}$ or $a^{-2}$). If you make an ion larger, the distance of a charged particle (outside of the ionic sphere) from the center of the ion increases and both the interaction energy and electric field decrease in magnitude.
Electric polarizability $\alpha$ (here assumed isotropic and written as a scalar - it is generally a tensor) is a measure of the extent to which charges can be separated by an applied electric field, resulting in an induced dipole moment:
$$\vec{\mu} = \alpha \vec{ E }$$
To compare atoms consider the magnitude of the electric field required to induce an equivalent dipole moment. This is determined by the polarizability (or rather its inverse). More polarizable means a weaker field suffices to generate the same dipole moment.
Finally, as explained in the Wikipedia:
Generally, polarizability increases as the volume occupied by electrons increases.[7] In atoms, this occurs because larger atoms have more loosely held electrons in contrast to smaller atoms with tightly bound electrons.[7][8]
Solution 2:
For both of these questions, we can still understand the results by thinking about point charges/applying Gauss' Law.
Charge Density and Lattice Enthalpy
Comparing two ions with the same charge, but different radii - e.g. $\ce{Li+}$ which is smaller than $\ce{Rb+}$, or $\ce{F-}$ which is smaller than $\ce{I-}$ - the size of the radii with determine the effective distance between charges (modeled as points or otherwise). The larger the radii, the further apart the charges will be, and hence less strongly they'll be electrostatically attracted to each other by Coulomb's law, for energy $E$: $E = - \frac{k q_1 q_2}{r}$
Hence, the smaller ions of $\ce{LiF}$ are more tightly bound with a lattice energy of $\pu{1049 kJ/mol}$ vs the larger ions of $\ce{RbI}$ which has the same crystal structure but a lattice energy of $\pu{632 kJ/mol}$.[1]
Polarising Power of Cations
As pointed out by user Buck Thorn, the induced dipole - a measure of polarisation - is proportional to the applied electric field: $$ \vec{\mu} = \alpha \vec{ E } $$ The electric field generated by the polarising cation will be determined by its charge and distance from the anion being polarised. The electric field, $\vec{E}$, generated by a point charge at position $\vec r$ is given by: $$ \vec{E} = \frac{1}{4\pi\epsilon} \frac{q \vec{r}}{\left|\vec r\right|^2}. $$
The larger the cation, the further away from the anion it will be, the weaker the electric field applied to the anion will be, so a smaller dipole will be induced in the anion.
So overall it will have a lower "polarising power" or polarising effect.
[1] Handbook of Chemistry and Physics 101st Edition