Chemistry - Why do the first and second laws of thermodynamics not contradict each other?
Solution 1:
It is something of a historical accident that entropy has units of J/K. It came out of the fact that the connection between heat, temperature, and energy was not obvious to early scientists, and so they effectively picked different units for measuring temperature and for measuring energy.
In the more modern statistical interpretation of entropy, the entropy of a system is simply a number. Specifically, if the number of microstates associated with a given macrostate is $\Omega$, then $S = k \ln \Omega$. The number of microstates ($\Omega$) is just a number, without any units, and therefore so is $\ln \Omega$. You can see that we actually have to insert Boltzmann's constant, with its units of J/K, to make the units "come out right".
Arguably, a more natural way to define entropy would be to just make it it a dimensionless quantity: $S = \ln \Omega$, without the factor of $k$.1 This would be equivalent to measuring temperature in units where $k$ is equal to 1 exactly, rather than defining our unit of temperature such that $k= 1.380649 \times 10^{-23}$ J/K exactly. If we did this, we'd be effectively measuring temperature in units of energy as well; for example, in an ideal monatomic gas with a "temperature of 1 J", the average KE of each molecule would be $\frac{3}{2}$ J. Quantities such as Helmholtz free energy would still have units of energy, since we'd still define $F = U - TS$, with $T$ having units of energy and $S$ being dimensionless.
Of course, in this parallel universe where entropy is defined as a dimensionless number, another gen. chem. student would be asking why temperature is not the same thing as energy, even though they're measured in the same units. But that's another question and another answer.
1 In fact, entropy is defined exactly this way in information theory, since there's not really a notion of energy (or temperature) to speak of in such contexts.
Solution 2:
Incorrect assumptions
[OP] we see that entropy can be transferred between a system and its surroundings, with $\Delta S_\mathrm{system}=-\Delta S_\mathrm{surroundings}$
This equation is usually not correct, except when you have a reversible process (an ideal situation where something happens even though everything is at equilibrium). For an equilibrium, you are right: The entropy does not increase.
[OP] If entropy is a form of energy, then how can universal entropy tend to increase?
Entropy is not a form of energy. It does not even have the same dimensions. Also, there are forms of energy that increase without breaking the first law. You could have a space heater turning electrical energy into thermal energy. The first law can't be applied separately to electrical energy ("the electrical energy in the universe is constant" is not true).
[OP] If entropy is not a form of energy, then how can it be compared with actual forms of energy and measured in energy units?
It is not measured in energy units. The term $T \Delta S$ is measured in energy units. Consider speed vs time and speed vs distance. They share units, but that does not mean speed is the same thing as time, or as distance. And laws about distances or time do not automatically apply to speed.
Simple counter example
If two bodies of different temperature are brought into thermal contact, they will reach thermal equilibrium (same temperature). The thermal energy lost by the hotter body is equal to the thermal energy gained by the colder body (first law). The entropy lost by the hotter body is less than the entropy gained by the colder body (entropy increases, second law).
Why there is no contradiction
Lots of quantities have the same dimensions as energy (work, heat), and there are a lot of forms of energy. The first law does not apply to any of those, just to all energy combined. So applying the first law directly to entropy does not make any sense, and entropy is not a form of energy.
Solution 3:
Entropy is not energy. Entropy times temperature has energy units. Entropy can be regarded as a statistical property of thermodynamic systems. Although such a definition is not necessary to apply the concept, it is difficult otherwise to grasp what entropy represents. It is a measure of the number of ways you can configure a system of fixed volume, energy and composition.
It is worth repeating: entropy is a statistical property. So why can you transform it into a property with energy units through simple multiplication with temperature? Temperature is a measure of the ease with which new confígurations become available through an increase in energy, or rather the inverse temperature is:
$$\left(\frac{\partial S}{\partial U} \right)_V=\frac{1}{T}$$
At low T it takes just a bit of energy to expand the number of possible configurations of the system (logarithmically scaled). On the other hand, when a system is "hot" you have to add a lot of additional energy in order to significantly expand the number of available configurations (on a logarithmic scale). Borrowing the definition $S \propto \log W$ provided by another answer we can rewrite this last equation as
$$\frac{1}{W} \left(\frac{\partial W}{\partial U} \right)_V\propto\frac{1}{T}$$
to emphasize that $T^{-1}$ measures the relative scale of the change, that is, the rate of change of the logarithm of the number of configurations.