What is meant when one says "spontaneous" reactions?
Thermodynamics is in many ways a calculus of symmetries and we today justify these nice properties by an underlying statistical mechanics, so I guess my answer is kind of the lame "more will be revealed in your statistical mechanics course, when you take it." It is well worth giving the argument in a very quick way that requires a proficiency with calculus, however.
Macrostates and microstates
Suppose that a system has $N$ different lowest-level configurations for some big $N$ -- a space of possibilities that we call the system's "phase space." To give an example, for a cubic meter of air at room temperature and standard atmospheric pressure there are something on the order of $2.5~\cdot~ 10^{25}$ diatomic molecules but the number of configurations that they can occupy, with each molecule having a different position and momentum and angular momentum limited by the Heisenberg uncertainty, is closer to $10^{2.6~\cdot~10^{26}}$ different possible configurations. So when I say that $N$ is big I mean it's so big that we have to study it and quantify its uncertainty in log-space.
Of this, there are maybe $n$ different observable configurations. (This is usually a human-scale number like $10^9$ or so, our dials only have finite precision for all of the things that we'd like to measure.) So each "observable" configuration is actually a really large set of these "lowest-level" configurations and it makes sense to think about how "big" each set is, the number of "microstates" in each "macrostate". In a vast understatement, in analogy with crystals which generally are made up of connected "grains" of perfect lattice structure, we call this a "coarse graining" of the phase space.
We almost always study this number of microstates per macrostate by taking the logarithm of the number; I call this looking at them in "log-space". We take the logarithm for two reasons. First, the practical reason: as you've seen usually these numbers are so big that only log-space can give a proper estimate of our uncertainty; our uncertainty is way, way above even the "one significant figure" level where we have some idea of the number of zeros following a single uncertain digit--we're uncertain about the number of zeros at these scales. Second, more theoretically: if we sit two non-interacting systems side-by-side without them interacting, and think about a macrostate of the combined system where first one is in macrostate $A$ and the second is in macrostate $B$, the overall system microstate $(a, b)$ lives in the Cartesian product $A\times B$ which has size $|A||B|.$ So the naive measure of "set bigness" is multiplicative and is usually called the multiplicity of the set, and taking the logarithm creates an additive measure.
Spontaneity
Now that we understand this, imagine that we just leave the system alone, what "spontaneously" happens? Well, our uncertainty spreads over the state of the system until we have no real idea what microstate it's in, and we can imagine that if we give it the right conditions to "isolate" it from the world (no energy transfer, no momentum transfer hence rigid fixed walls, no particle transfers, etc) then it basically chooses a microstate uniformly at random. However this does not mean that a macrostate is chosen uniformly: they are chosen proportional to their multiplicity, $p_i = \sigma_i / \sum_i\sigma_i.$ When I said that we "almost always" use log-space above, this is the one respect that we don't. In practice the gradients on these numbers are huge and the probability is generally clustered on a small collection of macrostates.
Now if the system does not start off in this cluster of most probable macrostates, then as our uncertainties multiply, it will appear to change towards that cluster. This we call a spontaneous transition between our macrostates, it is purely driven by us becoming less certain about the underlying state. So if you think of, say, a box with a divider in it and there is red-colored water on the left, blue-colored water on the right, and we remove the divider and the colors begin to churn and mix because we are getting more uncertain about the underlying state of the world: this is precisely what we mean when we use this word "spontaneous."
For a cleaner example: we put two systems in thermal contact, but otherwise we isolate them from each other and from the rest of the world. What happens? Well, energy can flow between them, but what is the criterion for whether it does? Well, each state has a log-multiplicity $\sigma_{1,2}$ and a packet of energy flowing from system 1 to system 2 $\delta E$ changes the overall log-multiplicity by $$\delta \sigma_{12} = \frac{\partial \sigma_1}{\partial E_1} (-\delta E) + \frac{\partial \sigma_2}{\partial E_2} \delta E,$$because log-multiplicities are additive. This will happen if the quantity $\frac{\partial \sigma_2}{\partial E_2} - \frac{\partial \sigma_1}{\partial E_1}$ is positive. Therefore we see that there is a spontaneous energy flow from a system of lower $\frac{\partial\sigma}{\partial E}$ to a system of higher $\frac{\partial\sigma}{\partial E}$, so $\frac{\partial\sigma}{\partial E}$ is a generally-positive number measuring "coldness". Since this mirrors our experience with temperature but exactly opposite (temperature is a positive number such that heat flows from higher temperature to lower) we define a "thermodynamic temperature" proportional to $\tau=\left(\frac{\partial\sigma}{\partial E}\right)^{-1}.$ Then energy spontaneously flows from higher temperatures to lower ones, only to cease spontaneously flowing when the two temperatures are the same.
We sometimes have the courage of our convictions and use $\tau$ directly, which measures this thermodynamic temperature with units of energy, since $\sigma$ is a pure number and the derivative adds units of inverse-energy which we invert again. More often we try to approximate our liquid-thermometer scales by defining $T = \tau/k_\text B$ for some constant $k_\text B,$ dividing this energy into chunks called "kelvins" so that there are 100 chunks between the boiling and freezing points of water at atmospheric pressure, because that's what the Celsius scale used for its degrees. We then find that this requires the freezing point of water be at 273.15 kelvin, connecting the two scales completely. To help our equations we also usually define the "entropy" $S = k_\text B~\sigma.$
It turns out when we do the statistical mechanics for the monatomic ideal gas we find the Sackur-Tetrode equation and this gives the ideal gas law in the form $P~V=N~k_\text B~T$ and so an ideal gas in a piston at constant pressure has $P,N$ constant while $V\propto L,$ and this works as a thermometer for the above temperature $T$.
The equation of state, connecting back to thermodynamics
It's also worthwhile to state explicitly what's being held constant in this partial derivative $\frac{\partial S}{\partial E}$; the boxes are not exchanging particles and the walls are still rigid so they are not exchanging volume or momentum; they are just exchanging energy. This gives a nice way to write the equation in terms of "differentials"; since entropy is a state function we find that our fundamental relationship for the state of the system is $$dE = T~dS - P~dV + \sum_i \mu_i ~dN_i,$$ where the system's state is being defined by its energy, the entropy of the current microstate, its volume, and the various numbers of particles that inhabit it $N_i$. My labeling of the change of energy per unit volume at constant entropy is "pressure" $P$ above for obvious reasons; my labeling of the change in energy per molecule of species $i$ is known as the "chemical potential" of that molecule-species $\mu_i.$ We see that this definition of entropy actually connects directly to those $\int dE/T$ operations that prior generations were thinking about before they knew the statistical mechanics that justifies them.
Boltzmann factors and free energies
If we now suppose that one of these systems $1,2$ is much much larger than another, let's say the first, we see that the gradients will also be much larger for one than the other, and we will find that the entropy of the system $S_{12}$ is dominated by the first system's entropy $S_1$ and its temperature $T_1$. If we think about the smaller system transitioning between from a state with energy $E_A$ to a state with energy $E_B$, this energy comes from the bigger system and so its entropy $S_1$ will change to $S_1^{(0)} + (E_A - E_B)/(k_\text B~T_1).$ Converting back to multiplicities, the ratio of the multiplicities is just the ratio of the probabilities and we find $$\frac{p_B}{p_A} \approx \frac{\exp\left(S_1^{(0)} + (E_A - E_B)/(k_\text B~T_1)\right)}{\exp\left(S_1^{(0)}\right)}.$$ We see that we should allocate probabilities for the smaller system according to $p_S = e^{-E_S/(k_\text B T_1)}/Z,$ for some normalizing constant $Z = \sum_S e^{-E_S/(k_\text B T_1)}.$ There's a lot that can be done with this so-called "partition function" $Z$ but this post is very long now. But this is what it means to hold a system "at constant temperature" and it is remarkable that we can get the probabilities of the various states directly from the energy levels of the states in such a case.
What this changes most significantly is the equation of state. The equation of state allows us to specify a bunch of partial derivatives "holding S constant" because of this nice appearance of $dS$ in the equation of state which we can set to $0$. We instead now need to exchange this for a $dT$ so that we can get derivatives "at constant temperature" by examining the case when $dT=0.$ There is an exceptionally easy way to do that, and it is by defining the "free energy" $F=E - T~S.$ When we do this we find that actually $$dF = -S~dT - P~dV + \mu_i~dN_i.$$ So now for example $-\left({\partial F\over\partial V}\right)_T$ is a valid expression for the pressure. The physical meaning of the free energy becomes therefore, "the energy change, but also including the energy that flows in/out of the big system in order to keep the temperature constant."
Similarly there is a free energy $H = E + P~V$ which we call the enthalpy. It exists because $dH = T~dS + V~dP + \mu_i~dN_i,$ trading the condition of constant volume for a condition of constant pressure. In a real sense enthalpy differences are energy differences which include the effect of a constant external pressure doing work on a system until they come to the same pressure. And there is a free energy $E + P~V - T~S$ which does both constant temperature and pressure at the same time, and that's called the Gibbs free energy. We can even subtract terms $\mu_i N_i$ to indicate that the box we're thinking about is free to exchange particles with its much larger surroundings. Each of these free energies has a name, and also each of these situations has a name that ends with "ensemble", so we started out in the "microcanonical ensemble" and when we thermalized with a much bigger system we transitioned to the "canonical ensemble" and if we then allow the system to exchange particles with its surroundings we get the "grand canonical ensemble" where probabilities are now $p_S=\exp(-(E_S - \mu N_S)/(k_\text B T))/\mathcal Z$ where now $\mathcal Z$ is the "grand canonical" partition function, etc.
I hope that gives you a good basis to start learning about these things.
The meaning of spontaneous process/reaction in thermodynamics is quite straightforward. It means that except for the bodies taking part in the process there are no permanent changes of any sort and there is no need of input work.
The term spontaneous is not a synonymous of possible. It is rather a synonymous of natural. It relates to tendency, that is a spontaneous process has the tendency of happening naturally - there is no needing to be driven by doing work. So for example a gas spontaneously expand in vacuum, meaning that it does not require any input work for doing that. On the other hand the gas will not return back to its original container unless we spend some work in the process thus this is not spontaneous. As another example, heat flows spontaneously from a hotter to a cold body. On the other hand, heat flow from colder to hotter body requires work and therefore it is not spontaneous.