What is an intuitive explanation of Gouy phase?
Several sources link to this paper: S. Feng, H. G. Winful, Physical origin of the Gouy phase shift, Optics Letters, 26, 485 (2001), which tries to give an intuitive explanation of the Gouy phase. Briefly, the point is that convergent waves going through the focus have finite spatial extent in the transverse plane. The uncertainty relation induces then some distribution over the transverse and consequently longitudinal wave vectors. It is claimed that the net effect of this distribution over wave vectors is an overall phase shift, which is larger for higher modes. However to see that one really needs to look into the formulas.
For several years, I have periodically tried to wrap my head around the Gouy phase.
The Gouy phase does not occur only in laser resonators, but it actually comes into play whenever a beam of light is focused.
Addressing your second question (how did Gouy find this crazy phenomenon way back in 1890?), there is a nice discussion of his experiment in this "Progress in Optics" book. Basically Gouy took the same light source (presumably emerging from a pinhole, to give it some degree of spatial coherence) and reflected it with both a curved mirror and a flat mirror. The focusing beam overlapped with the non-focusing beam in a region near the focus and created a circular diffraction pattern. Gouy then looked at the circular diffraction pattern at several different locations, both before and after the focus. He saw that the central region of the diffraction pattern changed from light to dark, indicating a phase shift in the focusing beam - the Gouy phase shift.
So, observing the Gouy phase shift is a relatively easy experiment. Explaining it is not quite so simple.
In a 1980 article titled "Intuitive explanation of the phase anomaly of focused light beams", R. Boyd explains the Gouy phase shift in terms of the difference in propagation of the Gaussian beam and a plane wave, very similar to Gouy's experiment. He shows that the Gouy phase shift can be derived by looking at the path length difference between the "true" path of the light (along path BCD) and the path that would be expected from geometrical optics (path BE).
Gaussian beam optics fundamentally incorporates the idea that, the more tightly light is focused, the larger the divergence of the beam. Since the divergence cannot be infinite, there is a minimum spot size for a given wavelength of light. This behavior of light is a consequence of Heisenberg's uncertainty principle. So, in my understanding, the Gouy phase shift occurs when we compare the "quantum" behavior of light with what we would expect from geometrical optics.
To take this explanation one step further, we can consider higher-order modes. These modes don't reach as tight of a focus as a simple Gaussian beam. Now, the difference between BCD and BE is even larger.
A naive explanation says that a beam after its focal point is inverted, not only in the sense of its spatial distribution but also in the sense of the direction of the electrical field vector (minus sign = adding $\pi$ to the phase). It's perfectly compatible with the fact why even beam profiles change the phase by $\pi$ and odd do not.
However, this explanations says nothing about behaviour of the phase near the focal point.