What is the amplitude of a light wave?
A light wave is a traveling disturbance in the electric and magnetic fields. In the far-field these two components are coupled and in-phase so that one can define the amplitude entirely in terms of either the electric field strength or the magnetic field strength; by convention we give the electric field strength.
So, in SI, the amplitude of a light wave is some number of Newtons per Coulomb (or equivalently of Volts per meter) and it represents the magnitude of the largest change in the electric field strength from the ambient value that would be measured as the wave passes.
It is not practical to make such measurements in the optical range (the frequencies are around $10^{14}$–$10^{15}\,\mathrm{Hz}$ after all), but this can be (and has been) measured explicitly in the radio regime.
The electric field of course is a vector, but again in the far-field and in free-space it is always perpendicular to the line of travel so the wave is transverse. We define the direction of polrization as agreeing with the direction of the electric field oscillation (the ambiguity of direction on either side of zero is of no consequence).
You cannot immobilize light, but you can absorb it, convert it to heat. A first step in measuring light intensity (watts/square meter) is to make a black body to absorb the light. Then, by conservation of energy, the heating of that black body tells you how much light is being absorbed, in watts. The illuminated area of the black body is the "square meters" part.
There are many other kinds of light sensors (electronic, acoustic, chemical) suitable for special cases, but they require different calibration curves for different light sources. Color, beam divergence, polarization, and other variables other than light intensity will affect those sensors.