What is the difference in meaning of a voltage signal and current signal?
For a signal, generally one of voltage or current is what is being controlled, and the other is a bi-product that is dependent on the load.
Consider a normal digital logic signal running between two CMOS chips on the same board. That's a voltage signal. Only the voltage is specified. Not only is the current not specified, but it can vary hugely and isn't known by the designers of the transmitting chip since it depends on the load the receiver presents.
If the only receiver in the above example is a CMOS chip, then very little current will flow in steady state. The load is almost purely capacitive, so current will flow in short blips when the logic level is changed. If instead this signal drives a LED and resistor so as to light the LED when high, then the current will be very different from the previous case. It will again be very different if the LED is wired between power and this signal (lit on logic low) instead of ground and this signal (lit on logic high).
Sometimes the signal value is encoded in the current, in which case the voltage ends up what it ends up. A common example is the industrial 4-20 mA sensor standard. The data is encoded by allowing from 4 to 20 mA to flow, but the voltage can vary over a range. In fact, this is called the compliance range, with a larger range allowing for more flexible use of the device. In this case, you can't look at the voltage to get the value being transmitted.
Picture two things: an electrical source, and a load that connects to the (two pins of) the source.
The relation between voltage, current, and the load is fixed by Ohms law: current = voltage / resistance.
Consequently, when the load is unknown, the source can choose between 'fixing' (setting, determining) the voltage OR fixing the current, but not both.
Hence when we want the source to convey information, it can do so either by its output voltage, OR by its output current. And that is how we name the source.
Basically, a voltage signal can have any load across it and it will still output approximately the same voltage. Conversely, a current signal regardless of load will give approximately the same current.
An example of a voltage signal would be a bench-top power supply. You set it to a voltage and it tries to output as much current as necessary to maintain that voltage.
An example of a current signal would be something like an inductor with a collapsing magnetic field. It results in a temporary pseudo constant current. If you put a high resistance load across it, you'll get a very high voltage.
No signal is truly a voltage or current signal entirely. We simply name them that way because certain signals are closer to an ideal current or ideal voltage source.
EDIT: You'll know based on the input of the amplifier. If the input is high impedance (a gate of a mosfet), then you're likely amplifying a voltage because a large current isn't meant to pass through a high impedance. If the amplifier is setup with a low impedance input, then you're amplifying a current. An example of this would be something like a current mirror/multiplier. In this case, the input is the drain of mosfets or collector of bjts which is low impedance.
Something to note is that most amplifiers amplify voltage so if you want to measure a current and amplify that, it's sometimes converted to a voltage through a resistor (or other means) and then amplified through an op-amp.