Voltage regulator from first principles - why is power dumped in the transistor?
I put this together from first principles without referring to any kind of reference on how voltage regulators are usually designed.
Not a good start, but you've actually ended up with almost the exact design of most linear regulators. But the "first principle" you've forgotten about is the MOSFET linear region. Have you tried this thing in a simulator? The system will settle at a point where the transistor is half-on, dissipating power as a resistor.
When V1 is less than about 7.5V, the output voltage never hits the 5V threshold, but instead hovers around 4V. I have tried this with varying loads but it simply does not function below that input voltage. What is the cause of this?
This is called the "dropout voltage". It's due to limitations in how close to the input rails the opamp is capable of driving; you lose approximately 0.7V in the output transistor of the opamp and another 0.7V because of the threshold voltage of the MOSFET.
You might be able to do better with a better op-amp than the ancient, obsolete 741. Otherwise, you're trying to design what's called an LDO: low-dropout regulator.
Why is so much power dissipated from Q2 considering it's just switching on and off?
Because it isn't a switching regulator circuit - it's a linear regulator that you have designed.
The current from V1 (input) roughly equals the current at R2 (output), despite differing voltages. This seems to match the behaviour of linear voltage regulators (is that what I just created?)
Yes, you have.
When V1 is less than about 7.5V, the output voltage never hits the 5V threshold
You need about a couple of volts on the gate (with respect to source) to begin turning on the MOSFET. This has to come from the op-amp and it probably "loses" about a volt on its output compared to the incoming power rail. So, if you want an output voltage of 5 volts then you need an input supply of about 8 volts and that will be on light loads.
On heavy loads, the gate-source voltage might need to be 3 or 4 volts. Now you will probably need an incoming supply that is about 10 volts to keep the regulator output at 5 volts.
Have respect for the simple regulator, especially those that are low drop-out types!!
Design is OK except the dropout of the FET LDO can be lower than the BJT LDO, but FET compensation may demand a limited range ESR for stability and allow some ripple for feedback.
You can make it up to 98% efficient by the good choice of inductor with a low RDSOn switch and low DCR choke. Now you have a buck regulator. Simulation here