How to properly set up a multimeter to measure the power consumption of a computer?
A computer or other appliance power supply is not a resistive load, it is a reactive load. It has a phase relationship to the incoming voltage, which is itself an alternating (AC) voltage. AC voltages inherently show an "average" of essentially zero. What is measured for power computation is an "effective" or "Root Mean Square" (RMS) voltage across, and current through, the appliance power feed.
Therefore measuring the resistance across its power supply leads will not provide meaningful results.
At a simplistic level, Voltage measurement could be done with a RMS voltmeter across the supply leads. See this EE.SE answer for more details.
Current measurement would need an RMS current meter either inserted into the power line in series, or using a clamp-type non-invasive current sensor.
Low cost AC power line meters use a basic rectifier circuit and internal computation to indicate power consumption. These are designed for specific power line types (e.g. 110V 60 Hz, or 230V 50 Hz), and will deviate from precision if used on a different line frequency, if they work at all.
The above does not take into account Power Factor calculations, another element impacting actual power consumption calculations.
The proposed multimeter approach will yield nothing except possibly a damaged multimeter and the risk of electrocution if you are not qualified to work with mains voltages.
There are commercially available power meter devices that plug into your wall socket, with the appliance plugged into the device, and log or display power consumption. That would be the recommended way to go.
I'm surprised nobody on this page mentioned Kill A Watt yet. You plug it into the wall, you plug your computer into the Kill A Watt, then the Kill A Watt displays volts, amps, and wattage within 0.2 percent accuracy.