Can LEDs or optocouplers be used without a resistor if PWM is used?
The current drawn by an LED (or any diode) rises exponentially as a function of voltage, with the typical forward voltage being the onset of exponential growth. Because of this, it's more important to think of your LEDs or optocouplers as devices that require a constant current, rather than a specific voltage. You want to stay away from that exponential curve, not just to protect the LED, but also to protect your microcontroller from sourcing or sinking too much current.
Sometimes, you get lucky and the microcontroller's internal output pin resistance is just enough to limit current through a particular LED. Sometimes it works out better in current sinking configuration. Check your datasheets.
Resistors are used to set a current limit. Higher power LEDs, particularly those used for illumination, may be driven by a constant current regulator to avoid flicker due to slight variations in the voltage supply. The exponential function magnifies small changes.
Resistors are cheaper than dirt, so it's easier to put them in where they are needed than find a workaround. If your time is worth anything, you could buy several hundred resistors by the time you've set up your PWM test. The experiment you reference is not really thought out. Rather than guess at arbitrary PWM values, it would seem reasonable to conjecture that if the total energy delivered to the LED is less than the forward voltage * maximum sustained LED current per PWM cycle,
$$ {1 \over T}\int_0^T V_{pwm}(t)I_{pwm}(t)\,\mathrm{d}t < {V_{forward} I_{max}} $$
it's probably safe(-ish), but there still may be problems from the brief high current peaks.
If someone told me they did that experiment in a job interview, I'd show him/her the door. It's not worth the trouble.
By the way, neither PWMs nor their frequency lower voltages. The duty cycle reduces the power transmitted. Increasing frequency allows more precise control, but it's really the duty cycle that does the work.
In general, NO. The current without a resistor (or other current limiting) is 'out of control', and some PWM-ed fraction of 'out of control' is still 'out of control'.
Side note: the peak current allowed for normal LEDs is often only a little above the rated current (for instance 30 mA versus 20 mA), so even when you do PWM with a controlled current do check the LED datasheet, both for the allowed average currrent and for the allowed maximum (peak) current.
And DO NOT use the absolute maxima section! The only section for normal operation is het 'normal operating conditions' section or something similarly named.
Not easily. The resistor gives immediate and automatic control of the LED current. PWM effectively controls mean voltage, so it would require you to measure the actual current (how? by measuring the voltage across a ... ah, resistor!) and controlling the PWM ratio to set the current. Much more complex!
You could use a low value resistor to limit the current to a reasonable value - say 20ma (the upper limit for a typical LED) and PWM down from that...