The current drawn by an LED (or any diode) rises exponentially as a function of voltage, with the typical forward voltage being the onset of exponential growth. Because of this, it's more important to think of your LEDs or optocouplers as devices that require a constant current, rather than a specific voltage. You want to stay away from that exponential curve, not just to protect the LED, but also to protect your microcontroller from sourcing or sinking too much current.
Sometimes, you get lucky and the microcontroller's internal output pin resistance is just enough to limit current through a particular LED. Sometimes it works out better in current sinking configuration. Check your datasheets.
Resistors are used to set a current limit. Higher power LEDs, particularly those used for illumination, may be driven by a constant current regulator to avoid flicker due to slight variations in the voltage supply. The exponential function magnifies small changes.
Resistors are cheaper than dirt, so it's easier to put them in where they are needed than find a workaround. If your time is worth anything, you could buy several hundred resistors by the time you've set up your PWM test. The experiment you reference is not really thought out. Rather than guess at arbitrary PWM values, it would seem reasonable to conjecture that if the total energy delivered to the LED is less than the forward voltage * maximum sustained LED current per PWM cycle,
$$
{1 \over T}\int_0^T V_{pwm}(t)I_{pwm}(t)\,\mathrm{d}t < {V_{forward} I_{max}}
$$
it's probably safe(-ish), but there still may be problems from the brief high current peaks.
If someone told me they did that experiment in a job interview, I'd show him/her the door. It's not worth the trouble.
By the way, neither PWMs nor their frequency lower voltages. The duty cycle reduces the power transmitted. Increasing frequency allows more precise control, but it's really the duty cycle that does the work.
They definitely won't work from USB if connected in series (even two LEDs in series won't work).
If you connect a ~100 ohm resistor in series with each LED, you may be able to run about five LED/resistor sets connected in parallel from a USB port. Without negotiation, a USB port is only guaranteed to supply up to 100 mA (although many USB ports have no current control, and may supply 500 mA or more).
According to the comments on Amazon, there is no recommended maximum current data for these LEDs, so I'm guessing 20 mA per LED would be acceptable.
Best Answer
LEDs are current controlled devices. You will surely either fry them sooner or later unless you control the current OR the LEDs won't lit due to insufficient voltage.