Using a transistor to get 100mA on a IR led
Increasing the power to the LED isn't the best way of increasing the distance. In fact it's the worst way.
The 100mA is only for a small duty cycle. You can power it with 100mA but only for a fraction of the time. The overall average light output won't be any more than at 30mA.
The problem you are actually suffering from is the fact that your receiver can't distinguish the LED's light from the background light. That is simple to overcome though. You just need to modulate your LED with a high frequency (typically 30kHz or higher) and perform a high pass filter and amplification on the receiver.
Exactly the same way that remote controls do it.
That way you can distinguish the LED (flashing at a given frequency) from the background light (static brightness or relatively very slowly changing).
Here is a diagram from Peter Bennett's answer to an Electronics Stackexchange question, How much voltage to give my IR LED?:
As you can see, the current-limiting resistor is in series with the IR LED and the transistor's collector. The base resistor is large relative to the current limiting resistor, because this transistor amplifies base current by a factor like 100.
The 1000 Ω base resistor is ok for loads into the neighborhood of 100 mA; if you plan to drive a larger load, use a smaller base resistor. As noted in the 2N3904 datasheet, hFE may be as small as 30 when IC = 100 mA. To ensure transistor saturation, arrange that hFE*(VCC-VBE)/RB exceeds desired collector current.
2N2222 specifications are similar to those of the 2N3904 mentioned in the question, but hFE drops off more slowly (as IC increases) for the 2N2222.
See the Electronics Stackexchange question linked above for more discussion and further links.