The 2N4401 will do fine in your circuit. However, I would ditch the capacitor accross the relay. Just about any regular diode will do, but the one you have is inappropriate. That is a zener diode meant to act as a voltage reference. You want a ordinary silicon or possibly Schottky diode.
To calculate the minimum specs required, you first have to identify the relay you want to use. The relay datasheet will tell you the current it requires at 12 V. That tells you directly the current the transistor must be able to handle. Most small 12 V relays require 15-50 mA, which is well within the capability of a 2N4401.
You also have to think about how much collector current the transistor can support for the base current you are giving it. Your sketch shows a 1 kΩ base resistor. Figure the B-E drop as 700 mV, which leave 4.3 V accross the resistor when the relay is supposed to be on. That means there will be 4.3 mA into the base. Figure a 2N4001 can be relied on to have a gain of at last 50. Your specs say 100-300 for Hfe, which is another way to say gain, but what you post is a dumbed down snapshot of the datasheet. In any case, 4.3 mA times 50 is 215 mA, which is lots more than any reasonable "small" 12 V relay is going to require, so all is fine there.
The diode has to be able to take up to the relay current in forward mode, and block at least the power voltage in reverse mode. A 1N4148 is a common small signal diode that can do this. Those usually top out at around 50-75 mA (there are a lot of variants out there), but again, that is still more than a reasonable relay will require.
You may be in luck. On eBay you can find 3-watt IR LEDs quite cheaply, such as 10pcs 3w 850nm infrared IR LED for night vision camera with 20mm Star PCB. Your AA cells must be long-life alkaline or NIMH rechargeables. Two AAs in series should drive a single LED with a 1.5 to 2 ohm limiting resistor for a couple of hours. And the resistor must be a 2-watt or higher. Yes, I know, it's grossly inefficient, but it's simple and ought to work without more electronics experience than I suspect you have.
Let's run a few numbers.
Let's say your camera field of view is 30 degrees. A single pixel is then 30 / 2048, or .015 degrees. At a range of 50 meters this covers about 13 mm, assuming perfect focus. Sunlight has a maximum intensity of about 1 kw / square meter, so the light reflected from a very white surface at one pixel will have a maximum power of about 0.17 watt total. IR makes up about 1/2 of sunlight's power, and let's say the camera response for a red pixel includes 20% of the visible and 20% of the IR. Then the overall red pixel power will be ~ .07 watt / pixel. Since the data sheet for the sensor which you gave has no data at all about the response in IR, this is purely speculation, but it's the best I can do. Certainly the camera can't see all that far into the IR. If it can, it will accept more IR power, which is bad for you.
Now think about the LED. Let's say that the LED has an efficiency of 20% - that is, it puts out .6 watt of IR. This is probably close. The emitter size is clearly less than a pixel, so you can compare it to the reflected sunlight. It will actually be somewhat higher than that if you are within the LEDs 135 degree field of view.
And the LED puts out about 9 times more power than the camera ought to see from reflected sunlight, assuming no specular surfaces. Like I say, you may be in luck.
If you go with a wider field of view, the pixel size increases, so the sunlight power/pixel increases, but the emitted LED power does not. So, for instance, going to a 60 degree field of view should cut your LED/sunlight ratio to about 2 to 1, which may or may not be adequate for reliable detection.
This is all very rough, and some of my assumptions may be off, but the numbers look close enough to warrant spending a few bucks on some LEDs. But don't go promising that you can make it work until you've actually tried it.
ETA - I thought I'd mention that the specific IR LED I linked to may not be acceptable. Without knowing the response of the camera, I'd recommend getting an LED as close to visible as possible.
Best Answer
The answer depends on the type & condition of the diode you measure. Typically, a good, forward-biased silicon diode will have a voltage drop measurement of 0.5V to 0.8V.
A good, forward-biased germanium diode will have a voltage drop measurement of 0.2V to 0.3V.
It's usually best to test the diode out of circuit.
You should also test the diode in both directions to ensure that it blocks current in one direction (i.e. shows no voltage drop (i.e. 0V) when the diode is reversed-biased).