For independent accurate clocks, you can try GPS receivers or the long wave time receivers such as for WWVB.
Another option is to put reasonable crystal oscillators on each unit and sync them up as part of the data upload process. The system receiving the data knows the current absolute time and extrapolates backwards using the data logger's current time and time stamps. If it knows the absolute time of the last upload, then it could try to spread whatever error it finds evenly over the last interval.
More detail on second solution:
You can put reasonably decent relative time measurement into each data logger. For example, 32768 Hz "watch" crystals are cheap, low power, and available with good accuracy. You can easily get one good to 20 ppm.
Let's say data is uploaded from each unit about once a month. In between uploads, the unit timestamps each data record from its local clock. These timestamps are not synchronized to real time, but have accumulating error of 20 ppm maximum with respect to each other. 20 ppm over one month is about one minute.
The computer receiving the uploaded data knows the real time and the data logger's current time and can therefore determine how to work backwards to find the real time of each of the data records with up to 20 ppm error going backwards. For example, the oldest data from one month ago is only known to within about 1 minute. The data from half a month ago to within half a minute, etc. If 1 minute error is acceptable, then little more needs to be done.
However, by storing the real time of the upload event, and of course timestamping it as usual, the worst case error can significantly reduced. This only requires writing a single event into the log after each upload. This will be the first event in the log on the next upload. At each upload then, you know the absolute time at the start and end of the upload data. The worst case drift from a known time is now in the middle of the upload period, which is only half a month from a reference, or only about half a minute off.
Even better is that the error is now independent of the absolute error of the local clock. Time errors only depend on shifts in the local clock frequency. Since you know the real time at the beginning and end of the upload data, you can determine how fast or slow the local clock ran during that time and account for it. For example, if the local oscillator was exactly 20 ppm fast the whole time, then the timestamps will show a period 1 minute longer than the known time between the start and end of data. If you linearly proportion the timestamp error so as to make the ends fit (since they are known), the data in the middle will be exact as long as the local oscillator didn't change. In the middle of the month, you subtract about half a minute, etc.
Crystals are often specified for a absolute error and a relative error over time. With the second scheme, the absolute error is cancelled out. Any remaining time error is only a function of how much the crystal changed during the last month, which can be significantly less than it's average absolute error.
Best Answer
Often, cameras can pick up infrared, so if a visible-light LED is out of the question (i think it could be quite dim and unintrusive), IR LED might be another option.