I'm making my own laser tag and I was checking out existing open source solutions.
The MilesTag protocol uses 600us as the basic pulse length. This however limits the maximum rounds per minute to under 2000 RPM while miniguns reach 6000 RPM (100 rounds per second). Are there any reasons I'm not aware of for the 600us pulse?
Will my own solution be less reliable if I go with, say, 300us pulses? Any lower won't work with the IR reciever I'm working with, because it needs at least 10 cycles in a burst with a 40kHz modulation frequency, but a 300us pulse would let me double my RPM.
Best Answer
The following is mentioned on the MilesTag Data Protocol page (the original version of the protocol) that also uses the same timing:
So I suspect the original design decision was to pick a low-level encoding method known to be compatible with a wide range of IR receivers. You seem to have determined that the receiver you're using can demodulate a signal in ten cycles, so I don't see a reason in principle the higher data rate won't work but the following might apply:
It may just be one of those things you have to try and see what difference it makes in practice. If it were me for a start I'd write code so the transmitter pulse duration could be easily changed using a few buttons and LCD, and on the receiver end do the same. Then you could also use the LCD to display the numbers of packets received over a known period of time to determine the packet loss percentage at various data rates and under different conditions.
Another idea somewhat "outside the box" is that if you're coding your own protocol then you could add a rolling sequence number to the packet. If you received two packets with consecutive sequence numbers you could assume when in "minigun" mode that it represented say three hits. Of course depending on the length of the sequence number you'd have a certain probability that you'd hit the next sequence number by chance at a later point rather than it being a consecutive hit.