The point of a (hard) real time system is that it must do certain things within a fixed time. If done later, even by a little bit, the system is useless. Think of a car airbag that blows up after your head has hit the steering wheel.
A real-time OS must give fixed guarantees for the maximum it takes to do OS things like interrupt latency, task switching, etc. Likewise, the libraries that come with it must provide such guarantees.
Non-real-time OS'es generally try to maximize average throughput. Consider three disk reads, that can either be done in 1 ms, 1 ms, and 5 ms, or alternatively in 3 ms, 3 ms and 3 ms. A non-real-time OS will tend to prefer the first option, because the total work is done in 7 ms. A real-time OS will probably prefer the second option, because the maximum disk read delay is 3 ms.
As far as I know Linux is not a real time operating system. It is not likely that it ever will be, because its general use is as desktop or server. Both uses benefit more from a high throughput than from an upper limit on the response time.
Your two questions depends on the operating system.
Choosing the Operating System is a whole new question that is where you might start from.
As your application seems very simple, and might have real time contrains, I suggest you to analyse the possibility of implementing it bare metal.
But, if Operating System is required, take a look at FreeRTOS, might interest you!
Best Answer
A real time system has a constraint set as: the system should respond to an event within 10ms. Or: the system should run PID loop control at a fixed rate of 10kHz. Implicitly a period time of 100us is used.
Both are well set maximum time limits. However they are not exactly deterministic; one PID loop update may take 10us, while the next may take 15us. It also does not say about the passed time between updates; e.g. the "phase".
This "jitter" can be a problem in some systems. Deterministic describes that the "noise" is very low; i.e. it is very predictable how long an algorithm will run, it will not vary a lot (or it can be determined and compensated for) given varying inputs or states of the program.
Deterministic can be important while producing timed signals for example video. In that case you want to know exactly how long (intermediate) operations take to complete, even when the algorithm needs to take branches that take less or more time to execute.
Some high performance systems are challenging as they are "accelerated" CPU's that incorporate instruction pipelines and caches that may stall code execution in some conditions. Predicting/determining these conditions may be nearly impossible, which is why deterministic is very hard on complex platforms.