delay(time); vs if(millis()-previous>time); and drift
There's one important thing that you need to remember when working with time on an Arudino of any form:
- Every operation takes time.
Your foo() function will take an amount of time. What that time is, we cannot say.
The most reliable way of dealing with time is to only rely on the time for triggering, not for working out when the next triggering should be.
For instance, take the following:
if (millis() - last > interval) {
doSomething();
last = millis();
}
The variable last
will be the time the routine triggered *plus the time doSomething
took to run. So say interval
is 100, and doSomething
takes 10ms to run, you will get triggerings at 101ms, 212ms, 323ms, etc. Not the 100ms you were expecting.
So one thing you can do is to always use the same time regardless by remembering it at once specific point (as Juraj suggests):
uint32_t time = millis();
if (time - last > interval) {
doSomething();
last = time;
}
Now the time that doSomething()
takes will have no effect on anything. So you will get triggerings at 101ms, 202ms, 303ms, etc. Still not quite the 100ms you wanted - because you're looking for more that 100ms having passed - and that means 101ms or more. Instead you should use >=
:
uint32_t time = millis();
if (time - last >= interval) {
doSomething();
last = time;
}
Now, assuming that nothing else happens in your loop you get triggerings at 100ms, 200ms, 300ms, etc. But note that bit: "as long as nothing else happens in your loop"...
What happens if an operation that takes 5ms happens to occur at 99ms...? Your next triggering will be delayed until 104ms. That's a drift. But it's easy to combat. Instead of saying "The recorded time is now" you say "The recorded time is 100ms later than it was". That means that no matter what delays you get in your code your triggering will always be at 100ms intervals, or drift within a 100ms tick.
if (millis() - last >= interval) {
doSomething();
last += interval;
}
Now you will get triggerings at 100ms, 200ms, 300ms, etc. Or if there are delays in other bits of code you may get 100ms, 204ms, 300ms, 408ms, 503ms, 600ms, etc. It always tries to run it as close to the interval as possible regardless of delays. And if you have delays that are greater than the interval it will automatically run your routine enough times to catch up with the current time.
Before you had drift. Now you have jitter.