Skill Level Beginner
- [Instructor] It's in the docks but you might not know how inaccurate timer objects can be if you're trying to show the time. Let's look at that inaccuracy and how to work around it. You'll find in the exercise file on git hub a starter project where I've put together a demo app with three labels and a button. We'll compare two ways of showing a time on a timer. I've also set up a basic timer for you. At top I declare a timer and and interval and in action for the button I set up a timer.
If you're not familiar with timers, when running they're in a state called valid. If the timer is valid, I'll invalidate it shutting the timer off. If the timer isn't valid I'll start a new one. There's several ways of using timer objects. Since I like to start them immediately, I tend to use the schedule timer class method, adding the interval here and making a repeating timer. The timer has a closure on it. This closure fires once every interval, having a parameter of timer which is where we're gonna do our work.
Now above the timer, make some space and I'm going to add the initialization for a count of the timer so I'm gonna put in var timer count equals zero point zero. Inside the closure I'll use a very simple way of tying the seconds elapsed. Increment this count by the interval so I'm gonna do timer count plus equals self dot interval.
And then I'm gonna send that to one of my labels, self, count label dot text equals and I'll do a formatted string here, and we'll make it timer. Space percent, zero six point four five, there we go and I'll put the timer count after that, very good.
Now let's compare this to another method. Checking the interval between a start date and the system lock date. I'll make another constant up here for the start date so let's do let start date equal date. And then I'm gonna use a date interval object to get a duration between that start date and now. So underneath this timer count thing here I'm going to put let duration equal date interval start and we'll use the start end one here and the start is going to be the start date.
And then end is gonna be what the date is right now which is date. Now that date interval can use the property duration which gives me a time in seconds, so that's a sneaky way of getting my duration and again I'm gonna send all of that to a label so I'm actually just gonna copy this and make the little changes that I need so I'm just gonna command C command V, this time I'm gonna send it to date label and change this to date and make this duration.
And finally, I'll show the difference between the two times and seconds in another label, so I'm gonna do another command V since I already have that handy and difference label so the self difference label dot text, I'll change timer to difference and over here I'm gonna just put duration minus timer count so I'll do duration minus timer count and we've set the interval, come back up here to five thousandths of a second.
So go ahead and run this and I have this set for an iPhone eight plus. And we get a timer, we could start it and notice that we're getting a pretty big difference quite quickly that we're off by seven hundredths of a second and growing. What you'd expect with a smaller interval to get a more accurate time, the opposite happens to be true. Timers have to wait for system resources. At bigger intervals, that is close to invisible.
But at smaller intervals, you can see the weighting delay creeping up. If you need an accurate time as in a stopwatch app, the best solution is to pull the clock and use a duration if you need the time as we did here. For most uses of a timer for telling users the duration, you don't need this kind of resolution. Go ahead and try something else. I'm gonna stop the app and change my interval to zero point zero five run that and you see the difference is much much smaller, it's still growing but it's still very small.
As you get to larger and larger intervals this becomes less and less of a problem. I'm gonna stop again and I'm gonna try one other number here. And that's gonna be one point zero divided by 30 point zero and that's one thirtieth of a second which is a standard animation speed. Go ahead and run that. And you can see that it's there we got a small difference but the important thing here is it really doesn't matter.
For things like animation, you don't have to worry about this, you're only worried about the time of the interval and the movement between them. As far as a visual thing is concerned, no one will every be able to see it, it's only when you're taking track of numbers that you have to worry about this accuracy issue. So for accurate timekeeping, pull the clock every interval while delays and other system types of timers such as animation, you'll find just using the timer works perfectly good.