Most efficient way to implement delta time

Posted by Starkers on Game Development See other posts from Game Development or by Starkers
Published on 2014-05-25T19:03:49Z Indexed on 2014/05/28 4:08 UTC
Read the original article Hit count: 276

Filed under:
|
|

Here's one way to implement delta time:

/// init ///
var duration = 5000,
    currentTime = Date.now();

// and create cube, scene, camera ect

//////

function animate() {
/// determine delta ///
    var now = Date.now(),
        deltat = now - currentTime,
        currentTime = now,
        scalar = deltat / duration,
        angle = (Math.PI * 2) * scalar;
//////

/// animate ///
    cube.rotation.y += angle;
//////

/// update ///
    requestAnimationFrame(render);
//////
}

Could someone confirm I know how it works? Here what I think is going on:

Firstly, we set duration at 5000, which how long the loop will take to complete in an ideal world.

With a computer that is slow/busy, let's say the animation loop takes twice as long as it should, so 10000:

When this happens, the scalar is set to 2.0:

scalar = deltat / duration
scalar = 10000 / 5000
scalar = 2.0

We now times all animation by twice as much:

angle = (Math.PI * 2) * scalar;
angle = (Math.PI * 2) * 2.0;
angle = (Math.PI * 4) // which is 2 rotations

When we do this, the cube rotation will appear to 'jump', but this is good because the animation remains real-time.

With a computer that is going too quickly, let's say the animation loop takes half as long as it should, so 2500:

When this happens, the scalar is set to 0.5:

scalar = deltat / duration
scalar = 2500 / 5000
scalar = 0.5

We now times all animation by a half:

angle = (Math.PI * 2) * scalar;
angle = (Math.PI * 2) * 0.5;
angle = (Math.PI * 1) // which is half a rotation

When we do this, the cube won't jump at all, and the animation remains real time, and doesn't speed up. However, would I be right in thinking this doesn't alter how hard the computer is working? I mean it still goes through the loop as fast as it can, and it still has render the whole scene, just with different smaller angles! So this a bad way to implement delta time, right?

Now let's pretend the computer is taking exactly as long as it should, so 5000: When this happens, the scalar is set to 1.0:

angle = (Math.PI * 2) * scalar;
angle = (Math.PI * 2) * 1;
angle = (Math.PI * 2) // which is 1 rotation

When we do this, everything is timsed by 1, so nothing is changed. We'd get the same result if we weren't using delta time at all!

My questions are as follows

  1. Mostly importantly, have I got the right end of the stick here?
  2. How do we know to set the duration to 5000 ? Or can it be any number?
  3. I'm a bit vague about the "computer going too quickly". Is there a way loop less often rather than reduce the animation steps? Seems like a better idea.
  4. Using this method, do all of our animations need to be timesed by the scalar? Do we have to hunt down every last one and times it?
  5. Is this the best way to implement delta time? I think not, due to the fact the computer can go nuts and all we do is divide each animation step and because we need to hunt down every step and times it by the scalar. Not a very nice DSL, as it were.
  6. So what is the best way to implement delta time?
  7. Below is one way that I do not really get but may be a better way to implement delta time. Could someone explain please?

// Globals
INV_MAX_FPS = 1 / 60;
frameDelta = 0;
clock = new THREE.Clock();

// In the animation loop (the requestAnimationFrame callback)…
frameDelta += clock.getDelta(); // API: "Get the seconds passed since the last call to this method."

while (frameDelta >= INV_MAX_FPS) {
    update(INV_MAX_FPS); // calculate physics
    frameDelta -= INV_MAX_FPS;
}

How I think this works:

Firstly we set INV_MAX_FPS to 0.01666666666 How we will use this number number does not jump out at me. We then intialize a frameDelta which stores how long the last loop took to run.

Come the first loop frameDelta is not greater than INV_MAX_FPS so the loop is not run (0 >= 0.01666666666). So nothing happens.

Now I really don't know what would cause this to happen, but let's pretend that the loop we just went through took 2 seconds to complete:

We set frameDelta to 2:

frameDelta += clock.getDelta();
frameDelta += 2.00

Now we run an animation thanks to update(0.01666666666). Again what is relevance of 0.01666666666?? And then we take away 0.01666666666 from the frameDelta:

frameDelta -= INV_MAX_FPS;
frameDelta = frameDelta - INV_MAX_FPS;
frameDelta = 2 - 0.01666666666
frameDelta = 1.98333333334

So let's go into the second loop. Let's say it took 2(? Why not 2? Or 12? I am a bit confused):

frameDelta += clock.getDelta();
frameDelta = frameDelta + clock.getDelta();
frameDelta = 1.98333333334 + 2
frameDelta = 3.98333333334

This time we enter the while loop because 3.98333333334 >= 0.01666666666

We run update

We take away 0.01666666666 from frameDelta again:

frameDelta -= INV_MAX_FPS;
frameDelta = frameDelta - INV_MAX_FPS;
frameDelta = 3.98333333334 - 0.01666666666
frameDelta = 3.96666666668

Now let's pretend the loop is super quick and runs in just 0.1 seconds and continues to do this. (Because the computer isn't busy any more). Basically, the update function will be run, and every loop we take away 0.01666666666 from the frameDelta untill the frameDelta is less than 0.01666666666.

And then nothing happens until the computer runs slowly again? Could someone shed some light please? Does the update() update the scalar or something like that and we still have to times everything by the scalar like in the first example?

© Game Development or respective owner

Related posts about mathematics

Related posts about animation