Constant game speed independent of variable FPS in OpenGL with GLUT?

Posted by Nazgulled on Game Development See other posts from Game Development or by Nazgulled
Published on 2011-03-18T23:33:42Z Indexed on 2011/03/19 0:19 UTC
Read the original article Hit count: 358

Filed under:
|
|
|
|

I've been reading Koen Witters detailed article about different game loop solutions but I'm having some problems implementing the last one with GLUT, which is the recommended one.

After reading a couple of articles, tutorials and code from other people on how to achieve a constant game speed, I think that what I currently have implemented (I'll post the code below) is what Koen Witters called Game Speed dependent on Variable FPS, the second on his article.

First, through my searching experience, there's a couple of people that probably have the knowledge to help out on this but don't know what GLUT is and I'm going to try and explain (feel free to correct me) the relevant functions for my problem of this OpenGL toolkit. Skip this section if you know what GLUT is and how to play with it.

GLUT Toolkit:

  • GLUT is an OpenGL toolkit and helps with common tasks in OpenGL.
  • The glutDisplayFunc(renderScene) takes a pointer to a renderScene() function callback, which will be responsible for rendering everything. The renderScene() function will only be called once after the callback registration.
  • The glutTimerFunc(TIMER_MILLISECONDS, processAnimationTimer, 0) takes the number of milliseconds to pass before calling the callback processAnimationTimer(). The last argument is just a value to pass to the timer callback. The processAnimationTimer() will not be called each TIMER_MILLISECONDS but just once.
  • The glutPostRedisplay() function requests GLUT to render a new frame so we need call this every time we change something in the scene.
  • The glutIdleFunc(renderScene) could be used to register a callback to renderScene() (this does not make glutDisplayFunc() irrelevant) but this function should be avoided because the idle callback is continuously called when events are not being received, increasing the CPU load.
  • The glutGet(GLUT_ELAPSED_TIME) function returns the number of milliseconds since glutInit was called (or first call to glutGet(GLUT_ELAPSED_TIME)). That's the timer we have with GLUT. I know there are better alternatives for high resolution timers, but let's keep with this one for now.

I think this is enough information on how GLUT renders frames so people that didn't know about it could also pitch in this question to try and help if they fell like it.

Current Implementation:

Now, I'm not sure I have correctly implemented the second solution proposed by Koen, Game Speed dependent on Variable FPS. The relevant code for that goes like this:

#define TICKS_PER_SECOND 30
#define MOVEMENT_SPEED 2.0f

const int TIMER_MILLISECONDS = 1000 / TICKS_PER_SECOND;

int previousTime;
int currentTime;
int elapsedTime;

void renderScene(void) {
    (...)

    // Setup the camera position and looking point
    SceneCamera.LookAt();

    // Do all drawing below...

    (...)
}

void processAnimationTimer(int value) {
    // setups the timer to be called again
    glutTimerFunc(TIMER_MILLISECONDS, processAnimationTimer, 0);

    // Get the time when the previous frame was rendered
    previousTime = currentTime;

    // Get the current time (in milliseconds) and calculate the elapsed time
    currentTime = glutGet(GLUT_ELAPSED_TIME);
    elapsedTime = currentTime - previousTime;

    /* Multiply the camera direction vector by constant speed then by the
       elapsed time (in seconds) and then move the camera */
    SceneCamera.Move(cameraDirection * MOVEMENT_SPEED * (elapsedTime / 1000.0f));

    // Requests to render a new frame (this will call my renderScene() once)
    glutPostRedisplay();
}

void main(int argc, char **argv) {
    glutInit(&argc, argv);

    (...)

    glutDisplayFunc(renderScene);

    (...)

    // Setup the timer to be called one first time
    glutTimerFunc(TIMER_MILLISECONDS, processAnimationTimer, 0);
    // Read the current time since glutInit was called
    currentTime = glutGet(GLUT_ELAPSED_TIME);

    glutMainLoop();
}

This implementation doesn't fell right. It works in the sense that helps the game speed to be constant dependent on the FPS. So that moving from point A to point B takes the same time no matter the high/low framerate. However, I believe I'm limiting the game framerate with this approach. Each frame will only be rendered when the time callback is called, that means the framerate will be roughly around TICKS_PER_SECOND frames per second. This doesn't feel right, you shouldn't limit your powerful hardware, it's wrong. It's my understanding though, that I still need to calculate the elapsedTime. Just because I'm telling GLUT to call the timer callback every TIMER_MILLISECONDS, it doesn't mean it will always do that on time.

I'm not sure how can I fix this and to be completely honest, I have no idea what is the game loop in GLUT, you know, the while( game_is_running ) loop in Koen's article. But it's my understanding that GLUT is event-driven and that game loop starts when I call glutMainLoop() (which never returns), yes?

I thought I could register an idle callback with glutIdleFunc() and use that as replacement of glutTimerFunc(), only rendering when necessary (instead of all the time as usual) but when I tested this with an empty callback (like void gameLoop() {}) and it was basically doing nothing, only a black screen, the CPU spiked to 25% and remained there until I killed the game and it went back to normal. So I don't think that's the path to follow.

Using glutTimerFunc() is definitely not a good approach to perform all movements/animations based on that, as I'm limiting my game to a constant FPS, not cool. Or maybe I'm using it wrong and my implementation is not right?

How exactly can I have a constant game speed with variable FPS? More exactly, how do I correctly implement Koen's Constant Game Speed with Maximum FPS solution (the fourth one on his article) with GLUT? Maybe this is not possible at all with GLUT? If not, what are my alternatives? What is the best approach to this problem (constant game speed) with GLUT?

I originally posted this question on Stack Overflow before being pointed out about this site. The following is a different approach I tried after creating the question in SO, so I'm posting it here too.

Another Approach:

I've been experimenting and here's what I was able to achieve now. Instead of calculating the elapsed time on a timed function (which limits my game's framerate) I'm now doing it in renderScene(). Whenever changes to the scene happen I call glutPostRedisplay() (ie: camera moving, some object animation, etc...) which will make a call to renderScene(). I can use the elapsed time in this function to move my camera for instance.

My code has now turned into this:

int previousTime;
int currentTime;
int elapsedTime;

void renderScene(void) {
    (...)

    // Setup the camera position and looking point
    SceneCamera.LookAt();

    // Do all drawing below...

    (...)
}

void renderScene(void) {
    (...)

    // Get the time when the previous frame was rendered
    previousTime = currentTime;

    // Get the current time (in milliseconds) and calculate the elapsed time
    currentTime = glutGet(GLUT_ELAPSED_TIME);
    elapsedTime = currentTime - previousTime;

    /* Multiply the camera direction vector by constant speed then by the
       elapsed time (in seconds) and then move the camera */
    SceneCamera.Move(cameraDirection * MOVEMENT_SPEED * (elapsedTime / 1000.0f));

    // Setup the camera position and looking point
    SceneCamera.LookAt();

    // All drawing code goes inside this function
    drawCompleteScene();

    glutSwapBuffers();

    /* Redraw the frame ONLY if the user is moving the camera
       (similar code will be needed to redraw the frame for other events) */
    if(!IsTupleEmpty(cameraDirection)) {
        glutPostRedisplay();
    }
}

void main(int argc, char **argv) {
    glutInit(&argc, argv);

    (...)

    glutDisplayFunc(renderScene);

    (...)

    currentTime = glutGet(GLUT_ELAPSED_TIME);

    glutMainLoop();
}

Conclusion, it's working, or so it seems. If I don't move the camera, the CPU usage is low, nothing is being rendered (for testing purposes I only have a grid extending for 4000.0f, while zFar is set to 1000.0f). When I start moving the camera the scene starts redrawing itself. If I keep pressing the move keys, the CPU usage will increase; this is normal behavior. It drops back when I stop moving.

Unless I'm missing something, it seems like a good approach for now. I did find this interesting article on iDevGames and this implementation is probably affected by the problem described on that article. What's your thoughts on that?

Please note that I'm just doing this for fun, I have no intentions of creating some game to distribute or something like that, not in the near future at least. If I did, I would probably go with something else besides GLUT. But since I'm using GLUT, and other than the problem described on iDevGames, do you think this latest implementation is sufficient for GLUT? The only real issue I can think of right now is that I'll need to keep calling glutPostRedisplay() every time the scene changes something and keep calling it until there's nothing new to redraw. A little complexity added to the code for a better cause, I think.

What do you think?

© Game Development or respective owner

Related posts about opengl

Related posts about c