Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Newcomers DBPro Corner / timer based movement and sync rate

Author
Message
Indie Rock 13
20
Years of Service
User Offline
Joined: 1st Sep 2004
Location:
Posted: 9th Apr 2005 17:59
I finally got timer based movement working in my game, or so I thought. I had the sync rate set to 60 and I got all the variables to where I wanted them. Then I tried taking out the sync rate to see if it worked, and the program ran at like 450 fps, but my character also ran really fast. At first I thought I had messed up the timer functions somehow, but then I got to thinking about what the sync rate 60 was really doing.

here's the code btw:


So my question is, in order to use timer based code do I have to leave out the sync rate xx command in order for it to work properly? And it seems to me that setting the sync rate changes the variable move# that my timer function sets. Does this mean that if the frame rate without the sync rate being set is 30 on one computer and 55 on another then move# will be different on either computer? If thats the case then the timer code I'm using would be superfluous...

Neeeeeeeewom!
Hamish McHaggis
21
Years of Service
User Offline
Joined: 13th Dec 2002
Location: Modgnik Detinu
Posted: 9th Apr 2005 20:13 Edited at: 9th Apr 2005 20:15
Ooh er. I don't see what your code is exactly doing, but here is a much simpler way of doing it...



The first two lines of the loop measure the time in ms it takes to loop, simple enough. The third divides that time by the standard fps, this is just so that you can use the values you would use if you were running the program constantly at 60fps (or whatever you want). This is clear if you make dt# 1/60 (ie. 60 fps), k# will equal 1.

The other line is implimentation, and uses the rule...

displacement = startVel*time + 0.5*acceleration*time^2

Draw a velocity-time graph with constant acceleration, and find a formula for the area underneath (ie. the displacement), if you don't get it.

Isn't it? Wasn't it? Marvellous!
Indie Rock 13
20
Years of Service
User Offline
Joined: 1st Sep 2004
Location:
Posted: 9th Apr 2005 22:28
I know the code I have, which I got from the codebase, does indeed work, and I understand everything its doing after looking over it. My main question is does using the sync rate x command throw the timer code off. Even with your code I get different movement speeds when I set the sync rate to 60 compared to setting it to 0, so that does seem to be the case.

The reason I want to set the sync rate to 60 is so my animations play at the proper speed. I can't figure out a way to have the animations play at the same speed regardless of the framerate. I had thought to come up with an equation that divides the framerate value by a certain number and then setting the animation speed to the result, but the set object speed command doesn't seem to be working properly...

meh

Neeeeeeeewom!
Cryptoman
20
Years of Service
User Offline
Joined: 24th Nov 2003
Location: Utah Mountains
Posted: 9th Apr 2005 23:21
You need to set that at 16.3 to get 60 fps, which you can't do with the standard timer. Gotta go deeper, and use your processor timer if you want to control framerate with time.


Hamish McHaggis
21
Years of Service
User Offline
Joined: 13th Dec 2002
Location: Modgnik Detinu
Posted: 9th Apr 2005 23:42
Timer() returns the processor time, so it's completely unrelated to the inner workings of DB.

Isn't it? Wasn't it? Marvellous!
Cryptoman
20
Years of Service
User Offline
Joined: 24th Nov 2003
Location: Utah Mountains
Posted: 9th Apr 2005 23:58
No, timer() returns windows time, which could be anything really, cause windows only updates it when it gets time. Not very accurate at all. Close but not accurate enough for fps control.


spooky
22
Years of Service
User Offline
Joined: 30th Aug 2002
Location: United Kingdom
Posted: 10th Apr 2005 08:15
I wrote that code years ago but never put it in the codebase. I did post it in code snippets though in October 2003!

Link: http://forum.thegamecreators.com/?m=forum_view&t=18986&b=6

Example of how to use it:



It works differently than most peoples as it calculates average frame rate over last 't9' milliseconds and uses that for next 't9' milliseconds. During first 't9' ms it uses average so far. The idea being that differences between frames can sometimes be so quick that you get a dif of zero, and for a split second, your objects wont move. I made it easy to use so you simply multiply object movement by move#. The comments in above code should explain things.

If you want proper timer based movement, you should always have sync rate at 0. This gives people with superfast pcs nice silky smooth games, and those with low spec pcs a playable game.

Ideally you would decrease the t9 value from default of 1 second to get less noticable frame rate jumps.

Remember that code is years old and needs tidying up a bit. As you see, my naming of variables is a bit dodgy!

Boo!
RiiDii
19
Years of Service
User Offline
Joined: 20th Jan 2005
Location: Inatincan
Posted: 10th Apr 2005 15:47
My understanding of setting the Sync Rate is similar to writing code like this (avoiding the Timer() accuracy issue):

So, if you have a fast PC, the Fast PC sit's and waits around for the timer to catch up in the timer Do/Loop.
If you have a slow PC, then the timer Do/Loop is quickly (or instantly) skipped through.
Same thing with Sync. Setting the Sync Rate to 0 would be the same as deleting the timer Do/Loop.
Since the human eye can't really do much better than 30 FPS (standard for movies and tv), it seems that setting the Sync Rate and fitting the code is far more important than trying to obtain a 450 FPS, which is fine for bragging rights, but not useful for much else. You're still not using the processing power of the faster pc, just running the frames much faster than needed.
At the very least, if you can get 450 FPS, let the user jack-up the poly-count, increase the texture resolution, add extra graphics, more/better sound, whole bunch of stuff to optimize the game (notice these settings on professional games?). Then set the sync rate at 30 to 60 and everyone plays at the same speed, but different quality based on PC performance. More FPS isn't really better compared to those things that really can enhance a game.

I do concede that higher FPS is "smoother", but honestly - past 60 FPS and the human eye really won't be able to tell the difference. The game appears faster, but the "smoothness" isn't better. And one last thing. What's the refresh rate of your monitor? 60 Hertz? 70? So even if your graphics card is refreshing faster - it's all wasted.

"Droids don't rip your arms off when they lose." -H. Solo
REALITY II
Indie Rock 13
20
Years of Service
User Offline
Joined: 1st Sep 2004
Location:
Posted: 10th Apr 2005 18:20 Edited at: 11th Apr 2005 11:18
Ok I figured out how to get the animations to play at the same speed no matter the fps. It took me a bit of time and a sheet of notebook paper but I figured out the rather simple equation that had been eluding me...

fps#=screen FPS()
speedtest#=1/(fps#/60)
Set Object Speed 500, (speedtest#*50)

I did that because running at 60 fps it was at the correct speed if I set the speed to 50.


EDIT: Aha! A bit of fiddling and I figured it out. I used to be doing this:


And that didn't work. Then I tried adding gameSpeed (or move#, or k#... i've tried so many different versions of the timer code
) to the positioning code as well like so:



and it works! I guess it has to be applied to both... maybe I could have just squared the gameSpeed value... kinda like Hamish's suggestion. Oh well it works so I'm not thinking about it anymore right now!

Neeeeeeeewom!

Login to post a reply

Server time is: 2024-09-23 17:18:39
Your offset time is: 2024-09-23 17:18:39