Re: game framerates


Subject: Re: game framerates
From: twistedhammer (twistedhammer@subdimension.com)
Date: Tue Feb 26 2002 - 19:54:39 AKST


>> This is something I've been curious about -- does it
really matter
>> if a game can produce 150 frames per second? Can your
monitor even
>> display 150 fps? Certainly not if it has less than 150
Hz vertical
                                                            
                                                           
>
> 30 fps is generally considered good enough to be smooth.
Certainly it
> doesn't make a difference if you are doing 60fps or
150fps.

Ahem. actually it can make a difference. Justin brings up
below that the NTSC
(National Television Standards Commitee) standard is roughly
30 fps. True
enough, but the primary difference between a video signal
captured by a
camera and one output by a video card is a matter of motion
blurring. Ever
notice that screenshots of games look crystal clear no
matter what is going
on on-screen? Try pausing one of your action-movie DVDs
while the hero is
swinging a fist at the bad guy, or a bazooka shell flys from
one side of the
screen to the other. You'll see that anything moving has a
minor-to-major
amount of blurring, leaving a trail behind it. In film that
virtually
increases your framerate (without actually doing so). Since
a digitally
authored video source lacks that blurring, you need a larger
framerate
to make a smoother picture. That being said, it's widely
accepted that
anything more than 60 is overkill, except in-so-far as the
buffer effect
mentioned below.

>
> But, the main key is not how fast it can go, but what how
low it drops
> when you have lots of stuff going on. If you run a game
at 30 fps, but
> when you get a bunch of stuff going on it drops down to 15
fps, it makes
> the game just as unplayable as if it were going at 15 fps
all the time.
> The key is to have it never drop below 30fps or
thereabouts..

>> refresh rate. Can your eyes even *see* 150 fps? I very
much doubt
>> it.

> I've heard somewhere that florescent lights typically
flicker at 120
> times per second, but those appear to be solidly lit, so
no. But I have
> heard somewhere that like insects (flys) are able to see
the flickering
> of florecent lights. (don't ask me how they know that
one!) In any
> case, it depends on the person. I can detect flickering
on a monitor at
> 60 fps, but many people cannot. I can't detect anything
at 70fps or
> higher though.

>> 30 fps is the same framerate as American television. Is
that not
>> enough for gaming?
see above.

> i believe tv is 24fps, iirc..
cinema film is 24 fps.
NTSC is ~29.97 fps (technically 30000/1001)

there are also a couple of varients.
NTSC film (many DVDs are encoded this way) is ~23.95 fps
(24000/1001)
NTSC dropframe (drops 1 frame out of every 1001) is 30 fps

also, since I'm on a roll, the NTSC isn't actually ~30 it's
~60 interlaced.
PAL (the european standard) is 25 fps, or 50Hz interlaced.
Why you might ask?

Please take notice:
US standard electrical outlet - 110-120VAC @ 60Hz (don't
remember the exact voltage)European standard outlet - 220VAC
@ 50Hz

Alright, this all has nothing to do with linux, so I'll shut
up now.

James Gibson.
_____________________________________________________________________
// free anonymous email || forums \\ subZINE || anonymous browsing
            subDIMENSION -- http://www.subdimension.com



This archive was generated by hypermail 2a23 : Tue Feb 26 2002 - 18:54:49 AKST