×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Framerates Matter

CmdrTaco posted more than 4 years ago | from the how-refreshing dept.

First Person Shooters (Games) 521

An anonymous reader writes "As more and more games move away from 60fps, the myth of the human eye only being able to detect 30fps keeps popping up. What's more, most people don't seem to realize the numerous advantages of a high framerate, and there's plenty of those."

Sorry! There are no comments related to the filter you selected.

Motion blur and bloom effects (3, Interesting)

sopssa (1498795) | more than 4 years ago | (#30671832)

The article notes about motion blurring, and links to NVidia's page about it's technology [nvidia.com] . The last figure [nvidia.com] shows a terrain with full-screen motion blur effect, which in my opinion is pretty important in games to create that feeling of speed. People usually object against this and bloom effects and just want a sharp picture, but maybe some games have taken it too far. It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.

Re:Motion blur and bloom effects (1)

Locke2005 (849178) | more than 4 years ago | (#30671872)

Doom always appeared to draw textures with much lower resolution while you were moving, and only display the full texture when you stopped and looked directly at an object, as a way of speeding up rendering. This gave the appearance of "motion blur" without a lot of additional processing required.

Doom 1? (2, Interesting)

tepples (727027) | more than 4 years ago | (#30672014)

By "Doom" do you mean Doom (1993) or Doom 3? If the former, I never saw this effect while playing the game on MS-DOS (vanilla version), Mac (Ultimate Doom), or GBA.

Re:Doom 1? (0)

Anonymous Coward | more than 4 years ago | (#30672158)

Maybe Doom II: Hell on Earth [wikipedia.org] (1995) ;)

Re:Doom 1? (1)

hardburn (141468) | more than 4 years ago | (#30672206)

Doom 2 was the same engine, just with new levels. If the engine was changed at all, I doubt it was to put in a poor-mans motion blur.

Re:Motion blur and bloom effects (1)

Chris Pimlott (16212) | more than 4 years ago | (#30672346)

Perhaps you're thinking of mipmapping [wikipedia.org] , which was implemented at least as early as Quake 1.

Re:Motion blur and bloom effects (2, Insightful)

Hatta (162192) | more than 4 years ago | (#30671874)

It's important none the less, even if it's not all sharp picture, because your eye picture isn't all that sharp either and you experience the same blur.

If my eye creates the blur, why do I need artificial motion blur?

Re:Motion blur and bloom effects (0)

Anonymous Coward | more than 4 years ago | (#30671930)

Because you aren't moving past the computer screen as fast as the scenery in the game is.

Re:Motion blur and bloom effects (1)

sopssa (1498795) | more than 4 years ago | (#30671968)

Because you're still looking at a single object, your monitor, and the picture and movement in it is artificially created. If you look at real objects moving or move or shake your head you'll notice theres huge motion blur effect. If you do it in game that has no motion blur effect, you notice how it instantly jumps to where you want to look at.

Re:Motion blur and bloom effects (2, Informative)

Shadow of Eternity (795165) | more than 4 years ago | (#30672178)

That just means we should strive for a higher framerate until our eyes blur things on their own. Reality is not inherently blurry (unless you need glasses...), our eyes and brain do that internally.

Making movement in a game inherently blurry when your head is already going to blur it for you internally is just a shortcut to motion sickness for a whole lot of people.

Re:Motion blur and bloom effects (1)

poetmatt (793785) | more than 4 years ago | (#30672300)

what are you talking about?

ever waved your hand so fast back and forth that it creates a blur? Ever seen those little things that go back and forth back enough to display an image?

Reality is indeed inherently blurry. It's just hard to accurately portray blur when you're staring at something that's not moving.

Re:Motion blur and bloom effects (2, Interesting)

sopssa (1498795) | more than 4 years ago | (#30672420)

I dont think we will get to a point that the framerate would be fast enough. The 3D monitors only generate up to 120fps too, and there's still lots of hardware limits to generate framerates over that with current games on good resolutions. And there is no framerate in real world; you're taking in images in realtime. Some argue (like the battle between 30fps vs 60fps) that human eye can't process more than certain amount of "frames" per second. The natural motion blurring effect and it's absence with video games perfectly shows that it can. While you see a smooth movement, you're still missing extra things like that generated by brain.

Re:Motion blur and bloom effects (2, Interesting)

spun (1352) | more than 4 years ago | (#30671970)

Just a guess, but perhaps because the frame rate isn't high enough for your eye to generate the blur? That is to say, if the scene were real, the frame rate would be well-nigh infinite, and your eye, capable of only a certain frame rate, would blur together all the frames. With discrete frames, you need to put in the blur the eye would generate from the frames in-between.

Or something like that.

Re:Motion blur and bloom effects (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30672406)

The human eye does not work with frames. It is a continuous, dynamic system.

The only way to achieve the blurring effect without artificially rendering it would be to make the object move on the screen at the same velocity it would in normal life. Needless to say, making a car move at 300 km/h in the virtual world and showing it with the real speed on the screen would simply make the game unplayable even excluding the practical difficulty of driving at that speed, because today screens simply don't have the necessary response times.

Re:Motion blur and bloom effects (4, Informative)

Shin-LaC (1333529) | more than 4 years ago | (#30672132)

Your eyes introduce blur due to the reaction time of the light-sensitive cells in the retina. Fortunately, the image processing area in your brain treats blur introduced by the eyes and blur built into the frame more or less the same, so you can use blur to give the impression of smooth motion with a lower frame rate than would otherwise be necessary. This is used to good effect in cinema, where the camera's exposure time naturally introduces blur that is quite similar to the one introduced by your eye.

In the case of video games, however, it is not so clear that rendering effctive artificial motion blur saves much processing time compared to simply rendering more frames. Then again, there is a limit to how fast your monitor can update its image, so rendering more frames is no longer an option past that point.

Re:Motion blur and bloom effects (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30672192)

Essentially people want these effects to be done by their eyes though, not the game. Why can't the game/computer/monitor produce fast enough frame-rates that its my eyes that are creating the blur, not the Post Rendering effects?

Don't get me wrong, I like the realism that these effects give, but some people see them as kind of fake and it draws away from their experience. Perhaps some people's eyes can percieve frame-rates slightly faster than others and thus don't actually see as much blur when moving fast as other people do.

Re:Motion blur and bloom effects (2, Interesting)

LordKazan (558383) | more than 4 years ago | (#30672278)

Why can't the game/computer/monitor produce fast enough frame-rates that its my eyes that are creating the blur, not the Post Rendering effects?

Physics.. monitors cannot change fast enough and in the right way to do this. they simply don't work that way.

Speaking of Physics - the properties of a game's physics engine have the properties of a Riemann sum where n=fps. so the higher your FPS the more accurate your physics simulation, even if your monitor cannot discretely display all those frames.

[note: only applies in games where physics ticks/sec are tied to framerate... which is almost all games]

Re:Motion blur and bloom effects (1)

sopssa (1498795) | more than 4 years ago | (#30672464)

Also bloom [imageshack.us] and lighting effects [imageshack.us] you still have to do in game because they rely on game world, can hide objects behind that bloom or make other objects dark, and because monitor just shows the color data you give it.

Really? (5, Informative)

non0score (890022) | more than 4 years ago | (#30671876)

Re:Really? (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30671936)

Tagged epenis

Re:Really? (2, Insightful)

Sockatume (732728) | more than 4 years ago | (#30672280)

Graphics are sold by screenshots and by box shots. YouTube and so on might make a difference, but ultimately you'll get more players to swoon with half the framerate and twice the geometry, than vice versa.

Where it matters most. (1)

Icegryphon (715550) | more than 4 years ago | (#30671906)

In fighting games you need 30FPS period.
There are books for Tekken and the like that have frame data for every move.
Input any lag into the equation and what might be safe of block, might not, costing you the game.

Re:Where it matters most. (0)

Anonymous Coward | more than 4 years ago | (#30671932)

Huh. And here I thought that those fighting games just randomly picked a move as the players sat there randomly bashing the controller buttons as fast as they could. Leave it to Koreans to prove me wrong.

Re:Where it matters most. (1)

Icegryphon (715550) | more than 4 years ago | (#30671980)

No you are thinking smash brothers.
Real fightans are a song an dance of multiple move and mind games against another person.
CPU's are to stupid to play with your head, they typically follow a pattern.

Re:Where it matters most. (1)

bigstrat2003 (1058574) | more than 4 years ago | (#30672204)

If you think smash bros is nothing but button mashing, you've obviously never played against someone really good (and I happen to have some friends who are really good, much to my dismay as I'm not all that great). At lower skill levels, smash bros is a lot of button mashing, just like any fighting game is at low skill levels. With highly skilled players, though, smash bros is very much a "song and dance of multiple moves and mind games".

Re:Where it matters most. (1)

Icegryphon (715550) | more than 4 years ago | (#30672396)

I was joking with the old meme,
Smash brothers gets not repsect on the Fightan scene,
this is probably because of all the Casuals/Button mashers who play it as well
There are skilled players in any game that would whoop my butt, even good old Pacman!

Re:Where it matters most. (1)

thetoadwarrior (1268702) | more than 4 years ago | (#30672362)

Smash Bros is just as much a real fighter as any other and likewise is has button mashers as you'll find for Street Fighter, Tekken, etc.

Re:Where it matters most. (4, Funny)

SendBot (29932) | more than 4 years ago | (#30672062)

I thought I was super badass at street fighter 2 in middle school, then I went to the arcade and saw older kids getting the insane combos on killer instinct. First thing I thought was... wow, you really have to study this stuff to know what you're doing. If only there was some sort of global information network where I could quickly and easily find out what all those moves are.

Re:Where it matters most. (1)

asills (230118) | more than 4 years ago | (#30672292)

When I was 13 (1992) I manually compiled moves learned while playing and discussing Mortal Kombat and sold them for $3-$5 a piece. Who needs the internet an enterprising little kid is destroying you in MK then offers to sell you the list of moves he knows?

Re:Where it matters most. (1)

flows (1075083) | more than 4 years ago | (#30672430)

Killerinstinct's FAQ was the first big file (over 60k iirc ) i ever got from the internet, November 1995, and yes, i know what you're thinking, that was BEFORE searching for porn. Really. It was.

Re:Where it matters most. (1)

91degrees (207121) | more than 4 years ago | (#30671964)

It will work just as well at 25fps, 60fps, even 42.77654fps. Maybe you'll have trouble below 24fps but the main importance is consistency.

Re:Where it matters most. (0)

Anonymous Coward | more than 4 years ago | (#30672006)

For games where timing and targeting prescision is required, you need AT LEAST 30 fps. Less than that leads to noticeably worse performance. The open question is, if more than 30 fps does make any difference. It probably won't improve accuracy by much, but if you reduce your monitor rate to 60 Hz, you'll pretty soon notice that the eye can detect a lot more than 30 pictures per second. It does just not affect our sense of motion by much.

Re:Where it matters most. (0)

Anonymous Coward | more than 4 years ago | (#30672010)

Tekken? 30fps? Real fighting games run on 60fps.

Re:Where it matters most. (1)

MozeeToby (1163751) | more than 4 years ago | (#30672018)

I think what you're getting at is that consistancy matters more than maximum frame rate. For different reasons than the one you state, I'd rather play a game at a constant 20 hz than at 30 (or even 60) hz most of the time but dropping down to 15 during the most intense moments. It's the large changes in framerate that are noticable, your brain can fill in the missing pieces if the framerate is constant.

Re:Where it matters most. (1)

spun (1352) | more than 4 years ago | (#30672050)

You don't understand how frame rate works, do you? The pictures drawn on the screen aren't the real model the game uses. Adding frames in between other frames won't generate lag (if the processing speed is high enough) So, if activating a block at a given frame works with 30fps, it will work with 15fps, 60fps, or 300fps. The frames aren't the 'real thing,' the game's unseen internal model is the real thing. The frames are drawn as often as is possible, given the hardware, and are drawn to comply with the current state of the internal model.

HTH.

Re:Where it matters most. (1)

Icegryphon (715550) | more than 4 years ago | (#30672106)

Sorry but, you are dead wrong. Input will lag behind with any other lag.

Re:Where it matters most. (1)

spun (1352) | more than 4 years ago | (#30672294)

How so? What lag are you talking about? What, in your theory, does 'lag' mean?

Heck, could you rephrase the sentence, "Input will lag behind with any other lag," so it actually makes sense?

Re:Where it matters most. (3, Interesting)

maxume (22995) | more than 4 years ago | (#30672122)

That isn't always the case, I recall a game in the past where gravity had less effect on players that had faster hardware. Or something like that. Anyway, the logic was mixed in with the rendering, so frame rate had an impact on what the player could do.

Re:Where it matters most. (1)

hardburn (141468) | more than 4 years ago | (#30672282)

Yes, that's entirely possible if it's programmed so that you fall x meters for each frame rendered. What should be done is to say you fall x meters per second, taking into account how long it's been since you last calculated the value.

(I'm simplifying the effect of acceleration above--many games could get along without it and produce a decent result, though it's not hard to factor in if you want.)

Re:Where it matters most. (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30672432)

Which is what the parent was getting at, alot of fighting games go by frames, not by seconds. Sounds ridiculous but it makes for easier programming and its alot less resource intensive.

Re:Where it matters most. (4, Insightful)

Kreigaffe (765218) | more than 4 years ago | (#30672506)

It's way, way way more than that.

The old HL engine -- at least in Natural Selection, but most likely any game on that engine -- your framerate didn't just effect your gravity (which made it so that at certain framerates you could literally jump further, which meant BHopping was sicker)..

it also changed the DPS of weapons. Yep. Weapon firing rate was tied to FPS in a very very odd way. Some dudes did too much testing. Insane.

And you can, visually, tell a difference between 100fps and 50fps and 25fps. Very easily. Takes a few minutes of playing, but there's a clear difference and anybody saying otherwise eats paint chips.

Graphics don't make games good. Graphics can cripple good games. Graphics never make bad games good.

Re:Where it matters most. (1)

Sir_Lewk (967686) | more than 4 years ago | (#30672514)

I know for a fact that Quake 3 did that (125fps is ideal for trickjumping mods like Defragged). The previous Quakes no doubt did it too.

Re:Where it matters most. (2, Interesting)

Speare (84249) | more than 4 years ago | (#30672230)

In many embedded apps, like coin-op arcade games, the "model" is indeed tied to the frame rate. The main loop assumes a fixed dt, and pipelines the input, update, render tasks. Often this is done without threading, just while (!dead) { do_input(); do_update(); do_render(); } in the main function. Even with threads or co-processors, they often tie the rates 1:1:1. Some have no room for adjustment, and some will at least update their dt if the render took too long.

Re:Where it matters most. (1)

spun (1352) | more than 4 years ago | (#30672388)

Ah, so faster hardware will actually update the model more quickly. But does this change the way the model acts? In some physics models, I guess it would. More frames would, in fact, be more accurate. But in most simple models, would calculating more time-slices actually change anything? I kind of doubt it, so even though you are right, and visual frame rate (in a non-threaded game) is tied to model frame rate, more frames would not change the outcome.

Basically, the original poster was making it sound as if the internal time of fighting games was tied to frames, so that, if a move took 12 frames to complete at standard speed, adding more frames per second would make it complete faster. I just don't see modern games being made this way.

Re:Where it matters most. (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30672318)

You don't understand the games he's talking about.

For something like Street Fighter at EFO, they take extra steps to make sure that the framerate is consistant across all play-times, times when the players are just standing there, and times when players are attempting to break blocks for their Hypercombofinishes.

Like many flash games - there is code that is actually executed ON THE FRAME. It is done as the frame is being rendered. When you get intensive moments that have people putting alot of input, lots of stuff to draw on screen, and whatever else, there is always that chance that latency will show up, slowing the frame-rates, which ultimately changes the rest of play - because now you would usually let go of your block at a precise moment, but because the game is slightly slower, your opponents initial attack is still flinging at you, forcing you to hold your block a bit longer.

Consistancy is what they are getting at. It needs to remain at the same FPS at all times for games where code is executed on rendering.

Re:Where it matters most. (0)

Anonymous Coward | more than 4 years ago | (#30672312)

A higher framerate is better -- your moves will be more responsive and render faster.

More important to have a robust webserver (0)

Anonymous Coward | more than 4 years ago | (#30671942)

./'d already...

LCDs = need even higher FPS (0)

Anonymous Coward | more than 4 years ago | (#30671952)

LCDs are hold-type displays which create motion blur when you follow a moving object on the screen. This can be avoided by modulating the backlight or by increasing the number of frames per second, if the LCD can keep up with the frame rate. Some high-end TVs already interpolate video frames four-fold, i.e. they create 3 interpolated frames for every actual frame delivered by the video source. This technique combined with backlight modulation creates very noticeably smoother motion. Unfortunately this technique is not suitable for interactive sources due to the unavoidable delay created by the interpolation. In conclusion: 60fps? Give me 120fps and we can start talking about finally replacing the CRT on my desktop.

What he (she?) said (1)

SendBot (29932) | more than 4 years ago | (#30672028)

It bugs me that 10 years ago I could play serious fps's and hit 100fps and actually see that on my monitor. It made a huge difference for the kind of competitive precision I was hitting.

My fancy new(ish) wuxga monitor has plenty of pixels, but 60fps feels real choppy to me.

I see some of these future 3d lcd's claiming 480Hz... is there a good inexpensive desktop monitor that can do 120Hz?

Re:What he (she?) said (1)

TheKidWho (705796) | more than 4 years ago | (#30672272)

Re:LCDs = need even higher FPS (1)

IndieKid (1061106) | more than 4 years ago | (#30672078)

...and some high-end TVs have a 'game mode' that amongst other things switches the interpolation off to avoid the delay you speak of. Specifically, I think some Samsung models have this feature.

There is a related point though which is the fact that a number of TVs/LCD Displays claim to be 100Hz or even 120Hz but can't actually accept a 100/120Hz input. Supposedly the coming generation of '3D ready' displays will rectify this since for a comfortable 3D viewing experience 60 FPS to each eye is required.

Counter-Strike... (2, Informative)

Manip (656104) | more than 4 years ago | (#30671962)

I myself used to play Counter-Strike (classic), and I can tell you both FPS and Ping made a HUGE difference in that game to the point that my score would increase as I connected to servers closer to home and used OpenGL instead of DirectX (since OpenGL almost doubled the FPS at the time).

Now, I wasn't an expert but I did play a whole lot. I think you ask most serious players and they would agree the impact of both...

Re:Counter-Strike... (0)

Anonymous Coward | more than 4 years ago | (#30672156)

yeah I always seemed to kick more ass with 60 fps than 30. Also, Avatar at 24fps didn't cut it for me with scenes featuring a lot of motion.

Re:Counter-Strike... (1)

hitmark (640295) | more than 4 years ago | (#30672180)

i wonder how much that had to do with the engine design. As in having the render engine and the game logic joined at the hip so that higher fps meant more repeats of the game logic pr second.

Re:Counter-Strike... (0)

Anonymous Coward | more than 4 years ago | (#30672190)

+1 I'm no phile (I'm the kind of person who listens to 96kbps and never bothered with powerstrip) but i would notice if the game wasn't running near 100FPS!

Re:Counter-Strike... (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30672366)

Agreed. Most players would notice a difference in a few miliseconds of network Latency more than a dozen frames per second, but its undeniable that extra Frames per second give you a distinct advantage.

If I see you and you see me, and you're running at twice my frames per second, You will have a smoother "turn and shoot" motion than me, which means you'll either notice your reticule over my head a slight bit faster than me, or you won't make the mistake of over or under compensating your aim since your motion was that much more sensative/responsive.

Apparently web servers also matter (3, Funny)

ForestHill (958172) | more than 4 years ago | (#30671982)

she's dead, Jim

Grammar?? (0, Troll)

jmvbxx (1074458) | more than 4 years ago | (#30671992)

There ARE many! Not, there's many.

Cached Version (5, Informative)

sabre86 (730704) | more than 4 years ago | (#30671994)

Looks like it's Slashdotted already. Here's the cached page: http://74.125.47.132/search?hl=en&q=cache%3Awww.significant-bits.com%2Fframerates-do-matter&aq=f&oq=&aqi= [74.125.47.132]

Re:Cached Version (1)

Zocalo (252965) | more than 4 years ago | (#30672330)

Nah, that's not Slashdotted. It's proving the point by showing the importance of a higher framerate.

Than zero.

The human eye can dectect 30 (5, Insightful)

gurps_npc (621217) | more than 4 years ago | (#30672020)

The human eye can clearly detect frame rates far greater than 30. So can the human brain.

HOWEVER

The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary assumptions and trick the MIND, if not the brain. into thinking something is real.

So while a higher frame rate will increase the quality of the game, it is not essential. It's like getting gold plated controls on your car's dashboard. Yes it is a real increase in quality, but most people would rather spend the money on a GPS device, real leather, plug-in-hybrid engines before you get around to putting gold in the car.

Re:The human eye can dectect 30 (-1, Offtopic)

CannonballHead (842625) | more than 4 years ago | (#30672220)

The human mind is evolutionary designed to make instant assumptions.

I feel nitpicky, today. Either call the human mind evolutionary or designed. Evolution has no design and cannot design. If you do believe in evolutionary theory, then at best, evolution perhaps "fashioned" it, but it did not "design" it. To try to logically squish together evolutionary theory and some form of design will either elevate "evolution" to what it isn't and shouldn't be or degrade the word "design" to mean what it doesn't. Well, all the definitions I've seen so far, anyways. Accidents are not designed, and natural processes do not design. Nature does not design. Yes, nature does "fashion" things, but not design.

I guess I'm trying to say that if you think the human mind is amazing enough to be "designed" and yet refuse to recognize there was any form of intelligence behind that design, I'd say that something strange is happening in your logic somewhere... and if you're not using "design" as in something that was planned out before it happened, then choose a different word :)

[/offtopic rant]

Re:The human eye can dectect 30 (1)

royallthefourth (1564389) | more than 4 years ago | (#30672328)

Clearly you have never had to work on software that was designed by someone with no intelligence!

Re:The human eye can dectect 30 (1)

GeoSanDiego (703197) | more than 4 years ago | (#30672412)

"Designed by process of evolution"

Re:The human eye can dectect 30 (1)

skylerweaver (997332) | more than 4 years ago | (#30672422)

How about "optimize" instead of fashion.
After doing some programming work with genetic algorithm optimization, I realized that although evolution may choose one gene path or another, the end result is always the same: the most optimized solution for survival, tool making, etc.
So I agree, not designed; absolutely optimized.

Re:The human eye can dectect 30 (1)

wgaryhas (872268) | more than 4 years ago | (#30672334)

In what way does gold plating increase the quality of the dashboard controls?

Re:The human eye can dectect 30 (0)

Anonymous Coward | more than 4 years ago | (#30672376)

What the article and most people fail to note is that your TV is only displaying 30 fps. So any game running at more than 30 fps is dropping frames.

This has nothing to do with what the eye can detect, it is just the TV standard. You are either running 60 half frames (interlaced) or 30 full frames in progressive.

Only PCs actually allow you to have real 60fps. So anytime you hear a console gamer talking about playing a game at a silky smooth 60fps, you have to realize that they have no idea what they are talking about, just like the article writer.

Re:The human eye can dectect 30 (3, Insightful)

TheCarp (96830) | more than 4 years ago | (#30672408)

> The human mind is evolutionary designed to make instant assumptions. Cat in mid air facing us = DANGER. No "Is it dead
> and being thrown at us?" No "Is it a picture?" As such, video games can quite easily take advantage of this evolutionary
> assumptions and trick the MIND, if not the brain. into thinking something is real.

Sort of. Its actually less "Cat in mid air" and more "This sets off a trigger based on something that happened before and hurt me".

Most adults, if you chuck a rock at their face, will toss up their arms to block, or move their head/body to dodge. This is completely learned. Do the same trick with a young child who has never played "catch" before, and your rock is going to bean him right off his skull.

From my own experience, my first motorcycle accident, I was on the ground so fast, I had to think afterwards about what happened. First two spills actually.

The one after those.... whole different story. The adrenalin hit as soon as I felt the bike start to turn sideways, by the time the bike was fully 90 degrees to my momentum vector, and the wheels were sliding out from under me, I was already calmly kicking my legs backwards and positioning myself for the impact. I hit the ground and slid 150 feet while watching my bike spark and slide away. I thought "shit I am in traffic" jumped to my feet and ran to the bike, picked it up and pushed it into a parking lot.

All I am saying is, its more complicated than that. The memory of such things and whole "flight or fight" response is an evolving and learning response. Its more than just visual, it encompasses all the senses. I doubt "cat facing us in mid air" is going to trigger much beyond anything in mid air moving towards us.

Re:The human eye can dectect 30 (1)

EvilBudMan (588716) | more than 4 years ago | (#30672468)

I would say it stops a 120fps but I haven't been able to run tests past that point so far.

Important in movies as well (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30672026)

Avatar had a lot of flickering because of the frame rate. The flicker gets more obvious with 3D and Imax. Apparently there is talk of going to 60 frames for projected movies but I wouldn't hold my breath since theaters are already squealing about switching to digital projection and 3D. The technology is becoming available but I'll be surprised if they try to deploy it before the 2020s. Too bad because it would make a massive difference for action films especially 3D. With talking head pictures you'd never notice the difference.

Absolutely (5, Funny)

occamsarmyknife (673159) | more than 4 years ago | (#30672036)

I couldn't agree more. That Internal Server Error looks way better at 120 Hz on my 45" HD display.

Re:Absolutely (0)

Anonymous Coward | more than 4 years ago | (#30672470)

I couldn't agree more. That Internal Server Error looks way better at 120 Hz on my 45" HD display.

And those animated gifs of the flashing siren, flaming torch, and spinning globe look so much better too!

Headroom... (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30672056)

the biggest reason to go for the highest frame rate possible is headroom. If your framerate is 30 at best, it'll dip down to 10 sometimes. If it's at 120 optimal it can dip down to 30, and still be playable.

Any animator knows... (5, Interesting)

Monkeedude1212 (1560403) | more than 4 years ago | (#30672070)

You can tell the difference between 30 FPS and 60 FPS.

The way I tested this was I made a 2 second video in flash, a circle moving from the left side of the screen to the right side. 60 frames. Run it at 30 FPS.

Then I made a second 2 second video, same exact positions. 12 Frames. Ran it at 60 FPS. Asked me, and all of my surrounding classmates, which was about 24 students IIRC.

100% of us noticed a visible difference in the smoothness. Whether our eyes were making out each individual frame perfectly or blurring some together to create a smoother effect, it was irrelevant since there WAS a noticable difference. I was going to slowly bump the 30 and 60 FPS up higher and higher to see at what point the difference is not distinguishable, but I got lazy (High school student at the time.)

The point I think most gamers would agree on is that more frames per second are nice - but that 30 frames per second are Necessary. You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.

Re:Any animator knows... (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30672086)

120 Frames* I mean. Sheesh. Not proof reading even though theres a preview button.

Re:Any animator knows... (1)

Sockatume (732728) | more than 4 years ago | (#30672244)

Time Splitters was the first game I played which was locked at 60fps: it was quite a remarkable transition, even from games which were locked at 30fps, never mind games that fluctuated (I'll take 30fps and locked over 30-60fps any day). Gran Turismo had a "Hi-Spec" mode which doubled the resolution and framerate too, albeit at an obvious graphical cost, and it looked like The Future.

On the subject of movie theatres, 24fps was chosen because it's pretty much as low as you can go before people notice problems. Roger Ebert and various others have been arguing for a doubling of movie framerates for years to no avail. Studios don't want to pay the film and (these days) CGI cost of double the frames (which is why they went with the lowest possible framerate to begin with).

Re:Any animator knows... (1)

StripedCow (776465) | more than 4 years ago | (#30672364)

If you use, instead of a circle, a turning wheel with spokes, you can "see" the wheel suddenly going backward as you drop the framerate.

Re:Any animator knows... (5, Interesting)

jeffmeden (135043) | more than 4 years ago | (#30672434)

You can occaisonally dip down to 24 and be alright (24 is supposedly the speed that most Movie theatres play at) - but when you get around 20 or so its really does take away from the experience.

If by 'supposedly' you mean 'definitely' and if by 'most movie theaters' you mean 'all theaters and even all motion picture production processes in recent years', then yes. The difference is lost on most people, but the reason 24fps is acceptable in movies is that the frame you see isn't what happened at that instant in time when it's displayed, it's everything that happened in the last 1/24th of a second, since it's recorded on film that exposed for that 24th of a second to derive the image. When a computer does it, it only cares about what is happening at that exact 24th of a second; so the difference between a series of exact frames of motion and a series of frames that include the blur of what happens between frames is HUGE.

However, this nuance is lost on pretty much everyone who fires up a computer game, notes the FPS indicator, and goes "OMG I CAN TOTALLY TELL ITS ONLY 30FPSZZZZ!!!! HOW INFERIOR!!!". Whine about framerates all you want, but they are only a small part of the experience.

Re:Any animator knows... (0)

Anonymous Coward | more than 4 years ago | (#30672520)

You can tell the difference between a rock-steady 60 fps and a rock-steady 30 fps. It feels better. But it is also true that what matters most is that the frame rate doesn't vary.
For some reason the brain doesn't like frame drops. You will get a better "feeling" of motion with a movie at 24 fps than with a game that usually goes at 60fps but has some frame
drops in complex areas that make it go to 30 fps. You feel it awkward, even if the framerate doesn't drop below the 24 fps of the movie.

I've noticed that quite a lot while playing games... I won't be able to tell you if a given game is running at 30fps or 20fps if the frame rate doesn't change. However, I would notice a
framerate drop from 60 to 30 then back to 60. Of course, if you put the game running at 30fps besides the same one running at 20fps the former will "feel" better to me.

The motion blur picture (1)

Lord Lode (1290856) | more than 4 years ago | (#30672072)

Whoa, the motion blur image with the birds and the mountain is nice, what game is that screenshot from??!!1

Re:The motion blur picture (1)

H0p313ss (811249) | more than 4 years ago | (#30672234)

Whoa, the motion blur image with the birds and the mountain is nice, what game is that screenshot from??!!1

Reality 1.0, very popular MMO with billions of users. Excellent graphics but the developers have been very unresponsive to bug reports.

I notice a difference from 30fps vs 100? (1)

Kungpaoshizi (1660615) | more than 4 years ago | (#30672128)

The difference is very minute, but I've noticed if you play an fps with no vsync, 30fps is ok, but if you're doin around 100fps, then you notice the tearing. I'm sure everyone knows what I'm talking about, but if you enable vsync, and then look at the difference, of course your monitor hz comes into play, but there is very slight fractions of a second you notice, such as dude lookin away from you, then all of a sudden he's looking at you (30 fps) But at max fps your monitor can support + whatever your vid card is kickin out, you don't just notice the guy looking away and then looking at you, you notice the guy looking away, and then the fraction of a second it takes him to TURN and look at you. You can see that turn. No vsync, and it was probably a tear, but it is still there, and having won a fps tournament, I can tell you it does matter.

Friend could see 57 vs 60. (1)

Maxo-Texas (864189) | more than 4 years ago | (#30672146)

I had a friend who was bothered by anything less than 60fps.

The screen looked "stuttery". He would take a lower resolution to maintain 60fps.

We could verify this in one game with a built in frame rate command.

This is like the "myth of the g spot" post a few days ago. sheesh.

Re:Friend could see 57 vs 60. (1)

Icegryphon (715550) | more than 4 years ago | (#30672264)

I would not doubt it for a second.
Many people can see the difference,
maybe not that small of an increment.
There are a few people like your friend who can probably see even more then just 3FPS difference.

Re:Friend could see 57 vs 60. (0)

Anonymous Coward | more than 4 years ago | (#30672370)

That's not strange, a game running at 57fps on a typical monitor with a 60hz refresh rate will suffer from image shearing, as the frame being displayed changes partway trough the monitor's refresh cycle. This is quite noticable even at high frame rates, and the reason for using vertical sync in gaming.

Now if your friend could tell the difference between 57hz and 60hz, that would be quite special.

120 and 240Hz TVs would like to have a word... (0)

Anonymous Coward | more than 4 years ago | (#30672246)

I mean, all you have to do is convert a regular 24FPS movie to 120Hz and you can see the massive difference that framerate makes in the smoothness of playback. That said, I am generally comfortable with 30FPS when it's consistent. Little in gaming is more annoying than seeing framerates adjust from scene to scene. Try watching a movie like that...

In addition... (0)

Anonymous Coward | more than 4 years ago | (#30672258)

Can we edit the post to tell people that no matter what their FPS meter says that they cannot display more frames per second than their monitor can draw? It seems to be a widespread myth that a game showing '500FPS' is actually drawing those to a 60hz monitor.

Also... Isn't the minimum FPS for fluid motion something like 23.4? Not 30.

Movies at only 24/25 FPS are horrible (1)

nmg196 (184961) | more than 4 years ago | (#30672296)

Personally I get annoyed by the fact that although they've invented HD (woohoo) they're still shoving it out at only 24 or 25 FPS. To me, this looks really jittery! I wish they'd go up to 50FPS for HD.

Watching Avatar in 3D seemed to accentuate that problem. I'm not sure how they do the whole left/right thing in terms of projection, but it seemed to me that the left/right image was shown alternately and at nothing like a high enough speed for me to perceive it as fluid motion. Did anyone else notice this?

Age-old confusion. (1)

SD-Arcadia (1146999) | more than 4 years ago | (#30672306)

The 30-fps-is-all-you-can-see myth was probably born of the notion that the illusion of continuous movement starts to set in around 25-30fps (in film for example). Therefore actually 30fps is the minimum you need rather than the maximum you can perceive.
I could tell in a glance the difference between 72fps and 100fps (both common refresh rates that translate to the max fps when v-sync is on) in Counter-Strike just by briefly moving the mouse to pan the scene.
This site has had the definitive explanation on this issue for a long time, along with many other useful faqs: http://www.100fps.com/how_many_frames_can_humans_see.htm [100fps.com]

Plenty numerous all right (2)

1u3hr (530656) | more than 4 years ago | (#30672308)

the numerous advantages of a high framerate, and there's plenty of those.

Brought to you by the Department of Redundancy Department.

Psychic pain. (1)

hyperion2010 (1587241) | more than 4 years ago | (#30672386)

From my own experience its not so much a specific constant frame rate as it is the fact that the frame rate fluctuates. If all game developers made sure that their games NEVER dipped below 30fps during major action (fps obvs) that would be fine, but when 30fps is the average between 45 and 15 fps we have a big problem. Heck, even a drop form 60fps to 30fps is noticable and disconcerting. I bet if you were to hook players up to a machine that could measure emotional responses you would find that sudden drops in framerate elicit strong negative responses which negatively effect performance.

Thanks... (1)

rickb928 (945187) | more than 4 years ago | (#30672392)

Now I can justify another $1000 worth of hardware to my wife, to play the same game I can get on a $300 console.

She gets it. I'm the Computer Guy. I know how it works. I know what is needed. I know how to keep her from not being able to play Farmtown. Or is it Fishville? hard to keep up with the Facebook privacy violations/games.

Ya gotta have priorities.

You can see the difference between 72 and 100 (0)

Anonymous Coward | more than 4 years ago | (#30672394)

LCD framerates being capped have confused the issue, but on a CRT there's was more than one reason the minimum should be 72. Then one day a friend suggested bumping it to 100 for better gameplay (Quake). I thought he was nuts, but I tried it and dammed if I didn't start making shots that I had been missing previously. Or at least thought I was missing! The difference was quite notable.

While this is probably on the upper end, it does happen. Also this is after many years of intensely competitive play, which will develop specific skills. [No, I don't strafe around corners in real life.]

That said, games have been getting slower and slower over the years. Modern titles just aren't at that warp 12 pace now, which is probably for the best given the limiting factor of LCD panels.

3d (1)

StripedCow (776465) | more than 4 years ago | (#30672400)

if you use one half of the 60fps for each eye, you have 30fps in 3d... that's probably one reason for the drift back to 30fps.

The 30fps myth (1)

asdf7890 (1518587) | more than 4 years ago | (#30672424)

The 30fps myth is simply an over simplification. The eye+brain starts to naturally perceive movement at around 10fps, usually a little lower. Motion usually starts to appear smooth somewhere between 15 and 25fps though it depends on many factors other than just the framerate (smoothness of the frame rate, relative change velocities of objects (or parts thereof) in the image, absolute colour and tone, colour and tone contrasts within the image, the existence or not of dropped frames and other inconsistencies, ...).

People often take this (the "15 to 25fps" bit, ignoring the "depending on..." complications) as meaning there is no need to go above 30fps, and in many cases there probably is no need, but in a number of conditions a higher framerate can affect the perception of movement quite significantly especially for fast moving objects/scenes.

I never understood why... (1)

JustNiz (692889) | more than 4 years ago | (#30672446)

Nearly everyone these days uses LCD monitors that have a pathetic maximum of 60hz display at HD resolutions (I think because of DVI spec/bandwidth limitations, Whatever moron invented DVI needs to be shot because of that).
I still have an analog CRT monitor that supports much higher frame rates at HD resolutions which gives a very noticeable edge when playing twitch-games like Unreal Tournament.
I never understood why people claim framerates above 60hz are better when their monitor is only capable of displaying 60hz at the resolution they play at. The only difference that framerates above 60hz (i.e. vsync turned off) is obvious tearing. You're still getting an actual 60hz framerate because the monitor, regardless of what the PC is doing.

Brightness (1)

ChrisMaple (607946) | more than 4 years ago | (#30672460)

The brightness of the image and ambient lighting makes a difference. The more light that goes into your eye, the faster it responds. I run 1600x1200 @ 62 Hz interlaced, and sometimes I notice flicker. When that happens I close the shades, and the flicker goes away.

30 Fps myth (2, Interesting)

ggendel (1061214) | more than 4 years ago | (#30672462)

There were a lot of studies done a long time ago, and there are some very accurate psycho-visual computer models of the human visual system. I had the pleasure of working with the Jeff Lubin model when I worked at Sarnoff Corp, which won an Emmy Award back in 2000.

The 30 fps requirement is not a fixed point, but depends upon a lot of other factors, including viewing distance, field of view, and lighting conditions. The reason that film operates at 24 fps is because it is expected to be viewed in a darkened room. When film is trans-coded for TVs, they have to modify the gamma for a normally lighted viewing area or it will look bad. NTSC TVs are interlaced, displaying 60 fields per second, even though the frame rate is 30 frames per second.

Bottom line is that this article should include the environmental factors under which this point was made.

The difference in framerate (5, Interesting)

DeskLazer (699263) | more than 4 years ago | (#30672478)

15 FPS vs 30 FPS vs 60 FPS [boallen.com] . This is a visual representation. There are points made, however, that when you watch a movie, the image is "softened" and runs at a lower framerate [something like 24 or 25 FPS?] because your brain helps "fill in the gaps" or something of that sort. Pretty interesting stuff.

Sorry, you lost me (3, Insightful)

nobodyman (90587) | more than 4 years ago | (#30672484)

As more and more games move away from 60fps *snip*

Hmm... I don't accept that premise, either on the PC (where midrange graphics cards can easily pull 60fps with any game on the market now) or on the consoles (where framerates are only going up as PS3 and 360 development matures).

I think that this article (or at least the summary) is a bit of a strawman. Most of the gamers I know recognize that good framerates are important.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?