60 FPS or 30 FPS. Can you see the Difference?

Too easy. 11/11 Found them all almost immediately the moment my eyes saw the 60fps one except for the tunnel one where my laptop was bogged down. The 60fps effect was most evident went the camera in first person was turned/rotated and in third person games regardless of a changing camera.
 
Last edited:
I got all of the right, but I gotta say that in some games it didn't really matter. Especially Max Payne 3. The most recent case where that was the case is Shadow of Mordor, I'm playing that game on PC with constant 60fps, and I really can't see any difference from console versions.
 
11/11

Something about 60Fps that just adds smoothness to the picture. I spotted the 60fps ones in seconds.
 
6/11, so I guess most of the time, I can tell what's happening. Although it's probably more 50/50 for me.
To me, it was pretty easy, but not as apparent then having games actually running on a TV in your face. The 60fps feed definitely was not as crisp as playing for real. More like 30 fps vs 40 or 45 fps.
 
11/11. Too easy.
But according to some people, nobody can see more than 30fps. And then others say the hman eye can't see past 24fps! lol Who knew that cinema's typical 24fps recording feeds perfectly match the limitations of the human eye! That's scientific genius on the part of the dude who invented film.
 
But according to some people, nobody can see more than 30fps. And then others say the hman eye can't see past 24fps! lol Who knew that cinema's typical 24fps recording feeds perfectly match the limitations of the human eye! That's scientific genius on the part of the dude who invented film.
So true. It irks me when people tell me what I can and can't see with my own eyes, just because some bs study somewhere says so, so IT MUST BE FACT! A complete lesson in trusting your own judgement.

Also, I do believe that the correct motion blur certainly DOES help close the gap. Even so, I think the jump is obvious to me. I think 60 is really a sweet spot for the human eye/brain to be fed light in motion, yet not be so wasteful with regards to content storage or graphics processing.

That said, FH2 still looks pretty darn amazing at 30 fps somehow. There's always a tradeoff, I guess.
 
So true. It irks me when people tell me what I can and can't see with my own eyes, just because some bs study somewhere says so, so IT MUST BE FACT! A complete lesson in trusting your own judgement.

Also, I do believe that the correct motion blur certainly DOES help close the gap. Even so, I think the jump is obvious to me. I think 60 is really a sweet spot for the human eye/brain to be fed light in motion, yet not be so wasteful with regards to content storage or graphics processing.

That said, FH2 still looks pretty darn amazing at 30 fps somehow. There's always a tradeoff, I guess.
I'm not 100% sure, but I think the study says we only really need 24 fps, anything above is simply the gravy on mash potatoes. Yet if it goes under, it's a very noticeable distortion of reality if you will. Of course, that's such an old study to begin with that people bring up. It's really more about the bare minimum. Of course, it also depends on which medium the argument is being applied to. Film and games are so very different in their approach to visuals.
 
I'm not 100% sure, but I think the study says we only really need 24 fps, anything above is simply the gravy on mash potatoes. Yet if it goes under, it's a very noticeable distortion of reality if you will. Of course, that's such an old study to begin with that people bring up. It's really more about the bare minimum. Of course, it also depends on which medium the argument is being applied to. Film and games are so very different in their approach to visuals.
That makes a lot more sense and actually seems familiar to me now. This is why there is such a disconnect with silent films that are under 24 fps, even if they do have their charm. (Love Buster and Charlie) It's just that people have been twisting that to mean you don't need over 24 or 30 Hz because you can't distinguish the difference. I actually had this argument with someone at TeamXbox back in the day. Can't remember who that stubborn fella was...
 
...also these tests aren't accurate unless your hardware runs them at 60Hz min. If your device can't cope, the 30fps SHOULD actually appear smoother. Just saying.

On a related note, I just found out Youtube has a 60fps option now. I didn't even realize that.
 
10/11 the 2nd one is a poopy mess. i feel bad for those who cant tell the difference.
 
I'm not 100% sure, but I think the study says we only really need 24 fps, anything above is simply the gravy on mash potatoes. Yet if it goes under, it's a very noticeable distortion of reality if you will. Of course, that's such an old study to begin with that people bring up. It's really more about the bare minimum. Of course, it also depends on which medium the argument is being applied to. Film and games are so very different in their approach to visuals.
Er, um, no bro. Film and video games are very similar in their approach to visuals. No two art forms share such identical values when it comes to filling a frame.
 
Er, um, no bro. Film and video games are very similar in their approach to visuals. No two art forms share such identical values when it comes to filling a frame.
Well, they are the same in one way, that they both are comprised of still images at a certain cycle per second (Hz), but there is one strong inherent difference. In film, the shutter speed directly relates to the amount of motion blur. At 24 fps, a film maker has to decide if a camera pan or fast action scene should contain either sharper detail or smoother motion. This IS tricky at 24 fps, no matter what anyone says. In say a 30 fps video game, you can try to recreate that as a post-processing technique.

What DOES remain the same, is that in both mediums: the need for longer shutter speeds in film, and video game motion blur, well both become more irrelevant with higher frame rates. You will find many cinifiles who are down-right against 48 or 60 fps (HFR) movies. I personally welcome change and would rather let the director decide on a per-movie basis. Of course, this issue is highly tied to 3D films (Hobbit), which is kind of out of the scope of this thread, accept to say that ones sensitivity to lower frame rates would seem to be compounded with stereoscopic 3D.
 
Er, um, no bro. Film and video games are very similar in their approach to visuals. No two art forms share such identical values when it comes to filling a frame.
Well, the differences I'm referring to are more akin to the finished product. A game runs on hardware but film is rather static in comparison. You don't need new hardware to watch the Hobbit on your computer, but you do if you want to play Ryse. Films need new tech to create larger scope movies, but not necessarily to play them. In my experience, there's no concern about a film's functionality in the same sense of a game. There's no fear of a movie crashing because of what's happening in the frames. Keep in mind, I'm not trying to say there are not any similarities, but there are plenty of key differences. Video games will always be way more subjected to interaction and functionality because without either, they just end up being movies when it's all said and done.
 
Well, they are the same in one way, that they both are comprised of still images at a certain cycle per second (Hz), but there is one strong inherent difference. In film, the shutter speed directly relates to the amount of motion blur. At 24 fps, a film maker has to decide if a camera pan or fast action scene should contain either sharper detail or smoother motion. This IS tricky at 24 fps, no matter what anyone says. In say a 30 fps video game, you can try to recreate that as a post-processing technique.

What DOES remain the same, is that in both mediums: the need for longer shutter speeds in film, and video game motion blur, well both become more irrelevant with higher frame rates. You will find many cinifiles who are down-right against 48 or 60 fps (HFR) movies. I personally welcome change and would rather let the director decide on a per-movie basis. Of course, this issue is highly tied to 3D films (Hobbit), which is kind of out of the scope of this thread, accept to say that ones sensitivity to lower frame rates would seem to be compounded with stereoscopic 3D.
Great response. We can discuss this in another thread though mate. This would be a great discussion to have with you though. I've been a huge fan of the science and magic of video games all my life.
 
But according to some people, nobody can see more than 30fps. And then others say the hman eye can't see past 24fps! lol Who knew that cinema's typical 24fps recording feeds perfectly match the limitations of the human eye! That's scientific genius on the part of the dude who invented film.
Those "people" are making s**t up.
 
The site is down now, but there's definitely a massive difference between 30 and 60 fps. 30fps is not smooth. While I am enjoying Forza Horizon 2, I can definitely feel the fps loss.
 
Didn't get a chance to try it out before the site went down, but I'd probably get about a 6/11. If it were a resolution thing I'd probably get 0/11, but I can sometimes get lucky and tell the difference between fps, especially when I'm actually trying to and not actually playing a game.
 
Site is down, but the whole 30 fps vs 60fps argument isn't about seeing, it's about feeling.
 
Site is down, but the whole 30 fps vs 60fps argument isn't about seeing, it's about feeling.


This. I've always said before before, that it's not about what you see, but rather, what you feel, and that's why fps means even more to pc gamers. With console gamers, controllers are not 100% accurate, and have a delay of sorts (whether it's input delays or stick deadzones). Also, large TVs tend to have more delay than pc monitors. Because of this, some console gamers claim they don't notice a difference.
But the difference isn't in what you see most of the time (although you can notice it), but rather, how it feels when you're controlling the action, and that's why you notice the difference more in fast paced games, and not so much in slower paced games. The speed at which you control the action using a controller or mouse doesn't translate 100% to what you're seeing on screen, so it appears choppy. Watch a movie at lower fps, everything appears smooth, but you're not controlling what you see on screen.

That's my very un-scientific take on it.
 
Er, um, no bro. Film and video games are very similar in their approach to visuals. No two art forms share such identical values when it comes to filling a frame.
I don't think he was serious. But just for the sake of a history lesson, film wasn't 24 frames, 16 frames, or 20 frames because of anything science-related. The low frame rates of movie/cinema was because movie makers back in the day had to save on film to help lower their budget, and the standard just happened to never go away. And the push for 48p is a tough one because "it looks too real" as shown by the results of the 48p theatrical release of The Hobbit.
 
Last edited:
Also, large TVs tend to have more delay than pc monitors.
You are correct in that they may not see the difference, but in a different way than delay. While pixel delay is a shortcoming of LCD technology, (Remember, most "LED TVs" are really LCD panels with an LED backlight... and OLED is very different) to compensate for this delay, a good portion of LCD displays these days have at least a 120 Hz refresh rates, so they may not get lag anymore... Here's the thing though, a lot of these sets also have frame interpolation (sometimes called Smoothscreen), which inserts new approximated frames up to 120 fps.... Anyone with this feature turned on won't likely see a difference between 30 and 60 fps.