Eurogamer: Face-Off: Ryse on PC The new benchmark for PC gaming? hardware?

When you do a little math and compare what it takes for the 290X to achieve its target resolution and framerate to what was achieved with that of the Xbox One being only a fifth of the power, it takes roughly a PC with 200% the power of the Xbox One to run the game at similar resolution and framerate. I don't really see what is 'not much more' about that. It shouldn't ever have to be argued that rendering purely in hardware has major advantage over having loads of additional software in the way like in DirectX.
In terms of raw specs and power PCs have always been much better than any console. However, the game quality consoles output is much better than any comparable spec PC. PC has so much wasted power on all the background stuff it's absurd. So that's why a PC typically has to have so much more ram, cpu power etc... to be even keel with a console game. And it gets to a point that raw power beats out a console no matter how streamlined it is.

Looking back it was amazing how Genesis/SNES were able to blow past my 286/386. For text heavy games, PC was much better. For anything fast,, flashy, neat effects, sound etc.... these consoles with 64 or 128k ram and puny cpus ran games much better despite my PC having 1 or 2 mb ram, and intel cpus at 16 or 33 mhz. And that's excluding needing to buy a sound card and speakers. And Genesis/SNES still sounded better.... esp. SNES's sound chip.

If PCs could somehow harness the raw power it has without all that background stuff eating up power the PC games would be much better.
 
When you do a little math and compare what it takes for the 290X to achieve its target resolution and framerate to what was achieved with that of the Xbox One being only a fifth of the power, it takes roughly a PC with 200% the power of the Xbox One to run the game at similar resolution and framerate. I don't really see what is 'not much more' about that. It shouldn't ever have to be argued that rendering purely in hardware has major advantage over having loads of additional software in the way like in DirectX.


What are you talking about? A R9 280 isn't 200% more powerful than what the XB1 gpu equivalent is. Sure, a 280 is better....but not 200% better.


Did you read the article? A R9 280 ran the game at 1080p at anywhere from mid 40s to mid 50s fps.



It's time that people stop pushing this idea that it takes a 200% more powerful pc to target just what the XB1 pumps out. Utter nonsense.
 
XB1 equivalent gpu running Ryse on low end pc at 1080p.
 
What are you talking about? A R9 280 isn't 200% more powerful than what the XB1 gpu equivalent is. Sure, a 280 is better....but not 200% better.


Did you read the article? A R9 280 ran the game at 1080p at anywhere from mid 40s to mid 50s fps.



It's time that people stop pushing this idea that it takes a 200% more powerful pc to target just what the XB1 pumps out. Utter nonsense.
I don't know where you got 200% from. A fifth is only 20%.
 
You.

It takes roughly a PC with 200% the power of the Xbox One to run the game at similar resolution
So how does "a PC" infer the 290X? Why are you dodging the point?

Edit. If it actually was confusing or difficult to read, then here is a more layman's: The 290X is achieving a resolution roughly one-and-a-half times the size that of 900p and doing so at only twice the frame rate the original X1 version achieved despite being five times the power of the Xbox One.

1080p x 60fps requires three times more power to render than 900p x 30fps. And with the Xbox One being One Fifth the power of the 290X, the 290X is only three times the performance, meaning that a lot of performance was lost in the transition to DirectX.
 
Last edited:
So how does "a PC" infer the 290X? Why are you dodging the point?

Edit. If it actually was confusing or difficult to read, then here is a more layman's: The 290X is achieving a resolution roughly one-and-a-half times the size that of 900p and doing so at only twice the frame rate the original X1 version achieved despite being five times the power of the Xbox One.


You keep hanging on the 290, ignoring that the 280 also ran at 1080p.
Also ignoring that 7850 running it at 1080p.

So your base comment that it takes a pc that is 200% more powerful to run at similar XB1 specs is false. 1.5x the resolution at doubled the performance is a far cry from "similar".
 
You keep hanging on the 290, ignoring that the 280 also ran at 1080p.
Also ignoring that 7850 running it at 1080p.

So your base comment that it takes a pc that is 200% more powerful to run at similar XB1 specs is false.
The 280 ran at 1080p and 45fps. Why don't you do the math and show me how it figures differently. And after that, please tell me, why be in such denial about hardware being seriously faster than software?
 
Last edited:
The 280 ran at 1080p and 45fps. Why don't you do the math and show me how it figures differently. And after that, please tell me, why be in such denial about hardware being seriously faster than software?

How does it not figure differently? The 280 is no where near 200x more powerful than the xb1.
What about the 7850 which also runs ryse at 1080p....which is about near gpu to xb1 gpu??????

A claim was made that you need a much more powerful pc to run ryse at about the same as xb1. It is simply not true. Your so called math doesn't support your claim.

It's a nonsense claim. It's like someone saying the xb1 is "x" times more powerful than the 360.....yet only runs ghost at a tad bit higher resolution at the same exact fps.


And for the record, on pc, that mid range 280x sees 1080p, better AA, 45-55 fps, better shadows and better details at further distance. All in a card that is not 200% more powerful. Your math is simply fuzzy.

If you stop looking at just "1.5x resolution difference" And see that the pc version actually improves in several areas...you can see quickly where the power is coming into play. All in a $200 gpu.
 
Last edited:
How does it not figure differently? The 280 is no where near 200x more powerful than the xb1.
What about the 7850 which also runs ryse at 1080p....which is about near gpu to xb1 gpu??????

A claim was made that you need a much more powerful pc to run ryse at about the same as xb1. It is simply not true. Your so called math doesn't support your claim.

It's a nonsense claim. It's like someone saying the xb1 is "x" times more powerful than the 360.....yet only runs ghost at a tad bit higher resolution at the same exact fps.


And for the record, on pc, that mid range 280x sees 1080p, better AA, 45-55 fps, and better details at further distance. All in a card that is not 200% more powerful. Your math is simply fuzzy.
Oh. So the 290X is five times more powerful than the Xbox One, but the 280X is not even twice as powerful as the Xbox. I guess I forget about the 280X being a low-end card. I guess you were right all along.
 
Last edited:
Oh. So the 290X is five times more powerful than the Xbox One, but the 280X is not even twice as powerful as the Xbox. I guess I forget about the 280X being a low-end card. I guess you were right all along.


It's nonsensical for anyone to believe that a gpu being "x" times more powerful is supposed to mean that games should run "x" times faster. In the real world, that's not how it works. Hence why I have you the xb1 versus 360 power difference as an example. There you have an incredibly large difference in power, but you don't see that difference translated in games in terms of performance.
It's short sighted to just look at resolution and performance as being the only measures of difference. Again, in ryse, you're not only seeing 1.5x higher res and nearly double the performance, but you get better AA, better details and better shadows.


In the end, you simply don't need a "much more" powerful pc to run ryse at 1080p and 30 fps. Turns out, a pc near xb1 specs will do it.
 
It's nonsensical for anyone to believe that a gpu being "x" times more powerful is supposed to mean that games should run "x" times faster. In the real world, that's not how it works. Hence why I have you the xb1 versus 360 power difference as an example. There you have an incredibly large difference in power, but you don't see that difference translated in games in terms of performance.
It's short sighted to just look at resolution and performance as being the only measures of difference. Again, in ryse, you're not only seeing 1.5x higher res and nearly double the performance, but you get better AA, better details and better shadows.


In the end, you simply don't need a "much more" powerful pc to run ryse at 1080p and 30 fps. Turns out, a pc near xb1 specs will do it.

This.

I'd say a 7850 is more on par with PS4, XB1 is more on par with a 7790.
 
Last edited:
It's nonsensical for anyone to believe that a gpu being "x" times more powerful is supposed to mean that games should run "x" times faster. In the real world, that's not how it works. Hence why I have you the xb1 versus 360 power difference as an example. There you have an incredibly large difference in power, but you don't see that difference translated in games in terms of performance.

In the end, you simply don't need a "much more" powerful pc to run ryse at 1080p and 30 fps. Turns out, a pc near xb1 specs will do it.
What is your basis for all this?
 
What is your basis for all this?


Real world benchmarks of various pc gpu/cpu components throughout the years.

"If" real world performances played out as your fuzzy math dictates, then a gaming pc today should be running Black Ops at well over 300 fps.




Again, you're limiting your knowledge of how pc hardware works to just what you see in terms of resolution and fps....and trust me....you are limiting it by a lot.

A 280x is running the game at 1080p at 45-55 fps, AND that's with better shadows, details and AA. Just imagine the performance jump if the game was at the Xb1 resolution AND toned down AA, shadows and details (you might not know this, but shadows are system hogs).


Now, you say that the 290x is 5x better than the XB1. In real world application, that doesn't mean the 290x would pump out 30fpsx5 (150 fps). That's simply not how it works. The more you cling to that idea with your fuzzy math, the more you show us that you have no idea how it works.

Take the below as an example. These are gpu benchmarks for Crysis 3. Notice that the very high end card in the list is not even a full 3x faster in regards to fps when compared to the lowest card in the list....despite the differences of the cards on paper. AND, that's with all things being equal (same game settings for all cards). In the case of Ryse, the 280x is not leaps and bounds better than the XB1's gpu, and you're getting nearly double the fps, higher resolution, better shadows, details and AA (I can't keep stressing that enough). You're not even comparing apples to apples and you're coming to insane conclusions as to what you "think" the performance should be.

18g21sfzjqvmlpng.png
 
Real world benchmarks of various pc gpu/cpu components throughout the years.

"If" real world performances played out as your fuzzy math dictates, then a gaming pc today should be running Black Ops at well over 300 fps.




Again, you're limiting your knowledge of how pc hardware works to just what you see in terms of resolution and fps....and trust me....you are limiting it by a lot.

A 280x is running the game at 1080p at 45-55 fps, AND that's with better shadows, details and AA. Just imagine the performance jump if the game was at the Xb1 resolution AND toned down AA, shadows and details (you might not know this, but shadows are system hogs).


Now, you say that the 290x is 5x better than the XB1. In real world application, that doesn't mean the 290x would pump out 30fpsx5 (150 fps). That's simply not how it works. The more you cling to that idea with your fuzzy math, the more you show us that you have no idea how it works.

Take the below as an example. These are gpu benchmarks for Crysis 3. Notice that the very high end card in the list is not even a full 3x faster in regards to fps when compared to the lowest card in the list....despite the differences of the cards on paper. AND, that's with all things being equal (same game settings for all cards). In the case of Ryse, the 280x is not leaps and bounds better than the XB1's gpu, and you're getting nearly double the fps, higher resolution, better shadows, details and AA (I can't keep stressing that enough). You're not even comparing apples to apples and you're coming to insane conclusions as to what you "think" the performance should be.

18g21sfzjqvmlpng.png
That benchmark pic shows the 680GTX as being 25% more powerful than the 660 Ti. That sounds about right when comparing this to PC hardware performance graphs for other games.
 
That benchmark pic shows the 680GTX as being 25% more powerful than the 660 Ti. That sounds about right when comparing this to PC hardware performance graphs for other games.

How can you continue to be so wrong? Did you miss where I said that chart had all else being equal? When comparing pc ryse to console, all things are NOT equal. You have higher resolution, better shadows, better details and better AA. You can't just throw some arbitrary % out there and say, well if the 280x is "x" times faster than the XB1, then, ryse on pc with a 280x card should be "x" times faster. In your world, using your fuzzy math, you think that it should be (xb1 fps)x("x" where x is how much faster the 280x is)=pc fps. I can't begin to tell you how wrong that is. All things are not equal, so you're calculations are so way off base.


THAT is why I used xb1 CoD performance compared to 360 CoD performance. Using your jacked up logic, ghost on xb1 should be running at well over 300 fps.
 
Take the below as an example. These are gpu benchmarks for Crysis 3. Notice that the very high end card in the list is not even a full 3x faster in regards to fps when compared to the lowest card in the list....despite the differences of the cards on paper.
Paper specs do not matter, especially when we have benchmark tools that show accurate performance analysis. And I don't get your comparison at all with Call of Duty BO2 on the 360 and X1. I thought tessellation wasn't possible on the 360. But ignoring all of that, I really want to know what convinced you to ever believe software rendering is as fast as hardware rendering.
 
Paper specs do not matter, especially when we have benchmark tools that show accurate performance analysis. And I don't get your comparison at all with Call of Duty BO2 on the 360 and X1. I thought tessellation wasn't possible on the 360. But ignoring all of that, I really want to know what convinced you to ever believe software rendering is as fast as hardware rendering.


Dude, it's painfully obvious that you simply don't get why pc game version of console game does not run at "x" times the speed of faster gpu.

You are ignoring over and over again the facts that you are NOT comparing apples to apples. A pc gpu being 5x better than xb1 gpu doesn't translate to 5x faster fps, when all other things are NOT equal.
I keep spelling it out for you, and you keep ignoring it.

I used the CoD game as an example...to show how stupid the argument is. Why did you bring up tesselation? (Hint, because that makes all things NOT equal.....the point you've been ignoring).

If we were comparing ryse on pc at 900p, less AA, lesser shadows, and less details, then the comparison would be better....but we're not. We're looking at 1.5 times the resolution, better shadows, better details and better AA all at about 80% better performance.
 
Dude, it's painfully obvious that you simply don't get why pc game version of console game does not run at "x" times the speed of faster gpu.

You are ignoring over and over again the facts that you are NOT comparing apples to apples. A pc gpu being 5x better than xb1 gpu doesn't translate to 5x faster fps, when all other things are NOT equal.
I keep spelling it out for you, and you keep ignoring it.

I used the CoD game as an example...to show how stupid the argument is. Why did you bring up tesselation? (Hint, because that makes all things NOT equal.....the point you've been ignoring).

If we were comparing ryse on pc at 900p, less AA, lesser shadows, and less details, then the comparison would be better....but we're not. We're looking at 1.5 times the resolution, better shadows, better details and better AA all at about 80% better performance.
I see. The difference in graphics between the 360 version of Black Ops 2 and that of the Xbox One version of the same game is comparable to the extreme difference in visual quality between Ryse on Xbox One and the PC version. I guess you are right about the whole software being as fast as hardware. Let us see these extreme differences in quality in a well-made comparison video.:

 
Now he's wrong on 2 points.

1. He continues to say BO2 on 360 vs xb1.
2. He now uses a youtube vid to show differences.



Buddy, whether or not you see the differences on a youtube vid doesn't mean the game isn't pushing these differences.
 
lol Qbert you have a lot of patience. It really seems like Kassan just argues because he likes to argue.
 
Now he's wrong on 2 points.

1. He continues to say BO2 on 360 vs xb1.
2. He now uses a youtube vid to show differences.



Buddy, whether or not you see the differences on a youtube vid doesn't mean the game isn't pushing these differences.
I guess that makes sense. A youtube video can't show the vast difference between the PC version and the Xbox One version of Ryse. Hell, not even screenshots can show that vast difference. It's an invisible difference that no one can understand. Software is as fast as hardware and always has been. There is no benefit to having a closed API designed for a single hardware.
 
I think qbert wins this argument because he is correct when he says Kassen isn't comparing apples to apples. Perhaps maybe instead of saying better AA, shadows, etc he would understand better if you said what he's doing is like comparing a game at 1080p on low details vs 1080p at max details and saying that's a valid comparison of fps performance.
 
I just ran through all of Ryse at max settings on PC at 1080p. My god what a looker, best looking game I've ever seen bar none, the gameplay is extremely repetitive and does get old but the story is pretty decent.
 
I think qbert wins this argument because he is correct when he says Kassen isn't comparing apples to apples. Perhaps maybe instead of saying better AA, shadows, etc he would understand better if you said what he's doing is like comparing a game at 1080p on low details vs 1080p at max details and saying that's a valid comparison of fps performance.


Exactly. Whether or not Kassen thinks there's an invisible difference, doesn't mean there isn't a difference. Fact is, he is simply not comparing apples to apples. The guys that actually, you know, do these analysis (versus fanboys that watch youtube vids) even said so. Fact is, the pc version is pushing more pixels, better shadows, etc. That's not a debate.

Kassen could repeating over and over again until he's blue in the face that he doesn't see a difference based off of youtube vids....that doesn't mean that there is no difference.