Eurogamer: Face-Off: Ryse on PC The new benchmark for PC gaming? hardware?

Plainview

I am a sinner.
Sep 11, 2013
47,635
25,262
4,279
via Eurogamer/Digital Foundry

Hot on the heels of the disappointing Dead Rising 3 port, Xbox One's other temporary launch exclusive, Ryse, is now primed and ready for release on PC. We had a chance to take a look at an unfinished demo build a couple weeks back, but now that we've spent some time with the release code we can finally explore it in greater detail.

Crytek's games have always been known for pushing the visual envelope and Ryse represents the latest iteration of its impressive CryEngine technology. Crysis 3 was a demanding but highly scalable game on the PC and, from what we've experienced, Crytek's latest release follows suit. Perhaps more than any other game to date, Ryse comes close to delivering the kind of experience you'd expect from a pre-rendered CG film. The game ships as a DX11 64-bit only release, unlike Crysis 3, no doubt as a result of the memory requirements and its Xbox One origins. There was talk of support for AMD's Mantle but this has not materialised and Crytek has stated that it will not be added at a later date.





Almost one year on from its original release Ryse still stands as one of the most visually impressive games on the market. The quality of the materials, lighting, and post-processing is remarkable and it comes close to achieving the pre-rendered look Crytek is clearly striving for. Indeed, Ryse just might be the most impressive-looking game available today on the PC.

It's an impressive effort, due in no small part to the excellent CryEngine powering the game. It's a demanding release with plenty of room to grow but it runs very well on hardware available now while scaling down to lower-end machines pretty smoothly. Compared to Dead Rising 3, this is something of a revelation and a very skilled porting effort on the part of Crytek - with the quality of its visuals and its high system requirements for the absolute top-end experience, it seems likely that Ryse will become a new benchmark for those looking to test the limits of their high-end PC hardware.​
 
Can't wait to see it on PC, I still have it Xbox and can't believe how incredible it looks even there. Still nothing on consoles looks as good, can't imagine a real gaming machine pushing it.
 
  • Like
Reactions: Myownsun
It actually makes sense that this is a good port. Crytek love the PC platform.

I actually do not understand all the " New hardware benchmark" stuff. The recommended specs are nothing special at all and sit pretty much in the mid range tier for gaming PCs. I see games like Metro Last light & Crysis 3 still be the benchmark for putting your PC through its paces.

Oh, and I read a review from somebody who played a build with an i3 * 7870 and they hit a consistent 40-50 FPS. No doubt because the i3 held them back. Not much of a rig tester if you ask me.
 
  • Like
Reactions: Myownsun
I'll get it when its about $20 or so can't believe it wasn't on gmg, wtf?
 
Runs pretty well maxed out at about 40-45fps on my 7950 Boost/junk/joke/console squasher/HTPC at 1920x1080
 
That's crazy that you really do need a high end card to hit 60fps even at just 1080p. This is why I love Crytek. They don't like limiting PC.
 
  • Like
Reactions: Myownsun
Just a heads up, I'm about 48 percent through the game and I locked it at 30 fps and it's a VERY smooth and responsive 30fps, max settings at 1080p, cryteks frame pacing is astounding. This is on a 7950 boost. So do not be afraid to play it at 30fps.
 
Just a heads up, I'm about 48 percent through the game and I locked it at 30 fps and it's a VERY smooth and responsive 30fps, max settings at 1080p, cryteks frame pacing is astounding. This is on a 7950 boost. So do not be afraid to play it at 30fps.
Just got the game this morning and played the beginning of the game just to get a feel of it. No frame drops what so ever on my GTX 670. Consistent 30 fps only because I locked the frame rate. Everything is maxed out with 1080p. My only complaint is temporal AA that blurs the image quality, thus affecting the quality of textures. Crytek should have added MSAA or at the very least, SMAA, which is my preferred Post-processing AA method that has negligible affect on IQ. I'm sure the modding community will release a program that allows for more configurable options as they did with Crysis 2 and 3.

Nevertheless, the game is a solid port.
 
Last edited:
Just got the game this morning and played the beginning of the game just to get a feel of it. No frame drops what so ever on my GTX 670. Consistent 30 fps only because I locked the frame rate. Everything is maxed out with 1080p. My only complaint is temporal AA that blurs the image quality, thus affecting the quality of textures. Crytek should have added MSAA or at the very least, SMAA, which is my preferred Post-processing AA method that has negligible affect on IQ. I'm sure the modding community will release a program that allows for more configurable options as they did with Crysis 2 and 3.

Nevertheless, the game is a solid port.

The Temporal AA they use is actually called SMAA T1x. It was also in Crysis 3. I finally got a chance to get Ryse on my main gaming machine on a heavily OC'd 780 Ti and I played for about 15 minutes, never say it drop below 60fps at 1080p. I experimented with SSAA 1.5x1.5 and was abot to be in the high 30s into the mid 40's, never dropped below 30 in the section I played.
 
The Temporal AA they use is actually called SMAA T1x. It was also in Crysis 3. I finally got a chance to get Ryse on my main gaming machine on a heavily OC'd 780 Ti and I played for about 15 minutes, never say it drop below 60fps at 1080p. I experimented with SSAA 1.5x1.5 and was abot to be in the high 30s into the mid 40's, never dropped below 30 in the section I played.

Temporal AA and SMAA are two different types of AA.

http://www.tweakguides.com/Crysis3_6.html

The term Antialiasing (AA) refers to any method that can help smooth out jagged lines, and reduce the distracting shimmering and crawling of those lines when in motion. Crytek has chosen to build several AA methods into Crysis 3: FXAA (Fast Approximate Anti-Aliasing), SMAA (Subpixel Morphological Anti-Aliasing), MSAA (MultiSample Anti-Aliasing), and TXAA (Temporal Anti-Aliasing). The reason for this choice of AA methods is that each has its benefits and drawbacks, both in terms of image quality and performance. There is no single recommended optimal AA method, as it largely depends on a combination of your system's capabilities and personal taste.


TXAA: This option is only available if you are using an NVIDIA GTX 600 series or GTX Titan graphics card. TXAA is based on MSAA, but places greater emphasis on reducing shimmering when in motion, known as temporal aliasing. This distracting shimmering can be particularly disastrous in Crysis 3 multiplayer when you're on the lookout for cloaked enemy nanosuits, which also shimmer. The available options are TXAA Medium (2xT) and TXAA High (4xT).
 
Last edited:
Runs pretty well maxed out at about 40-45fps on my 7950 Boost/junk/joke/console squasher/HTPC at 1920x1080
Is that with Super sampling or without?

I can run it pretty well with everything maxed at 1080P but Super-sampling off. Once i turn it on sh!t gets real slow like fps in the 20's. I'm running it with a GTX680.
 
Temporal AA and SMAA are two different types of AA.

http://www.tweakguides.com/Crysis3_6.html

Not when they are combined like TXAA, TXAA is a MSAA w/ a temporal filter, SMAA T1x is SMAA with a 1x Temporal Filter. The temporal filter does a great job on pixel shimmering.

EDIT: Ryse uses SMAA T2x, not T1x like I originally thought, same thing though just with 2x temporal samples w/ SMAA.
 
Last edited:
Is that with Super sampling or without?

I can run it pretty well with everything maxed at 1080P but Super-sampling off. Once i turn it on sh!t gets real slow like fps in the 20's. I'm running it with a GTX680.

No SSAA.
 
I dunno. Game is pretty average. Especially coming straight from Shadow of Mordor! The gameplay just isn't that interesting..pretty generic. At least the finishers are sometimes satisfying.
 
Some tweaks for some of you guys.

Go to the root directory and open System.cfg.

Just copy and paste this and save. This will enable "Very High" textures for 2GB video cards. Make sure you go into your Nvidia control panel and enable Adaptive Vsync. Then disable full screen and Vsync in the game's menu. The game will look better and performance will run smoothly. This is on my GTX 670.

; Ryse

sys_game_folder=GameRyse
sys_dll_game=CryGameRyse.dll
sys_user_folder=GameRyse
sys_localization_folder=Localization
sys_spec_TextureResolution=4
r_fullscreenwindow=1

;RSC servers 8core5 and build11

log_IncludeTime=1
sys_languages="english,italian,french,spanish,german,russian"

log_verbosity=0
 
Last edited:
So an Xbox One game is the new benchmark for PC. Yet people are worried about how powerful the X1 is? That's great.
 
So an Xbox One game is the new benchmark for PC. Yet people are worried about how powerful the X1 is? That's great.


That is funny....but bear in mind, they were playing at 4k resolution.

hell, with a 1 gig card, they were running 1080p at 30 fps.




As for how powerful the XB1 is, that's a nonsense argument. Both XB1 and PS4 are pretty powerful machines. One is just a tad bit more powerful than the other, but in the end, that really doesn't mean much.
 
That is funny....but bear in mind, they were playing at 4k resolution.

hell, with a 1 gig card, they were running 1080p at 30 fps.




As for how powerful the XB1 is, that's a nonsense argument. Both XB1 and PS4 are pretty powerful machines. One is just a tad bit more powerful than the other, but in the end, that really doesn't mean much.

Neither of which the X1 does. 1600x900 and is alot of the times below 30fps.
 
So an Xbox One game is the new benchmark for PC. Yet people are worried about how powerful the X1 is? That's great.
This is actually pretty true. Just to push 1080p and 30fps on PC requires much more hardware than what you would expect despite how well the Xbox One handled 900p/30fps with such low-spec'd hardware. The benefit of a closed console architecture is as evident as ever.
 
DX12 is suppose to address the overhead issue of PC.
 
This is actually pretty true. Just to push 1080p and 30fps on PC requires much more hardware than what you would expect despite how well the Xbox One handled 900p/30fps with such low-spec'd hardware. The benefit of a closed console architecture is as evident as ever.



Um....did you NOT read the DF article? A R9 280 (a card that can be had for about $200) ran the game at 1080P with about an average of 45 fps.


So, um, no....it doesn't take "much more" hardware.


And the XB1 and PS4 really isn't so "low" spec'd, just lower than pc. However, the problem is, you're stuck with those low specs for the rest of this generation, which were already behind pc right out the gate. Meaning, the more into this gen we get (and face it, we're pretty early in it) the larger we'll see the gap between pc and console.
 
This is actually pretty true. Just to push 1080p and 30fps on PC requires much more hardware than what you would expect despite how well the Xbox One handled 900p/30fps with such low-spec'd hardware. The benefit of a closed console architecture is as evident as ever.

It didn't handle it well at all, the game frequently dipped down the 20's, to run the game at 1080p and rock solid 30fps does not take much at all, my 7950 which is almost a 2 1/2 year old graphics card and can be had on ebay for like 120 dollars... mine runs the game at 1080p and in the mid 40's.
 
Last edited:
Um....did you NOT read the DF article? A R9 280 (a card that can be had for about $200) ran the game at 1080P with about an average of 45 fps.


So, um, no....it doesn't take "much more" hardware.


And the XB1 and PS4 really isn't so "low" spec'd, just lower than pc. However, the problem is, you're stuck with those low specs for the rest of this generation, which were already behind pc right out the gate. Meaning, the more into this gen we get (and face it, we're pretty early in it) the larger we'll see the gap between pc and console.
X1 is definitely low spec'd. Here's an idea of how weak the Xbox One GPU is in just raw specs taken from that virtual benchmark for PC.
PS4-Vs-Xbox-One-Vs-PS4-Benchmarks.jpg
 
X1 is definitely low spec'd. Here's an idea of how weak the Xbox One GPU is in just raw specs taken from that virtual benchmark for PC.
PS4-Vs-Xbox-One-Vs-PS4-Benchmarks.jpg


Regardless. Those are just numbers in comparison to PC, and obviously, the pc will be ahead.

In raw power, the XB1 and PS4 are no slouches, they just can't compete with PCs.



With that said, the comment that you'd need a much more powerful pc to run at just 30 fps is flat out false.
And "much more powerful" doesn't mean much when the cost doesn't translate to "much more".


See what I mean? Way too often, console gamers try to discredit the pc as a gaming platform by citing cost as being a negative, when the truth is, pc gaming is not that expensive.

Real world numbers.....do you know the true cost of the Xb1 without the Kinect if we consider an 8 year life before next gen? For over 90% of gamers that own an XB1, that would be at least $800 (cost of console plus LIVE subscription).
That's not even factoring in cheaper pc game costs. Ryse was $60 at release. On pc, it was $40 at release.
 
Yup, the Xbox One has the power of the cloud so it's a bit more powerful than the PS4. Both are very powerful though.


Bwhahahahahahahahaha.


Just imagine how much better Ryse would run or look on the PS4.
 
Regardless. Those are just numbers in comparison to PC, and obviously, the pc will be ahead.

In raw power, the XB1 and PS4 are no slouches, they just can't compete with PCs.



With that said, the comment that you'd need a much more powerful pc to run at just 30 fps is flat out false.
And "much more powerful" doesn't mean much when the cost doesn't translate to "much more".


See what I mean? Way too often, console gamers try to discredit the pc as a gaming platform by citing cost as being a negative, when the truth is, pc gaming is not that expensive.

Real world numbers.....do you know the true cost of the Xb1 without the Kinect if we consider an 8 year life before next gen? For over 90% of gamers that own an XB1, that would be at least $800 (cost of console plus LIVE subscription).
That's not even factoring in cheaper pc game costs. Ryse was $60 at release. On pc, it was $40 at release.
When you do a little math and compare what it takes for the 290X to achieve its target resolution and framerate to what was achieved with that of the Xbox One being only a fifth of the power, it takes roughly a PC with 200% the power of the Xbox One to run the game at similar resolution and framerate. I don't really see what is 'not much more' about that. It shouldn't ever have to be argued that rendering purely in hardware has major advantage over having loads of additional software in the way like in DirectX.