Project Scorpio v4.0

Status
Not open for further replies.
LOL haha the spin is real
When it doesn't happen its because it won't be able to in those circumstances.

How is it spin?

Of course, when a game is rendered at FauxK on Scorpio it'll be because it can't run the same scene at native 4K at the same frame rate. Obviously.

But how is that a problem or an issue?
 
Might as well rename this thread "the #@*% measuring thread".

Get over it people. Let's wait until we have more info, and see actual gameplay running. The proof is in the pudding...
 
Might as well rename this thread "the #@*% measuring thread".

Get over it people. Let's wait until we have more info, and see actual gameplay running. The proof is in the pudding...

I'm actually totally with you.

The Scorpio specs beat PS4 pro on every single level, so there's nothing to debate there.

What's interesting to me, now, is what the obvious/clear/undebatable superiority in specs will translate to in real world game scenarios.
 
Last edited:
  • Like
Reactions: TeGu
Not all PC games get 4K texture packs either, those take up a decent amount of extra room on a HDD as well and can get pretty expensive for the dev to make. I know it's not the norm but the high res texture pack for the last fallout game was over 50gb and it doesn't even look that good IMO, well compared to the ugly base game yeah lol but not compared to most others.

Why would you use a Bethesda game as an example for a 4k texture pack? That's an odd choice, even the regular textures don't look great. Battlefront 2 for example is coming this year and the first looked incredible with the textures. Plus you the first party games will all likely have them.
 
Here's to hoping developers start offering checkerboard option to go from 30 to 60 FPS. I've joined the stable frame rate camp. I think a lot of hardcore gamers are there too.

Ironically for MS to capture the hardcore PS4 crowd on multiplats, they'd gain more of them with checkerboard and 60 FPS than true 4K and 30 FPS. Will be nice to have both options.
I think you will be disappointed if you think there is going to be a 30 to 60.

Also MS dont need checkerboard crap they have the muscles to go native without it.
 
Why would you use a Bethesda game as an example for a 4k texture pack? That's an odd choice, even the regular textures don't look great. Battlefront 2 for example is coming this year and the first looked incredible with the textures. Plus you the first party games will all likely have them.
Well he could be talking about the mods on xbox for graphics mods.
 
Why would you use a Bethesda game as an example for a 4k texture pack? That's an odd choice, even the regular textures don't look great. Battlefront 2 for example is coming this year and the first looked incredible with the textures. Plus you the first party games will all likely have them.

I used them as an example simply because of how much extra HDD space it took for their crap pack, it was bigger than most games. BF2 already looks awesome without using 4K textures though, that's the thing you are going from great to really great which could be a hard sell if there is a big price difference, nothing on either console is going to look like blurry unless it's done by a bad developer.
 
I think you will be disappointed if you think there is going to be a 30 to 60.

Also MS dont need checkerboard crap they have the muscles to go native without it.

A 200mhz boost in CPU isn't going to double the framerate.
 
Christ this thread has turned really salty due to one dudes instance on troll posting.

Like others have said. Scorpio is the most powerful console on the market when it releases. It makes it's nearest rival look embarssingly underpowered.

If the information contained in the DF reveal is correct and let me be clear I think it's 100% correct, it's their reputation, then consumers have a clear choice. Pay for the Scorpio with 1st and 3rd parties best version of 4K gaming (native) the Nintendo 1 game gimic Switch or a couple of PS4 SKU's that only have 'volume sales' as a reason to buy it.

I am predicting Scorpio will alone push Microsoft past PS4 sales this generation. Cause at the end of the day. You want to play games on the very best system possible.

Also notice how nothing has been said about the VR support yet!
 
  • Like
Reactions: jimmyD
Of course I don't. I'm not making any claims.
I would just like to know who he is and who he works for.

DF has their rep on the line. This guy is fairly anonymous.
And perhaps.. What job he does as a dev? Is he an artist? Sound engineer? Even IF he were a programmer, that doesn't mean he knows how the thing was customized.

Just being a developer doesn't mean much. Dude could make online casino games for all we know.

I can't imagine MS would go through all this work only to hamstring themselves. I was initially disappointed for no Zen, then I realized that the end results are what matters.

If that Forza shot is indicative of the quality we will get, I couldn't care less the name of the chip. I'm ready for some games that can ustilize my tv!
 
Show me a game where PS4's raw GPU performance is measurably 50% higher, as the raw GPU math says it should be.

Look at any comparison - the perf is usually in the ballpark of 30% better.

Unity is the only game we know of that's CPU bound, and it shows 15% improvement over PS4.

Therefore, even with ZERO additional data, you're wrong. My assessment is not unsubstantiated. You were incorrect.

As for as trends go, yeah - the trend is that PS4 is outperforming X1 by about 30%, not 50%.

That's further substantiation.

One data point doesn't make what you say fact buddy. Its also odd you keep spouting off how no one knows what customisations that Microsoft did the CPU's (if any) but don't seem to afford the same assumption to Sony?, sounds like you might have a bit of bias there.

Yeah, KB and JinCA are really salty right now... this must be hard for them.

I admitted its more powerful in every way I'm just countering your terrible assertions and inability to understand statistics or the possibility that you may be wrong.
 
One data point doesn't make what you say fact buddy. Its also odd you keep spouting off how no one knows what customisations that Microsoft did the CPU's (if any) but don't seem to afford the same assumption to Sony?, sounds like you might have a bit of bias there.



I admitted its more powerful in every way I'm just countering your terrible assertions and inability to understand statistics or the possibility that you may be wrong.

So, you have no examples of a PS4 title hitting 50% better real-world performance in a game?

Got it.

Thanks for proving me right. Again.

...and my, you are salty!
 
  • Like
Reactions: Mcmasters
Here the article I referenced earlier where Anandtech is projecting Scorpio real world bandwidth usage to be in the same class as the Pros despite the significant theoretical difference.

"What makes things especially interesting though is that Microsoft didn’t just switch out DDR3 for GDDR5, but they’re using a wider memory bus as well; expanding it by 50% to 384-bits wide. Not only does this even further expand the console’s memory bandwidth – now to a total of 326GB/sec, or 4.8x the XB1’s DDR3 – but it means we have an odd mismatch between the ROP backends and the memory bus. Briefly, the ROP backends and memory bus are typically balanced 1-to-1 in a GPU, so a single memory controller will feed 1 or two ROP partitions. However in this case, we have a 384-bit bus feeding 32 ROPs, which is not a compatible mapping.

What this means is that at some level, Microsoft is running an additional memory crossbar in the SoC, which would be very similar to what AMD did back in 2012 with the Radeon HD 7970. Because the console SoC needs to split its memory bandwidth between the CPU and the GPU, things aren’t as cut and dry here as they are with discrete GPUs. But, at a high level, what we saw from the 7970 is that the extra bandwidth + crossbar setup did not offer much of a benefit over a straight-connected, lower bandwidth configuration. Accordingly, AMD has never done it again in their dGPUs. So I think it will be very interesting to see if developers can consistently consume more than 218GB/sec or so of bandwidth using the GPU."

So...326 vs 218. I feel like we're missing something. M's engineers are not idiots.
The ROP count isn't just a solid number, the clock-speed affects ROPs. So let's break this down.
32 at 911mhz = 218gb/s
32 at 1172 = Xgb/s
Theoretical maximum the ROP units can have in Memory bandwidth, it does not work that way, but since AnandTech assumes it does, we'll do it too. As there is overhead, and you'd never want to use your entire bandwidth just for the rasterizer output pipeline alone. The CPU also wants some of that juicy bandwith you know ;). But let's just say for the sake of Anandtech it does fill it up completely, and we do know that the ROPs are affected by clockspeed of the GPU.

the 32's we can stripe away as they are the same for either system. So we can just solve for X.
X = (1172/911)*218 = 280.456641 or 281gb/s

So if we keep the constants we come at a 281gb/s bandwidth for the Scorpio. Which is correct considering it is clocked about 28% faster.
Now if we subtract this value from the 326gb/s number.

326gb/s - 281gb/s = 45gb/s.

So the "crossbar" is 45gb/s and probably reserved for the CPU to access the memory without bothering the rasterizer operations pipeline. It can also probably be used by the GPU itself (As it goes through the GPU) for other tasks than graphical (for instance physics calculations. This actually is a smart design, as it would allow the CPU to access and manage it's memory via it's own dedicated controller/path while having plenty of bandwith for itself to keep it fed. 45gb/s is plenty even more than enough for a Jaguar. In fact AMD chips are notorious for performing better with higher clocked memory/more bandwith, unlike Intel where it matters less usually, on am AMD it can give you a nice boost. So if it is indeed a dedicated 45gb/s for the Jaguar + (as the power management stuff is something I really want to learn more about, it seems very interesting and novel and unlike the normal Jaguar/Puma chips), then we might have found the reason why it punches above it's weight.


So yes, by just going "OMG it's 218 just like the Pro", Anandtech doesn't do itself any favours and is actively spreading FUD. They of all people should know that ROP is more than just their number alone and that clock speed affects the ROP speed. That is assuming that MS put 32 ROPs on there (that isn't even certain yet). It could be more, and it could be less. I expect MS to cheap out on these things. :(, they've done that too much as of late. (One Drive and Windows phones are recent examples. ).

PS: With ROP's we are talking about fillrate, not bandwith ;). But since AnandTech chooses to go this route, I found it funny to stick with it. The ROP fillrate should align with the memory controller's bandwith. So you can't just say it has <x> bandiwth and since it is 32 in both they are teh same speed. No with a higher clockspeed the fillrate increases but with a higher clockspeed also the Memory controller's speed increases. It just isn't as simple as "32 = 32", there are a lot more factors to compare.
 
Last edited:
How is it spin?

Of course, when a game is rendered at FauxK on Scorpio it'll be because it can't run the same scene at native 4K at the same frame rate. Obviously.

But how is that a problem or an issue?
Just Val attempting to show he is correct and Scorpio's bump in performance over pro as negligible. If 100% of Scorpio titles aren't 4K he wins something in life.
 
C85Mbf2XUAAxNax.jpg
 
New DF vid



Some great quotes in there:

"This is unprecedented."

"The memory bandwidth in Scorpio ensures the additional overhead of these (4K) assets only hits performance by 1%."


"4K makes you a better gamer"

..oh wait. That's me.
 
Their opinion counts as much now as it always has meaning it means more than yours or mine because he has a better understanding of those things, that being said someone who actually has access to the hardware or knows people who work in his field that do can give more real world impressions. What we are hearing from DF is what the specs are and what he's told by MS that they can do, that's cool but at the same time you know as well as I do a lot of times these "we've made this great technical improvement" lines that come from companies end up not living up to the hype, at least not the over hype some people here are exhibiting.

Do we know he has acces to the hardware? We know Ledbetter has a production Scorpio Engine in his possession. ;)
 
That poster made a lazy argument:

1. Jaguar is a bottleneck on the Pro.

2. Therefore it must be a bottleneck on Scorpio.

It may end up being one but I imagine if you throw enough at then of course it eventually bottlenecks. I'm sure the same can be said at other aspects of the box also. It can only do so much in the end.

Another bit of information from the DF article that speaks to this (to those with their ears open) : MS had a hardware emulator and was running a profiler that would go through shipped games, and identify where there were bottlenecks, and then did their CPU and GPU customizations based on this.
 
Another bit of information from the DF article that speaks to this (to those with their ears open) : MS had a hardware emulator and was running a profiler that would go through shipped games, and identify where there were bottlenecks, and then did their CPU and GPU customizations based on this.

That pix tool you describe is coming up quite a bit. It was initially built to help developers maiximize their efforts on Xbox One in relation to its architecture. Now it sounds like it was basically used as a stress test for a bunch of games and Scorpio was built around that.
 
50% better in what way?

GPU performance (higher frame rate and/or pixel count).

Examples:

A game which runs at 900p (X1) vs 1080p (PS4) with the same frame rate is getting about 30% more performance from the GPU.

A game which runs at 40FPS vs. 30 FPS in the same scenario is getting about 30% better perf from the GPU, if the perf gap is due to being GPU bound.

For context, kb said that my claim that CPU and GPU performance is on X1 is proportionately better than PS4's was unsubstantiated, but it's demonstrably not:

Raw CPU power is 10% better on X1.
The only game we know of which is CPU bound runs on average 15% better on X1 under CPU bound conditions.
Therefore, X1's CPU real-world performance (+15%) is BETTER than the raw hardware gap (+10%). This means the software runs better/more efficiently on X1.

This tells us that X1 gets better than 10% CPU performance in the only case we know that's measurably CPU bound in a shipped title.

Raw GPU power is 50% better on PS4.
Across the gamut of comparables, however, the general performance gap is only around 30%-35%.
Therefore, X1's GPU real-world performance (-30%) is BETTER than the raw hardware gap (-50%). This means the software runs better/more efficiently on X1 with regards to the GPU.

In the VAST majority of DF comparisons I've seen, PS4 regularly gets about 30%-35% better performance than Xbox 1. This tells me that, even though the PS4 has a superior GPU by 50% (it has roughly 50% better TF performance), PS4 games aren't seeing that full performance advantage over Xbox One. This tells me that Xbox One games get better performance per clock than PS4 titles, which would make sense, given the heavy investment in DX12.

Therefore, kb was wrong/incorrect. My statement was very well substantiated, if not demonstrably provable.
 
  • Like
Reactions: GordoSan
Status
Not open for further replies.