Official Thread Pillow Fight that nobody wins with MOAR Jackie Chan and guys comfortable with STRETCHING their sexuality!

Status
Not open for further replies.
Actually, it doesn't mean that, Jinca. You know better than that. If the CPU and GPU were functioning at the same time, it wouldn't be variable. Theyd just lock the PS5 at 10.2 and call it a day. The fact devs have to throttle back on CPU to peek the GPU means that they most definately arent working most of the time. In fact 10.2 is sounding more and more theoretical in nature the more we learn. Im almost positive the PS5 original performance is guaranteed 9.4 at this point.

Know better than what? You have it all wrong, the frequency isn't the issue it's the workload, when the cpu or gpu are being asked to do something that's going to draw more power to accomplish that's when the other one is asked to give up some of it's power so the frequency will drop a bit. It seems that you aren't understanding (most people aren't so it's not just you) that workload and frequency aren't the same thing.

Again when the lead architect of a system says both the CPU and GPU will likely be running at their top frequency most of the time that means they both can run at full frequency at the same time. Also you are 100% wrong on the original performance being 9.4, Matt at resetera said this was always the plan but for some reason some of you guys want to pretend that sony got so scared of MS and their 12 TF that in a month they went out and decided to do something with their hardware that it wasn't originally designed to do and that's laughable.
 
How can he even call it when it comes down to the devs? Whats he mean by "most of the time?" That means it fluctuates. Rises and falls with neither being locked in at any point. How much performance can devs rely on 100% of the time? Its not 10.2. Thats my only point here. All these vague answers for something that should be communicated clearly. It shouldnt be this ambiguous.

Here's a quote from the DF article that may interest you, also they don't even seem to understand completely what Sony is doing so it'd be hard to blame anyone else for being confused.


But what if developers aren't going to optimise specifically to PlayStation 5's power ceiling? I wondered whether there were 'worst case scenario' frequencies that developers could work around - an equivalent to the base clocks PC components have. "Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."

Also Cerny said that when talking "most of the time" they aren't talking best case scenario

"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."
 
Last edited:
Here is a chunk from the DF article that I think will make some people understand what the PS5 is doing a little bit better:


Feedback from developers saw two areas where developers had issues - the concept that not all PS5s will run in the same way, something that the Model SoC concept addresses. The second area was the nature of the boost. Would frequencies hit a peak for a set amount of time before throttling back? This is how smartphone boost tends to operate.

"The time constant, which is to say the amount of time that the CPU and GPU take to achieve a frequency that matches their activity, is critical to developers," adds Cerny. "It's quite short, if the game is doing power-intensive processing for a few frames, then it gets throttled. There isn't a lag where extra performance is available for several seconds or several minutes and then the system gets throttled; that isn't the world that developers want to live in - we make sure that the PS5 is very responsive to power consumed. In addition to that the developers have feedback on exactly how much power is being used by the CPU and GPU."

Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level. "Power plays a role when optimising. If you optimise and keep the power the same you see all of the benefit of the optimisation. If you optimise and increase the power then you're giving a bit of the performance back. What's most interesting here is optimisation for power consumption, if you can modify your code so that it has the same absolute performance but reduced power then that is a win. "

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

"There's another phenomenon here, which is called 'race to idle'. Let's imagine we are running at 30Hz, and we're using 28 milliseconds out of our 33 millisecond budget, so the GPU is idle for five milliseconds. The power control logic will detect that low power is being consumed - after all, the GPU is not doing much for that five milliseconds - and conclude that the frequency should be increased. But that's a pointless bump in frequency," explains Mark Cerny.


At this point, the clocks may be faster, but the GPU has no work to do. Any frequency bump is totally pointless. "The net result is that the GPU doesn't do any more work, instead it processes its assigned work more quickly and then is idle for longer, just waiting for v-sync or the like. We use 'race to idle' to describe this pointless increase in a GPU's frequency," explains Cerny. "If you construct a variable frequency system, what you're going to see based on this phenomenon (and there's an equivalent on the CPU side) is that the frequencies are usually just pegged at the maximum! That's not meaningful, though; in order to make a meaningful statement about the GPU frequency, we need to find a location in the game where the GPU is fully utilised for 33.3 milliseconds out of a 33.3 millisecond frame.

"So, when I made the statement that the GPU will spend most of its time at or near its top frequency, that is with 'race to idle' taken out of the equation - we were looking at PlayStation 5 games in situations where the whole frame was being used productively. The same is true for the CPU, based on examination of situations where it has high utilisation throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency."

Put simply, with race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time. Cerny also stresses that power consumption and clock speeds don't have a linear relationship. Dropping frequency by 10 per cent reduces power consumption by around 27 per cent. "In general, a 10 per cent power reduction is just a few per cent reduction in frequency," Cerny emphasises.

It's an innovative approach, and while the engineering effort that went into it is likely significant, Mark Cerny sums it up succinctly: "One of our breakthroughs was finding a set of frequencies where the hotspot - meaning the thermal density of the CPU and the GPU - is the same. And that's what we've done. They're equivalently easy to cool or difficult to cool - whatever you want to call it."

There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.
 
wasn’t heating issues brought up in that recent df video?

I don't know if they ever said anything about it, I know a few rumor mongers have been trying to spread that around, they did before the PS4 launch as well though.
 
If the console specs reversed, a couple of guys here would laugh at MS for having less power potential lol

Nope but I'm sure a bigger chunk of people would be saying that MS outsmarted Sony by doing something new lol. This isn't an issue of someone trying to say the PS5 is more powerful because it's not, to me it's just an issue of people not understanding what Sony designed their console to do and are making silly uninformed assumptions based on an old way of thinking. I also don't mean to say that I'm much more informed and standing a step above everyone else in understanding this I just know a lot of the conclusions some people are coming to like Sony's boost being a response to XSX or that the CPU and GPU can't run at their full frequency at the same time is just not true.
 
Last edited:
  • Informative
Reactions: Swede
Xsx more powerful then

Has anyone said it wasn't? if so I haven't seen it but there are a couple of people in here making it sound like there is some big fight going on about which one overall is stronger and I don't think anyone has said that the PS5 is the more powerful of the two. There are a couple of things it may be better at but it's not the more powerful machine, the XSX is the more powerful machine overall.
 
A post from the DF PS5 thread by Albert Penello at resetera.

Well the article clearly disproved the idea that the CPU and GPU would have to tradeoff performance to run at the higher clocks which was something that seemed to be the case from the presentation.

I also thought the test between the two GPU's normalizing the TFLOPS with a 36cu and 40cu GPU was pretty interesting. I would not have thought you would see any material performance change when the FLOPS were the same but that was an interesting result.

Also nice to clear-up what appeared to be conflicting information where both sides turned out to be right (hearing from developers the need to "lock" clocks being a feature of the Devkit vs. final HW)

It looks like they have done something super interesting here which I admit I'm still wrapping my head around. Nice to get a little more detail and clarity on some of these points from DF.
 
Are we listening to what the architect of the PS5 said or are we calling him a liar?

I don't think he's lying. He said they expect it to be at or near those frequencies most of the time.

He also told DF that a 10% drop in frequency yields a 27% drop in power to give more of an idea of the power curve beyond the "couple" percent leading to 10% per savings. That lines up with the 2.23 being pretty in the range where a lot of voltage has to be thrown in for minor gains.
 
  • Agree
Reactions: Two Pennys Worth
...the developers have feedback on exactly how much power is being used by the CPU and GPU."

Mark Cerny sees a time where developers will begin to optimise their game engines in a different way - to achieve optimal performance for the given power level.
Optimizing for power draw rather than having fixed performance is definitely not going to be easier to develop for, which would explain why the devs they mentioned later in the article are choosing to drop CPU power to get "locked" GPU. This sounds like if there's any additional benefits to be wrung from this optimization, we'll see it from first party.

It's an innovative approach, and while the engineering effort that went into it is likely significant, Mark Cerny sums it up succinctly: "One of our breakthroughs was finding a set of frequencies where the hotspot - meaning the thermal density of the CPU and the GPU - is the same. And that's what we've done. They're equivalently easy to cool or difficult to cool - whatever you want to call it.

Wait, what are they doing on CPU that makes it equally hard to cool as a GPU pushed near the voltage limits? And this doesn't mention their earlier point that workloads, not just frequency, affect power draw which affects thermals.

It's going to be interesting once the systems can be disassembled and we can see real games. I'm really curious to know what effects the decisions to apparently base the CPU/GPU balance on equivalent heat outputs may show. I also know they're a damn fine engineering team, but it just seems like there's no world where that CPU should be equally difficult to cool as the GPU.
 
Last edited:
  • Informative
Reactions: karmakid
Problem with the variable clocks is that it will have a lower limit when it comes to performance. Another worrying factor is the heat from having the GPU clocked so high.
My guess as to why we haven't seen an actual PS5 unit is because they might still be trying to get their cooling system firing on all cylinders.
 
  • Agree
Reactions: Frozpot
Problem with the variable clocks is that it will have a lower limit when it comes to performance. Another worrying factor is the heat from having the GPU clocked so high.
My guess as to why we haven't seen an actual PS5 unit is because they might still be trying to get their cooling system firing on all cylinders.

I thought Cerny said they focused more on the cooling solution for the PS5 but I may have remembered incorrectly.
 
Sony using AMD Smart Shift is ANOTHER dead on clue that they aren't using a desktop grade Zen 2 CPU and rather it is closer to the laptop iteration. Smartshift is a feature of their laptop CPU's.
 
  • Agree
Reactions: The Living Tribunal

I wouldn't be surprised if it is true. The system with those clocks and that SSD speed is going to generate more heat than the Series X. Sony needs to get it right or they're going to suffer their own RROD situation.
In fact I plan on buying both but depending on what the PS5 ending design looks like I might wait it out initially. I don't want to be one of the early adopters plagued by a loud AND overheated system.
 
  • Agree
Reactions: Frozpot and eVo7
Optimizing for power draw rather than having fixed performance is definitely not going to be easier to develop for, which would explain why the devs they mentioned later in the article are choosing to drop CPU power to get "locked" GPU. This sounds like if there's any additional benefits to be wrung from this optimization, we'll see it from first party.



Wait, what are they doing on CPU that makes it equally hard to cool as a GPU pushed near the voltage limits? And this doesn't mention their earlier point that workloads, not just frequency, affect power draw which affects thermals.

It's going to be interesting once the systems can be disassembled and we can see real games. I'm really curious to know what effects the decisions to apparently base the CPU/GPU balance on equivalent heat outputs may show. I also know they're a damn fine engineering team, but it just seems like there's no world where that CPU should be equally difficult to cool as the GPU.

Once again devs pushing everything to the GPU sounds more like they just don't need the extra CPU power right now because most of their games are cross gen and built around jaguar. I think you are seeing a problem in that quote when there really isn't one, especially when the next line in the article says they don't need that power on the CPU. I don't think Sony would have taken this route just to make things harder for devs, clearly they see a benefit in doing things this way even though it's different than the old traditional method . Cerny is still involved in making games so he does know what would make things easier and more efficient and I'm sure that experience is what he based his decisions on when designing the console.
 
I wouldn't be surprised if it is true. The system with those clocks and that SSD speed is going to generate more heat than the Series X. Sony needs to get it right or they're going to suffer their own RROD situation.
In fact I plan on buying both but depending on what the PS5 ending design looks like I might wait it out initially. I don't want to be one of the early adopters plagued by a loud AND overheated system.

They wouldn't be taking this approach if they didn't have a viable cooling solution, you guys are taking this garbage rumor stuff way too seriously. If they couldn't have found a way to cool this thing properly they would have taken a more traditional route in it's design, t's just common sense. It's not like they didn't know they wanted to run this thing at higher clocks than normal and got caught off guard, it's been the plan from the beginning.
 
This will be a problem for the ps5. If they such confident system, Cerny wont be tiptoeing with these strange words.

Strange words? is English not your first language? I don't mean that as an insult but he's been using pretty basic words it's just people don't seem to understand what they are doing because most of us don't have a technical background.

I don't know if you have read that Eurogamer article but I suggest you do that instead of watching the video, for me it's easier to take the stuff in when I read it vs watching a video and listening to someone talk forever.
 
Last edited:
The power crown is basically a wash

Wrong. That's false. That's not at all true.

Xbox Series X has a sizable advantage in every way regarding rendering prowess. It's not close. Xbox Series X wins, hands down.

Xbox Series X wears the power crown clearly and comfortably. The "power" war is won. It's over.
 
Last edited:
Are we listening to what the architect of the PS5 said or are we calling him a liar?
There is a difference between what he is feed us and what we are seeing. He wording is the problem and he not going to flat out and day it's a weaker system.

If that wasn't the case Cerny would of said already from day one.
 
They wouldn't be taking this approach if they didn't have a viable cooling solution, you guys are taking this garbage rumor stuff way too seriously. If they couldn't have found a way to cool this thing properly they would have taken a more traditional route in it's design, t's just common sense. It's not like they didn't know they wanted to run this thing at higher clocks than normal and got caught off guard, it's been the plan from the beginning.

No one is saying it is certain. All I'm saying is I had 2 360s RROD on me and I personally am wondering what and how the cooling on the PS5 is going to work and work well. Running with rumors is one thing but also being genuinely concerned is another.
 
  • Agree
Reactions: GordoSan
I wouldn't be surprised if it is true. The system with those clocks and that SSD speed is going to generate more heat than the Series X. Sony needs to get it right or they're going to suffer their own RROD situation.
In fact I plan on buying both but depending on what the PS5 ending design looks like I might wait it out initially. I don't want to be one of the early adopters plagued by a loud AND overheated system.

Neither would I. 2.2GHz is a astronomically high clock speed, the heatsink has to cool a GPU at 2.2GHz AND also the CPU and possibly even the RAM chips. Good luck Sony, I will not buy another Jet air plane console like the launch PS4 and PS4 Pro.
 
  • Haha
Reactions: TeKPhaN
The argument over how often the PS5 run at max GPU and CPU doesn't seem that important in terms of overall performance. 1TF worth of power is a lot of nothing. For multiplats, the Xbox advantage is CUs for ray tracing and possibly physics. Much like resolution, those things can scale and developers will be scaling those things higher than consoles on PC. Nothing the PS5 does with boosts overcomes the CUs. The PS5 advantage is load times. Joe schmo consumer will notice the difference in load times in Destiny.

All the rest of this bull arguing over whether the PS5 will be able to maintain that overclocked Tflop or whether the Xbox Series X has some imaginary bottleneck that a fanboy on the internet discovered over all the other tech experts who've been doing this for years has gotten boring.

What is real is that the PS5 runs hot. I am interested in it's cooling system. Those frequencies legit scare me and I'm not confident being on the first batch of those consoles. I might be in the minority but I would've rather the PS5 be a 8.5 - 9 TF console that ran quiet and reliably. Especially if these boost clocks push up the price due to yields. I will be very interested in seeing what the PS5 runs like and it would be nice for DF to get their hands on the real machine.
 
Status
Not open for further replies.