EDGE backs up PS4 "50% faster" story

Nonsense, lol. You guys are extremely biased. So it's difficult for me to tell whether or not you are being genuine or not.



:laugh: Yeah, all the stuff they are saying isn't really adding up.

They say the games look "the same", but there hasn't been a single multiplatform game shown on either console. So how would you know if the games "look the same"?

Yes biased to idiotic nonsense and the overall level of complete FUD going around enough as it is almost everywhere casual and you can point me out for not accepting that stuff happening here.
 
Nonsense, lol. You guys are extremely biased. So it's difficult for me to tell whether or not you are being genuine or not.



:laugh: Yeah, all the stuff they are saying isn't really adding up.

They say the games look "the same", but there hasn't been a single multiplatform game shown on either console. So how would you know if the games "look the same"?

I'm beginning to realize, as perhaps you are, that you can't even discuss these things with people that are in full scale denial mode.
 
Nobody has EVER said the eSRAM was 'utilized in the same way as GDDR5'. What I did say was that GDDR5 has literally zero advantages over the X1's memory subsystem. It has less bandwidth to both the GPU and CPU. It has WAY more latency. It costs more. It's got far fewer prospects for future cost reductions. And the pool is 4.5GB with 5GB if you work closely with Sony compared to 5GB for X1. In literally every possible area that matters to games or platform considerations it is worse.

Zero advantage if developers take the additional time needed to take advantage of it.

And why am I suppose to care about some random quote from a guy who knows nothing at B3d? Why are you repeating quotes that I already tore apart? You want quotes from ppl who actually know wtf is going on. Not Brad Grenz or Betanumerical (Sony fanboys). Look for stuff from ERP (a Sony first party dev) or sebbbi or DaveBauman or 3dcgi or best of all bkilian. Those are game/graphics programmers, AMD engineers, and the audio engineer on SHAPE/X1. You want facts? Go to the ppl who work on this stuff for a living. Stop picking any ole random post that tells you what you WANT to hear as if we are supposed to be impressed.

I don't know if they are Sony fanboys or not. I can only take your word for it.

Who? No. It's just EDGE's no name anonymous source. Literally nobody else has suggested as much. lhuerre is a dev but not Matt, who iirc isn't a dev at all and just spreads FUD on GAF. lhuerre confirmed the 14/4 usage info and said the two platforms are basically the same in performance. He specifically said it was much closer than 360/PS3. But no, those two had nothing to do with anything.

It's funny you never link out to any sources and when I go to look up what you've said on Google I only find another post by you on a different forum. Can't find anything about "lhuerre" talking about the PS4.

I'm beginning to realize, as perhaps you are, that you can't even discuss these things with people that are in full scale denial mode.

They can't concede a single point against the XB1.

Talking to Astro he argues me down on the point that the PS4 is balanced to use 1.4 tflops, but that's not good enough for him. He has to make up that final 90 gflops and claim that the PS4 only has 1.31 tflops just like the XB1. He can't admit that there is a small 90 gflop difference.

Then you talk about the 400 gflops left for animation, physics, lighting etc. etc. And that MUST be used for audio and nothing else. So basically Sony just wasted 400 gflops according to him.

Then there is the claim that the ESRAM is just as good as the GDDR5 ram in the PS4. Yet, Ryse and Dead Rising 3 are running sub-1080p just like Ketto, Bunz, and a lot of other people were predicting. Those are exclusive games too.
 
Last edited:
Ketto,

As I have been saying the audio solutions are not a hard computation problem, frankly moving it into the GPU is a waste and only makes the problem worse. The issue with doing audio in a non DSP solution is LATENCY. There will be significant audio and video (from the camera and mic) latency in interactive video applications on the PS4. So much so that the console will be unsuitable to do these types of applications for all but the youngest gamers. (5 year olds wont care about the latency on PSEyeToy version Sesame Street)

It'll work just as well as Kinect did on 360 and since so many of you guys swear it was a great experience. I guess PS4 will suffice eh. ;)

So in your Karoke games, and Dance Games, the X1 is set to be king, the PS4 is DEAD ON ARRIVAL in these types of interactive games. It won't be relevant in this area. I suspect they will also kill off all the PS4 voice and gesture command on the PS4 executive layer as well. The X1 will probably also add virtual avatars and voice changers to Skype to enhance privacy and anonymity.

Actually in karoake/dance games would be the one that makes minimal use of advanced audio. You should have said a shooter like the Last of Us in which audio occlusion and bounce played a big part of the experience instead of the game genre that only makes use of a few track layers. Game audio in general really hasn't advanced much at all, only thing that's changed is the quality for the most part. Even "3D audio" that people are championing as "new" isn't new at all even within the environment. Ever since the rise of 5.1 and even before many PC developers have experimented with such some with success others with much less.

Where this will really hit the PS4 is gestures and and voice commands in core games, which I believe are going to be a large part of the X1 experience, and the next gen core game experience as well. In cross plat games, these types of activities will work great on the X1 hardware, while they will suffer greatly on audio and video streams processed on either the CPU or GPU on the PS4. And again the reason is LATENCY.

Obviously this will be a problem for core games since not every PS4 owner is even going to have a camera to make use of these features...thus developers won't even bother. And the audio streams themselves are handled on PS4's ACP. I think you mean to say filtering and mixing are handled on the CPU. It's going to be years before audio is handled by CUs.
 
I believe (I could be wrong) That the SHAPE audio chip is mainly used for kinect, but also could be used to off-load CPU tasks on the audio front. And yes the PS3 does have better sound capabilities.

No the audio block itself is mainly for Kinect, within the audio block is SHAPE that is for gaming which offloads CPU tasks. As I said for multiplatform games they'll do "good enough" audio that runs on PS4's CPU and then for the XBO they can move it to SHAPE and use more expensive complex algorithms if they're so inclined. This has been echoed by developers/programmers on Beyond3D who actually know what they're talking about. So either we argue that ERP, Shifty, Bkilian and others are fanboys who are also spreading FUD, or maybe just maybe they're right and Astro is...wrong as shocking as that may be.

I'm personally going to believe that they know a little bit more about this than astro.
 
And the audio streams themselves are handled on PS4's ACP. I think you mean to say filtering and mixing are handled on the CPU. It's going to be years before audio is handled by CUs.

SMH, we had people here claiming that 4 CUs on the PS4 would have to be used mainly for Audio. Turns out it's not even possible to do audio on CUs. :laugh:

No the audio block itself is mainly for Kinect, within the audio block is SHAPE that is for gaming which offloads CPU tasks. As I said for multiplatform games they'll do "good enough" audio that runs on PS4's CPU and then for the XBO they can move it to SHAPE and use more expensive complex algorithms if they're so inclined. This has been echoed by developers/programmers on Beyond3D who actually know what they're talking about. So either we argue that ERP, Shifty, Bkilian and others are fanboys who are also spreading FUD, or maybe just maybe they're right and Astro is...wrong as shocking as that may be.

I'm personally going to believe that they know a little bit more about this than astro.

I'm going to make sure I keep this post on hand for Astro when he tries to argue that the PS4 MUST reserve 400 gflops for audio.
 
SMH, we had people here claiming that 4 CUs on the PS4 would have to be used mainly for Audio. Turns out it's not even possible to do audio on CUs. :laugh:

Audio being handled by CUs is possible (I'm not sure what would remain on the CPU versus what will be done on CUs), just not probable any time soon, we probably wont see advances like that till year 3. Which both Sony and AMD are banking on since AMD really wants to push the whole GPGPU thing.

Like I said, this singular quote pretty much sums up the entire thing.

SHAPE, if it were utilized 100% at all times, would be hard to equal in CPU, yes. But they don't need to equal it. For one, it's using better, but more expensive algorithms, for which the cheaper versions work fine, and have been used for the last generation without complaint. And second, It's highly doubtful that it will be utilized 100% for most games. I'd be surprised if developers used even 50% of it's capabilities for most titles.
By the time developers are looking to push the capabilities of the audio block, I suspect using GPU compute audio will be well understood and a reasonable solution.
 
SMH, we had people here claiming that 4 CUs on the PS4 would have to be used mainly for Audio. Turns out it's not even possible to do audio on CUs. :laugh:



I'm going to make sure I keep this post on hand for Astro when he tries to argue that the PS4 MUST reserve 400 gflops for audio.

No body claimed that GPGPU would be concentrated on advanced audio, but that the CPU and GPU would need to be used for many activities down the line as in advanced audio as one example with next gen gaming. The argument got sidetracked on audio exclusively, but didn't mean that audio exclusively used up extra CPU and GPGPU.

The point was that X1 has hardware dedicated to do many things normally and in future would require CU usage that are handled more directly outside of the CU, thus allowing CPU and GPU to be directed towards more traditional requirements of rendering graphics. Again, that as developers progress in technology and engine requirements, there is more headroom for growth in the X1 system as the GPU and CPU or more focused for base requirements as opposed to having to chip into it with everything leveraged down the line.

Again, context. The thread is about 50% more power, the context of the discussion is that it is not true.
 
Audio being handled by CUs is possible (I'm not sure what would remain on the CPU versus what will be done on CUs), just not probable any time soon, we probably wont see advances like that till year 3. Which both Sony and AMD are banking on since AMD really wants to push the whole GPGPU thing.

Like I said, this singular quote pretty much sums up the entire thing.

3 years from now AMD will have much more powerful GPUs for GPGPU stuff.

No body claimed that GPGPU would be concentrated on advanced audio, but that the CPU and GPU would need to be used for many activities down the line as in advanced audio as one example with next gen gaming. The argument got sidetracked on audio exclusively, but didn't mean that audio exclusively used up extra CPU and GPGPU.

The point was that X1 has hardware dedicated to do many things normally and in future would require CU usage that are handled more directly outside of the CU, thus allowing CPU and GPU to be directed towards more traditional requirements of rendering graphics. Again, that as developers progress in technology and engine requirements, there is more headroom for growth in the X1 system as the GPU and CPU or more focused for base requirements as opposed to having to chip into it with everything leveraged down the line.

Again, context. The thread is about 50% more power, the context of the discussion is that it is not true.

It's not 50% more powerful. The GTX 660 is 1.8 tflops and the GTX 560 Ti is 1.31. That's not 50% more powerful, but the 660 can run Crysis maxed at 1080p and the 560 Ti can barely run it at 1080p 60fps on low settings. It's not 50%, but very significant.

The example they gave of having a PS4 multiplatform game running at 1080p 30fps and running in the low 20fps 900p on the XB1 would be significant if true.
 
3 years from now AMD will have much more powerful GPUs for GPGPU stuff.

That goes without saying, but the real advances will come from the console space and roll over to the PC space.

It's not 50% more powerful. The GTX 660 is 1.8 tflops and the GTX 560 Ti is 1.31. That's not 50% more powerful, but the 660 can run Crysis maxed at 1080p and the 560 Ti can barely run it at 1080p 60fps on low settings. It's not 50%, but very significant.

The example they gave of having a PS4 multiplatform game running at 1080p 30fps and running in the low 20fps 900p on the XB1 would be significant if true.

Well a 660Ti is Kepler while 560Ti is Fermi, different architectures.
 
Well a 660Ti is Kepler while 560Ti is Fermi, different architectures.

Just plain 660. Not 660 Ti. But I could say a 650Ti and and a GTX 660 (non-ti). That's the same difference in flops between the XB1/PS4.
 
Deleted a few posts between Smilebit and Astro. Was getting a little trollish and personal.
 
No I think you're the one who's conflating here, there are several games that were talked about on B3D by ERP and others in which the very process of doing audio itself was extensively different on each platform not just the channels or the compression. Do a google search with the modifer "site:beyond3d.com" and search for Battlefield (or DICE) audio you should find some talks on
exactly how different the two systems handled audio that extend further than merely quality.

...SNIP

You aren't addressing what I've said at all. You are just taking random forum posts by bkilian and acting as if they stand in for some sort of counterpoint you're unable to provide when these forum posts don't speak against my argument at all.

Here is what you seem to imagine I've argued
: "SHAPE is so great devs will use it to its fullest and therefore will cancel out the 4 CU's PS4 has for GPGPU tasks. YAY Xbox!"

Here is what I actually am arguing
: "Devs will build towards CPU parity with SHAPE in mind and as such will design their audio to run on the 4 CU's PS4 has for GPGPU tasks. Audio won't take up all of those CU's, but it will need to dedicate 1 or 2 to it in high end AAA games which do things like complex reverb and elaborate filtering (CoD, BF4, etc). Reason devs will prioritize CPU load parity is thsoe tasks (AI, physics) are vital to gameplay balancing and bug testing (aka notable adjustments to them costs lots of overhead $$$).

Hence, I expect real world usage comparisons will come down to something like 14 vs 12 CU's+7% clock advantage+display planes+DME's+eSRAM for graphics rendering, SHAPE vs 1 or 2 CU's for audio, maybe 1CU vs 150MHz clock advantage for AI/physics. In other words, the two will be exceedingly close in what is put on screen, how it sounds, and how it reacts. Evidence to support general conclusion: (i) various devs outright asserting they are equal or on par; (ii) Exclusives look on par with one another in tech across the board; (iii) Multiplatform games like MGSV look the same on both."

You are operating under the assumption that devs will be porting from PS4 to X1. History shows that devs tend to do the opposite (PS3 was lead platform for most titles mid-late cycle because it's easier to code to particulars than to port to them). Devs are highly unlikely to code their audio to the PS4's CPU only and then port to X1 where they suddenly have massive headroom that in your scenario they just ignore and do nothing with. Devs will take advantage of both console's designs and there is an obvious, natural, and straightforward way to do that, even if you refuse to admit it.
 
Zero advantage if developers take the additional time needed to take advantage of it.

What? The eSRAM's latency is going to be leveraged for GPGPU compute at the very least. DF is doing an article on it "soon" to talk about MS's approach to that. The additional bandwidth isn't something devs need to spend yrs discovering...it's being measured at over 200GB/s ALREADY (which was really in late May fyi) on X1. That's real world usage. Not theoretical peak. Actual games on X1 use that kinda bandwidth. On PS4 you are looking at 172GB/s or so if you're lucky. Not sure how you can conclude this isn't an advantage. RAM amounts for gaming and bandwidth are the only talking points that have ever existed for GDDR5. Suddenly the entire operational reason for RAM's existence is not important or what?

I don't know if they are Sony fanboys or not. I can only take your word for it.

You can keep reading the thread instead of simply copying random posts that tell you what you want to hear. Ya know...learn stuff.

It's funny you never link out to any sources and when I go to look up what you've said on Google I only find another post by you on a different forum. Can't find anything about "lhuerre" talking about the PS4.

Neat story. Don't feel like digging through GAF or B3d for you. I've seen the posts he made at B3d and GAF where he says they are closer than PS3/360 were and are definitely on par. Don't take his word for it, ERP, sebbbi, Kojima, Marcus Nilsson and Carmack all agree too. They are reliable ppl. Anonymous sources from DF agreed. Likewise anonymous sources from EDGE's articles earlier in the year also agreed! So did the Avalanche guy. And then after EDGE's editors decide to decalre PS4 the console war winner all the sudden they now have anonymous sources telling them different and we are all supposed to take that at face value? Hell, even their source says they aren't using the eSRAM or the X1's hardware in any specific fashion and their comparison was seemingly just for platform agnostic code, which means precisely nothing worthy of discussion.

They can't concede a single point against the XB1.

Nonsense. It's about facts. Fact is that there are areas each console has an edge over the other. Sony's dev tools are simpler and they were able to get devs up and running faster because of that. That's a result of their simplistic design goals, which is good and exactly what they needed as a company from a fiscal pov. MS has better performance in the memory sub-system but they also have to get the dev tools caught up to help devs take advantage of that. PS4 has more CU's for graphics rendering balance (14vs12). MS has display planes to help adjust framerates, resolutions, etc and other small helper hardware. Again, their software has the burden of helping devs leverage this stuff. That's the downside of a more elaborate design. PS4 also has more CU's for GPGPU stuff. Some of that will likely be soaked up with audio tasks, but not all of them. PS4 is also cheaper.

It's not about "conceding" anything. It's about acknowledging reality even when it's inconvenient. You are WAY too washed up in the prevailing internet narrative about how these consoles have been designed and how they will operate. That narrative isn't even in the ballpark of being realistic.

Talking to Astro he argues me down on the point that the PS4 is balanced to use 1.4 tflops, but that's not good enough for him. He has to make up that final 90 gflops and claim that the PS4 only has 1.31 tflops just like the XB1. He can't admit that there is a small 90 gflop difference.

14CU's on PS4 is actually 1.435Tflops. And I've never said anything about that being the same as X1's 12CU's. I've seen ppl at B3d make that mistake though. It's not true because X1 gets all the computing power it needs from those 12 (it's balanced at 12) while PS4 gets its computing power for graphics from its 14. It's not a 12CU's on X1 + clock boost > 14CU's on PS4 sorta thing. That's not what MS said. MS said for THEIR system, balanced at 12, the boost was more helpful than 2 more CU's. If Sony wanted to use 16CU's it would likely be similar in terms of improving rendering in a relatively negligible manner over the 14CU alternative.

Then you talk about the 400 gflops left for animation, physics, lighting etc. etc. And that MUST be used for audio and nothing else. So basically Sony just wasted 400 gflops according to him.

Those CU's aren't doing GPGPU lighting. Nor animation. And I never suggested they are wasted at all. They will likely house audio processing and some physics to help the CPU out (which is weaker than X1's). Stop makign s*** up and lying compulsively about what I said. If ya wanna know what I think, ask me and I will tell you. In the meantime, stop putting words in my mouth.

Then there is the claim that the ESRAM is just as good as the GDDR5 ram in the PS4. Yet, Ryse and Dead Rising 3 are running sub-1080p just like Ketto, Bunz, and a lot of other people were predicting. Those are exclusive games too.

So is The Order and BF4 on PS4. Games get res and framerates prioritized differently and optimized late in the dev cycle. And DR3 isn't sub-1080p at all times btw, it's dynamic res. And nobody even noticed either of those games being "sub-1080p" at all until it was made known. From what I've heard lots of next gen games are sub-1080p...just nobody has directly asked devs about those specifics so nobody talks about it.
 
Oh hey guys, look what just got posted on VGLeaks, confirming that audio tasks are expected (by Sony) to be utilized on those CU's at some point:

audio1-600x360.jpg


:|
 
Last edited:
I've not been on the forums for about a week or so but I heard Ryse is not even 1080p, 900p?

I wonder what offloading audio will do for that...
ruh roh! well when you start development on the 360 and and scramble to port it to the XBO....you get a game that looks like a 360 running at a higher resolution.
Let me say it again GOW 3/A>Ryse YUP as ALL of GOW3/A ARE REALTIME in-engine graphix and looking better then Ryse.
 
ruh roh! well when you start development on the 360 and and scramble to port it to the XBO....you get a game that looks like a 360 running at a higher resolution.
Let me say it again GOW 3/A>Ryse YUP as ALL of GOW3/A ARE REALTIME in-engine graphix and looking better then Ryse.
I'm just wondering if Idle is going to come in here now and call you out for saying sh1t like this.
 
  • Like
Reactions: Ceger
Who? No. It's just EDGE's no name anonymous source. Literally nobody else has suggested as much. lhuerre is a dev but not Matt, who iirc isn't a dev at all and just spreads FUD on GAF. lhuerre confirmed the 14/4 usage info and said the two platforms are basically the same in performance. He specifically said it was much closer than 360/PS3. But no, those two had nothing to do with anything.

Here is exactly what he said word for word.

lherre said:
About the CU's ...

All are the same, no differences between them.

About the "balance". As Cerny said in the interview you can use them as you wish, render or GPGPU. They simply said that those "extra" 400 Gflops are "ideal" for gpgpu tasks instead of rendering, but it is up to you how to use them. Not that you can't gain anything using the 18 CU's for rendering, obviously you will see a difference using them only for rendering versus 14 CU's but they want to encourage devs to use gpgpu compute in ps4.

So it's a misunderstood that the gpu is unbalanced in any way. - Source

Multiplatform developers probably won't use it for GPGPU at all.

ruh roh! well when you start development on the 360 and and scramble to port it to the XBO....you get a game that looks like a 360 running at a higher resolution.
Let me say it again GOW 3/A>Ryse YUP as ALL of GOW3/A ARE REALTIME in-engine graphix and looking better then Ryse.

God Of War 3? That doesn't make any sense.
 
http://www.vgleaks.com/playstation-4-audio-processor-acp/

But that doesn't mean that 400 gflops will be used soley for Audio.

Nobody has ever claimed this. Read posts before replying. I don't need you suggesting I claimed things that I most certainly never have.

Also, they are saying it's a possibility in the future. Not that it would be done with games within the launch window.

If devs want to use all 18CU's for rendering their games won't look notably better than the 12CU X1 version. As they use up CPU cores for audio, all you'll get is a gimped AI/physics setup on both consoles. Devs are likely smarter than that. I hope.
 
Here is exactly what he said word for word.

That's not the post(s) I was referring to. He said both on B3d and GAF a while back both are on par with each other, closer than PS3/360 were this gen.

Multiplatform developers probably won't use it for GPGPU at all.

Then they likely won't use them for anything meaningful at all. Using them all for rendering won't net you jack s*** for improving visuals. It'd be a very marginal 20% boost at best (going by MS's figures which should be representative), which amounts to very, very little once you consider how that will be utilized (more pixels which aren't noticeable in the first place).

What you are suggesting is that devs will just ignore Sony's advice and waste those CU's. They won't. They will use them in the obvious fashion as I have described: to do audio processing and a bit of physics. The rest of physics and the AI will be done on the CPU on PS4. On X1 all audio will be on the audio block(s) and the AI/physics will be on the CPU with its 150MHz clock advantage.

What you guys don't seem to grasp is that this gen there are very obvious, sensible ways to distribute the computing that still take advantage of each system's unique attributes. Devs will take advantage of SHAPE and the other audio chips as well as the display planes and eSRAM. They will also take advantage of the CU's in PS4. If you are holding on for dear life to the hope that someday down the road devs will use those CU's for audio then your argument about power differences is already dead in the water.
 
It'd be a very marginal 20% boost at best (going by MS's figures which should be representative), which amounts to very, very little once you consider how that will be utilized (more pixels which aren't noticeable in the first place).

Oh, I see. You are stuck on that 50% stuff. I know it doesn't scale like that just based off the Digital Foundry test. A 20% difference is like the XB1 version running at 900p with a framerate in the mid 20s versus the PS4 version running at 1080p solid 30fps. Like what the rumors were pointing to. MS could just do dynamic resolution scaling to lock the framerate at 900p.

Yes, I'm suggesting that devs won't waste that 400 gflops on audio. Not even half of that. Maybe 100 gflops.
 
Last edited: