Xbox One "Crazy Bandwidth" A Benefit To Developers Says Microsoft

yes that was what I said here http://unionvgf.com/index.php?posts/18033/



The Xbox One's circa 200GB/s is "real-life" bandwidth. The games can read from both DDR3 and esram at the same time.

272 is teoretical peak like ps4's 176GB/s peak is. We don't know real-life bandwith for ps4, but it's lower than 176GB/s (unless you are on noegaf, I think they figure that ps4 got a higher "real-life" bandwidth than the highest teoretical peak performance!!)



well we don't know that, what most would expect isn't what matters. I sure alot of devs will take advance of it in pratical games.
"Microsoft's point is that game-makers have experience of this already owing to the eDRAM set-up on Xbox 360 - and ESRAM is the natural evolution of the same system."

Just signed up here following being banned from the GAF for trying to explain the x1s bandwidth to them. If you want a laugh http://m.neogaf.com/showthread.php?t=676989&page=128
I start at #6372 ... Some of the responses I get are quality...I especially like the one where I'm told I'm Albert Penello...
 
Just signed up here following being banned from the GAF for trying to explain the x1s bandwidth to them. If you want a laugh http://m.neogaf.com/showthread.php?t=676989&page=128
I start at #6372 ... Some of the responses I get are quality...I especially like the one where I'm told I'm Albert Penello...
oh god... i just read a bit of the sh*tstorm after #6372... I just got even prouder of my avatar.
 
Just signed up here following being banned from the GAF for trying to explain the x1s bandwidth to them. If you want a laugh http://m.neogaf.com/showthread.php?t=676989&page=128
I start at #6372 ... Some of the responses I get are quality...I especially like the one where I'm told I'm Albert Penello...
Welcome to the forums, your going to like the users here, most of us have been around for more then 7 years.

We were OG from Teamxbox, Gaf is the last place you want to hang at. Their IQ are like 0.

Oh by the way I did see your name on the sh*t list that got banned! LOL
 
Welcome to the forums, your going to like the users here, most of us have been around for more then 7 years.

We were OG from Teamxbox, Gaf is the last place you want to hang at. Their IQ are like 0.

Oh by the way I did see your name on the sh*t list that got banned! LOL

Cheers, I have been lurking for a while...

It was my first post over there. My motivation for trying to get some facts across to the GAF was the poor sods who have pre-ordered the x1, and are loosing sleep over it because they have been reading all the nonsense spouted over there telling them their console is weak... but it will be ok because though the multiplats will look a bit worse, they will probably play ok.
Oh the irony.

The way they are dancing round the sh*t list like it's a triumph... not for free speech and balance it's not!
 
  • Like
Reactions: starlight777
Cheers, I have been lurking for a while...

It was my first post over there. My motivation for trying to get some facts across to the GAF was the poor sods who have pre-ordered the x1, and are loosing sleep over it because they have been reading all the nonsense spouted over there telling them their console is weak... but it will be ok because though the multiplats will look a bit worse, they will probably play ok.
Oh the irony.

The way they are dancing round the sh*t list like it's a triumph... not for free speech and balance it's not!
we have yet to see multiplat real word performance...i won't be surprised if the x1 actually has a slight edge (especially in cpu intensive and highly dynamic games)
 
Just signed up here following being banned from the GAF for trying to explain the x1s bandwidth to them. If you want a laugh http://m.neogaf.com/showthread.php?t=676989&page=128
I start at #6372 ... Some of the responses I get are quality...I especially like the one where I'm told I'm Albert Penello...

Just disgusting and honestly they know jack. I love how they keep quoting irrelevant and very old items. And my favorite is that an Indie dev got 172gb/s from the PS4, so that is the established real world average. An indie, that most likely has tons of repetitive code parts in a setup not balanced to handle true retail game code. Hell MS can easily make code that keeps the ESRAM bandwidth saturated at peak, but what good would that do.

I ESPECIALLY love the following...

Originally Posted by nib95
GAF will always be more accurate and reliable, because here we have a strict control and check up system where insiders etc are vetted before they're allowed to spout BS claims. After the sifting through the weed, this generally leaves a better cut of actual insiders and industry people left posting. Many of the ones who've posted in here even, despite not being insiders, still have experience in the field or fields similar, and can usually pretty quickly and astutely shoot down FUD. It's a shame so much FUD on B3D is not only propagated, but actually regurgitated and continually given traction.

The majority of secret sauce developments actually originated from B3D, that should tell you what you need to know credibility wise. Luckily they do still have a few genuine insiders and devs etc who do still post, albeit not nearly enough.

Followed by another Einstein...

I've been a member there since 05 I think but the site went to s*** a long time ago. Last time I posted there was when Reiko got banned. I remember the good ol days when PS3 wasn't launched yet and Ninja Theory devs use to post there giving some insight on Heavenly Sword. These days it's comical to see the stuff on there.

And calling the B3D quoters Xbots, Shifty Geezer above all... classic!
 
  • Like
Reactions: OldSchoolNerd
Just disgusting and honestly they know jack. I love how they keep quoting irrelevant and very old items. And my favorite is that an Indie dev got 172gb/s from the PS4, so that is the established real world average. An indie, that most likely has tons of repetitive code parts in a setup not balanced to handle true retail game code. Hell MS can easily make code that keeps the ESRAM bandwidth saturated at peak, but what good would that do.

I ESPECIALLY love the following...



And calling the B3D quoters Xbots, Shifty Geezer above all... classic!

And the funny thing is that I think B3D is out of touch a lot of the times. Only recently have they really been approaching things more openly looking to things beyond typical PC knowledge.

But GAF... go there when I need some laughs
 
Just signed up here following being banned from the GAF for trying to explain the x1s bandwidth to them. If you want a laugh http://m.neogaf.com/showthread.php?t=676989&page=128
I start at #6372 ... Some of the responses I get are quality...I especially like the one where I'm told I'm Albert Penello...

Read the first pages and... wow just wow!

Doesn't even matter if you've been right or wrong I saw NOTHING that warrants a ban. Despite personal attacks left and right you actually kept it civil.

Damn shame - such injustices make me so angry.

Planning a write-up on this one (seriously).
 
  • Like
Reactions: starlight777
If we're taking a look at PS4s theoretical bandwidth performance vs actual, then the only reference we have for actual performance would be a developer quoted as saying he was hitting 172 gbs versus the 176gbs theoretical. The guy didn't embellishwhether it was the peak out something he actually sustained. So, the reason I didn't mention it in my reply was because of that and because it's not that far off of the theoretical.

On the other hand, the XBOX figure is way different in actual scenarios, so worth pointing it out. And IMO, if we have relevant performance figures to compare, it makes more sense to compare that, than to compare figures we know to be useless.

If we know:
a> That the XB1 has hit actual 200 gbps in actual performance in a real scenario, and
b> That the PS4 has a theoretical max of 176 gbps,
then we also know:
c> That no amount of mystic hoohah will make the PS4's performance hit 200 gbps.

But that's really just to point out the obvious there, and not a claim that those numbers are worth using as the primary comparison between the consoles.
 
But that's really just to point out the obvious there, and not a claim that those numbers are worth using as the primary comparison between the consoles.

Point for discussion - are those numbers worth using to compare the relative performance?

These systems are at the end of the day data processing systems. What they do is process data. The average bandwidth figure is the measure of how much work a system is able to do in a second. For the sake of argument assume both systems running the same code (at least at a functional level) Surely the amount of data these systems are able to process while running that code is indicative of the performance level?

I propose that it is a better indicator than the number of CUs or CPUs, even peak bandwidth, as it is a measure of the actual work done.
 
Point for discussion - are those numbers worth using to compare the relative performance?

These systems are at the end of the day data processing systems. What they do is process data. The average bandwidth figure is the measure of how much work a system is able to do in a second. For the sake of argument assume both systems running the same code (at least at a functional level) Surely the amount of data these systems are able to process while running that code is indicative of the performance level?

I propose that it is a better indicator than the number of CUs or CPUs, even peak bandwidth, as it is a measure of the actual work done.

The bandwidth is how much data they can transmit per second, not how much work can be done. If you had a hypothetical system that could transmit 1TB of data in a single chunk once a second, that bandwidth would be amazing, but it would make for a terrible game system because you need to be able to update the screen much more frequently and react much more quickly than a 1 second cycle would allow. On the other extreme, a one-bit pathway that's updated trillions of times a second would also be problematic just because it would be a massive headache to try to coordinate sending and receiving that data.

So, yeah, it matters in the sense that too little bandwidth is killer. More importantly, they need to be able to get stuff from memory (CPU needs) quickly and process and send stuff from memory to the screen (GPU needs) quickly as well. Sony went with trying to solve that balance with a high power solution, as is tradition, while MS went with a balanced solution, as is tradition.
 
The bandwidth is how much data they can transmit per second, not how much work can be done. If you had a hypothetical system that could transmit 1TB of data in a single chunk once a second, that bandwidth would be amazing, but it would make for a terrible game system because you need to be able to update the screen much more frequently and react much more quickly than a 1 second cycle would allow. On the other extreme, a one-bit pathway that's updated trillions of times a second would also be problematic just because it would be a massive headache to try to coordinate sending and receiving that data.

So, yeah, it matters in the sense that too little bandwidth is killer. More importantly, they need to be able to get stuff from memory (CPU needs) quickly and process and send stuff from memory to the screen (GPU needs) quickly as well. Sony went with trying to solve that balance with a high power solution, as is tradition, while MS went with a balanced solution, as is tradition.

I maybe didn't make myself too clear. Probably not going to this time either but I'll have a go....What I'm trying to say is....

The average measured bandwidth is the amount of data that is actually being moved around the system. Eg In any given second, 60 game frames have been rendered, moving (in the case of the x1) 200GB of data around in the process. At times during the generation of a single one of those frames it could be using 50GB/s, at other times in the same frame, it could be using 270GB/s depending on what stage of the pipeline is running. But in total, in that measured second, 200GB of data was read or written.

So what I am suggesting is that that 200GB is the amount of work that was done, by all the processing elements in the architecture.

Now if everything else remained equal except you could only do 140GB of work in a second (I.e. the maximum average bandwidth you've measured is 140GB/s), then you are not going to be able to read and write the 200GB of data needed for 60 frames, so you need to change the code to use less data, eg drop the resolution, or drop the frame rate, whatever.

Therefore that average bandwidth utilisation can be a good measure of the systems performance.

Am I making any sense at all?
 
I maybe didn't make myself too clear. Probably not going to this time either but I'll have a go....What I'm trying to say is....

The average measured bandwidth is the amount of data that is actually being moved around the system. Eg In any given second, 60 game frames have been rendered, moving (in the case of the x1) 200GB of data around in the process. At times during the generation of a single one of those frames it could be using 50GB/s, at other times in the same frame, it could be using 270GB/s depending on what stage of the pipeline is running. But in total, in that measured second, 200GB of data was read or written.

So what I am suggesting is that that 200GB is the amount of work that was done, by all the processing elements in the architecture.

Now if everything else remained equal except you could only do 140GB of work in a second (I.e. the maximum average bandwidth you've measured is 140GB/s), then you are not going to be able to read and write the 200GB of data needed for 60 frames, so you need to change the code to use less data, eg drop the resolution, or drop the frame rate, whatever.

Therefore that average bandwidth utilisation can be a good measure of the systems performance.

Am I making any sense at all?


when i read it...

what-does-that-mean.gif


but that is just because I am not schooled, or versed in the tech speak