Xbox One "Crazy Bandwidth" A Benefit To Developers Says Microsoft

Yeah June 21st was eons ago. So the building has been built ? and is full of servers up and running ?

Why not head over to the tickle thread to continue our discussion.
This is adding to the huge system they already have....it is a advantage for MS...GET OVER IT.
 
  • Like
Reactions: Bigempty
Projection

That's how you could possibly think "your blind ps4 worship." with my posting history, except change the word PS4 to X1 and that's you.

Dedicated servers are working out just fine thanks. Devs have the option to RENT them on the PS4 just like they can RENT them from MS for the X1. Of course on both systems the devs can take the option to not RENT them. You're going to be disappointed if you think every game on the X1 will have dedicated servers, that's not what MS said. All they are doing is offering to RENT servers, nothing more.

Why does rise look better than the PS4 line up, because you want it to, it's a subjective call and you are a 'fan' of the X1. Lets face it, a 900p QTE fest is not going to sell any consoles.


I have to disagree with you here. If Activision can afford to rent azure servers for Call Of Duty: Ghost (with their large player base) then it must be extremely cheap. If the PS4 version of COD Ghost doesn't have dedicated servers then we will know for sure.
 
Last edited:
But to be fair, why should Kojima or Carmack be taken seriously when they can't even get it right that the PS4 is 50% faster, when even anonymous indie devs know that? :smash:

Then a game that can only maintain 40 fps @1080p on XB1 better run at 60 fps @1080p on PS4. Otherwise, 50% is an exaggeration, or not relevant during computations in-game.
 
Umm...You know that Azure is already an existing platform right? You do realize that MS has data centers already in place around the world right? You do realize that the news in the links you provided only pertain to Iowa right, as in an expansion of their already existing data center there?

From the links you provided:





The Azure data centers that I know about of the top of my head:

  • North Central US – Chicago, Illinois
  • South Central US – San Antonio, Texas
  • West US – California
  • East US – Virginia
  • North Europe - Dublin, Ireland
  • West Europe - Amsterdam, Netherlands
  • East Asia – Hong Kong
  • Southeast Asia – Singapore
Additionally, from earlier this month, http://www.thewhir.com/web-hosting-news/microsoft-accelerates-its-data-center-expansion.

Snippet:



So no, you are not entirely accurate.
Now that's what I'd call cutting through the BS like a scythe!
 
Projection

That's how you could possibly think "your blind ps4 worship." with my posting history, except change the word PS4 to X1 and that's you.

Dedicated servers are working out just fine thanks. Devs have the option to RENT them on the PS4 just like they can RENT them from MS for the X1. Of course on both systems the devs can take the option to not RENT them. You're going to be disappointed if you think every game on the X1 will have dedicated servers, that's not what MS said. All they are doing is offering to RENT servers, nothing more.

Why does rise look better than the PS4 line up, because you want it to, it's a subjective call and you are a 'fan' of the X1. Lets face it, a 900p QTE fest is not going to sell any consoles.

And they are only QTE's because you want them to be. Even though it has been stated all over media sites that it really isn't QTE's.
 
Then a game that can only maintain 40 fps @1080p on XB1 better run at 60 fps @1080p on PS4. Otherwise, 50% is an exaggeration, or not relevant during computations in-game.

A poster at Beyond who worked on the XB1 (he no longer works at MS) said the PS4 never had a 50 percent advantage over the Xb1.
 
Not entirely accurate. MS hasn't even built the building where this server farm is going to live.

http://www.gamerevolution.com/news/...rm-for-xbox-one-and-office-applications-20037

http://blogs.desmoinesregister.com/...ion-data-center-investment-in-west-des-moines

http://www.passfail.com/news/tacoma.../microsoft-plans-iowa-data-center-9696040.htm

The building permits were only issued a month ago, construction only just begun. So it doesn't actually exist yet.
Azure is a fully running system, with servers located at various points throughout the world. They're building out capacity. If your argument is that Microsoft's cloud isn't actually built yet, again, you're wrong. And again, Sony does not have anything remotely like this. They could eventually, even within a year. Right now they don't, end of story.
 
  • Like
Reactions: Cody32599
I have to disagree with you here. If Activision can afford to rent azure servers for Call Of Duty: Ghost (with their large player base) then it must be extremely cheap. PS4 version of COD Ghost doesn't have dedicated servers then we will know for sure.
Agreed. And if Activision wants to provide dedicated servers for PS4, I'm going to go out on a limb and guess that it won't be provisioned through Azure. They'll have to go with another provider, and right now, the only providers out there are traditional, "rent a box" datacenters, where you're paying for hardware. Azure is a virtualized, on-demand system where you pay for resource hours (usage), again like Amazon AWS. There's a huge difference in cost overhead, management, and scalability between the two approaches, which is a major reason why Amazon AWS is now running as much of the web as they are. And it's an ideal scenario for running gaming computational resources (like multiplayer servers) because the usage can vary so wildly over the lifetime of a game.

And again, I'll beat this dead horse: I'm not saying Sony CAN'T take this approach to provisioning multiplayer servers. Just that so far they haven't. And to do so, they'd probably need to partner with another major player like AWS, Rackspace, and/or a third party running on top of one of those systems (the relationship between Sony and Rackspace is not for this kind of thing, though it probably could be and will be eventually). MS is leveraging the fact that Azure is a core part of their overall business in order to differentiate themselves, and it's a great differentiator.
 
My company use Azure cloud for data usage. It's been around for a while, your just new to hear tjis so if its new then they just starting to built this right?

Wrong.

Far as Sony goes that is the last company people want to user server from after their network was hacked and Rockstar files for GTA was stolen.

Anyone want to put money into that company it be much easier to give it to me.
 
This should be all over the internet

X1 RAM bandwidth is 272 GB/s vs ps4 RAM bandwidth is only 176 GB/s
X1 RAM bandwith is 55 % faster !!!

(sony's pr machine would have made a big sensation out of this if it was their console) ;)
 
But we have devs stating that XB1 is 50% less faster to develop for.

Ha ha ha, yeah anonymous sources said that. :rolleyes:

also ps4 got The Light Saber, which makes it 9,5 % faster to dev for
xl_Sony-PS4-4-624.jpg

You Too Can Become a Jedi at Lightsaber School
 
Last edited:
  • Like
Reactions: Squiggs
  • Like
Reactions: Blandina
This should be all over the internet

X1 RAM bandwidth is 272 GB/ vs ps4 RAM bandwidth is only 176 GB/s
X1 RAM bandwith is 55 % faster !!!

(sony's pr machine would have made a big sensation out of this if it was their console) ;)

272 is purely theoretical and from what most would expect, it isn't really ever going to be obtained within a practical game. The 150 number is what they were to achieve in actual games, via esram. The 200 figure is a total system bandwidth figure, which includes the DDR3. It should be mentioned that the DDR3 will also be sharing resources with the CPU though.
 
272 is purely theoretical and from what most would expect, it isn't really ever going to be obtained within a practical game. The 150 number is what they were to achieve in actual games, via esram. The 200 figure is a total system bandwidth figure, which includes the DDR3. It should be mentioned that the DDR3 will also be sharing resources with the CPU though.

yes that was what I said here http://unionvgf.com/index.php?posts/18033/

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects
"That's real code running. That's not some diagnostic or some simulation case or something like that. That is real code that is running at that bandwidth. You can add that to the external memory and say that that probably achieves in similar conditions 50-55GB/s and add those two together you're getting in the order of 200GB/s across the main memory and internally."

The Xbox One's circa 200GB/s is "real-life" bandwidth. The games can read from both DDR3 and esram at the same time.

272 is teoretical peak like ps4's 176GB/s peak is. We don't know real-life bandwith for ps4, but it's lower than 176GB/s (unless you are on noegaf, I think they figure that ps4 got a higher "real-life" bandwidth than the highest teoretical peak performance!!)

272 is purely theoretical and from what most would expect, it isn't really ever going to be obtained within a practical game.

well we don't know that, what most would expect isn't what matters. I sure alot of devs will take advance of it in pratical games.
"Microsoft's point is that game-makers have experience of this already owing to the eDRAM set-up on Xbox 360 - and ESRAM is the natural evolution of the same system."
 
Last edited:
  • Like
Reactions: Ceger
272 is purely theoretical and from what most would expect, it isn't really ever going to be obtained within a practical game. The 150 number is what they were to achieve in actual games, via esram. The 200 figure is a total system bandwidth figure, which includes the DDR3. It should be mentioned that the DDR3 will also be sharing resources with the CPU though.

Ok, let's take 150 and add the 68 of the DDR3. That is 218. Still greater than the 176 of the GDDR5, which by the way, also shares resources with the CPU. Great thing about the ESRAM, it can work independently.

And guess what, the 68 of DDR3 and 176 of GDDR5 is max theoretical as well. You can find many examples at tech sites where you have significantly lower than the peak situations. Additionally, the GDDR5 bandwidth needs to be shared as separated across function between CPU and GPU. ESRAM allows for bypassing those situations in many instances.

So your point?
 
I don't think you understand what Azure is. Yes, anyone can RENT servers. From who? At what cost? Have you rented dedicated servers in the past? It's a b****. You're paying for the whole server. And scaling and maintenance is a b****. Have you ever been involved in the management of a dedicated server farm? Even worse.

Azure, like Amazon's AWS, is a virtual, ON DEMAND server infrastructure. That's a big deal. That means servers can be spun up and spun down at will, based on demand. There is a reason why AWS and similar services are now becoming the defacto standard for resource management on the web, as opposed to "dedicated servers". And right now, Azure is the only service working specifically for general computing (including dedicated MP servers) for gaming. Could such a service exist for Sony? Of course. But it doesn't yet. That's the advantage that MS has right now. And judging from how Sony countered Xbox Live, I'd imagine it will remain an advantage.

Sorry, but the idea of dismissing Azure as "anyone can rent dedicated servers" is naive at best and shows a complete lack of understanding about what exactly Azure is bringing to the table. Cheap, on demand, scalable general computing focusing on gaming. If I had the resources I'd invest it in a third party company who wanted to do that, because it's going to be huge. And again, right now, MS has it and Sony doesn't. Your attempt to put Sony and MS on equal footing in this regard is a massive failure.
Sony's flagship game Killzone only uses dedicated servers for stats and matchmaking. It uses P2P for the actual MP gameplay. Sony can't even support their own first party games with dedi servers for gameplay.
 
Ok, let's take 150 and add the 68 of the DDR3. That is 218. Still greater than the 176 of the GDDR5, which by the way, also shares resources with the CPU. Great thing about the ESRAM, it can work independently.

And guess what, the 68 of DDR3 and 176 of GDDR5 is max theoretical as well. You can find many examples at tech sites where you have significantly lower than the peak situations. Additionally, the GDDR5 bandwidth needs to be shared as separated across function between CPU and GPU. ESRAM allows for bypassing those situations in many instances.

So your point?

What's my point? My would've been that the 272gb/s rating is theoretical, 200 gbs is what they have said they got in actual game testing.

I made no mention of PS4 or its bandwidth, so take your fanboy rage to another thread bro.

If we're taking a look at PS4s theoretical bandwidth performance vs actual, then the only reference we have for actual performance would be a developer quoted as saying he was hitting 172 gbs versus the 176gbs theoretical. The guy didn't embellishwhether it was the peak out something he actually sustained. So, the reason I didn't mention it in my reply was because of that and because it's not that far off of the theoretical.

On the other hand, the XBOX figure is way different in actual scenarios, so worth pointing it out. And IMO, if we have relevant performance figures to compare, it makes more sense to compare that, than to compare figures we know to be useless.
 
Last edited:
What's my point? My would've been that the 272gb/s rating is theoretical, 200 gbs is what they have said they got in actual game testing.

I made no mention of PS4 or its bandwidth, so take your fanboy rage to another thread bro.

If we're taking a look at PS4s theoretical bandwidth performance vs actual, then the only reference we have for actual performance would be a developer quoted as saying he was hitting 172 gbs versus the 176gbs theoretical. The guy didn't embellishwhether it was the peak out something he actually sustained. So, the reason I didn't mention it in my reply was because of that and because it's not that far off of the theoretical.

On the other hand, the XBOX figure is way different in actual scenarios, so worth pointing it out. And IMO, if we have relevant performance figures to compare, it makes more sense to compare that, than to compare figures we know to be useless.

Ok, I understand now what you are trying to say. However, be aware that there are situations with the ESRAM in which you can consistently hit the 200+ bandwidth mark as well. All peaks are theoretical, MS is just trying to be honest in terms of quoting their average use case. If they wanted to they could have easily quoted "hitting" near the theoretical peak. Again, the ESRAM provides better sustainable coverage not only with the GPU but in line with the CPU as well as you can code plenty of situations in which the DDR bandwidth can utilize the 30 from GPU to CPU or 68 directly to CPU without having to split off that bandwidth due to GPU and CPU sharing the GDDR per cycle.

So if you are just trying to bring up a particular situations in general and not arguing about bandwidth issues of the X1 as opposed to another console or context, then I digress and sorry to imply anything against your intention.

Theoretical peaks was never questioned or deep dived into until all of a sudden someone has better bandwidth than the other that touted their technology choices. This scrutiny at MS should then turn back to every historical and present claims of bandwidth across all tech architectures.
 
  • Like
Reactions: pravus
I should've said as much in my first reply, but I was swyping at redlights on my cell phone on my commute to work. But, yes.. I'm not disagreeing with ya.

I was late, ha
 
  • Like
Reactions: Blandina and Ceger
I should've said as much in my first reply, but I was swyping at redlights on my cell phone on my commute to work. But, yes.. I'm not disagreeing with ya.

I was late, ha

And don't take my last couple of sentences as a diatribe against you, it was just a general "putting it out there" rant.

:)
 
And don't take my last couple of sentences as a diatribe against you, it was just a general "putting it out here" rant.

:)

Honestly it was cool of MS to put the numbers out there like that but i dont think its something their competition will ever do on their own. Hell, i dont think we even have 100% of PS4s specs confirmed yet.
 
  • Like
Reactions: starlight777
Honestly it was cool of MS to put the numbers out there like that but i dont think its something their competition will ever do on their own. Hell, i dont think we even have 100% of PS4s specs confirmed yet.
There are tons of specs for both we do not know about. Or more accurately, architectural specifications overall aside from processors, CPU, GPU, etc.. We have high level info. MS has shared some more on their specifics, but they seem to have a lot more specifics and customizations to share as baseline anyways,