Random Gaming News and Videos Topic

Status
Not open for further replies.
"The ability to run apps concurrently with the game with zero impact on performance required a significant amount of engineering, but the final result does work very well. What will make or break the system will be the quality of the apps themselves - certainly functions like party set-up and video editing work nicely."

I basically got that their whole system was built araound being able to run snap and with kinect functionaliy and still get the same game fidelity(naked eye) as the PS4 and to have the system run the whole time....I wonder why they wouldn't say how much power they consume.
 
  • Like
Reactions: starlight777
Good read, I wish they stated how much GPGPU resources were going towards Kinect though. And they basically skipped over latency in regards to the GPU.
 
  • Like
Reactions: astrograd
Maybe next gen they can have decent hair? Needs a new engine badly.
 
So basically we are getting a super fast, super efficient, console that will improve as time goes on.

I can live with that.
 
  • Like
Reactions: Blandina
The beta actually worked pretty well this time around (compared to BF3 Beta) , It was more enjoyable than my last COD sessions ..Players were pushing to cap/win ,no sniper spam no 40 yard C4 tosses and best of all the game wasn't stopping every 2 minutes for host migration , I love you Dedis!
 
You guys are focusing on part of the article that was discussed like 2 weeks ago and are completely ignoring the actual new information.
 
I can't even play at the moment. Running a 670 and i7 but it won't load properly. I can hear sound, but it just loads forever.
 
Good read. The Xbox One will be a power house. No doubt about it.

Digital Foundry is the best in the world at what they do.
 
  • Like
Reactions: Flynn
Good read, I wish they stated how much GPGPU resources were going towards Kinect though. And they basically skipped over latency in regards to the GPU.

I see the adjective 'conservative' describing their 10% time slice for OS, a subset of which is for Kinect's skeletal tracking. So real world usage is a good bit less than 10%, and a subset of that is for Kinect. Also bear in mine that the way they actually do the tracking is radically different than on 360 in terms of utilizing data structures of poses. Likely much more CPU oriented than GPU, using those 2 CPU cores presumed to be reserved for the OS.
 
Either way, it sounds like the X1 developers put a lot of thought behind the hardware design effectively creating a superior console experience all around. That must've been a bitch to pull of mate. But I congratulate them on their efforts. Can't wait to get mine! WOOT!!:bang:

*APPLAUSE*
 
  • Like
Reactions: Two Pennys Worth
Moving a bunch of posts to the tickle fight thread.
 
You guys are focusing on part of the article that was discussed like 2 weeks ago and are completely ignoring the actual new information.

Specifically the esram, it sounds like it was a solution they had to settle with as opposed to what they would have liked to have accomplished with more time.

The other thing I gathered was that because they have an upscaler, high definition at the native performance level wasn't important to them. And that really blows, because assuring that the X1 could do 1080@60fps natively would have given devs more headroom than they are now afforded.

And why do they continue to keep referring to what they've done as an increase? The 7790 was built to run with 14CUs/1Ghz@1.79TF. 12CUs/853mhz@1.31TF is a DECREASE, not an upclock. They could at least give the people they're talking to a little credit for being able to do basic math.

I'm a little confused by what they were trying to say about bandwidth. Perhaps I've misunderstood but it sounds almost like they're "hoping" that their emphasis on bandwidth over CUs is going to pay off. They don't sound entirely convinced of it either as they kind of leave that up in the air.
 
Specifically the esram, it sounds like it was a solution they had to settle with as opposed to what they would have liked to have accomplished with more time.

No. The eSRAM was in the design prior to the 8GB of RAM or DDR3. You can check the Yukon leak for proof of this. It's intended specifically as a natural evolution of the 360's eDRAM. It's not, nor was it ever, considered a bandaid. It was designed to use eSRAM because it gives the same benefits as any other high bandwidth setup without most of the downsides (cost, manufacturing flexibility).

The other thing I gathered was that because they have an upscaler, high definition at the native performance level wasn't important to them. And that really blows, because assuring that the X1 could do 1080@60fps natively would have given devs more headroom than they are now afforded.

Again, no. They intelligently realized that we are deep into the HD era now where pixel counts aren't necessarily important to the end result on screen in terms of visual fidelity to the user. On PC it matters as you sit inches in front of your monitor. On consoles you sit 6ft+ away from an HDTV and the angular resolution of the human simply isn't good enough to easily see a difference between something like 900p upscaled and 1080p (for example). See RYSE for absolute proof of this fact.

There is a massive elephant in the room in graphics rendering that is exacerbated on consoles/tv's; resolution scales non-linearly against performance but the visual payoff scales non-linearly against human perception as well. In other words, throwing an extra 180 pixel lines on screen won't even be noticed by users at all, yet it costs you a TON in processing to get that ridiculously diminished return that nobody even notices.

What MS did was recognize this and give devs the ability to tightly manage framerates and resolutions and color depths in the display planes. They did extensive testing of this at MSR where they looked at display planes utilized in a best case scenario and saw that you waste 5-6 times the GPU processing by going full HD when you don't need to in most cases. Letting devs process foregrounds/backgrounds independently in the GPU and then using a really high end scaler for free AA and quality IQ means devs can get almost the exact same results as 1080p + good AA without the additional GPU processing taking part at all.

RYSE is a great example, as is DR3. In RYSE's case, the game gets free AA via the display planes by going to 900p and upscaling. At the same time performance gets a boost from this, which they offset by adding more dynamic bits to Marius' armor and more visual fx in general. So you get a better looking game in areas that ppl actually notice without any compromises to the end results as viewed on your tv to the naked eye. For DR3, they locked the framerate using the display planes via dynamic resolution that slightly fluctuates in non-notable ways, even with fully destructible zombies numbering well into the hundreds all on screen at once (plus huge open levels with lots of destructibility in them too). So there you get all of that without having to pare back the destruction or zombie count (gameplay stuff) AND you get your framerate locked down at 30fps. For free relative to the GPU. More control over graphics-related optimizations in areas of highly diminished returns is a GOOD thing.

And why do they continue to keep referring to what they've done as an increase? The 7790 was built to run with 14CUs/1Ghz@1.79TF. 12CUs/853mhz@1.31TF is a DECREASE, not an upclock. They could at least give the people they're talking to a little credit for being able to do basic math.

They were using an 800MHz GPU. You have to earn credit and trying to compare their setup to discrete chips isn't a good way to start. ;)

I'm a little confused by what they were trying to say about bandwidth. Perhaps I've misunderstood but it sounds almost like they're "hoping" that their emphasis on bandwidth over CUs is going to pay off. They don't sound entirely convinced of it either as they kind of leave that up in the air.

For GPGPU stuff, both companies have different strategies. Sony has little experience with it compared to MS. Sony wants lots of CU's to help with GPGPU loads (physics for instance). In my view, this might be problematic as having CU's is worthless if the loads are stalled by pending fetch requests to memory. GDDR5 has lots of latency and for physics calculations you need low latency usually to avoid stalling your GPU/CU (hence physics is usually done on the CPU). That's MS's thoughts too, I think. In their setup you can use your CU's to access eSRAM which has very low latency in addition to high bandwidth at the same time. So you can bring in lots of data at once to run in parallel and that data can change violently during the frame without causing stalls. That's just my interpretation of their position though. Might be wrong.

Both companies are guessing with their strategies here. I'd wager MS's experience (DirectCompute, Kinect) there puts them at an advantage of being right over Cerny's expectations though.
 
The reality is unless your gaming on a computer monitor/ or display sitting less than 3 ft away, the visual difference will be minimal to non-existent as the system will probably scale the resolution any way unless it's native 1080p. I'm also expecting as the XB-1 Tech matures we will see more 1080p Native games hitting market. Personally I'm more excited around the the XB-1 SHAPE Chip and the sound it's gonna produce. Visual difference is great, the jump from DD/DTS 5.1 to DD TrueHD/DTS-MA is huge. If you don't believe me watch a DVD of less say "Looper" in DD & listen to the audio. Then watch it again in Blu Ray and listen to the Audio DTS-MA Mix

The Difference is "Night & Day"

The difference is pretty big for those used to the 360 - and have a decent sound-system.

The Xbox 360 were allways at 48 khz-range for all sound, so it did loose alot quality, on sound built for high quality. :)
Even when ripping/playback normal CD tracks wich use 44.1 khz wave, it lost alot of soundquality - since it's not evenly dividable with 48 khz compression.

I doubt we'll see a big upgrade on the sound from PS3 tough, since that allready had uncompressed sound, and rather high dynamic set of ranges bitrates and formats both when ripping and playback, and support for basically all formats.
But I expect both Xbox One and PS4 to be at PS3-levels or better, so alot of Xbox'ers will see a sound revolution, pending they have a decent sound-system. :)
 
  • Like
Reactions: bigfnjoe96
I hate how the shoulders look on the bigger models, they just look off and the animations suffer for it. The models I've seen so far for Bundy and Yoko look awful, they really do need to start over. I'm also kind of pissed that the Usos aren't in the main game this time even though they were DLC for the last one, the models were already made as well as the move sets so it just seems stupid to exclude them this time around.
 
i read this on gaf they made it seem like this was horrendous news and all lies, seriously got to stop going there for the moment glad to read some positive stuff about it here
 
I am really looking forward to the single player DLC. I'll probably cave on the season pass even though i haven't played the multi-player since its probably going to cost 19.99 alone. The map packs i think will be a total of $15 based on the numbers given in the article.
 
No. The eSRAM was in the design prior to the 8GB of RAM or DDR3. You can check the Yukon leak for proof of this. It's intended specifically as a natural evolution of the 360's eDRAM. It's not, nor was it ever, considered a bandaid. It was designed to use eSRAM because it gives the same benefits as any other high bandwidth setup without most of the downsides (cost, manufacturing flexibility).

Natural evolution? THIS is what they actually said astro:

Digital Foundry: So you didn't want to go for a daughter die as you did with Xbox 360?

Nick Baker: No, we wanted a single processor, like I said. If there'd been a different time frame or technology options we could maybe have had a different technology there but for the product in the timeframe, ESRAM was the best choice.








What MS did was recognize this and give devs the ability to tightly manage framerates and resolutions and color depths in the display planes. They did extensive testing of this at MSR where they looked at display planes utilized in a best case scenario and saw that you waste 5-6 times the GPU processing by going full HD when you don't need to in most cases. Letting devs process foregrounds/backgrounds independently in the GPU and then using a really high end scaler for free AA and quality IQ means devs can get almost the exact same results as 1080p + good AA without the additional GPU processing taking part at all.

RYSE is a great example, as is DR3. In RYSE's case, the game gets free AA via the display planes by going to 900p and upscaling. At the same time performance gets a boost from this, which they offset by adding more dynamic bits to Marius' armor and more visual fx in general. So you get a better looking game in areas that ppl actually notice without any compromises to the end results as viewed on your tv to the naked eye. For DR3, they locked the framerate using the display planes via dynamic resolution that slightly fluctuates in non-notable ways, even with fully destructible zombies numbering well into the hundreds all on screen at once (plus huge open levels with lots of destructibility in them too). So there you get all of that without having to pare back the destruction or zombie count (gameplay stuff) AND you get your framerate locked down at 30fps. For free relative to the GPU. More control over graphics-related optimizations in areas of highly diminished returns is a GOOD thing.

It will interesting to eventually read and hear the unvarnished sentiments of developers on this issue, because so far they have been very "diplomatic" when discussing the Xbox One's performance.
 
Natural evolution? THIS is what they actually said astro:

Digital Foundry: So you didn't want to go for a daughter die as you did with Xbox 360?

Nick Baker: No, we wanted a single processor, like I said. If there'd been a different time frame or technology options we could maybe have had a different technology there but for the product in the timeframe, ESRAM was the best choice.

What I said was that they weren't comrpomising anything in terms of the eSRAM. It was in the design from day 1, before 8GB of RAM and before DDR3. Here, this is from before what they are talking about there. It's from mid-2010 and seems to be the first design they were exploring:

Slide9.jpg


Next slide, we see they were planning on considering eSRAM instead of eDRAM:

Slide10.jpg


Also, here is what they said in the DF article:

Digital Foundry: Perhaps the most misunderstood area of the processor is the ESRAM and what it means for game developers. Its inclusion sort of suggests that you ruled out GDDR5 pretty early on in favour of ESRAM in combination with DDR3. Is that a fair assumption?

Nick Baker: Yeah, I think that's right. In terms of getting the best possible combination of performance, memory size, power, the GDDR5 takes you into a little bit of an uncomfortable place. Having ESRAM costs very little power and has the opportunity to give you very high bandwidth. You can reduce the bandwidth on external memory - that saves a lot of power consumption as well and the commodity memory is cheaper as well so you can afford more. That's really a driving force behind that. You're right, if you want a high memory capacity, relatively low power and a lot of bandwidth there are not too many ways of solving that.

...
Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.

Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM. The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, "Gosh, it would sure be nice if an entire render target didn't have to live in eDRAM," and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3 so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go.

Sometimes you want to get the GPU texture out of memory and on Xbox 360 that required what's called a "resolve pass" where you had to do a copy into DDR to get the texture out - that was another limitation we removed in ESRAM, as you can now texture out of ESRAM if you want to. From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly.

So not sure what you are trying to refute of what I said. The eSRAM isn't there as a bandaid. It's been in the design from jump street, prior to 8GB and prior to DDR3. It was there as a natural evolution of the eDRAM on 360, just as I said and as I've argued for months now.

It will interesting to eventually read and hear the unvarnished sentiments of developers on this issue, because so far they have been very "diplomatic" when discussing the Xbox One's performance.

You will automatically label *anything* that doesn't tell you what you *want* to hear as a diplomatic answer. Carmack has never been known for diplomacy on this matter. We both know where he stands.
 
  • Like
Reactions: fade2black
What I said was that they weren't comrpomising anything in terms of the eSRAM. It was in the design from day 1, before 8GB of RAM and before DDR3. Here, this is from before what they are talking about there. It's from mid-2010 and seems to be the first design they were exploring:

Slide9.jpg


Next slide, we see they were planning on considering eSRAM instead of eDRAM:

Slide10.jpg


Also, here is what they said in the DF article:



So not sure what you are trying to refute of what I said. The eSRAM isn't there as a bandaid. It's been in the design from jump street, prior to 8GB and prior to DDR3. It was there as a natural evolution of the eDRAM on 360, just as I said and as I've argued for months now.



You will automatically label *anything* that doesn't tell you what you *want* to hear as a diplomatic answer. Carmack has never been known for diplomacy on this matter. We both know where he stands.

What?! You know astro, sometimes you act like YOU built this console.

I was merely repeating what THEY SAID, and they DID say that they would have preferred one processor but they didn't have the timeline, so the esram BECAME the best choice. I didn't even criticize the choice of esram and you flew into a charts and graphs mode like Pure. LOL!

And if you can actually divorce yourself and be honest for a moment, you should be able to agree with me that developers have indeed been very diplomatic when describing the X1. All I was thinking is it will be interesting to read and hear the sentiments of developers and to see if their views match up with Goosen and Baker.

What was so wrong about that?

chillpill.jpg
 
Astro dude you seriously need to start writing for gaming websites or something

That wouldn't work, most website want unbiased journalists with integrity.
As long as Microsoft is the one doing it, everything is rosy and dandy in Astro's book.
Everything has it's pluses and minuses, but if Astro were to write an article about something regarding Microsoft, it would only be pluses. :-/
 
I can barely tell the diff between 1080p and 900p upscaled to 1080p. If that is the only diff. then Xbox One gamers have nothing to worry about.
 
Status
Not open for further replies.