Official Thread Pillow Fight that nobody wins with MOAR Jackie Chan and guys comfortable with STRETCHING their sexuality!

Status
Not open for further replies.
I've been consistent in saying that the hardware deltas weren't very exciting or interesting outside of the ML possibilities on the Xbox side. When Bill Stillwell left Xbox, he did say ML upscaling was working scarily well. There was no benefit for him to advocate at that point in time. Now hearing things leak out about it. Keep an eye on this. Microsoft touted hardware accelerated ML efficiencies when they talked about their version of RDNA2. I don't know PS5's capabilities there since Sony has laid low talking about specs...but this is really the only interesting area to watch in the future. Could make the Series S a very viable solution long term. Key is whether Microsoft can incorporate it well enough in their tools that 3rd parties will use it. Not sure about that right now.

ML possibilities all around....will be a story this gen.
 
  • Informative
Reactions: Two Pennys Worth
They both vary a bit at 120. But here they are with RT on. Yikes is right

View attachment 4408View attachment 4410



So much better of a in depth breakdown without the paid spin defending.

zsFk2qk.jpg


ZHKjTfh.jpg


MmxEeQj.jpg


9dIdSJC.jpg
 
I’ll post the first Tweet, but there are 6 or 7, so I’ll just post the text from the rest.
No foot stomping, please. If any of this is incorrect, just explain why.



“Forget Xbox Series X's more advanced full RDNA2 features (that PS5 does not even have), we are just talking raw rasterization performance. Everybody forgets how these systems are designed. PS5 has 16GB of RAM at 448GB/s. Series X has 10GB at 560GB/s and 6GB at 336GB/s.”

“Nearly all components of the Xbox Series X get identical performance regardless of which part of the asymmetrical memory design a game accesses, as in CPU, Audio & IO. The only thing that would not perform at full performance if it ended up using the slower side is the GPU.”

“So, given the obvious fact that devs had PS5 dev kits earlier than they had Series X dev kits, there's also the fact that the PS5's memory setup is just plain simpler and more straightforward to work with. It's the same 448GB/s across all RAM available to games.”




“Before you go on saying "oh that's impossible, it will never come to fruition" hold your horses, this very effective boost of your available resources happens all over designs such as modern CPUs/GPUs. This benefit, for example, are why things such as CPU or GPU caches exist.”




 
I’ll post the first Tweet, but there are 6 or 7, so I’ll just post the text from the rest.
No foot stomping, please. If any of this is incorrect, just explain why.



“Forget Xbox Series X's more advanced full RDNA2 features (that PS5 does not even have), we are just talking raw rasterization performance. Everybody forgets how these systems are designed. PS5 has 16GB of RAM at 448GB/s. Series X has 10GB at 560GB/s and 6GB at 336GB/s.”

“Nearly all components of the Xbox Series X get identical performance regardless of which part of the asymmetrical memory design a game accesses, as in CPU, Audio & IO. The only thing that would not perform at full performance if it ended up using the slower side is the GPU.”

“So, given the obvious fact that devs had PS5 dev kits earlier than they had Series X dev kits, there's also the fact that the PS5's memory setup is just plain simpler and more straightforward to work with. It's the same 448GB/s across all RAM available to games.”




“Before you go on saying "oh that's impossible, it will never come to fruition" hold your horses, this very effective boost of your available resources happens all over designs such as modern CPUs/GPUs. This benefit, for example, are why things such as CPU or GPU caches exist.”






All very nice to read. Think we've already gone over how Sony picked and choose what they wanted from RDNA2 features and then customized the rest so won't repeat there.

Both PS5 and XSX dev tools will improve in the coming months and years so looking forward to seeing how games are in the near future.
I’ll post the first Tweet, but there are 6 or 7, so I’ll just post the text from the rest.
No foot stomping, please. If any of this is incorrect, just explain why.



“Forget Xbox Series X's more advanced full RDNA2 features (that PS5 does not even have), we are just talking raw rasterization performance. Everybody forgets how these systems are designed. PS5 has 16GB of RAM at 448GB/s. Series X has 10GB at 560GB/s and 6GB at 336GB/s.”

“Nearly all components of the Xbox Series X get identical performance regardless of which part of the asymmetrical memory design a game accesses, as in CPU, Audio & IO. The only thing that would not perform at full performance if it ended up using the slower side is the GPU.”

“So, given the obvious fact that devs had PS5 dev kits earlier than they had Series X dev kits, there's also the fact that the PS5's memory setup is just plain simpler and more straightforward to work with. It's the same 448GB/s across all RAM available to games.”




“Before you go on saying "oh that's impossible, it will never come to fruition" hold your horses, this very effective boost of your available resources happens all over designs such as modern CPUs/GPUs. This benefit, for example, are why things such as CPU or GPU caches exist.”






Nice read. We will see if more time with the GDK will help the Xbox catch up to PS5 in future titles.

Both PS5 and Xbox dev tools will improve in the future which means good times for either going forward.
 
  • Agree
Reactions: karmakid
I’ll post the first Tweet, but there are 6 or 7, so I’ll just post the text from the rest.
No foot stomping, please. If any of this is incorrect, just explain why.



“Forget Xbox Series X's more advanced full RDNA2 features (that PS5 does not even have), we are just talking raw rasterization performance. Everybody forgets how these systems are designed. PS5 has 16GB of RAM at 448GB/s. Series X has 10GB at 560GB/s and 6GB at 336GB/s.”

“Nearly all components of the Xbox Series X get identical performance regardless of which part of the asymmetrical memory design a game accesses, as in CPU, Audio & IO. The only thing that would not perform at full performance if it ended up using the slower side is the GPU.”

“So, given the obvious fact that devs had PS5 dev kits earlier than they had Series X dev kits, there's also the fact that the PS5's memory setup is just plain simpler and more straightforward to work with. It's the same 448GB/s across all RAM available to games.”




“Before you go on saying "oh that's impossible, it will never come to fruition" hold your horses, this very effective boost of your available resources happens all over designs such as modern CPUs/GPUs. This benefit, for example, are why things such as CPU or GPU caches exist.”





Sounds like as expected, the Series X will continue to soar past the PS5.
 

stolen from gaf

On Xbox Series X:
-The lowest resolution found was 2112x1188p in that torch section, but it's now locked at 60fps
-2304x1296p instead of 2560x1440p in the hut for stable 60fps
-Runs at a lower resolution than PS5

The quality mode is native 4K on Series X and PS5. No resolution drops were noticed, 30fps capped
-Camera angle change stutters are still present
-Cut-scene camera stutters have not been fixed either on Microsoft consoles & PC

Performance Mode (60fps) on Xbox Series S:
-Resolution generally around 720p, sometimes slightly above 800p
-Frame-rate is better than pre-patch Series X but worse than PS5

  • The performance mode on XSS in 60fps was possible by lowering the resolution to 720p even if it is around 800p in game with sometimes drops to 720p.
  • Sometimes even in 720p on XSS there are frames drops in performance mode.
  • The framerate of the XSS is better than that of the XSX before the patch, but still worse than that of the PS5.
  • At first glance the PS5 version did not change much after the patch, the only noticeable change is in the first cutscenes where the framerate drops a bit compared to the base version.
  • Solid framerate on XSX, the tearing always occurs during a sudden camera change.
  • This framerate & tearing improvement is at the expense of a lower resolution on XSX, it goes from 1440p to 1188p which is not the case of the PS5.
  • The quality mode is nice for those who do not need 60fps.
  • The quality mode is in 4K native that it is on PS5 or XSX in 30fps, they did not notice any drop of framerate on this mode.
  • Camera issues (stutters) have not been fixed on Xbox and PC.

DF is trying to help XSX so bad it's pathetic.




Yikes XSS

eG7L5r0.jpg
 
Looks like devs are flexing the XSX/S power. RIP PS5.

View attachment 4443

Flexing as in lowering the dynamic resolution scaling of the XSX to 1180p and XSS to 720p vs the 1440p that PS5 is rendering at, sure 😂

PS5 is rendering at approximately 50% more pixels. Besides, according to DF, the drops are mostly in cutscenes for some reason.
 
None of that is MS. “Xbox News” Twitter and those respective images are all from the fan made account called Xbox News. No affiliation with Microsoft.

Are these the same people that was specifically stating that the Xbox versions would be the only one to run 4K at 60fps vs the wording for PS5 to run 4K and 60fps?

Pretty sure I saw a few posters here parading that around here too as if it was the truth.
 
I bet the ultra realistic explosions and fire effects are just too much for either of these consoles to handle in 4K.
 
stolen from gaf

On Xbox Series X:
-The lowest resolution found was 2112x1188p in that torch section, but it's now locked at 60fps
-2304x1296p instead of 2560x1440p in the hut for stable 60fps
-Runs at a lower resolution than PS5

The quality mode is native 4K on Series X and PS5. No resolution drops were noticed, 30fps capped
-Camera angle change stutters are still present
-Cut-scene camera stutters have not been fixed either on Microsoft consoles & PC

Performance Mode (60fps) on Xbox Series S:
-Resolution generally around 720p, sometimes slightly above 800p
-Frame-rate is better than pre-patch Series X but worse than PS5

  • The performance mode on XSS in 60fps was possible by lowering the resolution to 720p even if it is around 800p in game with sometimes drops to 720p.
  • Sometimes even in 720p on XSS there are frames drops in performance mode.
  • The framerate of the XSS is better than that of the XSX before the patch, but still worse than that of the PS5.
  • At first glance the PS5 version did not change much after the patch, the only noticeable change is in the first cutscenes where the framerate drops a bit compared to the base version.
  • Solid framerate on XSX, the tearing always occurs during a sudden camera change.
  • This framerate & tearing improvement is at the expense of a lower resolution on XSX, it goes from 1440p to 1188p which is not the case of the PS5.
  • The quality mode is nice for those who do not need 60fps.
  • The quality mode is in 4K native that it is on PS5 or XSX in 30fps, they did not notice any drop of framerate on this mode.
  • Camera issues (stutters) have not been fixed on Xbox and PC.

DF is trying to help XSX so bad it's pathetic.




Yikes XSS

eG7L5r0.jpg


DF doing what DF are supposed to do is bad and pathetic? No.

The games a technical mess which is no surprise seeing as it's Ubisoft
 
So what is the One X doing in these face off? I watched one for COD and felt it was firmly in the middle of the Xbox family. Better resolution than XSS but lower than XSX with all three about equal in frames. I’m sure the One X placed last in loading times and ray tracing but that is obviously expected.
 
Wait, so DF mattered when PS5 was doing better in ACV, but now when the XSX is doing “substantially” better in ACV than the PS5, DF doesn’t matter?

steve brule what GIF
 
Status
Not open for further replies.