Edge Article

polishwonder

Well-Known Member
Sep 14, 2013
86
32
359
I thought I would share this about the so called Edge Article if you can call it that!! This was posted on Neo Gaf (sorry sony Gaf) and I thought it was very intresting!! I think its funny that anytime anything postive comes to light about the Xbox One these so called insiders come out!!

http://m.neogaf.com/showpost.php?p=81785777&postcount=2692

Finally got my account approved on NeoGAF. Cheers admins!

I've been watching this thread since its birth, and I've got to say, the amount of mis-information and conclusions which are being jumped to here is immense. For a community which is highly "in the know" I can't believe anyone hasn't really critically evaluated this with any technological understanding?

This article by Edge over these past days has blown up massively, and guess what, this is the sole purpose of this article. They've got what they wanted, generated a metric f**k ton of ad revenue and page hits. This article is wrong for many reasons:
  • How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
  • DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
  • They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
  • 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
  • Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.
 
tumblr_mjixq5aITR1qdek7to1_500.gif


:)
 
This was posted in at least 1 of the 2 EDGE article threads floating around already. This is a new forum..there are not that many threads yet and the search function works..please make use of it :D
 
I hadn't seen this stuff yet, polishwonder -- at least not in one, centralized location. Thanks for posting.