Comparing 360/PS3/XB1/PS4 multiplats/tech stuff

Granite SDK V2.0 Now Available, Reduces PS4, PC & Xbox One Memory Usage by 75% For Texture Streaming

Middleware company Graphine has announced that its texture streaming software Granite is now at version 2.0 as of today. It brings several benefits for real-time graphics, especially with concerns to memory usage on next-gen consoles. This is possible through better texture streaming which allows for enhanced details without causing a drain on memory. The SDK is engine-independent and supports all image formats, with implementation into any engine possible via any pipeline.

Graphine co-founder and CEO Aljosha Demeulemeester stated that,“With the release of the next-gen consoles (PlayStation 4 & Xbox One, ed.) last month, the door has been opened to a whole new era of video game graphics. As the full potential of the consoles has yet to be unlocked, we provide developers with tools enabling them to bring impressive new experiences to these platforms.

“The hardware support for virtual texturing, the graphics API layer and native texture atlassing we’ve added to Granite in version 2.0 empower developers to keep up with said evolution in computer graphics.”

They also revealed that the new version enables massive amounts of unique texture data and reduces memory usage by 75% and disk file-size by 67%.

Aimed at PC, PS4 and Xbox One, some games that can be pointed to for showcasing the benefits of Granite include Divinity: Dragon Commander. Stay tuned for more details on how this benefits next-gen development.

So where's that tiled resources secret sauce at, again? Wasn't Granite one of the things shown off at the 40 minute TR presentation that neo and cegar kept linking to?
:(
 
Last edited:
You didn't get one? Sucks for you, some Xbox Ones' come with dGPUs and 12GB of ram coupled to a 16 core CPU.

Dayum. I made the mistake of ordering the low-calorie secret sauce for mine. :(
 
Not as far as I know, but the truth has more or less come out. Both PS4 and Xbox One have AMD GCN GPUs with the same hardware PRT support that is accessed by developers through different software APIs (OpenGL, DirectX, etc.) We don't know if one GPU is GCN1.0 or GCN1.1, but I'm betting they're both GCN1.1 and the differences are as Anandtech put it "extremely minor". Both consoles have access to similar tiled texturing hardware functions.

Pretty much everything neo, astro, and cegar wrote about TR and ESRAM latency is straw grabbing or a complete fabrication.
 
Project CARS Uses Xbox One eSRAM For Deferred Render Targets, Careful Use Mitigates Some of PS4′s Unified Memory Advantage

“Our engine uses a light pre-pass style rendering approach and after experimenting with a number of different variations we found it was more efficient to use eSRAM to hold the deferred render targets. Careful use of eSRAM like this for the various render stages mitigates some of the advantage that PS4 has with its faster unified GDDR5 memory,”

This dev admits PS4 has faster real-world memory bandwidth than what they can get out of DDR3+ESRAM, contrary to posters like flynn. And similar to pretty much every next-gen dev comment on memory, they make no mention of latency, which is a non-issue for graphics performance.
 
I don't think Flynn's ever made mention of the eSRAM+DDR3 being faster; Astro on the other hand has repeated it ad nauseam.
 
I don't think Flynn's ever made mention of the eSRAM+DDR3 being faster; Astro on the other hand has repeated it ad nauseam.

Doesn't matter; the template just calls for an argument and someone to pin it on, regardless of whether that person ever made that argument. The wheel's just coming up Flynn right now.
 
I don't think Flynn's ever made mention of the eSRAM+DDR3 being faster; Astro on the other hand has repeated it ad nauseam.
Everyone from flynn, to neo, to cegar, to astro, old guy, and even Albert Penello himself claimed DDR3+ESRAM bandwidth was faster than GDDR5. That's a rogues gallery of posters making the most straw grasping, deceptive, and baseless tech claims on this forum (minus Penello). Unfortunately, phony "add the numbers" benchmarks don't equate to real world games performance according to the Project Cars dev and other devs working on real games.
Flynn in a message to me said:
Bandwidth to/from the ESRAM is 200+ theoretical, and PS4's does not get 160-170 regularly. That's just false. Besides, the 200+ for the ESRAM is ON TOP OF the primary system's ram.
Doesn't matter; the template just calls for an argument and someone to pin it on, regardless of whether that person ever made that argument. The wheel's just coming up Flynn right now.
And of course, a hanger on making false accusations that I can't back up what I say with quotes, which I've just provided. Sorry, unlike you and many others here, I don't make false accusations about what other posters said. Another irony meter explosion: A poster just made a false accusation at me that I make false accusations.
 
Last edited:
And of course, a hanger on making false accusations that I can't back up what I say with quotes, which I've just provided. Sorry, unlike you and many others here, I don't make false accusations about what other posters said. Another irony meter explosion: A poster just made a false accusation at me that I make false accusations.

http://unionvgf.com/index.php?posts/85564/
Intellivision, cody, mcmasters, omega, hrudey, hammerclaw, nervusbreakdown, the gang's all here!

You do realize you already have a "talk FUD, disinfo, and s*** about PS4" thread, right? It's called the "official xbox thread". No need to invade this sub-forum.

One false accusation as you'll find no proof of me doing such a thing as I, you know, don't do that.

I think your other blatant troll where you necroed an 11-day dead thread to claim that I said it wasn't possible to have a CPU with GDDR5 memory was deleted so I can't link to it, but I'm sure you remember it and I'm just confident you'll find I never said it was impossible, just that it's not being done to date. And of course, to this date, there's still no commercially available consumer motherboard that accepts GDDR5 system memory.

So to review, false accusations about what I've said - you haz them. Perhaps you shouldn't stand so close to your irony meter, since you seem to have a deleterious effect on its performance.
 
"The gang" meaning the posters like yourself that spread FUD about PS4 hardware issues and argue against reason, logic, and facts. This entire thread you adamantly argued that a FUD hit piece article based on falsehoods and unsourced tweets about low level customer support was just as credible and reliable as Sony and Amazon official PR statements about unit replacement times. You were of course 100% wrong. I remember you had some other stuff in the PS4 hardware issues thread but they seem to have been deleted.

You acting hysterical, belittling, hyperbolic, illogical, and mocking whenever PS4's advantages are mentioned, just a small sampling from quickly poking through your post history, I'm sure there's a lot more:
http://unionvgf.com/index.php?posts/62039/
http://unionvgf.com/index.php?posts/89169/
http://unionvgf.com/index.php?posts/77230/
http://unionvgf.com/index.php?posts/85912/
http://unionvgf.com/index.php?posts/86063/
http://unionvgf.com/index.php?posts/89170/
http://unionvgf.com/index.php?posts/44151/
http://unionvgf.com/index.php?posts/43093/


I did not say you claimed it was impossible. I never used the word impossible in that entire thread, you are wrong as usual. The real reason GDDR5 isn't used as main PC RAM is price, not because it's "inferior" for the CPU. Now that developers are literally saying they'd prefer GDDR5 as main PC RAM, you're backpedaling like mad. The fact that you're still clinging to "latency" nonsense puts you in a select group of tech pariahs like neo, cegar, astro, and flynn.

You definitely belong in "the gang".
 
Last edited:
"The gang" meaning the posters like yourself that invaded the PS4 hardware issues thread and related threads to spread FUD and argue against reason, logic, and facts.

This entire thread you adamantly argued that a FUD hit piece article based on falsehoods and unsourced tweets about low level customer support was just as credible and reliable as Sony and Amazon official PR statements about unit replacement times. You were of course 100% wrong.

Show where I ever said the article was credible. My contention is that in that case, which should be familiar to you by now, is that I simply hate when people confuse FUD and "things they don't like." I realize you're struggling with that distinction, but we're here for you, though feel free to seek help elsewhere instead. You'll find that if you read my posts, I never once said the original article was credible, and in fact said that I wouldn't be surprised to find out it wasn't. But don't let what I actually said stop you from making up things I actually, you know, never said.


While you're seeking help for the distinction between FUD and things you dislike, perhaps you could also see if you can develop the capacity to laugh. Not that any of those were notoriously funny, but I suppose if you're locked in to every post is another struggle in the war between glorious Sony and the MS infidels, perhaps you could eventually contort some of those posts to be ... well, not hysterical or hyperbolic to be sure. And I think *anyone* here who's not a fulltime Soldier in the Holy Console Army would recognize the difference between those posts and what you're claiming them to be. In other words, you're wrong again. Enjoy that.


I did not say you claimed it was impossible. I never used the word impossible in that entire thread, you are wrong as usual. The real reason GDDR5 isn't used as main PC RAM is price, not because it's "inferior" for the CPU. Now that developers are literally saying they'd prefer GDDR5 as main PC RAM, you're backpedaling like mad. You definitely belong in "the gang".

I think you said, "Bu-bu hrudey said it can't be done because of the latency." I might be slightly off with the words but I think that's an accurate representation of your post and feel free to correct me if I am wrong. But if price were the only issue, don't you think we'd have had enthusiast boards with GDDR5 system RAM by now? Surely the people who'll pay for the high end PC equipment for maximum gaming benefits would pay for the GDDR5 mobo if available if it were truly that much better. And who knows, maybe it will be - certainly for a dedicated gaming-only system with an APU, it'll keep a graphics card fed faster. Maybe you should go let Asus or Gigabyte know how to go do their jobs once you're done telling IGN how to do theirs.
 
I said you claimed GDDR5 latency would hinder CPU performance, which is exactly what you said. The frequency/latency on DDR4 RAM, scheduled for 2014, is even higher than DDR3 and getting close to low end GDDR5. Modern memory controllers and CPUs with OOO execution and large caches are highly latency tolerant. See this stick of G.Skill DDR4 RAM announced.

And what types of PC will benefit the most from this high speed main RAM? APU systems with integrated graphics, similar to next gen consoles.
 
consolewarz should be renamed hyperlinkwarz. I guess Knack and KZ aren't worth spending time on.
 
I think his user name alone told you what to expect from him, so I don't know why you are surprised he is so argumentative.

Calling famousmortimer a "Sony shill trying to spread fud".

He kind of still is. I don't think being right about CODs resolution, or anything, changes that. Seeing his blog a while back, all he does is promote PlayStation and trash Xbox.
 
What happend to all of there console idiots that said launch games are all going to be 1080p 60fps because there isn't anything to learn as it is a PC architecture?

IIRC that was one poster and his whole shtick was that "XB1 has more games running @ 1080p than PS4". He went on about this for MONTHS.

Here we are, post launch and lo & behold, the PS4 has more games ACTUALLY running @ 1080p.

If that's the case (and it is) then BF4 on my PC runs at the framerate cap of 200fps @ Ultra with 4xAA.

It's funny, they talk about 60fps this and that, but the time when you need to the framerate to be as high as possible, in the heat of battle, it's not, it's closer to 40. Then really what is the point of targeting 60fps if it can't be maintained when needed most?

Yup. At this point it would be better to cap it @ 30 so that when you do get into a heated battle you're not being thrown off by the drops and spikes.
 
Substance Engine Increases PS4 & Xbox One Texture Generation Speed To 14 MB/s & 12 MB/s Respectively

Substance-Engine_Texture-Generation.jpg


This is a very interesting benchmark. While both CPUs get clobbered by a Core i7, it looks like PS4's CPU completed the benchmark with more MB/s. We know Xbox's CPU is 1.75GHz. PS4's CPU was never officially confirmed, although VGleaks reported 1.6GHz. Could be a couple reasons for the benchmark:

1. PS4 has less CPU reserve than Xbox. Pretty sure they both reserve 2 cores, though.
2. PS4's RAM or other system bandwidth is letting the CPU work more efficiently.
3. PS4's CPU is equally or higher clocked.
4. The devs rigged the benchmark and Xbox One is really secretly more powerful. (flynn/astro/neo/misterx option)
 
Last edited:
Substance Engine Increases PS4 & Xbox One Texture Generation Speed To 14 MB/s & 12 MB/s Respectively

Substance-Engine_Texture-Generation.jpg


This is a very interesting benchmark. While both CPUs get clobbered by a Core i7, it looks like PS4's CPU completed the benchmark with more MB/s. We know Xbox's CPU is 1.75GHz. PS4's CPU was never officially confirmed, although VGleaks reported 1.6GHz. Could be a couple reasons for the benchmark:

1. PS4 has less CPU reserve than Xbox. Pretty sure they both reserve 2 cores, though.
2. PS4's RAM or other system bandwidth is letting the CPU work more efficiently.
3. PS4's CPU is equally or higher clocked.
4. The devs rigged the benchmark and Xbox One is really secretly more powerful. (flynn/astro/neo/misterx option)

So 2 or 3 seems the most plausible?
 
I feel like consolewarz is a conspiracy theorist who thinks everyone is out to get him and the PS4. :laugh:

It's entertaining.
 
Console makes some good points, but he reminds me a lot like Puppeteer. Nevertheless, puppeteer was funny and he was sort of a conspiracy theorist.
 
Console makes some good points, but he reminds me a lot like Puppeteer. Nevertheless, puppeteer was funny and he was sort of a conspiracy theorist.
Yeah I know a lot of people don't give console the time of day here, but not everything he says is made up FUD and I can say without a doubt he is much more knowledgeable than me about this stuff so I don't question it . I think it's just the way he presents it that turns some people off.