DX12 Thread v. 2.0

http://1080players.com/index.php/xb...formance-increase-on-pc-even-more-on-xbox-one

DX12 adding a 20% performance increase on PC, even more on Xbox One

We had the privilege to sit down and speak with Dylan Browne of Incandescent Imaging about their upcoming game Caffeine. It is a first person, exploration horror game based in puzzles that is in development for PC, Xbox One and PS4. Development initially began last year in 2014 on Unreal Engine 3 but Mr. Browne has since moved to Unreal Engine 4. During our conversation, we spoke of the game's inspiration, gameplay setup, background as well as any developmental problems that have arisen so far.

We asked Mr. Browne if he has seen any changes performance wise on PC and Xbox One with the addition of DX12. The answer we received was surprising and not what we expected. "Oh it should help a lot. I've been in the Direct X12 developer program for a while. So, I've been playing around with a lot of those builds of Unreal Engine and seeing how they play out. Usually you're seeing a 20-ish% performance increase in most cases. So it definitely boosts the FPS which would allow to increase, especially on Xbox, which is a little more underpowered obviously then you're PS4".

Mr. Browne continued by saying "On Xbox, it might allow me to do things like have the soft chatters that I have on PC which I didn't originally enable when I started to port to Xbox mainly because it was cut out due to performance, but it performs pretty well. I could actually probably enable them on the current build but when I get a Direct X12 build I'll be looking into more ways to up the resolution and things like that".

I clarified and asked "When you say 20% you mean 20% increase in FPS?" His response ----- "Yes, yeah".

The 20% increase he mentioned was also only for PC with the Xbox One seeing a higher percentage. The reason he cited was all Xbox Ones are the same where as PC's need to account for different drivers and other internal hardware that may have counter productive impacts. Caffeine is currently due out for PC in three different segments with part one being released by the end of the year. Console versions for PS4 and Xbox One will most likely be in 2016. The full interview will be able to be heard on the show on Tuesday so be sure to tune in at 7PM Eastern. We will also have the full transcription of the interview up soon as well Either way, whether you are a fan of either system PS4 or Xbox One, it's good to see the industry in general improving. In the meantime, take a look at the newest Caffeine trailer below
XuBJvrKHutnkQ.gif
 
Great things are coming with DX12 boys. Not greatness, if ya know what i mean, but great things.
 
http://arstechnica.co.uk/gaming/201...ly-win-for-amd-and-disappointment-for-nvidia/

To say these benchmark results are unexpected would be an understatement. While it's true that AMD has been banging the DX12 drum for a while, claiming that there'd be some major performance upticks in store, its performance in Ashes is astonishing. AMD's cheaper, older, and less efficient GPU is able to almost match, and at one point beat Nvidia's top-of-the-line graphics card, with performance boosts of almost 70 percent under DX12. On the flip side, Nvidia's performance is distinctly odd, with its GPU dropping in performance under DX12, and even when more CPU cores are thrown at it. The question is, why?

Did AMD manage to pull off some sort of crazy-optimised driver coup? Perhaps, but it’s unlikely. It's well known that Nvidia has more software development resources at its disposal, and while AMD's work with Mantle and Vulkan will have helped, it's more likely that AMD has the underlying changes behind DX12 to thank. Since the 600-series of GPUs in 2012, Nvidia has been at the top of the GPU performance pile, mostly in games that use DX10 or 11. DX11 is an API that requires a lot of optimisation at the driver level, and clearly Nvidia's work in doing so has paid off over the past few years. Even now, with the Ashes benchmark, you can see just how good its DX11 driver is.

Optimising for DX12 is a trickier beast. It gives developers far more control over how its resources are used and allocated, which may have rendered much of Nvidia's work in DX11 obsolete. Or perhaps this really is the result of earlier hardware decisions, with Nvidia choosing to optimise for DX11 with a focus on serial scheduling and pre-empting, and AMD looking to the future with massively parallel processing.

Alas, without more data to draw from in the form of other DX12 games, it's hard to draw any concrete conclusions from the Ashes benchmark. Yes, AMD's performance gets a dramatic boost, and yes, Nvidia's doesn't. But with only one other major DX12 game on the way—Fable Legends—does it matter all that much right now? While DX12 usage will ramp up, DX11 isn't going anywhere for a long time. And who's to say that Nvidia won't see better performance in Unreal Engine, Unity, and others when they eventually get used in DX12 games?


(there is much more to read in the article)
 
  • Like
Reactions: Two Pennys Worth
This new info is actually something to get excited about.
 
Sweet! So that's why MS went with the bottom of the barrel AMD tech for the XB1, so when DX12 launches we'll have a beast under the hood :D
 
http://arstechnica.co.uk/gaming/201...ly-win-for-amd-and-disappointment-for-nvidia/

To say these benchmark results are unexpected would be an understatement. While it's true that AMD has been banging the DX12 drum for a while, claiming that there'd be some major performance upticks in store, its performance in Ashes is astonishing. AMD's cheaper, older, and less efficient GPU is able to almost match, and at one point beat Nvidia's top-of-the-line graphics card, with performance boosts of almost 70 percent under DX12. On the flip side, Nvidia's performance is distinctly odd, with its GPU dropping in performance under DX12, and even when more CPU cores are thrown at it. The question is, why?

Did AMD manage to pull off some sort of crazy-optimised driver coup? Perhaps, but it’s unlikely. It's well known that Nvidia has more software development resources at its disposal, and while AMD's work with Mantle and Vulkan will have helped, it's more likely that AMD has the underlying changes behind DX12 to thank. Since the 600-series of GPUs in 2012, Nvidia has been at the top of the GPU performance pile, mostly in games that use DX10 or 11. DX11 is an API that requires a lot of optimisation at the driver level, and clearly Nvidia's work in doing so has paid off over the past few years. Even now, with the Ashes benchmark, you can see just how good its DX11 driver is.

Optimising for DX12 is a trickier beast. It gives developers far more control over how its resources are used and allocated, which may have rendered much of Nvidia's work in DX11 obsolete. Or perhaps this really is the result of earlier hardware decisions, with Nvidia choosing to optimise for DX11 with a focus on serial scheduling and pre-empting, and AMD looking to the future with massively parallel processing.

Alas, without more data to draw from in the form of other DX12 games, it's hard to draw any concrete conclusions from the Ashes benchmark. Yes, AMD's performance gets a dramatic boost, and yes, Nvidia's doesn't. But with only one other major DX12 game on the way—Fable Legends—does it matter all that much right now? While DX12 usage will ramp up, DX11 isn't going anywhere for a long time. And who's to say that Nvidia won't see better performance in Unreal Engine, Unity, and others when they eventually get used in DX12 games?


(there is much more to read in the article)

I just LMAO.

Nividia sees no improvement with DX12, it even loses some performance. Yet AMD effectively doubles its performance.

It also should be noted that this game is a 'better with AMD' product. Plus, it is still in Alpha. Plus, MSAA wasn't used.
 
I expect Nvidia will release another driver soon that will see a similar performance boost to the AMD cards but that doesn't take away from the fact that the 290x sees a massive performance boost using DX12 over DX11. Over 100% performance boost in places.
Performance doesn't scale either so to double the performance would usually take more than double the power.
 
I expect Nvidia will release another driver soon that will see a similar performance boost to the AMD cards but that doesn't take away from the fact that the 290x sees a massive performance boost using DX12 over DX11. Over 100% performance boost in places.
Performance doesn't scale either so to double the performance would usually take more than double the power.

That's because AMD's DX11 drivers are that bad.
 
That's because AMD's DX11 drivers are that bad.

Nvidia's DX11 drivers are very good, as shown in the benchmarks but using DX12 the 290x competes and sometimes beats the 980ti DX11 benchmarks so no matter how bad AMD's DX11 drivers were, using DX12 the 290x is actually outperforming what a 980ti is capable of right here and now using Nvidia's excellent DX11 drivers.
Basically it has proven that software (DK12) can help weaker hardware become as efficient as much more powerful hardware running DX11.
As far as Nvidia's poor performance in this particular demo, I'd assume it's some kind of bug and I expect it will be business as usual in benchmarks in future DX12 games, but no one can deny the extra performance/efficiency DX12 has provided the 290x.
 
Last edited:
For those trying to downplay AMD's huge gain, 290X is a two-year-old card that had competed neck-to-neck with 780Ti back then. Calling "bad driver" only means that AMD freakin' overengineered their card two years ago which is even more unlikely.
 
It's nice to see this benefit PC too. And this with for Xbox, they now cover the weaknessof the Xbox One CPU.
 
Just like Mantle, DX12's purpose is to bring console level of optimization to PC. To a certain degree Xbox already benefits from the same thing. Xbox will keep improving on its own though, like it has been all the time.
 
DX12 is the target platform for Xbox One.

Not sure what that means, but it is fact that DX12 is aimed at the PC platform. MS has said it themselves on numerous occasions.

As for Xbox One, it makes sense for it to come to this platform as well, but it certainly isn;t the main focus of DX12 as DX12 is simply bringing to PC what consoles have always had.
 
Not sure what that means, but it is fact that DX12 is aimed at the PC platform. MS has said it themselves on numerous occasions.

As for Xbox One, it makes sense for it to come to this platform as well, but it certainly isn;t the main focus of DX12 as DX12 is simply bringing to PC what consoles have always had.
I was saying they designed the Xbox One around DX12 with not only DX12 features already present but was built around getting DX12 when it is released. Xbox One was completely designed around DX12. DX12 is the platform for Xbox One.
 
I was saying they designed the Xbox One around DX12 with not only DX12 features already present but was built around getting DX12 when it is released. Xbox One was completely designed around DX12. DX12 is the platform for Xbox One.

I seriously doubt they actually did. X1 would have designed around DX11 and then updated to accept some of the features of DX12. If the Xbox One really was designed around DX12 then why won't it support DX12.1?
 
It's nice to see this benefit PC too. And this with for Xbox, they now cover the weaknessof the Xbox One CPU.


Isn't the xb1's cpu already more capable than the ps4's? I mean either way they are both weak.
 
Not sure what that means, but it is fact that DX12 is aimed at the PC platform. MS has said it themselves on numerous occasions.

As for Xbox One, it makes sense for it to come to this platform as well, but it certainly isn;t the main focus of DX12 as DX12 is simply bringing to PC what consoles have always had.
DX12 is aimed at all platforms. There certain features that consoles never had.
 
Not sure what that means, but it is fact that DX12 is aimed at the PC platform. MS has said it themselves on numerous occasions.

As for Xbox One, it makes sense for it to come to this platform as well, but it certainly isn;t the main focus of DX12 as DX12 is simply bringing to PC what consoles have always had.

What games are currently using it? None so far is my understanding.
 
I seriously doubt they actually did. X1 would have designed around DX11 and then updated to accept some of the features of DX12. If the Xbox One really was designed around DX12 then why won't it support DX12.1?

I'm pretty sure 12.1 is a advanced feature set only found in high end GPUs. 12.0 is the base feature set.
 
Metroid 'demo' with Unreal Engine 4 [4.9] DX12


Wow. Slick.

Too bad Nintendo will never make a Metroid game looking like that. Even if it's the year 2100 it'll still look like something made in 2004.
 
Any links on that? You have the tendency to lie about these kinds of things.

Yeah, bro. I lie about everything...or should I say everything you disagree with:txbrolleyes:

If you need a link to what I said then you are a dumbass. Look at what DX12 doing, it is giving the platform the same level of access that consoles have had for donkeys. Why would a console be the target platform for something that it already benefits from?
 
I don't think menace-uk- is lying about DX12. He's just stating what we have been reading for the past 12 months regarding benefits of DX12 for PC and Xbox One. Phil Spencer has said that parts of DX12 are already used in the Xbox One SDK.
Personally I think the Xbox One will see benefits and I'm pretty sure menace-uk hasn't refuted that, just saying the PC will see more benefits because PC hasn't had the benefit of low level access up until now.
I think where I would maybe disagree with that consensus is although Xbox One and PS4 give devs low level access which could give similar benefits to DX12, the devs would still have to painstakingly code it step by step, instead DX12 does all that work for the dev so we are much more likely to see devs implement it. That is what Brad Wardell has been banging on about with regards to low level access and Xbox One anyway.


"It’s not about getting close to the hardware

Every time I hear someone say “but X allows you to get close to the hardware” I want to shake them. None of this has to do with getting close to the hardware. It’s all about the cores. Getting “closer” to the hardware is relatively meaningless at this point. It’s almost as bad as those people who think we should be injecting assembly language into our source code. We’re way beyond that."

http://www.littletinyfrogs.com/article/460524/DirectX_11_vs_DirectX_12_oversimplified

Edit: Brad Wardell speaks on this podcast about how developers have low level access on console but still have to painstakingly code for it to get the benefits. DX12 takes the time and effort out so therefore we'll see these kinds of benefits because frankly a lot of devs have neither the time nor resources to implement it through DX11.

 
Last edited:
  • Like
Reactions: Kassen