Sorry if this has been brought up. Doing my bing points, and all of a sudden dX12 pops ups, and cloud as NEWS. So basically the articles saying dx12 is doing a major upgrade for the X1 and MS is figuring out latency issue for cloud
So is this stuff legit?
First DX12
DirectX 12 For Xbox One Could Help CPU Utilization By 50%, Arrives Holiday 2015
AUTHOR: WILLIAM USHER
So DirectX 12 was showcased during this year's Game Developer's Conference. New information has indicated that the API is aiming to work like a low-level access tool for developers, possibly as Microsoft's attempt to compete with AMD's Mantle.
Gaming Bolt did a nice rundown of the new info that came out of Microsoft's DirectX presentation, where they revealed that the new API will be better optimized to bring CPU utilization down by more than 50%, even for the Xbox One. That's great news for CPU intensive games. It's too bad the Xbox One's problems aren't CPU intensive.
I'll get back to the Xbox One utilization in a bit.
DirectX 12 will also be fine-tuned for mobile and multi-GPU use. This means that the Glorious PC Master Race members running two or three different graphics processing units, either in SLI or CrossFire, will be able to see massive performance gains from DirectX 12 (theoretically). Some of you might know that when running a lot of DX-reliant games either in Xfire or SLI, you sometimes get worse performance than when running on a single card. Supposedly, DirectX 12 will fix this issue and finally stabilize compatibility for those of us who like to pump out twice the amount of power with multiple GPUs.
One of the things that really bothers me, though, is that despite DX12 being designed for mobile processing units and the latest tablet-based APUs, there's no word that DX12 will be made to support Windows 7, very much like DirectX 11.2. Technically, this makes absolutely zero sense that you would have an API designed for mobile devices – many of them running Android operating systems – but it won't be available for Windows 7? Seriously? The only reason I can see this being done is to force adoption of Windows 8, which will be the primary platform for DirectX 12, alongside other metro-based operating systems, like the Xbox One and devices running Smartglass.
Microsoft also demoed
Forza 5 (a game that's already running 1080p at 60fps) on a
GTX Titan Black at 60fps to show the power and compatibility of DX12, as reported by
WCCF Tech, but that seems like a moot point. Unless they reversed the effects of
The Forzaning, I don't think anyone cares.
NOW CLOUD.
Although Microsoft have been pushing the
power of the cloud ever since the Xbox One’s announcement, we have yet to see anything that will help the console push barriers. Granted that Microsoft have showcased a single tech demonstration that gives us a good taste of the cloud’s potential but then again players need to see something in action to have faith in this technology.
Even if Microsoft is able to implement cloud powered games on the Xbox One, they still have to beat the biggest beast of them all,
latency. Simply put, majority of the locations across the world, including some places in the USA, do not have latency free connections. Ever a round trip latency of 100 milliseconds will create issues for seamless cloud gameplay. In short, if you are cloud gaming developer, latency is your nightmare and this is perhaps one of the reasons why we may actually never see cloud powered games on the Xbox One in the future. However Microsoft is working to resolving this issue.
In a
research paper published by Microsoft, the development team is working on
DeLorean.
DeLorean is an execution system that is able to cover up or mask latency up to 250 milliseconds. The system’s algorithm is based on predicting future inputs, state space subsampling and time shifting, and misprediction compensation. It also includes algorithms for bandwidth compression which is crucial as it will reduce download times and improve response times.
DeLorean does not work on buffering the games but instead depends on mitigating latency via speculative execution. The system considers the user’s historical actions and based on accumulative data, the system decides a list of actions that are highly predictable. But what about actions that cannot be predicted? For that purpose, state space subsampling and time shifting is used to predict a long list of user actions. State space subsampling also helps in narrowing the important data only, saving bandwidth in the process.
Now there might be a possibility that the predictions were not correct and in this case the system will correct that frame using interpolation. This process is called misprediction compensation and it employs a checkpoint system to recover the data in case of an error. And lastly, the system uses a custom video encoding scheme that will save bandwidth.
The developers have already tested out
DeLorean games like DOOM 3 and Fable 3, and according to the research paper the results have been satisfying. We encourage you through the research paper as it has additional information regarding the underlying architecture of
DeLorean.