Nvidia VP on Next-Gen Consoles: “There’s no way a 200W Xbox is going to beat a 1,000W PC”

cyrus_atx

Well-Known Member
Sep 14, 2013
144
256
72


Nvidia is in the limelight these days. First they announced that they are working with the SteamOS and boasting on how much powerful PCs are as compared to consoles. Given that Nvidia was skipped by both the titans of console manufacturers who opted to go with AMD this time, it is understandable that it is siding with the PC gamers and assuring them each and every way.

During an interview with PC Powerplay, Nvidia’s senior vice president of technology Tomy Tamasi stated that the PCs will leave the consoles behind, again.

By the time of the Xbox 360 and PlayStation 3, the consoles were on par with the PC…

If you look inside those boxes, they’re both powered by graphics technology by AMD or NVIDIA, because by that time all the graphics innovation was being done by PC graphics companies.




Of course he’s Nvidia’s senior VP, he’s supposed to say stuff like this for the benefit of the company, shouldn’t he? Well he’s not far off and completely in the wrong here. The current-generation of the consoles ran a lifespan of seven-to-eight years whereas the generation before that had a lifespan of five-to-six years. Sony and Microsoft had both stated that the upcoming next-generation of their consoles may run a lifespan of ten years. By the time the consoles reach their prime, most of the PCs will be decently upgraded to allow more visually powerful gaming and the consoles will look like power-starved PCs.

“The consoles have power budgets of only 200 or 300 Watts, so they can put them in the living room, using small fans for cooling, yet run quietly and cool…

And that’s always going to be less capable than a PC, where we spend 250W just on the GPU. There’s no way a 200W Xbox is going to be beat a 1000W PC.”

Nvidia has been partnering up with Valve on the new SteamOS by making its drivers available for Linux. Something the PC gamers who prefer more independence have been asking for a long, long time. Recently AMD launched its new lineup of GPUs that rival directly with Nvidia’s top-end GPU lineup, along with the release was an API called ‘Mantle’ that will allow developers to port GCN architecture easily to PC by means of low-level high-performance drivers.

http://gearnuke.com/nvidia-vp-xbox-theres-way-200-watt-xbox-going-beat-1000-watt-pc/

:)
 

astrograd

Well-Known Member
Sep 11, 2013
476
241
72
And that will become irrelevant in 5 yrs or less when Sony and MS move entirely to rendering their games in their cloud infrastructures and streaming to thin clients.
 

Kassen

Well-Known Member
Sep 11, 2013
6,700
1,062
2,929
NVIDIA is really pushing hard for PC now that AMD is hogging all the consoles. And 1000W PCs beat 200W consoles. Glad to see NVIDIA's putting the "new" in "news." :rolleyes:
 

Kassen

Well-Known Member
Sep 11, 2013
6,700
1,062
2,929
I wish that guy would stop crying about not being involved with any of the new consoles.
Yup. Now that they got the short end, they have to s*** talk instead of ass kiss these consoles. It's a desperate cry for attention. NVIDIA is getting themselves nowhere with stating the obvious.
 
  • Like
Reactions: Bodhi

hrudey

Purveyor and appraiser of bovine excrement
Sep 11, 2013
7,384
4,176
2,630
But ... but... spearman defeats tank!
 

Hazard71

Gone but never forgotten
Sep 11, 2013
2,141
771
1,380
50
OMG I would have never known this if they hadn't said it!:confused:
 

Jarrod

Nudie Bar
Sep 11, 2013
630
234
710
My freshly new jug of milk to better than my 5 week past expiration date milk.
 

eVo7

Well-Known Member
Sep 13, 2013
4,403
1,584
2,230
AMD can build a great APU. I believe nvidia is just feelin salty.
 

BunzHoles

Well-Known Member
Sep 11, 2013
1,351
348
779
98133
www.2guys1carp.com
And that will become irrelevant in 5 yrs or less when Sony and MS move entirely to rendering their games in their cloud infrastructures and streaming to thin clients.

Heat and power consumption are issues for servers too.. you can't feasibly dedicate 1000 watts of power to each user for multiple reasons, including just pure costs of running them.. let alone the cost of building such an architecture.

Not that they'd need 1,000 watts in 5 years to beat the current 200 watt consoles, but power consumption and heat are not irrelevant with cloud based rendering.

I have serious doubts we'll see cloud based rendering used to "upgrade" games beyond what PS4/XBO put out natively this gen.
 

BunzHoles

Well-Known Member
Sep 11, 2013
1,351
348
779
98133
www.2guys1carp.com
And LOL at "nVidia is butthurt" comments.

They were asked the questions in an interview.. try clicking the links.. the interviewer asked specific questions, and the nVidia guy answered honestly and truthfully.. and it is the honest truth.

AMD's GPU's out on PC now crush the Xbox One and PS4 if you spend the dough/have the power supply to run them.

The nVidia guy could refuse to answer such interview questions.. but I don't see the harm in them. And he's right.. and it's something that had to be explained a LOT in the last 7 years to console-only faithful... and it was denied by tons of console-only faithful as well.

Now all of a sudden what was controversial for people like Ketto and I to "predict" only a year ago is being passed off as "Well obvious dude! You are just butthurt!"
 

Kassen

Well-Known Member
Sep 11, 2013
6,700
1,062
2,929
And LOL at "nVidia is butthurt" comments.

They were asked the questions in an interview.. try clicking the links.. the interviewer asked specific questions, and the nVidia guy answered honestly and truthfully.. and it is the honest truth.

AMD's GPU's out on PC now crush the Xbox One and PS4 if you spend the dough/have the power supply to run them.

The nVidia guy could refuse to answer such interview questions.. but I don't see the harm in them. And he's right.. and it's something that had to be explained a LOT in the last 7 years to console-only faithful... and it was denied by tons of console-only faithful as well.

Now all of a sudden what was controversial for people like Ketto and I to "predict" only a year ago is being passed off as "Well obvious dude! You are just butthurt!"

Six months ago, NVIDIA of their own accord went out of their way to make it clear that these next gen consoles are the weakest to ever be conspired. NVIDIA is definitely butthurt. They even went out of their way to make console vs PC comparison charts. Yup yup.
 

BunzHoles

Well-Known Member
Sep 11, 2013
1,351
348
779
98133
www.2guys1carp.com
Well all reports/statements indicate nVidia didn't want to be involved this generation.

What exactly are they butthurt about?

What they are saying/displaying is true... what they are saying/displaying has to do with the products they sell...

They are showing a product they were involved with on that same slide being easily surpassed by PC cards too.

If AMD's profit margins are low for the next gen consoles.. which they almost certainly are.. all they've done is release a low-profit competitor to their higher profit margin PC cards. Which is exactly the business nVidia DECIDED to no longer be involved with.
 

astrograd

Well-Known Member
Sep 11, 2013
476
241
72
Heat and power consumption are issues for servers too.. you can't feasibly dedicate 1000 watts of power to each user for multiple reasons, including just pure costs of running them.. let alone the cost of building such an architecture.

Not that they'd need 1,000 watts in 5 years to beat the current 200 watt consoles, but power consumption and heat are not irrelevant with cloud based rendering.

I have serious doubts we'll see cloud based rendering used to "upgrade" games beyond what PS4/XBO put out natively this gen.

The biggest constraints on X1/PS4 are due to power consumption limits. Those limits, relative to performance, won't be the same in 5 yrs by a long shot. Also, when consumers have to pay for the power it's an entirely different economic scenario than MS/Sony footing that bill. MS's data centers, for instance, run off of hydroelectric power that's extremely cheap to them. Noise is another factor that isn't problematic when housing these machines in data centers. I didn't say power was irrelevant, I said 'power as the limiting factor in performance' was irrelevant. There's a difference.
 

BunzHoles

Well-Known Member
Sep 11, 2013
1,351
348
779
98133
www.2guys1carp.com
astrograd said:
I didn't say power was irrelevant, I said 'power as the limiting factor in performance' was irrelevant.

You simply said "that will be irrelevant".. the topic of the thread is basically "PCs have unlimited power per user, consoles don't."

This is not made irrelevant by going to the cloud in any way, shape or form. They can't feasibly dedicate a high end PCs worth of power per user in a cloud scenario. 5 years from now MS won't be able to afford to render via the cloud what a high end PC can do 5 years from now.

It might change their graph, but again.. I doubt they'll even do what you and others believe they'll do this gen.

OnLive wans't really beating the rendering of 7 year old console hardware, and their business model completely failed.. they bled money barely keeping up with old hardware. I don't know why anyone is so sold on the idea that this is the future.. that they'll want to manage incredibly massive GPU farms for 10's or 100's of millions of people.
 

elitewolverine

New Member
Sep 18, 2013
23
4
2
You simply said "that will be irrelevant".. the topic of the thread is basically "PCs have unlimited power per user, consoles don't."

This is not made irrelevant by going to the cloud in any way, shape or form. They can't feasibly dedicate a high end PCs worth of power per user in a cloud scenario. 5 years from now MS won't be able to afford to render via the cloud what a high end PC can do 5 years from now.

It might change their graph, but again.. I doubt they'll even do what you and others believe they'll do this gen.

OnLive wans't really beating the rendering of 7 year old console hardware, and their business model completely failed.. they bled money barely keeping up with old hardware. I don't know why anyone is so sold on the idea that this is the future.. that they'll want to manage incredibly massive GPU farms for 10's or 100's of millions of people.

And why not? Steam is about to invade the living room doing just that, streaming (local of course but still lag and competing with internet traffic). They are going to want people to buy dedicated gaming machines that will be hooked up to the living room tv. They want the user to invest in the pc parts, they want the user to invest in the games, and the user to invest in the time and effort to 'learn' linux and use their custom OS.

MS on the other hand, has streamed Halo 4 to a windows phone, that has the graphics capacity of a ps2 that was out of date in 2000', putting it nearly 15years behind modern pc's. And they are doing this with a 45ms delay time.

What is feasible is for MS, to charge $200 a year, that would equal a $1,000 gaming machine ever 5 yrs, get cut from games like valve does on steam. Gaming companies now only have to target one system, and not hundreds (problem steam will face and still faces and why pc is not the dominate platform). The player can now play on a phone, a tablet, a pc, a x1, etc, and out perform that $2,000, every 2yr upgrade machine to match the cloud performance. At 45-100ms (which is console latency).

Lets say they are able to get 50million people to fork over $200 for the service, ever year. Thats 10billion, not to mention gaming cuts (how valve works).

Your telling me that MS, with the power of hundreds of thousands of servers couldnt pull it off? Would it require online? sure....But then again, it is still cheaper than a gaming rig 2yr cycle.

Meh, I am a pc gamer and even i find this nVidia post to be 'derp derp derp' and yes butthurt.

They are left out of the console race, so instead they buddy up with steam to get into the living room? If they were not hurt, they wouldn't be trying so hard to get back into the living room via steam.
 

BunzHoles

Well-Known Member
Sep 11, 2013
1,351
348
779
98133
www.2guys1carp.com
And why not? Steam is about to invade the living room doing just that

Completely unrelated. Not a great way to start a post..., claiming something that in no way relates to the cost of running a global network of game rendering servers is "doing just that."

We also aren't discussing the feasibility of the technology. I've used OnLive.. I have several locally streaming devices in my home (Wii-U, Vita, nVidia Sheild).. we are discussing the cost and the "power" (literally) behind rendering games.

MS on the other hand, has streamed Halo 4 to a windows phone, that has the graphics capacity of a ps2 that was out of date in 2000', putting it nearly 15years behind modern pc's. And they are doing this with a 45ms delay time.

Nobody here is questioning the tech behind streaming and whether it works or not.. and your locally rendered examples have pretty much nothing to do with the conversation, which is about power consumption, and how it magically becomes a non issue because a company is doing so in "the cloud."

Lets say they are able to get 50million people to fork over $200 for the service, ever year. Thats 10billion, not to mention gaming cuts (how valve works).

Your telling me that MS, with the power of hundreds of thousands of servers couldnt pull it off? Would it require online? sure....But then again, it is still cheaper than a gaming rig 2yr cycle.

Their current cloud is not built for DirectX 3d graphics rendering, not a single GPU.

And what it does "per user" is a tiny tiny fraction of what it would have to do "per user" if it were rendering games better than an Xbox One can.

It would be an unprecedented feat in the history of server power to have a userbase of 50 million avid gamers out-pacing next-gen system graphics. The type of gamers who easily could have 5-10 million people playing at once at the "normal peak times."

Just think about what you are saying.

We are supposed to be impressed that MS has a cloud that can host dedicated servers for many games at once, or calculate AI for games. Game servers that were previously.. running in the background of our 360's while we were playing the game on that same hardware.

Because even an entire instance of a game server is NOTHING compared to Xbox 360 level graphical rendering.. same with AI requests or anything else mentioned with MS's current, already impressive, already huge server farm.

Now increase the power/performance need by a factor of 100's or 1,000's... and it's constant, like every millisecond constant.. and has to provide 1080p video rendering on the level of a Netflix (not to mention encoding.)

That's why I find it questionable.
 

pravus

Well-Known Member
Cornerstone Member
Sep 11, 2013
6,516
2,881
12,630
NJ
“The consoles have power budgets of only 200 or 300 Watts, so they can put them in the living room, using small fans for cooling, yet run quietly and cool…

And that’s always going to be less capable than a PC, where we spend 250W just on the GPU. There’s no way a 200W Xbox is going to be beat a 1000W PC.”
:)