Major Nelson speaks about Native 1080p 60fps...

Status
Not open for further replies.
s***. You mates wont care once you get either BF4 or CoD at home. This rez pixel counting crap is ridiculous. Cant wait until these consoles release. Theres no telling what r******d nuance will be up for debate next week. :rolleyes:

I did the pixel thing amongst others with the 360 and PS3 and I do it now, I never changed my method, no one had an issue with me preferring the 360 to PS3 and no one can or will now.

BF4 on PC and COD on PS4 for now (soon). I'm getting Fifa 14 (free) and Dead Rising 3 on X1, I'm content.
 
720p may not be a biog deal today, as many media, games we played today ar 720p or below, & our mnd are tuned for this resolution. last time I never thought standard defination for TV need to improve, or we need a more Hight def DVD format. How things change.

In a few years time, when most media become 1080p (similar to current gane when HD become pretty much a standard only a few years after xbox 360 launched), its when 720 started to look 'low' res. Yes, the jump from 720 to 1080p is less than standard to HD, but you will certainly notice when you visual memory are tuned for 1080p (or higher) resolution.

If we calm ourselves down, I think most of us can digest that some launch windows or early titles will be 720p, but 720p cannot become the standard on x1 in later games, as resolution will show its age.
 
yes but they will update their hardware.. they did it with the current gen
Both consoles will continue to evolve over their life cycles
 
Maybe these launch games are not taking advantage of the CPU/GPU over clock. The may be direct PC ports to PS4 and XB1 with the later not touching the ESRAM. I think we have to wait till wave 2 of games before judging what the XB1 is capable of. Already Ryse and Quantum Break got the best visuals I have ever seen on any console.
 
Maybe these launch games are not taking advantage of the CPU/GPU over clock. The may be direct PC ports to PS4 and XB1 with the later not touching the ESRAM. I think we have to wait till wave 2 of games before judging what the XB1 is capable of. Already Ryse and Quantum Break got the best visuals I have ever seen on any console.
I hope so too, if so, MS must made a public address so people are informed.
 
Jesus Christ!:confused: What the hell are you talking about? So now, the machines arent meant for a decade? Microsoft and Sony are lying to us now. Pfft. This stuff gets worse by the day...

Sorry Mcmasters, wasn't trying to frustrate you. I don't know if they're lying to us, but anyone can recognize they won't last 8-10 years this time round.
 
Definitely not lasting 10 years. This cycle (360) was already WAY too long. I got a 360 when I was 23 years old. Now Im 31.

That said, I find the "X1 has weaker specs and it is showing" comments humorous, considering the two best looking launch titles on either system are on the X1. But what I find even more humorous is the inability of the gaming "community" to simply enjoy.....anything. Anything at all. Instead of celebrating the launch of a new console generation its endless pissing contests about irrelevant minutia.

Two games launching across every system in existence ran into snags on the more complicated architecture with late development tools. SHOCK. Yet somehow, arguably, one of them still ended up looking better on the "weak" machine.
 
Maybe these launch games are not taking advantage of the CPU/GPU over clock. The may be direct PC ports to PS4 and XB1 with the later not touching the ESRAM. I think we have to wait till wave 2 of games before judging what the XB1 is capable of. Already Ryse and Quantum Break got the best visuals I have ever seen on any console.
QFT... It is a big deal right now to kids that don't have patience. God forbid a xbox console launches that isn't firing on all cylinders day 1. If we're seeing the same thing a year from now then MS should be worried.

But what I find even more humorous is the inability of the gaming "community" to simply enjoy.....anything. Anything at all. Instead of celebrating the launch of a new console generation its endless pissing contests about irrelevant minutia.
Two games launching across every system in existence ran into snags on the more complicated architecture with late development tools. SHOCK. Yet somehow, arguably, one of them still ended up looking better on the "weak" machine.
Bang on!!
 
And before someone jumps all over that with the "PS4 POWERZ" comments, I fully understand that in raw performance numbers, the PS4 has the upper hand. Im excited about the PS4 too, like...really excited. I cant wait to buy one and see what Naughty Dog does with it.
 
Anything from his blog is entirely rubbish, this is hard to say, but I would believe Astrograd before I believed him.

His posts are really just made up crap with him agreeing with himself, if you look at some earlier posts of his you'll see incoherent ranting wrt to 3d stacked dGPU's which is obviously rubbish.
Looks like the Day One update will fix some things, try this one:
https://mobile.twitter.com/repi/status/395138203075633152
 
  • Like
Reactions: Grassrabbit
The devs could make their games 1080p/60fps, but the graphics would be dumbed down and not be the game the devs wanted...not matching closely to other platforms.

You have to blame MS because they have more hands in this than the devs. The devs made the game, but MS selected or developed all the hardware, developed the drivers with AMD and developed the other tools. The devs took what MS gave them and what SONY and PC-platform gave them, and could not make the console games on par in the equivalent time period. Something is wrong -- the consoles have similar h/w now and the different is quantifiable. SONY screwed up with PS2, then PS3 to an extent, now it seems MS is taking a crack at it.
wrong its up to the developers and their skills if they can do it. Hardware is there and it can do those things. Look at some of SONY games.
 
  • Like
Reactions: Mcmasters
Wish MS just gets it over with and drops Major Nelson like a bad habit. He's annoying all around and I only get to read the tweets he posts here and there when ever they are posted by someone.

Can they just replace him with someone... Better?
 
One word..."Titanfall!!!" Maybe its because I'm from a different generation, but when I saw the comparisons between the XBox One and the PS4, I was like damn they look good. I guess I just don't get what all the fuss is about. (shrug)
 
Dude, you need to chill on the apologist tip. If I take a CD track and rip it to MP3 at 320k, then downsample it to 160k, then upsample it to 320 again, I've lost bits of data. If the original image was at 1080, then compressed down to 900, then upscaled to 1080 again, you don't get the compressed bits back again. They're lost. This is a silly argument you're making. They're gone. Everything looks fine at 900 but you don't magically regain the data that was lost as part of reducing resolution or compressing. It's not like a freaking Slinky that just bounces back into shape. That's not how computer data works. If you squeeze down to 900p from 1080p, data is lost and it will never be regained.

A good upscaler can make things look nice, but it can't recover lost data. I honestly have no idea how you can argue what you're arguing.
Because that wasn't what I was arguing, I pointed out that the dev said the framebuffer was native 1080p, and kb was saying the dev was wrong, posting the article from DF with their assumption.

I know data is lost, never was saying that it wasn't, nor was I saying it was rendered in 1080p first. But, a 1080p image compressed to 900p and then upscaled again will look better than a 900p image rendered at 900p and upscaled to 1080p, data is lost but quality isn't deleted during compression. Never was I saying that a native 1080p image compressed to 900p and upscaled would look like a native 1080p image.
 
Wish MS just gets it over with and drops Major Nelson like a bad habit. He's annoying all around and I only get to read the tweets he posts here and there when ever they are posted by someone.

Can they just replace him with someone... Better?
Considering how many people listen to his podcasts, follow him on twitter and lineup to see him at events I'd say your wrong.
 
  • Like
Reactions: griff_dai
Nothing wrong with Nelson. He is having a hard job now, first due to the DRM, now with the performance gaps in 3rd party titles. If Iam him, & can get a way with it, Iwill kick the family jewel of the guy that make the decision not to upgrade the spec, when the information on the hardwares are known, & people praying hard for them to bite the bullet to match or close the gap on the spec.
 
Definitely not lasting 10 years. This cycle (360) was already WAY too long. I got a 360 when I was 23 years old. Now Im 31.

That said, I find the "X1 has weaker specs and it is showing" comments humorous, considering the two best looking launch titles on either system are on the X1. But what I find even more humorous is the inability of the gaming "community" to simply enjoy.....anything. Anything at all. Instead of celebrating the launch of a new console generation its endless pissing contests about irrelevant minutia.

Two games launching across every system in existence ran into snags on the more complicated architecture with late development tools. SHOCK. Yet somehow, arguably, one of them still ended up looking better on the "weak" machine.


very well said...gaming HAS ceased to be fun to people. Often I find that most of these self proclaimed gamers will get a title based solely on looks, still shots (mostly bullshots) and once the game is in their hand, they have no enjoyment out of it. It sure looks good tho!

One of the best games I played on Xbox 360 was so overlooked, that some don't even know it by name, but I had a blast. Not the prettiest, not the tightest control, just fun. Like Lollipop Chainsaw on PS3. Games do not have to be technical masterpieces, they have to be fun.
 
I thought it was pretty common to use odd output resolutions that aren't 16:9. They generate say a 10:9 image that looks squished, and then when it scaled it naturally "looks like" a 16:9 image should. So a square object would be rendered a "tall" rectangle, when scaled it looks like a square.. the rendering engine puts less pixels out but you get decreases accuracy on how things look on the X-axis. Something that "should be" 11 pixels wide ends up 12 pixels wide, etc.
 
Because that wasn't what I was arguing, I pointed out that the dev said the framebuffer was native 1080p, and kb was saying the dev was wrong, posting the article from DF with their assumption.

I know data is lost, never was saying that it wasn't, nor was I saying it was rendered in 1080p first. But, a 1080p image compressed to 900p and then upscaled again will look better than a 900p image rendered at 900p and upscaled to 1080p, data is lost but quality isn't deleted during compression. Never was I saying that a native 1080p image compressed to 900p and upscaled would look like a native 1080p image.
No, a 1080P image compressed to 900P will look exactly the same as a 900P image when upscaled. Because they're both 900P images being upscaled at that point. Once you've lost resolution on the compression, there is no "quality" that is there on one image versus the other. At that point, they're both 900P images, and they are treated exactly the same by a 1080P upscale. "Quality" isn't some magical thing you get when the source for your 900P image is a 1080P image. At the end of the day, you still have a 900P image. If anything, a native 900P image will probably look better than a 1080P image compressed to 900P, because it won't have any compression artifacts.
 
  • Like
Reactions: Mcmasters
No, a 1080P image compressed to 900P will look exactly the same as a 900P image when upscaled. Because they're both 900P images being upscaled at that point. Once you've lost resolution on the compression, there is no "quality" that is there on one image versus the other. At that point, they're both 900P images, and they are treated exactly the same by a 1080P upscale. "Quality" isn't some magical thing you get when the source for your 900P image is a 1080P image. At the end of the day, you still have a 900P image. If anything, a native 900P image will probably look better than a 1080P image compressed to 900P, because it won't have any compression artifacts.

Ok...so if you take an 8 megapixel photo and resize it to the same size as a 2 megapixel photo they will look the same? no, there will be more quality in the resized one, compression doesn't delete quality.
 
Ok...so if you take an 8 megapixel photo and resize it to the same size as a 2 megapixel photo they will look the same? no, there will be more quality in the resized one, compression doesn't delete quality.
What is quality? It isn't some magical thing that exists. Define what you mean by that. And from a technical standpoint, no, the 8MP photo and the 2MP photo won't look the same, but the native 2MP photo will be more accurate. When you compress the 8MP down to 2MP, decisions on how to fit that data into 2MP (i.e., what data gets lost) are made by the compression algorithm without any notion of quality, etc. That's why there are always compression artifacts, and that's the difference between compression algorithms and settings. For example, JPEG compression has certain settings you can make related to "quality", but those settings essentially say, "how much data are you willing to lose". If you're going down to exactly 2MB, you have already determined the "quality". Photoshop, for example, allows you to set high, medium, and low quality on JPEG compression. That quality determines nothing more than how aggressive the algorithm is in removing data. Low quality compression will remove more data than high quality, and the file sizes will be different. You don't have high quality compression and low quality compression both compressing to the same size, because the definition of quality in image compression means how much data will be removed in the compression.

So, by saying you're going from 8MB to 2MP, in image compression, you've essentially stated the quality you want. There is no other meaning of "quality" in this sense outside of that; how much data you're comfortable having removed. Same with 1080p down to 900p. Sure, a 1080p image compressed to 900p won't look exactly the same as a native 900p image. But it won't be better looking, just different, because instead of rendering out directly to 900p, you've rendered to 1080p, then compressed down to 900p, and the compression will remove data according to a compression algorithm, which may actually end up looking worse.
 
What is quality? It isn't some magical thing that exists. Define what you mean by that. And from a technical standpoint, no, the 8MP photo and the 2MP photo won't look the same, but the native 2MP photo will be more accurate. When you compress the 8MP down to 2MP, decisions on how to fit that data into 2MP (i.e., what data gets lost) are made by the compression algorithm without any notion of quality, etc. That's why there are always compression artifacts, and that's the difference between compression algorithms and settings. For example, JPEG compression has certain settings you can make related to "quality", but those settings essentially say, "how much data are you willing to lose". If you're going down to exactly 2MB, you have already determined the "quality". Photoshop, for example, allows you to set high, medium, and low quality on JPEG compression. That quality determines nothing more than how aggressive the algorithm is in removing data. Low quality compression will remove more data than high quality, and the file sizes will be different. You don't have high quality compression and low quality compression both compressing to the same size, because the definition of quality in image compression means how much data will be removed in the compression.

So, by saying you're going from 8MB to 2MP, in image compression, you've essentially stated the quality you want. There is no other meaning of "quality" in this sense outside of that; how much data you're comfortable having removed. Same with 1080p down to 900p. Sure, a 1080p image compressed to 900p won't look exactly the same as a native 900p image. But it won't be better looking, just different, because instead of rendering out directly to 900p, you've rendered to 1080p, then compressed down to 900p, and the compression will remove data according to a compression algorithm, which may actually end up looking worse.
By quality I guess I mean level of detail. the level of detail in an 8MP photo is higher than that of a 2MP,
 
720p may not be a biog deal today, as many media, games we played today ar 720p or below, & our mnd are tuned for this resolution. last time I never thought standard defination for TV need to improve, or we need a more Hight def DVD format. How things change.

In a few years time, when most media become 1080p (similar to current gane when HD become pretty much a standard only a few years after xbox 360 launched), its when 720 started to look 'low' res. Yes, the jump from 720 to 1080p is less than standard to HD, but you will certainly notice when you visual memory are tuned for 1080p (or higher) resolution.

If we calm ourselves down, I think most of us can digest that some launch windows or early titles will be 720p, but 720p cannot become the standard on x1 in later games, as resolution will show its age.

Your post broke my brain. What is your native language?
 
Microsoft was talking about cross platform play between X1 and PC, pft yeah right! X1 players would get smoked they'd be playing at 720p and PC players playing at 1080p or higher resolution with keyboard and mouse.
 
Status
Not open for further replies.