Moderator: JC Denton
by Jedah » 15 Apr 2008 11:06
by icycalm » 15 Apr 2008 11:28
Anid Maro wrote:Good article. My only criticism (which has been touched on earlier in this thread), is that the opening example is a poor one.
Anid Maro wrote:Not even just a comparison of Hero to DoA but just a comparison between a movie and a video game in general as regards graphics. They're two different animals.
Anid Maro wrote:A movie as the end product consists of a series of pre-rendered images whereas a video game must render its image output on the fly. Furthermore movies are essentially 2D in that the only concern graphically is the predetermined camera angle while in a video game it must (typically) consider multiple angles to render.
As such, in the graphics department a video game requires far more resources than a movie to output the same graphical quality simply because it has to consider a helluva lot more.
Anid Maro wrote:Mind you, I understand the point you are attempting to illustrate with the example
Anid Maro wrote:I just think it was a poor example to use.
Jedah wrote:While I agree with you that comparing movies to video game graphics is absurd
by Jedah » 15 Apr 2008 21:00
by icycalm » 16 Apr 2008 10:30
Jedah wrote:Why are you focusing on a single sentence?
Jedah wrote:Comparing movies to video game visuals is not a common thing to do, I'm not saying it's crazy!
by icycalm » 07 Jun 2009 17:12
by Beakman » 03 Sep 2009 16:47
David Jefferies wrote:...So, in order to display the 1280x720 output image from our game, the LCD must first use its hardware scaler to scale the image by 106.66 per cent in both axes. The lovingly crafted 1280x720 image has gone to be replaced by something dependent on the quality of the LCD scaler. But no matter how competent the scaler is, it’s difficult to image how scaling by 106.66% could do any good to the image.
So what is it about 1366x768? There are millions of these LCDs out there but the resolution doesn’t look familiar. It’s not a power-of-two, it’s not divisible by 640 and it’s not even an exact 16:9 aspect ratio (although it is pretty close).
...
The best I’ve managed to come up with is that it’s the highest resolution at nearly 16:9 that will fit into a megabyte boundary.
Multiply 1366 and 768 together and you get very nearly 1024x1024, or 1 megabyte. This means that the LCD manufacturers can use off-the-shelf video memory with very little wastage. On the other hand, if they were to use true 720p then they’d waste 13 per cent of the video memory and possibly lose sales to their rivals who could claim slightly higher resolutions.
...
The first thing to note is that we all make trade-offs between image resolution and quality. Probably the most noticeable example of this is multi-sample anti-aliasing. Both hi-def consoles support either 4xMSAA, where the scene is anti-aliased vertically and horizontally, or 2xMSAA, where the scene is only anti-aliased vertically or horizontally. However, if you’re throwing a lot of graphics around then you won’t be able to afford 4XMSAA at 1280x720 because the GPU hit is just too great.
At this point most games, including our games here at Black Rock, drop down to 2xMSAA at 1280x720. We are making a trade-off and saying that the screen resolution is more important to us than the quality of the anti-aliasing. This isn’t necessarily an entirely voluntary move because, until recently, Microsoft had a TCR insisting that games run at 1280x720 – providing you weren’t one of the lucky ones like Halo, who got it waived and ran at 1152x640, that is.
By asserting that screen resolution is more important than anti-aliasing we’re leaving ourselves vulnerable when the customer’s LCD decides it’s going to rescale the image to a new resolution anyway. If we instead assume that the LCD is going to rescale then, for some games, it might be more sensible to present it with a better anti-aliased but lower resolution image in the first place.
...
It’s is for this reason that Microsoft recently retired the TCR insisting on 1280x720. Now we are free to make the trade-off between resolution and image quality as we see fit.
by icycalm » 07 Oct 2009 13:00
ChaosAngelZero wrote:Great, Euroidiot is only a whopping three years late to your party, icy.
http://www.eurogamer.net/articles/digit ... -blog-postIn Saturday's Digital Foundry "not so high definition" feature, we talked about the technical reasons why some console games don't actually appear to be running at the lowest HD standard: 720p. We revealed that Namco-Bandai's forthcoming Tekken 6 is one of those games, but also stated that the additional graphics processing introduced at the lower resolution, surprisingly, produced a higher overall image quality than the game's in-built HD mode which actually runs in excess of 720p.
by icycalm » 05 Apr 2017 16:38
SeeNoWeevil wrote:The TV industry decides what's good for gaming.
Whatever the TV industry decide they need to sell TVs, is what decides the direction of gaming. If more pixels is the easiest way to get a bunch of people who don't want to replace their TV to replace their TV, then you can be sure gaming is going to trend towards those increased pixel rates. No one in gaming has made the conscious decision that 4K is good for gaming, this is all the TV industry and what makes sense for movie watchers.
- TV industry decide 4K is the easiest thing to market to consumers to push TV sales
- Consumers buy 4K sets
- Consumers demand 4K from content providers/console makers to justify their purchases
- Console makers tow the line with boxes capable of 4K at only 30fps (weak CPUs)
- Console games scaled back to hit those high pixel counts
- PC gamers with high end systems have nothing left to put increasing GPU power into besides 4K
Combine this with the obsession of how games look while stationary/in screenshots and the atrocious motion clarity of modern displays and it doesn't look good. Huge static resolution, awful motion resolution. What's next, 8K at 30fps?
Oh well, at least there's still VR. Can't ruin motion there without making people vomit on themselves.