default header

Hardware

PS3 and 360: Not Nearly As Powerful As They Should Be

Moderator: JC Denton

Unread postby Jedah » 15 Apr 2008 11:06

While I agree with you that comparing movies to video game graphics is absurd, the goal of the article is to prove that video resolution is not the main factor of video quality / photorealism. And there's nothing rendered in movies, it's just analog prints of a scene to a medium ( film ) while running on 24 images per second. Rendering is the process where scene data ( shapes, texture, light, fog etc ) are transformed with mathematics into a 2D viewport, which in turn is displayed as a 2 dimensional array of color cells on screen.
User avatar
Jedah
 
Joined: 30 May 2006 12:48
Location: Greece

Unread postby icycalm » 15 Apr 2008 11:28

Anid Maro wrote:Good article. My only criticism (which has been touched on earlier in this thread), is that the opening example is a poor one.


The opening example is perfect, thankyouverymuch.

Anid Maro wrote:Not even just a comparison of Hero to DoA but just a comparison between a movie and a video game in general as regards graphics. They're two different animals.


The above statement is just another way of saying that "I know fuck-all about computer graphics, but I still want to post my stupid, ill-informed opinion about them". Because you are obviously too dense to realize that there is ABSOLUTELY NOTHING fundamentally different about visuals in a game and a movie. If you had just spent 5 SECONDS thinking about this, you would have considered that modern action movies are 50% computer-generated graphics, and practically no one can tell the difference anyway. In a few more years, no one (without the 'practically') will indeed be able to tell the difference.

Anid Maro wrote:A movie as the end product consists of a series of pre-rendered images whereas a video game must render its image output on the fly. Furthermore movies are essentially 2D in that the only concern graphically is the predetermined camera angle while in a video game it must (typically) consider multiple angles to render.

As such, in the graphics department a video game requires far more resources than a movie to output the same graphical quality simply because it has to consider a helluva lot more.


Translation: "blah blah blah I am an ill-informed fuckwit"

Anid Maro wrote:Mind you, I understand the point you are attempting to illustrate with the example


You understand fuck-all.

Anid Maro wrote:I just think it was a poor example to use.


Your mom is poor, and I used her last night.

The rest of your comments are okayish blabbing. Please do not post any more of your opinions in this thread, as I want to leave it free of clutter in the off chance that anyone might come along who might have something actually interesting to say.

Oh, and welcome to the forum.

Jedah wrote:While I agree with you that comparing movies to video game graphics is absurd


Et tu, Brute?

There is nothing absurd about the comparison. See my replies above.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Jedah » 15 Apr 2008 21:00

Why are you focusing on a single sentence? Comparing movies to video game visuals is not a common thing to do, I'm not saying it's crazy! I think this is clear in the following sentences of the post. Oh well, forums are working like a bad telephone line sometimes...
User avatar
Jedah
 
Joined: 30 May 2006 12:48
Location: Greece

Unread postby icycalm » 16 Apr 2008 10:30

Jedah wrote:Why are you focusing on a single sentence?


I am focusing on a single sentence because it's there, in front of me, staring back at me from the page, and it's wrong. And because this forum is read by many readers of this website, I cannot afford to leave wrong comments go unrefuted. I realize this may seem extremely pedantic to some people, but I have the website's quality and reputation to think of. And the forum is an integral part of the website. If someone says something factually wrong in it, I am OBLIGED to correct them.

The rest of your post is correct, it's beautiful, so I had nothing to say about that.

Jedah wrote:Comparing movies to video game visuals is not a common thing to do, I'm not saying it's crazy!


Yes, you did not say it's crazy, you said it's absurd. Which it isn't.

I realize this whole thing was just a lazy phrasing on your part. So please take a bit more care with your phrasing from now on. Children are reading this website, and they have trouble enough understanding it as it is, even without misleading phrasing.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby James T » 07 Jun 2009 07:13

Haaaang on --

However, for gameplay reasons, especially in multiplayer situations, you are sometimes forced to trade off image quality for higher resolutions


-- what word was that, there?!
James T
 
Joined: 07 Jun 2009 07:11

Unread postby icycalm » 07 Jun 2009 17:12

User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Beakman » 03 Sep 2009 16:47

There's a new development regarding the Microsoft rules when it comes to game resolution. Black rock developer David Jefferies wrote about it when he explained the futility of rendering at 1280x720 when most displays are 1366x768 or 1920x1080 and the subsequent loss of quality because of scaling:

http://www.develop-online.net/blog/44/Microsofts-new-resolution

David Jefferies wrote:...So, in order to display the 1280x720 output image from our game, the LCD must first use its hardware scaler to scale the image by 106.66 per cent in both axes. The lovingly crafted 1280x720 image has gone to be replaced by something dependent on the quality of the LCD scaler. But no matter how competent the scaler is, it’s difficult to image how scaling by 106.66% could do any good to the image.

So what is it about 1366x768? There are millions of these LCDs out there but the resolution doesn’t look familiar. It’s not a power-of-two, it’s not divisible by 640 and it’s not even an exact 16:9 aspect ratio (although it is pretty close).

...

The best I’ve managed to come up with is that it’s the highest resolution at nearly 16:9 that will fit into a megabyte boundary.

Multiply 1366 and 768 together and you get very nearly 1024x1024, or 1 megabyte. This means that the LCD manufacturers can use off-the-shelf video memory with very little wastage. On the other hand, if they were to use true 720p then they’d waste 13 per cent of the video memory and possibly lose sales to their rivals who could claim slightly higher resolutions.

...

The first thing to note is that we all make trade-offs between image resolution and quality. Probably the most noticeable example of this is multi-sample anti-aliasing. Both hi-def consoles support either 4xMSAA, where the scene is anti-aliased vertically and horizontally, or 2xMSAA, where the scene is only anti-aliased vertically or horizontally. However, if you’re throwing a lot of graphics around then you won’t be able to afford 4XMSAA at 1280x720 because the GPU hit is just too great.

At this point most games, including our games here at Black Rock, drop down to 2xMSAA at 1280x720. We are making a trade-off and saying that the screen resolution is more important to us than the quality of the anti-aliasing. This isn’t necessarily an entirely voluntary move because, until recently, Microsoft had a TCR insisting that games run at 1280x720 – providing you weren’t one of the lucky ones like Halo, who got it waived and ran at 1152x640, that is.

By asserting that screen resolution is more important than anti-aliasing we’re leaving ourselves vulnerable when the customer’s LCD decides it’s going to rescale the image to a new resolution anyway. If we instead assume that the LCD is going to rescale then, for some games, it might be more sensible to present it with a better anti-aliased but lower resolution image in the first place.

...

It’s is for this reason that Microsoft recently retired the TCR insisting on 1280x720. Now we are free to make the trade-off between resolution and image quality as we see fit.



If Sony follows suit it'll be feasible to expect some kickass looking 480p games.
User avatar
Beakman
 
Joined: 26 Apr 2009 17:30
Location: Mexico City

Unread postby icycalm » 07 Oct 2009 13:00

Posted by Chaos in another thread, and I am moving it here because this one is more relevant:

ChaosAngelZero wrote:Great, Euroidiot is only a whopping three years late to your party, icy.

http://www.eurogamer.net/articles/digit ... -blog-post

In Saturday's Digital Foundry "not so high definition" feature, we talked about the technical reasons why some console games don't actually appear to be running at the lowest HD standard: 720p. We revealed that Namco-Bandai's forthcoming Tekken 6 is one of those games, but also stated that the additional graphics processing introduced at the lower resolution, surprisingly, produced a higher overall image quality than the game's in-built HD mode which actually runs in excess of 720p.


Just over three years indeed. I actually did not think any journ-lol-list would ever realize this, so I guess I underestimated some of them. Having said that, the above quote makes fuck-all sense (if Tekken 6 runs at a sub-720p resolution how can its "in-game HD mode" "actually run in excess of 720p"?), but I can't be bothered to look into it right now.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Gnarf » 07 Oct 2009 14:58

According to the article, the resolution is sub-720p first and foremost because of some motion blurring thing. If you turn off motion blur in the game, it'll switch to 1365x768, but use lower resolution textures.
Gnarf
 
Joined: 27 Aug 2008 18:31

Unread postby icycalm » 05 Apr 2017 16:38

http://www.neogaf.com/forum/showthread.php?t=1358119

SeeNoWeevil wrote:The TV industry decides what's good for gaming.

Whatever the TV industry decide they need to sell TVs, is what decides the direction of gaming. If more pixels is the easiest way to get a bunch of people who don't want to replace their TV to replace their TV, then you can be sure gaming is going to trend towards those increased pixel rates. No one in gaming has made the conscious decision that 4K is good for gaming, this is all the TV industry and what makes sense for movie watchers.

  • TV industry decide 4K is the easiest thing to market to consumers to push TV sales
  • Consumers buy 4K sets
  • Consumers demand 4K from content providers/console makers to justify their purchases
  • Console makers tow the line with boxes capable of 4K at only 30fps (weak CPUs)
  • Console games scaled back to hit those high pixel counts
  • PC gamers with high end systems have nothing left to put increasing GPU power into besides 4K


Combine this with the obsession of how games look while stationary/in screenshots and the atrocious motion clarity of modern displays and it doesn't look good. Huge static resolution, awful motion resolution. What's next, 8K at 30fps?

Oh well, at least there's still VR. Can't ruin motion there without making people vomit on themselves.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Previous

Return to Hardware

cron