default header

Hardware

PS3 and 360: Not Nearly As Powerful As They Should Be

Moderator: JC Denton

PS3 and 360: Not Nearly As Powerful As They Should Be

Unread postby Chuplayer » 07 Sep 2006 03:12

http://insomnia.ac/commentary/not_powerful_enough/

I read that article, and all I could think was CONCUR'D.

I think I'll actually prefer 480i on RCA A/V connectors for the new generation. I watch the MGS4 trailer on the Metal Gear Saga DVD in component 480i resolution, and it looks really good. However, I watch that same trailer either on that DVD or on disc 2 of MGS3 Subsistence through my PS2's RCA A/V, and I'm more impressed. Why? The lack of definition makes everything look more realistic. With the component hookups, I can see flaws and some plastic-like effects that shouldn't be seen. Thanks to the color bleed and all the shit that goes on with RCA jacks, Snake looks like a real person. More realistic. He looks more convincing.

Like you said, if they weren't trying to render all this stuff at ultra high resolution, they can focus on harnessing the system's power on making the lower resolutions look more convincing.
Chuplayer
 
Joined: 14 Apr 2006 16:05

Unread postby Jedah » 07 Sep 2006 11:25

Nice article. I don't agree with the numbers stated, computer performance has many more parameters, can't imagine how many right now. In general I'm in agreement with the fact that Sony & Microsoft overhyped the "HD gaming", to hide the lack of their hardware to produce innovative graphics effects. It's a fact that higher resolution makes justice of all the hard work game artists put into, but it's also true that PS3 and Xbox 360 produce graphics effects that expensive PC hardware already implements. Right now we see Prey run at half the fps in 360 than in a hig-end PC. It is depressing, no matter the price difference. John Carmack stated clearly in the last Quakecon (http://www.gametrailers.com/gamepage.php?fs=1&id=3119) that although next-gen consoles are very powerful and interesting to develop for, an expensive PC is already more powerful.

We're sold in hype or are we so impatient? Maybe developers haven't found yet the "key" to harness the power? If they do after two years, will people take notice? What about AI, physics, mechanical features?

All these questions will be answered in near or distant future. The thing is that for the first time in my gaming life, I'm not sold in next-gen. I really love my DS. The next time a bunch of extra money lands on my hands (very unlikely) I will be spending it for GBA, DS titles and maybe I'll finally purchase a PS2 with a couple of nice Cave shooters from the net. I hate trailers, FMV, prerendered cutscenes, high resolutions and shit. 360 is out almost a year and a single f*****g game has not persuade me to purchase one. On PS3 everything is very foggy about the games. Small gaming sections or FMV. Nothing to drop my jaw. What the f**k is going on? Maybe Wii will save the day? Well if you don't touch the thing, who knows?
User avatar
Jedah
 
Joined: 30 May 2006 12:48
Location: Greece

Unread postby piyo » 07 Sep 2006 14:46

I'm not sure that your tech numbers are right, but it doesn't distract from your argument too much.

I believe that the PS3 and the Xbox 360 are not powerful enough to handle true HDTV resolutions, and at the same time deliver the large variety of new effects necessary to approach photorealism. The quest for photorealistic graphics is, after all, the main reason for designing new consoles every four or five years .


What!? The main reason consoles are developed is to MAKE MONEY. The tech is secondary. The graphics are made to lure the consumers. Here is where you make those PR guys arugments for them.

The extra power should be used for higher polygon counts and more complex effects, so that standard resolution games can become indiscernible from standard resolution movies. Only when we have achieved this should we move to vastly more powerful hardware and higher resolutions.


I don't believe people require all games to have the same visual quality as a movie. See any puzzle game or cartoony/anime game. Or perhaps developers can bank on a gamer's visual forgiveness, being trained to suspend disbelief at anything pixelated.

Besides, the production costs would increase, and that puts the pressure on the marketers to polish the turd even harder.

Progress in visual quality is being held back because of a policy cooked up by marketing people to suit their, and not the developers', needs. On the one hand this is not such a big deal; graphics aren't everything, and we'll eventually get there one way or another. But what upsets me more is all the gibberish "ZOMG HDTV!" nonsense that is being poured down people's ears, while no one goes on the record to refute it.

Okay, sure the PR about HDTV has been off the wall and fantastic. But I don't expect to see photorealism or even better-than-uncanny-valley in my games in this generation. I think it is unrealistic to assume that its coming now. Maybe next generation (2012?) because there won't be any more freakin' video standards to jump to.

I guess all we can do is lament how graphics could be better. So what should we do, vote with our wallets? Buy games that don't strive for the realism? Or knock down our expectations a notch?

---
piyo
User avatar
piyo
 
Joined: 17 Aug 2006 11:45

Unread postby icycalm » 07 Sep 2006 15:27

Chuplayer wrote:I think I'll actually prefer 480i on RCA A/V connectors for the new generation. Thanks to the color bleed and all the shit that goes on with RCA jacks, Snake looks like a real person. More realistic. He looks more convincing.


I understand what you are saying, though I always prefer using the best possible connection type myself. What composite video gives you is basically the Photoshop Filter effect. If you have a very low quality image, it's sometimes better to smear it a little with a filter, to hide its rough edges.

As for the numbers, which Jedah and piyo doubt, the following statement is 100 per cent correct, as I have checked it with two experts in this field.

Now the way graphics cards work is that they have to perform all necessary calculations for every single pixel shown on screen. This means that a 1080i image requires, in the best case scenario, approximately seven times more calculations -- i.e. processing power -- than a 480i one (because it has approximately seven times more pixels). Accordingly, a 1080p image requires almost fourteen times more processing power.


On the other hand, the next statement was simply a guess I made in order to make a point. I used the word "say" to signify this:

If you want to double the amount of polygons and upgrade the graphics with newer, fancier effects, that will cost you extra. Say twice as much for a double improvement in overall visual quality.




piyo wrote:I don't believe people require all games to have the same visual quality as a movie. See any puzzle game or cartoony/anime game.


Of course. It would be absurd to believe otherwise.

But in this article I am referring specifically to the quest for photorealistic graphics. I am not discussing puzzle games, or stylized games like Killer 7, for example. The objective is to make a console that has limitless power, so we can then toss it to the designers and say: "Here, this thing will run any kind of game you can dream up."

All the games do NOT have to be photorealistic. Perfect photorealistic graphics will simply be the proof that we have reached the end of the processing-power journey. And this power can be used equally well to create even prettier versions of Okami or Killer 7, or anime-style action games, for example.


piyo wrote:I guess all we can do is lament how graphics could be better. So what should we do, vote with our wallets? Buy games that don't strive for the realism? Or knock down our expectations a notch?


No matter what our expectations, I trust developers will find ways to surprise us in terms of graphics and otherwise. I could not have possibly been prepared for the first time I saw Viewtiful Joe running, nor Okami.

I would be extremely happy if Sony and MS allowed developers to use any resolution they want. That was the point I had in mind when writing the article. But it will take at least a year for this to happen, if indeed it ever does. The battle between MS and Sony for which system is powerfuller is so intense, that no one would like gamers to see pics of a low-res game running on their hardware.

Too bad for 2D game-makers, who will struggle to design 720p shooting or fighting games, if indeed they even try...
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Pongism » 12 Sep 2006 18:37

DPS wrote:LASTLY! Alex Kierkegaard over at insomnia (see the link to the right?) wrote a little piece insulting, basically, graphics. Unfortunately, I can't defend most of what he says, mostly becasue I haven't read the whole thing. I mean, I did up until the part where he compared Dead or Alive 4 to the 2002 movie Hero. Not the upcoming crap-fest Dead or Alive movie starring Jaime Pressly and Devon Aoki, but the Oscar-nominated Hero that won like 40 awards worldwide. Also, Alex used scaled screenshots to compare the two side-by-side. If yellow journalism was in dire need of a comeback then Alex is, by far, the most likely asshole to do it.

(I bought my 360 specifically to play DoA4. Not watch it. Play it, jackass. FUCK GRAPHICS, and fuck you, too!)

Looks like DonMarco didn't like it.

I think the consoles are and will be plenty-powerful. Sharper images aren't selling points to true gamers, afterall. It's for the consumers that wander around BestBuy and the sales people there.
Pongism
 
Joined: 03 Apr 2006 06:11

Unread postby icycalm » 13 Sep 2006 19:57

Pongism, tell Marco that he missed the point, which is not surprising since he only read three paragraphs of a 20+ paragraph article. Most articles tend to reveal their purpose around the halfway mark, and this one is no exception.

As for DOA 4 and Hero, the choice of examples was arbitrary -- I could have picked any two examples without taking away from the points I wanted to raise.

In fact, on second thought maybe I should have picked a movie that has an Xbox 360-adaptation. I am vaguely aware of a couple of movie tie-ins for the system... X-Men perhaps, or LOTR?

Also, the images are not scaled. Only the thumbnails are, which is why they are called thumbnails.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Chuplayer » 14 Sep 2006 03:32

Pongism wrote:
DPS wrote:LASTLY! Alex Kierkegaard over at insomnia (see the link to the right?) wrote a little piece insulting, basically, graphics. Unfortunately, I can't defend most of what he says, mostly becasue I haven't read the whole thing. I mean, I did up until the part where he compared Dead or Alive 4 to the 2002 movie Hero. Not the upcoming crap-fest Dead or Alive movie starring Jaime Pressly and Devon Aoki, but the Oscar-nominated Hero that won like 40 awards worldwide. Also, Alex used scaled screenshots to compare the two side-by-side. If yellow journalism was in dire need of a comeback then Alex is, by far, the most likely asshole to do it.

(I bought my 360 specifically to play DoA4. Not watch it. Play it, jackass. FUCK GRAPHICS, and fuck you, too!)

Looks like DonMarco didn't like it.

I think the consoles are and will be plenty-powerful. Sharper images aren't selling points to true gamers, afterall. It's for the consumers that wander around BestBuy and the sales people there.


Chops: busted.

At least, I hope that was just a chop busting.
Chuplayer
 
Joined: 14 Apr 2006 16:05

Unread postby Pongism » 14 Sep 2006 18:04

Marco may not have read the whole thing (as I did) but I can't honestly say he missed out on all that much.

HD movies will be about the same resolution as the versions shown in movie theaters. That's the real push behind HD tvs and HD movie players. The resolution isn't imporant, but the picture clarity and overall detail and presesntation is what they (Nintendo, Sony, Microsoft) want to step up to. Every generation the beef up the machines and churn out better-looking games with more complicated character designs and backgrounds and explosions and so on. They aren't really looking to be better looking than a movie, but other games of the same generation.

Look back when Hollywood went from back and white to color. It was roughly the same time industry photographers could afford to buy and sell color photographs. Which was long after books and magazines could use inks to show color ilustrations. Which was long after artists could smear blood and fefces on cave walls. It's all pattern, Alex.

See, the next generation of consoles in 2012 or so will still be using 1080p as the max. If anything, they will have more processors in smaller sizes. They will be more comlex and more affordable. They will be more impressive, when compared to the last generation.

Seriously, fuck the tech specs and Cell processors and Emotion Enginges and all that jazz.

Graphics are an opiate for the masses, icycalm.
Pongism
 
Joined: 03 Apr 2006 06:11

Unread postby icycalm » 15 Sep 2006 21:37

Pongism, imagine a guy who reads a detailed article on the best equipment to use for fishing, and who writes in to say that he thinks the article is wrong because he hates fishing.

I understand that you, and Marco, and quite a few other people these days hate nice graphics, for some reason. However, you should know by now that I don't. Hate them, that is.

If you have some arguments to make on why you think the PS3 and the 360 are capable of delivering photorealistic graphics at HD resolutions, then go ahead and make them. As you have read, I am convinced that they don't have the horsepower for that.

If you want to make graphics vs. gameplay arguments or whatever, then start a thread in the games forum and I'll be sure to reply at length.

As for this:

Graphics are an opiate for the masses, icycalm.


That's why I love them.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby lock » 11 Nov 2006 13:37

Really good article. The graphics cost almost 50% of the machine resources.

That makes me think about the MGS4 graphic evolution.

Two years ago models looks almost the same as the latest pics, but filters and scenary textures got worse. It's possible that the game could keep the aspect of two years ago with 720p instead 1080p? Only Kojima knows...
User avatar
lock
 
Joined: 04 Nov 2006 12:31
Location: Kyoto

Unread postby icycalm » 14 Nov 2006 11:12

I think so. 720p seems to be nearer the sweetspot of the PS3 and the 360.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby zak » 14 Nov 2006 19:16

Icy are you getting a PS3 anytime soon?
zak
 
Joined: 07 Jun 2006 12:45
Location: Bucharest

Unread postby icycalm » 15 Nov 2006 17:02

Nope. The earliest I might get on is March I think. I am writing about this actually, I should have something on the frontpage in a day or so.

And you?
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby zak » 16 Nov 2006 21:10

Well yeah, since Euroland got screwed over I can only get one in March.
But I don't think I will, maybe towards the end of next year.
zak
 
Joined: 07 Jun 2006 12:45
Location: Bucharest

Unread postby GnaM » 10 Jun 2007 06:08

LOL, that was an extremely compelling article. I actually feel mildly ashamed that that stuff had not already occurred to me, considering I've been playing PC FPS for about 8 years now...and just the other day I was tweaking my settings for Quake 4 and found anything beyond 640x480 to be an absolute waste of processing power.

That said, I don't agree that "primitive" games like HL1 or N64 games receive little benefit from additional resolution.

Coincidentally, I just tried N64 emulation for the first time in the past two weeks, and was blown away to a small degree by how good the games look running at 1600x1200. It might just be my skewed memories, but last time I played Ocarina on an actual N64 it hurt my eyes and I quit after a few hours. In contrast, I've been rather pleased by the look of Ocarina: Master Quest on my PC and have been putting a few hours into it here and there throughout the past week.

Of course, that's not completely without additional effects; aside from the added resolution I use Anti-aliasing at 4x, Anisotropic Filtering at 3x, and SuperSAL texture filtering.

Of course, it doesn't make the game look like Twilight Princess, and the textures all still look like diarrhea up close, but it's still a hell of a lot easier to look at, for me at least.

Anyway, I'm completely convinced of your argument that the higher resolutions of the new consoles are likely just completely wasting processing power that could be used on more effects to create an overall superior image. And the sad thing is that the decision to use these resolutions obviously had nothing to do with offering better gaming, it was just a matter of Sony and MS promoting their respective home video formats.

Good blog.
User avatar
GnaM
 
Joined: 10 Jun 2007 05:22

Unread postby eae » 04 Apr 2008 23:37

I hope it's ok to post a comment to such an old article :)

Let me be the first one to disagree on your point. The reason why a movie looks very good even in a low resolution is that it was first "rendered" at a very high resolution (i.e. the "real world" scenes that happened in front of the camera), and then, the process of capturing it on film and then digitalizing it kinda "downscaled" it to a lower resolution, where every pixel in the end result is obtained by averaging a large number of "real world pixels" from the original image. This is the reason why it presents no aliasing, despite no anti-aliasing being applied to the image.

So if you want to produce low-res graphics that look fotorealistic, you should follow the following process: first render the image at a very high resolution, then downscale it by averaging the pixels (or maybe some more refined method) until you get the resolution you need. This is not the same thing as calculating the image in low-res an then applying antialiasing to it, since aa can't guess what's really meant to be around every pixel, so the end result will always be worse, with artifacts etc.

So as you can guess, if you have to render a high-res image anyways before downscaling it, it's much more reasonable to directly feed the high-res image to the screen.

A similar thing goes for framerate: you don't need 60 Hz to give the impression of a fluid motion, and indeed movies appear fluid despite running at 25 Hz. But if you want to replicate this kind of fluidity on a low framerate videogame, you'll need to add an accurate motion blur effect, and to obtain this you'll need to first render the images at a high framerate, and then downrate it so that every resulting "motionblurred" frame contains information taken from several frames. Again, it's much easier to just render the thing directly at 60 fps (or 120 fps for exigent pc gamers).

Hope this made sense to you :)

edit: let me add a wikipedia reference http://en.wikipedia.org/wiki/Supersampling
eae
 
Joined: 03 Apr 2008 14:46

Unread postby icycalm » 04 Apr 2008 23:57

eae wrote:I hope it's ok to post a comment to such an old article


There is no "sell-by" date on anything written on this site. If I didn't want people bumping threads I would have locked them.

eae wrote:The reason why a movie looks very good even in a low resolution is that it was first "rendered" at a very high resolution (i.e. the "real world" scenes that happened in front of the camera), and then, the process of capturing it on film and then digitalizing it kinda "downscaled" it to a lower resolution


No, this is NOT "the reason why a movie looks very good even in a low resolution". Because a movie would look photorealistic (i.e. good, according to the definition of 'good' in the article) at ANY resolution. This is because of how we have defined photorealism. A movie is photorealistic by definition, and therefore looks good by definition.

eae wrote:So if you want to produce low-res graphics that look photorealistic, you should follow the following process: first render the image at a very high resolution, then downscale it by averaging the pixels (or maybe some more refined method)


First off, the method you propose may in fact be a valid one, but it seems to me inefficient. Regardless, the point here is that the exact method by which a low-res game can be made to look photorealistic is irrelevant to the subject at hand. The point is that these methods require more polygons and additional effects, which means they require additional processing power. By increasing the game's resolution you are simply exponentially increasing these power requirements, while getting further away from photorealism, not closer to it.

eae wrote:A similar thing goes for framerate: you don't need 60 Hz to give the impression of a fluid motion, and indeed movies appear fluid despite running at 25 Hz. But if you want to replicate this kind of fluidity on a low framerate videogame, you'll need to add an accurate motion blur effect


One of the section headings in my article reads "It's the effects, stupid".

You are not saying anything new.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 04 Apr 2008 23:58

And your Wikipedia reference is completely irrelevant to anything that has to do with this subject.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby raphael » 05 Apr 2008 01:45

eae wrote:Let me be the first one to disagree on your point. The reason why a movie looks very good even in a low resolution is that it was first "rendered" at a very high resolution

Help !
Another guy who doesn't know anything about cinema and video and tries to teach it to you.

Please stop repeating what you've heard here or there.
You got almost everything wrong.
User avatar
raphael
 
Joined: 04 Mar 2008 19:31
Location: Paris

Unread postby icycalm » 05 Apr 2008 02:01

Thanks, Raphael, but I think I covered the issues sufficiently well. This is not some random "gamerz" forum where people gang up on new posters just because they are wrong. If you don't have anything to add to the discussion there's no point in posting.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Jedah » 05 Apr 2008 02:02

Scenes of real life are rendered? That's more imaginative than Matrix. In what resolution? Infinite x Infinite x double the eye retina capture frequency? That's a huge frame buffer if you ask me!!
User avatar
Jedah
 
Joined: 30 May 2006 12:48
Location: Greece

Unread postby icycalm » 05 Apr 2008 02:19

He used quotation marks ("rendered"), so he was not saying anything wrong there. I can understand what he is saying, but a game does not need to be rendered first in higher resolution and then displayed in a lower one in order to look good. This would be a waste of effort. All it takes is more polygons and more complex effects.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby rob » 10 Apr 2008 07:33

my problem with this article is that you begin by comparing a movie still with a game shot and make the point that the movie still looks better despite the lower res. But this isn't what the shift is about. There's more to making graphics than the simple numbers that the particular machine can kick around.
For a start, as you've said, there's the effects. Effects which need to be invented and probably included in the hardware so they can be done fast enough. So the current gen may not have been able to have them anyway. If they could we're talking increased costs/sacrificing other stuff for something only a selection of games that want it may use.
Next, there's the sheer time it would take to make things look that good. If you compared a shot of any two games, one at 480i and one at 720p, the 720p would look better. So at no cost in terms of dev time you've made every single game look better (2d aside).
Next, I don't agree that increasing res exponentially increases the power required. The AI, physics and audio are still the same for a start, so that side of things doesn't change. Geometry doesn't automatically increase so there's no more polygons to move around. Assuming textures all increase in proportion then there is a proportional hit on memory and fill rate, and similarly for most special effects, but that's only one part of it.

Finally, what makes 480i so perfect anyway? Surely by the writers arguments a movie at 400x300 would look more realistic than a 360 game, and require less power still - why not drop down further? It'd be nice to have the option for all games of choosing any res to fit, but with blu ray and hdtv pushing people to buy HD screens, unless the source is the right res stuff looks crap. I'm still on a CRT by the way though, so i don't care!

Note that I'm not actually that much of a techie so apologies if any of the terms I've used are wrong, but hopefully the point is clear...
rob
 
Joined: 10 Apr 2008 07:16

Unread postby icycalm » 10 Apr 2008 08:17

rob wrote:my problem with this article is that you begin by comparing a movie still with a game shot and make the point that the movie still looks better despite the lower res.


Well, it does. You have a problem with that? What are you, blind?

rob wrote:But this isn't what the shift is about.


Yes it is. The shift is, to a large extent, about improving the graphics.

rob wrote:There's more to making graphics than the simple numbers that the particular machine can kick around.


This sentence is meaningless. Graphics ARE numbers. Everything a computer does is numbers.

rob wrote:For a start, as you've said, there's the effects. Effects which need to be invented and probably included in the hardware so they can be done fast enough. So the current gen may not have been able to have them anyway.


That's nonsense. Yes, effects are invented all the time. The point is that the 360 and the PS3 are not even capable or exploiting to the full even the CURRENTLY-AVAILABLE effects at 720p or 1080p. Not to mention polygon counts.

rob wrote:If they could we're talking increased costs/sacrificing other stuff for something only a selection of games that want it may use.


I've no idea what this sentence is supposed to tell me. Yes, graphics aren't everything, we know. This article however was about graphics.

rob wrote:Next, there's the sheer time it would take to make things look that good. If you compared a shot of any two games, one at 480i and one at 720p, the 720p would look better.


Again, nonsense. It depends on the game in question. I know thousands of games that would look worse in 720p. THOUSANDS.

rob wrote:So at no cost in terms of dev time you've made every single game look better (2d aside).


Okay, now we are getting somewhere (with the parenthesis).

Yet, I still know many 3D games that would not look better in high res. A game with low texture quality would like shit in a higher res, for example, because you could make out their imperfections easier.

rob wrote:Next, I don't agree that increasing res exponentially increases the power required.


There's nothing to discuss here. You obviously lack the technical knowledge required to understand this. So just take my word for it.

rob wrote:Finally, what makes 480i so perfect anyway?


No one said 480i is perfect. You just need to learn to read.

rob wrote:Surely by the writers arguments a movie at 400x300 would look more realistic than a 360 game, and require less power still - why not drop down further?


Yeah, we could. But it would not be necessary. Obviously if you keep dropping down the resolution you get to the point where you only have one pixel, which can only be one color at a time. But we don't have to resort to such absurdities. Movies look amazing at 480i, and until we manage to make a game which looks just as good at 480i there's no reason to jack up the resolution (with the exception of strategy games, etc., as I explain in the footnotes).

rob wrote:Note that I'm not actually that much of a techie so apologies if any of the terms I've used are wrong, but hopefully the point is clear...


I am sorry to have to tell you that you don't really have a point...
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Anid Maro » 15 Apr 2008 05:33

Good article. My only criticism (which has been touched on earlier in this thread), is that the opening example is a poor one.

Not even just a comparison of Hero to DoA but just a comparison between a movie and a video game in general as regards graphics. They're two different animals.

A movie as the end product consists of a series of pre-rendered images whereas a video game must render its image output on the fly. Furthermore movies are essentially 2D in that the only concern graphically is the predetermined camera angle while in a video game it must (typically) consider multiple angles to render.

As such, in the graphics department a video game requires far more resources than a movie to output the same graphical quality simply because it has to consider a helluva lot more.

Mind you, I understand the point you are attempting to illustrate with the example (that hardware resources are being squandered on higher resolutions), I just think it was a poor example to use.

That aside, I fully agree. This high resolution BS is a total waste and only exists (in my opinion) to drive HD sales. Sony has a HD movie format (Blu-Ray) as does MS (HD-DVD) not to mention that Sony also happens to manufacture HD TVs. I think it no coincidence that HD quality is such a huge part of their campaign.

As said in the article, it would be better to have a lower resolution with more effects than the reverse. Why increase the resolution when there are few details to see? Furthermore an increase in resolution limits the polygon output which essentially means that the game models are less complex than they could be.

With the case of 99 Nights you have detailed characters with bland backgrounds, in a lower resolution you could have good quality in both (presuming of course, an appropriate budget in the art department).

Amusingly enough, a high resolution is most relevant as regards 2D graphics (i.e. Sprites)... which neither Sony nor MS would like to touch with a 10 foot pole. :P
Anid Maro
 
Joined: 15 Apr 2008 05:11

Next

Return to Hardware

cron