default header

Hardware

Graphics Cards

Moderator: JC Denton

Graphics Cards

Unread postby icycalm » 05 Jan 2010 21:51

Aaaaand this is the most exciting component of all. Basically, at this point in time, ATI seems to be ahead of NVIDIA, from what I can gather. And not only that, but their 5xxx-range cards are the only ones that currently support DirectX 11.

Out of these, the top model is the HD5970, which in various tests I've seen outperforms every other single card, and even various inferior SLI and CrossFire configurations. And two of these cards hooked up together are untouchable.

ATrHD5970_3-4_lg.png
ATrHD5970_3-4_lg.png (50.4 KiB) Viewed 181 times


So that's what I am going with. I'll get one of these cards to start with, and in a few months, when the price goes down a bit, I'll grab a second one. However, getting the first one will probably be a bit of a chore, since I haven't actually seen it in stock anywhere -- neither in Europe nor in the US. So I'll be looking out for it in the next few weeks. They go for about $600 in the US and 550 euros in Europe.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby Adjudicator » 21 Feb 2010 03:56

icycalm, I have a suggestion of an additional video card to add along with one or two ATI Radeon HD 5970s. It is an NVIDIA video card capable of running PhysX.

Here is a list of NVIDIA video cards which can support NVIDIA PhysX:

http://www.nvidia.com/object/physx_gpus.html

Which NVIDIA GeForce GPUs support PhysX?

GeForce 8-series GPUs and later (with a minimum of 32 cores and a minimum of 256MB dedicated graphics memory) support PhysX.


Here is a link which explains in detail about NVIDIA PhysX:

http://www.nvidia.com/object/physx_faq.html

Basically, it is a proprietary physics engine which can make use of appropriate hardware to accelerate physics calculation which would normally be calculated by the CPU.

There are several games which make use of this proprietary technology to render complex physics effects.

The most prominent example which comes to my mind is Batman: Arkham Asylum.

I must emphasise that PhysX in this game is optional, and all it changes is the rendering of certain effects which does not change how the game is played (about the only exception I can think of is the volumetric fog effect; when PhysX is disabled, the fog is not rendered at all).

This article from HardOCP gives a detailed review on PhysX, comparison on its setting in the game and its impact on performance.

I have linked to this article just to illustrate its potential and its effect on performance.

http://hardocp.com/article/2009/10/19/b ... ay_review/

This is one particular page which shows the significant impact of PhysX on how it changes the way certain scenes are rendered:

http://hardocp.com/article/2009/10/19/b ... _review/10

However, the big issue of PhysX is its proprietary nature, which means only NVIDIA video cards or the outdated AGEIA PhysX physics processing unit (PPU) can accelerate PhysX physics.

According to this review, running PhysX on an NVIDIA video card gives better performance than using the AGEIA PPU:

http://www.firingsquad.com/hardware/phy ... ce_update/

The CPU is capable of using PhysX in the absence of an NVIDIA video card or the AGEIA PPU, as well, but it incurs a very significant performance hit.

Another point to take note of is that PhysX is disabled if it detects the presence of any non-NVIDIA video card in the system.

http://enthusiast.hardocp.com/news/2009 ... d_present/

http://www.ngohq.com/graphic-cards/1622 ... esent.html

However, there is a "workaround" which circumvents this issue:

http://www.hardforum.com/showpost.php?p ... ostcount=1

http://www.ngohq.com/news/16560-patch-r ... esent.html

In summary, this is what the absolute best configuration of video cards as of this date:

Two ATI Radeon HD 5970s in CrossFire for DirectX 11 effects and unsurpassed performance.

One GTS 250 or better NVIDIA video card dedicated to PhysX.

One concern I have is that availability of GTX 260s, GTX 275s, GTX 285s and GTX 295s could be a problem.

http://www.semiaccurate.com/2009/10/06/ ... nd-market/
User avatar
Adjudicator
 
Joined: 12 Mar 2009 13:42
Location: Singapore

Unread postby ray » 02 Jun 2011 18:37

The 5970 has been outdone by the Radeon HD 6990, which is an absolute monster.
User avatar
ray
 
Joined: 22 Feb 2009 19:33

Unread postby icycalm » 01 Oct 2012 15:32

Really great article on GTX 680 vs. HD 7970 by those fagots at the Verge/Polygon. They had me from the opening lines:

Vlad Savov wrote:And yet, the stalwart PC continues to defy exaggerated reports of its demise with hugely popular series like Battlefield, Crysis, Diablo, StarCraft, and Total War offering gaming experiences that simply cannot be touched by consoles.


http://www.theverge.com/2012/7/3/307619 ... rd-upgrade

I don't know how good their benchmarking process is, but the article seems, to my relatively untrained eyes at least, quite thorough and well-written. Not as thorough as the 10-page analyses the dedicated hardware sites are putting out perhaps, but good enough for a general gaming site and with a layout 100 times prettier and better. The mere fact that they are using a single page is already a huge plus, and look at all those cool pics and insets! Really a joy to behold, and they absolutely sold me on the 3-monitor setup.

Image
Image

Vlad Savov wrote:Triple-monitor gaming is simply amazing. A sensory revelation. You still spend the majority of the time with your eyes anchored on the middle screen, but the sense of atmosphere that comes from the two auxiliary displays is spine-tinglingly good. After a while, you may even learn to look at things with your own neck rather than the mouse. In truth, this is how any sort of visual simulation is supposed to operate. The tight horizontal field of view of a typical monitor, no matter how resplendent its color reproduction may be, is just unnatural — human beings have peripheral vision which has gone (mostly) neglected, and it’s only once you move to this sort of surround view that you’ll understand what you’ve been missing.


Before I had thought it'd be merely a nice luxury, and had not prioritized it at all, but now I realize it's nothing short of a necessity, except if the helmets mothman talks about in the other thread prove an ever better solution in the short term (because in the long term they will of course be better). At any rate I've got a stack of cutting-edge PC games right here, and wondering whether I should go ahead and play them with my single 42" 1080p screen when my new GTX 680 arrives (the winner of the test), or wait until I get the still very difficult to find 690 and three new monitors. The problem is that I don't currently have the desk space for three 42" monitors, and the desks I like take a couple of months to be delivered once I've ordered them, on top of the fact I am leaving for France beginning of December, so if I want to do any PC gaming before the spring (when I return home) I'll have to do it on one screen.

Vlad Savov wrote:CAVEAT EMPTOR
The new generation of multi-monitor-gaming-on-a-single-card GPUs is indeed impressive, but if you want to make that $499 (and above) leap, bear in mind that you'll have to upgrade more than just the card itself. You’ll likely need new monitors and don't assume you'll be able to get away with the same old bureau, either. Multi-display gaming requires a space closer to a dining table than a desk. Monitors need to be identical in terms of screen size and resolution — ideally identical, full stop, so that bezel gaps and position will match — and have the thinnest possible bezels to maximize immersiveness.


Anyway, the Verge/Polygon just jumped up quite a few notches in my estimation, and I'll be clicking around their site to see if I can find anything else that's good on there. In the meantime, share your graphics card knowledge here if you have any news. It currently seems to me that the 690 is the king of the hill, if you can afford $1000 for it and can find one. Two of those seems to be the way to go for ultimate triple-screen gaming -- though if you can't afford them a single one might do. One step down still is the 680 at $500, which can also power a 3-screen setup in a pinch, at least according to the Polygon fags. So there you have it.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 22 Jan 2013 20:39

Looks like Big Kepler may finally be coming:

http://www.legitreviews.com/news/14972/

Mike S. wrote:NVIDIA's current flagship graphics card, the GTX 680, is based on the Kepler GPU architecture. However, being the GK104 variant, it's not the top model, or "Big Kepler" as it's been dubbed. That chip has already found its way into the super-expensive K20 Tesla HPC card late last year, with a huge 18700 of those cards finding their way into the Cray Titan supercomputer! However, so far, no GeForce card based on this GPU has been announced and crucially, no announcement from NVIDIA at CES 2013 earlier this month where they unveiled their Project SHIELD portable gaming platform running on Android.

Now, rumors (have pinch of salt at the ready) have leaked out that a variant of this chip is to finally be at the heart of a mega-powerful single-chip graphics card. This will be the GK110 GPU, but unfortunately not the full version, since it will only have 14 SMX units, rather than the full 15 present in the K20 card. This means that it will have 2688 Cuda cores rather than the 2880 of the uncut K20 variant. The GPU will be clocked at a conservative 732MHz and it will feature a massive 6GB of GDDR5 memory with an effective memory clock of 5200MHz attached via a 384-bit bus. Performance is reported to be around 85% that of a GTX 690 and the card will sell for about $900. Finally the name is expected to be called the "GeForce Titan", fitting in nicely with the name of Cray's supercomputer.


Sucks it still won't be the full thing, but the 6GB RAM, if true, should utterly fix the 680/690's bottleneck at higher resolutions, so sticking 4 of these in a PC should show drastic improvement. Nvidia has finally learned this goddamn lesson.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 22 Feb 2013 16:27

Reviews of Titan are in, and I quickly skimmed through these ones:

NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute
http://www.anandtech.com/show/6760/nvid ... n-part-1/4

NVIDIA’s GeForce GTX Titan Review, Part 2: Titan's Performance Unveiled
http://www.anandtech.com/show/6774/nvid ... e-unveiled

NVIDIA GeForce GTX TITAN SLI & Tri-SLI
http://www.techpowerup.com/reviews/NVID ... Titan_SLI/

TitanNew.jpg


As predicted, it's significantly faster than the 680, but significantly slower than the 690. The main advantage is that you can SLI up to four of them, while with the 690 you can only SLI two, and the huge 6GB RAM which does away with all hi-res bottleneck issues. However, you'll need your PC to be cutting-edge in all other areas (i.e. CPU and bus speed) to avoid any other bottlenecks and achieve the SLI configurations' max potential, and since my PC isn't I have decided to stick with my one 690, and perhaps throw in another one if its price falls a little bit at some point due to Titan's release. So I'll skip this card and wait for the 790, whenever that ends up hitting.

Still, tri-SLI Titan seems to be the only way to play Far Cry 3 smoothly on 3 1080p screens, so it looks like I'll be putting off that game for at least a few more months until the new cards hit.

http://www.techpowerup.com/reviews/NVID ... LI/12.html

farcry3_5760_1080.gif
farcry3_5760_1080.gif (28.68 KiB) Viewed 463 times
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 02 Mar 2013 16:34

Now that's what I am talking about:

titans_3970x.jpg


http://www.neogaf.com/forum/showthread. ... st48700149

Though if he's spending $4,000 on video cards just to watch them on those 3 shitty monitors he must be an imbecile. Probably has another screen or projector setup too, though.

It's the 1 Percent guy, by the way, and he says he'll have benchmarks up at some point.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 02 Mar 2013 21:22

They are going for around $1,400 right now on Amazon by the scalpers:

http://www.amazon.com/s/ref=nb_sb_noss_ ... idia+titan

The 690 is still steady at $1,000, unfortunately. I am getting a second one asap if wide availability of the Titan ends up bringing it down significantly.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby movie » 24 Apr 2013 01:28

http://wccftech.com/nvidia-geforce-700- ... -ti-gk104/

It has been reported by Brightsideofnews that NVIDIA would be launching their GeForce 700 series including the flagship GeForce GTX 780 as soon as May 2013. According to the information from their Asia based sources, the company is planning to launch three new GeForce 700 series SKUs which will be featuring refined version of the 28nm Kepler architecture.

The most interesting part about the report is that we have already been hearing about the GeForce GTX Titan LE for the past few months. The card which is a consumer version of the already released Tesla K20C would not be another TItan GPU but rather the next flagship GeForce GTX 780 graphics card featuring the same specifications. The GeForce GTX 780 would be powered by the GK110 core boasting 2496 Cuda cores and a 5 GB GDDR5 VRAM which was indicated in a leaked picture of the board itself.

The leak could now be considered true since it was mentioned that Titan LE was just an internal codename of the card, retail name would be something else and with a different cooling scheme compared to the Titan. It was speculated that the Titan LE, now known as the GTX 780 would be priced around the $499 – $599 mark.

p2r80JM[1].png

The other two boards, GeForce GTX 770 and GeForce GTX 760 Ti would be based on the GK104-425 featuring higher clock speeds and much better memory interfaces. The GeForce GTX 770 for instances is said to feature 4GB GDDR5 memory interface, 1536 cores (same as a GTX 680) and around 20-25% better performance than the GeForce GTX 670. The GeForce 760 Ti on the other hand would also be based on the GK104-425 core architecture and would get a 2 GB VRAM with a 256-bit interface and higher clock speeds. The card would be around 20-23% faster than the GeForce GTX 660 Ti and would give a hard time to the Radeon HD 7800 series GPUs.

NVIDIA-GPU-Roadmap[1].png

User avatar
movie
 
Joined: 28 Nov 2009 11:54

Unread postby icycalm » 31 May 2013 02:42

The 780 is out and apparently widely available at around $650 to $700. Here's a review:

http://www.anandtech.com/show/6973/nvid ... 780-review

GTX780_678x452.jpg


I am severely disappointed by it. Having not read much about it prior to this, I was under the impression it would be based on entirely new technology, but it's simply a gimped, cheaper version of the Titan. To make a long story short, this is how things stand:

If you are only going to have one card, the 690 is the card to go for at $1,000.

If you are only going to have two cards, 2 690s is what you want for $2,000 (this is what I got, because my current motherboard/processor/cooling setup does not allow for a third card without running into throttling and cooling issues).

If you are going to have either three or four cards, you'll want 3 or 4 Titans at $3,000 or $4,000.

The only reason to purchase a 780 is if you are going for a single card but don't have the extra $350 to spend on the 690. The 780 is essentially a budget solution, and that's why it's so annoying to see all those sites trumpeting it as "the new top-end single-GPU card" or some shit (though the review I linked, at least, is very well written and doesn't make that mistake).

So basically, I am sticking with my current setup until the 800 family rolls around, whenever that may be, at which point I'll need a whole new system to support it.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 26 Sep 2013 22:33

AMD's new GPU lineup was unveiled in Hawaii yesterday. Of particular interest are the following:

http://www.pcper.com/news/General-Tech/ ... -R7-Series

amd-gpu14-06.png
amd-gpu14-06.png (76.1 KiB) Viewed 184 times


Scott Michaud wrote:Not a lot is known about the top end, R9 290X, except that it will be the first gaming GPU to cross 5 TeraFLOPs of compute performance. To put that into comparison, the GeForce Titan has a theoretical maximum of 4.5 TeraFLOPs.


http://www.pcper.com/news/General-Tech/ ... -4K-Autumn

Scott Michaud wrote:The "Hawaii" powered Radeon R9 290 and R9 290X graphics cards are expected to handle CrossFire pacing acceptably at launch. Clearly, if there is ever a time to fix the problem, it would be in new hardware. Still, this is good news for interested customers; if all goes to plan, you are likely going to have a good experience out of the box.

Current owners of GCN-based video cards, along with potential buyers of the R9 280X and lower upcoming cards, will apparently need to wait for AMD to release a driver to fix these issues. However, this driver is not far off: Koduri, unclear whether on or off the record, intends for an autumn release. This driver is expected to cover frame pacing issues for CrossFire, Eyefinity, and 4K.

Koduri does believe the CrossFire issues were unfortunate and expresses a desire to fix the issue for his customers.


No launch dates or prices have been announced yet. I am looking forward to those, mostly from an academic perspective, because I don't plan to buy any new GPUs before the 20nm generation arrives.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby El Chaos » 29 Mar 2014 22:08

Nvidia announces $3,000 Titan Z graphics card, two Titan Black GPUs in a single, monstrous product: http://www.eurogamer.net/articles/digit ... es-titan-z
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby icycalm » 29 Mar 2014 23:48

I saw that. The thing is it costs as much as three Titan Blacks, and will also apparently be considerably underclocked compared to them, so if I had the cash to spend I would simply buy three Titan Blacks. I guess this is the long-awaited "790", but I just don't see the value in it for the power gamer.

Maybe if the buyer's motherboard doesn't have enough x16 PCIe slots available, I can see this board's usefulness. That's why I got two 690s instead of four 680s too. But if you'll be spending this amount of money on graphics, a cutting edge motherboard is chump change in comparison. Then again upgrading the motherboard sometimes entails upgrading the CPU and memory too... which again is why I stuck with two 690s instead of four 680s, but really, the $1000 you'll be saving from buying vanilla blacks should almost cover that. And if you buy four Blacks instead of two Black Zs, you will be saving $2000... which is practically enough for the best processor/motherboard/memory combination out there.

Conclusion: they priced this for compute users, not for gamers. Even if you are the richest gamer ever, this is a bad option, in my view. The Titan Black remains the best card on the market for now.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 30 Mar 2014 00:06

http://www.anandtech.com/show/7897/nvid ... 0-for-3000

GTX_Titan_Z_678x452.jpg


AnandTech gives a core clock speed of "700MHz?" and a boost clock speed of "?" for the Black Z, compared to 889 and 980 for the regular Black. It just wouldn't sit well with me to spend 50% more for something that performs slower, even if it's only a little slower. The 690 was also a little slower than two 680s, but at least it cost the same as them.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain

Unread postby icycalm » 03 Apr 2014 17:09

Just noticed this:

http://www.pcper.com/news/General-Tech/ ... phics-Card

Tim Verry wrote:the GTX TITAN Z is a triple slot graphics card


One more serious negative.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Spain


Return to Hardware