default header

Hardware

Graphics Cards

Moderator: JC Denton

Graphics Cards

Unread postby icycalm » 05 Jan 2010 21:51

Aaaaand this is the most exciting component of all. Basically, at this point in time, ATI seems to be ahead of NVIDIA, from what I can gather. And not only that, but their 5xxx-range cards are the only ones that currently support DirectX 11.

Out of these, the top model is the HD5970, which in various tests I've seen outperforms every other single card, and even various inferior SLI and CrossFire configurations. And two of these cards hooked up together are untouchable.

ATrHD5970_3-4_lg.png
ATrHD5970_3-4_lg.png (50.4 KiB) Viewed 4396 times


So that's what I am going with. I'll get one of these cards to start with, and in a few months, when the price goes down a bit, I'll grab a second one. However, getting the first one will probably be a bit of a chore, since I haven't actually seen it in stock anywhere -- neither in Europe nor in the US. So I'll be looking out for it in the next few weeks. They go for about $600 in the US and 550 euros in Europe.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby Adjudicator » 21 Feb 2010 03:56

icycalm, I have a suggestion of an additional video card to add along with one or two ATI Radeon HD 5970s. It is an NVIDIA video card capable of running PhysX.

Here is a list of NVIDIA video cards which can support NVIDIA PhysX:

http://www.nvidia.com/object/physx_gpus.html

Which NVIDIA GeForce GPUs support PhysX?

GeForce 8-series GPUs and later (with a minimum of 32 cores and a minimum of 256MB dedicated graphics memory) support PhysX.


Here is a link which explains in detail about NVIDIA PhysX:

http://www.nvidia.com/object/physx_faq.html

Basically, it is a proprietary physics engine which can make use of appropriate hardware to accelerate physics calculation which would normally be calculated by the CPU.

There are several games which make use of this proprietary technology to render complex physics effects.

The most prominent example which comes to my mind is Batman: Arkham Asylum.

I must emphasise that PhysX in this game is optional, and all it changes is the rendering of certain effects which does not change how the game is played (about the only exception I can think of is the volumetric fog effect; when PhysX is disabled, the fog is not rendered at all).

This article from HardOCP gives a detailed review on PhysX, comparison on its setting in the game and its impact on performance.

I have linked to this article just to illustrate its potential and its effect on performance.

http://hardocp.com/article/2009/10/19/b ... ay_review/

This is one particular page which shows the significant impact of PhysX on how it changes the way certain scenes are rendered:

http://hardocp.com/article/2009/10/19/b ... _review/10

However, the big issue of PhysX is its proprietary nature, which means only NVIDIA video cards or the outdated AGEIA PhysX physics processing unit (PPU) can accelerate PhysX physics.

According to this review, running PhysX on an NVIDIA video card gives better performance than using the AGEIA PPU:

http://www.firingsquad.com/hardware/phy ... ce_update/

The CPU is capable of using PhysX in the absence of an NVIDIA video card or the AGEIA PPU, as well, but it incurs a very significant performance hit.

Another point to take note of is that PhysX is disabled if it detects the presence of any non-NVIDIA video card in the system.

http://enthusiast.hardocp.com/news/2009 ... d_present/

http://www.ngohq.com/graphic-cards/1622 ... esent.html

However, there is a "workaround" which circumvents this issue:

http://www.hardforum.com/showpost.php?p ... ostcount=1

http://www.ngohq.com/news/16560-patch-r ... esent.html

In summary, this is what the absolute best configuration of video cards as of this date:

Two ATI Radeon HD 5970s in CrossFire for DirectX 11 effects and unsurpassed performance.

One GTS 250 or better NVIDIA video card dedicated to PhysX.

One concern I have is that availability of GTX 260s, GTX 275s, GTX 285s and GTX 295s could be a problem.

http://www.semiaccurate.com/2009/10/06/ ... nd-market/
User avatar
Adjudicator
 
Joined: 12 Mar 2009 13:42
Location: Singapore

Unread postby ray » 02 Jun 2011 18:37

The 5970 has been outdone by the Radeon HD 6990, which is an absolute monster.
User avatar
ray
 
Joined: 22 Feb 2009 19:33

Unread postby icycalm » 01 Oct 2012 15:32

Really great article on GTX 680 vs. HD 7970 by those fagots at the Verge/Polygon. They had me from the opening lines:

Vlad Savov wrote:And yet, the stalwart PC continues to defy exaggerated reports of its demise with hugely popular series like Battlefield, Crysis, Diablo, StarCraft, and Total War offering gaming experiences that simply cannot be touched by consoles.


http://www.theverge.com/2012/7/3/307619 ... rd-upgrade

I don't know how good their benchmarking process is, but the article seems, to my relatively untrained eyes at least, quite thorough and well-written. Not as thorough as the 10-page analyses the dedicated hardware sites are putting out perhaps, but good enough for a general gaming site and with a layout 100 times prettier and better. The mere fact that they are using a single page is already a huge plus, and look at all those cool pics and insets! Really a joy to behold, and they absolutely sold me on the 3-monitor setup.

Image
Image

Vlad Savov wrote:Triple-monitor gaming is simply amazing. A sensory revelation. You still spend the majority of the time with your eyes anchored on the middle screen, but the sense of atmosphere that comes from the two auxiliary displays is spine-tinglingly good. After a while, you may even learn to look at things with your own neck rather than the mouse. In truth, this is how any sort of visual simulation is supposed to operate. The tight horizontal field of view of a typical monitor, no matter how resplendent its color reproduction may be, is just unnatural — human beings have peripheral vision which has gone (mostly) neglected, and it’s only once you move to this sort of surround view that you’ll understand what you’ve been missing.


Before I had thought it'd be merely a nice luxury, and had not prioritized it at all, but now I realize it's nothing short of a necessity, except if the helmets mothman talks about in the other thread prove an ever better solution in the short term (because in the long term they will of course be better). At any rate I've got a stack of cutting-edge PC games right here, and wondering whether I should go ahead and play them with my single 42" 1080p screen when my new GTX 680 arrives (the winner of the test), or wait until I get the still very difficult to find 690 and three new monitors. The problem is that I don't currently have the desk space for three 42" monitors, and the desks I like take a couple of months to be delivered once I've ordered them, on top of the fact I am leaving for France beginning of December, so if I want to do any PC gaming before the spring (when I return home) I'll have to do it on one screen.

Vlad Savov wrote:CAVEAT EMPTOR
The new generation of multi-monitor-gaming-on-a-single-card GPUs is indeed impressive, but if you want to make that $499 (and above) leap, bear in mind that you'll have to upgrade more than just the card itself. You’ll likely need new monitors and don't assume you'll be able to get away with the same old bureau, either. Multi-display gaming requires a space closer to a dining table than a desk. Monitors need to be identical in terms of screen size and resolution — ideally identical, full stop, so that bezel gaps and position will match — and have the thinnest possible bezels to maximize immersiveness.


Anyway, the Verge/Polygon just jumped up quite a few notches in my estimation, and I'll be clicking around their site to see if I can find anything else that's good on there. In the meantime, share your graphics card knowledge here if you have any news. It currently seems to me that the 690 is the king of the hill, if you can afford $1000 for it and can find one. Two of those seems to be the way to go for ultimate triple-screen gaming -- though if you can't afford them a single one might do. One step down still is the 680 at $500, which can also power a 3-screen setup in a pinch, at least according to the Polygon fags. So there you have it.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 22 Jan 2013 20:39

Looks like Big Kepler may finally be coming:

http://www.legitreviews.com/news/14972/

Mike S. wrote:NVIDIA's current flagship graphics card, the GTX 680, is based on the Kepler GPU architecture. However, being the GK104 variant, it's not the top model, or "Big Kepler" as it's been dubbed. That chip has already found its way into the super-expensive K20 Tesla HPC card late last year, with a huge 18700 of those cards finding their way into the Cray Titan supercomputer! However, so far, no GeForce card based on this GPU has been announced and crucially, no announcement from NVIDIA at CES 2013 earlier this month where they unveiled their Project SHIELD portable gaming platform running on Android.

Now, rumors (have pinch of salt at the ready) have leaked out that a variant of this chip is to finally be at the heart of a mega-powerful single-chip graphics card. This will be the GK110 GPU, but unfortunately not the full version, since it will only have 14 SMX units, rather than the full 15 present in the K20 card. This means that it will have 2688 Cuda cores rather than the 2880 of the uncut K20 variant. The GPU will be clocked at a conservative 732MHz and it will feature a massive 6GB of GDDR5 memory with an effective memory clock of 5200MHz attached via a 384-bit bus. Performance is reported to be around 85% that of a GTX 690 and the card will sell for about $900. Finally the name is expected to be called the "GeForce Titan", fitting in nicely with the name of Cray's supercomputer.


Sucks it still won't be the full thing, but the 6GB RAM, if true, should utterly fix the 680/690's bottleneck at higher resolutions, so sticking 4 of these in a PC should show drastic improvement. Nvidia has finally learned this goddamn lesson.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 22 Feb 2013 16:27

Reviews of Titan are in, and I quickly skimmed through these ones:

NVIDIA's GeForce GTX Titan, Part 1: Titan For Gaming, Titan For Compute
http://www.anandtech.com/show/6760/nvid ... n-part-1/4

NVIDIA’s GeForce GTX Titan Review, Part 2: Titan's Performance Unveiled
http://www.anandtech.com/show/6774/nvid ... e-unveiled

NVIDIA GeForce GTX TITAN SLI & Tri-SLI
http://www.techpowerup.com/reviews/NVID ... Titan_SLI/

TitanNew.jpg


As predicted, it's significantly faster than the 680, but significantly slower than the 690. The main advantage is that you can SLI up to four of them, while with the 690 you can only SLI two, and the huge 6GB RAM which does away with all hi-res bottleneck issues. However, you'll need your PC to be cutting-edge in all other areas (i.e. CPU and bus speed) to avoid any other bottlenecks and achieve the SLI configurations' max potential, and since my PC isn't I have decided to stick with my one 690, and perhaps throw in another one if its price falls a little bit at some point due to Titan's release. So I'll skip this card and wait for the 790, whenever that ends up hitting.

Still, tri-SLI Titan seems to be the only way to play Far Cry 3 smoothly on 3 1080p screens, so it looks like I'll be putting off that game for at least a few more months until the new cards hit.

http://www.techpowerup.com/reviews/NVID ... LI/12.html

farcry3_5760_1080.gif
farcry3_5760_1080.gif (28.68 KiB) Viewed 4678 times
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 02 Mar 2013 16:34

Now that's what I am talking about:

titans_3970x.jpg


http://www.neogaf.com/forum/showthread. ... st48700149

Though if he's spending $4,000 on video cards just to watch them on those 3 shitty monitors he must be an imbecile. Probably has another screen or projector setup too, though.

It's the 1 Percent guy, by the way, and he says he'll have benchmarks up at some point.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 02 Mar 2013 21:22

They are going for around $1,400 right now on Amazon by the scalpers:

http://www.amazon.com/s/ref=nb_sb_noss_ ... idia+titan

The 690 is still steady at $1,000, unfortunately. I am getting a second one asap if wide availability of the Titan ends up bringing it down significantly.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby movie » 24 Apr 2013 01:28

http://wccftech.com/nvidia-geforce-700- ... -ti-gk104/

It has been reported by Brightsideofnews that NVIDIA would be launching their GeForce 700 series including the flagship GeForce GTX 780 as soon as May 2013. According to the information from their Asia based sources, the company is planning to launch three new GeForce 700 series SKUs which will be featuring refined version of the 28nm Kepler architecture.

The most interesting part about the report is that we have already been hearing about the GeForce GTX Titan LE for the past few months. The card which is a consumer version of the already released Tesla K20C would not be another TItan GPU but rather the next flagship GeForce GTX 780 graphics card featuring the same specifications. The GeForce GTX 780 would be powered by the GK110 core boasting 2496 Cuda cores and a 5 GB GDDR5 VRAM which was indicated in a leaked picture of the board itself.

The leak could now be considered true since it was mentioned that Titan LE was just an internal codename of the card, retail name would be something else and with a different cooling scheme compared to the Titan. It was speculated that the Titan LE, now known as the GTX 780 would be priced around the $499 – $599 mark.

p2r80JM[1].png

The other two boards, GeForce GTX 770 and GeForce GTX 760 Ti would be based on the GK104-425 featuring higher clock speeds and much better memory interfaces. The GeForce GTX 770 for instances is said to feature 4GB GDDR5 memory interface, 1536 cores (same as a GTX 680) and around 20-25% better performance than the GeForce GTX 670. The GeForce 760 Ti on the other hand would also be based on the GK104-425 core architecture and would get a 2 GB VRAM with a 256-bit interface and higher clock speeds. The card would be around 20-23% faster than the GeForce GTX 660 Ti and would give a hard time to the Radeon HD 7800 series GPUs.

NVIDIA-GPU-Roadmap[1].png

User avatar
movie
 
Joined: 28 Nov 2009 11:54

Unread postby icycalm » 31 May 2013 02:42

The 780 is out and apparently widely available at around $650 to $700. Here's a review:

http://www.anandtech.com/show/6973/nvid ... 780-review

GTX780_678x452.jpg


I am severely disappointed by it. Having not read much about it prior to this, I was under the impression it would be based on entirely new technology, but it's simply a gimped, cheaper version of the Titan. To make a long story short, this is how things stand:

If you are only going to have one card, the 690 is the card to go for at $1,000.

If you are only going to have two cards, 2 690s is what you want for $2,000 (this is what I got, because my current motherboard/processor/cooling setup does not allow for a third card without running into throttling and cooling issues).

If you are going to have either three or four cards, you'll want 3 or 4 Titans at $3,000 or $4,000.

The only reason to purchase a 780 is if you are going for a single card but don't have the extra $350 to spend on the 690. The 780 is essentially a budget solution, and that's why it's so annoying to see all those sites trumpeting it as "the new top-end single-GPU card" or some shit (though the review I linked, at least, is very well written and doesn't make that mistake).

So basically, I am sticking with my current setup until the 800 family rolls around, whenever that may be, at which point I'll need a whole new system to support it.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 26 Sep 2013 22:33

AMD's new GPU lineup was unveiled in Hawaii yesterday. Of particular interest are the following:

http://www.pcper.com/news/General-Tech/ ... -R7-Series

amd-gpu14-06.png
amd-gpu14-06.png (76.1 KiB) Viewed 4399 times


Scott Michaud wrote:Not a lot is known about the top end, R9 290X, except that it will be the first gaming GPU to cross 5 TeraFLOPs of compute performance. To put that into comparison, the GeForce Titan has a theoretical maximum of 4.5 TeraFLOPs.


http://www.pcper.com/news/General-Tech/ ... -4K-Autumn

Scott Michaud wrote:The "Hawaii" powered Radeon R9 290 and R9 290X graphics cards are expected to handle CrossFire pacing acceptably at launch. Clearly, if there is ever a time to fix the problem, it would be in new hardware. Still, this is good news for interested customers; if all goes to plan, you are likely going to have a good experience out of the box.

Current owners of GCN-based video cards, along with potential buyers of the R9 280X and lower upcoming cards, will apparently need to wait for AMD to release a driver to fix these issues. However, this driver is not far off: Koduri, unclear whether on or off the record, intends for an autumn release. This driver is expected to cover frame pacing issues for CrossFire, Eyefinity, and 4K.

Koduri does believe the CrossFire issues were unfortunate and expresses a desire to fix the issue for his customers.


No launch dates or prices have been announced yet. I am looking forward to those, mostly from an academic perspective, because I don't plan to buy any new GPUs before the 20nm generation arrives.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby El Chaos » 29 Mar 2014 22:08

Nvidia announces $3,000 Titan Z graphics card, two Titan Black GPUs in a single, monstrous product: http://www.eurogamer.net/articles/digit ... es-titan-z
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby icycalm » 29 Mar 2014 23:48

I saw that. The thing is it costs as much as three Titan Blacks, and will also apparently be considerably underclocked compared to them, so if I had the cash to spend I would simply buy three Titan Blacks. I guess this is the long-awaited "790", but I just don't see the value in it for the power gamer.

Maybe if the buyer's motherboard doesn't have enough x16 PCIe slots available, I can see this board's usefulness. That's why I got two 690s instead of four 680s too. But if you'll be spending this amount of money on graphics, a cutting edge motherboard is chump change in comparison. Then again upgrading the motherboard sometimes entails upgrading the CPU and memory too... which again is why I stuck with two 690s instead of four 680s, but really, the $1000 you'll be saving from buying vanilla blacks should almost cover that. And if you buy four Blacks instead of two Black Zs, you will be saving $2000... which is practically enough for the best processor/motherboard/memory combination out there.

Conclusion: they priced this for compute users, not for gamers. Even if you are the richest gamer ever, this is a bad option, in my view. The Titan Black remains the best card on the market for now.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 30 Mar 2014 00:06

http://www.anandtech.com/show/7897/nvid ... 0-for-3000

GTX_Titan_Z_678x452.jpg


AnandTech gives a core clock speed of "700MHz?" and a boost clock speed of "?" for the Black Z, compared to 889 and 980 for the regular Black. It just wouldn't sit well with me to spend 50% more for something that performs slower, even if it's only a little slower. The 690 was also a little slower than two 680s, but at least it cost the same as them.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 03 Apr 2014 17:09

Just noticed this:

http://www.pcper.com/news/General-Tech/ ... phics-Card

Tim Verry wrote:the GTX TITAN Z is a triple slot graphics card


One more serious negative.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby El Chaos » 05 Feb 2015 23:20

The Digital Foundry 2015 graphics card upgrade guide, the best GPUs for every budget, with performance testing and tuning tips: http://www.eurogamer.net/articles/digit ... rade-guide
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 26 Jun 2015 23:55

Radeon R9 390X 8GB Digital Foundry review: http://www.eurogamer.net/articles/digit ... 90x-review

Richard Leadbetter wrote:The case for the Radeon R9 390X is crystal clear: it's significantly cheaper than the GTX 980, it has twice the amount of on-board GDDR5, and following the upclock and RAM revision, it is comparable from a performance perspective - as long as you're not a 1080p gamer, where the GTX 980 clearly remains the best choice. However, if your display operates at 1440p or an even higher resolution, the performance credentials of the card bring it into line with the competition, surpassing the GTX 980 in several of our test games.

The case against is equally clear: there's little overclocking headroom left, and the 390X's power consumption is quite staggering. But let's be clear: a power-hungry card doesn't necessarily mean it's a nuisance from a noise perspective: in fact, MSI's cooler here is quite remarkable in terms of both its lack of noise and thermal dissipation. However, the fact remains that a lot of heat is being pumped into your case and you'll need to ensure that your chassis is capable of expelling it effectively - and the room will definitely get noticeably warmer as a consequence! Additionally, we find the 390X hard to recommend for multi-GPU scenarios, whereas Nvidia's GTX 970 and 980 both perform well here thanks to their enviable power efficiency, plus better driver support.

Overall, there's perhaps a sense that AMD has dragged the 290X kicking and screaming into a performance envelope comparable with GTX 980 but in the final analysis, there's no denying that AMD has achieved what it set out to do: R9 390X offers a gameplay experience similar to its Nvidia counterpart, though the decent overclocking of the GTX 980 may well give the green team final bragging rights.

But it's the practical value of the R9 390X's 8GB of GDDR5 that remains something of a mystery. Actual use-case scenarios beyond 4GB are limited right now, and the decision to kit out 390X with twice the memory of the new Fury X flagship is curious to say the least. But what's clear is that games' appetite for VRAM is only going to move in one direction, and as developers have been telling us for years now, the more GPU memory you have, the better - and in R9 390X, it's kind of reassuring just to know that it's there.


AMD's Radeon Fury X: the new leader in graphics tech? Brute force GPU power and cutting-edge RAM take the fight to Nvidia: http://www.eurogamer.net/articles/digit ... phics-tech

Richard Leadbetter wrote:If the performance numbers between the products are within a few percentage points, potential buyers may well find themselves looking beyond the numbers in choosing which GPU to purchase - do they go for the higher level of VRAM with Nvidia and its lower power consumption, or do they opt for the smaller form factor and quiet liquid cooling of the Radeon Fury? One thing's for sure, there's a fascinating face-off between the green and red teams coming up, and the fact that there absolutely is competition at the top level of GPU performance can only be a good thing for the market. We'll be reviewing the Radeon Fury X as soon as we can.
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 27 Jun 2015 23:10

Radeon R9 Fury X review: http://www.eurogamer.net/articles/digit ... y-x-review

Richard Leadbetter wrote:There have been some excellent deep dives on the make-up of the Fiji processor at the heart of the Radeon R9 Fury X - one of the best we've seen comes from The Tech Report, breaking down performance from individual components of the GPU and analysing their strengths and weaknesses. It shows that while a great many elements of the chip are best in class, others show little improvement from R9 290X - which may explain some of the results seen at lower resolutions. But the overall takeaway we have is that the Fiji hardware is a strong technological achievement - and as seen in the 4K results, where GPU capabilities are the absolute focus of attention, Fury X is competitive - it's a serious rival, and that's exactly what the market needs. The fact that AMD's new card is a good deal cheaper than current GTX 980 Ti prices (in the UK at least) is also a factor worth taking into account.

On the flip side, if you're pushing for balance between frame-rates and visual refinement, even with this new wave of uber-cards, you need to drop down from 4K to a more reasonable resolution - and the further down you drop, the more dominant GTX 980 Ti becomes, especially so when overclocked. But on a more general level, we do have to wonder whether 4K may well be something of a blind alley for PC gaming, bearing in mind that the physical size of 4K PC displays is remaining static (which is why many PC gamers are considering large 4K UHD TVs as monitor replacements). Right now, our gut feeling remains that the new wave of 34-inch 3440x1440 monitors with the super-wide 21:9 aspect ratio may well be the more natural home for the likes of the Fury X and the GTX 980 Ti - there's just 60 per cent of the resolution to drive, and there's a generally more immersive experience owing to the expanded field of view. LG, Asus and others are also incorporating FreeSync and G-Sync technologies into these panels too, making them even more compelling - we hope to review one of these displays soon.

But that's a discussion for another time. In the here and now, it's fair to say that AMD has produced an innovative, powerful piece of hardware - and if 4K gaming is your bag, it's a very serious contender. But there's the sense that the complete package hasn't quite been delivered - the hardware is there, but perhaps the accompanying software has fallen short. We already know that DX12 and Vulkan will solve AMD's API overhead issues (which may have something to do with the sub-par 1080p results), but that's going to take a while to proliferate into the games we actually play. With HBM, we are already seeing sleek, smaller form factors, but right now there's no knock-out blow in terms of gaming performance vs old-school GDDR5 VRAM. And in terms of flexibility - whether we're talking about running at sub-4K resolutions, or even support for HDMI 2.0, we're surprised to see AMD concede ground to Nvidia in any area on what is a flagship product.

In conclusion, a straight comparison of these high-end GPUs is fascinating. AMD and Nvidia have invested in innovation in completely different areas, the red team banking on the remarkable HBM, its counterpart relying more heavily on the power efficiency and performance of the second generation Maxwell architecture (and leaving its own version of HBM in reserve until next year's hardware). At 4K, these entirely different approaches have ended up yielding remarkably similar performance with accomplished levels of efficiency - but what we really need from both sides is brand new core technology, combined with smaller 16nm FinFET manufacturing, due next year. Factor in HBM and DX12 and within a couple of years, the next-gen Fury and its Titan equivalent should leave both of these current cards in the dust. In the here and now, AMD may not have bested Nvidia - but as we move away from the 28nm era, it's clear that the appetite and the technology are there for us to see some serious competition between the two.
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 05 Jan 2016 18:32

The Digital Foundry 2016 graphics card upgrade guide, the best enthusiast GPU upgrades for all budgets: http://www.eurogamer.net/articles/digit ... rade-guide

Richard Leadbetter wrote:Best graphics card under £100 / $130: GeForce GTX 750 Ti (Honourable mention: Radeon R7 360)

Under £130 / $160: GeForce GTX 950 (Honourable mention: Radeon R7 370)

Under £200 / $250: Radeon R9 380/380X (Honourable mention: GeForce GTX 960)

Under £250 / $330: GeForce GTX 970 (Honourable mention: Radeon R9 390)

Under £400 / $500: GeForce GTX 980 (Honourable mention: Radeon R9 390X)

Over £400/$500: GeForce GTX 980 Ti (Honourable mention: Radeon R9 Nano)
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 19 May 2016 21:43

User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 05 Jul 2016 23:26

Radeon RX 480 review: http://www.eurogamer.net/articles/digit ... 480-review

Can AMD's new strategy restore its fortunes? The Radeon RX 480 offers excellent value - but what about the rest of the range? http://www.eurogamer.net/articles/digit ... am-success
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby El Chaos » 03 Sep 2016 14:32

4K PC gaming is finally viable - and it's stunning. And also very costly - for now: http://www.eurogamer.net/articles/digit ... s-stunning
User avatar
El Chaos
Insomnia Staff
 
Joined: 26 Jan 2009 20:34
Location: Buenos Aires, Argentina

Unread postby icycalm » 27 Jul 2018 10:04

GTX 1180, 1170 & 1160 Release Dates Leaked!
https://www.youtube.com/watch?v=TNksuz1Qz3U

Via https://www.neogaf.com/threads/nvidia-g ... -253355085

IbizaPocholo wrote:
  • GeForce GTX 1180: 30/08
  • GeForce GTX 1170 y 1180+: 30/09
  • GeForce GTX 1160: 30/10


It's pretty awesome having a Shadow and not having to worry about the card I just "got" being superseded. Though I doubt Blade will update the Shadow with 1180s anytime soon, at least I won't have paid $1000 for my 1080 by the time they do upgrade them.
Image
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 10 Aug 2018 17:26

Nvidia Turing Possible Leak?
https://www.neogaf.com/threads/nvidia-t ... k.1464554/

gspat wrote:New cards:

Titan RTX
Geforce RTX 2080
Geforce RTX 2070
Geforce GTX 2060
Geforce GTX 2050

Emphasis on ray tracing in higher end cards.
Image
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby icycalm » 20 Aug 2018 17:28

nVidia RTX 2080 Ti/2080/2070/2060 Offical Announcement - Gamescom Conference Live Today 6PM CEST/12PM EDT/9AM PDT
https://www.neogaf.com/threads/nvidia-r ... t.1464802/

BossâMoogle wrote:RTX 2080 Ti FP32 performance is expected to be somewhere around 13 Tflops.


Time to go back to planning that heist.
Image
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Next

Return to Hardware