default header


Graphics cards

Moderator: JC Denton

Unread postby icycalm » 21 Aug 2018 11:36

Two opposed views: ... -253401496

longdi wrote:Considering that the rtx2080ti is the almost full quadro gpu, with tensors, rt and tips cores, that is a beast. 700mm2 in size.

AMD doesn't seem to have engineered anything close to this, as far as leaks go. Doesn't bode well for next gen console gaming.

vs. ... -253401651

Dontero wrote:I don't think AMD cares what Nvidia does at this point. They basically hold 90% of gaming market in their hands and whatever custom silicon they will came up with has possibility of really fucking hurting Nvidia in gaming sphere.

Nvidia really made a fucking huge mistake when they lost both major consoles to AMD.

Imagine for a second that AMD gets their own PsyhX and they dedicate part of silicon for it. Since they control both consoles it means 90% of developers in the world will make their games completely use it. Which in turn means that almost every PC game later will look like shit on Nvidia GPUs becausae they can't run it and some will not even be playable on Nvidia because developers choose to use physics as one of main designs behind game or various gameplay elements rely on it.

That is why i said Nvidia made a huge fucking mistake because company that has both best selling consoles will decide what games will use or not. Any custom silicon that manufacturer puts into consoles will be instantly used as a norm among game creators. If Nvidia would hold both consoles then we could have raytracing additions of RTX as standard for next gen.

We just don't know what AMD is cooking yet. IMHO it will be physics.
User avatar
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby recoil » 05 Mar 2019 14:35 ... 38730.html

Matthew Connatser wrote:Nvidia and AMD GPU Oversupply to Last Until Summer - Report


While we've known for some time that the crypto bust left AMD and Nvidia with an excess amount of GPUs, the Jon Peddie Research group says this will continue on for almost half a year (up to this summer).

Jon Peddie Research is a third-party analyst firm that estimates GPU trends like market share between AMD, Nvidia, and Intel (which includes iGPUs, making Intel the largest graphics vendor). According to its findings, AMD and Nvidia overestimated cryptocurrency-derived demand throughout 2018. That led to such an excess supply of GPUs that the entire graphics market suffered a 2.65 percent decline in shipments going into Q4 last year. But this also includes Intel, which didn't suffer as much as AMD and Nvidia, whose sales declined 6.8 percent and 7.6 percent, respectively, from Q3 to Q4.

AMD's 2018 Q4 was still better than its 2017 Q4, and while the company predicts a poor Q1 for this year (likely due to excess GPU inventory), it seems AMD has weathered the crypto bust rather well. AMD predicts it will still grow overall in 2019, likely thanks to AMD's upcoming Zen 2 CPUs.

Meanwhile, Nvidia suffered extreme losses in Q4 gaming revenue (despite having a solid Q3) and launched its new Turing GPUs. Nvidia is also predicting a poor Q1 like AMD, but with an even sharper decline in revenue. That's also likely due to excess inventory, which seems to have impacted Nvidia much harder than AMD. AMD might be somewhat protected from the volatility because it manufactures both GPUs and CPUs, granting it the ability to fall back to one market when the other fizzles.

For consumers, however, this is all very good news. Most people shopping for new PC parts have probably noticed how low prices are for last-gen GPUs, especially in the mid-range. You can buy GTX 1060s and RX 580s comfortably within the low-$200 range, or even cheaper sometimes, and RX 570s are going for the mid-$100s during seemingly regular sales.

Although cards such as the GTX 1080 and 1080Ti are in short supply and thus command high prices, cards like the GTX 1070 and 1060, and the RX Vega 56, 580, and 570, are still in high supply and are consequently much cheaper than they were during the cryptomining craze. AMD's GPUs even come with free games, further increasing the value. Now is a great time to buy a last-gen GPU, as GPU prices could slowly increase as inventory is shuffled out this year.

Sounds like a great time to upgrade or build a new entry level gaming PC. I bought a 1060 Ti last year for 100 USD more than the low 200 ranges shown above. It's been performing well on my desktop providing enough power to play current cutting edge games on low settings and it let's me use Shadow and stream at the same time, so it's not bad if you have a tight budget.
User avatar
Joined: 26 Feb 2010 22:35
Location: California, USA

Unread postby icycalm » 21 Aug 2020 16:28

Nvidia confirms GeForce event for September 1: it looks like Ampere is on its way ... on-its-way

Carly Page wrote:Company promises to showcase 'biggest breakthroughs in PC gaming since 1999'

Nvidia RTX 3090 could come with a ridiculously high price tag ... -price-tag

Matt Hanson wrote:A new rumor suggests that Nvidia’s upcoming flagship Ampere graphics card, which some people are calling the Nvidia GeForce RTX 3090 (though we’re still not convinced by that name), could come with a ridiculously high price tag – of around $2,000.

The rumor comes from what appears to be an internal memo, which suggests a price of ¥1,3999 in Chinese yuan, about $2,000/£1,500/AU$2,800. If true, this would make the RTX 3090 (or whatever it ends up being called) one of the most expensive consumer graphics cards ever made.

It would also mean this new flagship is almost twice the price of Nvidia’s current flagship, the RTX 2080 Ti, which launched for $1,199 (£1,099, AU$1,899).

If this rumor is true – and that’s a big ‘if’ at the moment – it means you’re going to have to shell out a huge amount of money for the most powerful RTX 3000 series GPU.

What I am really interested in, is if Cyberpunk will be somehow playable in VR (via those third-party programs that exist), or at least with three screens/projectors. I think at 1080p this should be doable with current tech, but not 4K unless you have two 2080 Tis or something, and even then probably not with all bells and whistles on. But if the 3000 cards arrive before November, as it looks like they will, a more surround experience of Cyberpunk at 4K and full detail might be possible. If there's one fucking game that should be in VR in 2020, it sure as hell is a game called "Cyberpunk" for crying out loud. But check this out: ... h-and-more

Brad Stephenson wrote:We tried. We were thinking about VR but, yeah, we’re not doing anything with VR. We got the VR dev kits but...

Wasn’t a good fit?

Some things would work in VR but, I think, it’s not really viable yet. You’re not making a lot of money in VR yet.


And of course the idea never enters the interviewer's head to say this. Gotta be a genius to ask simple questions like this! For more on this, read my essay, VR Is Stalling For No Reason.

Having said/screamed that, there might be hope on the horizon.

Rumor: Cyberpunk 2077 Might Have Been Delayed To Add VR Mode ... d-vr-mode/


Kai Powell wrote:According to Reddit user Server16Ark, the picture is from the recent Chinese press event for Cyberpunk 2077. Press were allowed to play Cyberpunk 2077 for four hours after finishing up the tutorial (and presumably character creation mode). In the background of the picture are a pair of KAT Walk Mini VR treadmills.

I think that, EVENTUALLY, we'll be able to play Cyberpunk in VR, at some point over the next few years. Unofficially for sure, but officially too, I am sure. The question is if we'll be able to do it at launch, when most people will play the game, and when the first playthrough will have its maximum impact on them. That's what I'll be looking out for between now and November.
User avatar
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby shubn » 02 May 2021 18:47

GeForce RTX 30 Series Performance Accelerates With Resizable BAR Support: ... r-support/

Note that you'll probably have to update your motherboard BIOS for the option to appear.
User avatar
Joined: 10 Jan 2012 03:17
Location: France

Re: Graphics cards

Unread postby icycalm » 16 Jul 2021 12:44

When/if you get in the market for a high-end 30-series Nvidia card, you want to research memory cooling, as only a few of them do it well, and apparently it matters:


Nvidia GeForce RTX 3080 Ti Review: Great Performance at a High Price ... -ti-review

For better or worse, Nvidia hasn't altered the card design relative to the RTX 3080 Founders Edition. The cooler mostly worked well before and ran reasonably quiet… except when the GDDR6X memory got hot and the fans had to kick up to full speed. We hoped Nvidia would use better thermal pads for the GDDR6X memory this round, especially with the addition of two more GDDR6X chips, but that doesn't appear to have happened.

In testing, games generally didn't have major problems, with memory temperatures peaking at anywhere from 94C–104C. The higher end of that range is a concern, however, as we're dealing with a brand-new card. How will it perform after a year or two of use? We've been there, countless times, and the answer will inevitably be higher temperatures and fan speeds over time.

Testing mining algorithms was a different story, with multiple algorithms pushing the GDDR6X temps to 110C, max fan speeds, and throttling of GPU clocks to try to compensate. Some future game, or game that we haven't tested, might behave in a similar fashion, but we haven't encountered any specific games that match the intensity of mining. That's the good news. The bad news is that a $10 thermal pad upgrade for the Founders Edition is still a good idea, but it would void your warranty. Nvidia should have done the 'upgrade' itself and used higher-quality pads.

We'll be looking at some third party cards in the coming days as well, and hopefully those will pay more attention to memory cooling. This has been a sore spot on RTX 3080 and 3090 cards since launch, with only about 25% of the cards we've looked at using better GDDR6X cooling methods. Considering the extreme nature of these parts, we expect manufacturers to pay more attention to this important aspect of cooling going forward.
User avatar
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands


Return to Hardware