default header

Hardware

Graphics cards

Moderator: JC Denton

Unread postby icycalm » 21 Aug 2018 11:36

Two opposed views:

https://www.neogaf.com/threads/nvidia-r ... -253401496

longdi wrote:Considering that the rtx2080ti is the almost full quadro gpu, with tensors, rt and tips cores, that is a beast. 700mm2 in size.

AMD doesn't seem to have engineered anything close to this, as far as leaks go. Doesn't bode well for next gen console gaming.


vs.

https://www.neogaf.com/threads/nvidia-r ... -253401651

Dontero wrote:I don't think AMD cares what Nvidia does at this point. They basically hold 90% of gaming market in their hands and whatever custom silicon they will came up with has possibility of really fucking hurting Nvidia in gaming sphere.

Nvidia really made a fucking huge mistake when they lost both major consoles to AMD.

Imagine for a second that AMD gets their own PsyhX and they dedicate part of silicon for it. Since they control both consoles it means 90% of developers in the world will make their games completely use it. Which in turn means that almost every PC game later will look like shit on Nvidia GPUs becausae they can't run it and some will not even be playable on Nvidia because developers choose to use physics as one of main designs behind game or various gameplay elements rely on it.

That is why i said Nvidia made a huge fucking mistake because company that has both best selling consoles will decide what games will use or not. Any custom silicon that manufacturer puts into consoles will be instantly used as a norm among game creators. If Nvidia would hold both consoles then we could have raytracing additions of RTX as standard for next gen.

We just don't know what AMD is cooking yet. IMHO it will be physics.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby recoil » 05 Mar 2019 14:35

https://www.tomshardware.com/news/nvidi ... 38730.html

Matthew Connatser wrote:Nvidia and AMD GPU Oversupply to Last Until Summer - Report

Image

While we've known for some time that the crypto bust left AMD and Nvidia with an excess amount of GPUs, the Jon Peddie Research group says this will continue on for almost half a year (up to this summer).

Jon Peddie Research is a third-party analyst firm that estimates GPU trends like market share between AMD, Nvidia, and Intel (which includes iGPUs, making Intel the largest graphics vendor). According to its findings, AMD and Nvidia overestimated cryptocurrency-derived demand throughout 2018. That led to such an excess supply of GPUs that the entire graphics market suffered a 2.65 percent decline in shipments going into Q4 last year. But this also includes Intel, which didn't suffer as much as AMD and Nvidia, whose sales declined 6.8 percent and 7.6 percent, respectively, from Q3 to Q4.

AMD's 2018 Q4 was still better than its 2017 Q4, and while the company predicts a poor Q1 for this year (likely due to excess GPU inventory), it seems AMD has weathered the crypto bust rather well. AMD predicts it will still grow overall in 2019, likely thanks to AMD's upcoming Zen 2 CPUs.

Meanwhile, Nvidia suffered extreme losses in Q4 gaming revenue (despite having a solid Q3) and launched its new Turing GPUs. Nvidia is also predicting a poor Q1 like AMD, but with an even sharper decline in revenue. That's also likely due to excess inventory, which seems to have impacted Nvidia much harder than AMD. AMD might be somewhat protected from the volatility because it manufactures both GPUs and CPUs, granting it the ability to fall back to one market when the other fizzles.

For consumers, however, this is all very good news. Most people shopping for new PC parts have probably noticed how low prices are for last-gen GPUs, especially in the mid-range. You can buy GTX 1060s and RX 580s comfortably within the low-$200 range, or even cheaper sometimes, and RX 570s are going for the mid-$100s during seemingly regular sales.

Although cards such as the GTX 1080 and 1080Ti are in short supply and thus command high prices, cards like the GTX 1070 and 1060, and the RX Vega 56, 580, and 570, are still in high supply and are consequently much cheaper than they were during the cryptomining craze. AMD's GPUs even come with free games, further increasing the value. Now is a great time to buy a last-gen GPU, as GPU prices could slowly increase as inventory is shuffled out this year.


Sounds like a great time to upgrade or build a new entry level gaming PC. I bought a 1060 Ti last year for 100 USD more than the low 200 ranges shown above. It's been performing well on my desktop providing enough power to play current cutting edge games on low settings and it let's me use Shadow and stream at the same time, so it's not bad if you have a tight budget.
Image
User avatar
recoil
 
Joined: 26 Feb 2010 22:35
Location: California, USA

Unread postby icycalm » 21 Aug 2020 16:28

Nvidia confirms GeForce event for September 1: it looks like Ampere is on its way
https://www.techradar.com/news/nvidia-c ... on-its-way

Carly Page wrote:Company promises to showcase 'biggest breakthroughs in PC gaming since 1999'


Nvidia RTX 3090 could come with a ridiculously high price tag
https://www.techradar.com/news/nvidia-r ... -price-tag

Matt Hanson wrote:A new rumor suggests that Nvidia’s upcoming flagship Ampere graphics card, which some people are calling the Nvidia GeForce RTX 3090 (though we’re still not convinced by that name), could come with a ridiculously high price tag – of around $2,000.

The rumor comes from what appears to be an internal memo, which suggests a price of ¥1,3999 in Chinese yuan, about $2,000/£1,500/AU$2,800. If true, this would make the RTX 3090 (or whatever it ends up being called) one of the most expensive consumer graphics cards ever made.

It would also mean this new flagship is almost twice the price of Nvidia’s current flagship, the RTX 2080 Ti, which launched for $1,199 (£1,099, AU$1,899).

If this rumor is true – and that’s a big ‘if’ at the moment – it means you’re going to have to shell out a huge amount of money for the most powerful RTX 3000 series GPU.


What I am really interested in, is if Cyberpunk will be somehow playable in VR (via those third-party programs that exist), or at least with three screens/projectors. I think at 1080p this should be doable with current tech, but not 4K unless you have two 2080 Tis or something, and even then probably not with all bells and whistles on. But if the 3000 cards arrive before November, as it looks like they will, a more surround experience of Cyberpunk at 4K and full detail might be possible. If there's one fucking game that should be in VR in 2020, it sure as hell is a game called "Cyberpunk" for crying out loud. But check this out: https://www.onmsft.com/feature/intervie ... h-and-more

Brad Stephenson wrote:We tried. We were thinking about VR but, yeah, we’re not doing anything with VR. We got the VR dev kits but...

Wasn’t a good fit?

Some things would work in VR but, I think, it’s not really viable yet. You’re not making a lot of money in VR yet.


YOU DON'T HAVE TO MAKE IT VR-EXCLUSIVE YOU DUMBASS POLISH FAGGOT.

And of course the idea never enters the interviewer's head to say this. Gotta be a genius to ask simple questions like this! For more on this, read my essay, VR Is Stalling For No Reason.

Having said/screamed that, there might be hope on the horizon.

Rumor: Cyberpunk 2077 Might Have Been Delayed To Add VR Mode
https://wccftech.com/rumor-cyberpunk-20 ... d-vr-mode/

Image

Kai Powell wrote:According to Reddit user Server16Ark, the picture is from the recent Chinese press event for Cyberpunk 2077. Press were allowed to play Cyberpunk 2077 for four hours after finishing up the tutorial (and presumably character creation mode). In the background of the picture are a pair of KAT Walk Mini VR treadmills.


I think that, EVENTUALLY, we'll be able to play Cyberpunk in VR, at some point over the next few years. Unofficially for sure, but officially too, I am sure. The question is if we'll be able to do it at launch, when most people will play the game, and when the first playthrough will have its maximum impact on them. That's what I'll be looking out for between now and November.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Unread postby shubn » 02 May 2021 18:47

GeForce RTX 30 Series Performance Accelerates With Resizable BAR Support: https://www.nvidia.com/en-us/geforce/ne ... r-support/

Note that you'll probably have to update your motherboard BIOS for the option to appear.
Image
User avatar
shubn
 
Joined: 10 Jan 2012 03:17
Location: France

Re: Graphics cards

Unread postby icycalm » 16 Jul 2021 12:44

When/if you get in the market for a high-end 30-series Nvidia card, you want to research memory cooling, as only a few of them do it well, and apparently it matters:

Image

Nvidia GeForce RTX 3080 Ti Review: Great Performance at a High Price
https://www.tomshardware.com/news/nvidi ... -ti-review

For better or worse, Nvidia hasn't altered the card design relative to the RTX 3080 Founders Edition. The cooler mostly worked well before and ran reasonably quiet… except when the GDDR6X memory got hot and the fans had to kick up to full speed. We hoped Nvidia would use better thermal pads for the GDDR6X memory this round, especially with the addition of two more GDDR6X chips, but that doesn't appear to have happened.

In testing, games generally didn't have major problems, with memory temperatures peaking at anywhere from 94C–104C. The higher end of that range is a concern, however, as we're dealing with a brand-new card. How will it perform after a year or two of use? We've been there, countless times, and the answer will inevitably be higher temperatures and fan speeds over time.

Testing mining algorithms was a different story, with multiple algorithms pushing the GDDR6X temps to 110C, max fan speeds, and throttling of GPU clocks to try to compensate. Some future game, or game that we haven't tested, might behave in a similar fashion, but we haven't encountered any specific games that match the intensity of mining. That's the good news. The bad news is that a $10 thermal pad upgrade for the Founders Edition is still a good idea, but it would void your warranty. Nvidia should have done the 'upgrade' itself and used higher-quality pads.


We'll be looking at some third party cards in the coming days as well, and hopefully those will pay more attention to memory cooling. This has been a sore spot on RTX 3080 and 3090 cards since launch, with only about 25% of the cards we've looked at using better GDDR6X cooling methods. Considering the extreme nature of these parts, we expect manufacturers to pay more attention to this important aspect of cooling going forward.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Re: Graphics cards

Unread postby icycalm » 18 Aug 2021 21:55

Intel Arc Announced. High Performance Graphics. DX12 Ultimate. Hardware Ray Tracing. AI Super Sampling. Launches Q1 2022.
https://www.neogaf.com/threads/intel-ar ... 2.1615783/

I skimmed the thread. Some funny takes in there. It seems like a mid-power option, so I am not interested to investigate further, but I thought it was worth reporting here. Maybe eventually they'll move into premium stuff.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Re: Graphics cards

Unread postby icycalm » 15 Nov 2021 10:18

NVIDIA GeForce RTX 4090: 5nm GPU, 24GB GDDR6X, 3x faster than RTX 3090
https://www.tweaktown.com/news/82748/nv ... index.html

3x faster means that triple-screen 4K gaming becomes viable. I mean viable even for the most demanding games; it is already viable for less demanding ones if you have a decent card, e.g. there are videos of 3x4K Elite Dangerous on YT.

8K cutting-edge gaming also becomes viable.

Also 8K VR headsets.

Three 4K TVs are becoming quite affordable. They are about €1,200 right now, and may drop sub-€1,000 next year. One 8K TV should drop to around €1,500. I haven’t been keeping up with VR headsets but last time I checked I remember an 8K one for about €2,500?
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Re: Graphics cards

Unread postby icycalm » 23 May 2022 15:18

NVIDIA's RTX 4090 Could Be Arriving as Early as July
https://hypebeast.com/2022/5/nvidia-rtx ... ase-rumors

https://twitter.com/3DCenter_org/status ... 8730454016

3DCenter.org @3DCenter_org wrote:Interesting: GeForce RTX 4090 with "only" 126 SM out of 144 physically available SM (87.5%). nVidia thus still has a lot of room for a GeForce RTX 4090 Ti - or an "Ada Titan".
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Re: Graphics cards

Unread postby icycalm » 30 Jan 2023 13:08

NVIDIA's RTX 4080 Problem: They're Not Selling & MSRP Doesn't Exist
https://www.youtube.com/watch?v=OCJYDJXDRHw

Image

Gamers Nexus wrote:The NVIDIA RTX 4080 has a problem: No one is buying it -- or at least, not enough people that board partners are feeling relief from their investment. They're on the hook for millions upon millions of dollars in inventory, and it's because NVIDIA's pricing structure is so egregiously bad that sales have finally slowed. NVIDIA got comfortable with a market where literally anything would sell, no exaggeration, and just existing would guarantee a sale. But that market is over, and now NVIDIA has an RTX 4080 problem. The other issue is that MSRP cards don't exist -- partners launched some, then stopped stocking them.


I still don't see any advantage in prices due to this...
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

RTX 5090 Rumors

Unread postby icycalm » 10 Nov 2023 20:54

The Nvidia RTX 5090 is rumored to be nearly twice as fast as RTX 4090, so we should just call it the Titan RTX at this point
https://www.techradar.com/computing/gpu ... this-point

John Loeffler wrote:The Nvidia RTX 5090, which is expected to lead the Nvidia 5000-series launch lineup in 2024, is rumored to be about 70% faster than its predecessor—currently the best graphics card for performance on the consumer market—thanks to some major spec upgrades over the 4090, including significantly more CUDA cores, faster clocks, and wider memory bandwidth.


This sounds like the card that will deliver the "8K or triple-4K" Supreme Commander standard that I have set in the Star Citizen Cult Spec, or ultimately 4K-level VR. Of course after CIG applies enough performance improvements so that the game is no longer CPU-bound. The other major application, for me at least, would be triple-4K Cyberpunk 2077.

Note that it doesn't look like there'll be a 4090 Ti, even though there's room for it on the chip/die/whatever it is. So 5090 might be the next upgrade we get. Hopefully my cryptos will be doing well by then and I'll be able to afford it.
User avatar
icycalm
Hyperborean
 
Joined: 28 Mar 2006 00:08
Location: Tenerife, Canary Islands

Previous

Return to Hardware