Advice on Upgrading a Computer to an Nvidia RTX 3090?

I have a Dell XPS 8930 Special Edition. It has an 850 watt power supply. It appears to have 2 PCIe 8-pin cables supporting the existing RTX 2070 Super card that came with the machine (but only 6 pins of each of the 2 cables seems to be used for the 2070S).

I called Dell. I think I got a canned response. The support person who helped me says that the case will only take up to an RTX 2080 card. I have measured the available inside dimensions and I seem to be able to afford the extra inch length and height of the 3090 over the 2070S. The 3090 takes 3 PCIe slots (the 2070S takes 2 slots).

I have 3 potential problems. 1) What looks like a small capacity sticks up a little bit just beyond the 3rd empty 1x PCIe slot the 3090 would cover. 2) How to determine if the 850 Watt supply is big enough (I have a 512 Gb SSD C: drive, a 2 Tb 7200 rpm SATA D: drive, and a 1 Tb 970 EVO Plus non-system SSD, plus a Blu-ray RE drive that I won’t be using with MSFS. And, 3) there is a great mounting bracket that holds the 2070S card firmly in place but how to come by a bracket that would hold the 3090 in place (the bracket for the 2070S appears to be a fixed, custom piece of plastic made specifically for the Nvidia card and locks into slots provided in the Dell case)???

I imagine my best bet, if the capacitor stands in the way, the power supply is really not big enough, and there’s no way that I can get an adequate new support bracket for my case is to see about moving my computer to a new case, motherboard(?), and power supply after my new computer warranty expires in May, 2021.

Thanks for any advice on sites to go to or stuff to start reading to get started.

Are we talking founders edition here?

  1. Hard to say without seeing it, but I find it unlikely a mobo capacitor would interfere with the card.

  2. 850W supply should be sufficient in general but it does depend on the supply - are you able to find out how the rails are arranged? I have seen people struggle with a power supply that should be adequate in terms of headline wattage because the 12v rails are split, others due to digital protections kicking in during spikes. Btw an undervolt on the 3090 will also help quell the beast somewhat at very little performance cost.

  3. Again hard to comment without seeing, can you post a picture of exactly what we are looking at?

Upgrading a prebuilt like a dell is often a bit more challenging, but usually achievable with a little planning.

1 Like

Actually, the capacitor problem appears to be solved by looking at the following 3090 card image.

The width of the card appears to end decisively right in line with the 3rd PCIe slot, as judged by looking at the insert tabs on the bottom of the mounting bracket. So the capacitor sticking up 10 to 15 mm beyond that should not be a problem and it’s so small, barely above the height of the 3rd PCIe slot, it shouldn’t restrict airflow into the 3090 fan. (BUT see Edit_Update! beyond 1st pic)

Edit_Update: I neglected to allow for the fact that the mounting bracket for a card slot is offset from the slot itself so the capacitor beyond the 3rd PCIe slot is smack dab in the middle of where the width of the 3090 card ends. See picture with middle of 3rd slot bracket and capacitor picture labeled.

The fan intake for the 3090 would be very close to the Samsung 970 EVO PLUS SSD, which is the card in the slot just below the capacitor circled in red.

Here’s what the RTX 2070S mounting bracket looks like in a Dell XPS 8930 SE computer. It’s a tower computer so without the support the card would be hanging horizontally from its PCIe slot and stressing out the slot and its pins incredibly with its weight.

I haven’t torn my computer down enough to find the equivalent PSU label for my very own unit but it ought be equivalent to this image captured from an eBay page:https://www.ebay.com/itm/USED-Genuine-Dell-80-Gold-850w-Switching-Power-Supply-HU850EF-01-9XG5C-/224280229061
(the previous Dell Community image I posted was a wrong lower power unit)

Well I can tell you I have the 3090 FE and I don’t have a issue with droop. If your getting this one you should just be able to cutout the space needed if that bracket is needed for something other than the card.

Card mounted horizontally like that is a very common setup, the only real concern would be during transport… that said it certainly doesn’t hurt to support the card either, it should be pretty easy to either modify the bracket you have or come up with something suitable if you want to support it.

One thing I notice there though is it looks like the back of the card is right up against the PSU. The backplate of the 3090 gets extremely hot largely due to the GDDR6X on the rear of the card and it looks like it would get even worse in your case there - you may end up throttling on VRAM junction temps. You might be able to compensate with a slightly more aggressive GPU fan curve, especially on fan 2. The rear of the card is also blow through, so you need to make sure the path is unobstructed.

The power supply label is for a 480w so hard to draw any conclusions as your 850w could be a completely different design. Like I said as a headline figure 850w should be fine but split rails is less than ideal and I know of at least two cases where it caused issues on the 3090.

Yes, I initially posted a lower power unit face plate but have replaced it with what should be the correct PSU specs from a eBay posting. Good point on the PSU unit being close to the bracket side of the card. The PSU is pretty compact, though and only descends about 1.38 in (35 mm) along the side of the card. On that side of the card there is the i9-9900K on the motherboard with a very large air-cooling fan over the processor and lots of vents out the back of the computer. Otherwise, there is empty space on that upper side of the horizontal RTX 2070S card. But the Dell XPS 8930 tower case gets a lot of critical comments in the Dell Community Forum for being a cramped case with poor airflow. My RTX 2070S runs hot, 83 to 84 deg C (spec limit is 88 deg C) but that appears to be winter or summer when the indoor temperature is 15.6 deg C or 29.4 deg C (60 F or 85 F). Since the RTX 3090 temp spec limit is even higher (93 deg C for Founder’s Edition) but it’s supposed to run at least 10 deg C cooler than RTX 20’s cards, I’m hopeful if the card fits, I could get by with the case I have. But the 3090 does put out an extra ~130 watts of power (>50% more heat).

Don’t waste your money on a 3090 for this…trust me I have a 3090 and this sim is still a stuttering mess that runs at 30 FPS, especially with airliners.

A flight sim HAS to be smooth or else it ruins the entire experience. Yes it’s great for stuttering your way up to FL350 for that awesome screen shot…

Not so much when your stuttering along at 20 FPS approaching LAX.

We will be lucky if this sim runs halfway decent by 2023. My advice is to get X Plane.

Well, I’ll certainly have to take your results into consideration but from what samplitude and Anarchae1 described, it sounds like a 3090 would considerably up my HP Reverb G2 experience. I’m not into flying airliners and takeoffs and landings but more spawned-in-the-air VFR scenery gawking out in the country in the daytime in cloudless skies going slow, straight, and level in something like a ZLIN Shock Ultra or an Icon A5. With an RTX 2070S and an Icon A5 or ZLIN Shock Ultra, I easily hit 45 to 55 fps of very smooth flying in 2D at ~ULTRA settings. My problem is 8 Gb of VRAM is nowhere near enough for VR with the G2 and I’m hoping, based on the description of others and how much of their VRAM is being consumed, that the 24 Gb of a 3090 will solve that problem, which arises because of the double stereoscopic image and the resolution of the G2.

This was a thread I made where I benchmarked the differences between 3080 and 3090. At the end of the day CPU will still be limiting factor

I glanced through your referenced thread. It didn’t seem like you had tested VR with a Reverb G2 yet. Forget the analysis that I was looking at but the person was consuming 16 Gb of VRAM with a 3090 in VR. So an RTX 3080 wouldn’t do so well for that but a 3090 would. That’s why I mentioned I’m looking for VRAM, not pure CPU or GPU processing power. I don’t know if the two separate stereoscopic images for each eye are computed entirely separately but when 11 Gb VRAM is supposed to be the optimum for 2D ULTRA performance, you’re not going to get that far in 3D with the 10 Gb at your disposal in a 3080, IMHO. An i9-9900K at 4.7 GHz is still within a few percent of the single-threaded CPU performance of some of the newer processors out there. My system memory at 32 Gb 2666 MHz RAM might be limiting, tho.

Allocated VRAM is not used VRAM, what you see in most, if not all, monitoring software is allocated and not used VRAM, which is generally much lower and doesn’t affect performance as you might think.

@JALxml I’m wondering the same for my test system (an Alienware by the way with the same metal chassis inside - or close enough to yours).

My 2070S is the Gigabyte WindForce OC which is longer than reference and won’t fit as-is. I’ve replaced the front chassis fan by a slimmer Noctua (14mm IIRC) to free enough space to fit the card (only about 3mm between the fan and the card!) and it is also quieter than stock (much quieter). I’ve no thermal problems with the card but I’m also using liquid cooling on the CPU which helps reducing the overall temperature buildup in the chassis.

As for supporting the card horizontally? I’m using the power connector cables themselves which I’ve put in a way behind the PSU removable cable cover so that they don’t move. This is helping relieving the weight on the PCI connector alone (I don’t have the plastic mounting bracket in my setup because stock was a small 1660)

1 Like

@CptLucky8 Actually, my biggest problem is likely to be my wife! Since she had no problems with my talking about upgrading to an upcoming iPhone 13(?-unlucky #?) Pro Max, hopefully, if I tell her I’d rather get an Nvidia 3090 instead of a new iPhone, she won’t get too steamed!

1 Like

haha, I won’t tell you what about even mentioning to mine I’ll add to the G2 + Index, a Yoke and Throttles ala Honeycomb… :joy:

1 Like

I think the following post is one of the ones that tilted me towards wanting a 3090 as opposed to a 3080, 5800XT or anything else. Also, 3090’ers seem to be happier reporting VR results than 3080’ers, etc. But my reading recollection may be faulty …

1 Like

I will be getting the Reverb G2 shortly, so fingers crossed I can test. That said, I do still think it’s the case that it’s allocated VRAM rather than actual use. In theory you should see similar VRAM usage to using a 4K monitor since pixel count is roughly the same.
I got the 3090 for multiple reasons - and these could justify yours:

  • I felt the 3080’s VRAM was too low for when MSFS goes to DX12,
  • I didn’t want to wait until the 3080Ti
  • the 3090 I found was reasonably priced
  • it has some good longevity to it

If you want good performance now, I would still say the 3080 is the way to go. If you want future proof for DX12, the 10gb VRAM isn’t enough imo

1 Like

The 3090FE core runs very cool with even moderate fan speed due to the enormous cooler… mine can easily maintain under 60c at 400w with middling fan settings. Where the thermals actually become an issue is the GDDR6X on the back and you can see many people observe a temp limit throttling even when the core looks fine - recently became clear it’s the GDDR6X junction temps hitting 110c and throttling. It’s actually exacerbated by the cooler running core as the fans don’t ramp up much on the stock curve leaving arguably inadequate cooling for the VRAM.

EKWB have actually recently announced a more complex water block that includes an active backplate element, probably due to exactly that issue.

For gaming unless you are overclocking the memory I’m not sure you’ll likely hit the throttling but I’m not entirely convinced of the longevity at 100c or more even though it doesn’t throttle until 110c.

Short answer on the PSU is it could be fine, but being split into three 12v rails isn’t ideal. If you experience shutdowns or crashes during gaming look there as a potential culprit.

1 Like

I see what you mean about similar to 4K resolution but you’re missing out the super sample that will tax the VRAM more. Of course the G2 can only display 2160x2160x2 but the settings you need to run at to get the default super sampled resolution is somewhere in the 3400x3400 range (not in front of MSFS right now but it’s around that kind of range). Got that reason your VRAM usage shoots up compared to a similar pixel sized panel non-VR.

I think MSFS is still uses around 70% of allocated VRAM which is just about what the 3080 can achieve, but since it can’t allocate what it wants you’ll suffer performance a bit. See this older video on dedicated vs allocated vram:

1 Like