Help Selecting a Video Card

I’m considering an eGPU for my 2018 Razer Blade 15 Advanced Model. Razer makes a nice enclosure that plugs into my Thunderbolt 3.0 port and provides a means to run a full sized graphics card.

I’ve got both a nice 27” Samsung 1280P display and a 28” Asus 4K gaming display. I’d love to be able to take advantage of 4K.

I know the sim is very CPU dependent, but I’ve got what I’ve got and can’t upgrade what’s in the laptop. What I don’t know is how to pair what I have with an appropriate video card.

Here’s my specs:

Processor: 8th Gen Intel® Core™ i7-8750H processor, 6 Cores / 12 Threads, 2.2GHz / 4.1GHz (Base / Max Turbo), 9MB Cache

Chipset: Mobile Intel® HM370 Chipset

Graphics: NVIDIA® GeForce@ GTX 1060 Max-Q (6GB GDDR5 VRAM, Optimus™ Technology)

Memory: 32GB dual-channel DDR4-2667MHz

Storage: 512GB SSD (NVMe PCIe 3.0 x4)

If it were me I’d match the thru put of the thunderbolt 3 with the video card. But what kind of power can the encloser make to the video card. It would be a shame if you bought a card only to find out that the encloser can supply the wattage the card needs.

What does that entail? I’m not sure how I’d go about doing that.

700w power supply. 100w of which is available to power the laptop, apparently, via Thunderbolt. It’s also got 4 USB ports, so a bit of power to whatever you’ve plugged into those.

Okay I just looked up the speed for the thunderbolt 3 and its 40 Gbps. So your not going to want to pick a vid card that exceeds that value for thru put.

So in theory what your saying you have 600 watts extra, give or take, so the card has to be in the realm of not drawing anymore the 600 watts in theory. Of course I dont know how this would translate to the real world.

If Razer encloser people have a web site, goto the tech support page and they should have one of those silly online chat reps that may or maynot be a real person. I would ask them if they have and idea what is capable of in reality. A sales droid might lie to you but a tech could give you a better idea of what your trying to accomplish is a foolhardy endeavor.

Well, nVidia says their latest 4000 series card will draw 600w board power. (Not that I’m even considering one of these, but for the sake of specifications.)

I’d certainly think anything in the 3000 series would be in the clear.

But thats only if you can actually get that 600 watts out of the encloser. There might be something that prevents this.

If I bought a 2080 3080, I have a 750 watt ps, I would defiantly upgrade the PS if I bought one of those cards.

Looks like “GPU Max Power Support 500w” from their detailed technical specs page.

I’ve been trying to search for throughput data on the various cards, but am coming up short.

Well I just looked at the pictures of the thing (dont know if it was a chroma or the lesser model) they had a rtx 2080 ti running in it.

So now all you have to do is figure out which one is pictured and that should give you a clue.

I did think it was drawing power from the laptop, but I see it has its own PS. So yeah as long as it doesnt pass the 500 watt mark you should be good to go.

Razer Core X - Thunderbolt™ 3 eGPU

GeForce RTX 2080:

CUDA cores: 2,944
Clock speed: 1515MHz base, 1710MHz boost, 1800MHz OC Founders Edition
Memory capacity: 8GB GDDR6
Memory path: 256 bits
Memory bandwidth: 448GBps
Ports: VirtualLink/USB-C, DisplayPort 1.4, HDMI 2.0b
Power: One 6-pin, one 8-pin
Release date: September 20, 2018

Yeah, I was going to get the Chroma version with the larger supply and the USB ports.

So, getting back to a card… how do I determine the throughput on them? (We’re talking about bus throughout, no?) I’m looking at nVidia’s site, but they give no such sort of spec for that.

For power, I see the 3090ti is a 450w draw and the 3090 and 3080ti are 350, for example.

Yeah I cant find the thruput number either just looked you can try digging here:

They have an extensive database of cards specs, maybe you can find something relevant there. I believe they have a forum as well, and someone in the forums could answer the question as well, thier all geeks over there ;p

The fact that they show a 2080ti card in one of these would lead me to believe that it should work, but its not my $600-800 thats getting spent.

1 Like

I hope this video may help you

27:19 - Recommended High End (4k, VR) PC Configuration

1 Like

This just in, I looked up your card and it will do 4k as it sits without all the hubbub bub. On my 1080ti for MSFS when the game/network isnt greifing me I can get 20fps on ultra(sometimes better). So if you have a 4k hdmi cable ( a good one it makes a difference) and a 4k monitor or TV you should be able to try 4k before you even invest a ton of money into this project.

Who knows, it might be good enough, if your not a fps chaser. Then you could save for a ‘real’ desktop computer and still enjoy 4k in the meantime. The way prices are falling Ive seen whole 3080 machines going for $1200 us from time to time at the local microcenter.

I can definitely drive my 4k display, but performance with MSFS is pretty abysmal. I didn’t try choking down the settings to medium or low-end, because what’s the point doing that with 4K?

It just manages to run 30FPS with the built-in 1080P display (set to 60Hz rather than 144) with Vsync at 50% on default high-end (what it defaults to) settings. It suffers at larger airports, and with more complex planes, although that seems to subside once in the air.

I run mine all on ultra, I have postprocessing features turned off, not because they lag but because I think they’re just bad. At 20 fps the game runs smooth, no stutters though since this su10 fiasco, Ive been seeing the low fps slowly climbing as I get into the air ;p

If i look at the devmode graph cpu runs at 40% and the GPU runs at 98-100%, and it only bounces to mainthread / gpu limited into red every 30 seconds or so. But I dont chase FPS and for me 20fps is more then adequate unlike what alot of other people on these boards think ;p

Remember its a flight sim and not a shooter what matters is a smooth display.

1 Like

I may plug it back into the 4k display here and mess about with these latest changes and DX12 with this Studio driver, just for shiggles.