Intel A770 Graphics Card (GPU) Discussion Thread

Great card, exposes limits of MSFS

Picked up a second 770LE Friday, 2 more 4k screens (this time 65") to add to my 100" video wall, and turns out the sim can’t handle 16kx2k (same as it can’t handle 8192x4320). Pity, because the Arcs don’t have an issue.

I shrunk the images to 1/4

Pictures of 8kx2k and 12kx2k. Both are awesome, but I’ll probably stick with 8k - the stuff off to the sides is just not in view.


Really not the dual Arc’s fault. I guess nobody figures there’d be people that would try to fiy at 16k.

1 Like

Intel ARC Driver 4146 is out.
This is the 11th driver released to date.

I pick the .exe version and run it.

Intel® Arc™ & Iris® Xe Graphics - WHQL - Windows*?

1 Like

Seems Intel is determined to blast their way into the graphics card market. Drivers are obviously a high priority.

Thanks for letting us know - been too busy to check the icons :slight_smile:

Replying to my own post instead of editing might be bad form, but it’s better to keep the original post as is and add some extra info.

  1. The problems with stretching the main display were also there with NVidia, so it’s not specific to the arc.

  2. I guess it was a bit “ambitious” (what I really mean is “stupid me”) to expect stretching the main window to a width of 16k would work flawlessly, since I already had experienced the same issue before (see point #1). So I guess I’ll have to create 2 more 4k display windows, one on the left 65", one on the right 65". Not really a problem, just gotta find the time. But I probably won’t go beyond 8k x 2k on the center display, because the taller you make the display, the less of a panoramic view you get - the sim “zooms in” the whole display. Even at 8k x 3k, there’s significant loss of “panavision.” And as you can see from the screenshots, the horizontal field of view is double what it is in 4k, and triple in 12k. When I get a chance, I’ll reconfigure and do a post with 16k x 2k.

Which brings up a question - where are all the simmers bragging about their new hotness NVidia 4090s in 8k? Maybe they can’t install dual “firestarter” cards? /snark

But seriously, is it even possible to run dual 4090s on most PCs in the average home? After all, you’d need to supply power to the room lighting and screens as well as the PC, and nobody wants extension cords to outlets in other rooms all over the place.

Current power considerations:
I know for my own setup, while I can go to a 13900 and still be within the room wall socket power limits (the 4 50" 4k screens draw an average of 66 watts, the 2 65" 4k screens draw an average of 115 watts - in both cases it goes up by 20 watts if you use the speakers), so the screens alone are drawing 494 + 20 watts, for 514 watts (because the sound from the 65" screens in stand-alone (not wall-mount) mode made me decide to pull plans for a separate sound system), the Corsair HX1000 fan only goes on at start for a couple of seconds (power-saving - fans only go on when drawing more than 400 watts), and the Arcs just aren’t working all that hard, even at 12k x 2k. Room lighting is 200 watts max (12 x 14 watts daylight LEDs - people with low vision need LOTS of light, but I normally leave the 7-bulb fixture off). And if I had to, I can tap into the 220 (all power supplies work just fine on both 110 and 220, and this particular one is even more efficient at 220). And if I ever added a 3rd card, I’d need to add a second power supply anyway - used up every single plug on the current one. So since users of the N4090 would have to buy a 1500-1600 watt power supply to have enough connectors, how many are actually running two cards unless they’re mining - in which case they’re probably running headless?

Success.
DP to HDMI Cable - Getting HDR10 to work on my
4K HDR/Dolby Vision TV that has only 3 HDMI inputs.

4K and HDR worked fine via a HDMI cable.

I expected the HDR to improve with the DP output
from the A770 GPU.
The graphics are improved with more detail.
And the colors are more vivid.

DP to HDMI Cables

1st HD 1920p → SDR
2nd DP 1.2 to HDMI 2.0 → 4K @ 30 Hz, SDR
3rd DP 1.4 to HDMI 2.0 → 4K @ 60 Hz, SDR
4th DP 1.4 to HDMI 2.1 → 4K @ 60 Hz, HDR

DP 1.4 is required for HDR.
HDMI 2.0 & 2.1 are required for HDR.

The third cable should have worked but it did not.
I don’t know why.

Edit:
I should add that I opened a Trouble Report to Intel with a Case assigned.

Intel Support emailed me,
" that HDR is not possible with a DP to HDMI Cable".

I emailed them back hat there is a lot of cable vendors on the Internet selling DP to HDMI Cables with capability for HDR.

1 Like

At lower resolutions, not 4k. Bandwidth requirements are bandwidth requirements. The bargain dp2hdmi cables can’t carry the bandwidth at 4k, but will work at lower resolutions.

Then again, HDR is a wheeze. Go outside to “that room with the big blue ceiling, the super-bright lightbulb, and the roof that occasionally leaks”, and you’ll notice the human eye can’t see very bright objects and very dim objects at the same time. Sure, the eye has huge dynamic range in terms of light, but not at the same time for different parts of an area. I don’t want “the sun and clouds to really pop” at the expense of the rest of the image I’m looking at - especially at 12k or 16k. HDR on ruins most of the image.

1 Like

As promised, 16k x 2k rendering on a pair of Arc 770LEs. I need to adjust the side window offset a bit, but it gives you an idea - and it’s perfectly usable. I’ll use the upper 2k on my 8k x 4k video wall for MFD and PFD displays for learners.

Screen capture downscaled (a LOT) because it would not be polite to upload a 40mb image.

Boy, can’t wait until MSFS supports XeSS! Can only get better (though honestly, I don’t think there’s really any need for improvement, but I’ll take whatever I can get, right)?

I think this pretty much settles that the Arc is more than good enough.

2 Likes

I am not sure what you are saying.

For DP HDR, you must have a DP 1.4 cable per the specifications.
For HDMI HDR, you must have a HDMI 2.0 cable per the specs.

There are no bargain cables.
They are spec’d from 1920p to 8K. SDR to HDR. Speed increases.
Price increases as the spec increases.

A user with a 1920p display (SDR) does not need an expensive cable.

For my purchases:
DP to HDMI HD 1920p SDR $12.00 USD
DP1.2 to HDMI 2.0 4K @ 30 Hz $14.00
DP1.4 to HDMI 2.0 4K @ 60 Hz $17.00
DP1.4 to HDMI 2.1 4K @ 60 Hz $22.00 (8K @ 60 Hz spec’d)

For DP to HDMI cable, I am not sure which spec controls.
I would guess it is the lower or HDMI.

But, the improved detail and colors displayed show that DP 1.4 is being displayed.

And, the Windows HDR Calibration App is different from
HDMI and is improved for DP.

So, you can get improved graphics from a DP to HDMI cable for a HDMI display as long as the display has the capability in itself.

I created an item for the wishlist:

Please vote for it!

4 Likes

Well, not all dp2hdmi cables are created equal. Plenty of people here have found that out the hard way.

Also, HDR ignores millions of years of evolution among all animals, well before humans. While our eyes have a wide dynamic range, that dynamic range is limited by lighting. For example, a car shining their high beams at you at night makes it impossible to see anything beside the car - you have to look away. Same as looking into a bright light - which is why over 100 years ago WW1 pilots would fly high, then come out of the sun to attack the planes below them. Pilots looking into the sun to try to see them would be unable to, because the human eye’s dynamic range is situationally dependent.

Now on a small screen running at 1080 or 1440, it really doesn’t matter - there’s not much of an option to “look away.” But on setups like mine, I would be royally cheesed off if the side screens details were nearly invisible because the software decided to dim those areas that are not so bright to increase the “popping out effect” of the brighter areas. This is totally contrary to real life, and with more people running multiple 4k displays, it’s just dumb.

I should be able to look to the sides, and see a normally lit scene, same as real life.

So why waste resources (CPU, GFX, etc.) on an effect that will be irrelevant to most multi-big-screen simmers in the near future? There are already people showing off their 3 x 75" displays. And then there’s cases like mine, where I had to develop the biggest display possible to help low vision users decide what’s best for them.

Think of taking a sheet of plywood, and a second sheet. Cut off 2’ from one end of the second sheet, and stand both sheets on your desk. That’s my display area. HDR deesn’t work properly in such cases - it degrades from the true-to-life experience.

If I wanted HDR, I would enable it in the screens themselves, not MSFS software. The HDR in each screen is calibrated by the manufacturer to give optimal performance for that screen.

But I don’t because (1) it’s not how vision works in real life, and (2) it’s a waste of resources. Right now I’m running at an effective resolution of 12kx4k (center 8k, sides 4k) - I’ve posted pics of what 16kx2k looks like, and it’s spectacular enough that I’m not going to waste my time on stuff like HDR. Especially since no two companies implement the 10 different standards the same way. I’d rather just enjoy the larger work surface for regular work, and the spectacular views when flying. HDR wouldn’t really add anything to the experience, so why bother?

Now if anyone else wants to post 16k screen shots that prove different, let them. I’ve got no problem with that - except that nobody seems to have that capability right now, so it’s irrelevant.

An in-cockpit view running in “panoramic vision” using 2 Acr 770LEs. 16k x 4k, runs nice. You have to imagine that the side windows are folded inwards to get the side views - in real life it’s great! Again, image greatly down-sized.

2 Likes

New Beta driver is available. 4148 Beta

Works fine for me in Windows 11 and MSFS.

This is the 12th driver.

1 Like

Hello guys.
I am trying to switch from my Xbox SS to a PC to run the sim. I have a 1440p screen and a 4k TV (which I run at 1440p DolbyVision currently) and looking into getting this (770) card first to see if it is enough of an upgrade compared to the Xbox. I have a 10 years old laptop so I have to buy everything for this PC.
What cpu do you recommend from the 13 intels or maybe a Ryzen 5800x3d? I am torn between those. I am mostly flying GA and gliders, so I dont need performance for fighter jets.

I recommend staying away from the 5800X3D as you would be tied to the AM4 platform which has now been superceded by AM5 and 7xxxX3D etc.

i7-13700K (with internal 770 GPU)
i7-13700KF (without internal GPU)

Why would I need an internal GPU? I only have 3 screens and dont plant to do streaming or things like that.

The i7-13700K CPU cannot compete with the 5800x3d CPU.
You have to go to the i9-13900K to beat the performance of the 5800x3d in MSFS.
All CPU’s are connected with a platform and there always will be a new platform in the future.
When switching to a new platform i will recommend AM5 with 7xxxX3D.

But why? :slight_smile:
I could buy the 5800x3d with a relatively budget AM4 MB now and when I decide to upgrade the CPU, I upgrade the MB and the RAM too, maybe in 5-7 years. The AM4 MB and the 32GB DDR4 RAM now is very cheap, AM5 and 7xxx and DDR5 would add a lot to my current cost and it would be also outdated 5 years from now.

1 Like

I don’t actually disagree. If you think you will be happy with it for the next 5 years then it’s not a bad approach. However it’s not like you’re spending $2000 on a graphics card and I personally would rather pay a little more now knowing I won’t have to upgrade my whole system the next time.

1 Like