Could foveated rendering theoretically work in 2D / Pancake with Tobii Eye Tracking?

Pretty much what the title says. And I don’t think this is the right forum, but I believe foveated rendering (for VR at least) is only available in the SU2 beta.

Surely it isn’t impossible to do some kind of similar foveated rendering in the future, for 2D (non-VR) use when pairing with a Tobii’s eye tracking feature?

While I’d still like to get into VR again (I succumbed to the Windows 11 24H2 update so my Reverb G2 is now a paperweight) I don’t want to fork out on a pricy headset and I ‘only’ have an 11900K and RTX3080. I am however quite pleased with the fluidity of the SU2 beta (I play at 4K with frame generation) and wondering if a foveated rendering feature would allow better visuals…

1 Like

“The primary application of foveated rendering is in display technologies, like VR headsets and AR glasses, where resource optimization is essential.”
What is foveated rendering? - Tobii

1 Like

I’m intrigued by your suggestion!

From the articles below, I assume the effectiveness of foveated rendering on a 2D screen depends on how much of the screen is in your peripheral vision. So, anything outside of the 70 degrees central vision Field of View (FoV) is rendered in a lower resolution.

What part of the display is covered by central vision? There’s a formulae for that… There is a difference between the physical FoV for your monitor and the FoV that Flight Simulator displays. The physical FoV is defined by the distance between pilot and monitor, and by monitor size.

I sit about 1 metre from my 710 mm wide display. So, at that distance I have an FoV of about 40 degrees. I reckon that all of the screen is within my central vision, so foveated rendering is outside of my screen. However, having three or more screens (as I do), might mean it’s a slight advantage to performance.

## What Does “Foveated Rendering” Mean?

Foveated rendering is a term that describes a reduction in rendering quality in the wearer’s peripheral vision. It works by tracking or predicting the position of the eye so that the portion of the scene that the wearer is looking at is prioritized for high-quality rendering.

This means that rather than rendering an entire scene at a fixed or even dynamic resolution, the rendering budget can be better spent on just the part of the image that a wearer is looking at. The very edge of the wearer’s field of view may see a reduction in resolution or other image-enhancing techniques like anti-aliasing since they aren’t in focus.

## What is peripheral vision?

Peripheral vision is what many refer to as “seeing out of the corner of your eye.” It is your ability to see objects outside of your direct line of sight without turning your head or shifting your eyes. This allows you to do things like walk without bumping into things, drive and play sports.

Peripheral vision is also sometimes referred to as “side vision” or “indirect vision.” The peripheral is the edge of your visual field, or the full range you can see at a given moment.

The typical visual field for humans is 170 degrees: 70 degrees for central vision and 100 degrees for peripheral vision.

1 Like