Continuing the discussion from HP Reverb G2 standard or HP Reverb G2 Omnicept Edition:
If discussing a specific VR headset, please tag your post with the appropriate selection.
If it is true, dynamic Foveated rendering, then the sweet spot should move around the MSFS screen… Or is that hard coded into the HP omnicept? Can you turn on the Tobii “bubble” to see where you are looking as in their NonVR head tracker?
Thanks.. sending protective and healing vibes to you and your loved ones and all who are viewing this.
Chas.
It depends what you call “sweet spot” here.
The “rendering sweet spot”, where we render with the higest resolution (in OpenXR Toolkit terminology, the Inner ring) will indeed follow the eye gaze.
The “optical lens sweet spot”, meaning where the optics have least distortion, that will not move. Most people fine it quite narrow (which is relatively true), which does limit a little bit the gains of dynamic foveated rendering.
In OpenXR Toolkit, yes there is a debug mode that shows where you are looking (but it isn’t used for anything else than visualization).
1 Like
Well, specifically, does it’s(OpenXR’s) use ,;
- cause down-rendering to , say 1/2 the full res.,or 1080, on everything but,
-that openxr center circle. Because,
- one of the eye sensor’s “eye state”, “Detector function”, is triggered…
by a change,of position, of the eye’s chosen dominant marker(pupilary?) has moved. Detector function’s routine not only detects,
its recording each change in x,y (pupils’s center x,y) and;
correspondingly passes on the recorded pupil x,y,vectors- to a different function, called,say, “CircleMover” .
"CircleMover simply does what was told to do…
- takes the Detector function’s new pupil x,y data, then;
- moves the OpenXR center of the rendering circle ,to what-ever spot, upon which the eyes are focusing(the latest x,y position), then, once there; calls the “Render function”, which takes "“Detector function” x,y info, and,;
3. Renders (defined by an new x,y center with an adjustable radius, a middle, largest focus circle,to some resolution somewhere between, 0 and 2160 say 1620.,
- Then the “Render function” , at the next instruction jumps to the latest center , moves the radius,also, some "adjustable radius’, say 1/2 to 2/3rds the radius of it’s containing lower resolution mother circle, and renders that inner circle to 2160x2160. Everywhere else, the screen is rendered at some lowering resolution out to the edge of the screen.
Job done, control goes back to “CircleMover” which at this millisecond, has nothing to do, thus relinquishes control to “Detector function”, which if no movement is detected within 1/200th of a second continues to scan until movement IS detected, and we all know what happens…grin
If that functional chain works on MSFS, (Visa Vis the Nvidia graphics card function’s of it’s own*1) OpenXR makes a real dent in the GPU/CPU COMPUTE requirements;yes?
Thus enableing us;
- to fly at the same FPS, but, at Ultra -moving a ■■■■■■■ of Graphics sliders to the right or ULTRA" in the VR settings, or;
- to fly at the higher FPS, but, with the Graphics sliders left where they are, un-altered in the VR settings. True?
Please, help me understand the error of my thinking.
Regards,
Chas
*1. It may well be that open XR has no idea of what what the Nvidia card is rendering,only where it needs to apply its rules to the Nvidia rendering engine. If Dynamic Foveated Rendering is implemented, and proper protocols exist between , ironically, Nvidia,Microsoft,MSFS and OpenXR, the above scenario could plausibly play out.
Now the next step is, I suppose, is to find someone with an OmniCept, or beg borrow or steal one, yack at all 4 of ‘em and see what finger pointingthey do at one another… or heaven. forbid, cooperation we get. outta them.
A lot of what you wrote is quite unclear (translated?).
I have a G2 Omnicept and OpenXR Toolkit’s eye tracked foveated rendering works without issue.
This isn’t a feature of the base game, it is done via OpenXR Toolkit, which is a 3rd party mod.
It doesn’t make a big difference in performance compared to fixed foveated rendering (no eye tracking), it just gives you the same gain as fixed foveated rendering with a less visible impact to image quality.
Are there diagnostics to indicate where you’re looking in whatever program you’re running?
Where are you located? I am in southeastern Kentucky at the juncture of Virginia, West Virginia, Tennessee and the Daniel Boone Trail. Where are you located?
I’m sending protective and healing vibes to you and your loved ones.
Regards,
Chas.
Yes,
In OpenXR Toolkit, there is a raw coordinates display, and there is a mode to draw a little square at the center of the rye gaze.
For other apps, you can also run the Omnicept test program to get the raw coordinates.
Before I order it from Walmart, that is indeed great news…and considering their refund policy, I’d be stupid to not try it….I’ll let y’all know…ordering now
NOT…went through Quest Pro…now happily on my Crystal