Hey all,
It’s your favorite OpenXR developer MattB here, bearing gifts!
I’ve spent a few hours trying to get NIS to work with VR, and I have an alpha-quality piece of software for you to experiment with! I’m hoping the community will find this useful and improve your gaming experience!
(updated link to reflect the latest version)
Q: What is this?
It’s a piece of software sitting between an application (here MSFS) and the operating system (the OpenXR runtime) and intercepting what’s rendered by the application, running the NVIDIA Image Scaling algorithm, then submitting the upscaled output to the operating system. Effectively it’s another render scale, but the NIS algorithm also performs sharpening in an attempt to make the visuals more crisp despite the lower resolution.
Q: Does it work?
I think it does, but I’m counting on this forums’ community to test it out and tell me what it’s really worth. I’m able to see an increased FPS when turning it on (+ 10-15 FPS on my machine), but I am not too sure if the visual quality is on par (y’all have a much better eye than me I’m sure). What I can say for sure is that between a cheap bilinear upscaling and NIS, the quality is definitely superior with NIS.
Q: Should I try it?
It’s a very early prototype that I’ve spent only a few hours working on. So it could be rough.
If you’re not too experienced, you probably shouldn’t try it until we figure out that it’s worth it and I work out the quirks.
If you expect to install it and get a jump in FPS for the same quality out-of-the box, this is also not for you and you should check back later after it’s tweaked.
Q: What if it doesn’t work?
If it doesn’t work for you specifically, we can try to debug as much as possible, but every PC setup is different so I can’t guarantee I will be able to resolve all problems…
If it doesn’t work at all (performance isn’t as good, quality is too low), then I’ll consider just killing the project and call it a failed experiment.
Remember, I am delivering this as an experiment, no guarantees!
Q: What are the best settings for it?
I don’t know and since this community has a history of being really good at figuring out combinations of settings, I’d like to crowd-source the answer to this question
Q: Does it work with DX11 and DX12?
Only DX11 at this time.
Q: How does the scaling value fit in with the OpenXR render scale and the game’s render scale?
The software uses the resolution returned by OpenXR (ie the one after OpenXR scaling, that you can see in the OpenXR Developer Tools if using Windows Mixed Reality, under System Status → View configuration). It applies its scaling to that resolution. The actual resolution can be observed in the log file (see README).
You’ll see something like:
Scaled resolution is: 2206x2161 (70% of 3152x3088)
The NIS scaler should replace any in-game render scale setting that was being used before. So if you used 80% as the in-game render scale, I suggest you set that back to 100%, and use 80% with the NIS scaler instead.
Q: How do I experiment with the various settings?
There are only 2 settings mapping to the NIS settings: scaling (between 0 and 1) sharpness (between 0 and 1).
Changing the scaling requires you to exit VR, modify the config file, then re-enter VR. The config file is named FS2020.cfg
and it must be in the same folder you copied the software (DLL). By default I’ve put 70%, but no idea if it’s the best value.
Changing the sharpness can be done in increments of 5% by pressing Ctrl + Down arrow (or Ctrl + F2) to decrease and Ctrl + Up arrow (or Ctrl + F3) to increase. By default, I put 50% but it may not be the best value. The new sharpness value can be observed in the log file (see README). When satisfied with your tweaking, you may then modify the config file to make it permanent.
You can use Ctrl + Left arrow (or Ctrl + F1) to enable/disable NIS and switch to a bilinear scaler (cheap scaler) instead, so you can see the improvements that NIS provides.
Q: What headset does it work with?
It should work with any VR headset thanks to OpenXR. We’ve seen success with Windows Mixed Reality (eg: HP Reverb), any headset going through the SteamVR runtime (Valve Index), Pimax, Oculus Quest…
Q: What GPUs does it work with?
I think the NIS shader works on any modern GPU, even not NVIDIA. I have only tried on a GT 1030 (not with MSFS obviously) and a GTX 2080.
Q: Will it work with other games like the ones from Steam?
This only works with OpenXR applications, not OpenVR and the other ones.
Even with OpenXR applications, I can’t guarantee it will work as I’ve only implemented the bare minimum for MSFS.
Q: Why NIS and not DLSS?
I don’t have an RTX card.
Q: Why NIS and not FSR?
I have looked at both SDKs, and the NIS shader was super easy to integrate with Direct3D 11, literally copy a few files then make 4 functions calls. The FSR code was more complex so I didn’t want to invest time in it.
Q: Why are you developing this?
Mostly for fun and learning, and to see if I can improve this community’s experience.
Q: Is it open source? Can I contribute?
It’s 100% open source, and if you’d like to contribute I’m happy if you get in touch with me!
Q: Is this affiliated with Microsoft?
While I am a Microsoft employee working on OpenXR, please note that this is a personal project not affiliated with Microsoft.
Q: Why does this have a weird name?
It’s following the OpenXR naming convention:
XR_
→ This is for Khronos’ OpenXR (as opposed to GL for OpenGL or VK for Vulkan for example).
APILAYER_
→ This is an API layer, ie it intercept existing API, as opposed to an EXT that would add new ones.
NOVENDOR_
→ This is not owned by any company, I am developing this indenpendently.
nis_scaler
→ That’s what it does, duh!
Q: How does it work?
NIS is a post-processing algorithm that NVIDIA integrated to its drivers so that in theory any application rendering to a monitor can benefit from the upscaling. But that’s not how VR works, because VR is a very special way of handling a monitor, so the driver just doesn’t seem to handle it.
Instead, what this software does is intercept the OpenXR calls from the application, perform the NIS processing pass, then forward the output to the real OpenXR runtime.
Step 1 is to make the application believe that it must use a lower resolution. When the application calls OpenXR to query the resolution, we just return the downscaled resolution (lower resolution).
Step 2 is to intercept the textures that OpenXR passes to the application for rendering, and instead to create smaller textures for the application to use. We keep a hold of the real texture that OpenXR needs (the one at full resolution).
Step 3 is to intercept when the application submits its rendered texture. When this happens, we invoke the NIS shader to upscale the texture (the smaller one) and write its output to the real OpenXR texture (the one at full resolution). Unlike what your driver typically does with NIS, it’s useful to notice that we must do the upscaling twice: one upscaling for each eye! We can then tell the OpenXR runtime to take it from there (ie use the upscaled texture).