Unlike what most are saying: yes, size depends on the IPD, but not just it.
For example some are finding 3D movies smaller than real life and some others find them bigger: this is because of the difference between you eyes IPD and a film ICD.
If the IPD (inter-eye) is matching the ICD (inter-camera), then any 3D object which is properly modelled and projected on the VR screens should appear in its correct dimensions. Here is an easy test to see this in action with WMR Headsets (at least with the Reverb G1 and the G2):
- In the Mixed Reality House, go in front of the Halo helmet, up close (really close).
- Open the WMR settings popup window in VR where you can adjust IPD in software.
- Grab the slider with the VR controller and while depressing the trigger look toward the helmet.
- Move the controller to the left fully and wait 1 sec or so, then move to the right fully and wait 1 sec or so.
In doing this you should perceive the helmet is changing size in front of your eyes (for me with an IPD close to 64, there seems to be about 3 distinctive sizes corresponding to slider left, center and right positions).
What is the problem within the simulator then?
This mostly depends on the API used and its implementation.
Native WMR with fixed IPD headsets (G1) will adjust where on the screens the image gets displayed, as well as the projection of the 3D world to the screen, whereas OpenVR Valve Index will just change the projection matrices because the screen physically moves with the hardware IPD slider.
When using the Reverb G1 via SteamVR you’re also dependent on the OpenVR API and the Microsoft implementation of the WMR for OpenVR driver. When using a WMR HMD in OpenXR you’re dependent on the OpenXR driver implementation, etc… Nearly all VR APIs are giving each eye projection matrices back to the application so that whatever the HMD and whether it has fixed or variable hardware IPD, the application shouldn’t care. However applications might get it wrong*
So in turn this all depends on this chain of parameters which must all match:
- you average IPD (distance between pupils)
- hardware IPD (distance between the lenses)
- software IPD (distance between images rendered to each eye)
- software ICD (distance and direction between the virtual cameras looking at the 3D scene)
NB: your physical IPD is different if you’re looking far away or converging up close and without eye tracking with auto IPD like in the Varjo, you’ll have to take an average value.
Therefore any VR application shouldn’t bother with these: they just get the correct projection matrices from the VR driver which is doing its job interpreting and projecting the matrices according to the physical and logical HMD specifications. For example I’ve tested Half Life Alyx both with the Reverb G1 and the Valve Index and only with the Index the object sizes where spot on. I’ve particularly tested these 2 HMD against the metro wagon in the game in matching my seating position with the virtual seat in the wagon. Conversely when in XP11 the Reverb G1 was making everything smaller to me whereas the Valve Index was always spot-on.
Having said this, it is a fact not everything is working like this and there are many VR software not doing it right (whether application and/or driver).
The other reason it could be giving a different perception of scale regardless of ICD matching IPD would have to do with the focal point both eyes projection matrices are converging to and how this differs from your own vergence. There is a possibility some implementations are only changing the ICD without changing the focal point location (say 2 meters in front of you) and some other implementations just moving apart the cameras without changing the projection angles, thus putting the focal point at a different location and perceived angle from both eyes.
This is why I’m suggesting FS2020 is implementing a user setting to override the actual ICD (either as a fixed value or as a scale factor).
[update]
In practice, there is a need for both absolute and relative adjustments so that you can transform the ICD in the software with this formula: ICD = (IPD + Bias) * Scale
Bias adjusts for the differences between the lenses IPD and your eyes IPD, whereas Scale could adjusts for the differences between the software IPD and the software ICD.
PS: you can override the XP11.50 ICD using the following command line:
--override_ipd 0.01
This will set the IPD to ± 0.01m and you can check log.txt for the following confirmation:
I/VR: Interaxial set to -#.##,#.##
*it is a little bit more complex than this but there are for example 2 sets of API related to the eye projection matrices in OpenVR and I suspect one is just taking the IPD set in the SteamVR config file and the other is dynamically adjusting at runtime.