In practice, there is a need for both absolute and relative adjustments so that you can transform the ICD in the software with this formula: ICD = (IPD + Bias) * Scale
Bias adjusts for the differences between the lenses IPD and your eyes IPD, whereas Scale could adjusts for the differences between the software IPD and the software ICD.
NB: in X-Plane and DCS you can adjust only the ICD (5)
Some are finding 3D movies smaller than real life and some others find them bigger: this is because of there is a difference between you eyes distance (IPD) and the film cameras distance (ICD) *(1)
To illustrate this, look at the pictures below, cross your eyes until the two squares align, and focus on the middle image
Do they look life-size to you? Bigger? Smaller?
Is the perceived depth life-size, bigger, smaller than expected?
If the IPD (inter-eye) is matching the ICD (inter-camera), then any 3D object which is properly modelled and projected on the VR screens should appear in its correct dimensions. The difference between ICD and IPD is what makes the image above not life-size to me
This mostly depends on the API used and its implementation.
Native WMR with fixed IPD headsets (G1) will adjust where on the screens the image gets displayed, as well as the projection of the 3D world to the screen, whereas OpenVR Valve Index will just change the projection matrices because the screen physically moves with the hardware IPD slider* (2).
When using the Reverb G1 via SteamVR you’re also dependent on the OpenVR API and the Microsoft implementation of the WMR for OpenVR driver. When using a WMR HMD in OpenXR you’re dependent on the OpenXR driver implementation, etc… Nearly all VR APIs are giving each eye projection matrices back to the application so that whatever the HMD and whether it has fixed or variable hardware IPD, the application shouldn’t care. However applications might get it wrong* (3)
So in turn this all depends on this chain of parameters which must all match:
- you average IPD (distance between pupils)
- hardware IPD (distance between the lenses)
- software IPD (distance between images rendered to each eye)
- software ICD (distance and direction between the virtual cameras looking at the 3D scene)
NB: your physical IPD is different if you’re looking far away or converging up close and without eye tracking with auto IPD like in the Varjo, you’ll have to take an average value.
Therefore any VR application shouldn’t bother with these: they just get the correct projection matrices from the VR driver which is doing its job interpreting and projecting the matrices according to the physical and logical HMD specifications* (4)
Having said this, it is a fact not everything is working like this and there are many VR software not doing it right (whether application and/or driver).
(2) A live test with WMR Headsets (at least with the Reverb G1 and the G2):
- In the Mixed Reality House, go in front of the Halo helmet, up close (really close).
- Adjust the IPD hardware slider or…
- Open the WMR settings popup window in VR where you can adjust IPD in software.
- Grab the slider with the VR controller and while depressing the trigger look toward the helmet.
- Move the controller to the left fully and wait 1 sec or so, then move to the right fully and wait 1 sec or so.
In doing this you should perceive the helmet is changing size in front of your eyes (for me with an IPD close to 64, there seems to be about 3 distinctive sizes corresponding to slider left, center and right positions).
(3) it is a little bit more complex than this but there are for example 2 sets of API related to the eye projection matrices in OpenVR and I suspect one is just taking the IPD set in the SteamVR config file and the other is dynamically adjusting at runtime.*
(4) However if the 3D object is wrongly projected on the VR screens, you’ll have to adjust/zoom/scale both the projection parameters and the ICD in order to restore matching dimensions. I can observe the G2 is projecting images smaller than the Index and this makes the world as seen through the G2 smaller that it should be to me. Whereas I find the Index projecting images at the expected focal length and size which makes the world as seen through the Index life-like.
For example in Half Life Alyx both with the Reverb G1 and the Valve Index and only with the Index the object sizes where spot on. I’ve particularly tested these 2 HMD against the metro wagon in the game in matching my seating position with the virtual seat in the wagon. In XP11 the Reverb G1 was making everything smaller to me whereas the Valve Index was always spot-on as well.
Another possible source of conflict in size perception could be the focal point both eyes projection matrices are converging to and the difference with our own vergence. Some implementations are only changing the ICD without changing the focal point location (say 2 meters in front of you) and some other implementations just moving apart the cameras without changing the projection angles, thus putting the focal point at a different location and inducing therefore a different perceived angle from both eyes.
(5) For example, you can override the X-Plane 11.50 ICD in VR in using the following command line:
This will set the IPD to ± 0.01m and you can check log.txt for the following confirmation:
I/VR: Interaxial set to -#.##,#.##