Games are harder for PCs to run in VR than on a flat monitor, partially because the image for each eye is rendered individually, from a different perspective. Luckily, there are many ways to improve the performance of VR, such as GPU-upscaling or multi-threading, and foveated rendering, where only a part of the image is rendered in full detail, saving on computer resources.
And, if you use eye-tracking, Foveated Rendering is Dynamic (DFR), meaning not only do you never see the pixelated areas, but you can also make the focus area smaller, creating an even bigger performance gain. This requires both hardware and software to work together.
Also in video
Hardware: 20 infrared lights, 120 FPS camera inside the lens barrel
So the Crystal Super comes with glass aspheric lenses. These lenses have several benefits, such as extreme high clarity, and high brightness. Pancake lenses, because they use polarizers and reflectors, only let through around 15% of the light of the panel. But aspheric lenses are single-element and let through around 99% of the light from the panel.
We got more space to work with. We've put ten infrared lights on each lens edge, which illuminate your eye in a way the camera can see, but you cannot. The distance between the lens and panel is also larger, so we can, unlike with pancake lenses, put a mirror inside the lens barrel that reflects the infrared light into the eyetracking camera, which we can also fit inside the lens. So the result is that we get a very clean shot of the users eye, with the infrared lights clearly displayed, to keep track of the user's eye movement. And we got a really good camera here, tracking the user's eye in high clarity at 120 FPS.
That speed and accuracy are both really important for DFR. If the accuracy or the speed is too low, DFR will not work properly, and you'll see pixelated patches of your application in front of you, taking you out of the immersion. If eyetracking is accurate, the focus area can be reduced, so only a small area of the total field of view needs to be rendered in full detail. This accuracy gives enormous FPS boosts, equivalent to getting a GPU upgrade.
Software: Client level for VRS and Quad Views
That's one half of the story.
What we also need is software to support it.
The headset feeds back all tracking data back to the PC that can calculate it extremely fast, so that you do not notice any delay.
The Crystal Super works with Pimax Play and supports two DFR techniques, namely
-
Variable Rate Shading
-
And Quad Views
Variable Rate Shading affects mostly the shading and edge details. For instance, instead of shading every pixel individually, VRS can shade a group of pixels by 4 by 4 with the same detail, reducing the workload of the GPU. This gives a performance boost, typically of 10 to 40% extra FPS. And the great thing is that, thanks to Pimax Play, this works natively with so many VR games, even when the developers have not implemented this.
QuadViews works differently. It directly lowers the resolution in the non-focus areas, and maintains or increases the pixel amount in the focus area. This typically gives an even bigger performance boost, of maybe up to 50 to 100%, while increasing clarity of the focus area.
More and more applications are natively supporting QuadViews, such as DCS, Pavlov, MSFS2024, and iRacing, and with Pimax Play, dozens of other titles are supported, even if the developer didn't implement QuadViews themselves.
So yeah, the Super isn't a small headset. But for clarity, there's nothing like it. Not only does it have 29 million pixels and a massive field of view. Thanks to its fast and accurate eyetracking inside the lens barrel, you can get way more out of your gaming system, thanks to DFR, which helps you increase both your render resolution and your frame rates.