A less sophisticated variant called fixed foveated rendering doesn't utilise eye tracking and instead assumes a fixed focal point.[3][4]
History
Research into foveated rendering dates back at least to 1991.[5]
At Tech Crunch Disrupt SF 2014,
Fove unveiled a headset featuring foveated rendering.[6] This was followed by a successful kickstarter in May 2015.[7]
At
CES 2016,
SensoMotoric Instruments (SMI) demoed a new 250 Hz eye tracking system and a working foveated rendering solution. It resulted from a partnership with camera sensor manufacturer
Omnivision who provided the camera hardware for the new system.[8][9]
In July 2016,
Nvidia demonstrated during
SIGGRAPH a new method of foveated rendering claimed to be invisible to users.[1][10]
In February 2017, Qualcomm announced their Snapdragon 835 Virtual Reality Development Kit (VRDK) which includes foveated rendering support called Adreno Foveation.[11][12]
During
CES 2019 on January 7
HTC announced an upcoming virtual reality headset called
Vive Pro Eye featuring eye-tracking and support for foveated rendering.[13][14]
In December 2019,
Facebook's
Oculus QuestSDK gave developers access to dynamic fixed foveated rendering, allowing the variation in level of detail to be changed on the fly via an API.[15]
On June 5, 2023,
Apple announced that the
Apple Vision Pro extended reality headset includes dynamic foveated rendering.[17]
Use
According to chief scientist
Michael Abrash at
Oculus, utilising foveated rendering in conjunction with
sparse rendering and
deep learning image reconstruction has the potential to require an order of magnitude fewer pixels to be rendered in comparison to a full image.[18] Later, these results have been demonstrated and published.[19]
Eye-tracked foveated rendering was first demonstrated[20] in a product in Sony PlayStation VR 2 headset.