Ming-Chi Kuo: Apple’s AR Headset Will Use Face ID Technology for Tracking Hand Gestures

Ming-Chi Kuo: Apple’s AR Headset Will Use Face ID Technology for Tracking Hand Gestures

Well-connected Apple industry analyst Ming-Chi Kuo says Apple’s much-rumored Augmented Reality headset will boast multiple highly sensitive 3D sensing modules in order to offer an innovative hand gesture and object detecting user interface.

Kuo’s predictions come in a recent note to investors, shared by MacRumors.

We predict that the structured light of the AR/MR headset can detect not only the position change of the user or other people’s hand and object in front of the user’s eyes but also the dynamic detail change of the hand (just like the iPhone’s Face ID/structured light/Animoji can detect user’s dynamic expression change). Capturing the details of hand movement can provide a more intuitive and vivid human-machine UI (for example, detecting the user’s hand from a clenched fist to open and the balloon [image] in hand flying away).

Kuo says the headset detects both hand gestures and movements to provide the user with an immersive experience where the user could open their hand to let go of a virtual balloon.

Apple is expected to incorporate four sets of Face ID-like 3D sensors which are of higher quality and specifications than the sensors currently used in iPhones.

Kuo says the headset’s interface abilities include gesture control, object detection, eye-tracking, iris recognition, voice control, skin detection, expression detection, and spatial detection.

Apple is expected to debut its first-generation AR-focused device in 2022.