Apple’s Augmented Reality Headset Rumors Confirmed
Apple employees have been expressing skepticism about the headset’s prospects, including some who work on it, according to sources. Some have even left the project because of concerns about its future, these people say.
Kuo claims that the AR headset will have 15 cameras, eight for augmented reality and six for “innovative biometrics.” Other reports have suggested that Apple is taking privacy seriously with its AR device.
The headset will include external cameras for eye and hand tracking, allowing users to select items on screen by looking at them and using a gesture like a pinch to activate them. It will use short and long-range LiDAR to accurately map the environment, allowing for precise positioning of virtual objects and environments.
The new headset will have two ar headset 4K displays for a total of 5.6 million pixels, with each display being able to render imagery in full resolution where the user’s eyes are focused. This will allow for a much higher level of immersion than is possible with current AR headsets.
Apple will also include internal cameras that can be used for FaceTime video chats with a realistic version of the wearer’s own face and body. This feature is likely to be rudimentary at launch, and Apple is expected to improve it over time.
According to Mark Gurman, the headset will have a custom operating system designed for it that will offer a combination of iOS and macOS features. The name of this system hasn’t been announced, but references to “realityOS” have appeared in App Store upload logs spotted by eagle-eyed developers.
The headset will also support augmented reality apps optimized for the device, including iPad apps adapted to work on the 3D interface of the headset. There will be a dedicated Music app for playing Apple’s catalog, and a Fitness+ and Health app that will let users follow exercise and meditation sessions with trainers in VR.
Apple has a huge team of people working on headworn AR and VR hardware. It’s expected that these headsets will eventually replace the iPhone, but they need to be light enough and have the processing power needed for proper see-through augmented reality. The company is said to be using its own silicon chips designed specifically for the headset, fabricated on the same 5-nanometer process as the iPhone processor. These will make the headset capable of performing complex tasks independently from an iPhone or Mac.
The headsets are also supposed to have multiple cameras for tracking eye movements and capturing facial features. This will allow the headset to display lifelike avatars and use foveated rendering to conserve power by only showing imagery where a user is looking. They are also supposed to be able to detect the force of a finger touching a surface and supply haptic feedback.
Like other Apple products, the headset will run a custom operating system, dubbed xrOS. It is said to be based on iOS 12 and optimized for the headset, including hand and eye tracking and Siri support. eagle-eyed developers have found references to the software in App Store upload logs. Earlier, it was known as “reality OS,” but that name was recently changed to xrOS.
The headset is said to feature more than a dozen cameras that will track hand movements and gestures, one method of control along with eye tracking. It will also be able to perform foveated rendering, which conserves power by only showing the image in full resolution where the user is looking. The device will also be able to detect the user’s face to automatically render a lifelike avatar.
Apple’s augmented reality team is said to include experts from a variety of hardware and software companies including Oculus, Lytro, Microsoft, 3D animation company Weta Digital, and Lucasfilm. A former Apple hardware engineering chief, Dan Riccio, has recently joined the project as a special advisor to help with development.
According to a report from The Information, the headset will come with two Mac-level M2 processors to provide unprecedented computing power in a wearable device. Combined with the two Sony-made 4K micro OLED displays, the headset will reportedly be capable of true see-through AR functionality without requiring a separate display panel on the back of the device.
The headset’s sensors could include iris scanning technology, which is likely to be used to authenticate purchases. The system would also be ar headset able to scan the surrounding environment to identify and recognize objects. A July 2020 patent application describes how the headset can use infrared heat sensing or mmWave technology to sense when it has touched a real-world object and then project its own controls on top of that object for a sort of mixed reality overlay.
In order to be successful, an ar headset will need compelling first-party and third-party software experiences. Apple has reportedly been working with a select group of developers on augmented reality software for the device. The company has also reportedly developed tools that will allow non-developers to create their own AR apps for the headset. Gaming will reportedly be a large focus, as will other top-tier apps available on Apple devices. One rumor even claims that Apple is developing its own version of Google Tilt Brush, which allows users to create 3D paintings.
Another feature that is expected in an AR headset is LiDAR, which uses lasers to scan a room and quickly and accurately measure distances. This technology is already used on the 2020 iPad Pro and iPhone, and it could enhance AR headsets by helping to place virtual objects in a real-world environment more accurately.
In addition to gaming, augmented reality can be used for education, shopping, and other practical purposes. For example, mobile AR lets you point a smartphone camera at an object in the real world and overlays 3D graphical information on the screen. Unlike VR, which requires a headset to experience, mobile AR works with a standard smartphone and is accessible to anyone with a compatible app. Some popular AR apps include Snapchat, Apple’s ARKit, and Google’s Lens.