MIT Researchers Develop Augmented Reality Headset That Gives Users X-Ray Vision

MIT Researchers Develop Augmented Reality Headset That Gives Users X-Ray Vision

Designed to be durable enough for everyday use, the headset is a smart solution for field service workers. It uses computer vision and wireless perception to automatically locate a specific item and then guide the user to retrieve it.

The device will have multiple cameras to monitor a wearer’s eyes and hands for control purposes. Iris scanning technology will also be incorporated into the headset, much like Face ID and Touch ID on iOS devices.

MIT researchers develop X-ray vision headset

MIT researchers have developed an augmented reality headset that gives users X-ray vision, effectively giving them superhuman powers. The headset uses computer vision and wireless perception to automatically locate items that are hidden from view, like under a stack of papers or inside a box. Then, it guides the wearer to retrieve them.

The system uses a wide-band antenna that tightly matches the shape of the headset visor, allowing it to communicate with RFID-tagged objects. It then combines visual data with a spatial mapping algorithm to pinpoint the location of the object. When the headset finds the object, it shows it as a augmented reality headset transparent sphere in an AR interface. The headset also verifies that the user has picked up the object correctly. In tests in a warehouse-like environment, the headset was able to localize hidden objects to within 9.8 centimeters on average and verified that the object was picked up with 96% accuracy.

Augmented reality is a powerful tool for collaboration, and the future is even more promising with the rise of immersive headsets like the Magic Leap 2. It can be used to demonstrate how to turn a screw or explain the wiring in an electrical fixture, and it’s especially useful when providing technical support. However, there are still many barriers to broader adoption of this technology. One important hurdle is the high cost of these devices.

X-AR locates hidden objects

The MIT researchers who developed X-AR have built an experimental system that uses augmented reality to show users where specific items are, even when they’re hidden from view. The headset combines computer vision and wireless perception to automatically locate a specific item, MIT said in a release. The team has modified a Microsoft HoloLens headset with an antenna that communicates with RFID (radio-frequency identification) tags. The flexible, lightweight antenna fits tightly on the headset visor without covering its cameras or obstructing the user’s field of view.

The antenna sends radio frequency signals that can pass through common materials like cardboard boxes and plastic containers, and reflects off the tags on items. The system combines this information with the headset’s graphical displays to direct the user to the location of the object. Once the item is located, it shows up as a transparent sphere in the headset interface. The user can select the item from a menu, and the X-AR system verifies that the correct object has been picked up.

In tests in a warehouse-like environment, the headset could localize objects to within 9.8 centimeters on average and verify that the user had picked up the correct item 96 percent of the time. The team plans to further develop the X-AR system, and explore how it might work with other sensing modalities, including WiFi, mmWave technology, and terahertz waves.

X-AR verifies object pickup

The headset combines several technologies to find the location of the RFID-tagged object. It uses a method similar to synthetic aperture radar, which is used by airplanes to locate objects on the ground. It takes multiple measurements from different vantage points to build up a map of the area and determine where the item is located in that map. It then combines this with visual data from the headset’s eye-tracking system to verify that a user is picking up the right object.

In tests conducted in a setting that simulates a warehouse, the team found that the system could localize hidden items to within 9.8 centimeters, on average, and correctly verified that users picked up the correct item 98.9% of the time. They say the technology could be useful in e-commerce warehouses and other industrial settings, where it can help workers quickly locate parts in cluttered bins.

The headset will be powered by the same processors that are in the iPhone 8 and MacBook Air, as well as a custom video encoder and decoder for streaming video, voice, and text. It will feature a dedicated image signal processing chip called the H2, which is designed to deliver ultra-low latency. Apple also plans to include two Mac-level M2 chips in the headset, which will provide unprecedented computing power for wearables. The headset is expected to launch this year, although Apple has yet to announce a price.

X-AR guides user

A key challenge in AR is figuring out how to deliver information in a way that makes it useful. The web changed how information is collected, stored and delivered, but it’s still limited by a 2-D model and requires users to mentally translate that information for use in a 3-D world. A new MIT system called X-AR addresses this problem by superimposing digital information directly on real objects and environments.

Using a headset equipped with a modified Microsoft HoloLens, the X-AR system combines RFID (radio frequency identification) tags with augmented reality to create a kind of “X-ray vision” for the physical world. It can locate the tagged items and guide users toward them, and it can verify that users have picked up the right item. In a warehouse setting, X-AR was able to detect the location of tagged items within less than a foot (9.8 centimeters) and verify that users had picked up the correct item with 96% accuracy.

The MIT team developed a lightweight antenna that’s attached to the headset’s visor and that leverages natural human motion to localize RFID tagged objects in the environment. The device uses a technique called synthetic aperture radar, similar to how airplanes image objects on the augmented reality headset ground, to take measurements with the antenna from different vantage points as a user moves around a room and then combines those measurements.

Leave a Reply

Your email address will not be published. Required fields are marked *