Exploring the Future of AR with Vision Pro Technology

AR

Exploring the Future of AR with Vision Pro Technology

There are numerous indications indicating the direction in which AR glasses, more compact headsets, and innovative interfaces are headed in the future. However, for all of these technologies to function optimally, they will require improved compatibility with our personal phones and computers.


AR


I traveled to Long Beach, California, for the AWE augmented and virtual reality conference recently but forgot to bring my mixed reality VR devices, the Apple Vision Pro and Meta Quest 3, with me from New Jersey.

Instead, I brought along two pairs of smart glasses: Meta’s Ray-Bans and Xreal’s Air 2 Pro. I used the Ray-Bans for taking photos and making calls during the trip. I enjoyed watching movies on the plane with the Xreal glasses, and surprisingly, I didn’t even miss wearing those bulky VR goggles at all.

In contrast, modern high-tech glasses resemble the traditional glasses I wear regularly. Both Meta’s and Xreal’s glasses are easy to put on and take off, and can be folded to fit into small cases.

Although they serve different purposes – with Ray-Bans acting as glasses with headphones and camera but no displays, while Xreal’s glasses have wearable displays and speakers and require a connection – they do not offer the complete mixed reality experience like Vision Pro or Quest 3.

However, their growing usefulness indicates a potential future of augmented reality without bulky headsets.

I anticipate that the upcoming iteration of Apple’s Vision Pro may be released within the next year or two, and is likely to be smaller, more cost-effective, and potentially able to connect to a phone, MacBook, or iPad.

Reports suggest that Meta’s move towards AR glasses may take longer, as they will need to bridge the gap between the self-contained bulky Quest headsets and the more limited Meta Ray-Bans that are connected to phones.

This transition will not happen rapidly, and I do not foresee the next Vision headset resembling glasses entirely, but there is a gradual shift towards smaller devices on the horizon.

At the same time, AWE emphasized to me that improved lenses, displays, and hand tracking are on the horizon but are still met with significant obstacles. How will forthcoming eyewear handle all of its processing needs? And what about the battery life? These gadgets need to progress, and the specific manner in which they do so will be influenced by Google and Apple.


AR 2


Vision Pro marks the beginning of a fresh era in interface design

Several developers and startup firms I conversed with described the Vision Pro as setting the standard for a forthcoming user interface. While the hand-and-eye-tracking technology of the Vision Pro may have flaws, it currently stands as the most advanced attempt at creating a controller-free interface.

The Meta Quest 3 and Vision Pro were widely showcased throughout the AWE expo floor, featuring numerous peripheral and software demonstrations. This is due to their shared capability of hand tracking and their ability to seamlessly blend camera footage of the physical world with virtual graphics for a truly immersive mixed reality experience.

Optical devices have not yet fully mastered the technology of hand and eye tracking combined with mixed reality, but it is likely that they will make significant advancements in the coming years. Several companies I have spoken with are actively working to solve these challenges.

Currently, the Vision Pro appears to be the most advanced version of the mixed reality future, with the Quest 3 also available as a more affordable option. Additionally, there are headsets such as the Vive XR Elite, Meta Quest 3, and Magic Leap 2 emerging to explore innovative startup ideas in the field.

Innovative methods for monitoring hand movements on compact devices

Larger headsets such as the Vision Pro and Quest 3 are already capable of handling advanced hand tracking, however, glasses currently lack the battery capacity to support such features. I came across several proposals to address this limitation, including the use of either new cameras or connected wrist devices.

Doublepoint, a touch interface developer, utilizes a software layer on Samsung watches to incorporate motion control and pinch and tap gestures directly on your wrist. Using this technology, I was able to control real lights in a smart home setting, manipulate a cursor, and change music tracks on a smart TV.

Additionally, I observed individuals using motion tracking on a Magic Leap 2 headset while wearing the enhanced watch. Another startup, TapXR, has developed a motion-sensing wristband equipped with a camera that includes tap and pinch gestures.

The company is also exploring tap-to-type systems on tabletops. These new wearables with advanced gesture capabilities seem to be paving the way for the future. Apple’s latest watch already features double-tap gestures, indicating that more innovations may be on the horizon.

Ultraleap, a company known for hand-tracking technology in VR and AR headsets, is experimenting with a smaller, more energy-efficient event camera technology. This technology only detects general changes in light and motion, rather than specific details, and has the potential to last for hours on compact glasses.

The goal is to detect subtle hand movements, similar to Apple’s Vision Pro but without the high power consumption of infrared sensors. I had the opportunity to try out a demo using modified Meta Ray-Bans, which allowed me to control music playback and select songs by moving my fingers and tapping on the glasses.

The aim is to create a user interface that is more intuitive than the current touchpad on Meta’s glasses, making it easy to use while on the go.

Speculations about Apple creating AirPods equipped with cameras may not be as unbelievable as it sounds, especially if these devices utilize energy-efficient event cameras like the ones showcased by Ultraleap.

These cameras may not have the capability to capture photographs, but they would assist in monitoring the surroundings to enable hand gestures to function with or without the need for glasses.


AR 5


The Screens and optics are nearly ready

We observed demonstrations of lens and display technology that revealed the potential for creating smart glasses with large, clear displays that look normal. However, it remains uncertain how cost-effective or energy-efficient these glasses will be.

Lumus, a startup specializing in lens components, has developed their own high-quality reflective waveguides that are integrated with prescription lenses. I viewed sharp and vivid images in several demonstrations, using what appeared to be ordinary glasses – although these glasses were connected to a laptop nearby to display the images.

Avegant, a different supplier, specializes in the display engines that drive these glasses and project 3D visuals onto lenses similar to Lumus’.

I examined a different model of smart glasses – wireless and appearing almost regular – that displayed bright images onto waveguide lenses, which were visible to me while I gazed out of a sunlit hotel room window during the day.

Waveguide technology utilizes small engravings on lenses to reflect images from side projectors into our eyes. Corporations such as Microsoft and Magic Leap employ this technology in their Hololens and Magic Leap headsets, although the process of creating these lenses and displays is becoming increasingly compact.

Developing a processor capable of efficiently powering these displays and lenses without adding excessive bulk is still a developing area: Magic Leap achieves this with a processor worn on the hip, while the Hololens 2 is a head-mounted display setup that is nearly as large as a Meta Quest headset.

These glasses require a connection to a device, and it is probably our phones.

Mobile devices and laptops must progress in order to be compatible with eyewear

The issue with modern phones is that they struggle to effectively interact with the requirements of many AR glasses. iPhones are restricted by Apple for anything beyond simple display functionality, which works well for wearable display glasses such as the Xreal Air 2 Pro that connect via USB-C like any other external monitor. Conversely, Android phones have varying capabilities, leading to fragmentation.



Xreal has developed its own ecosystem based on augmented reality, known as Nebula, which functions through an application on Android devices to transform their glasses into authentic AR accessories. Nevertheless, not all Android devices are capable of running it smoothly.

To combat this issue of fragmentation, Xreal has created a unique phone specifically designed for its glasses: the Xreal Beam Pro. This device showcases a potential direction for upcoming smartphones.

Priced at $200, the Beam Pro is an Android device that functions similarly to a phone, but is designed to be used with Xreal’s glasses as part of a seamless multiscreen computer setup.

It offers multitasking capabilities, 3D AR modes, and the option for 5G wireless connectivity for use outside of Wi-Fi networks. Additionally, the Beam Pro features a secondary charge port for continuous use while charging, as well as spaced out cameras for capturing 3D “spatial” photos and videos.

For years, Qualcomm has been advocating for AR glasses that are driven by phones, but so far Google has not integrated this connection into the core of its Android OS.

However, this could be changing soon as Hugo Swart, who previously worked for Qualcomm as head of XR, has recently joined Google. Additionally, Google is working on an XR headset with Samsung and Qualcomm, which is expected to be announced within the next year. This could be the beginning of a closer link between Google Play, headsets, and phones.

Apple may consider integrating its next-generation Vision hardware with iPhones and iPads in the future. Currently, the Vision Pro only functions as a secondary monitor for Macs. It is anticipated that laptops will also have a closer integration with glasses, an area that Spacetop, a startup showcasing a product at AWE, is investigating.

The Spacetop resembles a lidless Chromebook combined with Xreal AR glasses that are fixed to it; the glasses serve as the laptop’s display and are able to track its movement in space, ensuring the display remains centered as if it were an ethereal extension.

Spacetop and the Xreal Beam Pro come across as initial attempts to address the challenge of powering AR glasses with external devices. However, in the future, all our technology should seamlessly integrate with glasses, much like how headphones effortlessly connect with our devices.

Furthermore, in the long run, we can expect various gadgets such as cameras, watches, and other wearables to offer similar hand-tracking capabilities, as hinted at by recent reports about AirPods potentially incorporating cameras.

Anticipate slightly more intelligent eyewear as phone issues are being resolved

Augmented reality glasses are not yet widely available, mainly due to the fact that current smartphones are not equipped to support them, nor is the environment in which they would be used.

There are unresolved safety issues for public use, as well as challenges with location-based AR systems that rely on map applications to provide seamless shared experiences in particular locations.

Additionally, there is uncertainty surrounding whether phone batteries and processors have the capability to sufficiently power AR glasses without either overheating or requiring frequent recharging.

Currently, the upcoming generation of mixed reality headsets are likely to take inspiration from the Vision Pro and Quest 3 models, incorporating more augmented reality functions for household use.

Anticipate that eyeglasses will gradually incorporate more artificial intelligence capabilities, improved cameras, sound systems, and screens. Somewhere in between the Meta Ray-Bans and Xreal’s eyeglasses, lies my ideal pair of cutting-edge smart glasses, with enhanced versions expected to hit the market within the next year.

These glasses are poised to be capable of linking with our smartphones, but require our phones to advance and better integrate with them.


For Tech Videos  Subscribe this Channel


Read more article like this

 

For More Tech videos Subscribe

Leave a Reply