Skip to main content

Opinion: Why AR is the future despite Apple Vision Pro backlash, and one thing Google Glass did right

Apple just gave what I’d consider to be our first realistic look at what’s possible and achievable in augmented reality with today’s technology. Saying and believing “AR is the future” might bias me, but I think the backlash that has since emerged is short sighted, unfairly cynical, and missing a bigger picture, which I do think was caused by a miscalculation in Apple’s Vision Pro presentation.

On VR, AR, MR, XR and other terminology

Tim Cook yesterday announced Apple Vision Pro and visionOS as an “entirely new AR platform.” Traditionally, AR is the idea that you are directly seeing the real world with your eyes through a piece of glass/lens that can overlay graphics. By that definition, Apple is mixed reality (MR) or extended reality (XR) because you’re viewing the world through a camera feed and display, albeit very high-resolution 4K ones.

That being said, I still believe that Apple gave the best look at what AR could be since transparent screen technology is not yet completely feasible, at least to the standard of traditional displays. The experiences Apple showed are very much there to be applied to actual AR glasses once it’s ready, whenever that may be.

Why AR for users

In brief, augmented reality overlays contextual information over the real world and your line of sight. For example, there could be floating arrows, street names, and other geographical markers when navigating, such as what Google Maps Live View does through your smartphone. Imagine review scores projected over the fronts of restaurants (again Live View today) when you’re looking for some place to eat or on products when you’re shopping in a store (Google Lens). Similarly, foreign text on menus or signs could be translated and projected over the real world.

I believe this is helpful and the inevitable end goal of personal computing in that – with the right smart assistant – technology can ambiently aid you throughout your life without having to be explicitly asked. Maybe it’s as simple as setting a navigation, shopping, or translation mode/filter.

Some of these experiences already exist on smartphones today, but it will be most ideal and natural on smart glasses.  

This will require a lot of computer vision (image recognition) and contextual awareness of your environment (location, current activity, etc.) that will undoubtedly require cameras to be always-on and tracking, hopefully in a privacy-preserving manner.

The promise is that it is less disruptive than today’s screen and that it will actually let technology disappear into the background. That might seem counterintuitive, but the hope here is that your interactions with AR glasses are so brief that you get what you wanted to know and then stop using it. Let’s take those three example again:

  • Directions: You turn on navigation and go on with your trip
  • Shopping: You enter a store, get to the aisle you want, and the information appears
  • Translations: Glasses are aware of what languages you know and helps when one that you don’t is being looked at for an extended period of time

Why AR for Apple, Meta, Google, etc.

More than desktop, laptops, and smartphones, AR can be incredibly immersive and all encompassing. It places a great deal of control (and trust) in the OS to literally surface what you see and interact with.

You would be correct to be cynical about that. You should have high expectations for privacy. For example, what you look or simply gaze at should not be available to third-party advertisers. 

As seen on mobile, the operating system maker and app store overseer play an outsized role in the experience that third-party developers can offer. Some of the enforced mandates (privacy/security) can be good, others capricious (cuts of revenue).

Google and Apple won mobile computing, and there is an ongoing race to own the next form factor. The stakes are heightened by the fact that AR glasses are really the only device category that can ever displace the smartphone in terms of mass market adoption and appeal. In fact, glasses can go even beyond the phone by also becoming your productivity device (laptop) given the floating screen paired with either a physical or virtual full-sized projected keyboard. It can also be the entertainment device that replaces the simple screen that is your TV for watching movies, shows, and playing games, while unlocking other forms of immersive entertainment.

The stakes are so elevated that Facebook made a “bet the company” move with the Meta rename and dedicating huge amounts of resources to Reality Labs and the metaverse.

Why the backlash:

What Apple did wrong

All new technologies are met with criticism at launch, and yesterday for Apple was no exception. People vehemently reacted against wearing a headset that, to them, looks bulky and isolating. Apple’s attempts to counteract the last point of criticism was an OLED screen on the front of Vision Pro that shows your eyes when you’re looking into the real world (called passthrough) at others. Critics simply found that aspect of Apple Vision Pro creepy and unnatural.

Beyond being bulky, some – if not most – people will always react against putting a headset on their face. However, it’s important to keep in mind that all technology has to start somewhere, and these headsets are how the technology improves. Everyone’s end goal here is a pair of glasses that look indistinguishable from what we have today.

The more specific complaint I believe is against the idea of floating virtual screens or objects. At least with a phone or laptop, you can turn the display off or shut the lid. Science fiction, taking and extrapolating the state of pop-up ads on a webpage, has imagined AR as being filled with annoying overlaid pop-ups everywhere in your environment. As I said earlier, it’s not wrong to be cynical and expect more from this upcoming technology, but I think companies making this are well aware of what people will tolerate. As consumers, having at least three big platforms trying to compete in this space will provide choice and meaningful competition. 

Besides the inherent ridiculousness of wearing a headset and doing so all day for work, some people yesterday were very against the idea of you wearing such a thing and using it to record memories as Vision Pro is also Apple’s “first 3D camera.” Here, I agree and think Apple went a step too far in trying to make its headset useful. People are used to pointing phones, and Apple’s first 3D camera should probably have been reserved for that. I don’t think the core idea (a more realistic way to save memories) is bad, but people are not yet conditioned to the format. 

Historically, Apple as a company only launches technology that is ready. When it does, it’s often truthful, and as such, it does not shy away from showing people wearing the headset all day for work and using it as a camera. For that, Apple has been lambasted as mentioned above. In building these platforms, it’s good that companies are being frank with marketing and showing the literal unappealing angles, as well as the tethered battery pack in every shot.

…and Google did right 11 years ago

However, in Apple’s truthfulness and desire to only show what’s possible today, it left out the future. The company should be commended for only showing what’s ready, but that just looks like floating screens to most. 

As the backlash shows, people are not getting the potential yet. I think Apple explicitly saying that the end goal is building a pair of normal glasses that help you throughout your day should have been explicitly telegraphed. I realize it’s the antithesis of how they operate, but it’s clear people need to have a better idea for what’s coming in the future. 

More so, I think they should have shown the utility of AR overlays outside the home. What I’m thinking of is Google’s original concept video for Glass that it called “One day…” from 2012. AR interactions are shown to be visually light and quick, with notifications appearing in the corner until you open them. The UI is minimal and falls into the first half of AR being a helpful overlay rather than floating screens and a direct monitor/TV replacement that looks bulky today. That’s a vision people can get around. 

This video was very much not how Project Glass ever ended up looking, and Google should have been even more explicit that this was their vision. Yet, 11 years later, I think it best sums up what AR could be like, and I think a modern take of this from Apple – and later Google – is needed.  

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com