Right Now, Augmented Reality is Only a Feature, Part 1
A VentureBeat article came out a few weeks back, which described augmented reality (AR) as only a feature on a app. The article states that "(AR offers) developers and companies an opportunity to engage users of their existing apps with new AR features. But it might be better to approach this tech as “AR as a feature” — not as the end-all, be-all of the app."
We couldn't agree more.
In a nutshell, AR for the mainstream is only part of a smartphone app, and not the "whole user experience" since AR is not (yet) your main interface to your computers. Until a pair of really-smart glasses replace your phone, we can't see anyone getting into full "Minority Report". We have a few observations from our past experiences.
Holding Up a Phone to View AR Content is Awkward
Even with all the marketing apps and tests we've built, it's still really awkward to hold your phone up and point your camera at markers, surfaces, or anything. Because you don't have a big field of view and you don't know the size of the content, users are often adjusting their vantage points to see the content in relation to its space. This problem is even worse with markers. We've watched users fidget with being close enough to a marker to trigger an experience while moving around to get the best view of it, which could turn off the experience. See the giphy below:
Even with ARKit/ARCore or whatever markerless system you use, you'll run into vantage point adjustment issues as well. This depends on the texture of the surface you're projecting 3D things onto and the number of features in your environment (you need enough image features to "lock" into place).
In my giphy, you see me moving back and forth for a second or two towards a marker, which is enough to break the experience.
Holding Up a Phone is Tiring, which makes AR Temporary
When we were working on our construction product, we probably had some of the biggest lessons with smartphones because bad experiences are amplified on a job site. Dr Julien was testing in an underground subway station, and it was pretty apparent that you couldn't hold up your phone for more than a few minutes at a time. We'd be looking up at HVAC, then our necks and arms would get tired. It's as simple as that. We couldn't expect a site superintendent to do this all day.
Holding Up a Phone is a Safety Issue, which makes AR Temporary
Last but not least, on the construction site, danger is everywhere so worker safety was paramount for us. When we were testing with the smartphones held out in front, you were so focused on the screen that you didn't notice things in your periphery. Sometimes a beam was sticking out, a worker was cutting something, or even a massive hole would be in your way. Seeing these dangers made us always work in pairs: one person would test while the other would guide. Even if you were in the safety of your home, the awkward use of a smartphone for AR meant you could walk into your couch or knock something over on your bookcase, just to get a better vantage point.
So from our perspective, all these worries make you experience AR in a relatively small area in a short amount of time.
In Part 2 of AR as an app feature for smartphone, we'll briefly chat about battery power, smartglasses being a better platform for maintaining tracking (it's on your face, not your hand), and why gestures for AR (like Minority Report) are still not the way forward. Watching Tom Cruise use his computer looked really tiring.