Augmented reality (AR) is about to hit the mainstream. Gamers would probably argue it’s already huge, but I’m talking about AR breaking out of gaming and into any number of other industries, all of which are currently engaged in extensive testing and trialing of AR-based use cases. For example, Gartner has predicted that within three years, 100 million shoppers will be shopping using augmented reality.
When Apple released ARKit, a new application framework for developers introduced with iOS11, it enabled apps to interact with the real world in totally new ways. Apple’s ARKit has made the development of AR applications much easier by providing support for core functionalities such as surface detection, light and shadow mapping. It opened up possibilities for use cases in industries like retail, healthcare and fintech, to name a few.
AR will be the new interaction paradigm for consumer and business app users.
The promise of AR-enabled apps also brings with it a set of very real challenges that need to be overcome. Both from a development and from a QE standpoint, subtle user experience factors must be addressed. Flawless user experience will be essential for successful adoption of any AR application. That means developers and testers must put the customer right at the heart of their approach. They need to keep in mind the limitations of the ARKit platform and adapt their applications accordingly to ensure a smooth experience for their users.
We take a closer look at ARKit and what developers and testers need to consider when devising AR applications that are truly customer-centric in design.
Consider a shopping application that uses AR: if the application is opened in low lighting conditions, the scene recognition engine would not properly recognize the products, leading to inconsistencies in the user experience. Instead, the application should be able to display a warning message instead of generating the wrong output in low lighting.
Fast or Shaky Motion
In the same shopping application, if the device is shaky or moving very fast, it would not be able to process the input camera stream. This will result in blurred images, wrong output or the application crashing altogether. Applications need to be smart enough to alert users to keep their devices stable.
AR Content Dimension
Imagine a furniture store has an AR-enabled application that allows customers to “see” the product in their own home before deciding to buy. The application would need to take into account the product dimensions, the room dimensions and so on. It should only “place” the object in the room if there is enough space available.
Similarly, imagine an AR advertising app in which AR content is shown to the viewer upon scanning a QR code. The augmented object should be confined to within the space meant for the advert, and not necessarily its real dimensions.
AR Content Placement
Let’s revisit the furniture store example above… The way the object overlays or otherwise interacts with the existing objects in the room is important. In this case, the application will need to ensure that an object is placed only if enough surface is available. The object should be stationary relative to the reference surface.
If the customer is using an AR Shopping application which shows him product details in AR, the augmented data it displays should be at the correct viewing angle to the customer. It must be ensured that the AR object doesn’t obstruct view of the identified product. Thus, AR content placement also must be contextually relevant to the user. The position of augmented content should be steady despite user’s movement to maintain user experience consistency.
The applications built on the ARKit platform perform multiple computing and memory intensive tasks, such as scene recognition (computer vision) and rendering of objects into a real-world camera stream. Rendering of heavy visual content (large objects or multiple objects simultaneously) can slow down performance, so it’s important to ensure sufficient time is allotted for loading and rendering of three dimensional objects for a seamless user experience. Often, the application can become unresponsive due to heavy processing requirements — it is crucial to perform extensive load testing prior to the release of your AR application.
We would describe all these scenarios as non-functional requirements for AR features. It is down to technology solution providers with in-depth technical expertise to “bake” in these intuitive requirements. Apexon Lab-as-a-Service works with enterprises to bring AR applications to life. To find out how AR apps could enhance your customer offering, contact us today.