Americas

  • United States

Asia

WWDC: What’s new for App Clips in ARKit 5

opinion
Jun 14, 20215 mins
AppleAugmented RealityEnterprise Applications

Apple has updated App Clips with new features that make these things quite useful for many B2C and B2B enterprises.

object capture

One of Apple’s quietly significant WWDC 2021 announcements must be its planned improvements to ARKit 5’s App Clip Codes feature, which becomes a powerful tool for any B2B or B2C product sales enterprise.

Some things just seem to climb off the page

When introduced last year, the focus was on offering up access to tools and services found within apps. All App Clip Codes are made available via a scannable pattern and perhaps an NFC. People scan the code using the camera or NFC to launch the App Clip.

This year Apple has enhanced AR support in App Clip and App Clip Codes, which can now recognize and track App Clip Codes in AR experiences — so you can run part of an AR experience without the entire app.

What this means in customer experience terms is that a company can create an augmented reality experience that becomes made available when a customer points their camera at an App Code in a product reference manual, on a poster, inside the pages of a magazine, at a trade show store — wherever you need them to find this asset.

Apple offered up two primary real-world scenarios in which it imagines using these codes:

  • A tile company could use them so a customer can preview different tile patterns on the wall.
  • A seed catalog could show an AR image of what a grown plant or vegetable will look like, and could let you see virtual examples of that greenery growing in your garden, via AR.

Both implementations seemed fairly static, but it’s possible to imagine more ambitious uses. They could be used to explain self assembly furniture, detail car maintenance manuals, or to provide virtual instructions on a coffeemaker.

What is an App Clip?

An app clip is a small slice of an app that takes people through part of an app without having to install the whole app. These app clips save download time and take people directly to a specific part of the app that’s highly relevant to where they are at the time.

Object Capture

Apple also introduced an essential supporting tool at WWDC 2021, Object Capture in RealityKit 2. This makes it much easier for developers to create photo-realistic 3D models of real-world objects quickly using images captured on an iPhone, iPad, or DSLR.

What this essentially means is that Apple has moved from empowering developers to build AR experiences that exist only within apps to the creation of AR experiences that work portably, more or less outside of apps.

That’s significant as it helps create an ecosystem of AR assets, services and experiences, which it will need as it attempts to push further in this space.

Faster processors required

It’s important to understand the kind of devices capable of running such content. When ARKit was first introduced alongside iOS 11, Apple said it required at least an A9 processor to run. Things have moved on since then, and the most sophisticated features in ARKit 5 require at least an A12 Bionic chip.

In this case, App Clip Code tracking requires devices with an A12 Bionic processor or later, such as the iPhone XS. That these experiences require one of Apple’s more recent processors is noteworthy as the company inexorably drives toward launch of AR glasses.

It lends substance to understanding Apple’s strategic decision to invest in chip development. After all, the move from A10 Fusion to A11 processors yielded a 25% performance gain. At this point, Apple seems to be achieving a roughly similar gains with each iteration of its chips. We should see another leapfrog in performance per watt once it moves to 3nm chips in 2022 — and these advances in capability are now available across its platforms, thanks to M-series Mac chips.

Despite all this power, Apple warns that decoding these clips may take time, so it suggests developers offer a placeholder visualization while the magic happens.

What else is new in ARKit 5?

In addition to App Clip Codes, ARKit 5 benefits from:

Location Anchors

It’s now possible to place AR content at specific geographic locations, tying the experience to a Maps longitude/latitude measurement. This feature also requires an A12 processor or later and is available at key U.S. cities and in London.

What this means is that you might be able to wander round and grab AR experiences just by pointing your camera at a sign, or checking a location in Maps. This kind of overlaid reality has to be a hint at the company’s plans, particularly in line with its improvements in accessibility, person recognition, and walking directions.

Motion capture improvements

ARKit 5 can now more accurately track body joints at longer distances. Motion capture also more accurately supports a wider range of limb movements and body poses on A12 or later processors. No code change is required, which should mean any app that uses motion capture this way will benefit from better accuracy once iOS 15 is released.

Also read:

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

jonny_evans

Hello, and thanks for dropping in. I'm pleased to meet you. I'm Jonny Evans, and I've been writing (mainly about Apple) since 1999. These days I write my daily AppleHolic blog at Computerworld.com, where I explore Apple's growing identity in the enterprise. You can also keep up with my work at AppleMust, and follow me on Mastodon, LinkedIn and (maybe) Twitter.