Apple Announces ARKit 4 with Location Anchors, Depth API, and Improved Face Tracking

Apple today announced ARKit 4 alongside iOS 14 and iPadOS 14. The new version of ARKit introduces Location Anchors, a new Depth API, and improved face tracking.

ARKit 4
Location Anchors allow developers to place AR experiences, such as life‑size art installations or navigational directions, at a fixed destination. Location Anchoring leverages the higher-resolution data in Apple Maps to place AR experiences at a particular point in the world, meaning AR experiences may now be placed at specific locations, such as throughout cities or alongside famous landmarks. Users can move around virtual objects and observe them from different perspectives, exactly as real objects are seen through a camera lens.

ARKit 4 also takes advantage of iPad Pro's LiDAR Scanner with a brand-new Depth API with advanced scene understanding capabilities, creating a new way to access detailed per-pixel depth information. When combined with 3D mesh data, this depth information makes virtual object occlusion more realistic by enabling instant placement of virtual objects within their physical surroundings. This can offer new capabilities for developers, such as taking more precise measurements and applying effects to the environment.

Finally, face tracking is expanded in ARKit 4 to support the front-facing camera on all devices with the A12 Bionic chip or newer. Up to three faces may now be tracked at once using the TrueDepth camera to power front-facing camera experiences like Memoji and Snapchat.

Tag: ARKit

Top Rated Comments

Nicky G Avatar
39 months ago
It's funny that this "small" piece of WWDC news will go down, I reckon, as some of the most revolutionary stuff Apple announced this year, well beyond switching to ARM. It will take a few more years before it becomes more obvious, but when the "AR kit" that "ARKit" was designed for from the get-go eventually drops, it is going to have some very well-fleshed-out tech baked into it, stuff Apple has been "testing" out in the open for years now. Both via ARKit, and lots of other little things, such as Ultrawideband, early embrace of bluetooth beacon technology, etc. Science fiction has been describing this stuff (in terms of "fully-realized AR") since at least as far back as the early 90s, in Snow Crash. We're getting very close!
Score: 2 Votes (Like | Disagree)

Popular Stories

gmailbeforeandafter

Gmail, Google Docs and Google Sheets Gain Support for 2018 iPad Pro Models

Monday February 4, 2019 1:51 pm PST by
Google today updated its Gmail, Google Docs, and Google Sheets apps for iOS devices, introducing support for the new 11 and 12.9-inch iPad Pro models. The update brings an optimized form factor that no longer features distracting black bars at the top and bottom of the display in portrait mode or at the sides in landscape mode. Following the update, the Gmail app takes up the entirety of the...
macos monterey

Here Are All the Macs Compatible With macOS Monterey

Monday June 7, 2021 12:17 pm PDT by
macOS Monterey is compatible with many of the Macs that were able to run macOS Big Sur, but it drops support for some older MacBook Air and iMac models from 2013 and 2014. A full compatibility list is below: iMac - Late 2015 and later iMac Pro - 2017 and later MacBook Air - Early 2015 and later MacBook Pro - Early 2015 and later Mac Pro - Late 2013 and later Mac mini -...