At its Worldwide Developer Conference, Apple announced a significant update to RealityKit, its suite of technologies that allow developers to get started building AR (augmented reality) experiences. With the launch of RealityKit 2, Apple says developers will have more visual, audio, and animation control when working on their AR experiences. But the most significant part of the update is how Apple’s new Object Capture technology will allow developers to create 3D models in minutes using only an iPhone.
Apple noted during its developer address that one of the most challenging parts of making great AR apps was creating 3D models. These could take hours and thousands of dollars.
With Apple’s new tools, developers will be able to take a series of pictures using just an iPhone (or iPad or DSLR, if they prefer) to capture 2D images of an object from all angles, including the bottom. Then, Apple explained that using the Object Capture API on macOS Monterey only takes a few lines of code to generate the 3D model.
To begin, developers would start a new photogrammetry session in RealityKit that points to the folder where they’ve captured the images. Then, they would call the process function to generate the 3D model at the desired level of detail. Object Capture allows developers to develop the USDZ files optimized for AR Quick Look — the system that lets developers add virtual, 3D objects in apps or websites on iPhone and iPad. The 3D models can also be added to AR scenes in Reality Composer in Xcode.
Apple said developers like Wayfair, Etsy, and others are using Object Capture to create 3D models of real-world objects, indicating that online shopping is about to get a significant AR upgrade. Wayfair, for example, is using Object Capture to develop tools for their manufacturers so they can create a virtual representation of their merchandise. This will allow Wayfair customers to be able to preview more products in AR than they could today.