Easily prototype and produce AR experiences directly in AR with no prior 3D experience. • Drag and drop to place and rotate virtual objects to assemble your scene directly in AR or on the canvas. • Build your scene by importing your own .usdz files or take advantage of the ready to use virtual objects in Reality Composer’s built-in AR content library, so you can get up and running quickly. • Create high-quality 3D models using photogrammetry and a guided capture process leveraging the advanced LiDAR sensors on your iPhone or iPad (iPhone 12 Pro or later and iPad Pro 5th generation or later). • Customize a virtual object’s size, style, and more. • Add animations that let you move, scale, and add emphasis like a ‘wiggle’ or ‘spin’ to virtual objects, without having any animation experience. • Add spatial audio, all of which can be set up to happen upon a user tapping on those objects, coming in close proximity with them, or other triggers. • Easily send your project back and forth between your iOS device and your Mac with the Reality Composer app bundled with Xcode. • Export your complete scene as a .usdz file to use in AR Quick Look, use in your existing 3D projects, or integrate it into your Xcode project with Reality Composer on Mac. • Record sensor and camera data in the actual location you are building it for, and then replay it on your iOS device while building your app in Xcode later. This lets you iterate and build out your AR experience using the recorded data without the need to physically move your device or go to another location.