How do I load USDZ file into RealityKit scene?

Generated on 1/25/2025

1 search

To load a USDZ file into a RealityKit scene, you can use the RealityKit API, which supports loading USD files. Here's a general approach:

  1. Import the USDZ file: You can import your USDZ file into your RealityKit project. This can be done by dragging and dropping the USDZ file into your Xcode project.

  2. Load the USDZ file in RealityKit: Use the Entity.loadModel(named:) method to load the USDZ file into your RealityKit scene. This method asynchronously loads the model and returns an Entity that you can add to your scene.

  3. Add the model to your scene: Once the model is loaded, you can add it to your RealityKit scene by attaching it to an AnchorEntity or directly to the scene's root entity.

Here's a simple code example:

import RealityKit

// Load the USDZ model
let modelEntity = try! Entity.loadModel(named: "YourModelName")

// Create an anchor entity
let anchorEntity = AnchorEntity()

// Add the model entity to the anchor
anchorEntity.addChild(modelEntity)

// Add the anchor to the scene
arView.scene.addAnchor(anchorEntity)

For more detailed information, you might want to check out the session titled "What’s new in USD and MaterialX" from WWDC 2024, which discusses the support for USD in RealityKit. You can find more about this in the What’s new in USD and MaterialX (02:31) session.

Optimize for the spatial web

Optimize for the spatial web

Discover how to make the most of visionOS capabilities on the web. Explore recent updates like improvements to selection highlighting, and the ability to present spatial photos and panorama images in fullscreen. Learn to take advantage of existing web standards for dictation and text-to-speech with WebSpeech, spatial soundscapes with WebAudio, and immersive experiences with WebXR.

Discover RealityKit APIs for iOS, macOS and visionOS

Discover RealityKit APIs for iOS, macOS and visionOS

Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Explore object tracking for visionOS

Explore object tracking for visionOS

Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

Break into the RealityKit debugger

Break into the RealityKit debugger

Meet the RealityKit debugger and discover how this new tool lets you inspect the entity hierarchy of spatial apps, debug rogue transformations, find missing entities, and detect which parts of your code are causing problems for your systems.

Optimize your 3D assets for spatial computing

Optimize your 3D assets for spatial computing

Dive into an end-to-end workflow for optimized 3D asset creation. Discover best practices for optimizing meshes, materials, and textures in your digital content creation tool. Learn how to harness shader graph, baking, and material instances to enhance your 3D scene while optimizing performance. Take advantage of native tools to work more effectively with your assets and improve your app’s performance.

Build a spatial drawing app with RealityKit

Build a spatial drawing app with RealityKit

Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.

Discover area mode for Object Capture

Discover area mode for Object Capture

Discover how area mode for Object Capture enables new 3D capture possibilities on iOS by extending the functionality of Object Capture to support capture and reconstruction of an area. Learn how to optimize the quality of iOS captures using the new macOS sample app for reconstruction, and find out how to view the final results with Quick Look on Apple Vision Pro, iPhone, iPad or Mac. Learn about improvements to 3D reconstruction, including a new API that allows you to create your own custom image processing pipelines.

Create custom environments for your immersive apps in visionOS

Create custom environments for your immersive apps in visionOS

Discover how to create visually rich and performant customized app environments for Apple Vision Pro. Learn design guidelines, get expert recommendations, and explore techniques you can use in any digital content creation tool to begin building your immersive environment.

What’s new in USD and MaterialX

What’s new in USD and MaterialX

Explore updates to Universal Scene Description and MaterialX support on Apple platforms. Discover how these technologies provide a foundation for 3D content creation and delivery, and learn how they can help streamline your workflows for creating great spatial experiences. Learn about USD and MaterialX support in RealityKit and Storm, advancements in our system-provided tooling, and more.