RealityKit
Asked on 07/22/2024
2 searches
RealityKit is a key framework discussed at WWDC 2024, designed to simplify the process of rendering 3D models and creating spatial experiences. Here are some highlights from the sessions:
-
Platforms State of the Union:
- RealityKit simplifies rendering 3D models with various styles like realistic, cel-shaded, or cartoon. It first shipped with Vision Pro and has gained significant new capabilities, including alignment across macOS, iOS, and iPadOS with RealityKit 4. This includes features like material X, portals, particles, rich materials, virtual lighting, blend shapes, inverse kinematics, and animation timelines. (Platforms State of the Union)
-
What’s new in USD and MaterialX:
- RealityKit supports USD and MaterialX, enabling the creation of amazing spatial experiences. New features include support for MaterialX shaders in Shader Graph, which is now available across VisionOS, iOS, iPadOS, and macOS. (What’s new in USD and MaterialX)
-
Build a spatial drawing app with RealityKit:
- This session demonstrates building an interactive spatial drawing app using RealityKit. It covers spatial tracking APIs, SwiftUI for spatial UI, resource updates, low-level APIs for generating meshes and textures, and importing 2D vector graphics to make them spatial. (Build a spatial drawing app with RealityKit)
-
Explore object tracking for visionOS:
- This session shows how to use RealityKit and ARKit APIs for object tracking in visionOS, including creating and anchoring virtual content with Reality Composer Pro. (Explore object tracking for visionOS)
-
Break into the RealityKit debugger:
- This session introduces the RealityKit debugger, which helps track down bugs by taking a 3D snapshot of your running app and loading it in Xcode for exploration. (Break into the RealityKit debugger)
-
Discover RealityKit APIs for iOS, macOS, and visionOS:
- This session covers new RealityKit APIs for developing spatial computing apps, including enhancements like portal crossing, dynamic lights, and cross-platform capabilities. (Discover RealityKit APIs for iOS, macOS and visionOS)
Relevant Sessions:
- Platforms State of the Union
- What’s new in USD and MaterialX
- Build a spatial drawing app with RealityKit
- Explore object tracking for visionOS
- Break into the RealityKit debugger
- Discover RealityKit APIs for iOS, macOS and visionOS
Feel free to ask for more specific details or timestamps if needed!

Explore object tracking for visionOS
Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

Discover RealityKit APIs for iOS, macOS and visionOS
Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Platforms State of the Union
Discover the newest advancements on Apple platforms.