RealityKit
Generated on 7/22/2024
2 searches

This post is brought to you by Tailscan,
the best devtool for Tailwind CSS.
RealityKit is a key framework discussed at WWDC 2024, designed to simplify the process of rendering 3D models and creating spatial experiences. Here are some highlights from the sessions:
-
Platforms State of the Union:
- RealityKit simplifies rendering 3D models with various styles like realistic, cel-shaded, or cartoon. It first shipped with Vision Pro and has gained significant new capabilities, including alignment across macOS, iOS, and iPadOS with RealityKit 4. This includes features like material X, portals, particles, rich materials, virtual lighting, blend shapes, inverse kinematics, and animation timelines. (Platforms State of the Union)
-
What’s new in USD and MaterialX:
- RealityKit supports USD and MaterialX, enabling the creation of amazing spatial experiences. New features include support for MaterialX shaders in Shader Graph, which is now available across VisionOS, iOS, iPadOS, and macOS. (What’s new in USD and MaterialX)
-
Build a spatial drawing app with RealityKit:
- This session demonstrates building an interactive spatial drawing app using RealityKit. It covers spatial tracking APIs, SwiftUI for spatial UI, resource updates, low-level APIs for generating meshes and textures, and importing 2D vector graphics to make them spatial. (Build a spatial drawing app with RealityKit)
-
Explore object tracking for visionOS:
- This session shows how to use RealityKit and ARKit APIs for object tracking in visionOS, including creating and anchoring virtual content with Reality Composer Pro. (Explore object tracking for visionOS)
-
Break into the RealityKit debugger:
- This session introduces the RealityKit debugger, which helps track down bugs by taking a 3D snapshot of your running app and loading it in Xcode for exploration. (Break into the RealityKit debugger)
-
Discover RealityKit APIs for iOS, macOS, and visionOS:
- This session covers new RealityKit APIs for developing spatial computing apps, including enhancements like portal crossing, dynamic lights, and cross-platform capabilities. (Discover RealityKit APIs for iOS, macOS and visionOS)
Relevant Sessions:
- Platforms State of the Union
- What’s new in USD and MaterialX
- Build a spatial drawing app with RealityKit
- Explore object tracking for visionOS
- Break into the RealityKit debugger
- Discover RealityKit APIs for iOS, macOS and visionOS
Feel free to ask for more specific details or timestamps if needed!

Explore object tracking for visionOS
Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

Discover RealityKit APIs for iOS, macOS and visionOS
Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Platforms State of the Union
Discover the newest advancements on Apple platforms.

What’s new in USD and MaterialX
Explore updates to Universal Scene Description and MaterialX support on Apple platforms. Discover how these technologies provide a foundation for 3D content creation and delivery, and learn how they can help streamline your workflows for creating great spatial experiences. Learn about USD and MaterialX support in RealityKit and Storm, advancements in our system-provided tooling, and more.

Build a spatial drawing app with RealityKit
Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.

Break into the RealityKit debugger
Meet the RealityKit debugger and discover how this new tool lets you inspect the entity hierarchy of spatial apps, debug rogue transformations, find missing entities, and detect which parts of your code are causing problems for your systems.