reality kit
Generated on 8/12/2024
1 search
RealityKit is a powerful framework introduced by Apple to create compelling 3D and spatial experiences across various platforms, including iOS, macOS, and visionOS. At WWDC 2024, several sessions highlighted the advancements and features of RealityKit:
-
Platforms State of the Union: RealityKit 4 was introduced, aligning APIs and tools across macOS, iOS, and iPadOS. It includes features like material X, portals, particles, blend shapes, inverse kinematics, and animation timelines, enhancing character animation capabilities and interactions (Platforms State of the Union).
-
What’s new in USD and MaterialX: RealityKit now supports USD and MaterialX, allowing for the creation of amazing spatial experiences. It includes support for material X shaders in Shader Graph, a system within Reality Composer Pro, and brings these features to other platforms for consistent visuals (What’s new in USD and MaterialX).
-
Build a spatial drawing app with RealityKit: This session demonstrated building an interactive spatial drawing app using RealityKit. It covered spatial tracking APIs, SwiftUI integration, and advanced low-level APIs for generating meshes and textures interactively (Build a spatial drawing app with RealityKit).
-
Break into the RealityKit debugger: This session focused on using the RealityKit debugger to troubleshoot and enhance 3D apps. It showcased how to visualize invisible entities and debug complex behaviors in RealityKit applications (Break into the RealityKit debugger).
-
Explore object tracking for visionOS: This session explored using RealityKit and ARKit APIs for object tracking in visionOS, demonstrating how to anchor virtual content using Reality Composer Pro (Explore object tracking for visionOS).
These sessions provide a comprehensive overview of the capabilities and new features of RealityKit, enabling developers to create dynamic and immersive spatial experiences.
Platforms State of the Union
Discover the newest advancements on Apple platforms.
What’s new in USD and MaterialX
Explore updates to Universal Scene Description and MaterialX support on Apple platforms. Discover how these technologies provide a foundation for 3D content creation and delivery, and learn how they can help streamline your workflows for creating great spatial experiences. Learn about USD and MaterialX support in RealityKit and Storm, advancements in our system-provided tooling, and more.
Build a spatial drawing app with RealityKit
Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.
Break into the RealityKit debugger
Meet the RealityKit debugger and discover how this new tool lets you inspect the entity hierarchy of spatial apps, debug rogue transformations, find missing entities, and detect which parts of your code are causing problems for your systems.
Explore object tracking for visionOS
Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.