what is the latest RealityKit additions
Generated on 8/8/2024
1 search
The latest additions to RealityKit, as presented at WWDC 2024, include several significant updates and new features:
-
Cross-Platform Alignment: RealityKit 4 now supports macOS, iOS, and iPadOS alongside visionOS, allowing developers to build for all these platforms simultaneously. This includes features like MaterialX, portals, and particles (Platforms State of the Union).
-
Advanced Character Animation: New APIs for blend shapes, inverse kinematics, and animation timelines have been introduced, enhancing character animation capabilities and enabling dynamic interactions (Platforms State of the Union).
-
Low-Level Access: New APIs for low-level mesh and textures provide improved control over app appearance, working with Metal compute shaders to enable fully dynamic models and textures (Platforms State of the Union).
-
Shader Graph: Support for MaterialX shaders in Shader Graph, a shader creation system within Reality Composer Pro, is now extended to all platforms, ensuring consistent visuals across VisionOS, iOS, iPadOS, and macOS (What’s new in USD and MaterialX).
-
Enhanced Debugging: Xcode's view debugging now supports introspecting 3D scene content, allowing developers to investigate scene object hierarchies and inspect properties (Platforms State of the Union).
-
New Components and Features:
- Billboard Component: Ensures entities always face the user.
- Pixel Cast: Enables pixel-perfect entity selection.
- Subdivision Surface: Renders smooth surfaces without creating a dense mesh (Discover RealityKit APIs for iOS, macOS and visionOS).
-
Hand Tracking: Enhanced hand tracking capabilities, allowing developers to choose between continuous or predicted hand tracking depending on their needs (Create enhanced spatial computing experiences with ARKit).
For a detailed overview of these features, you can refer to the sessions:
Build a spatial drawing app with RealityKit
Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.
Platforms State of the Union
Discover the newest advancements on Apple platforms.
Create enhanced spatial computing experiences with ARKit
Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.
What’s new in USD and MaterialX
Explore updates to Universal Scene Description and MaterialX support on Apple platforms. Discover how these technologies provide a foundation for 3D content creation and delivery, and learn how they can help streamline your workflows for creating great spatial experiences. Learn about USD and MaterialX support in RealityKit and Storm, advancements in our system-provided tooling, and more.
Compose interactive 3D content in Reality Composer Pro
Discover how the Timeline view in Reality Composer Pro can bring your 3D content to life. Learn how to create an animated story in which characters and objects interact with each other and the world around them using inverse kinematics, blend shapes, and skeletal poses. We’ll also show you how to use built-in and custom actions, sequence your actions, apply triggers, and implement natural movements.
Discover RealityKit APIs for iOS, macOS and visionOS
Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.