ARKit improvements since 2020?
Generated on 8/5/2024
1 search
Since 2020, ARKit has seen several significant improvements, particularly highlighted in the WWDC 2024 sessions. Here are some of the key advancements:
-
Scene Understanding:
- Room Tracking: A new feature that allows apps to customize experiences based on the room they are in.
- Plane Detection: Updated to include slanted plane alignment, which helps in detecting angled surfaces. This is useful for placing virtual content on various surfaces like tables or walls. (Create enhanced spatial computing experiences with ARKit)
-
Object Tracking:
- ARKit can now track real-world objects that are statically placed in the environment, providing their position and orientation to anchor virtual content. This can be used in educational apps to exhibit 3D models of real-world items. (Create enhanced spatial computing experiences with ARKit)
-
World Tracking:
- Improved robustness for various lighting conditions. If the system detects low light, it will switch to orientation-based tracking to prevent complete tracking loss. (Create enhanced spatial computing experiences with ARKit)
-
Hand Tracking:
- Enhanced hand tracking capabilities, including display-aligned hand anchor updates for gesture detection and hand predictions for attaching content to hands. These advancements are also available in RealityKit for hand anchor entities. (Create enhanced spatial computing experiences with ARKit)
-
RealityKit Integration:
- RealityKit has been updated to simplify the process of rendering 3D models and now supports macOS, iOS, and iPadOS alongside visionOS. New APIs and tools like blend shapes, inverse kinematics, and animation timelines have been introduced to enhance character animation capabilities. (Platforms State of the Union)
-
Area Mode for Object Capture:
- A new iOS API for capturing areas with better quality reconstructions, which is useful for creating detailed 3D models of larger spaces. (Discover area mode for Object Capture)
These updates collectively enhance the ability of ARKit to create more immersive and interactive spatial computing experiences. For more detailed examples and demonstrations, you can refer to the sessions mentioned above.
Build a spatial drawing app with RealityKit
Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.
Discover area mode for Object Capture
Discover how area mode for Object Capture enables new 3D capture possibilities on iOS by extending the functionality of Object Capture to support capture and reconstruction of an area. Learn how to optimize the quality of iOS captures using the new macOS sample app for reconstruction, and find out how to view the final results with Quick Look on Apple Vision Pro, iPhone, iPad or Mac. Learn about improvements to 3D reconstruction, including a new API that allows you to create your own custom image processing pipelines.
Platforms State of the Union
Discover the newest advancements on Apple platforms.
Create enhanced spatial computing experiences with ARKit
Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.