What's new in ARKit?

Generated on 3/26/2025

1 search

At WWDC 2024, Apple introduced several exciting updates to ARKit, particularly for visionOS. Here are the key highlights:

  1. Room Tracking: ARKit now includes a room tracking feature that allows apps to tailor experiences based on the room you're in. It can identify room boundaries and recognize transitions between rooms, enabling unique experiences for each space. This is covered in the Create enhanced spatial computing experiences with ARKit session.

  2. Plane Detection: A new slanted plane alignment has been introduced, which allows for detecting angled surfaces in addition to horizontal and vertical ones. This enhancement is useful for placing virtual content on various surfaces. More details can be found in the Create enhanced spatial computing experiences with ARKit session.

  3. Object Tracking: ARKit can now track real-world objects that are statically placed in your environment, providing the position and orientation of these items to anchor virtual content. This feature is new to visionOS and is detailed in the Create enhanced spatial computing experiences with ARKit session.

  4. Hand Tracking: Improvements have been made to hand tracking, offering options for continuous or predicted hand tracking, depending on the needs of your app. This is particularly useful for gesture detection and attaching content to hands. Learn more in the Create enhanced spatial computing experiences with ARKit session.

These updates are designed to enhance spatial computing experiences, allowing developers to create more immersive and interactive applications.

Discover RealityKit APIs for iOS, macOS and visionOS

Discover RealityKit APIs for iOS, macOS and visionOS

Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Discover area mode for Object Capture

Discover area mode for Object Capture

Discover how area mode for Object Capture enables new 3D capture possibilities on iOS by extending the functionality of Object Capture to support capture and reconstruction of an area. Learn how to optimize the quality of iOS captures using the new macOS sample app for reconstruction, and find out how to view the final results with Quick Look on Apple Vision Pro, iPhone, iPad or Mac. Learn about improvements to 3D reconstruction, including a new API that allows you to create your own custom image processing pipelines.

Create enhanced spatial computing experiences with ARKit

Create enhanced spatial computing experiences with ARKit

Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Build a spatial drawing app with RealityKit

Build a spatial drawing app with RealityKit

Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.

Compose interactive 3D content in Reality Composer Pro

Compose interactive 3D content in Reality Composer Pro

Discover how the Timeline view in Reality Composer Pro can bring your 3D content to life. Learn how to create an animated story in which characters and objects interact with each other and the world around them using inverse kinematics, blend shapes, and skeletal poses. We’ll also show you how to use built-in and custom actions, sequence your actions, apply triggers, and implement natural movements.

Platforms State of the Union

Platforms State of the Union

Discover the newest advancements on Apple platforms.

Explore object tracking for visionOS

Explore object tracking for visionOS

Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.