What is new on VisionKit?
Generated on 7/14/2024
1 search

This post is brought to you by Tailscan,
the best devtool for Tailwind CSS.
What's New in VisionKit
At WWDC 2024, several updates and enhancements were introduced for VisionKit and related frameworks. Here are the key highlights:
-
Vision Pro Enhancements:
- Hand Positioning: You can now decide if you want the user's hands to appear in front of or behind the content, providing more creative control in app experiences.
- Scene Understanding: Improved fidelity in scene understanding capabilities, including detection of planes in all orientations and room anchors that consider the user's surroundings on a per-room basis.
- Object Tracking API: A new API allows you to attach content to individual objects found around the user, enabling new dimensions of interactivity (Platforms State of the Union).
-
Vision Framework Enhancements:
- New Swift API: A new API with streamlined syntax designed for Swift, making it easier to bring computer vision into your apps.
- Swift Concurrency: Full support for Swift concurrency and Swift 6, enabling the creation of more performant apps.
- New Capabilities: Introduction of new capabilities such as hand pose detection, body pose requests, and aesthetic score requests (Discover Swift enhancements in the Vision framework).
-
ARKit Updates:
- Room Tracking: New room tracking features to customize experiences based on the room.
- Plane Detection: Updates to plane detection to enhance scene understanding.
- Object Tracking: Enhanced object tracking capabilities to further creativity (Create enhanced spatial computing experiences with ARKit).
These updates collectively enhance the capabilities of VisionKit and related frameworks, making it easier for developers to create immersive and interactive experiences on Apple devices.

Get started with HealthKit in visionOS
Discover how to use HealthKit to create experiences that take full advantage of the spatial canvas. Learn the capabilities of HealthKit on the platform, find out how to bring an existing iPadOS app to visionOS, and explore the special considerations governing HealthKit during a Guest User session. You’ll also learn ways to use SwiftUI, Swift Charts, and Swift concurrency to craft innovative experiences with HealthKit.

Create enhanced spatial computing experiences with ARKit
Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Platforms State of the Union
Discover the newest advancements on Apple platforms.

Explore machine learning on Apple platforms
Get started with an overview of machine learning frameworks on Apple platforms. Whether you’re implementing your first ML model, or an ML expert, we’ll offer guidance to help you select the right framework for your app’s needs.

Discover Swift enhancements in the Vision framework
The Vision Framework API has been redesigned to leverage modern Swift features like concurrency, making it easier and faster to integrate a wide array of Vision algorithms into your app. We’ll tour the updated API and share sample code, along with best practices, to help you get the benefits of this framework with less coding effort. We’ll also demonstrate two new features: image aesthetics and holistic body pose.