What is new with Vision Pro?

Generated on 6/19/2024

1 search

Tailscan

This post is brought to you by Tailscan, the best devtool for Tailwind CSS.

Apple introduced several new features and updates for Vision Pro at WWDC 2024. Here are some of the highlights:

  1. Scene Understanding and Object Tracking:

    • Enhanced scene understanding capabilities, including the ability to detect planes in all orientations and anchor objects on surfaces.
    • Introduction of room anchors to consider the user's surroundings on a per-room basis and detect user movement across rooms.
    • New object tracking API that allows attaching content to individual objects around the user, enabling new dimensions of interactivity (Platforms State of the Union).
  2. Enterprise APIs:

    • Six new APIs to support enterprise environments, enhancing collaboration, communication, and guided work activities.
    • Enhanced known object tracking with configurable parameters for tuning and optimizing object tracking (Introducing enterprise APIs for visionOS).
    • Access to the Apple Neural Engine for on-device machine learning tasks, improving the performance of custom models (Introducing enterprise APIs for visionOS).
  3. Device Management:

    • Vision Pro now supports MDM (Mobile Device Management) with VisionOS 1.1, allowing IT teams to manage Vision Pro devices similarly to iPhones and iPads (What’s new in device management).
  4. HealthKit Integration:

    • New immersive experiences for reflecting on health data, such as entering an immersive space to reflect on calendar events and saving reflections to HealthKit (Get started with HealthKit in visionOS).
  5. Spatial Web Development:

    • Tools for inspecting and debugging web content in VisionOS, including DOM content, CSS rules, and JavaScript console (Optimize for the spatial web).

These updates and features aim to enhance the capabilities of Vision Pro, making it a more powerful tool for both developers and enterprise users.