Can I get lighting information about the environment on visionOS?

Generated on 7/31/2024

1 search

Yes, you can get lighting information about the environment on visionOS. Here are some relevant details:

  1. Virtual Environment Probe: The virtual environment probe describes a location in terms of color variation and illumination intensity. This information can be used to automatically shade any physically based rendering material, closely approximating the way light bounces off objects. This is useful for providing environment lighting for objects in your immersive space (Enhance the immersion of media viewing in custom environments).

  2. Environment Lighting Configuration: This component controls how much lighting an entity receives from the environment probe. It is particularly useful for creating smooth lighting transitions, such as when an object moves through a portal (Discover RealityKit APIs for iOS, macOS and visionOS).

  3. Lighting Adjustments for Tracking: ARKit on visionOS includes mechanisms to react to changes in lighting conditions. If the system detects low light conditions that affect tracking, it will switch to orientation-based tracking to prevent complete tracking loss (Create enhanced spatial computing experiences with ARKit).

  4. Customizing Tint and Brightness: In visionOS, you can customize the tint and brightness to blend the lighting on the user's hands with the environment lighting, enhancing the immersive experience (Enhance the immersion of media viewing in custom environments).

For more detailed information, you can refer to the sessions mentioned above.

Enhance the immersion of media viewing in custom environments

Enhance the immersion of media viewing in custom environments

Extend your media viewing experience using Reality Composer Pro components like Docking Region, Reverb, and Virtual Environment Probe. Find out how to further enhance immersion using Reflections, Tint Surroundings Effect, SharePlay, and the Immersive Environment Picker.

Discover RealityKit APIs for iOS, macOS and visionOS

Discover RealityKit APIs for iOS, macOS and visionOS

Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Platforms State of the Union

Platforms State of the Union

Discover the newest advancements on Apple platforms.

Create enhanced spatial computing experiences with ARKit

Create enhanced spatial computing experiences with ARKit

Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Keep colors consistent across captures

Keep colors consistent across captures

Meet the Constant Color API and find out how it can help people use your app to determine precise colors. You’ll learn how to adopt the API, explore its scientific and marketing potential, and discover best practices for making the most of the technology.

Get started with HealthKit in visionOS

Get started with HealthKit in visionOS

Discover how to use HealthKit to create experiences that take full advantage of the spatial canvas. Learn the capabilities of HealthKit on the platform, find out how to bring an existing iPadOS app to visionOS, and explore the special considerations governing HealthKit during a Guest User session. You’ll also learn ways to use SwiftUI, Swift Charts, and Swift concurrency to craft innovative experiences with HealthKit.