A foundation and work with camera

Generated on 8/12/2024

1 search

To work with the camera and foundational technologies at WWDC, you can explore several sessions that cover different aspects of camera integration and experiences:

  1. Build a great Lock Screen camera capture experience: This session introduces the Locked Camera Capture framework in iOS 18, which allows developers to create a camera capture experience directly accessible from the lock screen. It covers the lifecycle of a capture extension and how to handle permissions and integrate captured content back into your app. You can start exploring this session with the chapter on Great capture experiences.

  2. Build compelling spatial photo and video experiences: This session discusses how to use AV Foundation and other APIs to add spatial media capabilities to your apps. It explains how spatial video is recorded using the iPhone 15 Pro's camera setup and provides a deep dive into spatial media formats. You can start with the chapter on Tour of the new APIs.

  3. Introducing enterprise APIs for visionOS: This session covers how to access the main camera feed and embed it in an app using visionOS. It includes setting up a camera frame provider and handling user authorization for camera access. You can begin with the chapter on Enhanced sensor access.

These sessions provide a comprehensive overview of how to leverage camera functionalities in various contexts, from lock screen experiences to spatial media and enterprise applications.

Build a great Lock Screen camera capture experience

Build a great Lock Screen camera capture experience

Find out how the LockedCameraCapture API can help you bring your capture application’s most useful information directly to the Lock Screen. Examine the API’s features and functionality, learn how to get started creating a capture extension, and find out how that extension behaves when the device is locked.

Keep colors consistent across captures

Keep colors consistent across captures

Meet the Constant Color API and find out how it can help people use your app to determine precise colors. You’ll learn how to adopt the API, explore its scientific and marketing potential, and discover best practices for making the most of the technology.

Build compelling spatial photo and video experiences

Build compelling spatial photo and video experiences

Learn how to adopt spatial photos and videos in your apps. Explore the different types of stereoscopic media and find out how to capture spatial videos in your iOS app on iPhone 15 Pro. Discover the various ways to detect and present spatial media, including the new QuickLook Preview Application API in visionOS. And take a deep dive into the metadata and stereo concepts that make a photo or video spatial.

What’s new in DockKit

What’s new in DockKit

Discover how intelligent tracking in DockKit allows for smoother transitions between subjects. We will cover what intelligent tracking is, how it uses an ML model to select and track subjects, and how you can use it in your app.

Discover area mode for Object Capture

Discover area mode for Object Capture

Discover how area mode for Object Capture enables new 3D capture possibilities on iOS by extending the functionality of Object Capture to support capture and reconstruction of an area. Learn how to optimize the quality of iOS captures using the new macOS sample app for reconstruction, and find out how to view the final results with Quick Look on Apple Vision Pro, iPhone, iPad or Mac. Learn about improvements to 3D reconstruction, including a new API that allows you to create your own custom image processing pipelines.

Introducing enterprise APIs for visionOS

Introducing enterprise APIs for visionOS

Find out how you can use new enterprise APIs for visionOS to create spatial experiences that enhance employee and customer productivity on Apple Vision Pro.