How is compositing done on iOS and macOS?
Generated on 7/30/2024
1 search

This post is brought to you by Tailscan,
the best devtool for Tailwind CSS.
Compositing on iOS and macOS involves several key components and techniques, as discussed in various WWDC sessions. Here are the main points:
-
Compositor Services:
- visionOS: Uses compositor services to blend rendered content with physical surroundings. The compositor services API is used to create a rendering session, and Metal APIs are used to render frames. The rendered content is then composited with the physical environment using ARKit for world and hand tracking (Render Metal with passthrough in visionOS).
- CarPlay: The next generation of CarPlay uses a dedicated compositor to handle video streams from the iPhone and locally rendered UI. This compositor ensures seamless transitions and animations by synchronizing frames at a system level (Meet the next generation of CarPlay architecture).
-
Color and Depth Textures:
- visionOS: The color pipeline functions in the P3 display color space for better consistency. Compositor services use both color and depth textures from the renderer for compositing operations. The depth texture is expected to be in reverse Z convention (Render Metal with passthrough in visionOS).
-
Rendering Techniques:
- Metal APIs: Used extensively for rendering frames that are then composited. This includes handling pre-multiplied alpha values and ensuring correct color and depth values (Render Metal with passthrough in visionOS).
- Projection View Matrix: In visionOS, a projection view matrix is composed using device anchors and ARKit APIs to ensure correct depth and perspective in the rendered content (Render Metal with passthrough in visionOS).
-
Synchronization:
- CarPlay: Uses a low-latency channel called Uisync to synchronize UI elements across displays, ensuring smooth transitions and animations regardless of the rendering source (Meet the next generation of CarPlay architecture).
Relevant Sessions
These sessions provide a comprehensive overview of how compositing is handled on iOS, macOS, and visionOS, leveraging Metal, ARKit, and compositor services to create seamless and immersive experiences.

Render Metal with passthrough in visionOS
Get ready to extend your Metal experiences for visionOS. Learn best practices for integrating your rendered content with people’s physical environments with passthrough. Find out how to position rendered content to match the physical world, reduce latency with trackable anchor prediction, and more.

Platforms State of the Union
Discover the newest advancements on Apple platforms.

Bring your iOS or iPadOS game to visionOS
Discover how to transform your iOS or iPadOS game into a uniquely visionOS experience. Increase the immersion (and fun factor!) with a 3D frame or an immersive background. And invite players further into your world by adding depth to the window with stereoscopy or head tracking.

Meet the next generation of CarPlay architecture
Dive into the architecture for the next generation of CarPlay. Learn how your vehicle system works with iPhone to create a single cohesive experience that showcases the best of your car and iPhone. Learn how UI is rendered and composited, and explore ways to configure and customize a special experience for each vehicle model. This session is intended for automakers and system developers interested in the next generation of CarPlay.