How do I start using RealityKit
Generated on 7/30/2024
1 search

This post is brought to you by Tailscan,
the best devtool for Tailwind CSS.
To start using RealityKit, you can follow these steps:
-
Create a New Xcode Project: Begin by creating a new Xcode project using the VisionOS app template. This will automatically create a default scene that you can open in Reality Composer Pro.
-
Use Reality Composer Pro: Open the default scene in Reality Composer Pro. You can delete the default sphere and create an empty transform entity. Add an anchoring component to this entity, which will serve as the container for your object anchor. You can then import the reference object generated with CreateML and associate it with your anchoring component.
-
Explore RealityKit APIs: RealityKit provides high-performance 3D simulation and rendering capabilities for iOS, macOS, and VisionOS. It allows you to create immersive spatial computing apps and games. You can use RealityKit APIs to create your anchoring and other 3D content.
-
Build a Spatial Drawing App: For a practical example, you can follow the session on building a spatial drawing app with RealityKit. This session covers setting up spatial tracking, building a user interface, generating brush geometry, and creating a splash screen.
-
Debugging with RealityKit Debugger: Use the RealityKit debugger to inspect your app and track down bugs. The debugger helps you traverse entity hierarchies, address bad behaviors, and find missing content.
For more detailed guidance, you can refer to the following sessions:
- Explore object tracking for visionOS
- Discover RealityKit APIs for iOS, macOS and visionOS
- Build a spatial drawing app with RealityKit
- Break into the RealityKit debugger
Relevant Sessions
- Explore object tracking for visionOS
- Discover RealityKit APIs for iOS, macOS and visionOS
- Build a spatial drawing app with RealityKit
- Break into the RealityKit debugger

Build a spatial drawing app with RealityKit
Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.

Explore object tracking for visionOS
Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

Platforms State of the Union
Discover the newest advancements on Apple platforms.

What’s new in USD and MaterialX
Explore updates to Universal Scene Description and MaterialX support on Apple platforms. Discover how these technologies provide a foundation for 3D content creation and delivery, and learn how they can help streamline your workflows for creating great spatial experiences. Learn about USD and MaterialX support in RealityKit and Storm, advancements in our system-provided tooling, and more.

Break into the RealityKit debugger
Meet the RealityKit debugger and discover how this new tool lets you inspect the entity hierarchy of spatial apps, debug rogue transformations, find missing entities, and detect which parts of your code are causing problems for your systems.

Discover RealityKit APIs for iOS, macOS and visionOS
Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.