Apple has announced that its visionOS software development kit (SDK) is now available, allowing third-party developers to build apps for the upcoming Vision Pro headset. The SDK is built on top of the same basic framework as Apple’s various other operating systems, utilizing familiar dev tools, including Xcode, SwiftUI, RealityKit, ARKit, and TestFlight.
The company is hoping to lower the barrier of entry for existing developers and is banking on developer interest to help drive excitement around the system, which was met with a lukewarm reception when it was unveiled at WWDC earlier this month.
The SDK includes a variety of features that will allow developers to create immersive and interactive experiences for the Vision Pro. These features include:
- Spatial computing: The SDK provides developers with the tools they need to create spatial experiences that take advantage of the Vision Pro’s built-in sensors.
- Hand tracking: The SDK allows developers to track the user’s hands in real time, enabling them to interact with virtual objects in a natural way.
- Eye tracking: The SDK allows developers to track the user’s eye movements, which can be used to control virtual objects or to provide additional information.
- Audio: The SDK includes APIs for spatial audio, which will allow developers to create immersive audio experiences.
The SDK is available at least half a year before the headset officially goes on sale in the U.S., priced at $3,500. Apple is no doubt banking on a stocked App Store by the time the system arrives in early 2024. Content has been a major sticking point for years of VR and AR development, and Apple is hoping to avoid this issue by providing developers with the tools they need to create compelling experiences for the Vision Pro.
The Vision Pro SDK is a significant milestone in the development of the Vision Pro headset. With the SDK now available, developers can begin creating content for the headset, which will help to drive excitement and adoption for the device.