Apple has announced the release of visionOS SDK, a new software development kit that allows developers to create immersive and interactive applications for its visionOS platform. visionOS is Apple’s proprietary operating system for its augmented reality glasses, which are expected to launch in 2024. The SDK provides developers with access to various features and tools of visionOS, such as spatial audio, hand tracking, eye tracking, face recognition, object recognition, and more. Developers can also use the SDK to integrate their applications with other Apple services and devices, such as iCloud, Siri, Apple Pay, and Apple Watch.
The Vision Pro operates on visionOS, which is the world’s first spatial operating system. Developers can now tap into the possibilities offered by this new operating system through the visionOS Software Development Kit (SDK), enabling them to design, develop, and test apps across a spectrum of applications such as productivity, design, and gaming.
Apple has also announced plans to open developer labs in six cities across the world, including Cupertino, London, Munich, Shanghai, Singapore, and Tokyo, next month. These labs will provide developers with the hands-on experience needed to test their apps on Apple Vision Pro hardware, with the support of Apple engineers. Furthermore, development teams can apply for developer kits to expedite the building, iterating, and testing processes for Apple Vision Pro.
Susan Prescott, Apple’s vice president of Worldwide Developer Relations, stated, “Spatial computing unlocks new opportunities for our developers, enabling them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment. We can’t wait to see what our developer community dreams up.”
The Apple Vision Pro features a brand-new App Store, where users can discover a myriad of exciting apps and content. Developers are now able to create immersive app experiences by utilizing a set of established frameworks familiar from other Apple platforms. This includes Xcode, SwiftUI, RealityKit, ARKit, and TestFlight, which can be used to create windows, volumes, and spaces – the three core components of an immersive, spatial computing experience.
In addition, Apple introduces a new tool, the Reality Composer Pro, that is bundled with Xcode. This software allows developers to preview and optimize 3D models, animations, images, and sounds, enabling the creation of highly engaging, visually stunning apps on the Vision Pro.
Developers are also able to leverage the visionOS simulator to test room layouts and lighting conditions and are provided with built-in support for Apple’s innovative accessibility features. This ensures the new era of spatial computing and visionOS apps are accessible to a wide range of users.
Next month, Unity developers will have the ability to port their 3D apps and games to Apple Vision Pro and fully harness its unique capabilities.
Developers who have had the opportunity to preview the visionOS SDK and APIs have shared their excitement over the platform’s potential. Apps such as Complete HeartX, djay, JigSpace, and Stages have showcased the broad range of applications and immersive experiences that can be created using Apple Vision Pro.
The visionOS SDK, updated Xcode, Simulator, and Reality Composer Pro are now available for Apple Developer Program members on the official developer website. Apple provides an abundance of resources to aid developers in designing, developing, and testing apps for Apple Vision Pro, including extensive technical documentation, new design kits, and updated human interface guidelines for visionOS.