Welcome to Niantic SDK for Unity
SDK for Unity is Niantic's toolkit for creating immersive location based experiences. It extends Unity's AR Foundation subsystems so developers can seamlessly mix and match Niantic's unique AR features with Unity's existing AR framework. Any existing AR Foundation project can be upgraded with Niantic Spatial Platform. SDK for Unity will override existing systems (such as Occlusion and Meshing) and add new ones that support Niantic's features.
Developers can use the Unity documentation and tutorials on AR Foundation for basic AR concepts and then extend them to make use of Niantic’s advanced AR features, guides and samples.
Access the SDK download along with sample code and other resources by visiting our repositories in GitHub (opens in new window) and installation guide.
What's New
Niantic Offers:
- Depth, Occlusion, and Meshing that work on any device, regardless of lidar capability, at greater distance than lidar-based AR. These systems override and extend the base AR Foundation managers seamlessly with no work needed by the user if they already have an AR Foundation project.
- Semantic Segmentation for more than just people - there are 20 channels available for mask generation and scene queries for AI and gameplay.
- Object Detection for over 200 classes - supercharge your application's contextual awareness!
- Dynamic Navigation in AR using Navigation Mesh - you can have AI creatures move around your scene as it is generated.
- The Visual Positioning System (VPS) provides a way to persistently anchor content to real world location with centimeter-level accuracy.
- Shared AR allows up to ten players can join a room and interact in a multiplayer AR space through a process called co-localization. After co-localizing (using either VPS or a QR code), players can see positions of objects and each other in the same physical space! Each Shared AR room allows players to send networked messages to each other and has access to a real-time datastore. Using Shared AR's modular architecture, developers can even swap in other services they would like to use in multiplayer settings!
- World Pose (WPS) is a client-side visual odometry feature which works alongside VPS to enhance localization accuracy beyond the traditional GPS + onboard compass technique with no prior mapping required. Using computer vision through your phone's camera, WPS continuously updates and refines its estimate of your device's position and location, resulting in greatly improved localization, even in areas where VPS is not yet supported.
AR Foundation offers:
- Session Management
- Rendering
- Plane Tracking
- Point Clouds
- Face Tracking
- Environment Probes
- Occlusion
- Body Tracking (Apple ARKit only)
- Meshing (Apple ARKit only)
For full details on ARFoundation's offerings, see the Unity documentation (opens in new window).
For a list of supported devices in AR Foundation, see the Google ARCore device list and Apple ARKit device list (links open in new window).
Other Benefits of Upgrading
- Faster to get started; just add the UPM package and enable it!
- More performant rendering; our latest release is optimized to work in tandem with Unity's XR stack, increasing its framerate.
- Smaller runtime size, reducing the size of your app.
- Compatibility with ARKit & ARCore features like face, body, and environmental probes.
- More available documentation due to compatibility with existing AR Foundation tutorials and workflows.
- Porting or extending AR Foundation-based projects is now as simple as enabling the Niantic SDK in Unity and continuing your work.
If you are migrating from previous versions of our SDK, please see the Migration Guide.