-->
Whether you want to use DirectX or Unity to develop your mixed reality app, you will use Visual Studio for debugging and deploying. In this section, you will learn how to:
- Deploy applications to your HoloLens or Windows Mixed Reality immersive headset through Visual Studio.
- Use the HoloLens emulator built in to Visual Studio.
- Debug mixed reality apps.
Made for HoloLens. 323 apps, sorted by last update time, desc. Learn one set of tools, build for many devices. The software, APIs and core building blocks are shared across Windows Mixed Reality, so you can invest in one platform to build experiences for HoloLens, VR headsets, or even experiences that target both.
Prerequisites
- See Install the Tools for installation instructions.
- Create a new Universal Windows app project in Visual Studio. For HoloLens (1st gen), use Visual Studio 2017 or newer. For Hololens 2, use Visual Studio 2019 16.2 or newer. C# and C++ are supported. (Or follow the instructions to create an app in Unity.)
Enabling Developer Mode
Start by enabling Developer Mode on your device, so Visual Studio can connect to it.
HoloLens
- Turn on your HoloLens and put on the device.
- Perform the start gesture to launch the main menu.
- Select the Settings tile to launch the app in your environment.
- Select the Update menu item.
- Select the For developers menu item.
- Enable Developer Mode. This will allow you to deploy apps from Visual Studio to your HoloLens.
- Optional: Scroll down and also enable Device Portal. This will also allow you to connect to the Windows Device Portal on your HoloLens from a web browser.
Windows PC
If you are working with a Windows Mixed Reality headset connected to your PC, you must enable Developer Mode on the PC.
- Go to Settings
- Select Update and Security
- Select For developers
- Enable Developer Mode, read the disclaimer for the setting you chose, then click Yes to accept the change.
Deploying an app over Wi-Fi - HoloLens (1st gen)
- Select an x86 build configuration for your app
- Select Remote Machine in the deployment target drop-down menu
- For C++ and JavaScript projects, go to Project > Properties > Configuration Properties > Debugging. For C# projects, a dialog will automatically appear to configure your connection.a. Enter the IP address of your device in the Address or Machine Name field. Find the IP address on your HoloLens under Settings > Network & Internet > Advanced Options, or you can ask Cortana 'What is my IP address?'b. Set Authentication Mode to Universal (Unencrypted protocol)
- Select Debug > Start debugging to deploy your app and start debugging
- The first time you deploy an app to your HoloLens from your PC, you will be prompted for a PIN. Follow the Pairing your device instructions below.
Deploying an app over Wi-Fi - HoloLens 2
- Select an ARM or ARM64 build configuration for your app
- Select Remote Machine in the deployment target drop-down menu
- For C++ and JavaScript projects, go to Project > Properties > Configuration Properties > Debugging. For C# projects, a dialog will automatically appear to configure your connection.a. Enter the IP address of your device in the Address or Machine Name field. Find the IP address on your HoloLens under Settings > Network & Internet > Advanced Options, or you can ask Cortana 'What is my IP address?'b. Set the Authentication Mode to Universal (Unencrypted protocol)
- Select Debug > Start debugging to deploy your app and start debugging
- The first time you deploy an app to your HoloLens from your PC, you will be prompted for a PIN. Follow the Pairing your device instructions below.
If your HoloLens IP address changes, you can change the IP address of the target machine by going to Project > Properties > Configuration Properties > Debugging
Deploying an app over USB - HoloLens (1st gen)
- Select an x86 build configuration for your app
- Select Device in the deployment target drop-down menu
- Select Debug > Start debugging to deploy your app and start debugging
- The first time you deploy an app to your HoloLens from your PC, you will be prompted for a PIN. Follow the Pairing your device instructions below.
Deploying an app over USB - HoloLens 2
- Select an ARM or ARM64 build configuration for your app
- Select Device in the deployment target drop-down menu
- Select Debug > Start debugging to deploy your app and start debugging
- The first time you deploy an app to your HoloLens from your PC, you will be prompted for a PIN. Follow the Pairing your device instructions below.
Deploying an app to your Local PC - immersive headset
Follow these instructions when using a Windows Mixed Reality immersive headset that connects to your PC or the Mixed Reality simulator. In these cases, simply deploy and run your app on the local PC.
- Select an x86 or x64 build configuration for your app
- Select Local Machine in the deployment target drop-down menu
- Select Debug > Start debugging to deploy your app and start debugging
Pairing your device
The first time you deploy an app from Visual Studio to your HoloLens, you will be prompted for a PIN. On the HoloLens, generate a PIN by launching the Settings app, go to Update > For Developers and tap on Pair. A PIN will be displayed on your HoloLens; type this PIN in Visual Studio. After pairing is complete, tap Done on your HoloLens to dismiss the dialog. This PC is now paired with the HoloLens and you will be able to deploy apps automatically. Repeat these steps for every subsequent PC that is used to deploy apps to your HoloLens.
To un-pair your HoloLens from all computers it was paired with, launch the Settings app, go to Update > For Developers and tap on Clear.
Deploying an app to the HoloLens (1st gen) Emulator
- Make sure you have installed the HoloLens Emulator.
- Select an x86 build configuration for your app.
- Select HoloLens Emulator in the deployment target drop-down menu
- Select Debug > Start debugging to deploy your app and start debugging
Deploying an app to the HoloLens 2 Emulator
- Make sure you have installed the HoloLens Emulator.
- Select an x86 or x64 build configuration for your app.
- Select HoloLens 2 Emulator in the deployment target drop-down menu
- Select Debug > Start debugging to deploy your app and start debugging
Graphics Debugger for HoloLens (1st gen)
The Visual Studio Graphics Diagnostics tools are very helpful when writing and optimizing a Holographic app. See Visual Studio Graphics Diagnostics on MSDN for full details.
To Start the Graphics Debugger
- Follow the instructions above to target a device or emulator
- Go to Debug > Graphics > Start Diagnostics
- The first time you do this with a HoloLens, you may get an 'access denied' error. Reboot your HoloLens to allow updated permissions to take effect and try again.
Profiling
Unity Mrtk Tutorial
The Visual Studio profiling tools allow you to analyze your app's performance and resource use. This includes tools to optimize CPU, memory, graphics, and network use. See Run diagnostic tools without debugging on MSDN for full details.
To Start the Profiling Tools with HoloLens
- Follow the instructions above to target a device or emulator
- Go to Debug > Start Diagnostic Tools Without Debugging...
- Select the tools you want to use
- Click Start
- The first time you do this with a HoloLens, you may get an 'access denied' error. Reboot your HoloLens to allow updated permissions to take effect and try again.
Debugging an installed or running app
You can use Visual Studio to debug a Universal Windows app that's installed without deploying from a Visual Studio project. This is useful if you want to debug an installed app package, or if you want to debug an app that's already running.
- Go to Debug -> Other Debug Targets -> Debug Installed App Package
- Select the Remote Machine target for HoloLens or Local Machine for immersive headsets.
- Enter your device's IP address
- Choose the Universal Authentication Mode
- The window shows both running and inactive apps. Pick the one what you'd like to debug.
- Choose the type of code to debug (Managed, Native, Mixed)
- Click Attach or Start
Next Development Checkpoint
If you're following the Unity development checkpoint journey we've laid out, you're in the midst of the deployment stage. From here, you can proceed to the next topic:
Or jump directly to adding advanced services:
You can always go back to the Unity development checkpoints at any time.
See also
MRTK-Unity is a Microsoft-driven project that provides a set of components and features, used to accelerate cross-platform MR app development in Unity. Here are some of its functions:
- Provides the cross-platform input system and building blocks for spatial interactions and UI.
- Enables rapid prototyping via in-editor simulation that allows you to see changes immediately.
- Operates as an extensible framework that provides developers the ability to swap out core components.
- Supports a wide range of platforms, including
- Microsoft HoloLens
- Microsoft HoloLens 2
- Windows Mixed Reality headsets
- OpenVR headsets (HTC Vive / Oculus Rift)
- Ultraleap Hand Tracking
- Mobile devices such as iOS and Android
If you're new to MRTK or Mixed Reality development in Unity, we recommend you start at the beginning of ourUnity development journey in the Microsoft Docs. The Unity development journey is specifically tailored to walk new developers through the installation, core concepts, and usage of MRTK.
IMPORTANT: The Unity development journey currently uses MRTK version 2.4.0 and Unity 2019.4. |
---|
If you're an experienced Mixed Reality or MRTK developer, check the links in the next section for the newest packages and release notes.
Hololens Projects Github
Release Notes | MRTK Overview | Feature Guides | API Reference |
---|
Branch | CI Status | Docs Status |
---|---|---|
mrtk_development |
To build apps with MRTK v2, you need the Windows 10 May 2019 Update SDK. To run apps for immersive headsets, you need the Windows 10 Fall Creators Update. | The Unity 3D engine provides support for building mixed reality projects in Windows 10 | Visual Studio is used for code editing, deploying and building UWP app packages | The Emulators allow you to test your app without the device in a simulated environment |
Holotoolkit
Input System | Hand Tracking (HoloLens 2) | Eye Tracking (HoloLens 2) | Profiles | Hand Tracking (Ultraleap) |
---|---|---|---|---|
UI Controls | Solvers | Multi-Scene Manager | Spatial Awareness | Diagnostic Tool |
MRTK Standard Shader | Speech & Dictation | Boundary System | In-Editor Simulation | Experimental Features |
A button control which supports various input methods, including HoloLens 2's articulated hand | Standard UI for manipulating objects in 3D space | Script for manipulating objects with one or two hands |
2D style plane which supports scrolling with articulated hand input | Example script of using the system keyboard in Unity | A script for making objects interactable with visual states and theme support |
Various object positioning behaviors such as tag-along, body-lock, constant view size and surface magnetism | Script for laying out an array of objects in a three-dimensional shape | Annotation UI with a flexible anchor/pivot system, which can be used for labeling motion controllers and objects |
Slider UI for adjusting values supporting direct hand tracking interaction | MRTK's Standard shader supports various Fluent design elements with performance | Hand-locked UI for quick access, using the Hand Constraint Solver |
UI for Bounds Control's manual activation | Learn about various types of pointers | Visual affordance on the fingertip which improves the confidence for the direct interaction |
Voice Command / Dictation | ||
Floating menu UI for the near interactions | Make your holographic objects interact with the physical environments | Scripts and examples for integrating speech input |
Visual indicator for communicating data process or operation | UI for asking for user's confirmation or acknowledgement | Component that helps guide the user when the gesture has not been taught |
The hand physics service enables rigid body collision events and interactions with articulated hands | An Object Collection that natively scrolls 3D objects | The Dock allows objects to be moved in and out of predetermined positions |
Combine eyes, voice and hand input to quickly and effortlessly select holograms across your scene | Learn how to auto-scroll text or fluently zoom into focused content based on what you are looking at | Examples for logging, loading and visualizing what users have been looking at in your app |
Build Window | |||
---|---|---|---|
Automate configuration of Mixed Reality projects for performance optimizations | Analyze dependencies between assets and identify unused assets | Configure and execute an end-to-end build process for Mixed Reality applications | Record and playback head movement and hand tracking data in editor |
Explore MRTK's various types of interactions and UI controls in this example scene.
You can find other example scenes under Assets/MixedRealityToolkit.Examples/Demos folder.
With the MRTK Examples Hub, you can try various example scenes in MRTK.You can find pre-built app packages for HoloLens(x86), HoloLens 2(ARM), and Windows Mixed Reality immersive headsets(x64) under Release Assets folder. Use the Windows Device Portal to install apps on HoloLens. On HoloLens 2, you can download and install MRTK Examples Hub through the Microsoft Store app.
See Examples Hub README page to learn about the details on creating a multi-scene hub with MRTK's scene system and scene transition service.
Periodic Table of the Elements is an open-source sample app which demonstrates how to use MRTK's input system and building blocks to create an app experience for HoloLens and Immersive headsets. Read the porting story: Bringing the Periodic Table of the Elements app to HoloLens 2 with MRTK v2 | Galaxy Explorer is an open-source sample app that was originally developed in March 2016 as part of the HoloLens 'Share Your Idea' campaign. Galaxy Explorer has been updated with new features for HoloLens 2, using MRTK v2. Read the story: The Making of Galaxy Explorer for HoloLens 2 | Surfaces is an open-source sample app for HoloLens 2 which explores how we can create a tactile sensation with visual, audio, and fully articulated hand-tracking. Check out Microsoft MR Dev Days session Learnings from the Surfaces app for the detailed design and development story. |
Tutorial on how to create a simple MRTK app from start to finish. Learn about interaction concepts and MRTK's multi-platform capabilities. | Deep dive on the MRTK's UX building blocks that help you build beautiful mixed reality experiences. | An introduction to performance tools, both in MRTK and external, as well as an overview of the MRTK Standard Shader. |
See Mixed Reality Dev Days to explore more session videos.
Join the conversation around MRTK on Slack. You can join the Slack community via the automatic invitation sender.
Ask questions about using MRTK on Stack Overflow using the MRTK tag.
Search for known issues or file a new issue if you find something broken in MRTK code.
For questions about contributing to MRTK, go to the mixed-reality-toolkit channel on slack.
This project has adopted the Microsoft Open Source Code of Conduct.For more information, see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Discover | Design | Develop | Distribute |
---|---|---|---|
Learn to build mixed reality experiences for HoloLens and immersive headsets (VR). | Get design guides. Build user interface. Learn interactions and input. | Get development guides. Learn the technology. Understand the science. | Get your app ready for others and consider creating a 3D launcher. |
Spatial Anchors | Speech Services | Vision Services |
---|---|---|
Spatial Anchors is a cross-platform service that allows you to create Mixed Reality experiences using objects that persist their location across devices over time. | Discover and integrate Azure powered speech capabilities like speech to text, speaker recognition or speech translation into your application. | Identify and analyze your image or video content using Vision Services like computer vision, face detection, emotion recognition or video indexer. |
You can find our planning material on our wiki under the Project Management Section. You can always see the items the team is actively working on in the Iteration Plan issue.
Learn how you can contribute to MRTK at Contributing.
Mrtk
For details on the different branches used in the Mixed Reality Toolkit repositories, check this Branch Guide here.