Jump to content
  • entries
  • comments
  • views

A Quick Start with VIVE Wave™ for Developers



Getting Started with VIVE Wave™ for Unity Developers (updated!)

First download the Wave SDK : 

Legacy:  if not yet on XR Management system (Unity 2019.4 and later e.g. if still on Unity 2018)

Note: Porting to the VIVE Wave platform 

 General Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html  Please note the following quick guides below focuses on Unity scenes and code and when porting across platforms and devices but also consider graphics (depending on CPU/GPU/RAM etc.) coniderations.

If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR.  So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide.  Porting from other devices using different toolkits like Unity XR Interaction (in preview) for Unity XR plugins like Wave 3.2+ or VIU (Vive Input Utility) which supports both Unity XR Management and legacy should be considered.


A Quick Start Guide for developing in Unity: 

The following are the steps for setting up a scene with the Wave SDK (legacy) , but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support for either legacy or the new Unity XR Management support.

1) For legacy Unity (pre Wave 3.2): Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings...  (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html)   Note: for Wave XR plugin (Unity 2019.4) use the package manager and you can also avoid Android Studio and use the built in Android support.

2) For legacy support:  Import wavevr.unitypackage  (Assets->Import Package->Custom Package...)   

3) For legacy support: From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete or disable the existing Camera object there’s one already included in WaveVR)  For Wave XR plugin support - you can use Unity XR APIs like when using any other XR plugin.

4) For legacy: duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first)  For Wave XR plugin, see samples included with the packages.

5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging)  VIU (more below) can use a simulator when developing for all platforms.

Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to.  You can safely dismiss the AndroidManifest related popup for now until you are ready to publish on Viveport (this is for indicating 3DOF vs 6DOF or both).

At this point you should  be able to see your empty scene with your controller on your Wave device ! 


Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin: 

There is also an additional Unity plugin for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both APIs at once). 

The Vive Input Utility This is a Unity plugin that can support ViveVive Pro, Rift, Daydream, Go, Quest and the Wave SDK (e.g. Focus and Focus Plus) in addition to Unity’s XR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platformsIt’s available on the Unity Asset Store (search for VIU)  or at https://github.com/ViveSoftware    


Steps to create the same application but using the VIU plugin: 

1) Launch Unity and create a New Project and import VIU (Vive Input Utility) from the Unity Asset Store or package manager (or github)

2) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig) 

3) Build and Run 

VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene.

Note: VIU will step through obtaining the latest Wave 4.x packages via the Package Manager when you select Wave as the target in VIU Settings (in Preferences).  These settings will wrap selecting the target in XR Management settings.


Support is provided at the official developer forum for the Wave SDK: 

and VIU:


Recommended Comments

Hello Dario,


Does your plugin allow to change the 3D Models to the 6DoF Models coming with the Focus Plus / from the Developer Kit? I am having issues doing this right now. I just want to change the models but keep all the functionality. 

Thank you

Link to comment

@VRVision The short answer is yes, one of the benefits of using the VIU Unity plugin is that you can select or use the target platform to select the controller models - it includes most controllers for most platforms supported and you can add a custom material or shader to modify the white default models. You can easily select the models for the simulator to use so you can test.

At runtime you have several choices, if targeting SteamVR (OpenVR) PC based controllers you can allow the Steam runtime display the natively supported models with animation - simply include the SteamVR Unity plugin.  

To add your own model (or modified default model supplied) you can simply child the model to the corresponding controller or at runtime you can enable your model as shown when you use the ViveRig prefab - (press grip buttons to display a custom model). See the Controller Manager Sample C# script to see how that's done (and the hierarchy of the controller objects).

Again, check out the simulator which allows you to test controller interactions - this simulator allows fine tuning your controller interactions without the longer cycle of testing in a headset.

If you want to show natively available controller models at runtime (not necessarily using VIU) see:


Link to comment

Porting from Quest (or other devices) to the Focus Plus with Unity: (Intro)

First, please strongly consider using VIU even if using another toolkit*

- replace your existing camera and controller rigs with the corresponding ViveRig in all of your scenes.

This should get you most of the way there unless you did not abstract your code enough from other toolkits.

* If you used other toolkits e.g. VRTK, MRTK, Oculus SDK, SteamVR Interaction toolkit or even the new early preview of XRI from Unity, you should be able to replace the main hmd and controller "rigs" with the corresponding VIU prefabs and still port back to many platforms using the platform porting feature of VIU (which includes porting back to Quest with the same code base). 

If you didn't abstract your code enough from the toolkits or if you are using components from the toolkits (libraries) that are strongly tied to the toolkits input layer, then you may have to modify some of those components to be able to accept poses from VIU. You can also modify VIU components to pass on the pose information to the components from other toolkits.

  • Interactables (for grabbing) are replaced with the equivalent Grabbables for VIU..
  • For teleporting, add a teleportable to the floor or on objects you will teleport to (make sure to pass into the inspector the ViveRig origin and the camera from the ViveRig)..
  • Please see the ControllerManagerSample.cs script in ViveRig prefab for reference on how to mange controller.input, hide controllers on grab, etc. 

Note: the complexity of the port relies on how you abstracted your code form third party toolskits as well on how much you relied on third party libraries or components strongly linked to those toolkits.

If you have any questions please post on the VIU forum here:


Link to comment

Just to help clarify some open questions re: Wave SDK and VIU:

VIU wraps the calls to the Wave SDK APIs when targeting Wave, likewise for other SDKs like SteamVR or OVR, it also wraps those and additionally it'll fall back to using Unity XR APIs which still works.

If you choose to work with Unity's XR Interaction  toolkit - there's no extra Wave SDK APIs you have to code to - just include the Interaction support to make sure controller input works and follow Unity's tutorials. Also see detailed tutorials on this site for getting started with XRI:


Also Essence isn't another SDK/toolkit - there's only one Wave SDK broken into 3 pieces, the core, the native (low level Android APIs) and Essence which includes everything else from previous versions include tools and integrations for other toolkits like the XR Interaction integration support and additional samples





Link to comment
On 3/22/2019 at 10:44 AM, VRVision said:

Hello Dario,


Does your plugin allow to change the 3D Models to the 6DoF Models coming with the Focus Plus / from the Developer Kit? I am having issues doing this right now. I just want to change the models but keep all the functionality. 

Thank you

Since it's a common question (now with VIVE Flow) - please see this post:


Link to comment
  • Create New...