Jump to content

Search the Community

Showing results for tags 'wave'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • VIVE
    • VIVE Community Forums
    • VIVE Technical Support
    • VIVE Developer Forums


  • Viveport Blog
  • Viveport Reviews
  • Viveport SDKs and Downloads
  • Developer Blog
  • Vive Blog

Find results in...

Find results that contain...

Date Created

  • Start


Last Updated

  • Start


Filter by number of...


  • Start



About Me

Found 16 results

  1. Hello. Following this post : https://forum.vive.com/topic/12833-user-orientation-recentering-in-unity/ I did integrate it in my project but it doesn't work with the hand tracking : the hands doesn't move, but the controllers do Is there any work around for this ? @C.T. @chengnay @BabelSW I tried resetting the position of the hands, without success Thanks in advance for any help
  2. Hi. I have different applications that call themselves through code. I use a launcher that launches an app, the app then launch the launcher, and the launcher calls another app. I am experiencing random crashes and I think it is because i don't quit / call the apps correctly. Is there a way to do it properly through WAVE code ? I am using unity What I want is do it properly : I am in the launcher, i want to quit the launcher, then launch the app. Quit the app, then launch the launcher etc... I need to completely quit my game : the Unity Application.Quit() doesn't "kill the application" so I had to use these lines of codes. Thanks in advance. Here is my code to launch a game : public static void NextGame(string gameName) { AndroidJavaClass up = new AndroidJavaClass("com.unity3d.player.UnityPlayer"); AndroidJavaObject ca = up.GetStatic<AndroidJavaObject>("currentActivity"); AndroidJavaObject packageManager = ca.Call<AndroidJavaObject>("getPackageManager"); AndroidJavaObject launchIntent = packageManager.Call<AndroidJavaObject>("getLaunchIntentForPackage", gameName); ca.Call("finishAndRemoveTask"); ca.Call("startActivity", launchIntent); up.Dispose(); ca.Dispose(); packageManager.Dispose(); launchIntent.Dispose(); } And to stop it : public static void StopGame() { AndroidJavaClass up = new AndroidJavaClass("com.unity3d.player.UnityPlayer"); AndroidJavaObject ca = up.GetStatic<AndroidJavaObject>("currentActivity"); ca.Call("finishAndRemoveTask"); up.Dispose(); ca.Dispose(); }
  3. Hello. I am using Wrist Trackers to track external elements in my app (the trackers are not placed on my wrist: they are placed on a tracked extinguisher). I am also using hand tracking. It was working fine until recently. However, since the last Wrist Trackers update (, whenever I am using them, they also change the position of the hands (The position of the hands returned by the hand tracking system) and place them next to the tracker. Please note that I have the same behavior in Unity using the Wave SDK and in the Focus 3 main menu (the virtual hands are placed next to their corresponding trackers). In the previous version, there was a popup allowing us to choose if the Wrist Trackers were used to enhance the hand tracking or as external tracker (this second option being what we were using). However, this pop up is no longer displayed when I pair the trackers and I couldn't find an option to disable this feature anywhere. Is there any way to go back to the old behaviour? I have a headset with Firmware version 3.3.999.446 and the Wrist Trackers are on Thank you in advance for your help. @C.T.
  4. Getting Started with VIVE Wave™ for Unity Developers (updated!) First download the Wave SDK : Legacy: if not yet on XR Management system (Unity 2019.4 and later e.g. if still on Unity 2018) https://developer.vive.com/resources/vive-wave/sdk/320/ Note: Porting to the VIVE Wave platform General Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html Please note the following quick guides below focuses on Unity scenes and code and when porting across platforms and devices but also consider graphics (depending on CPU/GPU/RAM etc.) coniderations. If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR. So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide. Porting from other devices using different toolkits like Unity XR Interaction (in preview) for Unity XR plugins like Wave 3.2+ or VIU (Vive Input Utility) which supports both Unity XR Management and legacy should be considered. A Quick Start Guide for developing in Unity: The following are the steps for setting up a scene with the Wave SDK (legacy) , but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support for either legacy or the new Unity XR Management support. 1) For legacy Unity (pre Wave 3.2): Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html) Note: for Wave XR plugin (Unity 2019.4) use the package manager and you can also avoid Android Studio and use the built in Android support. 2) For legacy support: Import wavevr.unitypackage (Assets->Import Package->Custom Package...) 3) For legacy support: From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete or disable the existing Camera object there’s one already included in WaveVR) For Wave XR plugin support - you can use Unity XR APIs like when using any other XR plugin. 4) For legacy: duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first) For Wave XR plugin, see samples included with the packages. 5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging) VIU (more below) can use a simulator when developing for all platforms. Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to. You can safely dismiss the AndroidManifest related popup for now until you are ready to publish on Viveport (this is for indicating 3DOF vs 6DOF or both). At this point you should be able to see your empty scene with your controller on your Wave device ! Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin: There is also an additional Unity plugin for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both APIs at once). The Vive Input Utility: This is a Unity plugin that can support Vive, Vive Pro, Rift, Daydream, Go, Quest and the Wave SDK (e.g. Focus and Focus Plus) in addition to Unity’s XR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platforms. It’s available on the Unity Asset Store (search for VIU) or at https://github.com/ViveSoftware Steps to create the same application but using the VIU plugin: 1) Launch Unity and create a New Project and import VIU (Vive Input Utility) from the Unity Asset Store or package manager (or github) 2) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig) 3) Build and Run VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene. Note: VIU will step through obtaining the latest Wave 4.x packages via the Package Manager when you select Wave as the target in VIU Settings (in Preferences). These settings will wrap selecting the target in XR Management settings. Support is provided at the official developer forum for the Wave SDK: http://community.viveport.com/t5/Vive-Wave-SDK/bd-p/vive-wave-sdk and VIU: https://forum.vive.com/forum/72-vive-input-utility/
  5. The Wave SDK for Unreal Engine is now also available on Github: https://github.com/ViveSoftware/VIVE-Wave-SDK-Unreal INTRODUCTION So far, the Wave SDK for developers using Unreal Engine has been available only on the Vive Developer Website. We decided to release the WaveVR plugin for Unreal Engine as a public Github repository. This will allow developers to report bugs or suggest enhancements using Github Issues allowing us to get feedback from the developer community. Developers can also create Pull Requests to suggest bug fixes. The Vive team will review the pull requests and follow up with the developers that created them, but the actual merge will temporarily take place internally and not directly on GitHub. The repository will then be updated to include those bug fixes. STRUCTURE The Github repository contains a different branch for each version of Unreal Engine so developers don't need to pick a specific Wave version, only the version of Unreal Engine that they are using. This should make the process of integrating the WaveVR plugin to Unreal Engine more intuitive. The repository comes with a full Unreal Engine sample project (plugin.uproject) and the WaveVR plugin is already pre-installed in the Plugins folder. This means that if you want to use the WaveVR plugin in your own project you can simply copy the WaveVR folder from the Plugins folder to your own project's Plugins folder. Only official version releases are pushed to GitHub. Developers will still be able to access the older Wave versions using tags & releases.
  6. Hi there, we have developed apps for Oculus Quest with Unity and now want to port them to Vive Focus Plus. Want to find some universal solution which will work on Focus, Qeust and maybe Pico. We are a bit of confused with HTC SDKs. Are we right that we should use VIU as universal SDK solution plus use WAVE for Focus and Oculus tools for Oculus respectively as low level SDKs? We also can't find any API doc with classes description etc... Does it exist? Any task becomes a problem without doc. For ex, how can I discover HMD availability? In Oculus SDK we have class OVRManager. OVRManager.isHMDPresent - the flag. Can't find the same in VIU because can't understand the SDK principle without proper docs 😞 Help please :-)
  7. We realise how important it is for the development community to have easy-to-use tools that improve content performance. We recently released the new update of Wave SDK 3.1.94 [Early Access] with several Experimental Features for developers who create content for standalone VR headsets of the Wave Ecosystem such as the Vive Focus Plus and Vive Focus. In this update, we introduced changes to the Adaptive Quality feature, which can help to automatically adjust the rendering quality of your VR application according to the system workload in order to achieve better performance and improve battery life by up to 15%. This blog post will explain more about Adaptive Quality, why it is important and how to apply it to your own projects. We will provide an overview of the solution and its design/implementation and share a few tips on how developers can get started using it. We also describe how it works in synergy with other features of the Wave SDK such as Dynamic Fixed Foveated Rendering and Dynamic Resolution for better results. * Please note that Wave SDK 3.1.94 is an Early Access version that includes new features for developers to experiment with and provide feedback or suggest changes. These features are available only with a specific developer ROM update (and Adaptive Quality requires ROM v3.07.623.336 for Focus Plus) but content developed with this Developer ROM can't be published to Viveport until we release a public ROM update (coming soon!). Please refer to this article for more information on how to get access to the Developer ROM and test Wave SDK 3.1.94. Introduction Standalone VR headsets may have all the necessary components required to provide VR experiences, but unlike PC-based VR headsets, utilising the full power of their hardware requires an intricate balance for VR apps to run smoothly and with consistent performance. Heat generated from headsets working extra hard trying to render VR content can result in Throttling. The hardware will detect the high temperatures and when a predefined limit is crossed it will attempt to lower the clock speed of the CPU/GPU to prevent the system from overheating. When the temperature levels get back to normal, the system will increase the CPU/GPU clock speed and performance will bounce back again. Unfortunately, this process may be repeated and leads to poor battery life and inconsistent performance as the system is not able to quickly get rid of the generated heat. Although developers can mitigate this issue by trying to make sure games perform at their best at all times, this is not always possible. Adaptive Quality Adaptive Quality works by providing a way for developers to balance the performance of their VR applications and power consumption in real time, offering automatic adjustment of the CPU/GPU performance according to the workload of the system. Furthermore, it allows defining a set of strategies on how the application should respond to system changes to improve FPS when the rendering performance is insufficient. Adaptive Quality can be combined with the Dynamic Resolution feature to adjust the image quality of the application according to system changes and with Fixed Foveated Rendering to dynamically change the quality of the peripheral region when improving the performance is essential. This results in better battery consumption management and smooth frame rates, as developers have more control and can create and customise their own policies to dynamically handle hardware changes. Especially for GPU fragment-bound apps, this leads to less throttling and a better experience for the end users. Adaptive Quality v1.0 was first introduced in Wave SDK 3.1.1 supporting automatic CPU/GPU adjustment and system events according to workload. Starting from Wave SDK 3.1.94 we introduced new features and changes (v2.0) adding Dynamic Resolution and Fixed Foveated Rendering to the mix. Auto CPU/GPU Adjustment Standalone VR headsets are powered by a battery so making sure that power is not drained too quickly is important. With Adaptive Quality, we’ve made the management of CPU and GPU clock rates much simpler by making it almost entirely automatic. If Adaptive Quality is enabled, the system can dynamically change the CPU and GPU performance level to maintain performance based on the system load. So when the performance of your application is insufficient, CPU/GPU clock speeds will increase to improve the FPS. Likewise, if the application already runs at high FPS and the complexity of the scene is low, Adaptive quality can scale down the clock rates to save battery power for the headset and prolong its usage. Although we don’t provide direct access to the maximum/minimum-allowed clock speeds, Adaptive Quality can configure those properties based on its knowledge about the current system load and define if the levels should be lowered or raised. Of course, when Adaptive Quality is disabled, developers can still manually increase/decrease the CPU/GPU performance based on practical demands according to this API. System Events & Custom Policy Although we can change the clock rates to reduce power consumption or improve performance, that may not always be enough to achieve better results. Sometimes, more things need to change within the VR application software itself to get a stable performance. Adaptive Quality can be configured to broadcast events whenever there are changes to the system workload. This mechanism is really useful, since developers can create and customise their own policy to reduce GPU load and ensure constant frame rates over a longer period of time. Depending on the situation, the VR application can react differently according to those events and developers may create their own policies and choose whether and how to handle those changes and change scene complexity themselves. Developers can subscribe to receive performance events that recommend lowering or raising the rendering quality of the application and actively modify the rendering settings of the application in an equivalent way by enabling or disabling MSAA or other rendering settings that can help boost performance. Dynamic Resolution Dynamic Resolution is a feature that works together with Adaptive Quality and helps adjust the image quality of the application by changing the eye buffer size according to the events broadcasted (that we mentioned in the previous section). More specifically, the Resolution Scale will be increased automatically when the event received denotes that the quality can be higher and similarly, when the event type received recommends lowering the quality of the running application, the resolution scale will be decreased. In order to ensure that the Resolution Scale will not decrease to a point where the application is unusable, Dynamic Resolution comes with built-in functionality that helps determine the lower bound of Resolution Scale for different VR devices to maintain text readability. What’s great about this feature is that there is no extra latency introduced with the change in resolution scale. Dynamic Fixed Foveated Rendering Foveated Rendering is a technique that exploits the anatomy of the human eye and suggests that applications can drop the quality of graphics in the peripheral vision by lowering the resolution in that region while focusing all of the available processing power on a smaller area of the image(foveated region). The term “Foveated” derives from the word “Fovea”, the part of the eye retina that allows us to have a sharp central vision (Foveal Vision) which lets us focus on the visual details of what is important in a narrow region right in front of us. Anything outside the fovea region belongs to the peripheral vision and despite the fact that it is also useful, it can only detect fast-moving content changes and color, hence why it feels comparatively less detailed and blurry. It’s worth noting that there are 2 types of Foveated Rendering and the terms are sometimes confusing: Fixed Foveated Rendering assumes that the foveated region should always be at the center of the field of the view of the user and that lower resolution pixels should be rendered at the distortion region around the lens as things are already not clearly visible there. Eye-Tracked or Dynamic Foveated Rendering can be used with headsets that support eye tracking modules (Check Vive Pro Eye) to accurately define the foveated region based on gaze direction. It’s called dynamic because as the human eye moves, the foveated region keeps changing and the peripheral region keep changing dynamically. Adaptive Quality can be combined with Fixed Foveated Rendering to help increase the performance of VR applications according to system workload (dynamically). Fixed Foveated Rendering can be automatically enabled by Adaptive Quality to further reduce the GPU load and improve performance whenever it’s required. Results The benefits of Adaptive Quality can become more clearly articulated through some examples. The graph below illustrates how Adaptive Quality helps deliver smooth and high frame rates with a native fragment-bound application. In green, you can see the frame rate fluctuating when Adaptive Quality is disabled, with FPS going down as GPU workload increases. In pink, you can see the stable results after enabling Adaptive Quality running at 75 FPS. The x-axis indicates the time passed since the start of the application and the y-axis shows the FPS on the left and the GPU Level on the right. The yellow letters show the fragment loading values that increase every 5 seconds. Notice how the GPU clock rates go down when the FPS are high to save power(after 11 seconds and 61 seconds). Another graph below shows the results of a test using a Unity application and a GPU fragment-bound application. When Adaptive Quality and Dynamic Resolution are both enabled there is an increase of 13 FPS on average. In green you can see the case where Adaptive Quality is disabled, while in yellow you can notice how Dynamic Resolution tries to decrease the resolution scale to improve FPS and when FPS are high enough we increase the resolution scale back up again. We also tested Adaptive Quality with other applications such as Viveport’s Sureshot Game and SPGM. As you can see from the results, the energy consumption was improved by 10% and 15% respectively, indicating how useful it can be to use this feature in order to extend the battery life of the VR headset. How to use Adaptive Quality WaveVR SDK 3.1.0 to 3.1.6: Adaptive Quality is not enabled by default. The Wave SDK provides an API function called WVR_EnableAdaptiveQuailty that needs to be called manually so that the CPU/GPU performance levels can be adjusted automatically. The system events WVR_EventType_RecommendedQuality_Lower or WVR_EventType_RecommendedQuality_Higher will be broadcasted based on system workload and can be monitored to initiate actions according to them using the Wave System Event. Check System event to know how to listen these system events. WaveVR SDK 3.1.94 or later: Adaptive Quality is enabled by default and system events are broadcasted to change rendering quality when needed. One or multiple strategies can be used to adjust display quality and improve the FPS when performance is insufficient (WVR_QualityStrategy_Default, WVR_QualityStrategy_SendQualityEvent, WVR_QualityStrategy_AutoFoveation) Unity The WaveSDK package for Unity supports all the features of Adaptive Quality including the new features introduced with Adaptive Quality 2.0 (Dynamic Resolution and Dynamic Fixed Foveated Rendering) and it’s really easy to use. The WaveVR_AdaptiveQuality script can be used to enable the AdaptiveQuality feature. However, starting from WaveVR 3.1.94, this component will be pre-attached to the WaveVRAdaptiveQuality GameObject of the WaveVR Prefab. In this case, WaveVR_AdaptiveQuality is enabled by default. As you can see in the screenshot below, the WaveVRAdaptiveQuality game object is part of the WaveVR game object that is required to build VR applications that support the Wave ecosystem. There are 2 scripts attached to it: WaveVR_AdaptiveQuality: When this script is enabled, automatic CPU/GPU clock adjustment will take place and developers can tick the boxes under the Rendering Performance Improve Strategy section to define what strategies should be used (e.g I want system events to be broadcasted and Fixed Foveated Rendering to be enabled. In this case I need to tick both boxes). WaveVR_Dynamic_Resolution: This script is responsible for the Dynamic Resolution feature that adjusts the resolution scale of the VR application according to workload. An list of Resolution Scale values can be defined that will be used to adjust the resolution scale whenever there are events triggered by Adaptive Quality. Also the Text Size slider can be used to define the smallest size of text that will be used in the application to avoid having Dynamic Resolution making text unreadable. Unreal Engine Adaptive Quality can also be used with the Wave Plugin in Unreal Engine. Two functions are provided to enable the Adaptive Quality Feature and query whether Adaptive Quality is enabled or not. Although Auto CPU/GPU Adjustment is automatically supported when Adaptive Quality is enabled, Dynamic Resolution and Dynamic Fixed Foveated Rendering can’t be utilised at the moment with this plugin. There will be more updates on this soon. Wave 3.2 is expected to be released soon, adding support for system events in Unreal Engine. Summary Table Best Practices & Tips Now that you know more about Adaptive Quality here are some tips and advice: Rendering improvements by WaveVR AdaptiveQuality has limits. It is still highly recommended to optimise your app as much as possible first. Check our Mobile VR performance optimisation tips. Enabling WaveVR AdaptiveQuality can help lightweight VR apps, such as photo or video playing apps, be used longer. Enabling WaveVR AdaptiveQuality with WVR_QualityStrategy_SendQualityEvent and WVR_QualityStrategy_AutoFoveation can improve the rendering quality by up to 15% if the rendering bottleneck is GPU fragment processing bound. Always build optimized versions of the application for distribution. Even if a debug build performs well, it will draw more power and heat up the device more than a release build. Power management is a crucial consideration for Android VR development. Wave SDK force disables Adaptive Quality during map loading to increase performance and restores Adaptive Quality status after the map is loaded. Useful Links To implement Adaptive Quality in your own application, check the documentation pages below: Wave Native SDK Unity Integration Unreal Engine Integration Also see: System Event & how to listen to events Foveated Rendering Dynamic Resolution We gave a talk about Adaptive Quality and the new features introduced in the latest Wave SDK during the Virtual Vive Ecosystem Conference back in March 2020. You can watch the presentation below: What do you think? Feel free to try this feature and provide feedback from your tests in our forums. You can also find the complete list of new features for each Wave SDK update in our release notes.
  8. Hi all, We recently released an Early Access version of Wave SDK 3.1.94. This version comes with several new Experimental Features for content developers and one of them is Direct Preview for Unreal Engine. While creating content for Vive Focus/ Focus Plus, developers need to test and tweak their project to make sure that everything works properly. However this process is often time-consuming as developers need to repeatedly build, deploy and install APKs spending time waiting during the development stage. That’s why we introduced the Direct Preview feature which enables you to skip the building process and directly preview your content on your AIO HMD via Wi-fi. You can rapidly preview and iterate on your scene using Unreal’s VR Preview Play mode while Direct Preview will stream the rendered frames to your Vive AIO device. Headpose, Control input, gestures, and similar input is sent from the device to the computer. You are able to effectively preview the app without needing to go through a time-consuming build and deploy process. Here is a video showing the steps: DirectPReview_UE4 Video.mp4 INSTRUCTIONS You can find detailed instructions HERE about Direct Preview. Here's what you need to do: Integrate the WaveVR plugin of the WaveSDK to your Unreal Engine Project according to the instructions here. Connect the HMD to your PC/laptop with a USB cable and turn on the HMD. Make sure that the proximity sensor of the HMD is always covered to keep the HMD awake(otherwise it will go to sleep). Find the IP of your HMD using the adb command "adb shell ifconfig" Copy and paste the IP to the Wave VR Project Settings. Also make sure that the Connect Type is set to Wi-fi. Install the Direct Preview APK to your HMD using the Wave VR Menu option "Install Device APK". This will install automatically the wvr_plugins_directpreview_agent_unreal.apk that lives in the {YourProjectName}\Plugins\WaveVR\Prebuilt\DirectPreview\deviceAPK folder. Verify that the apk was installed successfully by checking the Library of installed apps on your HMD. Start the DPServer using the WaveVR Tab Menu option "Start DPServer". You will notice a console window opening that will show you logs of setting up the server. Now your server is up and running we need to connect the HMD to it. To do that we need to start the apk we installed earlier. Go ahead and run the "Start Device APK" command from the Unreal Menu tab. This will automatically start the apk on your HMD(since the HMD is connected to your PC). Once you start the apk on your HMD you will notice a screen showing the message "Connecting...". You should also see the dpServer console window is updating with logs, showing that a client is connected to the server. You can now disconnect the headset from the PC as the streaming will take place via Wifi. Now start "Play in VR Preview" from the Unreal Engine menu. This will start a preview of your VR application and if you enabled "Enable Preview Image" in your WaveVR Project settings you should see the same result rendered/streamed on your HMD wirelessly. That's it! If you now try to move the headset, the VR Preview window will update accordingly. You can always stop the VR Preview, modify your scene layout or work on your project and then start the VR Preview again to see the updated results quickly on your HMD. This way you don't have to deploy an apk and wait for things to be compiled just to preview a simple change, saving you previous development time. ISSUES/NOTES Make sure that your PC/Laptop and your HMD are both connected to the same network domain. If what you see on your HMD during the Direct Preview looks blurry, you may need to adjust the image size sent to the HMD by changing the Unreal Engine window size in Editor Preferences > Level Editor - Play in Unreal Editor. So for example, if your Unreal Editor VR Preview window is too small, the image sent to the HMD will need to upscale and this will cause pixelation. At the moment, your Unreal Engine Project folder must be in your Window C Drive(and not an external hard drive), otherwise Direct Preview may not work at all (we are working on a fix for this issue). If you notice that only the left or right eye view is rendered on the HMD during the Direct Preview Mode or none of them are rendered at all, that's because Direct Preview needs high bandwidth for streaming, otherwise it is possible to lose frames. However, we provide an update frequency option in the WaveVR Project Settings so developers can adjust the FPS according to their bandwidth and reduce the FPS accordingly. Also restarting both the apk and the dpServer.exe application can help with the loss of rendering in the HMD during Direct Preview. Keep in mind that the FPS option here has to do with the number of frames sent from the dpServer(PC) to the APK(HMD). If you can't see the dpServer.exe window after trying to start it from the WaveVR Menu option button, you can always start it manually by running {YourProjectName}\WaveVR\Prebuilt\DirectPreview\dpServer.exe. Similarly if the Direct Preview apk is not installed automatically after clicking on the Install Device APK option, you can always install it manually using adb install command. The apk lives inside the plugin: {YourProjectName}\Plugins\WaveVR\Prebuilt\DirectPreview\deviceAPK\wvr_plugins_directpreview_agent_unreal.apk. If you start the DPServer and the window is opening and closing immediately there is probably an error. Try to run the .exe manually with cmd and if you notice that the log is complaining about nvidia drivers try to update to the latest nvidia drivers + restart your PC if you already have the latest drivers. Remember that you only need to start the dpServer and the Direct Preview apk once and it will keep running in the background. Of course feel free to restart it if you notice that something went wrong. Please give it a go and let us know about your thoughts in the comments.
  9. Back in release 2.1.8 we were told to make sure to handle BOTH the new Alias1_Trigger and Alias1_Digital_Trigger in our code. In Wave SDK 3.1.6 the new WaveVR_Button List was introduced but for some reason the DigitalTrigger is commented out. Our app needs to run on BOTH focus and focus plus. Can someone help us figure out what buttons map to which enum values as of 3.1.6 See the Assets\WaveVR\Scripts\WaveVR_ButtonList from the latest unitypackage for Wave SDK 3.1.16: @Cotta @Tony PH Lin
  10. @VibrantNebula @JustinVive @Dario and team. It would be really great if your could publish a Best Practices Guide on Player/Build/Preferences settings in Unity. It would be VERY helpful understanding how WAVE SDK changes might be impacted by the version of unity you might be running. For instance, having a better understanding of how Unity 2019.x player settings impact the performance of our Wave VR 3.1.6 SDK based app on a Vive Focus/Plus would be EXTREMELY helpful...THoughts? Here's ours if its helpful. We are running Unity 2019.3.6f1 + Wave SDK 3.1.6 and we develop for BOTH Vive focus and vive focus using latest build of Visual Studio 2019 Enterprise as our editor. @Tony PH Lin @Cotta
  11. Hello One and All, If you're interested in creating mobile VR content for the Wave platform and you want to win cash prizes, check out the Wave Developer Awards - https://developer.vive.com/resources/2019/12/17/viveport-launches-wave-developer-awards/. We are now accepting submissions through April 10, 2020 at 11:59pm (GMT+8). Check out the video for inspiration and see some of the great games and apps that already exist in the Wave ecosystem. Good luck and can't wait to see the great apps that you create!
  12. In order to build for the Focus Plus you will need to integrate your apps with the Wave SDK. For more information about Focus and Wave development check out our Vive Developer Resources Site. You can find both the Wave SDK and Documentation linked from this Resource page. If you’re a Unity Developer, then Getting Started with Unity and Wave SDK will be very helpful. If you’re looking for community help, be sure to visit the Wave SDK Forum to find answers to your questions or to ask new ones you may have. You can find best practices for HMD and controller tracking - Room Setup and Controller Tracking Tips. Read the Vive Focus Plus User Guide to get acquainted with it. Make sure to update your HMD to the latest ROM (currently version 4.14.623.x). You can update the device by tapping on the controller’s Vive button and then > Settings > System Update (Note: You must be connected via Wi-Fi in order to receive system updates). If you need to debug your app, check out this post about ADB Logcat. Once you’re ready to publish your app on Viveport, visit the Viveport Developer Console. If you need inspiration, feel free to find some fun right away and get to know some of the Wave experiences. Visit the Viveport M store in the HMD and peruse the titles with the 6+6dof label. This label indicates that the experience is compatible with the 6dof headset and the 6dof controllers. Here are some great suggestions: Browse the Virtual Web with Firefox Reality. Download Sureshot: RPO Edition. In this Ready Player One app you can freely move and shoot in full room scale glory! Check out Skyworld Kingdom Brawl, which is Vive Studios’ tactical card-battler, Clash Royale-inspired collaboration with Vertigo Games. Or try Kingdom of Blades for some sword swinging, shield bashing action! If you’re looking to purchase a Focus Plus, they are available at this link https://enterprise.vive.com/us/focus-plus/. For bulk orders, please reach out to your Vive contact for more information. If you have any questions, please don’t hesitate to reach out to us. We look forward to experiencing what you create. See you in VR, Justin and the Vive Team
  13. Hi, I have a question regarding the foveated rendering on Unity. Is it a feature that works on all HTC headsets, or only some? I'm trying to use it with the Vive Focus, but no matter what the parameters, it doesn't change what's displayed. Either it be with the Unity Editor or inside the headset. Has anyone had any success with it?
  14. hi there, i m into a situation where i m not able to click on UI buttons... 1. i created lookat function for UI canvas which is attached to empty gameobject (parent of canvas) whose rotation is 0 and child (canvas rotation is 180) 2. when i teleport to different position and try to click on UI button not able to click, the controller ray is passing through UI object @Cotta @Tony PH Lin
  15. We are having trouble determining whether the use has taken off the headset. Looks like WVR_EventType_DeviceSuspend and WVR_EventType_DeviceResume events are not fired. Please advice what's the correct way for the app to know the headset has been taken off, since we need to take additional actions.
  • Create New...