Jump to content
 

Dario

Administrators
  • Content Count

    242
  • Joined

  • Last visited

  • Days Won

    2

Dario last won the day on August 28

Dario had the most liked content!

Community Reputation

4 Neutral

1 Follower

About Dario

  • Rank
    Constructor

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hi, @MacBread, Tutorials are dated, there's now support for both cameras not just with SRWorks but with the OpenVR API as well (though I haven't checked if the SteamVR Unity plugin has been updated) - I plan to get back to this via OpenVR for Vive Pro however I do recommend that using SRWorks will be the easier route. One approach is to take the textures or the quads from one or both of the eyes and simply move them to the object you want to project on. Currently the best way to learn how the SRWorks framework works is to use the inspector during runtime to see how it's laid out and then you'll know what parts to use or copy. I do plan to get to new tutorials - please feel free to ask for help on getting started on the SRWorks SDK forum best, Dario
  2. Yes, the sample will be updated to both latest SRWorks and Unity shortly, sorry for the inconvenience. Support for older versions of SRWorks will probably be dropped.
  3. Hi @Greg Corson, were you able to figure out delaying pose data? I would collect the poses (maybe a circular queue) and simply play them back on a model that's not attached to the live poses.
  4. Hi, Miracast is the supported streaming protocol - so no, Chromecast won't work currently. There are inexpensive Miracast dongles (tested and works) or you can cast to a PC and then from there cabst to whichever display you can from the PC. best, Dario
  5. When they updated the rom to suport Go they broke all unpublished apps because you now need to add to the Android Manifest: <uses-feature android:name="android.hardware.vr.headtracking" android:version="1" android:required="true"/>
  6. Please also consider using the easier to use VIU which adds platforms/devices like Cosmos, Wave (including Focus+), Quest, and more.
  7. Have you tried placing a green sphere resized to the depth you want and centered to follow the camera?
  8. @Futuretown, The Valve recommended way is yes to use the UI tool in Unity to create the actions.json mapping especially if you have custom actions. The one we provided has been tested / created with the tool and covers the basic actions (the example in the SteamVR plugin and also includes the actions used in our VIU (Vive Input Utility) plugin as well. Both plugins can generate this for you. I'll be posting screen shots shortly.
  9. Today we are announcing to developers an early access release of the Vive Hand Tracking SDK for the Vive, Vive Pro and the Vive Focus (Wave platform). This SDK will provide the ability to track your hands, recognize gestures and on the Vive and Vive Pro track your fingers as well (21 point tracking). For more info please attend the sessions at GDC on Vive Developer Day, Monday March 18. Or just try out the SDK available now here: https://developer.vive.com/resources/
  10. Can you share the android manifest snippet ? Normally adding to the manifest prevents the dialog from appearing. Note to all: This is more of a reminder than a build error - you can safely ignore (close) while developing. This is used by the store and thus is very important to add before publishing so that your application gets listed in the appropriate sections. The recommendation is to do your best to support both 3DOF and 6DOF controller(s) for wider reach.
  11. Getting Started with VIVE Wave™ for Developers First download the Wave SDK : https://developer.vive.com/resources/knowledgebase/wave-sdk/ The five components of the Wave Platform SDK: Wave Native (Android) SDK https://hub.vive.com/storage/app/doc/en-us/GettingStarted.html Wave Unity SDK (Plugin) -- see the getting started guide below -- https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html Wave UE4 SDK (Plugin) https://hub.vive.com/storage/app/doc/en-us/UnrealPlugin/UnrealPluginGettingStart.html Wave PluginKit SDK (for 3rd party accessories like controllers) https://hub.vive.com/storage/app/doc/en-us/Pluginkit_SDK_Tutorial.html Wave OEM SDK (for 3rd party headsets) https://hub.vive.com/storage/app/doc/en-us/VROEMService_Tutorial.html Note: Porting to the VIVE Wave platform A case study of porting from the Vive (PC) to the Wave (mobile) includes tips for optimizing for a mobile GPU available here. Additional Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR. So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide. A Quick Start Guide for developing in Unity: The following are the steps for setting up a scene with the Wave SDK, but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support. 1) Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html) 2) Import wavevr.unitypackage (Assets->Import Package->Custom Package...) 3) From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete the existing Camera object there’s one already included in WaveVR) 4) To future proof your application for more than one controller, duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first) 5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging) Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to. You can safely dismiss the AndroidManifest related popup for now until you are ready to publish (this is for indicating 3DOF vs 6DOF support - it is recommended to support both). At this point you should be able to see your empty scene with your controller on your Wave device ! Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin: There is also an additional Unity SDK for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both). The Vive Input Utility: This is a Unity plugin that can support Vive, Vive Pro, Rift, Daydream, Go and the Wave SDK (e.g. Focus) in addition to Unity’s UnityXR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platforms. It’s available on the Unity Asset Store (search for VIU) or at https://github.com/ViveSoftware Steps to create the same application but using the VIU plugin: 1) Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android ) 2) Import wavevr.unitypackage (Assets->Import Package->Custom Package...) and the Vive Input Utility from the Unity Asset Store. 3) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig) 4) Build and Run VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene. Here’s what you’ll see in your Vive Wave HMD after additionally adding a Plane and a Sphere to your scene using GameObject-> 3D Object This screenshot is of a Wave build (using the VIU plugin) and if you simply switch the build target to Windows it’ll work in a Vive/Vive Pro and you would simply see different controllers for the same scene with no changes. You can then develop with a Vive and then only switch the build target back to Android to create a build for the Vive Focus. You can use a simulator in Unity for testing a VIVE Focus in your Unity Editor: https://hub.vive.com/storage/app/doc/en-us/Simulator.html And if you have a VIVE Focus but are waiting on 6DOF controllers, you can also simulate a 6DOF controller. More info here: https://github.com/ViveSoftware/ViveInputUtility-Unity/wiki/Wave-VR-6-DoF-Controller-Simulator Support is provided at the official developer forum for the Wave SDK: http://community.viveport.com/t5/Vive-Wave-SDK/bd-p/vive-wave-sdk
  12. , it's currently available - please see the update at the top of this thread
  13. Not anymore in SteamVR as of today... not sure about UE4 but I think 4.21 should have all the latest (like skeletal support and 6DOF controllers for Android..)
  14. Well like I said it's been fixed in a just updated OpenVR update (yet to be included in SteamVR or the Unity beta plugin). But regarding "redirecting controller input to tracker via OpenVR Input Emulator, or using it as controller after role change (Vive Tracker Role Changer) " please note that both of these can now be considered EOL due to the new button/role bindings in the new input system in SteamVR. That being said, once you do map a hand role to a tracker (the bindings can be customized by the user via settings within the headset or at the desktop running SteamVR at http://127.0.0.1:8998/dashboard/controllerbinding.html ) then the class id will be changed from generic_tracker to controller according to the latest release notes for the latest release version of SteamVR. Technically this would break existing apps that used the generic tracker class id for hand roles however it can now role change to identify as a controller (left/right or both) and allow any shooter game use a tracker with a Hyperblaster. Moving forward, these SteamVR apps that broke should get updated and specify default action mappings. The beta SteamVR Unity plugin creates templates for you to edit if you need examples.
×
×
  • Create New...