Jump to content

hardware is only as good as the software


Recommended Posts

I'm going to go on a tangent for a second about developing for VR.


Vive as the hardware developer should make the interaction software as easy to use with full functionality.

Your hardware is only as good as the software and you're relying on a bunch of developers you don't know to make your hardware work great.


You are depending on some random developer to make your controller interact the best it can instead of developing the software yourself. 


I don't know how many times people I demo become frustrated with VR only because the developer had written bad hand control. Vive needs to figure out the software to make their hardware work best.


99.9%  people want to use the controllers the same way. There are only a handful of interactions, it's not hard for you to solve what works best:


1. Locomotion: Walk/ teleport

2. Laser point grab/push/pull/rotate/activate

2. Action: Climb grab/push/pull/rotate/activate

3. Action: car/ship/space ship/motorcycle

3. Action: Swim/skydive


You should figure out what is easier for people and write that software, I don't have time or resources to make case studies, statistic, and prototypes. I can focus on developing the actual game and you can sell more hardware.


SteamVR 2.0, VRTK and VR Easy are great directions, no one is going to want to use Vive Input unless you make it more robust with minimum effort.  


Simply figure out the most common configuration having your team develop that software.


Link to comment
Share on other sites

"SteamVR 2.0, VRTK and VR Easy are great directions..."  is certainly debateable, and VIU has the easiest path for abstracting input, with a clean archetcture and our input role mapping is now also integrated with the new steam input binding system.   Less clicks for any newbie than those toolkits for getting started.  Those who have started  using VIU  can atest to this (even our simulator has been cited to be the best one).


If what you're asking about is more about VR design and best practices  the I would recommend looking at the Primer pdf we have participated on as part of the XR Association - see http://xra.org


Let's review:

1. Locomotion: Walk/ teleport   

  - VIU provides basic examples (out of the box support with ViveRig prefab) you can swap our the graphics if it's too simple (you should always customize your look and feel anyway - one shouldn't have to tell what toolkit you used in you app)

2. Laser point grab/push/pull/rotate/activate   

  - VIU also provide basic pointer prefabs and scripts for the basics (please see examples)

2. Action: Climb grab/push/pull/rotate/activate

  - VIU again provides grab support as part of ViveRig controller sample script - more advanced examples can be modified from any library collection that has abstracted the specific physics interactions that you need.


3. Action: car/ship/space ship/motorcycle

3. Action: Swim/skydive


These are more complete use case requests that can be part of extended examples and libraries not the core Input utilities SDK support which is more than suffice for software engineers to make use of. That's why VIU is open sourced and many professionals use it to build upon, it was always event driven with call backs, pro studios that use it really prefer it over the others to build their library of apps.


If you're looking for out of the box components for non-programmers i.e. designers, you can look at other tools like Playmaker or even the VR editor tools and many others (although yes, not ideal). But ultimately that's the goal, to not require scripting or as little as possible and it's on Unity's road map for this year (visual scripting similar to UE4 blueprints). Locking users to monolothic bloated SDKs and libraries using inheritance does more of a diservice and disallows ability to pivot to new requirements (requiring rewrites and breaking backward compatibilty).


Technially VIU falls back to Unity XR apis supporting even more devices (VIU doesn't have to have the SteamVR plugin present), and ideally we shouldn't need any 3rd party plugins once the Unity XR APIs are complete enough to satisfy basic access to most devices. Libraries and projects should not get stuck to using specifc SDKs/Toolkits and should have their own abstraction layer to make it easy to switch toolkits when then time comes making porting to future devices easier.


I think the confusion here is about what the difference is between an SDK and a Library.  SDK have examples that sholdn't be intended to be a complete libary off the shelf components but building blocks to those. Many companies develop their own in house libraries (some even their own SDKs) and again you should abstract your own interface between your APIs and third party APIs before building an extensive library for use across your own applications.


Our input mapping to roles alone is the reason you should switch to VIU (in VR or as a 2D overlay, you can select devices and assign them hand or body roles) as well as the additional SteamVR MR camera support and ability to retarget to different platforms from preferences and not in your scene.


Once you get started with our static functions for handling input (especially with autocomplete) you'll understand that it's much easier to write scripts than with other toolkits.


Link to comment
Share on other sites

Thank you Dario for the time and thought you put into your message! I appreciate it.


VR games suck solely because of the interaction with the controllers looking at the games reviews proves this.


Bottom line: If you have to explain to someone in a VR game how to move or interact you haven't done a good job. Giving control to the developers has proven to be a bad idea.


I've bought and played about 90% of all VR games to date and I can tell you Vive has a software /controller problem and its not helping VR.


First time VR users walk away frustrated with the interaction and are put off never wanting to return, they think it's for hardcore gamers that are willing to put up with the complexity.


It's not the controllers, its the freedom you give your developers that don't have well thought out concepts. Truthfully there is only one right way to write the code for your controllers and how they interact.


Out of all the games, I've played there are three games that have intuitive controls. It is so frustrating to figure out for each game how to interact. All but three games I will not play again only because of the controllers.


"companies develop their own in house libraries" this is what the problem is with VR today.

Vive needs to supply a fully thought out library of interaction and everyone sticks with that, I wouldn't even make it an option. Just have the controllers always work the same, I don't see any situation you couldn't do this.

I think you should have a drag and drop, choose an "Action Set" Done. Tell us how to interact and we will change our game to adapt.


Sorry for the fuss and rant ;)

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Create New...