Jump to content

jboss

Verified Members
  • Posts

    78
  • Joined

  • Last visited

Everything posted by jboss

  1. Hi, I just changed my non-VR project into a VR project to be able to use it on my Vive. I did the following steps: Enable Virtual Reality Supported in Player settng (am using Unity 2019.4.18f1) Installed the packge OpenXR Desktop Selected the OpenVr SDK in Player Settings Imported SteamVR from the Asset Store Imported Vive Input Utility from the Asset Store And then I got 11 compile errors, all saying 'Event' does not contain a definition for 'current' in various files, such as Assets\HTC.UnityPlugin\ViveInputUtility\Scripts\ControllerTooltip\Editor\TooltipRenderDataAssetBaseEditor.cs(121,19) Assets\HTC.UnityPlugin\Utility\Attribute\Editor\FlagsFromEnumAttributeDrawer.cs(128,35) I figured I might need to restart Unity to reload dlls, but that did not help. What is going wrong? Thanks!
  2. Thanks for the reactions. @chengnayI am using Unity 2019.4.18f1 VIU version 1.13.4.0 for Oculus Android device. The Oculus integration I can't find. Checked the folder in the hierarchy for a version log, checked the Oculus item in the editor menu, checked one of the scripts but nowhere is a vesion number. The only thing can find the is Oculus XR Package, which is version 1.5.0 @lawwong Great, that works now!
  3. This must be so straightforward, but I just can't get it to work. How can I get the controller velocity on the Oculus Quest 2? I know that for SteamVR it works using the SteamVRBehaviourPose component. I have a standard VR CameraRig with RightHand and LeftHand. Both have a VivePoseTracker (Transforms) on them and a rigidbody. When I print the Rigidbody.Velocity it always shows (0,0,0). I figured, maybe I should use the ViveRigidPoseTracker (Rigidbody), but then my controllers were not tracked at all, so I reverted that. I checked the scripts to see if it might be similar to the problem with button presses, but there are no switches for Oculus_Quest in both PoseTracker scripts. So I hope someone can point me to the soution. The context is that I want to throw a ball. I made an alternative solution using ViveColliders and BasicGrabbable. That works, but the velocity there is off as well. The trajectory of the ball after releasing the trigger looks like the ball is extremely heavy. It flies only only about 2 meters, where it should fly for 20 -30 meters. So it looks like the velocity transferred to the ball is a lot slower than the velocity of the controller. The mass of the ball is 0.195 kg, so it is not extremely heavy. In the option decribed above I can use a multiplication factor to increase the velocity (I do that in the SteamVR version as well), but with BasicGrabbable I can't find such a work-around. Thanks!
  4. In case someone finds this with a similar problem: It was related to Quest 2 and the fix is described here: https://github.com/ViveSoftware/ViveInputUtility-Unity/issues/225#issuecomment-896542888
  5. @chengnayunfortunately not. The only feedback from Oculus/facebook is that check VRC Functional 3 failed, the user gets stuck. @DarioGreat, thanks! I'll send you a message.
  6. I have ported my Vive baseball training to the Quest, including the Vive Input Utility. Every time I submit it to AppLab it gets rejected because it is locked right at the start scene. The user is not able to select any menu options. The weird (and frustrating) part is that it works fine on my own Quest, so I have no possibility to reproduce the situation and troubleshoot it. Anybody seen this as well?
  7. I found a solution, but not the preferred one, I guess 🙂 In the ReticlePoser script I commented out the line targetReticle.rotation = Quaternion.LookRotation(result.worldNormal, raycaster.transform.forward);
  8. I would like to use a logo as reticle. First I tried to use a sprite for this, but I found that the Recticle Poser script requires Mesh renderers, so I changed it to a cylinder. But I notice that the reticle rotates depending on the angle between the pointer and the menu. If I point straight ahead the logo is fine, straight up, but when I point to the left it rotates counterclockwise and when I point to the right it rotates clockwise. How can I prevent that? I tried to uncheck the Ease Rotation marks in the Pose Easer, but they were checked automatically again when I hit Play (and remained checked, even after ending play, strangely enough) Thanks.
  9. Thanks. The laptop power is not the problem. The background for the question is the recommendation to minimise the motor activity of the basestations to avoid breakdown. I recently had to replace one of my 2.0 basestations because it was not working anymore (red led light, indicating hardware failure). before that I never really bothered about the basestations remaining on, but now I want to be a bit more careful. When I manually exit SteamVR the basestations shut down fine, but in the other situations they remain on.
  10. Ok, that's in the official PlayStore, so that should be ok. And it works fine 🙂 Thanks!
  11. Great, thanks! I found the Android version of the BS Companion at android-apk-app.com. https://android-apk-app.com/android/1533473030/bs-companion Do you know if that's safe to install?
  12. Is there a way to turn off the base stations when my computer goes into hibernation mode? Or when I close the lid of my laptop? It works fine when I exit SteamVR, but sometimes I leave my laptop on and walk away and then the laptop goes into sleep mode, but the base stations remain active.
  13. Thanks for the new version. I managed to get it working in the previous version already and I see that the behaviour is the same. But is is still not straightforward in my baseball application. I am logging Unity GameObjects (baseball bat and ball) and the gaze and replay them together and I have the option to step through the action frame by frame. I have set the framerate in Unity to 240 fps and logging the gameobjects is done in FixedUpdate. So with eyetracking at 120Hz I expect that the gaze moves every two frames, but it doesn't. Sometimes it's after 2 sometimes it's after 3, and very seldom it's after 1 frame. I remember from the other thread that it might be related to the moment in the cycle when the gazedata is read, but it just feels awkward. It must be running at 120Hz, otherwise the 2 frames jump would not be possible, but it does not seem consistent. Some feedback about upgrading the runtime: because I start SRanipal together with Windows I got the mesage that SRanipal was running. I quit SRanipal from the tray, but then still SRService was running. The installer gave the option to continue and then upgrade the locked files at restart, but even when I selected that option the installer reversed later in the process with error 10001 (or 1001, don't remember exactly) Service running. Also killing the service using taskmanager made no difference. I had to uninsall SRainpal and then install it again.
  14. I've seen the posts that a blinking red light is usually a mechanical issue, yet I hope my case is different 😕 as I can not think of any cause for a mechanical failure. The basesation was working fine yesterday morning and was blinking red yesterday afternoon. There was no fall, bump or any impact in the mean time, it has just been on the clamp on my doorway as always. Also, when I plug the basestation in it starts normally, it goes from white to blinking blue to green light and then after about 10 seconds it goes blinking red. I tried connecting the basestation to my laptop and run the firmware recovery tool, but that did not find the basestation. Any thoughts? Is it really broken or is there some way to rescue it (it is almost 2 years old, so I'm afraid not under warranty anymore)?
  15. @Corvus Thanks for this guide. I'll give it a try after the weekend. One question: in the screenshot you use Eye Version 1. Does that matter? Or does it work with both version 1 and 2?
  16. I've been developing for the Vive for a couple of years now and every now and then I get questions about accuracy and refresh speed. My application is a baseball training and it works with a controller or with a tracker on an actual bat. When people try to hit high speed pitched balls the controller or tracker moves quickly as well. And when people miss the ball they sometimes wonder if the accuracy of the system is good enough for such high speed tracking. One of the features in the training is that I show the path of the bat during the swing and also there I notice strange results when I move the bat at high speeds, particularly when using a tracker. The bat position jumps and sometimes the bat is rotated as well. My initial thought was that maybe with the tracker at the knob of the bat it would be shielded from the lighthouses (or at least from one of them), but I tested it with different setups (both lighthouses in front of me, for example) and that made no difference. So I'm trying to understand what could be the reason and for that I would like to know the tracking speed and accuracy of the system. Is tracking a baseball bat moving at 80mph at all possible using a Vive tracker? 80mph means 35 meter per second, so 3.5 cm per milisecond. So with a refresh rate of 1000Hz the tracker will move 3.5 cm between measurements. I found a couple of answers on reddit and from that I understand that it is a combination of IMU tracking and lighthouse tracking, but I can't find any clear answer on how many times per second the position and rotation is updated. I also found this post from Nasa https://ti.arc.nasa.gov/publications/61557/download/ talking about accuracy, but I can't find any clear info on the Vive site. And the reddit posts are a couple of years old, so are they still accurate? So I hope someone can point me to specs from HTC or can answer this in this post. Especially about the accuracy, as the paper from Nasa shows that for a moving tracker the accuracy decreases drastically (43 mm , 4.3 cm off is a lot when you try to hit a fast-moving virtual baseball). And is there a difference between the version 1.0 base stations and trackers and version 2.0? I currently have version 1 trackers (or even developer kit trackers 🙂). Thanks!
  17. The same bug applies to the combined gazeray. When I move my gaze to the left the ray goes to the right and vice versa. Up/down is fine. If gaze_orgin_C is the measured combined origin and gaze_direct_C the measured combined direction you need to do the below calculation to use it: fixed_gaze_origin_C = new Vector3(gaze_origin_C.x*-1f, gaze_origin_C.y, gaze_origin_C.z)/1000f, fixed_gaze_direction_C = new Vector3(gaze_direct_C.x*-1f, gaze_direct_C.y, gaze_direct_C.z), Note the division by 1000, as I'm using version 2 of the eyetracking API and there the measured values are in mm and Unity units are meters.
  18. One think I noticed now that I'm using version 2 is that measured values for origin are in mm, so to use it in Unity I need to divide by 1000. In version 1 this was not necessary, I could use the origin values right away.
  19. @VibrantNebula I agree that if you start the SR-runtime manually first before any application it works fine. But as a user (enterprise or consumer, I don't think this is an enterprise problem) you have to remember that. Every time! Because if you forget, it breaks the immersiveness of VR. You have your headset on and you get the message "next up... Your application' and it just stays there. Because on screen there is the UAC popup. So you have to take off the headset, click OK and put the headset on again. Not a good experience of VR. I changed the properties of SR-runtime.exe to always run as administrator, but that makes no difference, as other users have reported as well. I then had a brainwave, what if I run SR-runtime at startup automatically. Then you can't forget. Well, that didn't work either. I had to copy (not link, actual copy; weird Windows logic....) it to the Startup folder, but when I then started calibration the three blue dots froze and disappeared. And then the Vive Pro Eye option was gone from the SteamVR in-world system menu. So I copied it back to the original location. Since you mention it is related to the Tobii runtime environment, I hope you will take this up with them. I think I will do that as well, but if the request comes from HTC it may have more leverage.
  20. @VibrantNebula my problem is not so much with the calibration itself, but with the UAC popup. If you want me to start a new thread, let me know. Here's what happens. Every first time I run calibration after starting up my PC I get the UAC popup asking if it is ok to allow Vive-Super-Reality-SR_Runtime to make changes to this PC. It makes no difference if I start calibration from SteamVR, from the Unity editor or from a Unity application.exe. If I click Allow, then all subsequent starts work right away. Also if I first run it from my application.exe and then from SteamVR. Since the popup blocks the whole screen I can not go to the SR-runtime icon and pack-log. This is a new MSI laptop, Windows 10, with a clean install of SteamVR and SRanipal. I created only one account on this PC and according to Windows accounts it has administrator rights. I did not change any access rights settings for SRanipalRuntime.exe yet, because I see mixed reactions. I do see that there are settings for users, administrators and for applications. maybe that is a clue? I attached the settings (I hope the order is the same in all languages 🙂) I hope you can help me with this. I need to install a Vive with eyetracking at a customer later this month and since it is a large organisation I'm sure the IT department will not allow me to disable UAC control completely. And I don't want to tell the customer 'Well, if you want to use eyetracking you need to start this manually first', or that they need to take off the headset, click ok in the popup and then continue the application. Thanks!
  21. Hi @imarin18 I realise I forgot to mention something. I made a small adaptation to the script. In the while loops in the Coroutine Sequence I added a check for a boolean called taskInterrupted. This boolean is set when I press the application menu button on the Vive controller. When that boolean is true this breaks the while loop showing the targets. And then the last part of the Sequence Coroutine is executed and there saccadeEndTime is set. And that actually also breaks the while loop in the EyeCallback method, I see now. Even though cnt_callback < maxframe_count. // ====================================================================== // 1. Perform saccadic eye movement assessment: 60 pro-saccade trials. // ====================================================================== Debug.Log("1st pro-saccade test has started."); debugWindow.text="1st pro-saccade test has started."; // JB last step for testing, no changes in the rest of the procedure for (int i = 0; i < pro_n; i++) { yield return StartCoroutine(TargetAppear(prosac_time_1[i], prosac_direct_1[i], i)); if (taskInterrupted) { break; } } /* Skip rest of the test Debug.Log("Take 1 minute break. Press space key to start 4 practice trials of anti-saccade task."); So now that I write this answer I realise that the problem must be in how I stop the eyetracking in my application. And I understand better how your application works as well. 🙂 So thanks for your reaction,
  22. I'm sorry that my last post was what rude. I just got taken by surprise that there was a time-limit on editing a post. I'd really apreciate some help getting this working. Especially the freezing of the editor is killing me. After every test run I have to use task manager to end the editor and then start it again. That is not workable. @imarin18 I checked what happens in your application at the end becase that application ends normally and copied that to a Coroutine in my application (see code below) and then use Update to call this Coroutine when the X key is pressed. I can see in the log that the Coroutine is executed, but still the editor freezes. IEnumerator EndEyeTracking() { // Stuff from saccade test project that is done at the end of the test //debugWindow.text = "Test finished. Waiting " + Endbuffer.ToString() + " seconds"; yield return new WaitForSeconds(Endbuffer); SaccadeEndTime = DateTime.Now.Ticks; // JB changed because original code causes build errors if (Application.isPlaying) { Application.Quit(); } Debug.Log("Eyetracking finished"); } I know that in the editor Application.IsPlaying is false, so the application does not quit by itself. Instead I get the debug line below it. In the other forum thread that mentions the freezing of the editor there are several options and I get confused which is which. One uses new Thread() and then use OnApplicationQuit and OnDisable() to abort the eyetracking, but in the saccade test this is not used. https://forum.vive.com/topic/5897-getting-verbosedata-at-the-fastest-rate-possible/page/2/ So I hope someone can help me with this. I attached the editor.log file, maybe that helps. Editor_Freeze_OnePitch.log
  23. Well, screwed by technology again. Apparently I pressed some button to post this before it was fininshed and now I can not edit it anymore. So here's the full version. Now I'm converting the saccade test project to my own project and I get new errors, so I hope someone can help me with those. First I had to remove the static keyword from the eyecallback method, because I am logging the data and my CreateNewLogEntry method derives from a non-static base method. I can not judge the impact of that, but at least I got no compile errors. Then I need to link the eyetracking log to my motion log where I log the ball and baseball bat, which is logged in Unity's fixedUpdate process, at 250 fps. Since timestamp may have too much variation, I decided to use frame-number. But then I got the error that Time.frameCount can only be used in the main thread, so I cannot read that in the eyecallback thread. So I figured I should use eyedata.frame_sequence, but that has completely different values. Did a comparison with debug lines and found: FixedUpdate in LoggerEyetracking at frame 31 EyeCallback called at eyedata.frame 276386, loggingActive=False How can these vallues be so far apart? And is there another way I can link the eyecallback thread to the frames in the main thread? And every time there is an error the Unity editor freezes so I have to use Taskmanager to kill it and restart it. But why does this have to be so complicated? This is costing me so much time that I can not spend on new content for my application. Why can't it be connected to the FixedUpdate process in the main thread? Use 90fps and 90Hz as default (like it's now in VIU, link Physics Rate to Refresh Rate), and if I as developer change fixedDeltaTime to something between 90 and 120 the eyetracking frequency changes accordingly (or you can keep it at 90Hz, if that's too complicated). And when I set fixedDelateTime to anything higher than 120 the eyetracking frequency is 120Hz. If it is running in the fixedUpdate proces, I can simply combine the motion log with the eyetracking data that is available at that moment, even if it may have been recorded 1 or 2 frames earlier.
  24. Now I'm converting the saccade test project to my own project and I get new errors, so I hope someone can help me with those. First I had to remove the static keyword from the eyecallback method, because I am logging the data and my CreateNewLogEntry method derives from a non-static base method. I can not judge the impact of that, but at least I got no compile errors. Then I need to link the eyetracling log to my motion log where I log the ball and baseball bat. Since timestamp may have too much variation, I decided to use frame-number. But then I got the error that Time.frameCount can only be used in the main thread, so I cannot read that in the eyecallback thread. So I figured I should use eyedata.frame_sequence, but that has completely different values. Did a comparison with debug lines and found: FixedUpdate in LoggerEyetracking at frame 31 EyeCallback called at eyedata.frame 276386, loggingActive=False How can these vallues be so far apart
×
×
  • Create New...