MacBread Posted September 17, 2019 Share Posted September 17, 2019 I'm trying to show 2 kinds of screens(external camera - or webcam / front camera on HMD) on lens. I find a function? class? "WebCamDevice" - https://docs.unity3d.com/2020.1/Documentation/ScriptReference/WebCamDevice.html and also a project using that on Oculus VR. Anyway I imitate that project for study how to use that. I thought front camera is connected with PC so may they capture that cameras as webcams. but It doesnt. how can I project a texture on any object with a video from front camera? Is "WebCam" sth function right? this is an example what I want to do... that 2 Main Camera means 2 lens for stereoscopy and 2 quads are for screen sth(from Webcam or Front camera). I will attach a code I found at the bottom. It is tooo long to put here... I'm beginner in VR and Unity so I may think wrong about that. hope your wisdom and know-how!... thx developers! ------------------------------------------------------------------------------------------------------------------------------------------------- using System.Collections; using System.Collections.Generic; using UnityEngine; public class WebcamTexture : MonoBehaviour { WebCamTexture webcamTexture; public int webcamNumber; float cameraAspect; private float margin = 0f; public float scaleFactor = 1f; public bool rotatePlane = false; public bool fitting = false; //Transform cameraOVR; //public static Vector3 cameraPos; // Use this for initialization void Start() { WebCamDevice[] devices = WebCamTexture.devices; webcamTexture = new WebCamTexture(); if (devices.Length > 0) { webcamTexture.deviceName = devices[webcamNumber].name; Renderer renderer = GetComponent<Renderer>(); renderer.material.mainTexture = webcamTexture; webcamTexture.Play(); } if (rotatePlane) transform.Rotate(Vector3.forward, 180); if (fitting) FitScreen(); // camera position //cameraOVR = GameObject.Find("OVRCameraController") as Transform; } void FitScreen() { Camera cam = transform.parent.GetComponent<Camera>(); float height = cam.orthographicSize * 2.0f; float width = height * Screen.width / Screen.height; float fix = 0; if (width > height) fix = width + margin; if (width < height) fix = height + margin; transform.localScale = new Vector3((fix / scaleFactor) * 4 / 3, fix / scaleFactor, 0.1f); } // Update is called once per frame void Update() { //cameraPos = cameraOVR.position; //print(cameraPos); } }@Corvus @Daniel_Y Link to comment Share on other sites More sharing options...
Corvus Posted September 18, 2019 Share Posted September 18, 2019 @MacBread If you are a beginner in Unity and VR I would suggest checking some of the blogs that have been posted about using the camera for pass through. This one I'm linking is by @Dario and should help you get started but there are many others if you search around. https://medium.com/@dariony/about-mixed-reality-part-2-4a371f03d910 Link to comment Share on other sites More sharing options...
MacBread Posted September 19, 2019 Author Share Posted September 19, 2019 10 hours ago, Corvus said: @MacBread If you are a beginner in Unity and VR I would suggest checking some of the blogs that have been posted about using the camera for pass through. This one I'm linking is by @Dario and should help you get started but there are many others if you search around. https://medium.com/@dariony/about-mixed-reality-part-2-4a371f03d910 Thx Corvus! I read Dario's article and I success to project front camera's video stream on HMD with Dario's tutorial and ViveSR sample both!... I think ViveSR sample code may be fit for me more. Dario's one doesn't seem like for Pro with 2 cams... ps. Where can I access more detail examples and "how to" about Pro/Pro-eye's API? Vive's API just shows me "Func A is for B."... Or I need to learn more C#? Link to comment Share on other sites More sharing options...
dario Posted September 19, 2019 Share Posted September 19, 2019 Hi, @MacBread, Tutorials are dated, there's now support for both cameras not just with SRWorks but with the OpenVR API as well (though I haven't checked if the SteamVR Unity plugin has been updated) - I plan to get back to this via OpenVR for Vive Pro however I do recommend that using SRWorks will be the easier route. One approach is to take the textures or the quads from one or both of the eyes and simply move them to the object you want to project on. Currently the best way to learn how the SRWorks framework works is to use the inspector during runtime to see how it's laid out and then you'll know what parts to use or copy. I do plan to get to new tutorials - please feel free to ask for help on getting started on the SRWorks SDK forum best, Dario Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.