axel.goris Posted May 1, 2025 Posted May 1, 2025 Hey! There's nothing more fun than new requirements and I was wondering if it possible/how to use the (still in beta...) hand tracking from Varjo directly. They don't provide a Varjo SDK for it and only use OpenXR. But I had the feeling that most of the unigine varjo integration was done with Varjo SDK and not yet ISO feature with OpenXR integration? If we were to try to implement something with it, you got any direction for me? Thanks! Axel
silent Posted May 5, 2025 Posted May 5, 2025 Hi Axel, Quote But I had the feeling that most of the unigine varjo integration was done with Varjo SDK and not yet ISO feature with OpenXR integration? Basic functionality that will allow render with our OpenXR implementation for Varjo XR-4 is already available in 2.19.1.2, so you should be able to switch to the OpenXR for hand tracking testing. Regarding the XR_EXT_hand_tracking extension - it should be also supported in engine API (UnigineVRHandTracking.h). However, there is no hand tracking sample available out of the box. The sample itself we will try to squeeze into 2.20 release as well, but it may be available a bit later - just after the release itslef. So far I can share some test code that we used back in 2024 while working on hand tracking support implementation, hope you can use it as a reference for developing. We used valve hand models for reference: https://github.com/ValveSoftware/openxr_engine_plugins/tree/main/assets/valve_hand_models. Bones structure: hand_tracking_research.zip Since these code basically from the older version, it may have conflicts with an existing VR Sample implementation. Also, in order to enable hand tracking visualizer you can use following console command: show_visualizer 1 && vr_hand_tracking_visualizer_enabled 1 Thanks! 1 How to submit a good bug report --- FTP server for test scenes and user uploads: ftp://files.unigine.com user: upload password: 6xYkd6vLYWjpW6SN
Recommended Posts