Thank you for your question. Before the device can be applied to metaverse, we need to improve our design on the IMU tracking suit. In this stage, the suit is wired to the PC meaning the cable can somewhat obsecure movement. If we can make the suit wireless it will be more user friendly and immersive for metaverse/VR application.
CHONG, Wan Ho
January 19, 2022 3:46 pm
Good afternoon, I saw you made use of optical sensor and IMU to eliminate the problem of drifting and oclusion. To what extent can the problem be solved and how are you going to fuse the IMU data with Kinect data to do future processing or calculations to solve the problem? Moreover, will the future development of this technology apply to VR? Thanks.
Thank you for your question. The fusion algorithm is currently under development. Our approach is to create a skeletal model based on IMU and Kinect data. In this stage, we decide to implement a complementary filter to weigh each data at a given time in order to fuse these two sensors. In terms of will this solves our problem, this requires further evaluation in our game. Our motion tracking device seems to be too clunky to be used as a VR peripheral. Hopefully, our work can be further expanded upon and contribute to later projects.
Thanks for your reply. Look forward to your future evaluation.
YAN, Kai Hang
January 19, 2022 2:53 pm
Good afternoon, I saw that you used Madgwick filters during data processing. Can you give more detail on how it contribute for a better accuracy/ the role of this filter in the algorithm?
Thanks
The Madgwick Filter fuses the IMU acceleration relative to gravity and gyroscope rate of change. Instead of just simple integration, the mathematical calculation is done using gradient descent to optimize the Quaternion Vector [w,qx,qy,qz]
Hence it can better represent the orientation of the sensor.
Nice work! Will you applicate this project to the metaverse?
Thank you for your question. Before the device can be applied to metaverse, we need to improve our design on the IMU tracking suit. In this stage, the suit is wired to the PC meaning the cable can somewhat obsecure movement. If we can make the suit wireless it will be more user friendly and immersive for metaverse/VR application.
Good afternoon, I saw you made use of optical sensor and IMU to eliminate the problem of drifting and oclusion. To what extent can the problem be solved and how are you going to fuse the IMU data with Kinect data to do future processing or calculations to solve the problem? Moreover, will the future development of this technology apply to VR? Thanks.
Thank you for your question. The fusion algorithm is currently under development. Our approach is to create a skeletal model based on IMU and Kinect data. In this stage, we decide to implement a complementary filter to weigh each data at a given time in order to fuse these two sensors. In terms of will this solves our problem, this requires further evaluation in our game. Our motion tracking device seems to be too clunky to be used as a VR peripheral. Hopefully, our work can be further expanded upon and contribute to later projects.
Thanks for your reply. Look forward to your future evaluation.
Good afternoon, I saw that you used Madgwick filters during data processing. Can you give more detail on how it contribute for a better accuracy/ the role of this filter in the algorithm?
Thanks
The Madgwick Filter fuses the IMU acceleration relative to gravity and gyroscope rate of change. Instead of just simple integration, the mathematical calculation is done using gradient descent to optimize the Quaternion Vector [w,qx,qy,qz]
Hence it can better represent the orientation of the sensor.