Prototype for immersive was designed in Unity using inbuilt XR plugins. For HMD emulation MockHMD was used. The material was rendered on the plane. The material was linked to 2d texture. The VideoPlayer asset in Unity helped in converting the frames to textures. The camera was bounded to the camera space. There were no performance issues while development and testing. The asset used was present locally, asset fetched using URL can also be used.
Segmentation Model from previous week, crashed despite efforts. So, I created a UNet from scratch. Basic model performs well for sample space. Hyper-parameter optimisation is pending. For 300 epochs a dice coefficient of 0.84 was achieved. This UNet model works with openCV. For developing the OpenCV segmentation pipeline only the model inference and drawing portion has to change. Inference takes close to 1~2 seconds for 512x512x3 image.
That's all for today!!!
Hope you had a great week