OpenCVForUnity Contour Tracking AR using the Handpose Estimation example
This demo is a combination of the Handpose Estimation example that comes with OpenCVforUnity & the MarkerBasedAR example found in Unity’s Asset Store.
The handpose estimation example prompts you to click on the screen to take a color blob sample of what you clicked on. It then uses this color to detect contours of things with that color. So basically, whatever you click on, it will find the contour of it. There is a bounding box that is also being calculated, to get the four corner points of the contour. I took the four corners of the boundRect that was already being calculated in the Handpose example, and used them as my 2D or image points and fed them into my solvePNP function. For the 3d objects points I used the same points provided in the MarkerBasedAR example.