Colored Object Visual Tracker for Autonomous
By: Mohd Faid Bin Yahya (PhD Student)
The usage of vision sensor for underwater vehicles is important for closed-range applications such as for structure inspection and navigational purposes. For navigational purposes, a visual tracker could be implemented on an autonomous underwater vehicle to guide the vehicle towards a marker. There are few important image processing steps in order to realize the vision tracking given as follow:
1. image acquisition
2. image pre-processing
3. feature extraction
4. marker detection
This article explains about each process to undertake visual tracking in order to trackcolored markers.
Initially, image is acquired using a camera installed in an AUV. The camera should be placed at the frontal side of the AUV and is sealed to sustain water pressure. The camera can be of charge-coupled device (CCD) or complementary metal–oxide–semiconductor (CMOS). Additionally the camera should be able to capture images at least 15 frames per second to realize a real-time system.After the image had been captured, it needs to be processed to reduce the noise in the captured images. Gaussian smoothing could be implemented . The, feature extraction is required to obtain any important information from the marker that needs to be tracked. Color filter could be used to threshold the specific color range that the marker had. The thresholded image which originally contains colors will turn into a binary image of black and white. Black pixels indicate the background while white pixels indicate the element of interest.Finally, a simple detection algorithm of image moment could be implemented to locate the marker. An illustration of the overall process tested in a lab is shown in Figure 1 below.
Figure 1: Overall process of developed visual tracker with shown original image, blurred image, thresholded image, and detected object of interest image (from left to right).
- E.R. Davies (2010). Computer and Machine Vision: Theory, Algorithms, Practicalities. 4th Edition, Elsevier.