Colored Object Visual Tracker for Autonomous Underwater Vehicle

 

button3

Colored Object Visual Tracker for Autonomous

   Underwater Vehicle

 By: Mohd Faid Bin Yahya (PhD Student)

 

 

The usage of vision sensor for underwater vehicles is important for closed-range applications such as for structure inspection and navigational purposes. For navigational purposes, a visual tracker could be implemented on an autonomous underwater vehicle to guide the vehicle towards a marker. There are few important image processing steps in order to realize the vision tracking given as follow:

1. image acquisition

2. image pre-processing

3. feature extraction

4. marker detection

This article explains about each process to undertake visual tracking in order to trackcolored markers.

Initially, image is acquired using a camera installed in an AUV. The camera should be placed at the frontal side of the AUV and is sealed to sustain water pressure. The camera can be of charge-coupled device (CCD) or complementary metal–oxide–semiconductor (CMOS). Additionally the camera should be able to capture images at least 15 frames per second to realize a real-time system.After the image had been captured, it needs to be processed to reduce the noise in the captured images. Gaussian smoothing could be implemented [1]. The, feature extraction is required to obtain any important information from the marker that needs to be tracked. Color filter could be used to threshold the specific color range that the marker had. The thresholded image which originally contains colors will turn into a binary image of black and white. Black pixels indicate the background while white pixels indicate the element of interest.Finally, a simple detection algorithm of image moment could be implemented to locate the marker. An illustration of the overall process tested in a lab is shown in Figure 1 below.

 

3a jul15

Figure 1: Overall process of developed visual tracker with shown original image, blurred image, thresholded image, and detected object of interest image (from left to right).

Reference(s):

  1. E.R. Davies (2010). Computer and Machine Vision: Theory, Algorithms, Practicalities. 4th Edition, Elsevier.

H-infinity Loop-Shaping

 

button2H-infinity Loop-Shaping

By: Song Yoong Siang (PhD Student)

 

 

                H-infinity loop-shaping is one of the most important example of a robust control technique. This method minimizes the sensitivity of a system over its frequency spectrum, and this guarantees that the system will not greatly deviate from expected trajectories when disturbances enter the system.

                H-infinity loop-shaping control incorporates loop shaping to obtain performance and robust stability trade off, and a particular H infinity optimization problem to guarantee closed-loop stability and a level of robust stability at all frequency.

                In loop shaping approach, designer selects a controller which achieves sufficiently high open-loop gain at low frequency to guarantee concerning closed loop performance can be made at these frequencies.

                In H infinity synthesis, designer specifies closed loop objectives in terms of requirements on the singular value of weighted closed loop transfer functions, and a stabilizing controller is obtained which satisfies these requirements.

                The H-infinity loop-shaping control has two main stages:

  1. Loop shaping is used to shape the nominal plant singular values to give desired open loop properties at frequencies of high and low loop gain.
  2. The normalized coprime factor H infinity problem is used to robustly stabilize the shaped plant.

                This control technique had the following benefits:

  1. Easy to apply: commercial software handles the hard math
  2. Easy to implement: standard transfer functions and state space methods can be used
  3. Plug and play: no need re-tuning on an installation by installation basis

               

Reference(s):

  1. McFarlane, D. C., and Glover, K., 1992, A loop shaping design procedure using H synthesis. IEEE Transaction on Automatic Control, Vol-37, No. 6, 759-769.

Microcontroller versus FPGA

button4

Microcontroller versus FPGA

By: Mad Helmi Bin Ab. Majid (PhD Student)

 

Introduction

Controlling a machine requires basic components of sensors, actuators and controller. The controller acts as brain of the machine that control and manipulate operation to meet operation requirement. The most common controller includes microprocessor, microcontroller and Field Programmable Gate Array (FPGA). The selection of suitable controller is necessary for smooth operation and matches processing requirement. In this short article, we are comparing the microcontroller and FPGA in term of operation suitability and capability of both types of controllers.

Definition

Microcontroller is a small computer which consists of processor core, memory and programmable input/output port and embedded on a single integrated circuit. See Figure 1(a) for microcontroller general architecture.

FPGA is semiconductor devices that are based around a matrix of configurable logic blocks connect through programmable connection. It is integrated circuit that can be programmed on the field after manufacturing. See Figure 1(b) for FPGA general architecture.

 

4a aug15

 

Figure 1 General architecture (a)Microcontroller [1] (b) FPGA [2]

 

Differences

Microcontrollers such as Microchip, Atmega, etc. are commonly used for many types of applications either in education, research or industrial commercialization. However, some applications require a better processing power especially when signal, audio and video processing is embedded into the application. From this point of view, FPGA is more reliable compared to microcontroller. On the other hand, FPGA also works great for digital design. Some differences between microcontroller and FPGA that could possibly guide us toward selecting a suitable controller for a suitable application are summarized in Table 1.

 

Table 1 Differences between microcontroller and FPGA

4b aug15

Conclusion

In summary, a microcontroller is suitable for small processing application where serial execution is acceptable while a FPAG provides better solution for faster and complex processing application. Thus, by exploring basic differences between the microcontroller and FPGA, one can decide whether microcontroller or FPGA based control can fulfill the need of their system.  

                                                                                          

Reference(s):

  1. www.microe.com
  2. www.cs.ucr.edu

1

 

Tracking of Objects Based From Color

 

button3Tracking of Objects Based From Color

By: Mohd Faid Bin Yahya (PhD Student)

 

This article explains about a tracking vision system which emphasizes on objects’ color. The tracking method consists of image acquisition, color space conversion, color thresholding, morphological operations, target detection, and recognition. The flow of the proposed method is shown in Fig. 1.

3a aug15

Fig. 1.Tracking method overview

 

Initially, an image is acquired from a charged-coupled device (CCD) image sensor in the form of a webcam at 15 Hz (15 frames per second). The obtained original image which is in Red-Green-Blue (RGB) color space is converted to Hue-Saturation-Value (HSV) color space for easier image manipulation. Then, color thresholding is used to separate the light sources (targets) from the surrounding [1]. The selected threshold values were 0 to 15 for H component, 150 to 255 for S component, and 246 to 255 for V component. It is important to note that the artificial objects are selected as substitute to light sources since they both can be threshold with the same values of HSV color filter. Thereafter, morphological opening is utilized to remove noises while preserving the shape and size of the targets. Subsequently, for target detection, there are 3 steps required known as contour finding, contour moment calculation, and contour filtering. Finally, the recognition of the target is based from evaluating positions of centroid of these detected contours/targets. Fig. 2 shows targets tracking at different distances.

 

3b aug15

 

Fig. 2. Image captured at different distances

 

Reference(s):

  1. E.R. Davies (2010). Computer and Machine Vision: Theory, Algorithms, Practicalities. 4th Edition, Elsevier

ABC (Artificial Bee Colony) Algorithm – Part 1

ABC (Artificial Bee Colony) Algorithm – Part 1
By: Mei Jianhong (PhD Student)

As an SI (swarm intelligence), ABC (Artificial Bee Colony) model is first proposed by Tereshko and Loengarov in 2005, which is inspired from foraging behavior of honey bees. Thereafter, Karaboga (2005) developed the ABC algorithm to optimize numerical problems. The honey bees work according to their different division of labor, and share the food sources information to get the optimal solution of problem. There are three components in ABC model, employed and unemployed foraging bees, and food sources.

There are two groups of unemployed bees, onlookers and scouts. At first, the employed bees are associated with the specific food sources, and then onlookers watch the dancing of employed bees in the hive to obtain the food sources information and determine a food source. The scout bees are in charge of searching for food sources randomly. The employed bees and onlookers keep exploiting the nectar of food sources till the food sources are exhausted. Then, the employed bee which was exploiting the exhausted food source becomes a scout bee to search a new food sources in the neighborhood. The positions of the food sources represent the possible solutions of the problem and the nectar amount of a food source means the fitness of the associated solution. Thus, the number of employed bees is equal to the number of food sources and they are one-to-one associated. The general flow chart of ABC algorithm is shown below:

Reference(s):

  1. http://www.scholarpedia.org/article/Artificial_bee_colony_algorithm
  2. Karaboga, D., Gorkemli, B., Ozturk, C., & Karaboga, N. (2014). A comprehensive survey: artificial bee colony (ABC) algorithm and applications. Artificial Intelligence Review42(1), 21-57.