**5.1 System Implementation**

**5.1.3 Hand Segmentation**

In the hand segmentation stage, the filtered image from the previous stage which
consist only the hand region will be applied to the find contour function by using
*cv2.findContours() and get the maximum contour in the image based on the contour *
area. Then, the contour approximation is performed to approximate the contour shape
and smoothening the contour edges by using *cv2.approxPolyDP() with an *
approximated curve for epsilon. Eventually, the hand contour will be drawn by using
*cv2.drawContours() and return to main for further processing. *

**Figure 5.1.3-F1 Extracted Hand Contour **

CHAPTER 5: IMPLEMENTATION & TESTING

43
**5.1.4 Features Extraction **

Features extraction is one of the core processing stage in the real-time gesture recognition system that transform the segmented image into a set of measures which will be used for analysing and determining the meaning of gesture in the gesture recognition stage. Firstly, the centre of the hand can be found by using the moments of the contour with the function cv2.moments() and calculate the centre mass of the hand contour with following formula:

(ππ₯, ππ¦) = (^{π}^{10}

π_{00}, ^{π}^{01}

π_{00})

**Figure 5.1.4-F1 Centre Mass of Hand **

Next, the convex hull can be found by using cv2.convexHull in order to get the
palm radius by calculating the maximum Euclidean distance between the centre of palm
and the most extreme points in the convex hull. Scikit-learn has provides a function
*pairwise.euclidean_distances() to find the distance between one point to multiple points. *

Then the radius of palm will be assumed as 40% to the maximum distance from the hand centre.

CHAPTER 5: IMPLEMENTATION & TESTING

44
**Figure 5.1.4-F2 Convex Hull and Radius **

Besides that, the position of fingertips can be found by applying several steps to the hand centre, palm radius and the convex hull points that have been extracted. The first step will be eliminating the convex hull points that are very close to each other.

Secondly, the convex hull points that are too near or too far to the centre of hand have to be eliminated with the minimum and maximum finger length threshold to ensure only the finger part is detected. Lastly, the result of fingertips detection has to be optimized so that there will be not more than 5 fingers.

**Figure 5.1.4-F3 Fingertips Detection **

CHAPTER 5: IMPLEMENTATION & TESTING

45 In addition, the hull area and hand area also need to be acquired by using cv2.contourArea in order to calculate area ratio which is the percentage of area that are not covered by hand in the convex hull by following formula:

π΄πππ π ππ‘ππ =π»π’ππ π΄πππβπ»πππ π΄πππ

π»πππ π΄πππ Γ 100

Other than that, the convexity defects is the cavity in the convex hull that formed
when there are two or more hull point in the convex hull which is the fingers . It can be
found by using function *cv2.convexityDefects() and return with four values which *
include the start point, end point, farthest point and the approximate distance to the
farthest point. In this case, only the first three values will be used in calculating the
convexity defects.

**Figure 5.1.4-F4 Start, End and Farthest Point in the Convexity Defect **

**Start ** **End **

**Far **

CHAPTER 5: IMPLEMENTATION & TESTING

46 After that, the length between the three points will be represented by a, b, c and it have to be calculated by using the distance formula as below:

a = β(π π‘πππ‘[0] β πππ[0])^{2}+ (π π‘πππ‘[1] β πππ[1])^{2}
b = β(π π‘πππ‘[0] β πππ[0])^{2}+ (π π‘πππ‘[1] β πππ[1])^{2}
c = β(πππ[0] β πππ[0])^{2}+ (πππ[1] β πππ[1])^{2}

Once the length between the three points have been found, the angle between the two fingers can be determined by using the Cosine rule:

π^{2} = π^{2} + π^{2}β 2ππ cos π΄

π΄ = cos^{β1}(π^{2} + π^{2}β π^{2}/ 2ππ)

The distance between the convexity defects and convex hull also being taken as a measure to determine the number of convexity points to eliminate the defect points that are too close to the convex hull. It can be achieved by following formula:

π = (π + π + π)/2

ππ = βπ Γ (π β π) Γ (π β π) Γ (π β π) π = 2ππ/π

CHAPTER 5: IMPLEMENTATION & TESTING

47 The last step is to take in the result from two previous step as the parameter to determine whether it is a convexity defect points between two fingers. If the results is fulfil the following statement, it is consider as a defects point.

πΌπ | π΄ππππ πππ‘π€πππ π‘π€π πππππππ β€ 90

π·ππ π‘ππππ πππ‘π€πππ π‘βπ ππππ£ππ₯ππ‘π¦ ππππππ‘π πππ ππππ£ππ₯ βπ’ππ > 45 |

**Figure 5.1.4-F5 Convexity Defect Points of Hand **

CHAPTER 5: IMPLEMENTATION & TESTING

48 The last features to be extracted is the angle of finger which in either involving one finger or two fingers with different formula. If there is only one finger, the angle of finger will be determined based on the hand centre coordinate (cx, cy) and the fingertips coordinate (x, y) by using following formula:

π΄ = tan^{β1}(ππ¦ β π¦)/(ππ₯ β π₯)

**Figure 5.1.4-F6 Calculate the Angle of One Finger **

Else if there are two finger, the angle of finger will be determined based on the two fingertips coordinate and the hand centre coordinate by using the cosine rule as mentioned as above.

**(cx, cy) **
**(x, y) **

CHAPTER 5: IMPLEMENTATION & TESTING

49 Eventually, all the extracted features will be returned to the main as a set of measures that will be used for analysing and determining the meaning of gesture in the gesture recognition stage.

**Figure 5.1.4-F7 Display of All Extracted Features **

CHAPTER 5: IMPLEMENTATION & TESTING

50
**5.1.5 Gesture Recognition **

Gesture recognition is the final stage of the real-time gesture recognition system that applying set of rules from the hand features that extracted from previous stage to build the gesture recognition model in order to determine gesture and discriminate recognition error as much as possible. There are 8 set of different rules for 8 different gestures which will be described as table below:

**Gesture **

**Rule of the gesture recognition model **
**Defect **

CHAPTER 5: IMPLEMENTATION & TESTING

51
**Thumb left β Reject call **

0 1 >2000 15 β 30 0 β 50 -

**Three finger β Volume **
**up **

2 3 >2000 55 β 80 - -

**Thumb with one finger β **
**Volume down **

1 2 >2000 28 β 70 - 50 β 90

CHAPTER 5: IMPLEMENTATION & TESTING

52
**Two finger β **

**Temperature up **

1 2 >2000 28 β 70 - 0 β 50

**One finger β **
**Temperature down **

0 1 >2000 30 β 50 70 β 90,

-70 β -90 -

**Table 5.1.5-T1 Gesture recognition model **

CHAPTER 5: IMPLEMENTATION & TESTING

53
**5.2 System Testing **

The system testing is the process to evaluate whether the system has fulfils the system requirements which is the expected functionalities that will be performed by the system and evaluate on the system performance by using appropriate standard. For this project, there will be functional testing which evaluate on the system functionality and the non-functional testing which will evaluate on the system performance in terms of average recognition rate and classification performance.