Matlab Project: Gesture Control of PC using Colored Finger Gesture [ Report+code]

Share:
Matlab Project Code-Gesture Control of PC using Fingers


                      MAJOR PROJECT REPORT
                                               
                                    on

Color Control of Laptop using Webcam

Using  

MATLAB





                                                                 CONTENTS

S.NO.
TOPICS
PAGE NO.
      1
ABSTRACT
5
2
INTRODUCTION
5
3
NEED OF HAND GESTURE RECOGNITION
5
    4
SYSTEM DESCRIPTION
6
    5
BASIC BLOCK DIAGRAM OF THE SYSTEM
7
    6
CAPTURING THE REAL TIME VIDEO
7
    7
FLIPPING OF IMAGES
8
    8
CONVERSION OF FLIPPING IMAGE TO GRAY SCALE IMAGE
9
   9
COLOR DETECTION
10
   10 
CONVERSION OF GRAY SCALE IMAGE TO BINARY
11
   11
TRACKING THE MOUSE PONTER
11
   12
PERFORMING ACTION
12
  13
IMAGE TYPES
15
  14
MALAB IMPLEMENTATION
17
  15
MATLAB IMPLEMENTATION
19
 16
MATRICES
30
  17
MATLAB CODES
35
  18
PROBLEMS AND DRABACKS
36
  19
CONCLUSION
37
  20
REFRENCES




Hello Friends, here, I am sharing my best project on Matlab. In this project, I have controlled my laptop with my colored fingers. We use Digital Image Processing Concepts to process the information from webcam images and then control the window functioning through robots.

You will also like my other Matlab Projects here:) Please check them too.

(check this project also)






 

ABSTRACT:


This project presents an approach to develop a real-time hand gesture recognition enabling human-computer interaction. It is “Vision Based” that uses only a webcam and Computer Vision (CV) technology, such as image processing that can recognize several hand gestures. The applications of real time hand gesture recognition are numerous, due to the fact that it can be used almost anywhere where we interact with computers ranging from basic usage which involves small applications to domain-specific specialized applications. Currently, at this level our project is useful for the society but it can further be expanded to be readily used at the industrial level as well. Gesture recognition is an area of active current research in computer vision. Existing systems use hand detection primarily with some type of marker. Our system, however, uses a real-time hand image recognition system.

         
          INTRODUCTION:
       

In  our  work,  we  have  tried  to  control mouse  cursor  movement  and  click  events  using  a camera  based  on  color  detection  technique.  Here real time video has been captured using a Web-Camera. The user wears colored tapes to provide information to the system. Individual frames of the video are separately processed. The processing techniques involve an image subtraction algorithm to detect colors. Once the colors are detected the system performs various operations to track the cursor and performs control actions.


           In today’s computer age, every individual is dependent to perform most of their day-to-day tasks using computers. The major input devices one uses while operating a computer are keyboard and mouse. But there are a wide range of health problems that affects many people nowadays, caused by the constant and continuous work with the computer. Direct use of hands as an input device is an attractive method for providing natural Human Computer Interaction   which   has   evolved   from   text-based   interfaces through 2D graphical-based   interfaces, multimedia-supported interfaces, to fully fledged multi participant Virtual Environment (VE) systems. Since hand gestures are completely natural form for communication it does not adversely affect the health of the operator as in case of excessive usage of keyboard and mouse.  Imagine the human-computer interaction of the future:  A 3D application where you can move and rotate objects simply by moving and rotating your hand - all without touching any input device. In this paper a review of vision based hand gesture recognition is presented.



         Need of Hand Gesture Recognition:



Generally, Hand Gesture Recognition technology is implemented using “Data Gloves” or “Color pins” which in turn leads to additional cost and lack of availability among majority of masses. Also, using additional devices, involves more amount of maintenance. Webcam is an easily available device and today every laptop has an integrated webcam along with it. In our project, we will implement a hand gesture recognizer which is capable of detecting a moving hand with its gesture in webcam frames. In future, it may be considered that willing to be more natural and more comforted, human being, who has been communicated with computers through mouse, keyboards, several user interfaces and some virtual environments, may use their bare hands to interact with machines without any mediator. As the set of materials above, recognition of hand gestures and postures is a satisfactory way to first steps of solutions instead of using keyboards, mouse or joysticks. A very common disease known as Parkinso’s disease is very relevant now-a-days among the common masses. This disease is caused due to excessive use of keyboard and mouse. Use of Gesture recognition technology will prevent this in future.



         SYSTEM DESCRIPTION :


Following are the steps in our approach:
(i)     Capturing real time video using Web-Camera.
(ii)    Processing the individual image frame.
(iii)   Flipping of each image frame.
(iv)   Conversion of each  frame  to  a  grey  scale image.
(v)   Color detection and extraction of  the different  colors  (RGB)  from  flipped   gray           scale image
(vi)   Conversion   of  the  detected  image  into  a binary image.
(vii)  Finding   the   region    of   the   image    and calculating its centroid.
(viii) Tracking    the    mouse    pointer    using    the coordinates obtained from the centroid.
(ix)   Simulating  the


                  MATLAB IMPLEMENTATION



  
=


                                       MATLAB CODE:



function gesturerecognisition(redThresh, greenThresh, blueThresh, numFrame)
warning('off','vision:transition:usesOldCoordinates');

if nargin < 1
    redThresh = 0.22; 
    greenThresh = 0.14;
    blueThresh = 0.18;
    numFrame = 1000;
end

cam = imaqhwinfo;
cameraName = char(cam.InstalledAdaptors(end));
cameraInfo = imaqhwinfo(cameraName);
cameraId = cameraInfo.DeviceInfo.DeviceID(end);
cameraFormat = char(cameraInfo.DeviceInfo.SupportedFormats(end));

jRobot = java.awt.Robot;
vidDevice = imaq.VideoDevice(cameraName, cameraId, cameraFormat, ...
                    'ReturnedColorSpace', 'RGB');

vidInfo = imaqhwinfo(vidDevice); 
screenSize = get(0,'ScreenSize');
hblob = vision.BlobAnalysis('AreaOutputPort', false, ...
                                'CentroidOutputPort', true, ...
                                'BoundingBoxOutputPort', true', ...
                                'MaximumBlobArea', 3000, ...
                                'MinimumBlobArea', 100, ...
                                'MaximumCount', 3);
hshapeinsBox = vision.ShapeInserter('BorderColorSource', 'Input port', ...
                                    'Fill', true, ...
                                    'FillColorSource', 'Input port', ...
                                    'Opacity', 0.4);
hVideoIn = vision.VideoPlayer('Name', 'Final Video', ...
                                'Position', [100 100 vidInfo.MaxWidth+20 vidInfo.MaxHeight+30]);
nFrame = 0;
lCount = 0; rCount = 0; dCount = 0;
sureEvent = 5;
iPos = vidInfo.MaxWidth/2;


while (nFrame < numFrame)
    rgbFrame = step(vidDevice);
    rgbFrame = flipdim(rgbFrame,2);
    diffFrameRed = imsubtract(rgbFrame(:,:,1), rgb2gray(rgbFrame));
    binFrameRed = im2bw(diffFrameRed, redThresh);
    [centroidRed, bboxRed] = step(hblob, binFrameRed);

    diffFrameGreen = imsubtract(rgbFrame(:,:,2), rgb2gray(rgbFrame));
    binFrameGreen = im2bw(diffFrameGreen, greenThresh);
    [centroidGreen, bboxGreen] = step(hblob, binFrameGreen);
   
    diffFrameBlue = imsubtract(rgbFrame(:,:,3), rgb2gray(rgbFrame));
    binFrameBlue = im2bw(diffFrameBlue, blueThresh);
    [~, bboxBlue] = step(hblob, binFrameBlue);
   
    if length(bboxRed(:,1)) == 1
        jRobot.mouseMove(1.5*centroidRed(:,1)*screenSize(3)/vidInfo.MaxWidth, 1.5*centroidRed(:,2)*screenSize(4)/vidInfo.MaxHeight);
    end
    if ~isempty(bboxBlue(:,1))
        if length(bboxBlue(:,1)) == 1
            lCount = lCount + 1;
            if lCount == sureEvent
                jRobot.mousePress(16);
                pause(0.1);
                jRobot.mouseRelease(16);
            end
        elseif length(bboxBlue(:,1)) == 2
            rCount = rCount + 1;
            if rCount == sureEvent
                jRobot.mousePress(4);
                pause(0.1);
                jRobot.mouseRelease(4);
            end
        elseif length(bboxBlue(:,1)) == 3
            dCount = dCount + 1;
            if dCount == sureEvent
                jRobot.mousePress(16);
                pause(0.1);
                jRobot.mouseRelease(16);
                pause(0.2);
                jRobot.mousePress(16);
                pause(0.1);
                jRobot.mouseRelease(16);
            end
        end
    else
        lCount = 0; rCount = 0; dCount = 0;
    end
    if ~isempty(bboxGreen(:,1))
        if (mean(centroidGreen(:,2)) - iPos) < -2
            jRobot.mouseWheel(-1);
        elseif (mean(centroidGreen(:,2)) - iPos) > 2
            jRobot.mouseWheel(1);
        end
        iPos = mean(centroidGreen(:,2));
    end
    vidIn = step(hshapeinsBox, rgbFrame, bboxRed,single([1 0 0]));
    vidIn = step(hshapeinsBox, vidIn, bboxGreen,single([0 1 0]));
    vidIn = step(hshapeinsBox, vidIn, bboxBlue,single([0 0 1]));
    step(hVideoIn, vidIn);
    nFrame = nFrame+1;
end

release(hVideoIn);
release(vidDevice);
clc;
end

  PROBLEMS AND DRAWBACKS:


Since   the   system   is   based   on   image capture  through  a  webcam,  it  is  dependent  on illumination  to  a  certain  extent.  Furthermore the presence of other colored objects in the background might   cause   the   system   to   give   an   erroneous response.  Although  by  configuring  the  threshold values  and  other  parameters  of  the  system  this problem  can  be  reduced  but  still  it  is  advised  that the  operating  background  be  light  and  no  bright colored objects be present. The  system  might  run  slower  on  certain computers   with   low   computational   capabilities because it involves a lot of complex calculations in a  very  small  amount  of  time.  However  a  standard pc or laptop has the required computational power for optimum performance of the system. Another fact is that if the resolution of the camera is too high then the system might run slow. However  this  problem  can  be  solved  by  reducing the  resolution  of  the  image  by  making  changes  in the system.



 CONCLUSION:



In  this  paper,  an  object  tracking  based virtual  mouse  application  has  been  developed  and implemented using a webcam. The system has been implemented   in   MATLAB environment using MATLAB Image Processing Toolbox. This  technology  has  wide applications  in the fields of augmented reality, computer graphics, computer gaming,   prosthetics,   and   biomedical instrumentation.  Furthermore  a  similar  technology can  be  applied  to  create  applications  like  a  digital canvas  which  is  gaining  popularity  among  artists. This  technology  can  be  used  to  help  patients  who don’t  have  control   of  their  limbs.   In  case   of computer graphics and gaming this technology has been applied in modern gaming  consoles to create interactive  games  where  a  person’s  motions  are tracked and interpreted as commands. Most of the applications require additional hardware  which  is  often  very  costly.  Our  motive was  to   create  this  technology  in  the  cheapest possible   way   and   also   to   create   it   under   a standardized operating system. Various application programs   can   be   written   exclusively   for   this technology  to  create  a  wide  range  of  applications with the minimum requirement of resources.



REFERENCES:


                   
1.      A.  Erdem,  E.  Yardimci,  Y.  Atalay,  V.  Cetin,  A.  E. “Computer vision  based  mouse”, Acoustics, Speech, and Signal     Processing,     Proceedings.     (ICASS).     IEEE International Conference, 2002
2.      Hojoon  Park,  “A  Method  for  Controlling  the  Mouse Movement    using    a Real    Time    Camera”,    Brown University,    Providence,    RI,    USA,    Department    of computer science, 2008
3.      Chu-Feng  Lien,  “Portable  Vision-Based  HCI  –  A  Real-time   Hand   Mouse   System   on   Handheld   Devices”, National   Taiwan   University,   Computer   Science   and Information Engineering Department
4.      Kamran  Niyazi, Vikram  Kumar, Swapnil Mahe, Swapnil Vyawahare,  “Mouse  Simulation  Using  Two  Coloured Types”,  Department  of  Computer  Science,  University  of  Pune,    India,    International    Journal    of    Information Sciences and Techniques (IJIST) Vol.2, No.2, March 2012


Watch live project performance:)



You will also like my other Matlab Projects here:) Please check them too.



4 comments:

  1. Hello, I was trying to use your code, but I recieved the next error.
    Error using end
    Incorrect cell or structure reference involving "end".
    Most likely cause is a reference to multiple elements of a cell or structure followed by additional subscript or structure
    references.

    Error in gesturerecognisition (line 14)
    cameraId = cameraInfo.DeviceInfo.DeviceID(end);
    I dont know why and I was wondering if you could help me to solve it.
    I even don´t know which camera Adaptors you use.

    ReplyDelete
    Replies
    1. May be you should try to install device driver package in matlab...

      Delete
  2. Error in gesturerecognisition (line 12)
    cameraName = char(cam.InstalledAdaptors(end));
    whats the problem i am facing?

    ReplyDelete