You are on page 1of 3

Project Charter

General Information
Project Title: Brief Project Description: GESTURE RECOGNITION SYSTEM FOR HEARING AND SPEECH IMPAIRED The software of gesture recognition system is responsible for detecting gestures of sign language and then translating them into text and then finally into speech. The software allows user to detect gestures of languages already stored in the dictionary of the software as well as add gestures to the dictionary either in same language or in a new user defined language. GRS Working Group 15-Sep-2012 Version: 1.0

Prepared By: Date:

Project Objective
Basic goal of the software is to help hearing or speech impaired people communicate effectively by using this software to translate their gesture of sign language. The software allows user to detect gestures of languages already stored in the dictionary of the software as well as add gestures to the dictionary either in same language or in a new user defined language.

Assumptions
It is assumed that Kinect camera never fails to work properly other than when user shuts it down. The SRS is designed for detection of camera using Kinect camera and incase of using any other RGB or range camera the SRS will have to be modified accordingly.

Project Scope
Out Scope: In Scope:
Some of the features are not included inaour project scope are: The primary function of that the GRS is to provide mode of communication to the hearing and speech impaired people. It translated their gestures into text The GRS software cannot work with any other camera except for Microsoftand then into speech. Kinect. following points the functions of GRSof inKinect detail as GRS works only The At a time only onehighlight person should stand in front for one person at a time. Allows users to choose sign language already in the dictionary and then perform Meanings gestures of words to in translate gesture dictionary them. will be user provided and their meanings will not be given. Allows user to add new gestures to the existing sign language in the dictionary. User defined gestures. Allows user to add a new sign language in the sign language dictionary. User defined sign language. Real time gesture recognition will be provided. Gestures related to movement of hands and figures are both include.

Project Milestones
Mile Stone
1. 2. 3. 4. 5. 6. 7. 8. Software requirement specification Architecture Design Resources Cost Devices Configurations Coding Testing Making distributable software package

Start
4/11/2013 4/12/2013 4/17/2013 4/27/2013 5/7/2013 5/17/2013 5/27/2013 6/6/2013

End
4/14/2013 4/15/2013 4/29/2013 5/9/2013 5/19/2013 6/8/2013

Resources
Resource
Microsoft Kinect Graphics card Computer LCD Monitor Microsoft Kinect SDK

Constraints
Version 1.0 or later 2 GB Core i7 15 at least Version 2.0

Project Risks
Risk
Programming Skills Deficiency Co-operation between Project Team

Algorithm Understanding to be used Research Based Obstacle Design Failure Compatibility Issues Skipped Functionality Test of any module

Success Measurements
Prepared software capable of recognizing gestures of sign language and ability to create new languages and addition of new words will also be included.

Signatures
Advisor: Name
Dr.Zahid Halim

Signature

Date

Dean FCS: Name


Dr. Aftab Ahmad

Signature

Date

Project Members: Name


Hafiz Saad Masood Majid Pervez Qureshi Muhammad Bilal Shah Muhammad Salman Ashraf

Signature

Date