Professional Documents
Culture Documents
Fig 5: Adaboost
www.irjmets.com @International Research Journal of Modernization in Engineering, Technology and Science
[890]
e-ISSN: 2582-5208
International Research Journal of Modernization in Engineering Technology and Science
( Peer-Reviewed, Open Access, Fully Refereed International Journal )
Volume:04/Issue:07/July-2022 Impact Factor- 6.752 www.irjmets.com
Cascading: For example, an image of resolution 640x640, we move 24x24 window all across the image and in
each window it evaluates through those 2500 features which is computationally inefficient. Hence we use the
process of cascading. Cascading is the process where the features are subdivided into various cascades in the
form of a binary tree pattern where the image slides through each cascade where the features are matched and if
matched are continued on rest of the cascades.
Fig 6: Cascading
Eye Detection: Once the face is detected we locate the position of the eye using facial landmarks which is a
group of 64 points which marks all the features on the face as shown in the diagram. Using these landmarks, we
can slice the positions of the eye and use it for iris detection.
Fig 8: Block diagram for integration of Iris Detection with Graphical User Interface
Instead of text-based UI’s, typed command labels, or text navigation, the graphical user interface enables people
to interact with electronic devices through graphical icons and auditory indicators like primary notation. In the
field of human-computer interaction, designing a GUI's visual layout and temporal behavior is a crucial
component of software application programming. Initially the face of the patient is detected and then the eye is
captured followed by which the iris is detected and used for eye gaze movement. When the patient is at a
certain accessible distance, the screen which was initially in an idle mode, switches on with the detection of a
nearby patient using the IR sensor. The information of the patient is shown on the very 1st screen which
includes various information about the patient.
For the patient to move to the next application of screen, the patient has to gaze towards the right side of the
screen which contains a button to move to the next screen where the succeeding applications can be accessed.
If the patient blinks the any current screen, the main application contained in that screen with get executed. For
e.g., if the patient blinks on the screen which consists of the function of doctor /nurse, a message will go to the
concerned professionals. If the patient has to go back to any previous screen, the patient can gaze towards the
left which has the option for switching to the previous page and therefore the screen shifts back to the previous
page.
Tkinter
The Tk GUI toolkit has a Python binding called Tkinter. The Tk GUI toolkit's standard Python interface is what it
is called. Using the widgets in the Tk toolkit, this framework gives Python programmers a quick way to make
GUI elements. Button, menu, data field, and other widgets can be built with Tk widgets in a Python program.
These graphic elements, once developed, can be connected to or interact with features, functionality, processes,
data, or even other widgets. A button widget, for instance, can both accept mouse clicks and be programmed to
carry out an action, like closing the application. Various GUI communication screens aiding the patients are
shown in Fig 9.
Fig 9a: Patient information screen and index Fig 9b: Doctor/Nurse screen
Fig 9c: Washroom assist screen Fig 9d: Family assist screen
Fig 10a: Patient accessing GUI Fig 10b: Washroom Assist screen Fig 10c: Detection of right gaze
Fig 10d: Confirmation screen Fig 10e: Family assistance screen Fig 10f: Blink Detection
Fig 10g: Request initiated Fig 10h: Message sent to family members Fig 10i: Moving to previous page
V. CONCLUSION
In consideration with the means of communication, we are trying to create an effective environment for
differently abled patients to interact with doctors’ family members or nurses in hospitals or rehabilitation center.
A computer interface that provides the functionality of an input device like mouse based on eye actions such as
eye blink, eye-gaze and gaze control. The implementation is such that, Initially, by detecting the facial features of
the patient and then capturing the iris which is the integral part of the eye gaze movement Secondly, by using
various image processing techniques and other algorithms which enhances the clarity and accuracy of the image
thereby giving us the desired output. In addition to this, the project also considers the use of a Graphical User
Interface for making the application more interactive which will create a user- friendly experience and also by
providing a hassle free accessing of the eye gaze system. To conclude, the main motive of the application is to
minimize the physical strains a patient has to go through for communicating, thereby creating a convenient and a
futuristic approach to communication for differently abled patients.