You are on page 1of 41

Use of Information Technology for Monitoring Pain in Patient Care

Multi-disciplinary work on Video Analysis Rashid Ansari Dept. of Electrical & Computer Engineering University of Illinois at Chicago
Work done with Diana J. Wilkie, Chaired Professor in Bio-behavioral Health Sci. and Zhanli Chen, ECE PhD student

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Larger Context
 Information technology and healthcare enterprises.  Electronic Health Record (EHR) systems  EHR Data Treasure Trove  Need for multidisciplinary skills  Nature of EHR data  Goal: Need training in IT and Health disciplines
clinicians big picture" of the patients condition based on its current EHR data, assisted by the analysis, mining, and visualization of historical EHR data securely accessed in a privacy-preserving manner from disparate EHR systems.

Larger Context Excerpt from report of IEEE


Presidents http://dl.dropbox.com/u/2397114/IEEE_kam.pdf

Larger Context Report of IEEE Presidents

Larger Context Report of IEEE Presidents

Larger Context EHR Training & Research for


Advances in Informatics (EHR-Train)


Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Motivation for Automated Pain Recognition


 Behavioral observation important for assessing pain.  Many populations cannot communicate pain children, critically ill non-communicative patients, patients undergoing procedures, adults with dementia  Taskforce Position statement recognized use of Facial Expression: Pain assessment in the nonverbal patient: position statement with clinical practice recommendations.

Motivation for Automated Pain Recognition


 Facial expression is behavioral indicator of pain - characterized by the Facial Action Coding System (FACS)1 Ekman-Friesen FACS: Objective assessment in expression analysis - facial Action Units (AUs) AUs - single or combination muscular activity that produces changes in facial appearance 44 AUs 9 Pain-related AUs (example) AUs detected by FACS experts
Pain AUs: Injured soldier.

 

 

[1]Ekman P, Friesen WV, Hager JC. New Version of the Facial Action Coding System: The Manual. Salt Lake City, UT, USA: Research Nexus Division of Network Research Information; 2002. Photo Source: Kim Komenich, with permission.

Motivation for Automated Pain Recognition


 Facial expression is behavioral indicators of pain - characterized by the Facial Action Coding System (FACS)1 Ekman-Friesen FACS: Objective assessment in expression analysis - facial Action Units (AUs) AUs - single or combination muscular activity that produces changes in facial appearance 44 AUs 9 Pain-related AUs (example) AUs detected by FACS experts Facial expression coding using FACS - time consuming, clinical use infeasible

 

  

Pain AUs: Injured soldier.

[1]Ekman P, Friesen WV, Hager JC. New Version of the Facial Action Coding System: The Manual. Salt Lake City, UT, USA: Research Nexus Division of Network Research Information; 2002. Photo Source: Kim Komenich, with permission.

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Framework for Using FACS - detecting Pain-related Action Unit


Table 1. Description and Muscular Basis of Selected Action Units for Pain Facial Expression Action Unit 4 6 7 9 10 20 26 27 43 Description eye brow lowerer cheek raiser eye lid tightener nose wrinkler upper lip raiser lip stretcher jaw drop mouth stretch eyes closed Muscular Basis depressor glabellae, depressor supercilii; corrugator supercilii orbicularis oculi; pars orbitalis orbicularis oculi; pars palebralis levator labii superioris alaeque nasi levator labii superioris; caput infraorbitalis risorius masetter; temporal and internal pterygoid relaxed pterygoids, digastric relaxation of levator palpebrae superioris

AU Source: Ekman & Friesen, 1978; with permission;

Framework to use FACS feature points to detect AUs (shape vertices)


 66 vertices are first identified in the face image  AUs are linked to these vertices or feature points

Framework to use FACS: Key Ingredients for Pain Recognition


 Behavioral model for assessing pain FACS  AUs are linked to key facial feature points  Approach to use FACS in Video Analysis: Mark features in key frames and train a model Use model to extract feature points and track them Use rules to map features to Pain AUs for detection and recognition  The above 3 items define tasks in AU recognition

Framework to use FACS: Schematic of Tasks for AU detection

Training Set Labeling

Feature extraction and tracking

Pain Facial Expression Detection and Recognition

Framework to Use FACS: Task I Semi-automated labeling


 Some Key frames are labeled with 66 vertices along the feature cues in the face.  Marking vertices by hand is tedious  Some key vertices are first identified and the location of remaining vertices is constrained by their relationship to the key vertices.

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Past work and available databases


 Research on automated expression recognition uses geometric features referred to as the shapes of the facial components (eyes, mouth, etc.) and appearance features including textures, wrinkles, bulges, and furrows. Most of the expression recognition methods are classifier-based built from large training sets. A rule-based approach was developed by Pantic et al68 for AU detection using temporal dynamics of the feature points. Past work focused on general expression recognition Only recently has some effort been directed at detecting pain-related facial expression. Lucey et al use an Active Appearance Model (AAM) based method to extract features and used support vector machines (SVM) for pain recognition from video achieves of patients with shoulder pain (acute pain)

   

Past work and available databases: Existing Databases  Databases


Cohn-Kanade facial expression database
most widely used database for facial expression recognition.

MMI facial expression database


contains both posed expressions and spontaneous expressions of facial behavior.

BU-3DFE database
3D range data of six prototypical facial expressions

UNBC-McMaster Shoulder Pain Archive


Spontaneous expression caused by real pain

Our Video Database of Lung Cancer Patients


 Our database has videos of 43 lung cancer patients -- 10-minute video with the camera focusing on the face  Each 600-sec video is partitioned into 30 equal-sized segments with 20 seconds per segment  Segments are reviewed and scored by three trained coders independently.  An AU is scored in a video time slot only if at least two coders agree on its presence.

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Task II: Feature Extraction and Tracking Active Appearance Model (AAM)
 AAM - parametric model of shape and texture, used in object appearance modeling.  Shape model: n Eigen shapes Si plus the mean shape S0.

 Appearance model: m Eigen appearance Ai, plus the mean Eigen texture A0

 Goal of AAM: find parameters that minimize the error between the observed image and synthesized image.

Task II: AAM Modification - Coefficient Partitioning Method


 AAM employs a set of parameters to control the shape and texture variation, which define a registration between a target image and a reference template.  In the original model, each coefficient controls certain motion (represented by one Eigen shape) of all vertices simultaneously in the shape model.  Convergence problems can occur when the algorithm tries to fit different feature components on the face when the facial change is localized.  In order to extract the expression information accurately, more flexibility is necessary for the deformation of the shape model.

Task II: AAM Modification: Coefficient Partitioning Method

Task II: AAM Fitting results


Labeled Training Images of patient P16.

Result of synthesizing patient face using AAM fitting for patient P16 in video segment 18.

Task II: AAM Modification: Coefficient Partitioning Method  After processing all shape vertices are available

 Need to define rules to use vertices to determine AU

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Task III: Rule-Based Action Unit Recognition


 The feature extraction for rule-based method: needs shape information from the AAM fitting result.  Main idea: Define rules from information of the feature point position to synthesize AUs as described in the FACS manual.  Temporal information of the feature points can enhance the capacity and robustness in AU recognition  Distance Parameter extraction

Task III: Pain related Action Unit (Recall)


Table 1. Description and Muscular Basis of Selected Action Units for Pain Facial Expression Action Unit 4 supercilii 6 7 9 10 20 26 27 43 Description eye brow lowerer cheek raiser eye lid tightener nose wrinkler upper lip raiser lip stretcher jaw drop mouth stretch eyes closed Muscular Basis depressor glabellae, depressor supercilii; corrugator orbicularis oculi; pars orbitalis orbicularis oculi; pars palebralis levator labii superioris alaeque nasi levator labii superioris; caput infraorbitalis risorius masetter; temporal and internal pterygoid relaxed pterygoids, digastric relaxation of levator palpebrae superioris

AU Source: Ekman & Friesen, 1978; with permission;

Task III: Recognition Rules


The rules listed in Table 2 use AU descriptions in the FACS manual (scoring AU with intensity B)

Task III: Visual Tracking Results

Plots of some tracked feature points in segment 18 of patient P16 s video.

Task III: Recognition Procedure using time evolution


 A Complete AU occurrence has 3 stages: Onset, Apex, Offset  Two sets of thresholds are used: Lower threshold for Onset and Offset and higher threshold for Apex (the trajectory should have an amplitude to score intensity B)

Task III: Design of Decision Trees


 AU combinations could be scored on different periods during the entire 20 seconds slot. We conveniently define multiple decision trees to identify all the possible AU combincations.  Five simple decision trees are developed for AU combinations. Decisions check if the rules involved are overlapping or not.

Schematic View of the system

Automated AU Recognition Method.

Outline
 Larger Context  Motivation for video analysis for pain  Framework for using Bio-behavioral models for Automated Pain detection  Past Work and Available Databases  Feature extraction and tracking using Active Appearance Model (AAM)  Rule-based Action Unit Recognition  Results  Conclusion

Results
 Due to the sparse appearance of AUs in the patient video, we evaluate performance by examining the agreement between the AUs detected by the algorithm and those scored by experts.  Tested several video segments for four patients containing different AU combinations.  We compared AUs recognized by the computer with those scored by at least two human coding experts  We found full agreement in the case of the segments investigated so far.

Future Work
 We will investigate all 43 patient videos. We are improving the AAM fitting algorithm to make it more robust.  Plan to new multi-view high-resolution patient video dataset, so that we will get a better quality video to study the automated pain recognition system.  We will also extend our research to 3D accordingly to better handle the rigid motion problem.

Conclusion
 Use of Information Technology for Monitoring Pain in Patient Care.  Multi-disciplinary work on Video Analysis  Information technology and healthcare enterprises Electronic Health Record (EHR)  Need for training a future workforce with multidisciplinary skills.  Behavioral observation important for assessing pain.  Translational research to use FACS to develop an automated pain AU recognition system

Thank You

You might also like