Defining and Assessing Professional Competence
Ronald M. Epstein, MD Edward M. Hundert, MD
EDICAL SCHOOLS, POSTgraduate training programs, and licensing bodies conduct assessments to certify the competence of future practitioners, discriminate among candidates for advanced training, provide motivation and direction for learning, and judge the adequacy of training programs. Standards for professional competence delineate key technical, cognitive, and emotional aspects of practice, including those that may not be measurable.1,2 However, there is no agreed-upon definition of competence that encompasses all important domains of professional medical practice. In response, the Accreditation Council for Graduate Medical Education defined 6 areas of competence and some means of assessing them3: patient care (including clinical reasoning), medical knowledge, practice-based learning and improvement (including information management), interpersonal and communication skills, professionalism, and systems-based practice (including health economics and teamwork).3 In this article, we will advance a definition of professional competence of physicians and trainees that expands on these 6 areas, perform an evidencebased critique of current methods of assessing these areas of competence, and propose new means for assessing residents and medical students.


Context Current assessment formats for physicians and trainees reliably test core knowledge and basic skills. However, they may underemphasize some important domains of professional medical practice, including interpersonal skills, lifelong learning, professionalism, and integration of core knowledge into clinical practice. Objectives To propose a definition of professional competence, to review current means for assessing it, and to suggest new approaches to assessment. Data Sources We searched the MEDLINE database from 1966 to 2001 and reference lists of relevant articles for English-language studies of reliability or validity of measures of competence of physicians, medical students, and residents. Study Selection We excluded articles of a purely descriptive nature, duplicate reports, reviews, and opinions and position statements, which yielded 195 relevant citations. Data Extraction Data were abstracted by 1 of us (R.M.E.). Quality criteria for inclusion were broad, given the heterogeneity of interventions, complexity of outcome measures, and paucity of randomized or longitudinal study designs. Data Synthesis We generated an inclusive definition of competence: the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community being served. Aside from protecting the public and limiting access to advanced training, assessments should foster habits of learning and self-reflection and drive institutional change. Subjective, multiple-choice, and standardized patient assessments, although reliable, underemphasize important domains of professional competence: integration of knowledge and skills, context of care, information management, teamwork, health systems, and patient-physician relationships. Few assessments observe trainees in real-life situations, incorporate the perspectives of peers and patients, or use measures that predict clinical outcomes. Conclusions In addition to assessments of basic skills, new formats that assess clinical reasoning, expert judgment, management of ambiguity, professionalism, time management, learning strategies, and teamwork promise a multidimensional assessment while maintaining adequate reliability and validity. Institutional support, reflection, and mentoring must accompany the development of assessment programs.
JAMA. 2002;287:226-235

For editorial comment see p 243.

DEFINING PROFESSIONAL COMPETENCE Building on prior definitions,1-3 we propose that professional competence is the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and community being served. Competence builds on a foundation of basic clinical skills, scien-

tific knowledge, and moral development. It includes a cognitive function— acquiring and using knowledge to solve
Author Affiliations: Departments of Family Medicine (Dr Epstein), Psychiatry (Drs Epstein and Hundert), and Medical Humanities (Dr Hundert), University of Rochester School of Medicine and Dentistry, Rochester, NY. Corresponding Author and Reprints: Ronald M. Epstein, MD, University of Rochester School of Medicine and Dentistry, 885 South Ave, Rochester, NY 14620 (e-mail: ronald_epstein@urmc.rochester .edu).

JAMA, January 9, 2002—Vol 287, No. 2 (Reprinted)

©2002 American Medical Association. All rights reserved.

Downloaded from by guest on May 1, 2012

”11 For example. students.7 Personal knowledge is usable knowledge gained through experience. and an affective/moral function—the willingness. and colleagues) Affective/Moral Tolerance of ambiguity and anxiety Emotional intelligence Respect for patients Responsiveness to patients and society Caring Habits of Mind Observations of one’s own thinking. including attentiveness. a relational function—communicating effectively with patients and colleagues. patience. pattern-recognition. Integrative Aspects of Care Professional competence is more than a demonstration of isolated competencies10. intuition. and presence. elaborated knowledge) Linking basic and clinical knowledge across disciplines Managing uncertainty Context Clinical setting Use of time Relationship Communication skills Handling conflict Teamwork Teaching others (eg. critical curiosity.8 Clinicians use personal knowledge when they observe a patient’s demeanor (such as a facial expression) and arrive at a provisional diagnosis (such as Parkinson disease) before eliciting the specific information to confirm it. clinical. colleagues) Learning from experience Technical Physical examination skills Surgical/procedural skills Integrative Incorporating scientific. 6. interpreting new knowledge.9 cognitive and emotional self-awareness is necessary to help physicians question. (Reprinted) JAMA. the student who can elicit historical data and physical findings. No. ©2002 American Medical by guest on May 1.ama-assn. and make decisions with limited information. Acquisition and Use of Knowledge Box 1. 2012 . Professional competence is developmental. Dimensions of Professional Competence Cognitive Core knowledge Basic communication skills Information management Applying knowledge to real-world situations Using tacit knowledge and personal experience Abstract problem-solving Self-directed acquisition of new knowledge Recognizing gaps in knowledge Generating questions Using resources (eg. patients. and act like a physician. and techniques Attentiveness Critical curiosity Recognition of and response to cognitive and emotional biases Willingness to acknowledge and correct errors Evidence-based medicine is an explicit means for generating an important answerable question. All rights reserved. and emotional awareness to use these skills judiciously and humanely (BOX 1). emotions. impermanent. an integrative function—using biomedical and psychosocial data in clinical reasoning.6 as are other clinical skills. including the informed use of heuristics (rules of thumb).DEFINING AND ASSESSING PROFESSIONAL COMPETENCE real-life problems. seek new information. Because experience does not necessarily lead to learning and competence. 2002—Vol 287. feel. self-awareness. and humanistic judgment Using clinical reasoning strategies appropriately (hypothetico-deductive. and context-dependent. and who can draw the biosynthetic pathway of bilirubin may not accurately diagnose and manage a patient with symptomatic gallstones. A competent clinician possesses the integrative ability to think.4 But Polanyi5 argues that competence is defined by tacit rather than explicit knowledge. The assessment of evidencebased medicine skills is difficult because many of the heuristics used by novices are replaced by shortcuts in the hands of experts. tolerate uncertainty. Tacit knowledge is that which we know but normally do not explain easily. January 9. who knows the anatomy of the gallbladder and the bile ducts. we see its parts differently than when we see them in isolation. 2 227 Downloaded from jama. who can suture well.12-15 Schon16 argues that professional competence is more than factual knowledge and the ability to solve problems with clear-cut solutions: it is defined by the ability to manage ambiguous problems. published evidence. Competence depends on habits of mind. and pattern recognition. and judging how to apply that knowledge in a clinical setting. “when we see the whole. and adjust for their own biases.

and pharmacology of -blockers).18 Although expert clinicians often use pattern recognition for routine problems19 and hypotheticodeductive reasoning for complex problems outside their areas of expertise. rather than assessing a student’s competence in diagnosing and treating heart disease (a diseasespecific domain) by dividing it into competencies (physical examination.20-22 These networks help professionals initiate a process of problem solving from minimal information and use subsequent information to refine their understanding of the problem.40-42 228 Competence is context-dependent. curious. relational. Errors in medicine. may result from overcertainty that one’s impressions are beyond doubt. and expectations.49.41. in- JAMA. for example. a task (in the world). and humanistic judgment to engage in clinical reasoning. For example. reflective.25. whereas experts draw on several strategies.13 further emphasizing the importance of assessing emotional intelligence and self-awareness in clinical practice.47 This view stands in contrast to an abstract set of attributes that the physician possesses—knowledge. In discussing validity of measures of competence in an era when reliable assessments of core knowledge. participatory decision making correlates with clinical outcomes. but they are especially difficult to objectify.50 CURRENT MEANS OF ASSESSMENT Assessment must take into account what is assessed.29 but it is unclear when trainees should be assessed on this skill. and attitudes—that are assumed to serve the physician well in all the situations that he or she encounters. Reflection allows practitioners to examine their own clinical reasoning strategies. clinical. and basic clinical skills have been developed. technical. the assessment of teamwork and institutional self-assessment might effectively complement individual assessments.46. Although a third-year resident might be expected to counsel a fearful diabetic patient about the need to start insulin.24 costs. There is debate about which aspects of competence should be acquired at each stage of training. according to this view.23. integrative. and moral aspects of by guest on May 1. and making judgments about treatment). interpretation of electrocardiogram. affective. and the health system (good insurance and ready access to care). All rights reserved. abstract problem solving. January 9.43. a 52-year-old business executive who is now in the emergency department because of new-onset chest pain). But students tend to use the same cognitive strategy for solving all problems. 2012 . early clinical experiences and problem-based learning formats encourage clinical reasoning skills formerly relegated to the final years of medical school.45 and the ecology of the health systems and clinical contexts in which those tasks occur. Changes in medical practice and the context of care invite redefinitions of competence.37 and professionalism. No. a 52-year-old uninsured homeless woman who has similar symp- Competence is developmental. For example. emotions. such as trust36.38. For example. Building Therapeutic Relationships Habits of Mind Competence depends on habits of mind that allow the practitioner to be attentive. how it is assessed.43 Many physicians would consider these habits of mind characteristic of good practice. skills.32-34 Thus.ama-assn.17.29 Medical errors are often due to the failure of health systems rather than individual deficiencies.6 which raises the question of whether assessment of practicing physicians should be qualitatively different from the assessment of a student. 35 Only recently have validated measures captured some of the intangibles in medicine. 2002—Vol 287.51-56 we must now establish that they encompass the qualities that define a good physician: the cognitive. contextual.14. the use of electronic communication media48 and changes in patient expectations. 2 (Reprinted) ©2002 American Medical Association. should be able to judge his or her level of anxiety when facing an ambiguous clinical presentation and be aware of how the anxiety of uncertainty may be influencing his or her clinical judgment. a third-year student might be expected only to elicit the patient’s preferences.39 Recent neurobiological research indicates that the emotions are central to all judgment and decision making.26 Key measurable patient-centered 28 (or relationship-centered)30. We distinguish between expert opinion. and willing to recognize and correct errors. and the assessment’s usefulness in fostering future learning.44 Context toms and receives episodic care at an inner-city clinic. Downloaded from jama. forming a therapeutic relationship. Determining how and at what level of training the patient-physician relationship should be assessed is also difficult.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE Competence depends on using expert scientific. Competence is a statement of relationship between an ability (in the person). A competent physician.25 and outcomes of chronic diseases26-29 by altering patients’ understanding of their illnesses and reducing patient anxiety. for example. Development The quality of the patient-physician relationship affects health and the recovery from illness.45.31 behaviors include responding to patients’ emotions and participatory decision making. performing diagnostic maneuvers. the clinician’s abilities (eliciting information. expert clinical reasoning usually involves working interpretations12 that are elaborated into branching networks of concepts. our view is that competence is defined by the interaction of the task (the concrete process of diagnosing and treating Mrs Brown. Affective and Moral Dimensions Moral and affective domains of practice may be evaluated more accurately by patients and peers than by licensing bodies or superiors. self-aware. Caring for Mrs Brown requires different skills than caring for Ms Hall.

32. we used words such as OSCE. This set was reduced by including any of 20 text words describing assessment techniques. mechanical.63 Within each domain of assessment. drawing blood) Interpretation of diagnostic tests (electrocardiogram. We surveyed the LEVEL OF ASSESSMENT Knows Self-assessment and reflection Information gathering from patients and families Relationship-building and professionalism Sharing information. behavior change. and residents. and associate new knowledge with previously learned principles. and patient involvement CLINICAL TASKS Physical examination Patient procedural skills (suturing. medical students. apply principles in a familiar situation. oral examination. immunology. (Reprinted) JAMA. principles. the student can demonstrate the ability to imitate or replicate a protocol. and the few studies that show associations between results of assessments and actual clinical performance. or computer simulations that involve demonstration of skills in a controlled setting. video. culture/diversity) Special topics (spirituality.ama-assn. adapt principles to new situations. Each category can be combined with a number designating a category such as the name of a patient. there are 4 levels at which a trainee might be assessed (F IGURE ). yielding 430 references. psychology. essay. 2012 . No.65 METHODS Using the MEDLINE database for 1966 to 2001. and CEX (clinical evaluation exercise). yielding 101 Englishlanguage references. OR the text word reliability. chart note) KNOWLEDGE CONTENT AREAS Basic mechanisms (anatomy. Finally. economics) Knows how Shows how Does New problem CONTEXT OF CARE Chronic illness Emergency Preventive Acute hospital The grid is filled out according to the type of assessment conducted. or a team exercise.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE termediate outcomes. essay. validity OR research. a type of computer exercise. we searched for articles that studied the reliability or validity of measures of clinical or professional com- petence of physicians. triple jump. 2002—Vol 287. gastrointestinal) Social science (epidemiology. interpreting.61 Good assessment is a form of learning and should provide guidance and support to address learning needs. mental status. All rights reserved. we address concerns that the medical profession still lacks adequate accountability to the public62 and has not done enough to reduce medical errors. The shows how level usually involves human (standardized patient). Too by guest on May 1. microbiology) Pathophysiology of disease (dermatology. Articles of a purely descriptive nature. peer assessment.57-60 We consider how the process of assessment might foster future learning. ie. clinical competence OR professional competence AND reproducibility of results. renal. A Framework for Assessment level involves the ability to solve problems and describe procedures. standardized patient or simulation. postencounter probe. 64 The knows level refers to the recall of facts. The knows how Figure. and applying the medical literature Presenting data to colleagues (referral letter. and opinions and position statements were excluded. portfolio. January 9. practitioners select educational programs that are unlikely to influence clinical practice. and theories. The does level refers to observations of real practice. ethics. An initial search using the following Medical Subject Headings of the National Library of Medicine yielded 2266 references: educational measurement. For each of these levels. 2 229 Downloaded from jama. reviews that offered no new data. or computer exercises. ©2002 American Medical Association. patient simulation. imaging) Diagnostic reasoning: Psychosocial issues Diagnostic reasoning: Biomedical issues Diagnostic reasoning: Diagnostic uncertainty Clinical judgment: Planning further diagnostic workup Clinical judgment: Generating therapeutic plan Accessing.

and referrals. Summary of Studies The 3 most commonly used assessment methods are subjective assessments by supervising clinicians. and the patient-physician relationship. each trainee must be subject to multiple assessments for patterns to emerge.116 The OSCE scores may not correlate with multiple-choice examinations and academic grades. dissatisfied with the capability of the OSCE to evaluate competence for the final professional licensing examination. or Miller’s does level. All rights reserved. However. given the small number of controlled trials of assessment interventions and the complexity of outcome measures. physical examination. and books and did additional literature searches using the key authors of recent reviews.90-100 and if criteria for competence are based on evidence. Because we knew that MEDLINE search strategies would not capture all relevant studies. most assessment methods evaluate these domains in isolation. may be less attuned to affective aspects of the interview and significantly increase the cost of the examination.125.101 Although few cases are needed to assess straightforward skills. other review articles. mammographic screening. Few reliably assess clinical reasoning. compared with Florida family physicians who are not board-certified.106.78-81 For example.69 The literature makes important distinctions between criteria for licensing examinations and program-specific assessments with mixed formative and summative goals. Faculty ratings of humanism predicted patient satisfaction in one study.134 The Royal College of General Practitioners.118 and short OSCE stations can risk fragmentation and trivialization of isolated elements of what should be a coherent whole. Evaluation of factual knowledge and problem-solving skills by using multiple-choice questions offers excellent reliability71-75 and assesses some aspects of context and clinical reasoning. 3. Few assessments use measures such as participatory decision making70 that predict clinical outcomes in real practice. developed a format in which candidates for certification present several best-case videotapes of their performance in real clinical settings to a trained examiner who uses specified criteria for evaluation. Quality criteria for inclusion were broad.59.103 Although SPs’ ratings usually correlate with those of real patients.54. Scores on Canadian licensing examinations. either physicians or other SPs.57. evaluators often do not observe trainees directly. whereas external raters. multiple-choice examinations to evaluate factual knowledge and abstract problem solving.115.121 However.105-107 Defining pass/fail criteria for OSCEs has been complex.ama-assn.135 Although the face validity of such a measure is high and the format is well accepted by physicians.7. systems-based care. Many have questioned the validity of multiple-choice examinations.85-89 Communication. correlated positively with subsequent appropriate prescribing. They often have different standards122. technology. Clinicians may behave differently in examination settings than in real practice. which include standardized patient assessment and multiplechoice tests.126 Because of interpatient variability and low interrater reliability. data abstraction is complex140 and defining competence in JAMA. and technical skills can be rated reliably if there is a sufficiently large number of SP cases67.58 and multiple-choice certification examination scores correlated with 230 subsequent faculty76 and peer77 ratings. No. and tolerance.67-69 Although curricular designs increasingly integrate core knowledge and clinical skills.133. Of the 195 references.82 Standardized test scores have been inversely correlated with empathy.104 differences have been noted.114 But global rating scales of interpersonal skills may be more valid than behavioral checklists. Standardized rating forms for direct observation of trainees127-132 and structured oral examination formats have been developed in response to this criticism. the detection rate of unannounced SPs in community practice is less than 10%. The roles are portrayed accurately56. 2012 . we gathered 94 additional relevant references.102. January 9.112 Ratings by the SP are generally accurate52 but may be hampered by memory failure.84 and simulations are convincing. those who are have nearly twice the risk of being sued.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE first 200 of the 2165 references excluded and found none that met our search criteria. Downloaded from jama. counseling. 2002—Vol by guest on May 1. few medical school examinations can claim to achieve the high psychometric standards of the licensing boards.66 and standardized patient assessments of physical examination and technical and communication skills. because of lack of expertise and resources. Subjective evaluation by residents and attending physicians is the major form of assessment during residency and the clinical clerkships and often includes the tacit elements of professional competence otherwise overlooked by objective assessment instruments.137-139 Profiling by managed-care databases is increasingly used as an evaluation measure of clinical competence.119 The OSCE also has low test reliability for measuring clinical ethics.136 the number of cases that should be presented to achieve adequate reliability is unclear. though.83 Also.123 and are subject to halo effects124 and racial and sex bias. Checklist scores completed by physician-examiners in some studies improve with expertise of the examinees113 and with the reputation of the training program. 2 (Reprinted) ©2002 American Medical Association.90.108-111 There is debate about who should rate student performance in an OSCE. 124 presented new data on assessment of physicians.90. up to 27 cases may be necessary to assess interpersonal skills reliably in highstakes examinations. responsibility.100.120 There are few validated strategies to assess actual clinical practice. we searched reference lists in the 101 articles. The Objective Structured Clinical Examination (OSCE) is a timed multistation examination often using standardized patients (SPs) to simulate clinical scenarios.117 suggesting that these tools measure different skills.

fostered further reflection. For example. 2012 . Whereas performance is directly measurable. and teamwork from health care practitioners. as in the case of a trainee with severe test anxiety. especially if there are corresponding efforts at the institution toward selfassessment and change. teamwork. even among bright and motivated individuals. self-assessment was neither valid nor accurate. No. 148 Performance on a multiplechoice test may exceed competence. Conversely. Dis- Box 2.147 The decline of public trust in medicine may reflect a growing concern that physicians are not achieving these goals. (Reprinted) JAMA. Self-assessments have been used with some success in standardized patient exercises144 and in programs that offer explicit training in the use of selfassessment instruments. for example. the United Kingdom.36 Assessment is also a statement of institutional values. 2 231 Downloaded from jama. The composite results should be edited to protect the confidentiality of the raters. Rather. and systems-based care or on the ability of assessment programs to drive curricular change or reduce medical errors. Spain. competence is an inferred quality. institutional.143 Anonymous medical student peer assessments of professionalism have raised awareness of professional behavior. cost analyses may favor physicians caring for more highly educated patients. people often act differently when not under direct scrutiny. Assessment serves personal. and drive institutional selfassessment and curricular change.77. mandated remediation) Certify achievement of curricular goals Foster course or curricular change Create curricular coherence Cross-validate other forms of assessment in the curriculum Establish standards of competence for trainees at different levels For the Institution Guide a process of institutional self-reflection and remediation Discriminate among candidates for further training or promotion Express institutional values by determining what is assessed and how assessment is conducted Develop shared educational values among a diverse community of educators Promote faculty development Provide data for educational research For the Public Certify competence of graduates tinctions between these goals often are blurred in practice. All rights reserved. competence may exceed test performance. some of which we describe. 145 Among trainees who did not have such training. The underlying assumptions driving such evaluation systems may not be explicit. 2002—Vol 287.141 Peer ratings are accurate and reliable measures of physician performance. These assessments are increasingly multimodal and tailored to the goals and context in which they will be ©2002 American Medical Association. inspire confidence in the learner. Devoting valuable curricular time to peer assessment of professionalism. enhance the learner’s ability to selfmonitor. Australia. Some Purposes of Assessment For the Trainee Provide useful feedback about individual strengths and weaknesses that guides future learning Foster habits of self-reflection and self-remediation Promote access to advanced training For the Curriculum Respond to lack of demonstrated competence (denial of promotion.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE terms of cost and value is difficult. and the United States have made commitments to developing innovative assessments of professional competence. the Netherlands. Correlation with National Board scores and feedback on graduates’ performance can be useful in validating some assessment instruments but should be done with caution. helped students identify specific mutable behaviors.35 Students should be assessed by at least 8 of their classmates.ama-assn. COMMENT Aside from the need to protect the public by denying graduation to those few trainees who are not expected to overcome their deficiencies. and been well accepted by students. formative feedback is intended to foster individual reflection and remediation146 but may be perceived as having evaluative consequences. For example. it is not surprising that there is scant literature on the assessment of learning. as in the case of a trainee with a photographic memory but poor clinical judgment. professionalism. can promote those values that are assessed while encouraging curricular coherence and faculty development. and societal goals (BOX 2). the outcomes of assessment should foster learning. efficiency is highly valued in residents but less so in medical students. For example. however. it was more closely linked to the trainee’s psychological sense of self-efficacy and selfconfidence than to appropriate criteria. Summative evaluation is a powerful means for driving curricular content and what students learn. January 9. The public expects greater by guest on May 1. communication. Given the difficulty in validating tests of basic skills. Future Directions Medical schools in Canada. Assessment provides information to allow institutions to choose among candidates for advanced training.142 Peers may be in the best position to evaluate professionalism.

153 Combining formats appears to have added value with no loss in reliability. In a postencounter exercise. Screening. Failure to include a key diagnostic possibility or the overestimation or underestimation of probability are criteria for evaluation. No. health economics. Proponents of the new formats argue that they provide more useful feedback and are more efficient at the medical school or residency level (Box 1 and BOX 3) than traditional formats. a matrix (Figure) can be useful to display the domains assessed. All rights reserved.154 Ongoing educational outcomes research will show whether composite formats help students learn how to learn more effectively.158 is being tested for students and residents.150 They target core knowledge and clinical skills in different contexts and at different levels of assessment.162 Patient ratings also have the potential to validate other measures of competence. Assessment instruments include a structured evaluation of the search strategy and a communication rating scale. and medical ethics. and statistical adjustments can account for team composition.25 and malpractice litigation. completed by an SP. These measures reflect students’ capacity to organize and link information.151. Comprehensive assessments link content across several formats. Because of their complexity. ability to acquire and use new knowledge. also. teamwork between health care practitioners. by grading of their written referral letters.107 OSCE stations. patient assessment.161 health care use. partnership between physicians and patients. 2 (Reprinted) ©2002 American Medical Association. The goal of the exercise is to demonstrate emotional intelligence40 and selfawareness in the context of conflict and ambiguity. This exercise assesses the student’s communication skills. Genetics. Large-scale licensure examinations must use computer-gradable formats. develop habits of mind that characterize exemplary practice. 2012 . clinical reasoning. Communication between health settings could be assessed at the student level. 2002—Vol 287. the resident creates a rank-order differential diagnosis and then answers a series of script concordance153. or multiplechoice questions test pathophysiology and clinical reasoning in context.24. numbness in the left hand) and then asked to judge how her diagnostic hypotheses or therapeutic actions would change. and contextualized use of knowledge of genetics.43 and provide a more multidimensional picture of the examinee than the individual unlinked elements.170 Although it could be argued that licensing boards do not have the mandate to remediate examinees who perform poorly or modify educational curricula. Next. an independent literature search. Postencounter probes immediately after SP exercises using oral.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE Box 3. Cognitive and Affective Challenges of Clinical Uncertainty. Similar observations might be made with trainees’ video portfolios of real clinical encounters. medical schools and residency programs do.157 The use of time in a continuity relationship can be assessed with a series of SP or realpatient exercises.79 peer assessments. real patient cases. Innovations in Assessing Professional Competence Multimethod assessment Clinical reasoning in situations that involve clinical uncertainty Standardized patient exercises linked to postencounter probes of pathophysiology and clinical reasoning Exercises to assess use of the medical literature Long-station standardized patient exercises Simulated continuity Teamwork exercises Unannounced standardized patients in clinical settings Assessments by patients Peer assessment of professionalism Portfolios of videotapes Mentored self-assessment Remediation based on a learning plan that use patient vignettes followed by questions that require clinical judgment. A student is instructed to perform a literature search about genetic screening test for Alzheimer disease in anticipation of an SP encounter later that by guest on May 1.152 Triple-jump exercises152—consisting of a case presentation. and communication between health care settings. trainees are graded on teamwork as well as individual problem solving.29 emotional distress. A rating scale is used to assess a resident on her ability to agree on a plan of action with an SP who portrays an outpatient demanding a computed tomographic scan for headaches without neurological signs.ama-assn. Others use sophisticated mannequins to simulate acute cardiovascular physiological derangements found in intensive care settings164-169. January 9.160 These efforts are guided by data showing that patients’ ratings of communication and satisfaction correlate well with biomedical outcomes. Well-functioning health systems are characterized by continuity. Evidence-Based Medicine.150. Validated measures of reflective thinking153 have been developed 232 JAMA.155 questions in which the examinee is presented hypothetical additional data (for example. the student completes an essay about the ethics of genetic screening and the genetics of Alzheimer disease. Tests that demonstrate students’ strengths or weaknesses may not provide the student with the opportunity to reflect on actual behaviors and patterns of thought that used. that assesses the clarity of the student’s presentation and the student’s ability to involve the patient in the decision-making process.156. they predict clinical reasoning ability 2 years later. but comprehensive examinations using structured direct observation.107 case-based questions.163 Several institutions assess teamwork by using peer assessments. Two examples of comprehensive assessment formats follow. essay. To assess partnership.159. for example. and essay-type questions149 are reliable as well. and then an oral or written postencounter examination—test the use and application of the medical literature.81. and Communication. Downloaded from jama. currently used to assess physicians in practice.

The Power of Mindful Learning. Jones A. 22. 1978. Knowing and Being: Essays. Saltzberg D. New York. 1984. and Albert Oriol-Bosch. 1958. Emanuel EJ. San Francisco. Reevaluation of clinical competency. Chicago. NY: Basic Books.acgme. Chopra V. 1994. 44. Experience and Nature. 1992. 4. Asch E. Chicago.282: 833-839. 1990. 18. Linking primary care performance to outcomes of care. and drafting of the manuscript: Epstein. Taira DA. for their critical review of the manuscript. Health Professions Education and Relationship-Centered Care: Report of the Pew-Fetzer Task Force on Advancing Psychosocial Health Education. Woolliscroft JO. 11. Norman GR. Curricular change also can be guided by results of assessments but requires a parallel process of institutional reflection. Validation of professional licensure ex- ©2002 American Medical Association. Epstein R. 1997. Boshuizen HP. Am J Phys Med Rehabil. LaDuca A. 15. Epstein RM. McNaughton N. Taylor DD. Cope DW. A cognitive perspective on medical expertise: theory and implication [published correction appears in Acad Med. Acad Med. Larson EB. 1984. Author Contributions: Study concept and design. Murphy J. Downie RS. 1983. mentoring. Novack DH. In: Spath PL. Bordage G. Kane M. MD. J Gen Intern Med. Hodges B.7:115-140. 49. 1999:97-138. Birk PS. No. 1998. Reason. Hill IK. Mass: Perseus Books. 88:877-882. 1994. 1997. MD. Safran DG. Dugdale DC. PhD. Dubler NN. Reinforcement of self-directed learning and the development of professional attitudes through peer. San Francisco. 1997. Accessed October 1. Montgomery JE. 1999. 47. Feinstein AR. Calif: Addison-Wesley. 1997. Pantilat SZ.120:799-805. Time and the patient-physician relationship. Calif: Jossey Bass. New York. Mandin H. 2000. (Reprinted) JAMA. Philadelphia. Anesthesiology. An inadequate system for feedback.47: 213-220. Acad Med. Starfield B.269:16551660. 1974. Norman GR. OSCE checklists do not capture increasing levels of expertise. and several institutions have invested significant time and resources to develop them. Wray C. Preserving the physicianpatient relationship in the era of managed care. Polanyi M. Chang H. 2012 . Influence of context effects on health outcomes: a systematic review.3:25-30. Meredith L. 1995. Acad Med. 27. Pa: American Board of Internal Medicine. 25. From Novice to Expert. MD. Sangster M. Macnaughton J. All rights reserved. JAMA. American Board of Internal Medicine. England: Falmer Press. Lancet. 14(suppl 1):S34-S40. England: Oxford University Press. 43. 34. ACGME Outcome Project. 14. JAMA. 45. Stewart M. Am J Public Health. professional societies. NY: Springer. and Kevin Volkan. NY: Churchill Livingstone. Norman GR. 1932 ed. Koornneef F. Acad Med. Acquisition of data. Kaplan SH. Kosinski M. Model-based practice analysis and test specifications. CMAJ. Learning professional processes: public knowledge and personal experience. Goleman D. 27:S110-S127. Developing Professional Knowledge and Competence.68:13-17. ed. The public’s trust in the medical profession and the ability of medical practitioners to learn from mistakes depends on valid and reliable means of assessment. 1999. Toward creating physician-healers: fostering medical students’ selfawareness. and well-being. Reported significant observations during anaesthesia: a prospective analysis over an 18-month period. Ann Intern Med. 30. Bordage G. New York. Predictors of outcome in headache patients presenting to family physicians—a one year prospective study. Tarlov AR. 12. 1994. Ware JE Jr.74:516-520.71:127-131. 48. Evidence on patient-doctor communication. change medical education. In: Neufeld VR. Epstein RM. Defining competence: a methodological review. 24. Connell KJ. The impact of patient-centered care on outcomes. critical revision of the manuscript for important intellectual content. 1995.9:75-81. American Board of Internal Medicine. Project Professionalism. 1927. The care of the patient.278:502-509. 1998.and selfassessment. Carline JD. The influence of patient-practitioner agreement on outcome of care. Acad Med. Timothy Quill. CSW. Appl Meas Educ. Suchman AL. Epstein RM. Menlo Park. 1999. expected time of completion. 50. 36. Brian Hodges.ama-assn. 1986. Br J Anaesth.49:399-406. 2. Acknowledgment: We would like to express thanks to Francesc Borrell-Carrio. 1969:123-158. Ernst E. The design of a new physician licensure examination. A strong mentoring system should accompany any comprehensive assessment program. The Headache Study Group of the University of Western Ontario.73(suppl):S19-S21.67:287]. Candib LM. 33. Descartes’ Error: Emotion. Ill: University of Chicago Press. Medical student errors in making a diagnosis. Leake B. Bovill JG. 1981.18:406-416. 37. To foster reflection and action. NY: GP Putnam’s Sons. Hanson M. Polanyi M. and administrative.74:11291134. ed. Tiberius R. LoGerfo JP. 1992. Cancer Prev Control. 1999. Medical educators. Bordage G. In: Eraut M. In: Grene M. 16. New York. MD. 6. Preventable anesthesia mishaps: a study of human factors. and the Pew-Fetzer Task Force. Switching doctors: predictors of voluntary disenrollment from a primary physician’s practice. The logic of tacit inference. Novack DH. 26. feedback. Spierdijk J.79:481-486. Sinacore JM. Acad Med. January 9. 17. Brown JB.49:796-804. Ramsey PG. Available at: http://www. REFERENCES 1. London. Kaiser S. Daniel Klass. 21. 3. Ill: University of Chicago. “Clinical Judgment” revisited: the distraction of quantitative models. 2000. by guest on May 1. Larry Mauksch. These new assessment formats are feasible. Harkness E. Emotional Intelligence.357:757762. Clinical Judgement: Evidence in Practice. and reduce medical errors should be studied in controlled trials. Olthoff AJ. Greenfield S. 46.65:611-621. J Fam Pract. and remediation will subvert even the most wellconceived and validated examination. Headache. 42. Stewart M. Acad Med. and the Human Brain. Schon DA. Hess K.10:18. Eval Health Prof. Guide to Evaluation of Residents in Internal Medicine. 51. Daniel Federman. New York. 5. Galajda J. Med Educ. 152:1423-1433. Zacks R. 41. Pa: American Board of Internal Medicine. Peabody F. D’Lugoff BC. Accreditation Council for Graduate Medical Education Web site. Dewey J. MD.16:2226. Assessment of a pattern-recognition examination in a clinical clerkship. Ternov S. eds. Also we would like to acknowledge Anthony LaDuca. Acad Med. Sackett DL. JAMA. 2001. NY: Dover. Oxford. 2001. The human side of medical mistakes. 2000. Andres Sciolla. 1995. J Gen Intern Med. 23. Long CD. Philadelphia. and remediation. 35. 1997. 13. Evidence-Based Medicine: How to Practice and Teach EBM. 20. 38. 40. Regehr G. Inui TS. Error Reduction in Healthcare: A Systems Approach to Improving Patient Safety. 8. Internal medicine patients’ expectations for care during office visits. 1998. Reading. Gale J. LaDuca A. analysis and interpretation of data. Harasym P. Bhrany V. The Reflective Practitioner. NY: Bantam Books. ed. Medicine and the Family: A Feminist Perspective. Brown JB. Damasio AR. Benner P. Tresolini C. 2002—Vol 287. Clark W. PhD.27:679]. Najberg E.273:323-335. 1999. for their contributions to our formulation of professional competence. 1984. Hundert. Rogers WH. 2001. Assessing the effects of physician-patient interactions on the outcomes of chronic disease [published correction appears in Med Care. 32. Med Care.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE should be changed. New York. Randall R. Use of peer ratings to evaluate physician performance. Safran DG. 1985:15-35. Boon H. and means of verification146. Friedman MH. Georgiou A. Personal Knowledge: Towards a PostCritical Philosophy. 1989. 1999. Marsden P. 7. MD. Rogers WH.69:683-684. The structure of medical knowledge in the memories of medical students and general practitioners: categories and prototypes. 1994:100122. the means of achieving them. Helping students learn to think like experts when solving clinical problems. 28. Di Blasi Z. JAMA. NY: Basic Books. Woloschuk W. Stewart M. Eraut M. Elaborated knowledge: a key to successful diagnostic thinking. Assessing Clinical Competence. 2000. MD. Clinical problem solving: the beginning of the process. JAMA. 1993. 39. The promise that a more comprehensive assessment of professional competence might improve practice. 1998. Wenrich M. technical. 1994. Ware JE. Kleijnen J.171 as a required outcome of an assessment. Med Educ. et al. 1989. 9.73:575. Calif: Pew Health Professions Commission. Newbower RS. 10. Klass D. J Fam Pract. 1994. Cooper JB. McPeek B.69:883885. Mindful practice. Paulsen RH. Donner A. Dunn MM. 31.72:173-179. Calibrating the physician: personal awareness and effective patient care. some institutions require a learning plan in which trainees chart their learning needs. and licensing boards should view professional competence more comprehensively to improve the process of assessment.26:285-294. Kravitz RL. New York. 19. or material support: Epstein. personal growth. 29. 2 233 Downloaded from jama. Effective physician-patient commu- nication and health outcomes: a review. 1982. J Fam Pract. Schmidt HG. Gross R. Langer EJ. Kaplan C.

Clinical performance-based test sensitivity and specificity in predicting first-year residency performance. Acad Med. 2 (Reprinted) ©2002 American Medical Association. 1996. 1991. 1993. Ann Intern Med. 80. Klass DJ. 1997. Dawson B. Tamblyn RM. synthesis. Med Educ. Eval Health Prof. Carney PA. Factors associated with the accuracy of standardized patient presentation. 1992. 2000. Acad Med. Colliver JA. Swanson DB. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Kopelow ML.68(suppl):S41-S45. Standardized patients’ accuracy in recording examinees’ behaviors using checklists. Validity of NBME Part I and Part II scores for selection of residents in orthopaedic surgery. Scoring and standard setting with standardized patients. A comparison of knowledge. Rethans JJ. Dietrich AJ. 1997. Barrows HS. Klass DJ. Kopelow ML. Guide to postgraduate exams: multiple-choice questions. Rothman A. Webster GD. Schnabl GK. Dauphinee WD. Tamblyn R. 52. Calhoun JG. Acad Med.7:485-499. Robbs RS. 81.25:100-109. Factors affecting the reliability of ratings of students’ clinical skills in a medicine clerkship. 1984. 1992. Distlehorst LH. Klass DK. Boshuizen HP. Carline JD. The periodic health examination provided to asymptomatic older women: an assessment using standardized patients. 1998. 95. 66. et al. Acad Med. Schnarch B. Badger LW. 72. 2001.71(suppl):S4-S6. van der Vleuten C. 109. Acad Med. Ely JW.280:989-996. 65. Ware JE Jr.48:965-972. Norcini JJ.71(suppl):S84-S86. 1998. and preventive medicine. Swanson DB. 72:619-626. 99.74:539-546. Klass DJ.67(suppl):S19S21. van der Vleuten C.73(suppl):S114-S116. Kassebaum DG. Owen M.65(suppl):S55S56. Colliver JA.25:100-109. Fried ED. 1996. 1990. 71. AJDC. 89. Ramsey PG. 1980. Schuwirth LW. Moroff S. Jones R.65(suppl): S63-S67.278: 1164-1168. Thiede KW. An expert-judgment approach to setting standards for a standardized-patient examination.70:52-58. Acad Med. Kassirer JP. by guest on May 1. 102. Acad Med. January 9. 72(suppl 1):S85-S87. Kohn LT. 1992. Freeman DH Jr. Tamblyn RM. 1994. Detecting and correcting for raterinduced differences in standardized patient tests of clinical competence. Psychometric characteristics of the objective structured clinical examination.71(suppl):S31-S33. 111. JAMA. validity and efficiency of multiple choice question and patient management problem item formats in assessment of clinical competence. 1993. Simulated patients and the assessment of medical students’ interpersonal skills.2:58-76. et al. 97. Arch Intern Med. Stillman PL. 1998. Does CME work? an analysis of the effect of educational activities on physician performance or health care outcomes. 59. et al. De Champlain AF. et al.28: 21-39. 110. 96. Assessment of clinical skills of residents utilizing stan- JAMA. Swanson DB. 64. 1997.73(suppl):S112-S113.134:587-590. and construct validity. 2000. 104. Dawson JD. Eliassen MS. Gomez JM. Developing case-specific checklists for standardized-patientbased assessments in internal medicine: a review of the literature. Validity of final examinations in undergraduate medical training. Ripkey DR.48:757761. Matsell DG. et al. 114. Acad Med. Swanson D. Carney PA.6:36-44. Mann KV. Smith BO. 73. Norcini JJ. Psychometric test results associated with high achievement in basic science components of a medical curriculum. Ann Emerg Med.6:213-220. Young PR. Becker DF. Acad Med. 234 Reliability. Colliver JA. Verhulst SJ. Vu NV. 1980. Kopelow ML. Burnstein M.14: 179-183. Newble DI. 1992. Washington. 1997. Shea JA. 74. Mott LA.68(suppl):S51-S56. Brailovsky C. Verheggen MM. 83. 1985.269:16551660. Swanson DB. 100. 57. Wenrich MD. Verhulst SJ.321: 1217-1219. MacRae HM. Democracy and Excellence in American Secondary Education. Setting examination-level standards for a performancebased assessment of physicians’ clinical skills. The assessment of clinical skills/ competence/performance.119:129-135. Surgery. Marcy ML. McLeod PJ. The assessment of clinical competence on the emergency medicine specialty certification examination: the validity of clinically relevant multiple-choice items. DC: National Academy Press. Veloski JJ. Carney PA. Guidelines for assessing clinical competence. 145:757-762. Rothman A. Keystone J. 88. 91. Int J Psychiatry Med.17:178-197. 1993. Case SM. 60. Hassard TH. 1990.127:429-438. 106. Acad Med. et al. Joorabchi B. 2000. Teaching Learning Med. Clauser BE.102:520-528. Regan MB. structured clinical examination for licensing family physicians. Miller GE. Med Educ. 1996. 1992. 62. Acad Med. 93. Rainsberry P. Brailovsky CA. Blackwell TA.25:293-299. 1999.146:1735-1740. LoGerfo JP. Dinant GJ. Tamblyn RM. Barrows HS. Mott LA. Tutton PJ.35:348-356. Cusimano MD. 1999. 55. Hsu E. Reliability and validity of the objective structured clinical examination in paediatrics. 2000. 67. Assessing clinical performance with standardized patients. De Melker RA. Davis D. Greenfield S.65(suppl):S25S26. Reznick RK.7:506-510. 1997. Simulated patients in assessing consultation skills of trainees in general practice vocational training: a validity study. 1990. Pseudoaccountability. Stillman P. Kopelow ML. 71:170-175. 103. van der Vleuten C. et al. All rights reserved.71:181-186.67(suppl): S13-S15. Acad Med. 1994. Eaglen RH. 2002—Vol 287. 76. Tamblyn R. 1964. Gorter S. An overview of the uses of standardized patients for teaching and evaluating clinical skills: AAMC. 122:335-343. The relationship between clinical science performance in 20 medical schools and performance on Step 2 of the USMLE licensing examination: 1994-95 Validity Study Group for USMLE Step 1 and 2 Pass/Fail Standards. Use of peer ratings to evaluate physician performance. Abrahamowicz M. Poole AD. Med Educ. JAMA. Defining standards of competent performance on an OSCE. 1991. Acad Med. De Champlain AF. van der Vleuten C. Acad Med. Colliver JA. JAMA. Use of standardized patients in the assessment of medical practice. 56. A standardized-patient assessment of a continuing medical education program to improve physicians’ cancer-control clinical skills. Margolis MJ. van der Vleuten C. Med Educ. et al. Vu NV. Young PR. 54. 90. Kaplan S. Ainsworth MA. Tamblyn R. 75. Wass V. Inui TS.20:121123. A comparison of standard-setting procedures for an OSCE in undergraduate medical education. Swartz MH. 1991.75:267-271. 94. 113.16:322-332.158: 205-207. CMAJ. Large-scale use of an objective. Six years of comprehensive.ama-assn. Shortcomings in the evaluation of students’ clinical skills and behaviors in medical school. 79. 84. 82. Sawnson DB. Ann Intern Med. Association between licensing examination scores and resource use and quality of care in primary care practice. Eval Health Prof. Cohen R. Sanson-Fisher RW. 85. and clinical judgment: multiple-choice questions in the assessment of physician competence. 1992. BMJ.150:573-577. Ramsay JO. An examination of the examinations: the reliability of the objective structured clinical examination and clinical examination. Pujol R. To Err Is Human: Building a Safer Health System.14:249-253. Br J Hosp Med. 1994. 70. 68. Wolfish NM. Objectively evaluating student case presentations. 53. Berkson L. Reznick R. Abrahamowicz M. 1990. 1998. 1988. Sutnick AI.28:226-233. Acad Med. 1995. King A. Objective structured clinical examination in a pediatric residency program. clinical. Burnett J. Travis T. Swanson DB. Who should rate candidates in an objective structured clinical examination? Acad Med. Med Educ. 78. Case SM. Case SM. Patients don’t present with five choices: an alternative to multiple-choice tests in assessing physicians’ competence. Touw-Otten FW. Carline JD. 1994. 1998. Assessing practicing physicians in two settings using standardized patients. JAMA. 1991. dermatology.75:1130-1137. Tamblyn RM. Acad Med. 1997. Petrusa ER. Validity of simple approach to scoring and standard setting for standardized-patient cases in an examination of clinical competence. Schnabl GK. 92.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE aminations: professions theory. Downloaded from jama. 1993. Do short cases elicit different thinking processes than factual knowledge questions do? Med Educ. 1999. Expanding patient involvement in care: effects on patient outcomes. Ill: Rand McNally. Margolis MJ. Use of standardized patients to assess between-physician variations in resource utilization. Acad Med. 58. Invest Radiol. Lescop J. Can standardized patients predict real-patient satisfaction with the doctor-patient relationship? Teaching Learning Med. Paauw DS. 112.9:554556. Corrigan JM. Robeson MR. CMAJ. 1992.278: 790-791. Recognizing and managing depression in primary care: a standardized patient study. 61. et al. Norcini JJ. 63. Schanbl GK. The accuracy of standardized patient presentation. 2001. Med Educ. Ann Intern Med. 1993. Teaching Learning Med. 87. Travis TA. Acad Med. Latif A. et al. Grosso LJ. 107. Donaldson MS. performance-based assessment using standardized patients at the Southern Illinois University School of Medicine. Margolis MJ.35:321-325. Klass DJ. Grand’Maison P. Med Educ. Larson EB. Assessment of clinical skills with standardized patients: state of the art. Abrahamowicz M. 1999.68:443-451.31:94-98. Freeman DH Jr. Downing SM. A new assessment tool: the patient assessment and management examination. Klass DK. Regehr G.19:238-247. Standardized or real patients to test clinical competence? the long case revisited. 1990. Rabinowitz HK. 108. Clinical skills assessment with standardized patients. 98.48:23-30. Swartz MH. Colliver JA. Klass DJ. Swanson DB. Holsgrove GJ. Eval Health Prof. Scherpbier A. test design. Relationship between scores on NBME basic science subject tests and the first administration of the newly designed NBME Part I examination. Newble D. Dietrich AJ. J Gen Intern Med. 77. Acad Med. Med Educ. Charon R. J Fam Pract. 86. Acad Med. Acad Med. Validating the standardizedpatient assessment administered to medical students in the New York City Consortium. 2001. 1985. Schnabl GK. Broudy HS. 1997. 69. Ann Intern Med. Webster GD. Ross LP. Dauphinee D. Acad Med. Pieters HM. Kaufman DM. Regehr G. Grosso LJ. Tamblyn RM. Malpractice claims against family physicians: are the best doctors sued more? J Fam Pract. 2012 .74:842-849.67:42-50. No. Prieto L. Bardes CL. Kopelow ML. Orr NA. et al. Med Educ. 105. Blane CE. Med Teacher. Tamblyn RM. 101. The accuracy of standardized patient presentation. 1996. 1985. Muijtjens AM. Martin JA. Chicago. 1993. Ramsey PG. Allen JD. Gayton D. Unnecessary prescribing of NSAIDs and the management of NSAID-related gastropathy in medical practice. Tamblyn RM.

Ratings of videotaped simulated patient interviews and four other methods of evaluating a psychiatry clerkship. Harvey J. Benbassat J. 1987. Foulkes J. (Reprinted) JAMA. Occasional Paper 46. McGibbon D. Earp JA. January 9. 147. Beeth S. Franks P.7:174-179. Composite undergraduate clinical examinations: how should the components be combined to maximize reliability? Med Educ. 136. 128. Validity and generalizability of global ratings in an objective structured clinical examination. Fraser RC. Rice MM. Arch Intern Med. Berkson L. Assessment of practicing family physicians: comparison of observation in a multiple-station examination using standardized patients with observation of consultations in daily practice. First-visit bias in the measurement of clinical competence with standardized patients. Hinton-Walker P. 115. Grol R. In: Suchman AL. Med Educ.71:495-498.9:321-326. Assessing physicians’ interpersonal skills via videotaped encounters: a new approach for the Royal College of General Practitioners Membership examination. Schlesinger H. Stillman RM. Med Educ. Feasibility and psychometric properties of using peers. van der Vleuten C. Manual located at the University of Toronto.155:1877-1884. Rochester.4:143-152.93:439-442. 1983. Virtual reality in medical training. 161. Rochester. Robb A.66:169-171.159:423-426. 1994. Barnas GP. Tamblyn R. practice.169: 418-420. Christakis D. Cohen R. Roca RP.157:1352-1353. 164. Acad Med. 169. 167. Violato C. Helmreich RL.34:525-529. Frankford DM. Improvement of reliability of an oral examination by a structured evaluation instrument. Helmreich RL. NY: University of Rochester Press. Cohen R. 121.9: 146-152. 1993. Structured single-observer methods of evaluation for the assessment of ward performance on the surgical clerkship. Hamers JG. Grol R. 150. Blank LL. Wang-Cheng RM. 1995. Washington. consulting physicians. 1997. Partnerships in Health Care: Transforming Relational Process. Ross J.274:741-743. A structured teamwork system to reduce clinical errors. Schouten B. London M. How well does chart abstraction measure quality? a prospective comparison of standardized patients with the medical record. Hastings AM. 168. Rowland-Morin PA. 1995. Standardized patients: a long-station clinical examination format. Thomas EJ. ©2002 American Medical Association. 1996. Anesthesiology. 1983. The mini-CEX (clinical evaluation exercise): a preliminary investigation. 130. van der Vleuten C. van der Vleuten C.70:898-931. Thompson BL. Error Reduction in Health Care: A Systems Approach to Improving Patient Safety. Colliver JA. Benaroya S. Influence of patient education on profiles of physician practices. Hodges B. Acad Med. Bardes CL. Acad Med. 1991. NY: Springer Publishing Co. 1996. 1998.74:1028-1032. Goulet F. Rothman AI. 119. Rethans JJ. Maximizing the Value of 360-Degree Feedback: A Process for Successful Individual and Organizational Development. How well do faculty evaluate the interviewing skills of medical students? J Gen Intern Med. 155. Lawrence SL. Fiscella K. Scully J. van der Vleuten C. Fulkerson PK. 66:545-548. 1993. 1990. Helmreich RL. 135. Acad Med. Hoogenboom RJ. Noel GL. Swartz MH. Dauphinee WD. Acad Med. A pilot experience with competency-based clinical skills assessment in a surgical clerkship. 129. NY: University of Rochester Press. 1992. 149. Schouten B. 1991. Jaffe BM. Kester A. Cote S. 159. The Examination for Membership in the Royal College of General Practitioners.71:624-642. Verhoeven BH.21: 477-481. Norcini JJ. 1991. 1992. Norman GR. Gonzi A. Measurement of clinical reflective capacity early in training as a predictor of clinical reasoning performance at the end of residency: an experimental study on the script concordance test. 2000. Epstein RM. Formative assessment of the consultation performance of medical students in the setting of general practice using a modified version of the Leicester Assessment Package. Musson DM. Neighbour R. Checklist self-evaluation in a standardized patient exercise. 153.ama-assn. Dunnington GL. Snell L. England: Royal College of General Practitioners. 145. The feasibility and value of using patient satisfaction ratings to evaluate internal medicine residents. 74:62-69. Can J Anaesth. Parker S. Partnerships in Healthcare: Transforming Relational Process. Fidler H. 1990. Coe NP. 1999. Teaching Learning Med. Lane KM. 1994. Am J Surg. 127. 170. 1995. Arnold GK. Influence of effective communication by surgery students on their oral examination scores. Risser DT. 151. Surgery as team endeavour. Mumford E. 1991. 171. Marini A. Reisner L.7:499-505. National Office of Everseas Skills Recognition Research Paper # 8. 116. and teamwork in medicine and aviation: cross sectional surveys. Brailovsky C. 137. Effect of student and preceptor gender on clinical grades in an ambulatory care clerkship. Turnbull J. 2001. Med Educ. 1993. Kimball HR. 2000.58:864-872. 152. Irby D. Characteristics of the informal curriculum and trainees’ ethical choices. Hanusa BH. Acad Med. Benaroya S.25:110-118. Salisbury ML. Ferrell BG.35:326-330. J Gen Intern Med. 133. Patient and Physician Peer Assessment and the ABIM Program of Continuous Professional Development. 2002—Vol 287. Lee M. Am J Surg. CMAJ. 124. 156. Institute of Medicine. Med Educ. Teaching and learning in ambulatory care settings: a thematic review of the literature. Campion P. Yang JC. 1999. In: Spath PL. 146. Dunnington G.167: 604-606. Med Educ. Scherpbier AJ. Markakis KM. J Health Commun. 2001. Collaborative Clinical Education: The Foundation of Effective Patient Care. Helmreich RL. The Development of CompetenceBased Assessment Strategies for the Professions. Med Educ. The Script Concordance test: a tool to assess the reflective clinician. Pangaro LN. Surgery. 73:138-145. Improving physicians’ interviewing skills and reducing patients’ emotional distress—a randomized clinical trial. 118. Hall JA. et al. van der Vleuten C. Lockyer J. Pitfalls in the pursuit of objectivity: issues of reliability. 132. 2 235 Downloaded from jama. 1999. 134. Can J Anaesth. Cuerdon T. Westburg J. Friedman M. Peabody JW. Communication between primary care physicians and consultants. Botelho RJ. Hobma S. Helmreich RL. Assessment Across Basic Science and Clinical Skills: Using OSCEs With Postencounter Basic Science Probes. Kern DE. Hinton-Walker P. 140. and community in a market-driven era. 2001. Brailovsky C. Error. Rethans JJ. 2000. American Board of Internal Medicine. Ann Intern Med. McLeod PJ. Helmreich RL. 1995. Konrad TR. Crossing the Quality Chasm: A New Health System for the 21st Century. 2001. 158. 1989. 1994. No. Herbers JE Jr. 148. Schnarch B. Singer PA. 1994. Gordon MJ. BMJ.131:745-751. 165. Hundert EM. Weaver MJ. ed. Kaiser S. Acad Med. 1997. 120. 1998. 117. 154. Global ratings of videotaped performance versus global ratings of actions recorded on checklists: a criterion for performance assessment with standardized patients. Keynan A. by guest on May 1. 1999. Reliability of global rating scales in the assessment of clinical competence of medical students. Rethans JJ. Ram P. DC: National Academy Press. The interrater reliability and internal consistency of a clinical evaluation exercise. Med Educ. All rights reserved.123:795-799. Jason H. Norman G. Roy L. Wright K. J Med Educ. Kroboth FJ. Faculty ratings of resident humanism predict patient satisfaction ratings in ambulatory medical clinics. Cooper GS.114: 393-401.320:745-749. Glassman P. Schaefer HG. Acad Med. 166.144:316-322. van der Vleuten C CP. 125. Suchman AL. Dresselhaus TR. Am J Surg. San Francisco. A review of the validity and accuracy of self-assessments in health professions training. Wass V. 108:642-649.34:573-579. Toronto. NY: John Wiley & Sons. Bauer JJ. 1999. 143. Hoffman K. J Gen Intern Med. Snell L.33:197203. Sexton JB. eds. Fried ED. Where do we stand and what might we expect? JAMA. 123. Ontario. van der Vleuten C.66:762-769. van der Vleuten C. Calif: Jossey-Bass Inc. 1992. New York. 163. Roter DL.47:391-392. McLeod P. Hafferty F. Ram P. 2000.320:781-785. 142. 1993. 1995. Davies JM. Luck J. 1991.27:376-381. 139. 1995. co-workers. Tate P. The triple-jump examination as an assessment tool in the problem-based medical curriculum at the University of Hawaii. McKinley RK. 2000. Burchard KW. Poldre P. Grol R. Aretz K. Assessment of general practitioners by video observation of communicative and medical performance in daily practice: issues of validity. Anaesthetic simulation and lessons to be learned from aviation. Frankel RM. Tornow W. 2000. 1998. 1998:171-183. Am J Psychiatry. 157. J Gen Intern Med. Beckman HB. 1999. Acad Med. 160.DEFINING AND ASSESSING PROFESSIONAL COMPETENCE dardized patients: a follow-up study and recommendations for application. Abrahamowicz M. Beausoleil S. Fulginiti J. Witzke D. 138. Kowlowitz V. Pa: American Board of Internal Medicine. Tamblyn R. Garb JL.33: 447-454. stress. Acad Med. 131. Kalet A.80:479. Lockie C. Canberra: Australian Government Publishing Service. Ram P. New York. Charlin B. 67(suppl):S22-S24. Assessment in general practice: the predictive value of written-knowledge tests and a multiple-station examination for actual medical performance in daily practice.70:324-326. 2001. The doctor-patient relationship and malpractice: lessons from plaintiff depositions. London. 1987. Acad Med. Barker LR. 162. Evaluation of the student: improving validity of the oral examination. 1994. and patients to assess physicians. Am J Med. The effect on reliability of adding a separate written assessment component to an objective structured clinical examination. Charon R. Cole KA. Smith RM. BMJ. 2000. Philadelphia. Davies JM. 2012 . 144. 122. Med Educ. On error management: lessons from aviation. Charlin B. The importance of human factors in the operating room. et al. Toews J. Arch Intern Med. Laube DW. 126. Suchman AL. Acad Med. Responsive medical professionalism: integrating education.72 (suppl 1):S82-S84. 1999:235-277. Abrahamowicz M. Assessing clinical performance.35:430-436. Botelho RJ. reliability and feasibility. Field S. Ann Intern Med. Ann Intern Med. Moroff S. Tamblyn RM. De Graaff E. 68:366-372. 1997.12:189-195. Simon R. van der Vleuten C. Performance-based assessment of clinical ethics using an objective structured clinical examination. 4:202-208.44:907-912. Acad Med.154:1365-1370. How accurate are faculty evaluations of clinical competence? J Gen Intern Med.

Sign up to vote on this title
UsefulNot useful