P. 1
Training assesment center

Training assesment center

|Views: 25|Likes:
Published by Valentin Adam
training
training

More info:

Categories:Types, Reviews
Published by: Valentin Adam on Dec 22, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

03/24/2014

pdf

text

original

Advances in Developing Human Resources

http://adh.sagepub.com An Integrative Model of Competency Development, Training Design, Assessment Center, and Multi-Rater Assessment
Hsin-Chih Chen and Sharon S. Naquin Advances in Developing Human Resources 2006; 8; 265 DOI: 10.1177/1523422305286156 The online version of this article can be found at: http://adh.sagepub.com/cgi/content/abstract/8/2/265

Published by:
http://www.sagepublications.com

On behalf of:

Academy of Human Resource Development

Additional services and information for Advances in Developing Human Resources can be found at:

Email Alerts: http://adh.sagepub.com/cgi/alerts Subscriptions: http://adh.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav Citations http://adh.sagepub.com/cgi/content/refs/8/2/265

Downloaded from http://adh.sagepub.com by Dana Opre on October 26, 2009

Woehr and Arthur asserted that the lack of construct-related validity in assessment center literature is primarily due to issues of design and development. Assessment Center. the issue of establishing construct-related validity of assessment center is still unsolved. core competency. No. relate to exemplary performers or performance in a specific job or job level (Boyatzis. 8. Keywords: assessment center. 2009 . Nine propositions related to validity were developed in accordance with the model to evoke future research. training design. Built around validity (particularly construct-related) issues of assessment center. 1997). Advances in Developing Human Resources Vol. performance assessment A review of related literature indicates that researchers have not reached a clear definition of competency.An Integrative Model of Competency Development. The term sometimes refers to outputs of competent performers and sometimes refers to underlying characteristics that enable an individual to achieve outstanding performance (Dubois & Rothwell. is tied to strategic. and Multi-Rater Assessment Hsin-Chih Chen Sharon S. 1999. whereas a relevant term. assessment center. Naquin The problem and the solution.1177/1523422305286156 Copyright 2006 Sage Publications Downloaded from http://adh. 2 May 2006 265-282 DOI: 10.This article focuses on the design aspect of assessment center to develop an integrative competency-based assessment center model that links competency development. Although assessment center has been proven effective in predicting performance. Practical implications are also provided. Training Design. McLagan.com by Dana Opre on October 26. Hoffmann.sagepub. the model guides scholarly practitioners on how to design a competency-based assessment center that has potential to improve construct-related validity and capability to build into training design and assessment and other human resource functions. resulting in an unmet research challenge. future-oriented. competency modeling. Most definitions. and multi-rater assessment together. 1982). however. 2004.

competency development or competency modeling. 2004). an abstract concept that exists in practice and refers to standardized procedures used for assessing behavior-based or performancebased dimensions whereby participants are assessed using multiple exercises and/or simulations (Thornton. job promotion. training needs analyses. and expected outcome (Shippmann et al. where “successful” is understood to be in keeping with the organization’s strategic functions (e. Assessment center is not a brick-and-mortar research center or building. placement. Howard. 2000). It is.. Research Problems Research on assessment center has evolved over the past few decades as researchers have moved from focusing on an understanding of what an assessment center is and how it works to establishing some criterion-related validity and generalizability (Howard.g. and a common assessment strategy is the use of assessment center. rather. & Adams. vision. uniqueness. it should be noted that although the terms job analysis and competency modeling are often used interchangeably. A similar construct. and integrated simulations (Thornton & Mueller-Hanson. training and development. future orientation. beliefs. business games. and recruitment (Byham & Moyer. knowledge. the two differ in terms of assessment of reliability. success. Osburn. and traits) that enable successful job performance. 1999). Prahalad & Hamel. 1990). motives. attitudes.g. HR planning. 1996. in-basket exercises. It is understandable that assessment strategies and methodologies are closely related to competency. individual career planning.sagepub. Thus.com by Dana Opre on October 26. We consider competency to refer to the underlying individual work-related characteristics (e. skills. 1997). compensation. Dimensions for assessment (equated to competencies) are usually identified through job analysis. succession planning. oral fact-finding. 1997. Morris. performance appraisal. we have adopted an overarching perspective that combines both the performance and strategic aspects associated with the various definitions found in the literature. strategic planning. However. Etchegaray. refers to the process of identifying a set of competencies representative of job proficiency. strategic focus. 1992). mission.. role-playing. With the generic term just defined. 2009 . However. competency development can enhance various human resources (HRs) and organizational development activities including personnel selection.. resulting in an unmet research challenge (Robie. leaderless group discussions. the issue of establishing constructrelated validity of assessment center is still unsolved. Lucia & Lepsinger. Common simulation exercises used in an assessment center setting include oral presentations. or survival). Construct-related validity refers to the degree to which a theoretical concept is operationalized and the degree to which the operational inference exhibits Downloaded from http://adh.266 Advances in Developing Human Resources May 2006 collective functions in organizational level (Moingeon & Edmondson. 2004. 2000).

we have adopted an overarching definition of competency that includes the organization’s strategic Downloaded from http://adh. Another issue existing in assessment center literature is that changes of individual behavior can be readily observed through assessment center activities.Chen. the application of the assessment center to HRD can also help the HRD field. 1994.sagepub. Meanwhile. On the other hand. and multi-rater assessment strategies. Because HRD is deeply rooted in the design and development of learning activities across various levels. the applications appear to be piecemeal and not systematically connected. training design. 2006 [this issue]). emotion. Joyce. its utilization in human resource development (HRD) practice is relatively sparse (Chen. beliefs. 2004. Woehr and Arthur (2003) asserted that the lack of construct-related validity in assessment center literature is primarily due to issues of design and development. the competency-based assessment center evidently differs from traditional assessment center in scope. move further away from cognitive or reactive assessments toward behavioral assessment—a more reliable measure. More important. 1995). & Pond. Because of the integration. etc. Thayer. values. multi-rater assessment such as dual-ratings assessment can potentially be more effective in assessing implicit behavioral competencies. 1997).g. 1997. as mentioned.com by Dana Opre on October 26. assessment center and multi-rater assessment seem to complement each other perfectly (Howard. Naquin / AN INTEGRATIVE MODEL 267 consistency of what a researcher intends to measure. although assessment center has received a wide range of applications in HR-related functions. motivation. This challenge also clearly relates to an ongoing debate as to whether the design of assessment center should be based on dimensions and competencies or tasks and exercises (Byham. Lowry. In contrast. particularly the design of training assessment. Howard. visions. Accordingly. It is regrettable that the ability to assess implicit behavior (e. As mentioned. whereas the multi-rater assessment method relies on perceptions and/or memories of behavior.) through an assessment center is limited. An obviously and immediately useful application of assessment center to HRD is for assessing effectiveness of competency-based training. integrating the HRD perspective into assessment center has strong potential to contribute to assessment center literature in resolving the construct-related validity issues of assessment center. In current assessment center literature.. but these methods are not able to provide the level of information regarding tangible outcomes that assessment center can. it fundamentally refers to discrepancies between competencies and the measures that are used to demonstrate such competencies in assessment center activities. This is primarily due to the fact that assessment center typically involves observation of outcomes or performance behaviors. 2009 . Research Purposes The purpose of this article is to develop a competency-based assessment center design model that integrates competency development. assessment center activities.

It is important to note that the following notions are not intended to be comprehensive.. assessment center. This model attempts to serve multiple purposes. which mainly serves selection and promotion purposes.. knowledge. they provide readers with a generic understanding of how the model is framed. so the traditional assessment center. Competency-Based Development Common practice of competency development is through quantitative and/or best practice approaches to develop a set of competencies characterized by individual skills. Competency-Based Training Design A generic difference between traditional training and competency-based training designs is that the former is learning-focused whereas the latter is based on performance. Indeed. Second. and multi-rater assessment. To the contrary. training design. it provides a design process that has the potential to enhance the construct-related validity of an assessment center. Accordingly. 1993). Naquin & Holton. leadership skills identified by a benchmarked organization or institute) and is often followed by a dynamic customization of competencies for use in a particular organization (e. no longer satisfies the extended scope.g. behaviors. Conceptual Framework The model is guided by best practice and research in competency-based development. Spencer & Spencer. 2003). The quantitative approach is through reorganization of exemplary performers on a specific job and identification of their characteristics toward the successful performance on the job (e.sagepub. traditional assessment center is developed through job analysis to identify individual work-related characteristics. The best practice approach is through adoption of an existing competency model (e.. Instead. it has often overlooked or has limited ability to appropriately respond to an organization’s strategic. 2009 . Blank (1982) identified four major characteristics of competency-based Downloaded from http://adh. Such a mechanism is current in nature.g. assessment center strategies. and traits.268 Advances in Developing Human Resources May 2006 functions. it introduces a systematic approach to linking competency development. First. future-oriented functions. a competencybased assessment center rooted by competency development and integrated with multi-rater assessment can overcome or complement the limitation. and multi-rater assessment.com by Dana Opre on October 26. the model helps develop a set of propositions for future research. competency-based training must tie to work-related performance outcomes such as transfer of learning or behavior change. Only key concepts that underpin the purpose of this article are included. Third. training design.g.

Downloaded from http://adh. and (j) misuse of results. (k) improper link to existing systems without a pilot. supervisors. reports. not a process.com by Dana Opre on October 26. 2003). Naquin / AN INTEGRATIVE MODEL 269 programs that are essential for competency-based training design: outcome driven. (c) multiple assessments. & Gruys. The process involves an individual’s self-evaluation against a set of criteria and in comparison to norms from other raters about the individual. an assessment center should include 10 key components: (a) job analysis. Common errors of assessment centers (Caldwell. (g) lack of clarifying the feedback to be used. (h) insufficient resources for implementation. (d) poor exercises. research related to 360 degree has also reached beyond its traditional application for management development to other HR functions such as performance appraisal (Toegel & Conger. (f) compromising confidentiality. (e) assessors. and customers with regard to their perceptions of research interests. task mastering. and (h) data integration. (j) unfriendly administration and scoring. multisource assessment or feedback is through an objective lens and is a dynamic process that provides developmental or evaluation information about one’s performance or behavior. such as performance and developmental feedback. (h) inadequate candidate preparation. (b) using it as a subtitle for managing a poor performer. These errors as described by Caldwell et al. (g) recording behavior. and (m) lack of measuring effectiveness. (f) unqualified assessors. (e) insufficient communication among people involved in the process. and high level of proficiency in a jobrelated setting. trainee centered. (f) assessor training.Chen. Thornton. (2003) include (a) poor planning. (c) weakly defined dimensions. (i) sloppy behavior documentation and scoring. (b) inadequate job analysis. (b) assessment techniques. Multi-rater Assessment Multi-rater assessment is also known as 360-degree feedback assessment or multisource assessment. (d) no key stakeholder involvement. peers. (e) lack of pretest evaluation.sagepub. Multi-rater assessments collect information from individuals and their subordinates. (g) inadequate assessor training. the assessment center is a standardized procedure used for assessing behavior-based or performance-based dimensions whereby participants are assessed using multiple exercises and/or simulations. In other words. 2009 . 2003) were also considered in developing the model. According to Joiner (2000). (i) lack of clarification of ownership of the data. Similar to the assessment center that has gone beyond its traditional application for selection and promotion. Wimer and Nowack (1998) suggested 13 common mistakes using 360-degree feedback including (a) unclear purpose. (c) lack of pilot testing. behavior classification. (l) treating it as an end. (d) simulations. Assessment Center As mentioned.

com by Dana Opre on October 26. Researchers can use subcompetencies to develop a questionnaire or survey that can be distributed to a targeted sample. The model can be found in Figure 1. As Holton and Lynham (2000) pointed out. whereas the subcompetencies are variables to represent the constructs. we divide the hierarchical competency system into three levels: competencies. Subcompetencies normally consist of a set of observable. The third level provides detailed guidelines for achieving the action statements in level two. Building a Hierarchical Competency System The first step in developing a competency-based assessment center is to build a hierarchical competency system that breaks a whole into supporting parts. but more job related than learning objectives alone” (p. networking. subcompetencies. abstract form.. specific. Much of the literature in competency development (not assessment center) addressing validity issues focuses on face or content validity. validity is examined through a qualitative rather than quantitative lens. 11). “competencies are less specific than tasks. respond to customer’s inquiry politely and consistently. and procedures or steps.. but less collective than competencies. The second-level items (the subcompetencies) are the action statements that support competencies. team building. etc. The task of identifying competencies in current competency modeling practice takes various forms. In other words. which is in abstract form.). specific. The three-level hierarchical competency systems appear to be effective in communicating with stakeholders and linking competency to training design and assessment center.g. Built around the model are nine propositions presented below and followed by the authors’ rationale for each. communication.g. whereas their supporting subcompetencies are more measurable. The hierarchical competency system can serve as a conceptual framework for quantitative research to enhance construct-related validity of competencies. This step is critical because it lays out a framework to guide training design and assessment center measurement. can be easily communicated in discussing competency issues with stakeholders.). For discussion purposes. 2009 . The first level. etc. behavior-based steps. The number of levels of competencies depends on the complexity of a system. the competencies can be used as constructs to be assessed. determining whether the competencies are valid is most often based on subject matter experts’ or managers’ judgment. For example. Some identify competencies in specific ways such as in performance or behavioral indicators (e. Downloaded from http://adh. Others describe them in generic or abstract terms (e. Specifically. problem solving. adjust equipment in terms of a mechanical manual.sagepub.270 Advances in Developing Human Resources May 2006 Competency-Based Assessment Center Design Model The model consists of eight components as a practical guide for designing a competency-based assessment center. Competencies are described in collective.

and implicit-behavioral subcompetencies are assessed by multi-rater assessments. • Build multi-rater assessment into matrix design. Proposition 6: Measuring no more than 10 sub-competencies in an activity will enable assessors to accurately assess the sub-competencies that are supposed to be measured. and level of difficulty of indicators) into design will lead individuals to better understand the process of assessment center and therefore can indirectly improve the construct-related validity of competency-based assessment center (rating accuracy). 2006) to examine the relationship between competencies and subcompetencies. to develop the competency-based assessment center activity matrix will enhance the construct-related validity of competency-based assessment center. Research Track (Research Propositions) Proposition 1: Using factor analysis. Caldwell et al. assessors.com by Dana Opre on October 26. Proposition 2: Linking subcompetency to training program and assessment center will improve the constructrelated validity of the competencies-based assessment center. • Leverage number of performance outcomes in an activity. • Clearly identify training objectives and performance guidelines in the assessor training. Selecting and Developing Assessors • Select assessors from two levels higher than individuals to be assessed in the organization. Proposition 5: Using a numeric rating scale rather than a dichotomous scale will lead to an appropriate assessment center activity selection. setting. Proposition 4: Differentiating between explicit-behavioral and implicit-behavioral subcompetencies will improve the construct-related validity of competency-based assessment center where explicit-behavioral subcompetencies are measured by traditional assessment center mechanisms. • Develop action-oriented supporting performance indicators. Therefore.g. Proposition 3: Using subcompetencies.Chen. However. Proposition 9: Well-trained assessors will contribute to criterionrelated validity of competency-based assessment center. Doing this will increase the construct-related validity of the competencybased assessment center. Linking Subcompetency to Competency-Based Training Design and Competency-Based Assessment Center • Subcompetncies serve as central links to training design and competency-based assessment center design (See Figure 2 for details).) involved in the competency-based assessment center and building extraneous factors (e. which collectively represent competencies in a more observable way. administrators. to examine construct-related validity of competencies will help refine the definition of the competency and enhance the validity of competencies-based assessment center. technology. skilled trainer for assessor training. Naquin & Chen. Determining Appropriate Competency-Based Assessment Center Activities • Use numeric scale rather than dichotomous scale at the subcompetency level to determine appropriateness of assessment center activities. • Use an experienced. Determining Performance Outcomes for Activities • Performance outcomes are informed by job outcomes in training design and subcompetencies in competency development (See Figure 2 for details).. etc.g.sagepub. (2003) pointed out a common error of assessment center practices—weakly defined dimensions or competencies. Downloaded from http://adh. and role players. Differentiating Implicit and Explicit Behavior • Differentiate explicit-behavioral and implicit-behavioral subcompetencies. Proposition 8: Developing customized materials for different individuals (e. in addition to qualitative competency development. Developing Subcompetency-Assessment Center Activity Matrix • Use subcompetency rather than competency to develop the matrix. Naquin / AN INTEGRATIVE MODEL 271 Practical Track (Step by Step Competency-Based Assessment Center Design) Building Hierarchical Competency System • Develop a three-level competency system.. the numeric scale will indirectly influence the construct-related validity of competency-based assessment center. Designing Competency-Based Assessment Center Materials • Use customized materials to enhance fidelity.. FIGURE 1: Integrated Competency-Based Assessment Center Model The collected data can be analyzed through factor analysis (e. 2009 .g. Proposition 7: Using customized assessment center materials which are designed to closely relate to participants’ work settings will lead to a stronger predictive validity of competencybased assessment center. resource persons. • Embrace both qualitative and quantitative approaches to develop and refine definitions of competencies.

In a competency-based training design. or performance outcomes.272 Advances in Developing Human Resources May 2006 factor analysis can easily allow the researcher to determine how well the competencies were defined and can serve as a means to help refine the definitions of the competencies.g. job outcomes. Linking Subcompetency to Competency-Based Training Design and Competency-Based Assessment Center In the hierarchical competency system that the model depicts. we develop the following proposition: Proposition 2: Linking subcompetency to a training design and assessment center will improve the construct-related (competency) validity of competency-based assessment center. In an assessment center design. learning objectives.. and the assessment center can be found in Figure 2. Halman & Fletcher. 2000). However.com by Dana Opre on October 26.sagepub. using an abstract competency to develop assessment center exercises can potentially jeopardize the validity of selected assessment center activities because such a matrix cannot identify the most appropriate activities to assess the competencies. which is an abstract level (e. the subcompetencies serve as performance outcomes that participants are expected to demonstrate in an assessment center. The statements for these three components will be identical whereas each of their supporting components (e. 2009 . competency-based assessment center is linked through subcompetencies. Therefore. 2000). one may select role-play activities to assess an individual’s communication competency. step or procedures. we develop the following proposition: Proposition 1: Using factor analysis in addition to qualitative competency development to examine construct-related validity of competencies will help refine the definitions of the competencies and enhance the validity of competency-based assessment center. the subcompetencies serve as key links between competency-based training design and the design of the assessment center. the intended measures for each stage are strictly connected. the training program. Developing Subcompetency Assessment Center Activity Matrix Developing a competency exercise matrix is a basic requirement for assessment center development (Joiner. The relationships among competency model. For example. from a generic view. Current practices for developing such a matrix are conducted at competency level. the subcompetencies serve as desired job outcomes.g. or performance indicators) may be different in description. However. the communication competency can Downloaded from http://adh. Therefore. As shown in Figure 2.. representing what training participants are expected to perform when they return to their respective jobs. Through this design.

which are in terms to support competencies. 3. FIGURE 2: Relationship Between Competency Model. to develop the assessment center activity matrix will enhance the construct-related validity of competency-based assessment center. Training Program. 2009 . Steps or procedures are in very specific form and described in support of sub-competencies. Sub-competencies inform job outcomes in training program design and performance outcomes in assessment center design and multi-rater assessment center. hence the role-play does not address all necessary communication skills. which collectively represent competencies in a more observable way. 7.Chen. Naquin / AN INTEGRATIVE MODEL 273 Assessment Center2 Performance Outcomes69 Performance Indicators10 Competency Model1 Competencies3 Subcompetencies46 Steps or Procedures5 Job Outcomes678 Learning Objectives78 Multi-Rater Assessment2 Performance Outcomes69 Performance Indicators10 Competency-Based Assessment Center12 Training Program1 1. abstract form. 2. Sub-competencies are more measurable. Job outcomes are work-related outcomes. and performance outcomes are identical. 10. Competency-based assessment center includes a traditional assessment center and a multi-rater assessment.com by Dana Opre on October 26. Competencies are in collective. we develop the following proposition: Proposition 3: Using subcompetencies. specific. and behavior-based. Therefore. Assessment Center. 5. Downloaded from http://adh. whereas learning objectives are supported by training materials. and Multi-rater Feedback Assessment encompass written and oral skills. Competency model triggers training and competency-based assessment center designs. Performance outcomes are general indicators that assessment center and multi-rater assessments are targeted to measure. 6. 8. specific but less collective than competencies. Statements of sub-competencies. Learning objectives are in support of job outcomes in a training design. Steps or procedures are observable. job outcomes.sagepub. Performance indicators are specific indicators in support of performance outcomes. 9. 4.

where explicit-behavioral subcompetencies are measured by traditional assessment center mechanisms and implicit-behavioral subcompetencies are assessed by multi-rater assessments. If not appropriately rendered. according to the “Guidelines and Ethical Considerations for Assessment Center Operations” (Joiner. Although it sounds impractical to involve a group of individuals to rate the matrix. This approach not only helps in alleviating subjective decisions but also provides more valid information on the degree to which subcompetencies fit assessment center activities. if this approach can enhance the competency-based assessment center design validity. Although one may argue that implicit behavior can be measured by transferring it to explicit format. assessing the implicit-behavioral competencies in a traditional assessment center could create more problems than it can solve. As a matter of fact. it evidently cannot be effectively managed in an assessment center. it should be considered. 2000). This is because implicit behavior is fairly complex and enduring. 2009 . Moreover. an assessment center has limited ability in measuring individuals’ implicit characteristics. any manager or trainer in an organization should be able to serve as raters for the questionnaire. It is very likely that the construct-related validity issue from an assessment center results from a lack of differentiation between explicit-behavioral and implicit-behavioral competencies. as long as each activity in a matrix development is clearly defined.com by Dana Opre on October 26. However. Determining Appropriate Competency-Based Assessment Center Activities Current research in developing the competency (or subcompetency) activity matrix uses simple check marks to determine the exercise to be used for a particular competency.sagepub. this approach provides no information on how well the competencies fit the exercises. We suggest using numeric ratings such as a 5-point Likert-type scale to determine the appropriateness of assessment center activities to the subcompetencies by treating the matrix as a questionnaire. it can easily jeopardize the validity of the assessment center.) Therefore. Consequently.274 Advances in Developing Human Resources May 2006 Differentiating Implicit and Explicit Behavior As previously mentioned. we develop the following proposition: Proposition 4: Differentiating between explicit-behavioral and implicit-behavioral subcompetencies will improve the construct-related validity of competency-based assessment center. This approach will require a group of participants to rate the questionnaire and then calculate aggregated scores on the collected data. A multi-rater assessment appears to be a more effective tool in assigning implicit behavior. Integrating multi-rater assessment in the assessment center design also provides the flexibility to reduce complexity and avoid common errors. (See more discussions in latter section. to increase the chance of Downloaded from http://adh.

sagepub. Determining Performance Outcomes for Activities This step requires composing a list of appropriate subcompetencies or performance outcomes (the two top-ranked activities) related to each of the activities. The following strategies are designed to help determine the most appropriate activities to be used in an assessment center: (a) Select two top-ranked activities for each of the subcompetencies. it is reasonable to assert that no more than 10 dimensions are practical for an activity. the numeric scale will indirectly influence the construct-related validity of competency-based assessment center. otherwise. consider a multi-rater questionnaire as a more appropriate approach to assess the subcompetency.com by Dana Opre on October 26. a competency-based assessment center activity designer should use judgment to avoid the problem of measuring too many subcompetencies in a single exercise or activity. In addition. Naquin / AN INTEGRATIVE MODEL 275 obtaining objective data. whereas Thornton and Mueller-Hanson (2004) stated that in practice. consider using all of them. When the number of subcompetencies increases. its applicability to any of the exercises is low. Thornton (1992) suggested 5 to 10 dimensions (subcompetencies in this context) to be assessed for various assessment centers. These performance indicators will be aligned with the activity design. because quantitative data are only meaningful if well interpreted.0. It is important to note that the competency-based assessment center designers should not overrely on quantitative data as presented here to design the assessment center activities. Therefore. (b) If more than two activities are tied as top-ranked. Based on the rationale just discussed. Activity designers should always review these subcompetencies to examine the appropriateness of fit.Chen. Therefore. The numeric scale has merit in assisting a competency-based assessment center designer to select the most appropriate activity to be used. Downloaded from http://adh. research suggests that one activity should not include too many measures. the following proposition is developed: Proposition 5: Using a numeric rating scale (along with appropriate subcompetency selection strategies) rather than a dichotomous scale will lead to an appropriate assessment center activity selection. each dimension or competency should include more than one assessment exercise. 2001). (c) If no rating for a particular subcompetency is greater than 3. Synthesizing the findings and suggestions in these literatures. 2009 . the assessors’ ratings could be biased by intuition due to the limitation of one’s cognitive abilities in differentiating complex situations in time-limited situations (Lievens & Klimoski. consultants only measure 4 or 5 dimensions in an exercise. Our suggestion is to move less congruent subcompetencies in an activity to multi-rater assessment.

. technology. it is also reasonable to eliminate the activity and move the subcompetencies classified in this activity to multi-rater assessment. The notion of fidelity is essential to design activities or cases that closely relate to participants’ day-in-thelife work situation. role-players. etc. which are designed to closely relate to participants’ work settings. Doing this will increase the construct-related validity of the competency-based assessment center measurement. administrators. administrators. the following proposition is developed: Proposition 7: Using customized assessment center materials.sagepub. we develop the following proposition: Proposition 6: Measuring no more than 10 subcompetencies in an activity will enable assessors to accurately assess the subcompetencies that are supposed to be measured. starting with an action verb).g. Thornton (1992) suggested that “fidelity” of assessment center design would help improve the validity of performance outcomes. and level of difficulty of the indicators when designing exercise materials.com by Dana Opre on October 26. These individuals include participants. resource persons.. assessors.) involved in the competencybased assessment center and building extraneous factors (e. technology. the supporting performance indicators should be as action-oriented as possible (e.g. and role-players. assessors. resource persons. Designing Competency-Based Assessment Center Materials There are two methods or models for developing assessment center materials: using off-the-shelf materials and using customized materials. setting. will lead to a stronger criterion-related (predictive-related) validity of competency-based assessment center. can indirectly improve the constructrelated validity of competency-based assessment center (rating accuracy). In addition to determining the subcompetency (performance outcome) or supporting performance indicators for exercise development. the use of a customized model adds more value to this systematic competency-based assessment center design. for an activity with fewer than 5 subcompetencies to be measured. An appropriate supporting performance indicator may include the performance Downloaded from http://adh. from a cost-effective perspective.. In developing multi-rater assessments. the following proposition is developed: Proposition 8: Developing customized materials for different individuals (e. Therefore. and level of difficulty of indicators) into design will lead individuals to better understand the process of assessment center and. Therefore. Thornton and Muller-Hanson (2004) suggested that several sets of exercise materials must be designed for various individuals involved in the exercises.276 Advances in Developing Human Resources May 2006 On the other hand. Based on the rationale. Therefore. therefore.g. 2009 . a competency-based assessment center designer should also consider factors such as setting.

and so on.sagepub. have a deep understanding of issues related to assessor training. according to the guidelines. The more important point is perhaps to keep these selected assessors (e. such as trainer and instructional design.Chen. and assessment program. 1997). In addition. Thornton. the assessor training should clearly state training objectives and performance guidelines. (e) simulations on exercises to be assessed. Thomson (1970) found no significant differences between ratings of psychologist and manager assessors. Training length should be determined in connection with other considerations. Downloaded from http://adh.. Quinones. Gaugler. and the criterion to determine effectiveness or efficiency of the performance (Mager. and attitudes. In addition. assessor capability. managers) well trained on how to assess assessees’ performance before engaging in an assessment center activity. Gaugler. (f) ratings issues. The objectives of assessor training are to facilitate assessors gaining reliable and accurate judgments. (c) relationship to job performance. and Pohley (1997) found that best practice incorporates line or staff management as assessors and these assessors are generally two organizational levels higher than the individuals to be assessed. (h) feedback procedures. Both can contribute to the success of assessor training if the communication system is well established and utilized. Naquin / AN INTEGRATIVE MODEL 277 to be measured. whereas some assessment center practices used psychologists as assessors. Rosenthal. Contents in the assessor training may include (a) knowledge and understanding on assessment dimensions. whereas program champions have broader insights on how the program works. (b) definitions of dimensions. the condition in which the performance occurs. Selecting and Developing Assessors Selecting and developing qualified assessors usually go hand in hand. 2000).com by Dana Opre on October 26. This is because the competency-based assessment center designers are expert in functions of an assessment center design. and Bentson (1987) found that an assessment center that used psychologists as assessors exhibited higher criterion-related validity than managerial assessors. More detailed issues related to assessor training can be found in the assessment center guidelines (see Joiner. (d) examples of effective and ineffective performance. In selecting assessors. It is also important to consider establishing a continually improving training system to help assessors maintain skills. research on the effect of assessors’ individual background shows mixed results. knowledge. and continually communicate with competency-based assessment center designers and a program champion. (g) data integration. 2009 .g. However. Spychalski. Although the assessment center guidelines suggested considering professional psychologists as assessors. For example. from a practical standpoint it is plausible to select assessors from the target organization. a trainer of assessor training should be familiar with simulation exercises. Finally.

but many organizational stakeholders do not naturally recognize them. and multi-rater assessment. 1991). it is imperative to collect cost-effective information of existing assessment center practices from other organizations and to provide information regarding costs associated with a competency-based assessment center program that will be implemented. Perhaps these are the major reasons that assessment centers do not receive enough attention in HRD.g. (b) how an assessment center can make changes. selection. This article ties competency development to training program design. subcompetencies. performance appraisal. Providing information regarding the strategies that will be utilized to maintain cost-effectiveness is also necessary.) through a common link. resulting in an integrative.sagepub. Educating Organizational Stakeholders The major objective for this strategy is to advocate various advantages and applications of assessment center to organizational stakeholders. 2009 . To enhance communications between HRD practitioners and organizational stakeholders with regard to the benefits of adopting competency-based assessment center. These facts seem obvious.278 Advances in Developing Human Resources May 2006 Proposition 9: Well-trained assessors will contribute to the criterion-related validity of competency-based assessment center.com by Dana Opre on October 26. it is certain that the competency-based assessment center design model can also be easily integrated with other HR functions (e. and difficult to manage (Dulewicz. In addition. (c) the differences between traditional assessment center and competency-based assessment center. expensive. Discussions and Implications for HRD Practice Designing and implementing an assessment center is labor-intensive. and (d) benefits of competency-based assessment center to individual development and organizational effectiveness as a whole. Collecting and Providing Cost-Effective Information Cost-effectiveness is always a concern for organizational decision makers. promotion. competency-based assessment center model that has profound implications for HRD practice. Some training effectiveness practices that focus on participants’ reaction and learning add little to how participants can apply what they learned to their jobs. For a competency-based assessment center to be adopted by an organization. time-consuming. assessment center. Downloaded from http://adh. we have developed a set of strategies for practical considerations. as this article demonstrates.. Implementing a competency-based assessment center in an organization can appropriately respond to such a question. Specific information that can be provided may include (a) the reasons that assessment center is important. etc.

Therefore. Organizational stakeholders’ involvement and support will be key to its success. culture fit) that may arise when implementing an assessment center in an organization. a project the authors led in designing a competency-based assessment center for assessing Downloaded from http://adh. communicating responsibilities before a center is implemented is as important as the other strategies proposed here. articulating the purposes could be the key to reducing the anxiety and enhancing the motivation for stakeholders to adopt the program. Anxiety and motivation to change are often related to resistance. Therefore. Communicating Responsibilities The implementation of a competency-based assessment center cannot solely rely on designers or implementers. The information allows decision makers to justify how an assessment center can change their organization.. Naquin / AN INTEGRATIVE MODEL 279 Providing Comparative Information This strategy mainly deals with providing information on what other organizations in the same industry have done regarding an assessment center and how well the assessment center has helped the organization improve performance. it is inevitable that resistance will be met. Implications for HRD Research Through the development of competency-based assessment center. this article provided nine propositions to guide future research in HRD. For example. it is important that competency-based assessment center designers and implementers fully articulate the purposes and describe how the data collected from the center will be used. Implementing a Pilot Test If a process of a competency-based assessment center design aims to facilitate a customized assessment center. We advise that the propositions should be examined entirely because the propositions closely link to the model and are all related to validity issues. implementing a pilot test can provide formative feedback for program design.com by Dana Opre on October 26. The pilot test also helps in examining practicality and other issues (e. 2009 . Articulating Purposes of the Competency-Based Assessment Center and the Purposes for Which the Data Are to Be Used Because an assessment center can be used for various purposes. When a new tool such as a competency-based assessment center is implemented for evaluation purposes.sagepub.Chen.g.

and other HR functions. 73-88. CA: Davies-Black. Conclusions Traditional assessment centers have been challenged by lack of strong construct-related validity. Retrieved March 25. assessment. Downloaded from http://adh. G. E. (2006). Public Personnel Management. integrative perspective—focuses on design aspects of a competency-based assessment center to enhance validity issues of assessment centers. 2004. 2004 from http://www. H.com by Dana Opre on October 26. Advances in Developing Human Resources. Another example would be to examine whether differentiating explicitbehavioral subcompetencies (as measured by a traditional assessment center) and implicit-behavioral subcompetencies (as measured by multi-rater assessment) will lead to an improvement of construct-related validity of competencybased assessment center. 32. Thornton. W. Chen. (2003). Englewood Cliffs. 247-264. Handbook for developing competency-based training programs. L. from http://www. The integrative model not only expands the scope of traditional assessment centers by incorporating multi-rater assessment into design but also guides HRD practitioners on how to design a competency-based assessment center that has potential to improve construct-related validity and has capability to build into training design.-C. C. W. (2004). Dubois. J. Ten classic assessment center errors: Challenges to selection validity. Assessment center: A critical mechanism for assessing HRD effectiveness and accountability. 2009 . E.ddiworld.com/research/ publications.. one may assess whether a set of subcompetency definitions refined by a result of a factor analysis contributes to the enhancement of a competency-based assessment center. the model provides a set of research propositions to be examined. & Moyer. NJ: Prentice Hall. Palo Alto. W. This article—through a systemic. The competent manager: A model for effective performance. R. C.com/research/ publications.asp Byham. W.sagepub. Retrieved March 25. C. Boyatzis. (1982). On the other hand. 8(2)... Byham.. In addition. D.280 Advances in Developing Human Resources May 2006 learning outcomes of a statewide leadership and management training program in the state of Louisiana has adopted the concepts and processes proposed in this article (see Melancon & Williams. Developing dimension-competency-based human resource systems. The adoption allows researchers to empirically examine all the propositions. P. these propositions could also be examined individually. New York: John Wiley. 2006 [this issue]). (2004). D. & Gruys. C. (2004). Using competencies to build a successful organization. For example. Competency-based human resource management. References Blank. R. (1982). M. & Rothwell.ddiworld.asp Caldwell.

S. 79-91. 355-370. L. 1-17. Osburn. Harvard Business Review. Spencer. III.. C. Robie. C. Organizational learning and competitive advantage. 24. G. E. Journal of European Industrial Training. The art and science of competency models: Pinpointing critical success factors in organizations. (1997).Chen. S. Hoffmann. B.. M. 50-55. R. E. Academy of Human Resource Development Conference Proceedings (pp. The assessment center process: Assessing leadership in the public sector. Personnel Management. E. C. W. (2003). (1999). A. S. Improving assessment centers.. Atlanta. F. Journal of Social Behavior and Personality. (2006).. & Chen.-C. III... Construct validation of the Louisiana Managerial/Supervisory Survey. A. A. T.). 13. D. 29. 8(2). 283-314. 23-46. (1991).. L. Competencies: The next generation. 703-740. M. Training and Development. (2006). Effects of the rating process on the construct-related validity of assessment center dimension evaluations. Naquin / AN INTEGRATIVE MODEL 281 Dulewicz. Naquin. P. S. Managerial functions: An alternative to traditional assessment center dimensions? Personnel Psychology.. F. Lievens. (1999). A. Thousand Oaks. B.. L. & Pond. C. Advances in Developing Human Resources. Personnel Psychology. & Hamel. & Edmondson. J. B. In F. 493-511. III. (1995).. Downloaded from http://adh. & Spencer. (1994). San Francisco: Jossey-Bass.. G. OH: Academy of Human Resource Development. M. Ash. S. Gaugler.. M. C. Guidelines and ethical consideration for assessment center operations: International task force on assessment center guidelines.. B. Howard.-C. F. Performance-driven leadership development. A. CA: Sage. Morris. Lowry. J. 32. (1997). 423-442. Moingeon.. L. W. New York: John Wiley. Cooper. Journal of Occupational and Organizational Psychology. 6. 40-47. & Fletcher. 73. (2000). Holton. et al. Public Personnel Management. Joiner. L. 53. Mager.. G. Human Performance. H. R. H. Robertson (Eds. Etchegaray. Pattista. The meanings of competency. D. Rosenthal. M. S. & Williams. F. Hesketh. & I. J.. K. 1400-1407).. The impact of development centre participation and the role of individual differences in changing self-assessments. Melancon. 23. 23. Advances in Developing Human Resources. 13-52. Carr. & Klimoski. A.. 16). Competency-based assessment center design: A case study.. Jr. Eyde. Public Personnel Management. V. (1996). New York: John Wiley.. A. P. S. (2000).com by Dana Opre on October 26.. 109-121. McLagan. Lucia. (2001). Thayer. Competence at work: Models for superior performance. International Review of Industrial and Organizational Psychology (Vol. & Adams. 12. & Holton. (1987). (1997). A. Redefining state government leadership and management development: A process for competency-based development.. Preparing instructional objectives. T. 275-285. III. F. P. & Lynham.. Naquin. R. GA: CEP Press. 68. 51.. B. 72. 47. M. K. D. M. The core competence of the corporation. 443-450. & Lepsinger. (2000). A. & Bentson. Nafukho & H.. R. (1993). C. Public Personnel Management.. Thornton. Understanding the assessment center process: Where are we now? In C. Bowling Green. Metaanalysis of assessment center validity. Halman. Chen (Eds. 2009 . (2000).sagepub. (2000). 315-331.. (1990).. A reassessment of assessment centers: Challenges for the 21st century. The practice of competency modeling. Journal of Applied Psychology. B. S. Prahalad. D.). Joyce. Shippmann.

Prior to joining Amedisys. (2006). 69-70.com by Dana Opre on October 26.282 Advances in Developing Human Resources May 2006 Spychalski. G. M. J. He has published a number of research articles in human resource development. Academy of Management Learning and Education... M. Thomson. PhD. is a statistician/research analyst at Amedisys. and effectiveness. (1992). is director of the Louisiana State University (LSU) Division of Workforce Development and an associate professor in the LSU School of Human Resource Education. 231-258. Wimer. strategy. 297-311.. 29. W. A survey of assessment centre practices in organizations in the United States. G. & Mueller-Hanson. Gaugler. & Arthur. Downloaded from http://adh. and currently serves as associate editor for the 2006 International Conference Proceedings of Academy of Human Resource Development. (2004). and philosophy of human resource development. A. 50. Assessment center in human resource management. and corporate strategies across all levels. assessment center. Chen. (2003). (1970). MA: Addison-Wesley. & Pohley. 54. C. III. S. Jr. 2. J... Mahwah. K. 8(2). Comparison of predictor and criterion judgments of managerial performance using the multitrait-multimethod approach. C. B. S. Woehr. The construct-related validity of assessment center ratings: A review and meta-analysis of the role of methodological factors.. 496-502. (1998. B. Journal of Management. D. His recent research interests include competency-based development. Hsin-Chih Chen. 2009 . Reading. Toegel... K. Advances in Developing Human Resources.-C. Training and Development. S. Thornton. (2003). a leading provider of home health care services. Personnel Psychology. III. A.. She has conducted extensive research in managerial and leadership competency development and works with municipal and private agencies on strategic planning and organizational development initiatives. Journal of Applied Psychology. market analyses. Sharon S. May).. 71-90. 13 common mistakes using 360-degree feedback. & Nowack. His doctorate was completed in human resource development at Louisiana State University. he served as a postdoctoral researcher at Louisiana State University. & Naquin. transfer of learning. 265-282. Thornton. peer reviewed journals. Inc. 52(5). (1997).. NJ: Lawrence Erlbaum. Developing organizational simulations. Quinones. R. An integrative model of competency development. G.sagepub. C. 360-degree assessment: Time for reinvention. assessment center. H. & Conger. A. Naquin. Inc.. and multi-rater assessment. where he conducts data-driven research on quality of services. A. training design. H. PhD.

You're Reading a Free Preview

Download
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->