Information Systems as an Instrument for Quality Programs

José A. M. Xexéo1 , Ana Regina C. da Rocha1 , Álvaro Rabelo Alves Júnior2,3 J. R. Blaschek4
Programa de Engenharia de Sistemas e Computação COPPE, UFRJ – RJ, Rio de Janeiro, Brasil. 2 Universidade Federal da Bahia, Salvador, Bahia, Brasil. 3 Fundação Bahiana de Cardiologia, Salvador, Bahia, Brasil. 4 Pontifícia Universidade Católica do Rio de Janeiro, RJ, Rio de Janeiro, Brasil

Abstract
In the current context of the economy there is a general consensus that the assimilation, dissemination and application of quality management techniques and the intensive use of information systems (IS) are critical factors to the success and the survival of organizations. In this scenario there is no exception for institutions providing health services. Studies in the hospital segment also establish this interdependence between quality, information and competitivity. Corporate information analysis generate trends and establishes cause and effect relationships to support decision-making at the various organizational levels. Therefore, information is an agent of change, quality is information intensive and the IS play a crucial role in the process. There are several reports on literature about attempts to implement quality programs in hospitals. Most of these experiences have adapted procedures related to methodologies well proven at manufacturing plants. This, nevertheless, seems to fail in meeting the specificity of the service area. A methodology is, therefore, required to guide managers in the implementation of quality programs in health organizations supported by information systems. This article proposes a methodology and presents the case study that has validated it. The objective of proposed methodology is to guide the implementation of Information Systems supported Quality Programs in health care organizations. The validation was carried out at Fundação Bahiana de Cardiologia, Cardiology and Cardiovascular Surgery Unit, at the Federal University of Bahia. (Key-Words: Information, Healthcare, Quality, Methodology, Indicators, Information systems)

1. INTRODUCTION Fewer barriers to international trade and increased access to technology and to the markets have changed quality into a critical success factor. ISO 9000:2000 standard explicitly justifies quality management systems as instruments to achieve customer satisfaction. The ability to manage information as a strategic resource is also linked to the success of the business and leads organizations to invest in Information Systems development and updating. Organization doing business in the health segment are not immune to these influences. The quality and the productivity of patient care are one of the main focus of investments (Richardson and Gurtner, 1999). When the subject is information, the importance is the same; the practice of medicine and information management are intertwined (Shortliffe, 1990, Berndt et al 2000 and Plummer, 2001). The search for quality supported by Information Systems for Quality Programs is, then, a natural outcome of this environment. Bates (1999) argues that Information Systems are low cost alternatives to increase the quality of provided care. Thatcher and Oliver (2001) refer to

1

hospitals in which the Information Systems help to improve the quality of care provided to patients. This article proposes a methodology, and presents the case study that has validated it. The objective of proposed methodology is to guide the implementation of Information Systems supported Quality Programs in health care organizations. The validation was carried out at Fundação Bahiana de Cardiologia, Cardiology and Cardiovascular Surgery Unit, at the Federal University of Bahia. 2. QUALITY PROGRAMS IN HOSPITALS Several authors report experiences of quality program implementation at hospitals, among whom are Fahey (1992), Tamworth Base Hospital; Materna (1992), Community Memorial Hospital; Matherly (1992), Blount Memorial Hospital; Dasch (1995), Naval Hospital Orlando; Burney (1994), Virginia Beach Surgery Center; Chesney (1993), Barnaby Hospital; Appleman (1995), Naval Medical Center; Shaw (1995), Strong Memorial Hospital; Roland (1997), Chesshire Medical Center; Hart (1997), Leicester General Hospital; Valdivia e Crowe (1997), Truman Memorial Hospital; Stratton (1998), Overlook Hospital; Bates (1999), Brigham and Women’s Hospital; Enchaug (2000), Haukeland University Hospital; Potter et al (1994); Huq (1996); Brashier (1996); Nabitz and Walburg (2000); Lim and Tang (2000) and Lim et al (1999). The analysis of these experiences reveals the following facts: a. identified objectives reflect the shift of focus occurring as years go by. Fahey (1992) and Matherly (1992) detect control and cost reduction objectives. Roland (1997), Stratton (1998), Hart (1997) and Valdivia and Crowe (1997) are concerned with the focus on process and client satisfaction. Bates (1999) refers to the use of Information Systems to measure and improve quality. Enchaug (2000) has furthered the establishment of doctor-patient partnerships. b. among other, the following issues are deemed important to achieve success: top management involvement (Godfrey, 1992, Potter, 1994, Huq, 1996, Roland, 1997 and Sanders, 1997); faculties involvement (Godfrey, 1992, Huq, 1996 and Brashier, 1996); minimizing reaction to change (Materna, 1992, Matherly, 1992, Shaw, 1995 and Sanders, 1997); development of a culture of concern with the patient (Materna, 1992, Chesney, 1993, Burney, 1994, Huq, 1996); establishment of clear objectives (Shaw, 1995 and Brashier, 1996); making the program part of organizational culture (Brashier, 1996, Sanders, 1997 and Roland, 1997); training (Gopalakrishnan 1992, Dasch, 1995, Shaw, 1995, Brashier, 1996, Roland, 1997, Sanders, 1997 and Enchaug, 2000); measuring (Godfrey, 1992, Brashier, 1996, Huq, 1996 and Lim et al, 1999); survey user satisfaction (Brashier, 1996); conduct self-evaluations (Nabitz and Walburg, 2000). c. the improvement targets represent top management vision. The other stakeholders do not make a significant contribution to establish desired quality standards. Every one is interviewed and stated expectations are recorded. Yet, the interpreter is the one who listens the company. The organization decides about the user needs. This behavior impairs the makeup of a consistent vision of quality shared by the majority of employees and clients of the organizations. d. there are few occurrences of IS as a component of an evaluation system. Bates (1999) relates the integration of the IS of a hospital to a system of quality measurement and improvement.

2

h. 3 . 3. Ishikawa and other researchers. 3. (viii) focus on success factors and barriers to program implementation. Brashier (1996) has a proposal for the hospital area. (ii) focus on the service provision view. (vi) establish indicators and use information systems for indicators follow-up. with the ultimate purpose of specifying requirements to enable the Information Systems of organizations to support Quality Programs (figure 3. Crosby. but limit the research to patients expectations. Where used. there is no emphasis on the characteristics of measures. The dimensions of service quality are not referenced for the implementation of quality programs. Most experiences have adapted procedures related to methodologies well proven at manufacturing plants. the view of services occurs in an evaluation context.1). (iii) identify and analyze opinion classes in the surveyed universe. Deming. g. seems to fail in meeting the specificity of the service area. He detects that TQM programs lacks focus and insists that patient satisfaction requires specific solutions. namely: (i) build an integrated view of quality. and since its dissemination has become a broadly used technology to manage and measure service quality. Juran.1 INTRODUCTION The analysis of experiences of quality program implementation in health organizations have led to the identification of determinants for the definition of the methodology. (v) link quality factors to activities and establish priorities. Finison (1992) established that measures in the health area must reflect an explicit relationship between clients demands and key process variables. THE METHODOLOGY 3.2 METHODOLOGY PROCEDURES The proposed methodology comprises stages interacting through feedback links which are characteristic of continuous improvement processes. designed by Parasuraman et al (1985) . f. (iv) emphasize the analysis of the current status of the organization. Sanders (1997) records that quality concepts and tools must not be transferred from manufacturing to health care.e. It reflects the influence from Feigenbaum. but foreseeing a methodic and gradual implementation. Lim and Tang (2000) propose a model for TQM implementation in health care. Davis (1993) advances the hypothesis that faults in quality programs implemented this way are a consequence of the process itself. nevertheless. This. which results in the established quality view being biased. using SERVQUAL1 resources or adapting it for a specific application. (vii) establish a comprehensive methodology. 1 Instrument to evaluate service provisions.

QUALITY PROGRAM INFORMATION SYSTEM ESTABLISH REQUIREMENTS TO INFORMATION SYSTEMS ESTABLISH QUALITY INDICATORS ESTABLISH QUALITY PROGRAM OBJECTIVES ESTABLISH AN INTEGRATED VISION OF QUALITY QUALITY faculties family patients adm Organize Motivation.1 the methodology 4 . Adapting and Awareness Programs nursery Figure 3.

to enable the process to be developed to capitalize on positive factors and neutralize negative factors identified in this study. 2nd stage: identify problem context The purpose is to study and define the context. ORGANIZE MOTIVATION.3). The Software System Methodology (SSM) approach.2). 3rd stage: elicit quality requirements The objective of this stage is to survey the views of several user categories to integrate these categories in a solution that meets collective priority interests. There are several ways of achieving it. and its final product is the set of critical quality factors for the considered universe of individuals. The final choice of the technique depends on the size of the company. ESTABLISH AN INTEGRATED VISION OF QUALITY This five stage phase. because of its ability to mobilize and easy implementation. the JAD (Joint Application Design) workshops and the non structured interviews are adequate techniques. A seven step process is proposed (figure 3. proposed by Checkland (1981).2 – establishing an integrated vision of quality 1st stage: preliminary study The purpose of the preliminary study is to examine the universe in which the organization operates and similar situations. starts with a study of the environment. ADAPTING AND AWARENESS PROGRAMS The objective of this first phase is to obtain the commitment from the organization top management. an example is the establishment of an 5S Program. preliminary study identify problem context elicit quality requirements define sample characteristics determine critical quality factors Figure 3. The integration is ensured by the representativeness adopted in the process of eliciting requirements (figure 3. PHASE 2. level of pre-exiting knowledge and of organization specific objectives. 5 .PHASE 1. with a clear and precise statement of the problem.

In this step the technique to elicit requirements is selected. clarity.choosing the technique to elicitity requirements building the instrument to be applied determining sample size and composition selecting and training the team of interviewers pilot trial revising the instrument applying the instrument Figure 3.3– eliciting quality requirements Step 1 – choosing the technique to elicit quality requirements. samples of all categories involved in the process should be interviewed. Measurement of the importance instead of expectation and identification of quality based on perception and not by the difference between expectation and perception. Step 2 – building the instrument to be applied. 6 . Step 3 – determining sample size and composition In this step. The adequacy. proposed by Parasuraman et al (1988). The technique takes an instrument traditionally used for the approached problem and adapts it in the context of the new problem. Step 4 – selecting and training the team of interviewers In this step interviewers are selected and trained to use the instrument. The desired scope of the survey determines the use of the interview technique with questionnaire. The adaptation is carried out through interviews with representatives of the various categories of users. Step 5 – pilot trial The pilot trial simulates the eliciting of quality requirements working with a restricted group. For the context of health institutions. as well as the sufficiency of questionnaire scope and the level interviewer training. the SERVQUAL was chosen. The reference instrument is the SERVQUAL. In this methodology. simplicity and accuracy of the questions are verified. In this step of the methodology the instrument to be used in the survey is built. with two changes. Training deals with care in the approach and with the several ways of explaining and exemplifying the content of used questionnaire. the total number of interviews to be carried out is determined and the way in which this number will be segmented across the categories selected to form the sample is defined.

In this stage critical quality factors are determined. ESTABLISH QUALITY PROGRAM OBJECTIVES In this stage. below the general average of the perception questionnaire. The second one. organizational policies. The final report must submit demographic data on interviewees. during or after service provision. consequently. until process and instrument are deemed to be ready to be applied. The purpose of this stage is to define sample characteristics and the opinion of its various segments about quality. identify activities which are relevant to quality establish the relationship between activities and critical quality factors determine critical activities External Factors Quality Program objectives statements Figure 3. discuss the results of the processing of interviewee answers to the questionnaire and state the statistical tests carried out to evaluate the degree of results reliability. the Quality Program objectives. 5a stage: determine critical quality factors. critical activities are identified and. equal or above the general average of the importance questionnaire (the general average is calculated adding the averages of each factor and dividing the result by the number of factors). etc (figure 3. From these relationships and from the results of the previous phase. Critical quality factors are those quality factors that simultaneously meet two requirements (i) have the average of the ratings. This phase is influenced by factors like financial restrictions.4 – establishing Quality Program objectives 7 .4). the relationships between quality factors and institution activities are determined. assigned to them by the interviewees in the perception questionnaire. The first one. 4a stage: define sample characteristics. PHASE 3. intended to rate the priorities of patients. Two questionnaires are required to determine the critical quality factors. These two steps are repeated in a feedback loop. that have been assigned to them by interviewees in the importance questionnaire. (ii) have the average of ratings.Step 6 – revising the instrument This step makes the changes indicated by previous step. core activity professionals and professionals working on support activities regarding evaluated functional quality factors. Step 7 – applying the instrument This step comprises the interviews. targets the evaluation of the perception of the same agents about these same factors.

and the quantitative level. (iii) match matrix lines and columns to quality factors and the activities. 8 . (v) through each evaluator. The organization activities which are significant to quality are identified through the following procedures: (i) based on the survey. The most immediate solution is simple ordering. in which the questions are associated to indicators. there are organizational factors which may point to another solution. the Goal Question Metric (GQM) is proposed. GQM has three levels. To identify indicators. 3rd stage: determine critical activities This stage is conducted by quality project team and the purpose is to determine critical activities from the ordering matrix. 4a stage: establish Quality Program objectives In this stage. the first step is to establish criteria for the selection of critical activities. depends also on Program scope. (iv) in the first column enter the averages achieved by quality factors in the importance questionnaire. assign to each cell in the relationship matrix the value corresponding to the degree of relationship between the activity (column) and the corresponding quality factor (line). through continuous improvement goals. 1999). to specify objectives. metrics (figure 3. PHASE 4. A conceptual level. These indicators should be used to follow-up the results as time goes by. therefore. the relationship matrix and the ordering matrix. multiply the value of found average by the value of the corresponding quality factor (same line). the operating level. This decision. should. (ii) build two similar matrixes for each evaluator.1st stage: identify activities which are relevant to quality. This is a software engineering method developed to support the identification of what is required to follow-up the achievement of objectives (Solingen and Berghout. where n is the quantity of quality factors and m the quantity of activities. (vii) for each cell. Quality Program objectives are established from the critical quality factors and associated critical activities. and on financial restrictions. (vi) calculate the average of evaluations for each matrix cell. based on the Action Plan. (ii) prepare and validate the final list of activities in interviews with professionals who are familiar with institution procedures and routines. based on the result of the sum of the cells in each column. has the purpose of defining a set of quality indicators capable of measuring up to what point Quality program objectives are met. to design the statement of questions which will quantitatively characterize the objectives. Nevertheless. These objectives. respectively. (viii) enter the result in the corresponding ordering matrix cells. Thus. an ordering matrix is constructed to establish a relationship between activities and critical quality factors through the following procedures: (i) choice of participants of this stage among organization staff. 2nd stage: establish the relationship between activities and critical quality factors. on the literature. In this stage. ESTABLISH QUALITY INDICATORS This phase to be carried out by project team.5). The objective in this stage is to identify quality indicators. that is. result from decisions taken at meetings with organization management in which. leaving the first column blank. nevertheless. determining the name and the acronym of the indicator and calculation formula. a formal document should be drafted – the Action Plan – which constitutes the base of the Quality Program to be implemented. human resources and even on the physical space available. with n lines and m+1 columns. prepare an initial list of activities which are relevant to the quality programs implemented in institutions from the same segment.

indicators are analyzed regarding the origin of data required to calculate them. For data obtained from observation a tool have to be prepared to record observed information and the person who will be in charge of observations must be selected. (ii) determine where and how to get data. So. there are three stages: establishing procedures for data collection to calculate quality indicators. which are already available in the systems. this stage establishes the procedures to collect data to calculate the indicators. For data collected by opinion polls. These requirements comprise two documents. To that end. indicator acronym and calculation formula). it is necessary to (i) identify data required to calculate each indicator. interviewers ought be selected and trained and the pilot trial also must occur. for each factor. (iii) build a table containing. The product will be the final tool ready to be applied. 2nd stage: specifying identified indicators. must be identified and also which developments must be introduced in the information systems to enable collecting new data will be identified. The objective of this stage is to complete the specification of quality indicators. a set of specifications of quality indicators and a suggestion of an architecture for the information system. For data obtained through organization information systems hose. ESTABLISH REQUIREMENTS TO INFORMATION SYSTEMS In this phase. the data required to calculate it and data source. completing the specification of quality indicators and specifying Information System architecture. its acronym. data origin and collection mode 9 . which still have not been determined. taking into account the origins identified in the previous step.The cycle used to identify quality indicators PHASE 5. Indicators may be calculated based on data obtained from several sources: opinion polls. pertinent instruments must be prepared.Specify objective Formulate questions Specify indicators Figure 3. determining the specification standard items. observation or organization information systems. the requirements enabling managers to follow-up the Quality Program launched by the Action Plan are identified. The specification contains 10 items. Step 1: classifying indicator data In this stage. Three of these items are determined in the previous stage (indicator name. Step 2: establishing procedures for data collection.5 . This step establishes procedures for data collection. 1a stage: establishing the procedures to collect data to calculate the indicators Through the steps described below. sample have to be defined.

this section reports the case study carried out in the Cardiology and Cardiovascular Surgery Unit at Fundação Bahiana de Cardiologia (UCCV/FBC). which establishes the information boundary between the system being implemented and the environment in which it will operate. a translator of opinion polls results. aggregation period. 6 . This methodology uses the representation scheme proposed by Hatley et al (2000). a dialogue sub-system. It is a hierarchical model and accepts successive refinements. For example. USER INTERFACE PROCESSING PROCESS and CONTROL INPUT PROCESSING REVIEW PROCESS FUNCTIONS OUTPUT PROCESSING Figure 3.1 ESTABLISH AN INTEGRATED VISION OF QUALITY 10 . 3rd stage: specifying Information System architecture. At the input processing area rectangles with rounded corners also should be drawn representing the external entities generating information to the system.6 depicts the representation scheme for the architecture in question. for example. means. at the output processing area the external entities. and suggested by Pressman (2001). the four remaining ones (collection period. Review process.are available in the results of the preceding stage. the context diagram. an Information System architecture is suggested. The area of user interface processing is appropriate to establish the type of mechanism for interaction with the user. This methodology presents the architecture at its highest representation level.the architecture framework (PRESSMAN. 4. is the area destined to the representation of the system to be implemented (a rectangle with rounded corners). for example a report generator. presentation and goal) should be suggested by project team. used at all detailing levels. In this stage. for example. Similarly. optimizing the algorithms used to calculate indicators or a review of established goals. which communicates with all other areas. The figure 3. are represented. Because of its simplicity this scheme was selected. 4. THE CASE STUDY Starting from the second phase. which receive information from the system. This standard represents the relationship between system elements. distributing them in five processing regions. 2001) The central area.

and by UCCV/FBC suggestions and structure. The first versions were tested with 16 people and some questions were reformulated. and a five points Likert scale was adopted. The pilot trial adjusted the process in the fifth step. Vandamme and Leunis (1992). Youssef et al (1996). in which an interview was simulated with each interviewer and each question was analyzed.The first stage in the second phase – preliminary study. 1992. three employees and twelve patients.9) is similar to men’s age (50. 1998 and Van Der Bij. Several questions were broken down and eventually a total of 47 items were defined. Regarding demographic data. in the fourth stage it was observed that the number of women (87. Transposition to the hospital environment was guided by the works of Babakus and Mangold (1992). 1999) led to the segregation of these two dimensions from the five original ones to built seven dimensions.is the analysis. FBC Head-Nurse. The full working hours schedule at UCCV/FBC were covered. interviewers were selected among students from the Universities existing in Salvador. and that 48% of the sample is within the 40 to 60 years range.2). which include emergency care and out-patient and inpatient services. The second step built the tool to be applied – adapting the SERVQUAL. The results of quality factors processing are presented in the table 4. To select the patients and professionals to be interviewed lots were drawn – without replacement. The questionnaire has more questions than the original one because of the complexity of the services rendered by UCCV/FBC. introducing two changes: measure significance instead of expectation and identification of quality by perception. In step three a sample of 384 interviews for each one of the questionnaires was calculated. Step seven concludes stage three and deals with tool application. In step four. The third stage comprising seven steps is described below. the Administrative Manager.1 prepared for the importance questionnaire. 1996. Andaleeb. presented in the introduction section. aiming at guiding an Action Plan for continuous service improvement and increased satisfaction of UCCV/FBC users. as shown in the table 4. that the average women’s age (50. all patients and professional categories would be heard.15 and perception = 4.the President of UCCV/FBC decided to conduct an opinion poll to check the degree of satisfaction with the services rendered by UCCV/FBC in which. Youssef et al. Questionnaires were applied in two different moments and in different groups.09) 11 . The relevance of access and security aspects (Babakus and Mangold. FBC Chairman and Head of UCCV was interviewed as well as the Head Physician of the Nuclear Medicine Department. The calculations for segmentation chose as reference the movement of patients during a week at the several UCCV/FBC sectors. Initial questions were taken from the original Parasuraman (1988) proposal. 1997. The results were used in step six – revising the instrument – in which several questions were changed. of cases reported in the literature about the implementation of quality programs by health organizations.1 that shows the values of global average for each of the questionnaires (importance = 4. Only positive worded statements were used. As a result of the second-stage – identify problem context . Each interviewer’s first interview was conducted under supervision. Conway and Willcocks. Two training meetings were scheduled.5%) exceeds the number of men. The first step chose SERVQUAL as the tool to be used to elicit quality requirements.

06 3. patients are listened about their feeling concerning the approach given to their illness.52 4.37 3.43 4.14 4.11 2.33 4.27 4.50 4.56 3. emergency has 24-hour availability.29 3. in-patient support services are promptly provided.77 4.31 3.57 4. services are carried out right the first time.45 3.10 3.59 2. records about patients are error free. patients are immediately informed about their exams results. emergency facilities have sufficient resources and easy access to support service.47 4.09 12 .15 4.57 4.30 4. staffs’ attitude instills confidence in patients.61 3.76 4.52 4.02 3.40 3.60 4. billing is accurate.30 3. facilities are easily accessible.44 4.05 4. patients are kept informed about their illness and its treatment.57 4.1 – quality factors Factor P1 P2 P3 P4 P5 P6 P7 P8 P9 P10 P11 P12 P13 P14 P15 P16 P17 P18 P19 P20 P21 P22 P23 P24 P25 P26 P27 P28 P29 P30 P31 P32 P33 P34 P35 P36 P37 P38 P39 P40 P41 P42 P43 P44 P45 P46 P47 average It is important that all the physical facilities are permanently sanitized.54 4.49 4.49 4.42 4.34 3. patients are informed about the procedures and complementary exams in schedule.21 4. impatient requests are promptly satisfied.44 4.20 4. compromises take upon with the patients are accomplished =services are provided at the right time.50 4.00 4. hospital distributes folders that explain the services that are provided. records and relationship with patients are classified as confidential.71 3. professionals are easily accessible.27 4.57 3.65 4.14 2.77 3.46 4.17 3. professionals have opportunity for professional updating.02 4.21 4.37 4.Table 4.25 3. professionals have sincere interest in solving patients’ problems.61 4.47 4.16 4. signalization of the corridors helps patients’ circulation. the consult period is sufficient to establish the patient status.62 4.69 4. information provided to patients is correct.77 4.73 4.28 4.57 4.12 4.81 4. facilities and equipment is sufficient to meet the demand.10 3.17 4. professionals always are courteous and respectful with patients.26 4.37 4.74 4.24 4.20 4.15 4.26 4.56 3. parking facilities are compatible with the demand. the staff has the knowledge to answer patients’ questions. doctor’s offices and complementary exams rooms have sufficient resources to support patients’ consults and exams.40 3.70 4.70 4. in-patient facilities have the necessary resources to support service. in-patient support services are timely provided. are enough facilities to set appointments with doctors and for complementary exams. emergency care is promptly.61 4.24 4.40 4.45 4.28 4.32 4.20 4.48 4.71 4. the equipment is up-to-date. nuclear medicine and radiology facilities have controlled access. professionals from the institution guide patients to the sites were they will be assist. schedule for continuity is provided.54 4.49 4. support facilities for professionals are comfortable. professionals give individual attention to patients. the physical facilities are comfortable. professionals are satisfied with their working and payment conditions.13 4. treatment instructions are written Importance Perception 4. equipments are securely operated.15 3.94 3.81 4.08 4.01 3. number of professionals is sufficient to meet the demand. the equipment has high level of availability.95 4. in-patient meal is conforming to best nutritional practices.

in each questionnaire. and UCCV/FBC’s wards. core activity professionals and professionals working on support activities. Nevertheless. but the transversal consistency does not show the same intensity obtained in the importance questionnaire (table 4. the three former categories.2) and of the existence of significant transversal consistency (table 4. 13 .5). The result showed that. with the first 100 interviews and with the last 100 interviews. just the P15 factor was left with a response index below 25%.and the average for each quality factor. related to hygiene and the emergency sector. P2. the values of the variations within treatments and between treatments are of the same order for both sets of interviews. Closing the fourth stage. As not all of the factors applied to all groups. The results for the complete questionnaires were satisfactory. In the third step the ordering of quality factors obtained when considering the set of all interviewees is compared to the orderings obtained when considering separately the six identified classes. This did not occur with the perception questionnaire.0 supporting k-means method combined with the square Euclidean distance metric applied to six classes). for both questionnaires. P34 and P7. importance). In the perception questionnaire. and also exceeds the correlation coefficients of other pairs (importance. The results of the two questionnaires were analyzed separately. the existing differences do not avoid the uniformity of the composition of the classes relatively to the three interviewed categories (table 4. In the second test . regarding importance and perception. maybe because they were formed by few factors. the results were favorable. the interviewees did not answer questions about the services they did not use.05). alpha = 0. Even though. The fourth test – tool reliability analysis – evaluated the level of internal consistency of the questionnaires using Cronbach alpha coefficient.7. The second step is to determine the composition of these classes relatively to three categories: patients.7) for both questionnaires. For the importance questionnaire. The objective of the transverse consistency test – fifth test – was to check the consistency of results obtained when all interviewees were considered. presenting low coefficients. difference between importance and perception) and (perception. with the results obtained when the several categories represented in the sample and the classes identified by clustering technique were individually considered. and no evidence of bias was detected. the following statistical tests were performed: In the first test – correlation between the sets of interviewers’ answers to the questionnaires – it was assessed that the absolute value of the correlation coefficient for the pair (perception. Factors P1.3).3). using the clustering technique in the considered universe of the questionnaire being examined (Statgraphics Plus 5. it can be observed that the uniformity of classes is maintained at the same level (table 4. the coefficient stayed above the lower limit (0. P41.4). are the most important factors to UCCV/FBC community of patients and professionals (table 4.bias assessment – was carried out through the analysis of variance (one-factor experiment using Excel 2000. The initial procedure is to identify classes of opinion. Aside from the approximation of expectation by importance.verification of factor response indexes – the importance questionnaire indexes were high (over 98%). the security and access dimension that had been segregated from the original ones did not stand. difference between importance and perception) exceeds 0. The objective of the third test .

3 .5%) 1 (4. Table 4.2 – classes’ composition for the importance questionnaire CLASSES of OPINION C1 C2 C3 C4 C5 C6 TOTAL CORE 2 (7%) 10 (35%) 0 9 (31%) 3 (10%) 5 (17%) 29 SUPPORT 0 21 (91%) 0 1 (4.2%) 55 (14%) 14 . Quality factors P1 P2 P41 P34 P7 P12 P3 P39 P14 P23 x x x x x C1 x x x x x x C2 x x x x x x C3 x x x x x x x C4 x x C5 x x x x x x x x x x C6 x x x x x x x (1) x x x x x (2) x x x x x x x x (3) x x x x x x x x x (4) x x x x x x x x x (5) x x x x x x x x x x (6) x x x x x x x x x x (7) x x x x x x x x x x x x x x (8) x (a) x x x x x x x x x x x x x (b) x x x x x x x (c) x x x x Total of occurrences 17 (100%) 16 (94%) 16 (94%) 17 (100%) 15 (88%) 15 (88%) 9 (53%) 11 (65%) 9 (53%) 8 (47%) Total of 8 8 7 3 9 8 5 8 9 9 9 9 9 8 10 7 7 occurrences note: (a) patients (b) core activities professionals (c) support activities professionals (1) hemodynamics (2) arrhythmia (3) offices (4) ecographic (5) ergometric (6) nuclear medicine (7)emergency (8)in-patients Table 4. UCCV/FBC sectors and general patients and professionals’ categories relatively to the set of the ten quality factors identified for the UCCV/FBC as a whole is examined.4 – classes’ composition for the perception questionnaire classes of opinion C1 C2 C3 CORE 16 (56%) 0 4 (14%) SUPPORT 6 (29%) 1 (4.The results of these statistical tests show that there are no indications that would prevent the acceptance of questionnaires and carried out survey as statistically valid.5%) 0 23 PATIENTS 48 (14%) 55 (17%) 15 (4%) 76 (23%) 69 (21%) 69 (21%) 331 TOTAL 50 (13%) 86 (22%) 15 (4%) 86 (21%) 73 (19%) 74 (20%) 384 note: CORE: core activities professionals SUPPORT: support activities professionals Table 4.5%) 6 (29%) PATIENTS 23 (7%) 87 (26%) 45 (14%) TOTAL 45 (12%) 88 (23. There is also no indication of relevant discrepancies between the various categories and user classes justifying conciliation procedures.pertinence of the ten quality factors with highest levels of importance of each identified class.

concludes the second phase of the methodology. UCCV/FBC sectors and general patients and professionals’ categories relatively to the set of the ten quality factors identified for the UCCV/FBC as a whole is examined. Critical quality factors are those factors with a importance (IMP) level equal to or above the 4.47) − P41: emergency has 24-hour availability (IMP=4.determining critical quality factors .5 .69.70. a perception (PERC) level below the 4. PERC=3. namely: Group of factors related to Emergency Care at UCCV/FBC: − P7: emergency care facilities have sufficient resources and easy access to support service (IMP=4.C4 C5 C6 TOTAL 0 5 (16%) 4 (14%) 29 2 (9%) 5 (24%) 1 (4.pertinence of the ten quality factors with highest levels of perception of each identified class.77) Group of factors related to appointments and exams − P19: services are provided at the right time (IMP=4. which can be separated in four groups. PERC=3.15 average and.71.1 has led to the identification of ten factors with this property. Quality factors P40 P23 P39 P33 P38 P45 P44 P25 P28 P43 Total of occurrences x C1 x x x x x x x x x x x 6 8 x 6 6 x x x x x x x 9 x 4 5 3 7 x 7 x x x x x x C2 C3 C4 C5 x x x x x x x x x x x x x x x x 4 7 6 3 9 5 5 x x x C6 x x x x x x x (1) (2) x (3) x x x x x x x x x x x x (4) x x (5) x (6) x x x x (7) x x x x x x x (8) (a) x x x x x x x x x x (b) x x x x x x x x (c) x Total of occurrences 12 (71%) 12 (71%) 13 (76%) 11 (65%) 9 (53%) 11 (65%) 9 (53%) 9 (53%) 8 (47%) 6 (35%) note: (a) patients (b) core activities professionals (c) support activities professionals (1) hemodynamics (2) arrhythmia (3) offices (4) ecografics (5) ergometric (6)nuclear medicine (7)emergency (8)in-patients The fifth stage .8%) 49 (13%) 384 note: CORE: core activities professionals SUPPORT: support activities professionals Table 4.21.09 average. PERC=3. An analysis of table 5.08).5%) 21 58 (17%) 77 (23%) 44 (13%) 334 60 (15%) 87 (22.81) 15 . simultaneously. PERC=3. − P34: emergency care is promptly (IMP=4.

Ahire (1996).02) − P8: impatient facilities have sufficient resources to support service (IMP=4.57.57.26) − P14: professional have opportunities for professional updating (IMP=4.29. add to or remove activities from this list and give it its final format. PERC=3. which records the pertinence of critical quality factors of each sector relatively to the set of critical factors identified for the UCCV/FBC as a whole is examined.pertinence of critical quality factors of each sector relatively to the set of critical factors identified for the UCCV/FBC as a whole is examined 4. Camilleri and Callaghan (1998).− P42: is easy to set appointments with doctors and for complementary exams (IMP=4. Thiagarajan and Zairi (1998). FBC Head Nurse and the administrative Manager were interviewed to evaluate. 16 . patient education. Naveh (1998).81) Group of factors related to infrastructure − P3: number of professionals is sufficient to meet the demand (IMP=4. advance medical research.76) − P4: facilities and equipment is sufficient to meet the demand (IMP=4.01) The consistency of this result is checked examining the figure 4. The Head Physician of the Nuclear Medicine sector. PERC=4. PERC=4.2 ESTABLISH THE QUALITY PROGRAM OBJECTIVES 1st stage: identification of activities. Forza (1995). PERC=3. which are relevant to the quality program The initial list was built based on Lim et al (1999).1 . (1997).32. UCCV/FBC sectors Hemodynamics Arrhythmia Offices Ecografia Ergometric Nuclear Medicine Emergency In-patients Quality critical factors P3 x x x x x x x x x x x P4 P7 x x x x x x x x x x x x x x x x x x x x x P8 P13 P14 P19 x x x x x x P34 x x x x x x x x x x x x x x P41 P42 Figure 4. survey of patients.06) Group of factors related to professional interests − P13: professionals are satisfied with their working and payment conditions. PERC=3. Black (1996).40. benchmarking. PERC=4.1. Capon (1995).48. Anderson. schedule for continuity. (IMP=4. The final list comprises 24 activities as physician involvement in decision-making.

For appointments activities A1. are simultaneously among the ten first activities in the specific set and in the set in which all critical factors are considered. FBC Head Nurse and the administrative Manager were selected to establish this relationship following the steps defined in the methodology.7 . This intersection of criteria identified A9.6 – first ten ordered activities when all critical quality factors are considered Quality activity A14 A1 A11 A4 A20 A5 A9 A3 A10 A12 quality operational management schedule for continuity involving people with quality clinical treatment activities patient admission benchmarking in-service education and training clinical diagnosis activities physician involvement in decision-making nursing operations management pts 344 343 327 319 312 309 306 303 302 298 Table 4.8. The criteria for ordering were established in a meeting with FBC Chairman and Head of UCCV. three and nine were used to indicate a week. To assign the degree of relationship between the quality factor (line) and the corresponding activity (column) the values one. respectively.3 25.first five ordered activities when only emergency related critical quality factors are considered Quality activity A20 A12 A10 A9 A14 patient admission nursing operations management physician involvement in decision-making in-service education and training quality operational management 25. A10. 4.7 and 4.9 Table 4. emergency and appointments. The second criteria was identifying the first three activities which.2nd stage: establishment of the relationship between activities and critical quality factors.3 25.8 . 3rd stage: determining critical activities The objective of this stage is to order the activities and select those that are critical. The Head Physician of the Nuclear Medicine sector. First adopted criteria was to respect the principle of limiting project action and focusing on the two first sets.3 23.first five ordered activities when only appointment related critical quality factors are considered 17 . Table 4. medium and strong relationship.3 25. A11 and A14 were identified.6. The results are presented in the tables 4. A12 and A20 for emergency activities.

Quality objectives are determined through the analysis of survey results and the design of an Action Plan. besides final destination of each patient. budget limitations and urgent need to provide better service under specific aspects. considering the impossibility of physical enlargement. which. after care.with a nursing student in charge of buffering the arrival to the E. − Simultaneously.3 ESTABLISH QUALITY INDICATORS 18 . − Collect with these students data related to the times of arrival. − 30 days after procedure implementation. with another group of students.quality program objectives statements. − Increase facilities for setting of appointments and scheduling complementary exams. conduct a satisfaction survey for 15 days. − Simultaneously. conduct a satisfaction survey. with another group of students. To achieve these objectives the following procedures were implemented at the emergency sector: − Receive emergency patients – in two work shifts . using a questionnaire. 4. The following objectives were defined: − Improve perception of the readiness of emergency care service by improving patient reception..3 4th stage .7 16. using a questionnaire.3 15. at an office close to Drs.7 16. Offices to schedule requested complementary exams. − Investigate punctuality indexes in the conduction of appointments. with a group of students to record data associated to the punctuality of doctors and patients. after treatment. decided to act in the reception activity of the emergency sector and on the follow-up activities and quality operating management of the appointments and complementary exams sector. This plan resulted from a series of meetings with the FBC Chairman and Head of UCCV.A14 A11 A1 A15 A20 quality activity quality operational management involving people with quality schedule for continuity patient scheduling patient admission pts 16. in two working shifts. receive appointment and complementary exam patients.R. treatment and referral. At the appointment scheduling service: − Receive appointment.7 15.

1st stage: establish a set of quality indicators for the Emergency Care sector This stage established a set of quality indicators for the Emergency Care sector to follow up the actions and results of the Action Plan. excluded the calculation formulas. there are two stages. The Goal Question Metric (GQM) method was used. excluded the calculation formulas.2: Number of confirmed appointments versus capacity of the agenda Question 7: Which are the patients’ attributes? Indicator 7. Three of these questions and seven related indicators are pointed out as examples.2: Unavailability of critical equipment in the E. two objectives were defined. Question 4: Which are the characteristics of the demand met? Indicator 4.R Indicator 1.1: Doctors absenteeism in the E. Question 1: Which is the level of promptness? Indicator 1.1: Number of cancellations versus number of confirmed appointments Indicator 6. Question 6: Which are the characteristics of the agenda? Indicator 6. Two sets of indicators were established.R Question 3: What is the assistance capacity? Indicator 3.3: Distribution of patient destination in the E. However. This objective worked in the GQM cycle produced four questions and 17 indicators.1: Loyalty of first appointment patients.R Question 2: Which is the availability of services? Indicator 2.R Indicator 3. As it is defined by GQM an objective was established: improve the rates of patient satisfaction perception relatively to the quality of Emergency Room (E. The technique used was also the GQM. Three questions and eight indicators were produced.1: User subjective evaluation about promptness in the E. So. excluded the calculation formulas.R Indicator 2.R nd 2 stage: establish a set of quality indicators for the Appointments and Complementary Exams Sector This stage established a set of quality indicators for the Appointments and Complementary Exams Sector with the same purpose of the first stage.4 ESTABLISH REQUIREMENTS TO INFORMATION SYSTEMS 19 . The GQM cycle produced five questions and 11 indicators.R Indicator 2.The purpose of this stage was to follow-up the Action Plan using specific indicators. 4. one for the Emergency Care sector and the other for Appointment and Complementary Exams sector.1: Percentage of extra services Question 5: Which are the patients’ attributes? Indicator 5.1: Percentage of late patients The second objective was to make easier for patients to set appointments with physicians. The first objective was to reduce awaiting time for appointments with physicians. instead of one.2: Average waiting time in the E. Two of these questions and two related indicators are pointed out as examples.2: Distribution of patients relatively to time spent in the E.R) services provided.1: Number of assistances provided in the E. Two questions and three related indicator are examples.

Two questionnaires were designed to be used in the opinion poll.R. in which the way to approach users and to complete questionnaires were discussed. the forms in which observations would be recorded were designed by project team.R Observation in the appointments sector Information system Observation in the appointments sector Information system Information system Figure 4.2 point out some examples.R Liberation time of each patient Distribution of patient destination in Destination of each patient the E. Every one who came to the E.R Observation in the E. 2 – required data for calculation and origin of indicators Step 2 – establishing procedures for data collection The sample comprises only patients.In this phase. The team of interviewers was trained in three meetings. For being simpler. Observation in the E.R E.R.R about Data Subjective evaluation Arrival time of each patient Admission time of each patient Origin Opinion poll in the E. The figure 4. 1st stage: establishing the procedures to collect data to calculate the indicators This stage established the procedures to obtain the required data for indicators calculation through the following three steps.R Percentage of extra services Number of extra patients Number of services provided Arrival time of each patient Percentage of late patients Appointment time of each patient Number of cancellations versus Number of cancellations number of confirmed appointments Number of confirmed appointments E.R or to the appointment and complementary exam sector was potentially part of the sample. One for the E.R.R Average waiting time in the E. Observation in the Number of srevices provided in the Number of srevices provided in the E. 20 . Indicator User subjective evaluation promptness in the E. Step 1 – classifying indicator data The indicators established within the previous phase were analyzed in order to determine the data required to calculate them and to identify were or how this data should be collected.R Observation in the E.R Distribution of patients relatively to Admission time of each patient time spent in the E. These questionnaires were evaluated at a meeting with FBC Chairman and Head of UCCV. and the other for the Appointments Sector. the requirements to Information Systems that enables managers to follow-up the Quality Program launched by the Action Plan were identified.

service provided or patients desisting and final destination. aggregation period and goal) must be suggested by the project team. patient Collect period: Daily Presentation mode: Graphic FORMULA Late_pat = Aggregation period: One week Goal: Maximum of 10% Σ(arrival time . Final approval was obtained at a meeting with FBC Chairman and Head of UCCV.: date. 2nd stage specifying identified indicators This stage completed the specification of quality indicators. resources availability. name or identity card. Content evaluation was a simple process because there was previous experience on more than 900 interviews with questionnaires with a very similar content. The following date was collected in the form used in the E. The final product of the stage is a document containing the specifications of all indicators. appointments and also overall value judgment.R. appointment and time patient was admitted for appointment. which still had not been determined.3 – indicator specification standard 3a stage: Specify the Information System architecture. Three of these items were determined during the prior phase (the name of the indicator.3). This methodology uses the framework proposed in HATLEY et al (2000) and suggested by PRESSMAN (2001). its acronym and the calculation formula). and overall value judgment of the services offered in the E. number of service order and times of arrival. 21 . The pilot trails was conducted with the team of interviewers and led to small changes to questionnaires.3 is an example of specification standard. 24-hour availability. Indicator Acronym Late_pat Collect mode: Observation Observation Percentage of late patients Required Data: Data origin: Arrival time of each patient Appointments sector Appointment time of each Appointments sector.R. determining the items of the specification standard (figure 4.R. times of arrival.appointment time)* number of observations *Only if arrival time > appointment time Figure 4. The questionnaire distributed to Appointment Sector patients’ appraised items related to waiting time. Figure 4. The form used by the Appointments Sector recorded: date. The last three items (collect period. facilities access. This stage has the purpose of specifying an architecture for the Information System. name.The E. Data origin and collect mode are results of the previous stage. Regarding the information system there are several data available to calculate indicators. questionnaire addressed items related to promptness.

5.comparative data 22 . • all required flows.RESULTS 4. a classification corresponding to a “good” perception of the service. Table 4. The comparative table 4.1 points (three points scale).9 .4 – the proposed architecture 4.5 ACTION PLAN . Dialogue Answer processor Quality module (process and algorithms) Report generator Data extractor Algorithms and goals revision process Figure 4. • four terminators (one for each external item with which the system must communicate) – the answer processor. resulting in a global evaluation.1 Emergency Care The opinion poll was conducted with 434 interviews. The flow context diagram shows the system embedded in its environment (figure 4.4) and has the following elements: • one architecture module representing the system under development – the quality module. of 2. the report generator and the dialogue system.The framework proposed has a top-down hierarchy.9 shows a marked improvement of all surveyed factors with evidence that the way patients are received in the Emergency Room sector influences patient perception of service quality. which uses flow context diagram at the highest level. of which there was no previous record. the data extractor.

Table 4.65 4. 40% of those who come to the E.10 records a small improvement in almost all surveyed factors. facilities Evaluation on resources availability Evaluation on promptness to provide service Evaluation on 24-hour availability After the Action Plan 4.1 Before the Action Plan 3.5.10 .43 points (three points scale). possibly because of the decision to maintain.08 3. There are indications.45 3.81 4.R are referred to other institutions. confirming previous diagnosis.1 4.61 4.69 3. The comparative table 4.77 The analysis of data collected during the observation of 494 patients’ receptions in the E.08 3.satisfaction level with the service provided by appointment scheduling system Level Factor P3 P6 P12 P19 P21 P23 P24 P28 P33 The number of professionals is sufficient to meet the demand.76 4.3 4.67 Before Action Plan 3. • For 50% of patients waiting time exceeds 1 hour and just 29% of patients are seen in less than 30 minutes. The services are provided at the right time. that the action of scheduling appointments close to the offices has influenced the perception of service quality.63 4. Nevertheless. of 2.80 4.30 4.R ranges around 90 minutes for 60% of patients. 4.R revealed: • The time patients remain in the E. The consult period is sufficient to establish the patient status.56 23 .69 4. The staffs’ attitude instills confidence in patients. during the period of experience. The staff has the knowledge to answer patients’ questions.10 4.49 4. The equipment has high level of availability.42 4. an advanced station to schedule appointments and exams close to Doctors offices. for which there was no previous record. The doctor’s offices and complementary exams rooms have sufficient resources to support patients’ consults and exams.76 4.47 3.Level Factor P7a P7b P34 P41 Evaluation on access to E. Both factors deemed to be critical showed the already expected behavior (P19 and P42). resulting in a global evaluation.3 4. • About 50% of patients go back home. The information provided to patients is correct. a classification corresponding to a “upper good” perception of the service. The factor that measures punctuality (P19) remained unchanged because punctuality was only the target of a survey and not an improvement target. therefore.35 4.R.2 Appointments The opinion poll was carried out with 127 interviews. The factor that measures perception of the facility to schedule appointments and exams (P42) has increased. The professionals always are courteous and respectful with patients. After Action Plan 3. Just 15% stayed more than 3 hours.57 4.

emphasizing the use of organization Information Systems. Methodology design has taken into account the following determining factors: (i) build an integrated view of quality. main results. barriers and evaluation processes were identified. The priority assigned to service quality and user satisfaction can be easily evidenced in the new version of the ISO 9000:2000 standard which establishes that user measurement of user satisfaction is a requirement for an ISO 9000 certification.46 3.52 4. 5. (3) associate quality factors to organizational activities and establish priorities.82 4. which may account for the waiting time. This paper presents a methodology and its validation to identify the expectations and perception of quality by the social body of an organization with the purpose of defining the objectives of a Quality Program. (2) value the view of service rendering. This step is also quite clear in the methodology.54 4. • The rate of unscheduled extra patients reaches a 20% average. Information and quality are seen as strategic organization management tools. success factors. 4.83 4.P42 P44 P45 Mean It is easy to set appointments with doctors and for complementary exams. The patients are informed about the procedures and complementary exams in schedule. (4) Identify indicators and establish follow-up procedures. and its purpose is to make priorities a consequence of an interaction between users and the organization. In studied reports in the first stage of phase two the approach focus. 24 .28 The analysis of data collected during the observation of 1422 patients’ receptions in the appointment sector reveled: • Just 40% of patients are seen with less than 30 minutes period of waiting and almost 30% of patients wait for more than 1 hour. The patients are kept informed about their illness and its treatment. focusing on the heath services area. This factor is ensured by the use of techniques which focus on the behavior both of people and organizations at the times in which service is rendered. Another strength of the methodology is the study of context.80 4. |It also became quite clear that the methodology supports decision making as it shows organization critical factors and links with organization activities. methodological characteristics and detected objectives.07 4. do not cover the various aspects which should be emphasized to achieve user satisfaction. particularly in health care organizations. The methodologies available to link these two sources of competitive advantages. CONCLUSION This paper discusses service quality. This factor is ensured by the use of samples representing all social groups within examined universe and by the way in which critical quality factors are determined.

..12. P.. “Comparing public and private hospital care service quality”.R and also for the appointment sector. Decision Sciences . MANGOLD. “TQM in a Surgery Center”.J. The International Journal of Health Care Quality Assurance .. v.G. Bibliography AHIRE. pp. 2000.R revealing a “good” overall evaluation for the perception of the service and a marked improvement of all surveyed factors with evidence that the way patients are received in the Emergency Room At the appointment sector the opinion poll was carried out with 127 interviews. K. “Clearing the Way for Physicians’ Use of Clinical Information Systems”. “Introduction to the Minitrack: Databases. “Adapting the Servqual Scale to Hospital Services: An Empirical Investigation”. Communications of the ACM . 27. 97-100. an advanced station to schedule appointments and exams close to Doctors offices. WOOD. “Implementation of TQM/CQI in the health-care industry: A comprehensive model”. possibly because of the decision to maintain. n. v. C. 2. 25 .27. During the opinion poll 768 patients and health professionals were interviewed about the relative importance of 47 quality factors and about their perception of the quality level of the same quality factors. 1996. Quality Progress. D. L. The satisfaction survey and the observation activity conducted 434 interviews and observed 494 patients’ receptions in the E. WEY.. N. L. LARGE. resulting in a overall evaluation of an “upper good” perception of the service. Decision Sciences. et al.. Data Warehousing and Data Mining in Health Care”. A. CAMILLERI. 1999. 1998. 1995. K.. M. ANDERSON. v.. BERNDT. v. “Development and Validation of TQM Implementation Constructs”. G. 3. ANDALEEB. 4. BABAKUS. Health Service Research . The survey results identified critical quality factors and critical activities that were the basis for the formulation of an action plan. “Identification of the Critical Factors of TQM”. In: Proceedings of he 33rd Hawaii International Conference on System Sciences. D.. WALLER. pp. 40.. pp. PAPP. KUPERMAN. pp. The observation activity over 1422 patients detected a small improvement in almost all surveyed factors. The action plan accomplished a satisfaction survey and an observation activity for the E. E. 8. BRASHIER. PORTER. v. n. D. International Journal of Quality and Reliability Management . M. v. “Measuring the success of a TQM programme”... BLACK. M. 1996.M. 115-124. APPLEMAN. G. 23-56. 11.8. during the period of experience. 1998. “Determinants of customer satisfaction with hospitals: a managerial model”. E. S. 1. CAPON. v. 1992.. D. C. pp. 1996. 53. A. and an positive increasing on the factor that measures perception of easiness to schedule appointments and exams. v. pp. International Journal of Medical Informatics . n.. HU. 1. n. KAYE. January 1994. L. 2. Benchmarking for Quality Management & Technology .. pp. J. 11.Y... 8-22. R. 1-21. 26. . “Using Information Systems to Measure and Improve Quality”.. n. GOLHAR.. CALLAGHAN. n. 6.IUS. 83-90. August 1997.The validation occurred through a case study developed in two phases: a opinion poll and a action plan. International Journal of Heath Care Assurance .. S.S. v. BATES.. Quality Progress. 47-49. S. BURNEY. M. “Navy Hospital Fights Diseases With a Quality Team”. et al. April 1995. n. n.

M. pp. “The impact of information systems on quality perfomance: an empirical study”. v. ROESSNER..C. n. JURAN. Systems Practice. “Quality Begins and Ends with Data”. LIM. Dorfman. P. 89-91. John’s University . S. Health Administration Press. M. Quality Progress. I. Ann Arbor..K. L. pp. 3. Chichester. A. 2 ed. n. McGraw Hill.. 59-76. MATHERLY. 6. Review of Business.. n. 1988. n. Systems Thinking. International Journal of Operations & Production Management .. Management Information Systems: Conceptual Foundations.B. PIRBHAI.2. 1981. “A Canadian Hospital Implements Continuous Quality Improvement”.H.. OLSON. D.. H. 2000. 6. “Service Quality: The Six Criteria of Good Perceived Service Quality”.. 26 .. A. 10.. 1999. J. J. M. pp. “What are Good Health Care Measurements?” Quality Progress.. TALMANIS. 1997. St. v. DICKENSON. LIM. 13. 41-42. n.. IEEE. H. BERWICK. HUQ.13. 1985. April 1992.. P. 69-83. CORTADA. “Can Quality Management Really Work in Health Care?”. McGraw-Hill . J. In: Thayer R. Quality Progress. n. C. 1997. New York. 2nd. April 1992. International Journal of Health Care Quality Assurance . 9. JACKSON. ENCHAUG. L. April 1992. “Improving Health Care on a Tight Budget”. J. “A TQM evaluation framework for hospitals”. pp. pp. pp. Quality Progress. NY.. RYAN. D. v. 7579. Quality Progress. HATLEY.. N. 1983. 2000. P. 81-84.. DAVIS. MATERNA. 1996. Software Requirements Engineering.. C. “Implementing TQM in a Hospital”. K. C. April 1992. “An innovative framework for health care performance measurement”. FINISON. IEEE Computer Society. B. Singapore. C. Managing Service Quality . Explorations in Quality Assessment and Monitoring. CHESNEY. GODFREY. MARTHA L. pp.. ”The development of a model for total quality healthcare”. 93-95. 23-27. E. FEIGENBAUM. 103-111. “Patient participation requires a change of attitude in health care”.. Quality Progress. McGraw-Hill Book Company. H. P. N. Managing Service Quality . October 1995.. n. Total Quality Control. P.. ROTHE. Quality Control Handbook. K.H.. P. Structure and Development. Dorset House. A. M. 1995.H. International Journal of Health Care Quality Assurance . A. K. HART.. 1995. B. “Hurdles to Quality Health Care”.. MCINTRYE. 2000.. LAWRENCE. pp.. 15. cap 3... International Journal of Quality an Reliability Management . M. N. HRUSCHKA. 9.. G. “Hospital Sets New Standard as Closure Approaches: Quality is Continuous”. v. McGraw-Hill.. Quality Progress. Z. 1980. 6.10. v. CA. 45-480. pp. L.. April 1992.. TANG. 4. I. GOPALAKRISHNAN. “Monitoring quality in the British health service – a case study and a theoretical critique”. LASATER. “IEEE Recommended Practice for Software Requirements Specifications”. NY. Process for System Architecture and Requirements Engineering. FORZA. E.. S. pp..V. 178-181. New York. 7. J. 1988. Quality Progress. (eds).M. pp.B. TQM for IS Management. A. FAHEY. v.. pp. 176-205. 25-28. DASCH. pp. TANG. John Wiley and Sons. April 93. P. GRONROOS. DONABEDIAN.CHECKLAND. 1992.. v.

Benchmarking for Quality Management & Technology. et al. M.J. 12. 2001.H. 2000. March 1997. RICHARDSON. 1997. Proceedings of the 34th Hawaii International Conference on System Sciences..T. THATCHER. pp. International Journal of Heath Care Assurance .R. 2001. pp.E. 82-85. London. SANDERS. et al. SERVQUAL: “A Multiple-Item Scale for Measuring Consumer Perceptions of Service Quality”. International Journal of Heath Care Assurance ....H.13. W.R. 47-49.259-265..NABITZ. v. pp.. M. McGraw-Hill Book Companies. L. PRESSMAN. Journal of Retailing.6.. 208212. “Monitoring health care process: a framework for performance indicators”. Hawaii. VISSERS. R. NADA R. International Journal of Health Care Quality Assurance . February 97. U. n. “Developing a TQM Implementation Model”. OLIVER. “Insights into Improving Organization Performance”. 2001. ZONNENSHAIN. v... STRATTON.. T. 45-48.D..... pp. 1990. pp. 5. J. MORGAN.. “The Impact of Information Technology on Quality Improvement. n. pp. Quality Progress. A. pp. 1998. E. SOLINGEN. pp.A.. THOMPSON. E BERRY. 183-189. B. R. International Journal of Health Care Quality Assurance v. A. AddisonWesley. 214-221. pp. “Information Systems Methodology for Building Theory in Health Informatics: The Argument for as Structured Approach to Case Study Research”... n.. WALGURG. C. pp. “Continuous Quality Improvement in an Acute Hospital: A Report of an Action Research Project in Three Hospital Departments”. M. 5. 27 .. Overlook Hospital Emergency Department: Meeting the Competition with Quality”. 5th ed. n. In: SHORTLIFFE. “Addicted to qualify – winning the Dutch Quality Award based on the EFQM Model. 291-303. P. June 1995... McGrawHill Book Company. 1999. T. ROLAND. E. THIAGARAJAN. C. Software Engineering: A Practioner’s Approach. Qualiy Progress. 10. PERREAULT. “Contemporary organizational srategies for enhancing value in health care”. ZEITHAML. POTTER. New York. GURTNER. A. 5.E. ZAIRI. SHAW.W. 1999. S. Quality Progress.. E.M.. Quality Progress. BERGHOUT. PERREAULT. pp.. “Achieving hospital operating objctives i te light of paient preferences”. D. 7. 1988. Proceedings of the 34th Hawaii International Conference on System Sciences. L. New York. V. Productivity and Profits: An Analytical Model of a Monopolist”... A. M. 12.J.J.A. n. Quality Progress. Medical Informatics: Computer Applications in Health Care. The Goal Question Metric Method. PLUMMER. (eds). International Journal of Health Care Quality Assurance. October 1998. NAVEH.. PARASURAMAN.. L. “An empirical analysis of critical factors of TQM”. 1994.. 1. E. vl 5 nr 4. CROWE. “Health Care Organizations Can Learn From the Experiences of Others”. v.. “Learning from Mistakes”.. February 1998. M. Inc..H. VALDIVIA. 55-59. 1999. VAN DER BIJ. SHORTLIFFE. Hawaii. J.L.H. 41-43. Spring. 12-40.. EREZ. v..

9. v. International Journal of Service Industry Management . R. et al. 1.E (eds). 4. 1996. G. YOUSSEF. “Development of a Multiple-item Scale for Measuring Hospital Service Quality”. 3. E. pp.. L..VANDAMME. “Hospital Information Systems”. 1990. WIEDERHOLD. 1993. NEL. PERREAULT.. Addison-Wesley. International Journal of Health Care Quality Assurance.H. LEUNIS.. 28 . J. v. “Health care quality in NHS hospitals”.. D. Medical Informatics: Computer Applications in Health Care. n. BOVAIRD.. F. T.N. n. chapter 7. In: SHORTLIFFE. 30-49.