You are on page 1of 5

Mills Education Evaluation Services, Inc.

Evaluation of Determining Instructional Purposes (DIP) A Proposal Submitted to Far West Laboratory for Educational and Research Development By Cynthia Mills, CEO October 20, 2013

Introduction The Far West Laboratory for Educational and Research Development (FWL) located in Las Vegas, Nevada is accepting bids for Request For Proposals. The company made an initial investment in a training package entitled, Determining Instructional Purpose (DIP), which consists of a coordinator’s handbook and three training units. To begin with, FWL seeks information and recommendations for use in making decisions regarding the marketing and sale of the instructional unit. Secondly, FWL wishes to provide useful information to school administrators; thus, an evaluation is needed. Finally, FWL needs to determine if the program is meeting the needs of the stakeholders and whether or not it is viable to invest more money into the training and market the DIP further. This proposal is being submitted by Lead Evaluator, Cheryl Macy of Mills Education Evaluation Services, Inc. in accordance with the Request for Proposal put forward by FWL. Description of Program Being Evaluated The Determining Instructional Purposes (DIP) training package was developed and designed for the purpose of training school administrators and graduate students in educational administration. The training incorporates skills needed for and related to the planning and preparation of effective administrative programs and practices. The training package consists of a Coordinator’s Handbook and three training units. Unit 1 is entitled, Setting Goals; Unit 2 is entitled, Analyzing Problems; and Unit 3 is entitled, Deriving Objectives. Each unit contains 4-6 modules that adhere to a set number of learning objectives. Each module contains reading material, hands-on activities that involve individuals and groups, and feedback or assessments. For example, the learners read the material that guides them through a particular skill, and then he or she is given a hypothetical situation where that skill would be needed and implemented. Learners provide feedback individually through self-assessment, or they collaborate in groups and critique each other’s outcome. Furthermore, learners are organized into planning teams that apply problem-solving skills, and they determine the solutions necessary for success. In addition, the units are flexible in that administrators can focus on the unit of choice, whether it is setting goals or analyzing problems. Additionally, the units are selfcontained so that learners can progress step-by-step. The training time required is 10-25 hours for units 1 and 3 and 12-18 hours for unit 2. The program developers’ intention is that the coordinator works through all materials first, and then he or she is a trainer. His or her role is to organize, guide, and monitor activities and progress. The Coordinator Handbook is including in the training package. Evaluation Method Because the goal of the evaluation is to determine whether or not FWL should market the DIP training program package, the primary focus of this proposal is to provide the method in which the information and recommendations will be evaluated. The audiences for the evaluation are FWL, administrators, and other schools across the country that FWL will market to if the program is feasible. The Decision-Making Model, developed by Daniel Stufflebeam will be implemented regarding the future use of the training. Both

quantitative methods such as pre-assessments and qualitative methods, such as interviews, observations, and surveys will be used. Therefore, in order to evaluate this program effectively, Mills Education Evaluation Services MEES will address the following initial questions using a survey. The questions are designed for the creators of the program; they focus on the efficiency, effectiveness and impact that the program will have. Questions of efficiency:  When will training begin?  How much will it cost?  How long will it take? Questions for effectiveness:  What do the administrators do?  How well will the program implemented?  What are the desired outcomes? Questions for Impact:  Does the program influence the learners’ reactions to situations?  Are the skills learned implemented to trouble shoot and problem solve?  Does the training program have value? If so, what is it? Next, once they have finished working through the materials, MEES will conduct interviews of the program coordinators, who are the Vice Principal and Principal at Rocky Mountain High School in Las Vegas, Nevada. MEES will also interview the four other Vice Principals participating in the training and their graduate students, who are seeking their Master’s degree in administration. These interviews will be recorded and analyzed using a holistic rubric that defines the goals and the objectives of the training package. This will be a summative assessment tool. Finally, MEES will observe the administrators and the graduate students implementing the skills they have learned using a formative assessment tool that breaks down each unit (Setting Goals, Analyzing Problems, Deriving Objectives) in terms of the learning objectives defined in the training package. Task Schedule January 5, 2014-June 10, 2014
Date Task Responsible Party Cheryl Macy, Lead Evaluator MEES Cheryl Macy, Lead Evaluator MEES

January Evaluator meets with FWL to survey creators of the 5th DIP. January Evaluator interviews program coordinators. 15 Evaluator meets with administrators and graduate students at the onslaught of their training for an January informal interview and discussion of the overall 17 process, including the formative and summative assessments. Jan 30 Survey data is collected. Evaluators meet with FWL to discuss initial findings and review formative and summative assessment tools. Revise tools if necessary.

Cheryl Macy, Lead Evaluator MEES; Lynn Fouts, Assistant Evaluator MEES

Cheryl Macy, Lead Evaluator MEES; Lynn Fouts, Assistant Evaluator MEES

Feb 15

Interviews of Administrators begins

Cheryl Macy, Lead Evaluator MEES; Expert, Dr. Schopen, Retired Principal. Cheryl Macy, Lead Evaluator MEES; Expert, Dr. Schopen, Retired Principal, Lynn Fouts, Assistant Evaluator MES Cheryl Macy, Lead Evaluator MEES; Expert, Dr. Schopen, Retired Principal, Lynn Fouts, Assistant Evaluator MES Cheryl Macy, Lead Evaluator MEES; Expert, Dr. Schopen, Retired Principal, Lynn Fouts, Assistant Evaluator MEES Expert, Dr. Schopen, Retired Principal Cheryl Macy, Lead Evaluator MEES; Expert, Dr. Schopen, Retired Principal, Lynn Fouts, Assistant Evaluator MES Expert, Dr. Schopen, Retired Principal Cheryl Macy, Lead Evaluator MEES; Expert, Dr. Schopen, Retired Principal, Lynn Fouts, Assistant Evaluator MES Cheryl Macy, Lead Evaluator MEES Cheryl Macy, Lead Evaluator MEES; Lynn Fouts, Assistant Evaluator MEES

March 3 Interviews of Graduate students begins April 5th May 1

On the job observations begin for Administrators

On the job observations begin for graduate students

May 9th Data from interviews is compiled and analyzed. May15 Meeting between expert and evaluators to go over data from interviews

May 20 Data from observations is compiled and analyzed May 30 June 2 June 10 Meeting between expert and evaluators to go over data from observations. Data collection and analysis is completed. MES meets with FWL. Final Evaluation report completed. Final meeting with FWL

Project Personnel MEES: An educational evaluation and consulting firm that works with foundations, organizations and corporations whose primary interests are to enhance education at every level. MEES conducts assessments and evaluations to inform strategic direction of educational training and programs. We pride ourselves in providing a high level of quantitative and qualitative data that enables our clients to maximize their investments and the effectiveness of their educational initiatives. Cheryl Macy, Lead Evaluator: Mrs. Macy has 10 years experience as an evaluator, facilitator researcher and program specialist. Her specialties include developing measures to determine whether your program is meeting its goals and objectives and developing an evaluation design to determine whether your program is having its intended impacts. She has a B.S in Information Technology and a Masters in Educational Technology. Lynn Fouts, Assitant Evaluator: Mrs. Fouts has 7 years experience as an assistant evaluator. She specializes in conceptual models that provide written descriptions of how program activities and components relate to each other and to the overall goals and objectives. She has a B.A. in English and a Masters in Curriculum and Instruction. Dr. Cynthia Schopen, Expert: Dr. Schopen is a retired principal of 25 years. She has a Doctorate in K-12 Education and her Masters in Curriculum and Instruction.

Budget Personnel Lead Evaluator – 80 days @ 250 per day = 20,000 Assistant Evaluator – 64 days @ 175 per day = 11,200 Expert – 30 days @ 400 per day = 12,000 Personnel Total = 43,200 Training Supplies Coordinator Handbook – 2@ 4.50= 9.00 Books for Participants – 10 people @ 24.95 dollars per set of books = 249.50 Total Training Supplies = 258.50 Travel and Per Deim Travel Per Diem – 20 days x 1 person @ 250 per day = 5,000

Supplies and Communication Survey = 300 Paper, Pens, Ink, etc. = 200 Phone Calls – Local and Long Distance = 400

Total Budget = 49,358.50