You are on page 1of 31

Effectiveness of MoodleRooms Training at Union University

Final Evaluation Report EDTECH 505-4173 Josh Simmons August 4, 2013

Table of Contents
Learning Reflection……………………………………………………………………… 3 Executive Summary……………………………………………………………………… 4 Purpose of Evaluation………………………………………………………………....…. 5 Background Information………………………………………………………………. 6-8 Evaluation Design……………………………………………………………………. 9-10 Results of Evaluation……………………………………………………………….. 11-19 Discussion of Results……………………………………………………………….. 20-21 Conclusions and Recommendations…………………………………………………… 22 Appendix A: Survey Form………………………………………………………….. 23-28 Appendix B: Email to Faculty Members………………………………………………. 29 Appendix C: Project Timeline…………………………………………………………. 30


Learning Reflection
Evaluation was a subject I felt familiar with when the course began. However, I quickly learned that my experiences with evaluation had not been from the design aspect, but typically only a discussion of results. I had a small amount of experience looking at data and determining how the data guides our future planning. The textbook examined many aspects of evaluation that I had never considered such as evaluation models and designing an evaluator‟s program description. These are 2 critical aspects of designing an evaluation, but I had never considered all steps necessary to ensure an evaluation addresses various aspects of a project. One project that truly showed me the breadth of an evaluation was the Tangled assignment. In this assignment we were to begin with asking questions and soon I realized that one questions leads to another and to another. These questions often overlap causing more questions, increasing the evaluation process. As I began developing my own evaluation, I had grandiose ideas and envisioned an evaluation that would cover the entire implementation of MoodleRooms at my university. Dr. Perkins gave me advice to not „bite off more than I could chew‟. I didn‟t think my project was too big until I began developing the assessment questions and once again, the project developed into a huge undertaking. I realized that evaluations must be defined in attainable terms. We could evaluate the effectiveness of the US government, but it would be a huge undertaking. However, an evaluation on one small portion is much more attainable. Evaluation is an enlightening experience. It is sometimes difficult to accept the answers we gain through evaluation. However, to truly grow in any aspect of life, including education, we must learn to take the good with the bad and apply the lessons learned. I am thankful for the evaluation project as it gave me a chance to work closely with co-workers to scrutinize an area that has had very little assessment in the past – faculty training. Through the results of the evaluation, we have already begun the improvement process. AECT Standards Addressed 5.1 Problem Analysis - Evaluation revolves around studying problems. The Tangled assignment proved just how difficult it is to analyze a problem and showed the many angles at which a problem can be viewed. The Evaluation Proposal identified a real problem that needed to be addressed and brought a look at the evaluation in light of all stakeholders involved. 5.2 Criterion-Referenced Measurement – Essential to criterion-referenced measurement is the desired learning outcome, often based on a rubric. A rubric is needed as a standard or guide for evaluation or assessment. In criterion-referenced measurement, a learning task will be balanced with the learning assessment. Each item will be measured appropriately, when a standard is involved. 5.3 Formative and Summative Evaluation – Formative assessment takes place throughout a process and includes various trials and tests. The purpose of formative assessment is to make changes in a design before it is fully executed. A summative assessment takes place after implementation. Often summative assessments measure the effectiveness of a product or design. My evaluation project was based on a summative assessment of the effectiveness of MoodleRooms training at my university.


5.4 Long-Range Planning – Once data has been analyzed and an evaluation completed, the impact of the study must be applied to item being studied. Long-Range planning is where implementation and alignment of evaluation results occur. The purpose of evaluation is to better a product or design. Evaluation is a tool used in planning for the future to make changes now that will affect a product or design tomorrow.


Executive Summary
This evaluation was conducted to examine the effectiveness of MoodleRooms training for Union University faculty members. The university selected a team of 7 faculty, 2 administrators and 2 information technology staff members to select a new learning management system (LMS) in October 2011. The team conducted research and determined MoodleRooms would be the new LMS. Training was determined to be the best way to gain faculty buy-in of the new LMS. The team suggested multiple face-toface training sessions be offered as well as a self-paced online training tool. This evaluation is conducted with the goal of examining current training practices and finding areas for improvement. The evaluator created a survey consisting of 28 questions to gather demographic information and assess both the face-to-face training session and online training session. The assessment also examines the communication to faculty members and whether faculty had knowledge of all training opportunities. The survey was emailed to 250 full time faculty members. The survey was available for completion for 2 weeks. 37 faculty members responded to the survey, giving sampling population of nearly 15%. Results of the survey show that most found the training at least somewhat helpful. Most faculty were aware of at least 2 face-to-face training opportunities, even though 6 were initially offered. About half of the respondents participated in the online training opportunities and over half participated in the face-to-face training. The majority of respondents found the training materials and training staff helpful. Overall, the training was effective, however there are plenty of areas that need improvement.


Purpose of the Evaluation
Purpose The purpose of this evaluation is to examine the effectiveness of training for the MoodleRooms LMS. Knowing that a change in learning management systems is a massive undertaking for a university of any size, the plan has been to provide effective and efficient training, both in face-to-face and online settings. Due to the numerous styles of teaching with an LMS and the various tools available within the LMS, the evaluation is truly an assessment of the training materials and staff by faculty who are teaching with MoodleRooms. The evaluation is NOT meant to examine the effectiveness of MoodleRooms as an LMS, effectiveness of instruction, or the effectiveness of course design. This evaluation is simply meant to help improve faculty training procedures. Evaluation Questions The main questions in regards to this evaluation are: Were participants made aware of training opportunities? Are training materials helpful and effective? Are training staff members helpful and effective? These main questions were broken into multiple sub-questions to assess various aspects of every concern. The questions examined personal opinions on helpfulness of materials and staff, whether excitement for online instruction was encouraged, if communication was effective, and whether faculty would recommend various training opportunities to colleagues. Impact The evaluation identified 4 primary stakeholders who could be impacted by the results of the report. Since the evaluation is focused on the effectiveness of training materials and staff, the first stakeholders considered were the trainers. The university has 2 full time trainers and used 2 other faculty representatives served as trainers during the process. Faculty members are the second group of stakeholders considered. Faculty need to feel that the training is not a waste of time, therefore the impact of the evaluation will greatly impact faculty perceptions and desire to partake in future training opportunities. The Information Technology administration was central to the implementation of MoodleRooms and therefore is considered as the third group of stakeholders. The IT administration provided funding for LMS training opportunities and should see a return on investment through faculty support. The final group of stakeholders is the academic administration. These members were instrumental in the adoption of a new LMS and provided much insight to the desires of the greater faculty, as well as capital funding for the project.


Background Information
Origin Union University is a private, Christian university serving over 5300 students, with nearly 300 full-time faculty members, over 200 part-time faculty members, 250 full time staff and an annual budget of over 94 million dollars. The university has 3 campuses in Tennessee, with the 270-acre Jackson campus being the primary, and satellite campuses in both metro Memphis and metro Nashville offering adult and graduate level degrees. The university is currently adopting an online delivery for many courses and degree programs. The university has used the previous learning management system, BlackBoard, to supplement traditional face-to-face courses. A desire to prepare for disaster recovery and to become competitive in the online education arena drove the university to increase the capacity of the former BlackBoard system. The increase in support for BlackBoard and a desire to find a more robust LMS lead the university to establish a committee to evaluate a variety of LMS solutions. The team recommended MoodleRooms to the faculty in February of 2012. The LMS was fully adopted during the summer of 2012 and all faculty members were expected to be “somewhat active in the LMS environment”, including completion of at least one training session, by spring 2013. Now, the LMS is beginning to play a significant role in content delivery for many courses and several adult and graduate degree programs. Standards and Goals The training staff set the following goals and standards for training: Goal 1 – Training materials are considered helpful by faculty members Many faculty members are inexperienced using instructional technology and have never taught a course in an online format. The training must give a helpful step-by-step demonstration of creating and maintaining an online course. While the step-by-step approach might seem basic to many, the training team finds it necessary to cover every aspect from a basic level and work toward a more advanced level. Goal 2 – Training staff is considered helpful and knowledgeable by faculty members Recognizing that many faculty members are hesitant to create an online version of courses, the training staff is aware of the need to be engaging, helpful and knowledgeable for increased desire of faculty members to complete the training sessions. An unenthusiastic training staff could spell disaster for the entire implementation of MoodleRooms as the new LMS. Goal 3 – Training opportunities are clearly communicated to faculty members Having experienced trouble with communication to faculty in the past, the training staff is aware of the challenge to announce each training opportunity, including the online training course, to all faculty members. Training is to be encouraged by academic administration during greater faculty meetings and deans or department chairs 7

during smaller college, school or departmental meetings. The training staff also needs to encourage training attendance among faculty members. Goal 4 – Department chairs, deans and academic leadership actively support training The training staff realizes that support from department chairs, deans and academic leadership will be necessary to encourage hesitant faculty members to participate in online education and training. Although every course is required to have an online component, some faculty members are still reluctant to attend training sessions and build online components to their courses. Encouragement from leadership is desired. Previous Products The university had used the previous learning management systems, WebCT and BlackBoard, to supplement traditional face-to-face courses for over 12 years. BlackBoard replaced WebCT 8 years ago when it purchased WebCT. Faculty members had not been required to use BlackBoard, as it was only for optional supplemental materials in most cases. Very few courses were delivered strictly using BlackBoard. After the February 5, 2008 tornado devastated much of the main campus, the university decided to take a serious look at disaster recovery preparedness.

© Union University – Aftermath of the Feb. 5, 2008 Tornado

One decision was to create an online version of each course to be used in time of emergency. It was determined that the current BlackBoard system was insufficient for supporting all coursework. The costs associated with upgrading the BlackBoard system were considerable and the system had other insufficiencies stacked against it. As part of the big picture of disaster preparedness the IT staff began examining off site data storage. The cost increased exponentially with the BlackBoard system in place due to the need for all files associated with the LMS being stored on local servers. Those files would need to be backed up off site as well. The university also began investigating the ability to compete in the online education arena. The university has been known for providing high quality education for


nearly 200 years, but had not entered the online realm as of 2010. The administration saw a need to expand course offerings to include online and distant education programs and degrees. These 2 issues compounded to increase the need to replace BlackBoard and MoodleRooms was brought in to replace the previously outdated system. Users Involved The move to MoodleRooms was initiated with an 11-member LMS committee. The university selected a team of 7 faculty, 2 administrators and 2 IT staff members to begin the process of selecting a new LMS in October 2011. The faculty representatives were wide-ranging in subject area, years of instruction and technology proficiency. Two full-time trainers, Tabitha Washburn and Robin Navel, and 2 faculty members, Eric Marvin and Anna Clifford served as trainers for this implementation. The online training session was adapted from a MoodleRooms training template by our LMS coordinator, Tabitha Washburn. Academic administrators notified all faculty members of training during faculty meetings and deans and department chairs were to encourage training at smaller group meetings and in passing conversation. Two face-to-face training sessions were offered the week before the Fall 2012 semester began. Four other sessions were offered during the fall semester. The online training session was activated in August 2012 and remains active today. Characteristics The training was offered in a face-to-face format 6 times throughout the Fall 2012 semester and is continually offered online. Faculty members were required to complete MoodleRooms Training 1 by the end of the Spring 2013 semester. This training session is the only required session. As of June 30, 2013, 126 faculty members had completed the face-to-face training session and 277 had completed the online training session. The training session included a hands-on tutorial provided by one of 4 trainers. The session covered all aspects of Moodle Rooms including: Account creation Setting up new courses Using course builder tools Content management Assignment grading Assessment tools Student management Analytic tools The training session was designed as a 2-hour overview held in a 40-seat computer lab. Each participant was able to use a computer in real-time course development demonstration. The participants were given numerous materials including handouts, useful websites, links to training videos and long-term support information. The training ended with a review, but no post-training survey was given.


Evaluation Design
Design Model The decision-making evaluation model was used for this evaluation. This method was chosen due to the focus on quality and effectiveness. The evaluation was based on qualitative data and entirely summative in nature. The goal of the evaluation was to rate the effectiveness of training materials and training staff, as well as gain a glimpse of potential communication hindrances between the training staff and faculty members. This evaluation is one of several that will be conducted on the effectiveness of the transition to MoodleRooms from the prior BlackBoard system at Union University. Two full-time IT staff members developed the MoodleRooms training. One is the learning management system administrator and the other is an instructional designer. An IT staff member who was not directly involved in training or the selection of MoodleRooms developed the survey and conducted the evaluation. The data gathered was strictly from those who were participants in MoodleRooms training. While each participant was able to complete the goals of the training session, the evaluator sought to analyze whether or not the training met the goals listed above in the Background Information section. The evaluator worked with trainers to determine the best process for assessing the effectiveness of training. The goal of the evaluation was never intended to measure the effectiveness of MoodleRooms course delivery, design or any other aspect. The evaluation was only designed to gain feedback from faculty members as to the helpfulness of and knowledge gained through training. Sampling The evaluation was open to all full-time faculty members. The evaluator sent an email request with a link to the evaluation form to the full-time faculty email list. There were a potential for nearly 300 responses. The group of faculty would include many who never used the BlackBoard system, dozens who had previously used Moodle or MoodleRooms at prior institutions, and a large number who had no experience at all with learning management systems. Participants are at varying levels of technology usage and integration into classroom materials. The sampling would also include faculty members from various academic backgrounds including 7 schools or colleges within the university structure. Several faculty members have been in business or medical fields prior to joining the faculty and therefore have no prior knowledge to learning management systems within this university or any other university setting. Data Collection Process On July 11, an invitation to participate in the MoodleRooms training survey was sent to all full-time faculty members. The email indicated that the survey was available for completion until July 25. This gave respondents a 2-week window to complete the survey. The survey collected basic demographic indicators as well as feedback regarding training materials, staff and knowledge of training opportunities. The survey utilized multiple-choice questions, rating scales and semantic differential scales. Only 2 questions were required questions. Both required questions were demographic questions. Data was compiled using a Google form. The form also populated a spreadsheet giving data that 10

could be analyzed and compared using various demographics. After the cut off date of July 25, the evaluator analyzed the data. The data was collected in spreadsheet form as well as graphically. Evaluators Program Description Graphic Representation


Demographic Questions Q1) Participants were asked to identify with one school, college or institute of instruction. The College of Arts and Sciences had the most respondents with 9 followed by both the College of Education and Human Studies and the School of Nursing with 7 respondents each. The School of Business registered 6 respondents while the School of Theology and Missions registered 4. The School of Pharmacy only registered 3 participants and the Institute for International an Intercultural Studies registered 1.
10 9 8 7 6 5 4 3 2 1 0

Q2) Participants were asked how many years they had been employed by the university with 9 claiming 1-5 years. Both 6-10 years and 11-15 years had 8 respondents each. The remaining categories (less than 1 year, 16-20 years, 20-30 years and 30+ years) all registered 3 respondents each.
10 8 6 4 2 0 Less than 1 1-5 Years 6-10 Years 11-15 Years 16-20 Years 20-30 Years 30+ Years

Q3) Participants were asked how they would describe their experience with the previous BlackBoard system. 14 claimed to be very experienced while 13 claimed no experience at all. 10 respondents said they were somewhat experienced. Q4) Participants were asked if they had any experience with MoodleRooms prior to employment at Union. 12 responded yes and 25 responded no.


Q5) Participants were asked if prior to the university‟s selection of MoodleRooms as the new LMS they were asked for input from a member of the LMS Committee. The majority of respondents, 29, replied no they had no input on the LMS decision. 7 claimed to have had some input while 1 did not remember.

Q6) Participants were asked if prior to the university‟s selection of MoodleRooms as the new LMS they had participated in any meetings or surveys about their personal desires for the new LMS. An overwhelming majority of 35 said no, they had not participated in any meetings, while 2 said they had participated in either meetings or surveys. Communication Questions Q7) Participants were asked who first told them of the transition to MoodleRooms. 19 respondents said their dean had first mentioned it while 11 claimed to hear from their chair first. 5 claimed to hear it from the Vice President for Academic Administration first and 2 claimed a member of the IT staff first mentioned it.
20 18 16 14 12 10 8 6 4 2 0 Dean Chair VP for AA IT Staff


Q8) Participants were asked how many training opportunities they had been made aware of. 13 respondents said they were aware of 4 training sessions. 11 respondents each responded they were aware of 6 and 2 training sessions. 1 was made aware of only 1 session and 1 respondent said they were not aware of any training sessions. No respondents replied with 3 or 5 training sessions.
14 12 10 8 6 4 2 0 1 2 3 4 5 6 I was not made aware of training

Q9) Participants were asked who announced the training opportunities to them. 17 of the respondents identified their chair as announcing training opportunities while 14 identified their dean. 3 were made aware of training from an IT staff member and 2 from the VP of Academic Administration. 1 claimed to not have been made aware of training.
18 16 14 12 10 8 6 4 2 0 Dean Chair VP of AA IT Staff I was not made aware of training


Q10) Participants were asked if they were aware of online training opportunities. 23 respondents said yes, they were aware of online training opportunities while 8 said they were not aware of online training opportunities. 6 claimed to be somewhat aware of online training opportunities.

Q11) Participants were asked if they attended a training opportunity. 20 respondents said they had attended a training opportunity while 17 said they had not. Q12) Participants were asked if they had participated in an online training opportunity. 19 respondents said they had not participated in online training while 18 said they had. Face-to-Face Training Questions Q13) Participants were asked if training opportunities were offered at a convenient time. 10 respondents said yes and 10 respondents said no with 15 indicating that training was somewhat convenient. Q14) Participants were asked how they would describe the training materials, rating materials as very helpful, somewhat helpful or not helpful. 7 said the materials were very helpful while 3 said they were not helpful. 11 indicated the training materials were somewhat helpful to them.
12 10 8 6 4 2 0 Very Helpful Somewhat Helpful Not Helpful


Q15) Participants were asked to describe training materials as easy to understand, somewhat confusing or very confusing. 14 respondents said the material was easy to understand and only 1 respondent claimed the materials as confusing. 6 respondents said the material was somewhat confusing.
16 14 12 10 8 6 4 2 0 Easy to Understand Somwhat Confusing Very Confusing

Q16) Participants were asked to rate the training materials on a scale of 1 to 5 with 1 being poor and 5 being excellent. 4 participants rated the materials as a 5 (excellent). 5 rated the materials as a 4 (good) and 9 rated the materials as a 3 (average). Only 3 participants rated the materials as a 2. None of the respondents rated the material as 1 or poor. 43% rated the materials as a 4 or 5 and 43% rated the material as a 3.

10 9 8 7 6 5 4 3 2 1 0 Poor Below Average Average Above Average Excellent


Q17) Participants were asked to describe the training staff using very helpful, somewhat helpful or not helpful. 12 respondents said the training staff was very helpful while 9 said the staff was somewhat helpful. No respondents claimed the staff was not helpful.
14 12 10 8 6 4 2 0 Very Helpful Somewhat Helpful Not Helpful

Q18) Participants were asked if the trainer made the training session interesting. Only 4 respondents said yes, while 15 said somewhat. 2 respondents thought the trainer did not make the training session interesting.
20 15 10 5 0 Yes Somewhat No

Q19) Participants were asked to rate the training staff‟s knowledge of MoodleRooms on a scale of 1 (poor) to 5 (excellent). Half of the respondents rated the trainers knowledge as a 4 while 6 respondents rated the trainers knowledge as a 3 and 4 rated the trainers knowledge as a 5.
12 10 8 6 4 2 0 Poor Below Average Average Above Average Excellent


Q20) Participants were asked if they left the training session excited about using MoodleRooms. 7 respondents said yes while 4 said no. 10 respondents said they were somewhat excited after the training.
12 10 8 6 4 2 0 Yes Somewhat No

Q21) Participants were asked if they would recommend the training session to colleagues. 18 respondents said they would recommend the training session and 5 said they would NOT recommend the training session.

Online Training Questions Q22) Participants were asked how they would describe the online training materials using very helpful, somewhat helpful and not helpful. The majority of respondents, 13, found the online materials somewhat helpful, while 3 claimed the materials to be very helpful. Only 1 respondent said the online materials were not helpful.
15 10 5 0 Very Helpful Somewhat Helpful Not Helpful


Q23) Participants were asked to describe the online training materials using the terms easy to understand, somewhat confusing and very confusing. 14 respondents claimed the online materials were easy to understand and 3 said the online materials were somewhat confusing. None of the respondents claimed the online materials to be very confusing.
16 14 12 10 8 6 4 2 0 Easy to Understand Somewhat Confusing Very Confusing

Q24) Participants were asked how they would rate the online training materials on a scale of 1 (poor) to 5 (excellent). 6 respondents each rated the online materials as a 3 and 4. Only 3 respondents rated the material as a 5. 1 respondent each rated the material as a 1 and 2. This shows that 88% of respondents rated the material as average or above.
7 6 5 4 3 2 1 0 Poor Below Average Average Above Excellent Average

Q25) Participants were asked if the online training session built excitement about MoodleRooms. 6 respondents said yes while 3 respondents said no. The majority of respondents, 8, said the training had somewhat built excitement about MoodleRooms.
10 8 6 4 2 0 Yes Somewhat No


Q26) Participants were asked if they would recommend the online training to colleagues. 79% of respondents said yes, while 5 respondents, or 21% said no.

Final Questions Q27) Participants were asked to rate their overall training experience. This rating did not differentiate between face-to-face or online training sessions. The participants were asked to rate the training on a scale of 1 (poor) to 5 (excellent). 20 of the respondents rated the training as a 4, or above average, while 3 of the respondents rated the training as a 2, or below average. 9 respondents rated the training as a 3, or average, with 3 rating the training as a 5, excellent, and 1 rating the training as a 1 or poor.
25 20 15 10 5 0 Poor Below Average Average Above Average Excellent

Q28) Participants were asked to rate the transition to MoodleRooms on a scale of 1 (poor) to 5 (excellent). 20 respondents rated the transition as a 4, while 1 respondent rated it as a 2. 13 respondents rated the transition as a 3 and 2 rated it as a 5. None rated the transition as poor.
25 20 15 10 5 0 Very Difficult Somewhat Difficult Neutral Somewhat Easy Very Easy


Discussion of Results
Demographic Questions Discussed The highest number of respondents was from the College of Arts and Sciences. However, with this college having the most faculty members, it was also the lowest percentage of faculty per division. The Institute for International and Intercultural Studies had the highest percentage with only 1 respondent out of 3 faculty members. There was a wide range of tenure represented with the average length of tenure of respondents being between 6-15 years. 65% of respondents had previous experience with the BlackBoard system with 38% claiming to be very experienced. Only 12 respondents had used MoodleRooms prior to their tenure at Union, leaving 68% of respondents with no experience with MoodleRooms. Communication Questions Discussed Only 19% of respondents were asked for input about the new LMS from a committee member, while 95% of respondents were not involved with any meetings or surveys about the new LMS prior to selection. When asked who first told them about the transition to MoodleRooms 51% of respondents said they had learned of MoodleRooms from their Dean and 30% said they had heard about it from their chair showing that for the most part, the transition to MoodleRooms was clearly communicated by proper leadership. Some confusion comes into play when considering how many training session each respondent was aware of. The numbers were scattered across the spectrum with 35% of respondents being aware of 4 training sessions, 30% aware of only 2 and another 30% being aware of all 6 sessions. Training opportunities were communicated properly in most cases. Question 7 could show a breakdown in communication as the dean or chair was supposed to share about the transition to MoodleRooms. However, since there was no designation of rank for participants, it is not clear if a dean or chair took the survey and therefore skewed this result. 78% of respondents had at least some familiarity with the online training sessions, while 22% were not aware of the opportunity for online training. 20 respondents attended a face-to-face training session and 18 participated in an online training opportunity with 1 respondent participating in both. One interesting point is that in the School of Nursing, all 7 respondents heard about the training opportunities from their dean. However, all 7 respondents also only heard about 2 training sessions, with 6 being offered. 3 of the 7 were familiar with the online training opportunities. Face-to-Face Training Questions Discussed The majority of respondents said that training was offered at least somewhat at a convenient time with 71% responding favorably to the training times. 18 out of 21 respondents claimed the training materials were at least somewhat helpful with only 3 stating the materials were not helpful. 14 of 21 respondents thought the training materials were easy to understand. However, 33% of those attending a face-to-face training session thought the materials to be somewhat or very confusing. Overall, the training materials received average of above markings with only 14% of ratings being below average. The training staff was seen as helpful with no one rating the staff as not helpful. However, 21

only 4 of 21 respondents thought the trainer made the session interesting. All respondents thought the trainer‟s knowledge of MoodleRooms was at least average, with 70% claiming the trainer‟s knowledge above average or excellent. 17 of the 21 respondents left the face-to-face training session at least somewhat excited about moving forward with MoodleRooms. Only 4 respondents were not excited after the training session. Overall, 78% said they would recommend the face-to-face training session to their colleagues. Online Training Questions Discussed Overall the online training session received good reviews. 16 of 17 respondents described the online training materials as at least somewhat helpful. 14 of the 17 thought the materials were easy to understand and the other 3 thought the materials were somewhat confusing. 88% of respondents thought the online training materials were at least average quality and 53% claimed the material as above average or excellent. 14 of the respondents were at least somewhat excited about MoodleRooms after completing the online training while 3 respondents were not. 79% of respondents said they would recommend the online training to their colleagues. Overall Training Questions Discussed Overall, the training experiences, both face-to-face and online, received positive reviews from the faculty members who completed the survey. 89% of respondents rated the training as average or above with 64% rating the training above average. 97% of respondents felt the transition to MoodleRooms has been mostly easy, with only 3% disagreeing. Both of the final 2 survey questions are good indicators of the effectiveness of the training sessions, in the opinion of faculty. Interestingly, those with no experience with MoodleRooms found the training sessions to be more helpful and less confusing than those who had previous experience at another institution. Of those with prior MoodleRooms experience, 11 found training materials to be somewhat helpful and 9 claimed it to be somewhat confusing. Of those with previous experience in MoodleRooms, 8 said they would NOT recommend the training session (whether online or face-to-face) while only 4 said they would recommend the session.


Conclusion and Recommendation
Immediate Conclusions     MoodleRooms training has been well received by faculty members and is on track to encourage online educational opportunities for our students. Those with prior MoodleRooms experience found the training to be somewhat lacking. Perhaps, training should not be required for new faculty with prior MoodleRooms experience. With 24% of respondents finding the training material somewhat confusing, there needs to be an examination of what could be confusing participants. Communication surrounding the MoodleRooms selection and training should have been centralized through campus email and not distributed through deans and chairs.

Long Range Planning After analyzing the data from the each respondent, it is clear that we are on the right track in the styles of training offered and the materials used. The training is well received, overall, and should be continued for new hires. Repeat face-to-face training session should be considered for those who are still not comfortable designing a course in MoodleRooms. It seems that some of the training material in both versions of the training are a bit confusing for trainees. All material should be reviewed and examined for clarity using a review team consisting of non-MoodleRooms trained personnel. Future trainings session opportunities would be better communicated through faculty-wide email than through dean-to-faculty or chair-to-faculty means. One other goal should be to raise respondents‟ opinions from average to above average across the board on training materials and staffing. Evaluation Insights I found the training to be well received and organized. Faculty members seemed to enjoy the training sessions and were generally positive about the training opportunities. When designing a survey, it will be important to not leave room for error, as I did in question 7. The question asked whom participants heard about training opportunities from. The correct answer should have been the chairs of each department. However, chairs heard about training from the dean and deans heard about it from the VP for Academic Administration. Therefore, if a dean or chair took the survey, the answer could be wrong according to the results, but perfectly fine according to the structure of communication. One of the biggest challenges for me with this evaluation was determining how to make it attainable. My original idea for evaluation was massive and received critical feedback for being too large by my peers and professor. The struggle became choosing what to evaluate; knowing that it all needs to be examined. I chose to focus solely on the training aspect of the MoodleRooms transition and am thankful that was all I chose to evaluate. I feel that the process was rewarding for me and will be rewarding for our department in days to come.


Appendix A: Survey Question Form







Appendix B: Email to Faculty Hello, By now you are all aware of our recent transition to MoodleRooms from the former BlackBoard system. Last summer we introduced our faculty members to MoodleRooms and began promoting the transition. We find ourselves 1 year into the transition and want your feedback. Obviously a transition such as this one has required a lot of work by many staff and faculty members. We are grateful for their commitment to this project. We ask that you take a moment and participate in a brief survey (survey link). The survey focuses on the training offered for the new MoodleRooms LMS. Please answer questions as accurately as possible. The survey will be available from July 11 until July 25. Thank you in advance for your participation. Your input helps ensure a high quality of service for all faculty and staff.

Josh Simmons Manager, Campus Media Services


Appendix C: Project Timeline

Date June 19 June 30 July 5 July 7 July 9 July 10 July 11 July 18

Project Task Find an evaluation Begin developing questions for assessment Determine scope of evaluation Final project formal proposal Re-evaluate scope of evaluation Email sent to faculty Link must be active to Google form Check to see that results are coming into the form

Notes Evaluation will be on MoodleRooms implementation

Proposal was critiqued as being too much for course evaluation work Evaluation will only be on training

Very few results are recorded Only 37 results are recorded. I later realize this is nearly 15% and that is a good percentage for a voluntary survey

July 25

Form closes

July 28 July 31 August 2 August 4

Begin formatting data collected Evaluation due for peer review Evaluation returned from peer review Final evaluation project due