You are on page 1of 6

Reflections

EDUC 765: Trends and Issues in Instructional Design

By: Lynda L. Rassbach

Submitted June 20, 2016

The successful man will profit from his mistakes and try again in a different way.
~ Dale Carnegie

When I enrolled in EDUC 765 Trends and Issues in Instructional Design, I thought, “I got this!” After
all, I have almost thirty years of teaching experience, and I’ve been involved in curriculum
development projects during my tenure at Wisconsin Indianhead Technical College (WITC). In my
limited experience, instructional design was taking an instructional objective, pulling out a tool box of
teaching strategies, selecting a best practice that ramped up the fun factor, creating a video or
interactive e-book, and, then, making it all look “pretty”. I was both humbled and surprised to see how
far off the mark I was! During the past eight weeks I have learned to justify an instructional need,
identify learner characteristics, and refine desired performances into a finite set of performance-based,
observable objectives. Instructional design is clearly more than selecting teaching strategies, designing
instructional websites and making worksheets.
Instructional design is the systematic process that takes training from its initial conception to its final
delivery. Instructional design is spending time in interviews with subject matter experts, focus groups
and workers identified as exemplars. It is getting into the trenches with workers and managers to learn
what they do and how they do it. It is discovering the “who”, “what”, “why”, “when” and “how” of a
performance. It is understanding how people learn, being able to translate learning theory into practice,
and then, assessing the learning in authentic ways which link directly to the training goals and
objectives. Instructional designers are the quintessential life-long learners who apply the best in
educational research and the latest in new technologies to develop training materials and assessments
that ensure that training actually does what it was designed to do.
My design project, Data and Evidence Analysis, is my first experience in formal instructional design.
During my deliberations, I gained an appreciation for the detail-oriented mind of the instructional
designer and have discovered that I have taken a lot for granted in my own curriculum development
experiences. Coming from within the umbrella of the Wisconsin Technical College System (WTCS)
where the Worldwide Instructional Design System (WIDS) reigns supreme and terminal objectives are
handed out like popsicles on a hot day has encouraged a sense of laziness relative to course design. If I
teach a course, there is no front end analysis or goal analysis. I don’t have to stop and identify learner
characteristics. I’m not required to select a learning theory. I just log into the curriculum bank, pull out
a course outcome summary and start creating learning activities.
Within the course outcome summary, the WTCS provides the course goal and terminal objectives. The
objectives are intentionally designed to be “fuzzy” – no clearly identified performance, conditions, or
criteria – because of the variety of programs that may use a course and the variety of instructors that
teach each course across the state. Without access to the front end analysis and the goal/task analysis, it
is easy to misinterpret the intent of the fuzzy terminal objectives. The result is that many times I find
myself teaching a course that is identified by the same course number but is markedly different in
instructional level, delivery mode and assessment as another faculty member within my same college.
In fairness to learners that should not be happening!
One of the most painstaking things I have ever done is to sit alone in my study and attempt to bring my
project, Data and Evidence Analysis, to life. The WTCS gave a rudimentary beginning to the course
with three objectives that can be summarized as “provide faculty with opportunities to demonstrate that
they can use data to make decisions.” However, to know and understand a course is to have

participated in its development. I missed integral parts of that development: the needs analysis, the
background interviews with the subject matter experts, managers and faculty, the presentation of
research and the brainstorming sessions. I was forced to make assumptions that may or may not
convey the meaning intended by the development group. For example, when the course outcome
summary states that faculty as to “use statistical measures” does that mean they should calculate those
measures? In my mind, unless you understand how to calculate a statistical measure you can’t
interpret statistical measures. Fuzzy objectives are dangerous!
In the process of reverse engineering the fuzzy course outcomes without access to the “why” or “what”
or “how” of the goal analysis, course content started spontaneously generating. The more I worked to
refine those fuzzy objectives, the more the content emerged from out of the “fuzz.” It reminded me of
the magic black snakes you can buy on the Fourth of July. In fact, I have a “black snake” dilemma!
During the goal analysis, more content emerged than will fit within the confines of an eight-hour
training session. Moral to the story? I overthought the intent of the original course development team.
Some of the content will need to be sacrificed.
As I struggled to de-fuzz-ify performances in my reverse-engineered goal analysis, I discovered some
important take-aways worth sharing. First and foremost, don’t skip the front end analysis; the front end
analysis along with the goal analysis significantly shortens the development time in the long run and
leads to clear intent in instruction. Second, avoid black snakes by writing clearly defined objectives
that leave little room for decoding errors. A third take-away also comes from my goal analysis – good
objectives are difficult to develop when they aren’t shared with another person. Unless the objectives
have a chance to come out of your head and hang in the air where they can be looked at and listened to,
considered by many different sets of ears and eyes, and then, questioned and refined, they are a nesting
ground for “black snakes.” To make it to some form of course goal and terminal objectives that will
make sense to future learners, the objectives need to be volleyed about and polished to the point where
anyone should be able to pick up an objective and understand the performance described there.
Working collaboratively with the group of experts on the Faculty Quality Assurance System design
team that brainstormed the data analysis competencies would have provided clarity to the obfuscatory
objectives provided by WTCS course outcome summary.
Another take-away from the goal analysis – sigh of relief – course development is a process. In that
knowledge comes permission to say, “Good enough!” and move along in course development. I finally
pulled back from my attempts at defining and redefining my fuzzy objectives and moved into the
instructional development phase of my design. In that process of moving a step forward in
development an interesting thing happened! New ideas, new ways of getting at my goal in shorter
amounts of time emerged. It was reassuring to discover that course development is a process with no
right or wrong answer; this first go-round is a work in progress.
As I moved from the goal analysis into designing learning activities, I selected the learning theory and
motivational theory I would use as a framework for instruction. I’m excited to look at instruction
through a new set of eyes, but this too has been a painstaking journey. For me the old adage, “You
teach like you were taught,” is true. I went from middle school to high school to college to graduate

school watching mathematics being presented by an instructor standing in front of the classroom, chalk
in hand. Day by day – year after year – the cycle was embedded deep within me, “This is how you
teach mathematics!” Therefore, when I stepped into the classroom for the first time, I mimicked my
favorite sage from the stage. For a time, it worked. Now, thirty years later, as I step onto the stage in a
developmental math class and twenty sets of bright eyes go dim, I realize that my adopted theory of
learning and motivation – mind dump by telling learners everything they need to know, give lots of
examples, and assign homework – isn’t working. The difficult part of identifying a problem is that you
need to dig into the “why’s” and the scary thing about the “why’s” is that I have discovered that I need
to take ownership of the problem!
What does ownership look like? Ownership is acknowledging that I am responsible for teaching with
the best that research has to offer for a particular set of learner characteristics. Therefore, the time I
spent studying learning theories in this course was invaluable. I discovered that I can teach from the
shoulders of giants like Malcolm Knowles, Lev Vygotsky, Mohammed Chatti and John Keller. As luck
would have it, our module on learning theories was complemented by my attendance at a four-day New
Mathways Project (NMP) seminar at the Dana Center at the University of Texas. Both experiences
have changed the way I view my role as an instructional designer and the importance of incorporating
learning theory and educational research into course design. Let me explain.
The NMP is an approach to remedying the issues of student success and retention in undergraduate
mathematics courses; it is based in educational research. The NMP uses a spiral curriculum based on
Vygotsky’s constructivism; learners construct meaning through collaborative activities within the zone
of proximal development and build new knowledge that links to prior knowledge. The built-in
scaffolds inspired my enabling objectives during my goal analysis. The curriculum design of the NMP
also fits well with Knowles’ theory of andragogy – adults learn best in collaborative group settings with
open-ended questions that are grounded in real world problems related to life and employment. These
real world problems require constructive persistence – struggling with minimal guidance to build new
knowledge from old knowledge in a safe, supportive environment (Dana Center, 2016). This message
from the NMP – real problems, collaborative learning, open-ended problems, safe/supportive
environment, and integrating new knowledge with prior knowledge – is the reason I selected social
constructivism as my learning theory for my final project.
Along with elements of social constructivism, the nature of the course, Data and Evidence Analysis,
makes it easy to incorporate elements of Chatti’s Learning as a Network (LaaN) theory. Why? Adults
learn best when new knowledge is linked to old knowledge. However, where that knowledge is stored
is irrelevant. Knowledge can be stored on a computer’s hard drive sitting on a desktop or in the elusive
cloud halfway around the world. Either way, knowledge is accessible with the swipe of a finger.
Knowing how to determine what knowledge is needed should be the heart of instruction. Learning,
then, is about forming networks of knowledge both inside and outside of the brain and organizing that
network for retrieval when information is needed. LaaN theory suggests that identifying the data
network at WITC, introducing learners to key personnel from the Office of Institutional Effectiveness,
introducing data analysis tools to access and analyze the data, and developing skill in breaking a
problem down into measurable outcomes should be the focus of the instructional design in my Data and
Evidence Analysis course.

Keller’s ARCS motivational theory suggests that there are four key elements – attention, relevance,
confidence and satisfaction – which must be present within each learning activity to encourage and
sustain learners’ motivation. In order to get (and keep) each faculty member’s attention in the course I
am designing, content will be taught in the context of real world applications. You can’t get more real
world than to ask faculty to bring problems to the training session from their own teaching and learning
experiences. With problems in hand, faculty will be introduced to the program data profile (PDP) – a
one-stop-shop for most of their data needs. Basing instructional design on real, up-to-date datasets
drawn from actual student performance and allowing learners to select problems linked to their own
teaching and learning experiences will diminish the “You’re wasting my time! I could be getting ready
for class!” comments usually associated with training sessions.
Faculty will come to training with mixed feelings about the subject matter and varied levels of
confidence. For this reason, instructional materials will be broken into smaller “chunks” followed by
practice in small groups for peer-to-peer feedback. Course outcomes and learning objectives will be
provided in advance so that, if they chose to do so, faculty can present documents from their personal
portfolios as evidence that they have already met course outcomes. Approaching statistics with
manageable modules and allowing opportunities for faculty to be rewarded for outcomes they have
already mastered will be an essential boost to learner confidence and satisfaction.
Three activities come to mind when I consider milestones in my development over the past eight
weeks: the teach-back on learning theories, the online discussions and the self-reflection activities.
The learning theory teach-back was particularly rewarding. Just as I was reading about educational
research and learning theories in class, I participated in the NMP training in Texas. I had the
opportunity to read about social constructivism from the viewpoint of Lev Vygotsky and then see it
played out in the Dana Center’s NMP curriculum. Along with the teach-back, I found the network of
resources on the discussion board invaluable. I am not much of a “working in groups” person – at least
I didn’t think so. But, some valuable information came from those interactions. Peers from across the
world, some already working as instructional designers, came together and shared experiences which
allowed me to feel that I was part of a supportive community of learners. They asked questions that I
didn’t realize should be asked. Without the input of my peers, I am certain I would not have learned as
much as I did so quickly. Their input was informative – and oddly enough, reassuring!
Both the instructional design competencies organizer and this project reflection offered two
opportunities for self-reflection which I have found invaluable. Stopping to evaluate where I have been
and where I am now and deciding where I still need to go is a requirement of continuous improvement.
I can appreciate the abilities that I already possess which make me a good candidate for instructional
design -- writing skills, creative vision, love of learning, technology guru – and I can identify those
skills which I still need to develop – applying learning theory, creating graphic designs for use in my
courses, using Captivate, practicing interdependence. Taking the time to reflect and make adjustments
to underlying assumptions is the component of double-loop learning theory that ensures that I am
focused on continuous improvement and reassures that I am ready to move past the evaluating level to
the creating level of Bloom’s revised taxonomy.

With my focus on using my newfound knowledge to create something from nothing, I have set three
goals relative to instructional design. Over the next two months, I will:


Complete the next course in the instructional design certificate at UW Stout
Create the first learning module in Data and Evidence Analysis with an August 22, 2016,
launch date
Create learning materials for a training need expressed at Red Cedar Church in Rice Lake,
Wisconsin

The latter goal comes from a mentorship training I attended through my church. With no overarching
goal and terminal objectives, the training bounced all over the place, and at the end of the training,
everyone involved looked around with the proverbial “What just happened?” look on their faces – more
evidence for the significance of a front end analysis! Called by the challenge to put my newfound skills
into practice, I volunteered to complete a goal analysis and create an online mentorship resource.
Stakeholders are meeting in two weeks to begin the process. I look forward to developing my skill as
an instructional designer by participating in my first-ever front end analysis and goal analysis – no
reverse-engineering required!

References
Argyris, C. (1976, Sep). Single-Loop and Double-Loop Models in Research on Decision Making.
Administrative Science Quarterly, 21(3), 363-375. Retrieved from
http://www.jstor.org/stable/2391848
Chatti, M.A. (2013, January 11). The Laan Theory. [Web log comment]. Retrieved from
http://mohamedaminechatti.blogspot.com/2013/01/the-laan-theory.html
Digital Promise.org (2016). Designing for Adult Learners. Retrieved from
http://digitalpromise.org/wp-content/uploads/2016/03/designing-for-adult-learners.pdf
Keller, J.M. (2013). ARCS Explained: What are the ARCS Categories? ARCSModel.com. Retrieved
from http://www.arcsmodel.com/#!arcs-categories/c1zqp .
The Charles A. Dana Center, University of Texas at Austin (2016). Retrieved from
http://www.utdanacenter.org/higher-education/higher-education-resources/new-mathwaysproject-evaluation-and-research/new-mathways-project-annotated-bibliographies/.