You are on page 1of 17

AHE 571:

Procedures &
Rationale
By Christine Griffi n and Kara Fuhrmeister
This project was done on behalf of the Washington State Guide By Your Side
Program. This document includes Steps 1-4 of the procedures and rationale
project requirements as outlined in the AHE 571 syllabus.

Page | 1

Table of Contents

Step 1: Procedures and Rationale 2


Step 2: Procedures and Rationale 3
Step 3: Procedures and Rationale 4
Step 4: Procedures and Rationale 6

Page | 2

Step 1: Procedures & Rationale

Briefly describe the site/organization, research population, relevant


history/background:
Guide By Your Side (GBYS) is a program of Washington Hands & Voices. This
organization provides non-biased information of communication modalities and
support to families of children, ages birth to 21, with a suspected or diagnosed
hearing loss. This includes children who have hearing loss plus additional disabilities
such as blindness, cognitive, developmental or medical needs.
Trained Parent Guide employees are parents who have the shared experience
of raising a child who is deaf or hard of hearing, and who work one on one with
families referred to the program. Our mission statement: "What works for your child
is what makes the choice right". Each Parent Guides initial training is extensive.
Ongoing conference calls and an annual training are also offered to better prepare
Parent Guides in their work with families.
Those that will be surveyed in this evaluation are parents, or family members,
who have been supported by a Parent Guide. Once the familys case has been
officially closed a survey will be sent to evaluate the effectiveness of the service
received. Since the start of the program, six years ago, families using this program
have not been surveyed. Working only 20 hours a month there was never sufficient
time allotted for the GBYS coordinator to complete the task of setting up protocols,
procedures, and to analyze data.
Who are the educators who need the research/evaluation?
As part of our new contract with Office of Deaf and Hard of hearing, a division of
DSHS, we are required in our scope of work to provide annual data reporting on
performance-based outcomes. Requirements of this reporting are to include mixed
methods of quantitative and qualitative data analysis.
What do the educators wish to learn from the research or evaluation?
Christine Griffin is the program coordinator of Guide By Your Side who will use
the information collected to analyze the effectiveness and identify gaps of the
staffs professional training.
Why is this information needed?
Collected data will be useful to plan future continuing education training for
the Parent Guide staff. In addition, data collected will be useful for strategic

Page | 3
planning and to offer stakeholders an understanding of the program, and hopefully
show the growing effectiveness of the programs staff of trained Parent Guides.
Having both quantitative and qualitative responses will be very useful in
grant writing efforts to grow the programs capacity in order to offer more families
support and information throughout Washington State. With more families utilizing
the services from the Guide By Your Side program, parents will be better prepared
to make informed decisions based on their childs educational, social and emotional,
and safety needs.
The vision of the Guide By Your Side program is to engage families in their
childs development, which will aid in the improvement of child and family
outcomes.

Step 2: Procedures and Rationale

Brief Explanation of Evaluation Context:


Guide By Your Side is an organization that provides non-biased information of
communication modalities and support to families of children, ages birth to 21, with
a suspected or diagnosed hearing loss. Our groups purpose is to create a mixed
methods instrument for the Guide By Your Side program to use to measure patron
satisfaction once individual families are no longer in need of the organizations
services. In addition to the exit-survey instrument itself, we will need to design a
protocol for administering the survey (to be used by the trained parent guide
staffers) as well as outlining a protocol and method(s) to use to analyze the data
once it is collected. We would also like to note here that understand that the
procedures and rationale project differs from the instrument design and testing
project; it is just difficult to explain one without the other in this outline as they are
so closely tied.
What We Plan to Submit:
To complete this project, we plan to submit a collectively authored paper that
would loosely model an evaluation report. Although the writing of each section has
been assigned to one person in particular, we still plan to be very much involved in
the selection of content for each area. The components would be as follows:
Background/Context: Christines Section

Including a description of the organization, their concerns, as well as


their needs for evaluation.
Evaluation Methods: Karas Section

Page | 4
o

This will include a discussion and justification of the selected


evaluation type, a description and justification of the data collection plan, as
well as a protocol for data analysis.

If there is not too much overlap between assignments here, we would


also like to do a brief overview of the survey instrument itself.
Evaluation Theory: Christines Section

This section of our report will discuss how our chosen methodologies
have aligned and contrasted to various evaluation paradigms.
Discussion of Results: Kara's Section

At the very least this section will discuss how we feel our evaluation
methodologies have met the needs of the Guide By Your Side Organization.

While we will do our best to mitigate weaknesses, we would also like to


discuss and acknowledge any potential issues or biases we see in our
evaluation methodologies here.

Actual results: The Guide By Your Side Organization sorely needs this
formative evaluation plan and instrument. At the moment, it is unknown
whether we would have enough data to discuss in our AHE report as this
survey would only be administered to families upon their exit from the
program (families exit on varying time frames). That said, it is also a goal to
survey the 25 families that the Guide By Your Side program supported in
2015.
References: Both Kara and Christine

We will be sure to cite a minimum of four different textbook chapters


throughout this report.

Justification of this Outcomes Plan:


We feel that by following this plan, we will be able to produce a well-rounded
and quality product for submission at the end of the quarter. Each section of our
procedures and rationale plan was carefully crafted to include all of the important
concepts outlined in the course syllabus. For example, by including a section in our
paper that discusses and justifies our evaluation methodology, we hope to convey
our understanding of course concepts and our ability to design an evaluation that is
both appropriate and useful. In addition, we added a theories section into our
modified evaluation report. It is our hope to use this section to demonstrate our

Page | 5
knowledge of evaluation paradigms and to show that we made conscious design
decisions that were grounded in established principals.

Step 3: Procedures and Rational

Overview:
With new funding for Guide By Your Side comes a required annual evaluation
summary of the program. As was mentioned before, the Guide By Your Side
program provides information and one-on-one support by trained Parent Guides to
assist Washington State families of children who are deaf, hard of hearing or
deaf/blind.
Guide By Your Side is a program of Hands & Voices, which is an international
organization. Many years ago, headquarter staff created a generic Parent
Satisfaction survey for state Guide By Your Side programs to use. Historically, this
survey only included questions of how the family would rate their interaction with
the Parent Guide. We will keep this component but seek to also diversify the survey
content into some broader categories as well.
Our updated mixed-methods survey is meant to be completed by every family who
works with a Parent Guide. This will be a developmental evaluation (i.e.
continuous), but we will divide the survey data that we collect within each fiscal
year to use for stakeholder reporting. Since Christine works as both a coordinator
and as a Parent Guide to families in northwest WA, the survey is designed to be
anonymous and confidential to eliminate a conflict of interest.
Progress Report:
Our work has been to update the Parent Satisfaction survey and make it into a
Parent Input survey that would additionally provide family outcome information to
stakeholders. In this way it would include more transformational information. The
family outcomes we have chosen to focus on in the evaluation survey are:
1.

Parents will have an increase in knowledge about resources, current


issues, and supports available to them to assist their child with hearing
loss.

2.

Parents will have a better understanding of whatever system that their


child is involved in (this will vary).

Page | 6
3.

Parents will increase their level of confidence in their ability to make


decisions that will best serve their childs communication and
developmental needs.

Steps Taken:

We spoke with leaders at headquarters asking if we could alter the Parent


Satisfaction Survey. They not only replied that it was fine, but they welcomed it.
Other states have reported they also have wanted to update the survey.
Unfortunately, no one has had the time to complete this task until now.

We have reviewed different samplings of parent surveys from various


programs to see what different questions are asked. From here, we decided on
the questions and how they would be designed.

Christine attended an online webinar presented by evaluators of the COHEAR program in Colorado who administers biannual surveying of families they
serve. (The CO-HEAR program is an early intervention program that provides
direct professional services to families, such as working with families on
language development, etc.)

We have inquired to the other states who offer the Guide By Your Side
program for feedback on which ways they have sent the survey and if they
found they had more response using a particular method over another (i.e.
electrically, or by mail).

We crafted a test survey and cover letter that has been sent it out to 8 pilot
study participants. In addition, we asked the participants a few questions of
their thoughts when taking the survey that would give us feedback on; the time
it took to complete the survey, if they thought the questions were redundant,
and any edits they would suggest.

We have written out the protocol steps for administering this survey. We are
willing to edit the protocol steps after information from other programs and the
test survey results are in.

At present we are deciding on the best way to store and analyze our data. As
there is a minimal budget, paying for expensive data analysis software is not
going to happen. Currently the evaluation population is so small that it may be
feasible to run our own descriptive statistical analyses in Excel and perhaps to
even code and store our qualitative data in a Word document. Ideally though, we
would use an online survey method to track and analyze data. When a mailer
survey comes in, we could just enter it manually into the online forum so that it
would all be stored in the same place.

Questions:
1. Is it crazy to consider doing data analysis manually vs. finding a software
program to do it for us? Does anyone have any experience or input here?

Page | 7
We have looked at Survey Monkey and Lime Survey. Survey Monkey
seems much more user friendly, however, there are limitations with the
free version, which is what we would use (i.e. the survey can only be
issued 100 times).
2. What potential problems do you see with regard to biases and validity?
And why?
3. Are there other suggests youd like to offer?

Step 4: Procedures and Rationale

Background/Context:
With the advent of universal newborn hearing screening 2-4 per 1000
(American Academy of Pediatrics, 2010; Bagatto, Scollie, Hyde, & Seewald, 2010;
Watkin & Baldwin, 2011) babies are now being identified with hearing loss before
their first birthday, and most before three months of age. This is a dramatic change
from 15 years ago when this type of testing was not utilized to screen infants.
Based on the number of births in Washington State this translates into
approximately 160 babies annually (Washington State Dept. of Health, Early
Hearing-loss Detection, Diagnosis and Intervention, 2015). To ensure these families
enter early intervention services and supports, The Joint Committee on Infant
Hearing (JCIH) determined a system of care that families would benefit from
following a hearing loss diagnosis. One portion of this 12 goal system of care is,
Goal 9: All Families Will Have Access to Other Families Who Have Children Who Are
D/HH and Who Are Appropriately Trained to Provide Culturally and Linguistically
Sensitive Support, Mentorship, and Guidance. (JCIH, 2007) This means that
families have the right to be referred to a formalized deaf and hard of hearing
(D/HH) parent to parent support program in the states that provide these services.
Stakeholders have identified Hands & Voices as a credible unbiased parent
support organization that provides one to one parent support through their Guide By
Your Side (GBYS) Program. The GBYS program provides parent to parent support
across the state by trained individuals called Parent Guides. In August 2010,
Washingtons GBYS program began supporting families through a pilot project
funded by Washington State Department of Healths Early Hearing-loss Detection,
Diagnosis and Intervention Program. Now in our 6th year of operation the GBYS
program is gaining momentum supporting more families than in previous years. In
November, 2015 the GBYS program signed a new contract with the Office of the
Deaf and Hard of Hearing (ODHH) that will allow GBYS to grow and support more
families.
With new funding from ODHH, they have requested the Washington State
chapter of GBYS to demonstrate a commitment to consumer input through the use
of an annual parent satisfaction survey and have required an annual report of the
following information:
1. Percentage of parents who have completed the survey.
2. For each statement in the survey, score the average responses.
3. Summary of comments written by participants.

Page | 8
4. Contractor or subcontractor action plan to monitor services that need
improvement.
A primary aim of this evaluation is to collect information to show whether GBYS
is meeting its targets. The Guide by Your Side Organization has outlined three target
outcomes of their services:
1. Parents will have an increase in knowledge about resources, current
issues, and supports available to them to assist their child with hearing
loss.
2. Parents will have a better understanding of whatever system that their
child is involved in (this will vary).
3. Parents will increase their level of confidence (self-efficacy) in their ability
to make decisions that will best serve their childs communication and
developmental needs.
In addition to measuring how well GBYS is meeting these outcomes, we would
also like to collect data about: the performance of the individual Parent Guides that
serve each family, family demographics, and information regarding opinions of the
GBYS program itself.
Evaluation Methods:
A General Overview:
Our team decided to utilize a developmental evaluation design model
because it is the intention of the Guide by Your Side organization to evaluate and
make changes to its program operations on a continuous basis. Mertens and Wilson
(2012) explain that developmental evaluation in no way replace[s] formative and
summative [evaluations], but [serves] an additional distinct purpose (M&W, 2012,
p. 280). In other words, this evaluation will be neither formative nor summative
because data will be collected on a monthly, ongoing manner. Each month, the
newly collected information will be added to an existing database. We are not
working towards a final outcome, but instead, seek to continually improve the GBYS
programs usefulness for its constituents. Findings will still need to be reported to
the Office of the Deaf and Hard of Hearing stakeholder group annually, yet,
improvements to the program will continue to be made throughout the year and
evaluation efforts will carry on long passed each reporting deadline.
To support our developmental evaluation design, we chose to create a census
survey testing instrument to track and measure longitudinal trends. The benefit of
such an instrument is that it will allow us to collect and examine changes to parents
perceptions of, and experiences with, the Guide By Your Side program over time
(Gay, Mills, & Airasian, 2012, p. 185). It was also important for us to ensure that the
instrument could be administered as a census survey. According to Gay et al.
(2012), census surveys are usually conducted when the population is relatively
small and readily accessible(p. 184). Although the number of families that GBYS
serves at any given time is growing, GBYS currently estimates that the evaluation
population will be only 20-30 families per year. By attempting to gather feedback
from all families exiting the program, we hope to maximize the data we have
available for analysis.
Yet another advantage to conducting a survey, is that it enables one to
collect data in a relatively short time frame (Gay et al., 2012). As GBYS is under a

Page | 9
deadline to get an initial report to stakeholders before summer, this will be
especially important for their first few evaluation cycles.
Instrument Development
A lot of thought went into the development of our survey instrument. It was
developed in a series of phases that included a study of existing instruments, some
formal survey design training, a review of pertinent literature, collaboration with the
parent organization, as well as a pilot test. We used a sample survey from the Guide
By Your Side headquarters as a starting point.
To gain understanding of how program evaluation is done by other
organizations Christine attended an online webinar, "Early Intervention Program
Accountability "provided by the National Center for Hearing Assessment and
Management on January 21, 2016. In the webinar, presenters shared their
experience of administering a parent survey to the families whom they had
provided early intervention services to. Here a 6-point Likert scale was discussed
and reasons for its use. The presenters also shared their experience in obtaining
optimal return rate, which they stated was by mailed version.
The Hands & Voices headquarters staff was also contacted in order to gain
permission to adapt their version of the Parent Satisfaction survey. They replied this
was not only alright with them, but welcomed our changes as they plan to revamp
their survey within the next year. We asked if Hands and Voices wanted to
standardize the survey for all GBYS programs to use in an effort to provide national
data among the programs. Unfortunately, this is not something they are prepared to
take on, so it is up to the individual programs to administer their own parent
surveys. Christine then reached out to other GBYS coordinators for feedback on
their own evaluation protocols and see to what was successful for them. From those
who replied, all expressed an interest in developing a new survey, but most had
little to share.
We have since made many changes to the survey instrument as the original
survey was almost entirely focused around collecting data about the Parent Guides
that work with each family. Parent Guide performance is something that still needs
to be evaluated, but as was discussed in the background section of this report, it is
evident that GBYS also wanted to collect data that could be more transformative for
the program as a whole. It was therefore decided that the survey would include
some of these elements of the parent satisfaction survey, but would have added
questions to collect demographic information, examine the impact that GBYS
services had on families, and to evaluate parents overall feelings about their
experience with GBYS.
Once we had a strong draft of this survey, we gathered a group of 9 Parent
Guides (GBYS staff members) to administer a pilot test. The pilot test included an
emailed cover letter, an informed consent form, as well as a link to the survey itself
on Google Forms. Pilot test participants were encouraged to take the survey and
then respond to specific feedback questions i.e. Did any of the questions seem
vague or unclear? In addition to the requested written feedback, the pilot test
subjects also participated in a conference call to discuss their experiences with the
survey. Based on our pilot test findings, our team worked to identify areas in the
survey instrument that necessitated improvement in order to roll out the final
testing instrument at the end of March.

P a g e | 10
Glitches with Google Forms was a large issue that became apparent to us
through this pilot review process. We have since changed our survey housing to
Surveymonkey.com. However, due to the fact that GBYS does not have the budget
to pay for an ungraded account on Survey Monkey, we have had to shorten our
survey from 12 questions to 10. Questions that we selected to be cut gathered data
that could be easily and legally collected using GBYSs existing records or they were
added onto a different question to make it 2 parts.
Instrument Content:
We opted to take a mixed methods approach to this survey. An opinion in
support of mixed methods is cited by Mertens and Wilson (2012): we as
evaluators share a moral imperative to conduct mixed methods evaluations
because of the complexity of the contexts in which we work. It was clearly
important for GBYS to have access to both qualitative and quantitative data. As
evaluators, we strove to utilize the strengths of these two methodologies to our
advantage. At last edit, the GBYS survey was 10 questions long; 7 of the questions
being quantitative, and 3 questions allowed for an open-ended/qualitative response.
Two of the quantitative questions utilize the Likert Scale, others are multiple choice,
and the remaining allow for a multiple answer selection.
Data Collection:
*Note: A complete walk-through of this process was submitted as attachment
Survey Protocol to our Step 1: Survey Instrument Rationale.
Families are only surveyed once they have officially exited the program and
their case-files have been formally closed. During the file closing process, it is
written into our survey protocol that Parent Guides are to inform the families of the
forthcoming survey. It is during this time, that Parent Guides will give families the
option of receiving either an electronic or mailed version of the GBYS Parent
Satisfaction Survey. We hope that by providing this option, we may be able to
increase the number of surveys returned. This information will then be reported to
the WA GBYS coordinator during the Parent Guides monthly reporting process.
Surveys will then be sent out in the mode selected by each individual family.
Regardless of the administration method, all surveys are strictly anonymous and
confidential. A Notice of Informed Consent is also included in each survey cover
letter. Using SurveyMonkey.com for electronic survey takers and the postage paid
postcard (see instrument protocol) we will be able to ensure the anonymity of each
survey recipient, while still being able to track which families have submitted
surveys. This will allow us to follow up with families we have not heard back from as
is outlined in the instrument protocol document.
Data Analysis
Survey responses that are sent in by mail will have to be manually entered
into SurveyMonkey.com to unify it with the online data where a descriptive
statistical analyses and a qualitative summary will take place. The qualitative data
will be analyzed first so as to limit bias potential in the analysis process (if we had
trends in mind based on earlier quantitative findings, this may inadvertently impact
what themes we identify in the qualitative responses). The process of theme
identification will take place in order to analyze the compiled qualitative responses.
Through this method, we will identify patterns and emergent ideas within the

P a g e | 11
qualitative data collected from the open-ended questions on the survey (Gay, et al.,
2012, p. 469). Key phrases from the survey responses will be coded, sorted, and
stored in a data base throughout the year in order to report to stakeholders
commonly stated thoughts and opinions. This monthly analysis approach will also
make for a less work intensive reporting process to the ODHH at the end of each
annual reporting cycle. The qualitative data may also be used in brochures and in
presentations for the GBYS organization.
For all quantitative response questions, we will conduct frequency
measurements so that we can clearly see the number of times each variable
occurred (Gay, et al., 2012, p. 322). Using the frequency values, percentages will
also be able to be calculated for all quantitative data to portray the popularity of
certain responses even more clearly. Mean scores will be developed for all questions
that utilize the Likert Scale so that the organization can see an average value for
these types of questions. Although mean scores are the most common way to
measure central tendency, a downfall is that they can be affected by outliers (Gay,
et al., 2012, p. 326). Because of this, we will also include standard deviation scores
for all values that had a mean score calculated. This will be especially helpful to
show how spread out responses were to each question that utilized the Likert Scale.
Tables that include response numbers, mean scores, and standard deviation will be
included in the report shared with the ODHH stakeholder group.
Evaluation Theory:
The Guide By Your Side Parent Input Survey is framed by the transformative,
pragmatic and constructivist paradigms. Comprised of four assumptions, axiology,
ontology, epistemology and methodology (Mertens & Wilson, 2012, p. 36) each of
these paradigms is metaphysical in nature.
Its important to first identify the transformative paradigm and the Disability
and Deaf Rights Theory as an overarching context of our parent survey. Disability
and Deaf Rights Theory is part of the social justice theory (Mertens & Wilson, 2012,
p. 179), and emphasizes equality with regard to communication and the ability to
have a voice in the process of outcomes.
Though social justice is not directly mentioned in the mission of the Hands &
Voices Guide By Your Side Program, the goals that the Parent Guides emphasize with
families regarding the childs communication, social and emotional, and self-efficacy
outcomes are rooted in the social justice theory. The goal that is most consistently
focused on with families is Goal 8: Ensure that families voices are heard in the
systems that are serving [the families]. (Hands & Voices, 2008) Essentially, this
means that their childs equitable rights for communication must be provided
throughout their daily life. In the study, Parents Needs Following Identification of
Childhood Hearing Loss, a parent shares that parent support impacts a parents
decision making for their childs services and that will impact the childs outcomes
as well. (Fitzpatrick, et al., 2008). In this way, parent support addresses the issues
of power inadequacies (Mertens & Wilson, 2012, p. 163) and is inline with the four
primary principles of the transformative paradigm axiology. These are, the
importance of being culturally respectful, the promotion of social justice, the
dedication to furthering human rights, and addressing inequities. (Mertens & Wilson,
2012, p. 164)
In keeping with in the integrity of the Disability and Deaf Rights Theory,
family voice is central. To integrate the transformative paradigm into our parent

P a g e | 12
input survey we have designed it to include open-ended questions. This allows the
family to respond positively or negatively on the Parent Guides ability to support
their family outcomes.
Another component of the Guide By Your Survey is to determine the
effectiveness of the program itself. The pragmatic paradigm is one that focuses
primarily on data that are found to be useful by stakeholders (Mertens & Wilson,
2012, p. 88) and can include mixed-method (qualitative and quantitative) to
evaluate the GBYS program. The Pragmatic paradigm contains the Use Branch
theory. This theory is flexible in nature allowing us to make the best use of the data
collected. The axiological assumption allows us to gather information most useful to
our program. Additionally the ontological and epistemological assumptions provide
for individual interpretation of reality, and freedom for the evaluator to deem what
is appropriate to study, respectively (Mertens & Wilson, 2012, p.91) Due to these
aspects our team can highlight the focus by the GYBS program, (Mertens & Wilson,
2012, p. 90) emphasizing the work that the GBYS program values most.
Lastly, the Constructivist paradigm focuses primarily on qualitative methods
to gain a better understanding of the lived experiences of the participants. (Mertens
& Wilson, 2012, p.133) As part of the Values Branch it seeks to obtain views and
beliefs from participants in the evaluation. In this case, GBYS is interested to learn
about the perceptions families have raising a child with hearing loss as well as their
experience with the program.
Families that the Parent Guides connect with come from a variety of cultures,
and socioeconomic backgrounds. These family members have multiple, socially
constructed realities, (Mertens & Wilson, 2012, p.137). The authors, Mertens and
Wilson, explain, Constructivists hold that there are multiple, socially constructed
realities, ...realities that are constructed by individuals through reflection upon their
experience and in interaction with others. Simply put, this means everyone creates
their perceptions of the world based on lived experiences. By connecting with a
Parent Guide, the GBYS program is curious if a parents initial perceptions have
transformed, making them a better advocate for their child, or knowledgeable about
their childs communication needs.
Discussion of Results:
As was discussed in the background section of this report, we took into
account the needs of two groups when designing this evaluation. The first being the
Guide By Your Side organization themselves. The second, was the Office of the Deaf
and Hard of Hearing (ODHH), GBYSS major stakeholder group whom had made the
initial request for a report. This evaluation methodology and the survey instrument
are a blended response to the request of these two groups. With GBYSs 3 program
outcomes in mind, we developed a tool to evaluate the variety of content they
wanted to assess and went on to collect the type of data requested by the
stakeholder group.
Guide By Your Side will now be able to collect more transformative data as
opposed to data that is only centered around individual Parent Guides
performances. In addition to this Parent Guide information, they will be able to track
and measure the growth of families knowledge and confidence in navigating the
systems that are in place for their Deaf or hard of hearing child. This type of
information will allow the organization to develop the action plan required of them
by the ODHH. This may mean developing the program in certain areas or perhaps

P a g e | 13
putting on continuing education sessions for their Parent Guides in order to improve
areas targeted for growth throughout the evaluation process. Either way, they will
now be able to gather the information needed to make these informed decisions.
In addition to this action plan, the Office of Deaf and Hard of Hearing will now
have an annual report sent to them with the accompanied descriptive statistics and
summary of qualitative responses. This will allow them to monitor the success and
well-being of the GBYS program from afar, while also tracking their plans for growth
and improvement.
Limitations
Although we believe that our evaluation design is sound and that it has met the
needs of the ODHH and GBYS organizations, we would still like to acknowledge
some potential limitations. One such limitation is with our survey instrument itself
due to the fact that it collects only self-reported data. While quite convenient, this
type of evaluation methodology does not involve direct observation of the
behaviors in question to confirm what people say is really what theybelieve or
feel. In addition, an evaluator cannot know for certain whether a participant
interpreted the questions in a way they were intended to be read (Mertens and
Wilson, 2012. p.374). We hope to have mitigated some of these issues and thus,
enhance the validity of our findings by going back through our survey in step three
of our instrument review process to revise any questions that might have built in
biases or be phrased in a way that might be leading survey recipients to give a
certain type of response. We also conducted a thorough pilot review process in
hopes to search out any areas that may be confusing or repetitive to our survey
respondents.
To help increase the validity of our data, we have designed this survey to be
both anonymous and confidential. We certainly want families to feel comfortable
expressing honest thoughts and opinions about their experiences with Guide By
Your Side. The promise of anonymity or confidentiality will increase truthfulness of
responses as well as percentage of returns (Gay et al., 2012. p. 192).
This leads us to another issue that we may be likely to face we may
struggle with a survey mortality of sorts. Despite the promise of confidentiality and
anonymity, if families prove reluctant to respond to our survey, poor response rates
would certainly affect our evaluations validity and usefulness as a whole. With such
a small evaluation population and a census survey design, it will be even more
important to gather feedback from as many families as possible. We have
attempted to mitigate this by having Parent Guides inform families that a survey will
be arriving during their case closing process. During this time, Parent Guides are
also instructed to give families a preference of whether their survey will arrive in the
mail or by email. From here, we have created a progression of follow up activities
as is described in our survey protocol document. Research suggests that first
mailings will typically result in a 30% to 50% return rate, and a second mailing will
increase the percentage by about 20%; mailings beyond a second are generally not
cost effective in that they each increase the percentage by about 10% or less (Gay
et al., 2012. p. 193). Our plan is to send out the survey by the chosen method (mail
or email) up to two times. If enough time has gone by (as is designated in our
protocol) then one last attempt to secure a response will be made using the survey
delivery option not yet used. According to the previous quote from Gay et al. (2012),

P a g e | 14
these actions would hopefully result in a minimum return rate of around 60%. The
rule of thumb for a survey response rate, based on good sample, is 50%. Anything
above 50% will increase the confidence with which you can speak about your
findings (Gay et al., 2012. p. 193). Hopefully, the listed precautions can help to
secure GBYS a functional amount of data. A poor survey response rate could have
the largest potential to impact the validity of our findings.
One final issue that should be discussed is that the program coordinator for
Guide By Your Side, Christine Griffin, is also the co-creator of this evaluation.
Christine will be the individual in charge of evaluating and reporting data findings in
the long term and also works occasionally as a Parent Guide herself, making her
both an evaluator and an evaluand. It is clear that there could be potential for
biases, because Christine would naturally prefer to receive positive feedback; this in
turn could affect the validity of this evaluation. We would hope that much of the
potential for bias within the evaluation design has been mitigated due to the fact
that Christine formed an evaluation partnership with someone who does not have a
stake or ties to GBYS. Together, we have taken a very objective approach in creating
this survey, and think that having someone working on the evaluation from outside
of the organization has been helpful to keep this goal in perspective. This evaluation
team was also able to take advantage of Christines knowledge of the organization
as well as her established relationships with stakeholders. As is stated by Mertens
and Wilson (2012), Key stakeholders might be crucial for facilitating support for
[the] evaluation about how they want to communicate with other stakeholders,
about the scope and purpose of the evaluation, [as well as] roles and consequences
of the evaluation (p.554). Overall, we believe that a positive outcome was gained
from this evaluation design partnership. From here on out, the data analysis plan is
fairly straightforward so that the margin for accidental biases should be fairly
narrow.
References:
American Academy of Pediatrics. (2010). Evaluation of the universal newborn
hearing screening
and intervention program. Pediatrics, 126 Suppl, S1927.
doi:10.1542/peds.2010-0354F
Fitzpatrick, E., Angus, D., Durieux-Smith, A., Graham, I. D., & Coyle, D. (2008).
Parents Needs Following
Identification of Childhood Hearing Loss. American Journal of Audiology, 17,
38-49.
Gay, L. R., Mills, G. E., & Airasian, P. (2012). Survey Research. In Educational
research: Competencies for

P a g e | 15
analysis and applications. (pp. 183-199). Upper Saddle River, NJ: Pearson
Education, Inc.
Gay, L. R., Mills, G. E., & Airasian, P. (2012). Descriptive Statistics. In Educational
research: Competencies
for analysis and applications. (pp. 319-338). Upper Saddle River, NJ: Pearson
Education, Inc.
Gay, L. R., Mills, G. E., & Airasian, P. (2012). Qualitative Research: Data Analysis and
Interpretation. In
Educational research: Competencies for analysis and applications. (pp. 465479). Upper Saddle River, NJ: Pearson Education, Inc.

Hands & Voices. (2008). Guide by your side operations manual.


Joint Committee on Infant Hearing. (2007). Year 2007 position statement: Principles
and guidelines for early hearing detection and intervention programs.
Pediatrics, 120(4), 898921. doi:10.1542/peds.2007-2333
Mertens, D. M., & Wilson, A. T. (2012). Planning Evaluations. In Program evaluation
theory and practice:
A comprehensive guide (pp. 262-301). New York, NY: Guilford Press.
Mertens, D. M., & Wilson, A. T. (2012). Evaluation Designs. In Program evaluation
theory and practice: A
comprehensive guide (pp. 303-352). New York, NY: Guilford Press.
Mertens, D. M., & Wilson, A. T. (2012). Communication and Utilization of Findings. In
Program
evaluation theory and practice: A comprehensive guide (pp.475-554). New
York, NY: Guilford Press.

P a g e | 16
Washington State Dept. of Health, Early Hearing-loss Detection, Diagnosis and
Intervention program.
EHDDI Flow Chart of Newborn hearing screening, diagnosis and intervention.
(2014) Retrieved from Washington State Stakeholder EHDI Meeting, March
2015

You might also like