You are on page 1of 8

Assignment of:-

Educational Assessment

Topic Name: -

Method of Interpreting Results

Submitted to:-

Farasat Zahra

Submitted by:

Muhammad Ilyas

Roll No: -

Bsf1903667(BS IT 3rd)

(University of Education Lahore)


Method of Interpreting Results:

Data analysis and interpretation have now taken center


stage with the advent of the digital age… and the sheer amount of data can be
frightening. In fact, a Digital Universe study found that the total data supply in 2012 was
2.8 trillion gigabytes! Based on that amount of data alone, it is clear the calling card of
any successful enterprise in today’s global world will be the ability to analyze complex
data, produce actionable insights and adapt to new market needs… all at the speed of
thought.

 Are the digital age tools for big data. Capable of displaying key performance indicators
(KPIs) for both quantitative and qualitative data analyses, they are ideal for making the
fast-paced and data-driven market decisions that push today’s industry leaders to
sustainable success. Through the art of streamlined visual communication, data
dashboards permit businesses to engage in real-time and informed decision making,
and are key instruments in data interpretation. First of all, let’s find a definition to
understand what lies behind data interpretation meaning.

What Is Data Interpretation?


Data interpretation refers to the implementation of processes through which data is
reviewed for the purpose of arriving at an informed conclusion. The interpretation of
data assigns a meaning to the information analyzed and determines its signification and
implications.

The importance of data interpretation is evident and this is why it needs to be done
properly. Data is very likely to arrive from multiple sources and has a tendency to enter
the analysis process with haphazard ordering. Data analysis tends to be extremely
subjective. That is to say, the nature and goal of interpretation will vary from business to
business, likely correlating to the type of data being analyzed. While there are several
different types of processes that are implemented based on individual data nature, the
two broadest and most common categories are “quantitative analysis” and “qualitative
analysis”.

Yet, before any serious data interpretation inquiry can begin, it should be understood
that visual presentations of data findings are irrelevant unless a sound decision is made
regarding scales of measurement. Before any serious data analysis can begin, the
scale of measurement must be decided for the data as this will have a long-term impact
on data interpretation ROI. The varying scales include:

 Nominal Scale: non-numeric categories that cannot be ranked or compared


quantitatively. Variables are exclusive and exhaustive.
 Ordinal Scale: exclusive categories that are exclusive and exhaustive but with a
logical order. Quality ratings and agreement ratings are examples of ordinal scales (i.e.,
good, very good, fair, etc., OR agree, strongly agree, disagree, etc.).

 Interval: a measurement scale where data is grouped into categories with


orderly and equal distances between the categories. There is always an arbitrary zero
point.
 Ratio: contains features of all three.

For a more in-depth review of scales of measurement, read our article on data analysis
questions. Once scales of measurement have been selected, it is time to select which of
the two broad interpretation processes will best suit your data needs. Let’s take a closer
look at those specific data interpretation methods and possible data interpretation

Analyzing and Interpreting Information

Analyzing quantitative and qualitative data is often the topic of advanced


research and evaluation methods courses. However, there are certain basics
which can help to make sense of reams of data.

Always start with your research goals

When analyzing data (whether from questionnaires, interviews, focus


groups, or whatever), always start from review of your research goals, i.e.,
the reason you undertook the research in the first place. This will help you
organize your data and focus your analysis. For example, if you wanted to
improve a program by identifying its strengths and weaknesses, you can
organize data into program strengths, weaknesses and suggestions to
improve the program. If you wanted to fully understand how your program
works, you could organize data in the chronological order in which customers
or clients go through your program. If you are conducting a performance
improvement study, you can categorize data according to each measure
associated with each overall performance result, e.g., employee learning,
productivity and results.

Basic analysis of "quantitative" information

(for information other than commentary, e.g., ratings, rankings, yes's, no's,
etc.):

1. Make copies of your data and store the master copy away. Use the
copy for making edits, cutting and pasting, etc.
2. Tabulate the information, i.e., add up the number of ratings, rankings,
yes's, no's for each question.
3. For ratings and rankings, consider computing a mean, or average, for
each question. For example, "For question #1, the average ranking
was 2.4". This is more meaningful than indicating, e.g., how many
respondents ranked 1, 2, or 3.
4. Consider conveying the range of answers, e.g., 20 people ranked "1",
30 ranked "2", and 20 people ranked "3".

Basic analysis of "qualitative" information

(respondents' verbal answers in interviews, focus groups, or written


commentary on questionnaires):

1. Read through all the data.


2. Organize comments into similar categories, e.g., concerns,
suggestions, strengths, weaknesses, similar experiences, program
inputs, recommendations, outputs, outcome indicators, etc.
3. Label the categories or themes, e.g., concerns, suggestions, etc.
4. Attempt to identify patterns, or associations and causal relationships in
the themes, e.g., all people who attended programs in the evening had
similar concerns, most people came from the same geographic area,
most people were in the same salary range, what processes or events
respondents experience during the program, etc.
5. Keep all commentary for several years after completion in case needed
for future reference.

Interpreting information

1. Attempt to put the information in perspective, e.g., compare results to


what you expected, promised results; management or program staff;
any common standards for your products or services; original goals
(especially if you're conducting a program evaluation); indications or
measures of accomplishing outcomes or results (especially if you're
conducting an outcomes or performance evaluation); description of the
program's experiences, strengths, weaknesses, etc. (especially if
you're conducting a process evaluation).
2. Consider recommendations to help employees improve the program,
product or service; conclusions about program operations or meeting
goals, etc.
3. Record conclusions and recommendations in a report, and associate
interpretations to justify your conclusions or recommendations.

Reporting Results
1. The level and scope of content depends on to whom the report is
intended, e.g., to funders / bankers, employees, clients, customers,
the public, etc.
2. Be sure employees have a chance to carefully review and discuss the
report. Translate recommendations to action plans, including who is
going to do what about the research results and by when.
3. Funders / bankers will likely require a report that includes an executive
summary (this is a summary of conclusions and recommendations, not
a listing of what sections of information are in the report -- that's a
table of contents); description of the organization and the program,
product, service, etc., under evaluation; explanation of the research
goals, methods, and analysis procedures; listing of conclusions and
recommendations; and any relevant attachments, e.g., inclusion of
research questionnaires, interview guides, etc. The funder may want
the report to be delivered as a presentation, accompanied by an
overview of the report. Or, the funder may want to review the report
alone.
4. Be sure to record the research plans and activities in a research plan
which can be referenced when a similar research effort is needed in
the future.

Who Should Carry Out the Research?

Ideally, the organization's management decides what the research goals


should be. Then a research expert helps the organization to determine what
the research methods should be, and how the resulting data will be analyzed
and reported back to the organization.

If an organization can afford any outside help at all, it should be for


identifying the appropriate research methods and how the data can be
collected. The organization might find a less expensive resource to apply the
methods, e.g., conduct interviews, send out and analyze results of
questionnaires, etc.

If no outside help can be obtained, the organization can still learn a great
deal by applying the methods and analyzing results themselves. However,
there is a strong chance that data about the strengths and weaknesses of a
product, service or program will not be interpreted fairly if the data are
analyzed by the people responsible for ensuring the product, service or
program is a good one. These people will be "policing" themselves. This
caution is not to fault these people, but rather to recognize the strong biases
inherent in trying to objectively look at and publicly (at least within the
organization) report about their work. Therefore, if at all possible, have
someone other than the those responsible for the product, service or
program to look at and determine research results.

Contents of a Research Report -- An Example

Ensure your research plan is documented so that you can regularly and
efficiently carry out your research activities. In your plan, record enough
information so that someone outside of the organization can understand
what you're researching and how. For example, consider the following
format:

1. Title Page (name of the organization that is being, or has a


product/service/program that is being researched; date)
2. Table of Contents
3. Executive Summary (one-page, concise overview of findings and
recommendations)
4. Purpose of the Report (what type of research was conducted, what
decisions are being aided by the findings of the research , who is
making the decision, etc.)
5. Background About Organization and Product/Service/Program that is
being researched
1. Organization Description/History
2. Product/Service/Program Description (that is being researched)
1. Problem Statement (in the case of nonprofits, description
of the community need that is being met by the
product/service/program)
2. Overall Goal(s) of Product/Service/Program
3. Outcomes (or client/customer impacts) and Performance
Measures (that can be measured as indicators toward the
outcomes)
4. Activities/Technologies of the Product/Service/Program
(general description of how the product/service/program is
developed and delivered)
5. Staffing (description of the number of personnel and roles
in the organization that are relevant to developing and
delivering the product/service/program)
6. Overall Evaluation Goals (eg, what questions are being answered by
the research)
7. Methodology
1. Types of data/information that were collected
2. How data/information were collected (what instruments were
used, etc.)
3. How data/information were analyzed
4. Limitations of the evaluation (eg, cautions about
findings/conclusions and how to use the findings/conclusions,
etc.)
8. Interpretations and Conclusions (from analysis of the
data/information)
9. Recommendations (regarding the decisions that must be made about
the product/service/program)
10. Appendices: content of the appendices depends on the goals of
the research report, eg.:
1. Instruments used to collect data/information
2. Data, eg, in tabular format, etc.
3. Testimonials, comments made by users of the
product/service/program
4. Case studies of users of the product/service/program
5. Any related literature

Some Pitfalls to Avoid

1. Don't balk at research because it seems far too "scientific." It's not.
Usually the first 20% of effort will generate the first 80% of the plan,
and this is far better than nothing.
2. There is no "perfect" research design. Don't worry about the research
design being perfect. It's far more important to do something than to
wait until every last detail has been tested.
3. Work hard to include some interviews in your research methods.
Questionnaires don't capture "the story," and the story is usually the
most powerful depiction of the benefits of your products, services,
programs, etc.
4. Don't interview just the successes. You'll learn a great deal by
understanding its failures, dropouts, etc.
5. Don't throw away research results once a report has been generated.
Results don't take up much room, and they can provide precious
information later when trying to understand changes in the product,
service or program.

You might also like