Professional Documents
Culture Documents
Educational Assessment
Topic Name: -
Submitted to:-
Farasat Zahra
Submitted by:
Muhammad Ilyas
Roll No: -
Bsf1903667(BS IT 3rd)
Are the digital age tools for big data. Capable of displaying key performance indicators
(KPIs) for both quantitative and qualitative data analyses, they are ideal for making the
fast-paced and data-driven market decisions that push today’s industry leaders to
sustainable success. Through the art of streamlined visual communication, data
dashboards permit businesses to engage in real-time and informed decision making,
and are key instruments in data interpretation. First of all, let’s find a definition to
understand what lies behind data interpretation meaning.
The importance of data interpretation is evident and this is why it needs to be done
properly. Data is very likely to arrive from multiple sources and has a tendency to enter
the analysis process with haphazard ordering. Data analysis tends to be extremely
subjective. That is to say, the nature and goal of interpretation will vary from business to
business, likely correlating to the type of data being analyzed. While there are several
different types of processes that are implemented based on individual data nature, the
two broadest and most common categories are “quantitative analysis” and “qualitative
analysis”.
Yet, before any serious data interpretation inquiry can begin, it should be understood
that visual presentations of data findings are irrelevant unless a sound decision is made
regarding scales of measurement. Before any serious data analysis can begin, the
scale of measurement must be decided for the data as this will have a long-term impact
on data interpretation ROI. The varying scales include:
For a more in-depth review of scales of measurement, read our article on data analysis
questions. Once scales of measurement have been selected, it is time to select which of
the two broad interpretation processes will best suit your data needs. Let’s take a closer
look at those specific data interpretation methods and possible data interpretation
(for information other than commentary, e.g., ratings, rankings, yes's, no's,
etc.):
1. Make copies of your data and store the master copy away. Use the
copy for making edits, cutting and pasting, etc.
2. Tabulate the information, i.e., add up the number of ratings, rankings,
yes's, no's for each question.
3. For ratings and rankings, consider computing a mean, or average, for
each question. For example, "For question #1, the average ranking
was 2.4". This is more meaningful than indicating, e.g., how many
respondents ranked 1, 2, or 3.
4. Consider conveying the range of answers, e.g., 20 people ranked "1",
30 ranked "2", and 20 people ranked "3".
Interpreting information
Reporting Results
1. The level and scope of content depends on to whom the report is
intended, e.g., to funders / bankers, employees, clients, customers,
the public, etc.
2. Be sure employees have a chance to carefully review and discuss the
report. Translate recommendations to action plans, including who is
going to do what about the research results and by when.
3. Funders / bankers will likely require a report that includes an executive
summary (this is a summary of conclusions and recommendations, not
a listing of what sections of information are in the report -- that's a
table of contents); description of the organization and the program,
product, service, etc., under evaluation; explanation of the research
goals, methods, and analysis procedures; listing of conclusions and
recommendations; and any relevant attachments, e.g., inclusion of
research questionnaires, interview guides, etc. The funder may want
the report to be delivered as a presentation, accompanied by an
overview of the report. Or, the funder may want to review the report
alone.
4. Be sure to record the research plans and activities in a research plan
which can be referenced when a similar research effort is needed in
the future.
If no outside help can be obtained, the organization can still learn a great
deal by applying the methods and analyzing results themselves. However,
there is a strong chance that data about the strengths and weaknesses of a
product, service or program will not be interpreted fairly if the data are
analyzed by the people responsible for ensuring the product, service or
program is a good one. These people will be "policing" themselves. This
caution is not to fault these people, but rather to recognize the strong biases
inherent in trying to objectively look at and publicly (at least within the
organization) report about their work. Therefore, if at all possible, have
someone other than the those responsible for the product, service or
program to look at and determine research results.
Ensure your research plan is documented so that you can regularly and
efficiently carry out your research activities. In your plan, record enough
information so that someone outside of the organization can understand
what you're researching and how. For example, consider the following
format:
1. Don't balk at research because it seems far too "scientific." It's not.
Usually the first 20% of effort will generate the first 80% of the plan,
and this is far better than nothing.
2. There is no "perfect" research design. Don't worry about the research
design being perfect. It's far more important to do something than to
wait until every last detail has been tested.
3. Work hard to include some interviews in your research methods.
Questionnaires don't capture "the story," and the story is usually the
most powerful depiction of the benefits of your products, services,
programs, etc.
4. Don't interview just the successes. You'll learn a great deal by
understanding its failures, dropouts, etc.
5. Don't throw away research results once a report has been generated.
Results don't take up much room, and they can provide precious
information later when trying to understand changes in the product,
service or program.