You are on page 1of 3

Uncertainties and the Individual Investigation:

CLARIFICATION ON PERCEIVED IA OVERLAP




The issues of errors and uncertainties are clearly
stated in the Physics Course Guide under sub-
section 1.2. Similarly, the treatment of a best-fit
linear graph and its uncertainty range are described
under Mathematical Requirement, page 22 in the
Guide. These statements express the minimal skills
in experimental work for students; these skills can
be assessed on written examinations, particularly
Section A on Paper 3 and, where relevant and
appropriate on the student’s IA.

However, the details of error analysis are not described in the individual investigation assessment criteria.
Rather, only general issues about errors and uncertainties, from various perspectives, are addressed in the IA
criteria. The door is wide-open to a more advanced treatment of errors and uncertainties by the student in
their IA report, but the open door is also open to a lesser treatment of errors and uncertainties, perhaps not
including them at all, depending on the individual investigation’s purpose and methodology.

Issues of Errors and Uncertainties are mentioned in three IA criteria.

Exploration. “The methodology of the investigation is highly appropriate to address the research
question because it takes into consideration all, or nearly all, of the significant factors that may
influence the relevance, reliability and sufficiency of the collected data.”

Analysis. “The report shows evidence of full and appropriate consideration of the impact of
measurement uncertainty on the analysis.”

Evaluation. “Strengths and weaknesses of the investigation, such as limitations of the data and
sources of error, are discussed and provide evidence of a clear understanding of the
methodological issues involved in establishing the conclusion.”

https://ibpublishing.ibo.org/server2/rest/app/tsm.xql?doc=d_4_physi_tsm_1408_1_e&part=6&chapter=2

Could the appreciation of the quality, reliability and error analysis be assessed in the same way under three
different criteria? Should the student be penalized twice or three times for the same fault, omission or
mistake? The IB’s philosophy tells us ‘no’. Rather, the assessment of errors and uncertainties is taken
from three different perspectives, each focusing on different aspects of this important issue within the
student’s individual investigation report.

The IA criteria should not be read in such a way that overlaps


the focus of expectations. However, being realistic, experienced
teachers know that a fault in one area may well lead to other
faults. An ill-planned method may well lead to poor quality
data. But double jeopardy does not apply for assessment. We
overcome the apparent overlap issue by emphasizing a
different purpose for each time the issue of data quality comes
up.

Moreover, the extent in which each of the three perspectives


on errors and uncertainties is executed in an individual
investigation is determined by the research question and the
methodology of the investigation. This is the issue of relevance
(covered in another clarification document).

IA Overlap CLARIFICATION.docx 1

Exploration Clarification

Criterion 5-6 mark band states that: “The methodology of the investigation is highly appropriate to address
the research question because it takes into consideration all, or nearly all, of the significant factors that may
influence the relevance, reliability and sufficiency of the collected data.”

The relevance of data pertains to the research question and the methodology set forth in the investigation.
The reliability of the collected data concerns an awareness of possible issues of quality, again depending on
the investigation. Even for a data-based investigation, the data quality and reliability needs to be appreciated,
perhaps significant figures and the source of the data required attention too. This could include an awareness
of systematic errors related to assumptions in the method as well as calibration of the equipment, and it
might concerns the precision and accuracy of information obtained from a database. Expressions of
numerical values of uncertainty are not expected under Exploration but an awareness of the types and
potential issues would be expected. The sufficiency of data would include the scope and range of data, from
the smallest to the largest value in the range being considered as well as the number of measurements within
the range. Richard Feynman said that the first and the last data points are the first and last for a reason. The
student needs to appreciate this. Factors here might include limitations of equipment, practical issues of a
finite range as well as management issues of limited time and resources.


Analysis Clarification

The Analysis criterion 5-6 mark band states: “The report shows evidence of full and appropriate considera-
tion of the impact of measurement uncertainty on the analysis.”

Establishing the absolute or raw data uncertainty in the relevant data would be appropriate here. Similarly,
the uncertainty in any processed data needs to be considered under Analysis. See Core Syllabus Sub-topic 1.2.
Plotting relevant error bars and establishing a gradient and its uncertainty for linear lines would also be
appropriate under Analysis. The method of determining the gradient uncertainty is up to the student,
however, and does not need to follow the mathematical requirements item about “relative accuracy (by eye)
taking into account all the uncertainty bars” as stated in the Guide page 22.

Overlap of Exploration with Analysis

The sufficiency of relevant data is under scrutiny in both Exploration and Analysis, but from different
perspectives.

EXPLORATION Focus ANALYSIS Focus

The methodology of the investigation takes into The report includes sufficient relevant data. Under
consideration the significant factors that influence Analysis, the appropriate treatment of sufficient
the relevance, reliability and sufficiency of the data is expected.
collected data. “Takes into consideration” means
that the student acknowledges these issues. The
student does not need to follow through with
detailed analysis and treatment as these issues
come under the Analysis criterion.

IA Overlap CLARIFICATION.docx 2
Overlap of Analysis with Communications

Consider the example where the student has not recorded uncertainties in the raw data. Is this a failing of
Analysis indicator two or Communication indicator three or both? The answer is that it should be understood
as part of Analysis because missing uncertainties does not inhibit communications.

Consider a further example. We know that the absence or partial absence of


recorded measurement uncertainties normally comes under Analysis, but if the
precision of data and uncertainties are inconsistent or unclear then this might
also come under Communications if it causes confusion when reading the report.
In the former case the student may have forgotten uncertainties or made only a
partial attempt at recording uncertainties, while in the later case there might be
an inconsistency with uncertainties and significant figures of data, and related
issues that confuse the reader and made a comprehensive understanding of the
report less than it could have been.

Overlap of Evaluation with Analysis

The data and a conclusion are under scrutiny in both the Analysis and the Evaluation criteria, but each with a
different focus, a different perspective.

ANALYSIS Focus EVALUATION Focus


The conclusion is in the context of the analysis of The conclusion is in context of the original
data. The ‘conclusion’ here is the interpretation of research question.
the processed data, not the conclusion of the
research question.

In the analysis, it may be concluded that there is a In the evaluation, the student is expected to put
positive correlation between x and y. An the conclusion into the context of the original aim.
appreciation of relevant errors and uncertainties, Does the conclusion support the original thinking?
the possible range of values for a best-fit graph If not, a consideration of why it does not will lead
line, etc. are part of an interpretative conclusion into an evaluation of the limitations of the method
based on data and data processing. and suggestions as to how the method and
approach could be adjusted to generate data that
could help draw a firmer conclusion.

Focus is on the reiteration of variability, the data Focus is on the limitations of the results.
patterns.

OCC TSM: https://ibpublishing.ibo.org/server2/rest/app/tsm.xql?doc=d_4_physi_tsm_1408_1_e&part=7&chapter

6 April 2015, Dr. Mark Headlee


SUMMARY:
• Interpret a criterion indicator in a way that does not overlap with
another criterion indicator.
• Each indicator of each criterion focuses on a different aspect of a
common element; e.g. “conclusion” under Analysis is not the
same as “conclusion” under Evaluation.
• If an indicator is not relevant it can be ignored, like an outlier in a
graphing situation. However, it would be wise for the student to
address the omission, giving a reason for omitting something,
e.g. not addressing safety issues.

IA Overlap CLARIFICATION.docx 3

You might also like