You are on page 1of 12

Physics

2015 Chief Assessor’s Report


Physics

2015 Chief Assessor’s Report

Overview
Chief Assessors’ reports give an overview of how students performed in their school
and external assessments in relation to the learning requirements, assessment
design criteria, and performance standards set out in the relevant subject outline.
They provide information and advice regarding the assessment types, the application
of the performance standards in school and external assessments, the quality of
student performance, and any relevant statistical information.

School Assessment

Assessment Type 1: Investigations Folio


It should be noted that the assessment design criteria of investigation, and analysis
and evaluation are key aspects of the investigations folio. They should be central to
teachers’ determination of the grade for students in this assessment type. Most
teachers assessed against an appropriate variety of specific features, but many gave
insufficient importance to specific features I1, AE1, and AE2 when determining the
grade for the folio.

A minority of teachers provided adequate evidence of student achievement against


specific features I3 (manipulation of apparatus and technological tools to implement
safe and ethical investigation procedures) and A3 (demonstration of skills in
individual and collaborative work). In these cases, moderators could more readily
consider student achievement against these specific features when confirming
grades.

The best tasks provided a variety of opportunities for students to achieve against the
performance standards by assessing them on more than one occasion. The better
tasks also allowed students to demonstrate their strengths by being flexible so that
different students could approach the task in different ways. Students were limited in
their responses when, for example, all students in a class completed the same
design practical investigation or answered the same issues investigation question.
The best design practicals and issues investigations were not over-structured but
were designed in a way that students were able to show initiative.

The vast majority of teachers followed the subject outline by having students focus
on an issue in their issues investigation, rather than an application. These teachers
provided students with the opportunity to address, in particular, specific feature AE1.
When a question is generated and answered by the student, the student can more
fully address the specific feature.

It is helpful when moderators can easily see that assessment conditions are clearly
indicated and followed, with thought given about how to balance the need for
teachers to verify student work with the need to give students adequate time to
produce work at the required standard. Moderators also need to be clear what advice
and support were provided to students prior to them commencing the task, especially

Physics 2015 Chief Assessor’s Report Page 2 of 12


in the design practical and issues investigation; this ensures that moderators are
aware of how much of the work presented is the student’s own. Timed tasks provide
a teacher with more certainty that the work presented is that of the student. However,
timed tasks in the investigations folio may limit the detail that a student can provide
when trying to demonstrate work of a high standard.

A variety of strategies were used to successfully enable students to comply with the
word-limit in the issues investigation. Efficient ways of evaluating the bias, credibility,
accuracy, and suitability of the sources enabled students to devote the majority of
allowed word-count to:

 providing sufficient detail about the physics relevant to the issue


 identifying and demonstrating understanding of the significance of the issue
 summarising findings and drawing conclusions.

Use of ICT (information and communications technology) can communicate


information efficiently, but the task should avoid requiring an ICT presentation which
is simply repeating information provided elsewhere. Moderators were instructed to
enforce the word-limit by basing their judgment only on the work provided by the
student up to the 1500-word limit.

Although there is no word-limit in the practical investigations, teachers are able to


control the demands of tasks by limiting the length of the practical component of the
task and the length of the report. Practical investigations that used multiple different
pieces of apparatus or that required extensive reports were more onerous than
necessary.

Most teachers provided students with tasks that had an appropriate amount of
scaffolding. This gives students sufficient guidance to meet the performance
standards and also allows them to demonstrate their achievement at higher grade
bands. A task that is too scaffolded removes the opportunity for a capable student to
demonstrate that the student ‘designs logical, coherent, and detailed physics
investigations’, ‘critically and systematically analyses data … to formulate logical and
perceptive conclusions’, and ‘critically and logically evaluates procedures and
suggests a range of appropriate improvements’ (as in the performance standards).
The nature of some tasks and the scaffolding provided limited some classes to a
maximum B grade or even a C grade.

Assessment Type 2: Skills and Applications Tasks


The most significant assessment design criteria assessed in skills and applications
tasks (SATs) were application, and knowledge and understanding. In addition,
specific features I4 (the obtaining, recording, and display of findings of investigations,
using appropriate conventions and formats), AE1 (analysis of data and concepts and
their connections, to formulate conclusions and make relevant predictions), and AE2
(evaluation of procedures, with suggestions for improvement) were assessed
effectively in experimental skills questions.

Sets of skills and applications tasks were almost exclusively topic tests. In most
cases, teachers made good use of a variety of previous examination questions or
questions of a similar style. They generally made it easier for moderators to confirm
their grades by clearly identifying the specific features assessed in each task. This
was done either by annotating each question with the appropriate specific feature or
by providing a summary of student achievement against the features on the front

Physics 2015 Chief Assessor’s Report Page 3 of 12


cover. In a small number of cases, teachers appeared to have simply used marks
and then directly converted a percentage to a grade. This meant that, at times, the
grades allocated were not consistent with the performance standards.

Most tasks included a mixture of questions with a variety of demands. They allowed
all students to access some questions and demonstrate their knowledge,
understanding, and skills. They also allowed some students to demonstrate their
ability to work in the A grade band. Some teachers set tasks that did not assess
experimental skills, limiting the achievement in the investigation, and analysis and
evaluation assessment design criteria. Some tasks also limited the ability of students
to give extended answers in which they could describe and explain physics concepts,
phenomena, and applications necessary to achieve in the high grade bands for
knowledge and understanding.

Although there is no requirement in the subject outline for the SATs to assess all
topics, almost all teachers did so, allowing students to show breadth as well as depth
of knowledge and understanding.

Again this year, some teachers set more than five tasks in this assessment type,
which results in the total number of assessments across the subject being greater
than the maximum of ten permitted in the subject outline. Chief Assessor’s reports
have advised against setting more than five SATs in previous years. It should be
noted that dividing a task that has been described as a single task in an approved
learning and assessment plan (LAP) so that the task is completed at two separate
sittings changes this task from one to two separate tasks. As such, this practice is
likely to lead to the total number of tasks in the LAP exceeding the maximum number
permitted by the subject outline and hence the LAP would not be approved. In effect,
then, these teachers provided student work for moderation that was contrary to their
approved LAP. In much the same way as word-count is dealt with in the issues
investigation, moderators had to consider only the number of SATs present in the
approved LAP. Teachers who follow this practice risk their students having limited
opportunity to show the breadth and depth of their knowledge and understanding, as
well as their ability to use terminology and solve problems.

Most teachers provided their students with topic tests to assess them against the
performance standards. This is preferable to using mid-year or trial examinations as
SATs, where the demands of the task at that time of the year tend to limit student
achievement against the performance standards.

External Assessment

Assessment Type 3: Examination


The questions in the Stage 2 Physics examination can be generally described as
calculating, explaining, and problem-solving. In 2015, questions of the first type were
generally answered successfully, but the decrease in student ability with the latter
two types continues. While the number and quality of answers to the extended-
response questions were an improvement over previous years, the number of poor
answers (or blank pages), particularly in Booklet 2, was concerning. As in previous
years, the communication of physics ideas in a logical sequence challenged the
students. That having been said, the mean mark for the examination was very similar
to previous years, as was the number of students in each grade band.

Physics 2015 Chief Assessor’s Report Page 4 of 12


Student handwriting was a problem in some booklets. Often answers could not be
read, and markers are not required to decipher carelessly written work but are
instructed to mark what they can read. Diagrams can aid a student answer, and are
required in some questions. It is recommended that these are drawn with a dark
sharp pencil, so that corrections can be easily made.

Questions asked in different contexts often showed that students had rote-learned
specific responses and could not adapt them to the given circumstances.

Below is a discussion of individual questions.

Question 1

To obtain the first 2 marks of the examination, the students were required to show
knowledge that the accelerations of the two projectiles are equal in size, and directed
straight downwards. While most students were able to show the required knowledge,
disappointingly, too many students were unable to distinguish between acceleration
and velocity, with many students drawing tangential velocity vectors and their
horizontal and vertical components (rarely with appropriate labels).

The second part of the question used the knowledge that complementary angles
would result in the same range. Students who did not know this often guessed 45° or
attempted to measure the angle from the diagram.

Question 2

Question 2 was the most successfully answered question in the paper. The
calculation provided little challenge, nor did drawing the vectors. Failure to label at
least one vector resulted in students losing a mark that they should easily have been
able to gain.

Question 3

Both the explanation and calculation were completed well by many students. Often
the best explanations were done by students who used a diagram within their
answer, which showed that the horizontal component of the normal force was
directed towards the centre of the circle — the information most commonly missed
from the explanations.

Question 4

This question contained two routine calculations, which students were usually able to
complete well. A common misconception in part (b) was that the speed of the satellite
depends upon its mass rather than the mass of the Earth. Incorrect answers to
part (c) usually had correct physics (such as launching west to east, or that the
centre of orbit corresponds to the centre of the Earth), but that information did not
answer the question.

Question 5

A failure to correctly draw the vector diagram led to this question being the worst
answered in the examination. Students who showed additional working before
drawing the vector diagram (such as ) were able to easily
answer the question correctly.

Physics 2015 Chief Assessor’s Report Page 5 of 12


Question 6

Conservation of momentum questions have often been problematic. The best


answers in part (a) used an approach of equating the total initial momentum with the
total final momentum. The follow-up calculations and interpretation did not seem to
be affected by the problems in part (a).

Question 7

The rearranging of Coulomb’s law proved problematic for many students (particularly
dealing with the constant), often resulting in unrealistic answers. Answers with three
or more significant figures were penalised in this question — however, many
students used the ‘clue’ in the question (distance is ) to correctly decide
how many significant figures to include.

For the second part of this question, many students determined that the force was
one-quarter as large, but failed to properly communicate how they used the
proportionality to obtain this result.

Question 8

Most students had knowledge of the direction and shape of the electric field, but
students often lost marks due to a lack of care or through rushing. Markers penalised
answers that did not clearly show an understanding that the vectors touch the
surface of the conductor, and that the vectors should be perpendicular to the surface
of the conductor.

Question 9

Question 9 is a challenging problem-solving question, but one that can be answered


in many ways. There were very good 3-mark answers that used a descriptive
approach, (fewer) very good 3-mark answers that used calculations, and (even
fewer) very good 3-mark answers where students derived expressions to compare
the times. However, these answers were rare — 18% of students did not attempt the
question (more than any other question in the examination) and 35% made an
attempt but achieved no marks. Students considered the different masses, but rarely
accounted for the difference in charge of a proton and an alpha particle.

Question 10

Students’ inability in part (a) to describe the uniform circular motion of a charge in a
magnetic field is frustrating, because the three required points are well known and
because this concept has been frequently assessed in recent examinations.
A common misconception was that the particle’s motion changes ‘due to the right
hand rule’. The remaining parts of the question were handled well by the majority of
students.

Question 11

The use of a projectile motion problem-solving methodology with charged particles


moving in electric fields was done well by many students, and markers were pleased
with the responses. The greatest challenge came (unsurprisingly) in determining the
flight time of the charges, with incorrect selection from horizontal, vertical, and initial
speeds common.

Physics 2015 Chief Assessor’s Report Page 6 of 12


Question 12

The first two parts of the question were successfully completed by many students.
The vague answer to part (a) of ‘up’ could not be rewarded, and markers commented
that ‘too many students said into the page’. The correct abbreviation of ‘seconds’ is
‘s’, not ‘secs’.

Part (c) showed many students’ inability to communicate their deductions — many
determined that the frequency would increase but were unable to give a satisfactory

reason. Some students had the misconception that (presumably from ).

Question 13

For one of the applications prescribed in the course, it was very surprising that 12%
of students did not provide an answer about the operation of a loudspeaker, and that
20% provided an answer that received no marks. It was common for students to
discuss the compressions and rarefactions of a sound wave without referring to the
magnetic interactions that cause the movement of the cone. The student responses
to this question were the most disappointing of the examination for the examiners.

Question 14

Explanation questions involving 3 marks are usually very challenging for students,
and Question 14 was no different. The question required knowledge of the production
of light through incandescence, and the concepts of coherence and
monochromaticity. The majority of students were unable to explain and/or link these
concepts (with coherence being surprisingly poorly understood).

Question 15

Question 15 was a positive way for most students to finish Booklet 1. In part (c),
there were many examples of poor communication, particularly the vague use of
terminology such as ‘the waves are in phase’.

Question 16

The calculation in part (a) was not a problem for most students, although some used
in their calculations (possibly from seeing five maxima in the image). The failure
by students to correctly round off their answer (taking 1.2266 from the calculator and
writing 1.22) demonstrates a lack of numeracy skills that is unexpected in Stage 2
Physics.

Knowing that blue light has a smaller wavelength than red light allowed many
students to successfully describe how the different wavelength would affect the angle
of the first-order maxima. The misconception that was evident in a number of
student answers. Most students could talk about how a change in wavelength would
affect the angle, even if they did not know how the wavelength was changed.

A number of markers commented that they believe a significant number of students


had never seen the white-light diffraction pattern, so were unable to describe it, let
alone explain it. Very few students were able to satisfactorily describe and explain
the differing angles for the first-order maxima of different colours, despite the lead-in

Physics 2015 Chief Assessor’s Report Page 7 of 12


from part (b). The white central maxima was described more often, but the
explanation was rarely sufficiently well expressed.

Question 17

Question 17 was a routine calculation, which the majority of students did correctly.

Question 18

The most successful way to answer part (a) was by using the terms ‘work function’
and ‘threshold frequency’. Where students attempted to describe these (rather than
just state them), their communication often lacked clarity.

Many answers to part (b) suggested that students had not analysed graphs of data
from photoelectric effect experiments. It was common for students to rearrange the
given equation for , making no reference to the graph (or its gradient).

Question 19

The 3-mark explanation for part (a) was fully completed by less than 7% of students.
Many struggled to use the law of conservation of energy to discuss the energy
change into X-ray photons when the electrons strike the target, and the link between
frequency and energy was often omitted. Many students made the connection
between the energy of the emitted photon and the electron’s proximity to nuclei in the
target material, but few mentioned that the remaining energy is converted to heat.

The rearrangement and calculation in part (b) was done well by the majority of
students, although many answers were given as 58012.5 V, and this inappropriate
number of significant figures was penalised.

Question 20

Part (a) was considered sufficiently routine that the result could be used in parts (b)
and (c). Most students did the calculation correctly, but often the conversion of
478 nm into metres was omitted or the conversion to eV was done incorrectly,
leading to unrealistic answers. Some students used the wavelength as the frequency
in the equation again yielding an unrealistic answer. The knowledge that only
transitions upwards from the ground state of hydrogen will occur at room temperature
was rarely shown or rarely communicated well in part (c). Questions that instruct
students to draw and label require students to draw and label to gain full credit.

Question 21

It is clear from this question that many students do not know what fluorescence is, or
do not understand the process to a satisfactory standard. About 14% of students did
not attempt this question, and a further 25% attempted it, but received no marks.

Question 22

This question focused upon students’ knowledge of the positron emission


tomography application, and this was clearly lacking. Only 1% of the students were
able to obtain all 6 marks for this question, with the majority losing marks in part (c).
There was significant confusion about how the detection of photons can allow the
location of the radioisotope to be determined — often the only knowledge shown was

Physics 2015 Chief Assessor’s Report Page 8 of 12


that two photons travel in opposite directions. The better answers often used a
diagram to aid the explanation of using multiple pairs of detectors.

Physics 2015 Chief Assessor’s Report Page 9 of 12


Question 23

Question 23 assessed radioactive decays in the context of leak detection, and


students struggled to demonstrate the necessary knowledge. The first part of the
question required that an electron and an antineutrino be included in the products of
the reaction, and that the atomic number and mass number of the magnesium
nucleus be determined. The antineutrino was often missing or incorrectly denoted.
The balancing of atomic numbers and mass numbers was generally done correctly.
Many students identified that the penetrating power was the key factor in part (b), but
often the comparison between the penetrating power of the alpha and beta decays
was missing or lacking detail.

Question 24

As expected, a number of students confused the role of control rods with the role of
moderators. The best answers referred to controls rods controlling the rate of
reactions through the absorption of neutrons. Half-life calculations are usually done
well by the majority of students, and that pattern continued in 2015. There was
improvement over previous years in the way that students communicated their
problem-solving method in determining the time for the activity to drop.

Question 25

A surprising number of students spelled ‘fusion’ as ‘fussion’, and this mistake was
penalised. The problem-solving required to determine the mass of the products from
the energy released was challenging for most students. In part (c), the misconception
that fusion requires energy to overcome the nuclear force was common. Rarely were
the key properties of the nuclear force stated clearly, so applying the ideas in an
unfamiliar context was very challenging for students.

Question 26

There were many students who completed the data table in part (a) with answers to
an incorrect number of significant figures. The graphs drawn in part (b) were usually
done well, with the most common loss of marks being for unsuitable scales on the
vertical axis and for incorrect plotting (which was often a result of an unsuitable scale
on the vertical axis). The students who calculated a gradient did it well, but the
gradient’s units were often incorrect or omitted. The use of the gradient is a skill that
students find challenging, with only the most capable students being able to
successfully determine an answer and communicate their reasoning. Some students
were able to show their understanding of ‘accuracy’ in part (e) despite having failed
to complete part (d), and were appropriately rewarded. Many students demonstrated
some knowledge of how the magnetic force depends upon the angle between the
current element and the magnetic field, but the link to the results of the experiment
was not communicated well.

Question 27

The marking of both of the dot points for Question 27 focused on seeking three
content points for each. In the first dot point there were two common approaches.
The approach of students describing the acceleration being caused by the force was
generally done better than the approach of discussing the energy gained by the work
done as the ions move across the potential difference; however both approaches
yielded some very good answers. The best answers were concise, with a logical
progression of ideas. A number of students derived an expression for the

Physics 2015 Chief Assessor’s Report Page 10 of 12


acceleration of the ions (with varying degrees of detail), but rarely did these answers
contain sufficient detail communicated effectively.

The best answers to the second dot point correctly showed how the law of
conservation of momentum applied to the context, rather than just stating it. An
approach of equating the total initial momentum with the total final momentum
yielded the best results. Despite many students failing to link the increased
momentum of the spacecraft with a gain of speed, the marks for the second dot point
were better than for the first.

The average mark for Question 27 was higher than the average mark for most
extended-response questions in the past few years. Also pleasing was that the
number of students presenting an answer (92%) was higher than in recent history.

Question 28

The need for students to identify a variable and then design an experiment was
different from many extended-response questions in the past, but this did not prevent
students from answering it. In the same way as for Question 27, the number of
students presenting an answer for Question 28 was higher than the number of
answers for the final questions of recent examinations.

There were a number of variables that the markers determined to be appropriate,


including (but not limited to) the distance from the slits to the screen, the number of
slits, and the distance between the slits. A number of students suggested that the
wavelength would be a suitable variable, but this did not fit with the instruction that
the experiment was to be conducted with a helium–neon laser. It also showed many
misconceptions about the nature of laser light — some students suggested changing
the wavelength by putting coloured filters in front of the laser, or simply included the
instruction to ‘change the laser’s wavelength’. When a student identified an incorrect
variable, the remainder of the practical was marked to reward correct physics.

When suggesting variables, a significant number of students stated that the equation

would apply, whereas better answers suggested that it may apply. Other
good answers focused on the equation , known to apply to two slits and to
diffraction gratings, so likely to apply to three slits.

The marking of this question focused on the students’ ability to devise and
communicate an effective procedure. There were many additional ways that students
could obtain marks, including correctly identifying and justifying independent,
dependent, and controlled variables; stating a hypothesis which contained an
anticipated proportionality; and describing how the data could be analysed
graphically. Discussions of safe use of lasers, particularly the requirement that the
room be well lit (as described in the Australian Government’s Safety Guide for the
Use of Radiation in Schools), was rewarded. Good practical skills, such as repeating
measurements and averaging or measuring across a number of bright bands (and
dividing appropriately), earned students marks.

Disappointing was the number of students who did not understand a ‘variable that
could affect the pattern’, erroneously discussing things like the ambient light in the
room. Similarly disappointing was the number of students who did not design an
experiment for a variable, but who described an experiment they (probably) had done
during the year that uses different diffraction gratings to determine the wavelength of
a laser.

Physics 2015 Chief Assessor’s Report Page 11 of 12


Operational Advice
School assessment tasks are set and marked by teachers. Teachers’ assessment
decisions are reviewed by moderators. Teacher grades/marks should be evident on
all student school assessment work.

Teachers should ensure that there is a copy of the approved LAP, including an
addendum where appropriate. There should also be a Variations — Moderation
Materials form for all instances where student work in the moderation sample differs
from that specified in the LAP (including the addendum).

A teacher pack containing a copy of the approved LAP and copies of each of the
tasks, showing what was provided for students, should be included. Teachers should
also ensure that the process that they used to determine the student grades is clear
to the moderators, allowing them to review the work and confirm the grade.

Originals rather than photocopies of student work should be provided, with teacher
annotations in a different colour ink to student work. Hard copies of student work
should be provided where possible, rather than electronic copies of text documents.

Physics

Chief Assessor

Physics 2015 Chief Assessor’s Report Page 12 of 12

You might also like