You are on page 1of 122

Cambridge Primary Checkpoint End of Series Report

May 2017

1
Contents

1. Introduction Page 3

2. Cambridge Primary Checkpoint English as a Second Language 0837 Page 4


2.1 Comments on specific questions – English as a Second Language 0837 component 01 Page 4
2.2 Comments on specific questions – English as a Second Language 0837 component 02 Page 8
2.3 Comments on specific questions – English as a Second Language 0837 component 03 Page 10
2.4 Table and charts of sub-group performances – English as a Second Language 0837 Page 12

3. Cambridge Primary Checkpoint English 0844 Page 36


3.1 Comments on specific questions – English 0844 component 01 Page 36
3.2 Comments on specific questions – English 0844 component 02 Page 40
3.3 Table and charts of sub-group performances – English 0844 Page 44

4. Cambridge Primary Checkpoint Mathematics 0845 Page 63


4.1 Comments on specific questions – Mathematics 0845 component 01 Page 63
4.2 Comments on specific questions – Mathematics 0845 component 02 Page 72
4.3 Table and charts of sub-group performances – Mathematics 0845 Page 79

5. Cambridge Primary Checkpoint Science 0846 Page 98


5.1 Comments on specific questions – Science 0846 component 01 Page 98
5.2 Comments on specific questions – Science 0846 component 02 Page 101
5.3 Table and charts of sub-group performances – Science 0846 Page 104

2
1. Introduction

This document reports on candidate performance for this exam series. Performances for each syllabus are reported separately; the entries for on-screen and paper-
based syllabuses are not combined.

Overall and sub-group performances can change from series to series. You can use the report to compare sub-group performances for this syllabus in this
series. You should not use the information to compare performance changes over time.

For each syllabus the following information is provided:

• examiner comments on specific questions within each component of the test,


• tables and charts of sub-group performances for the overall assessment and at strand level.

3
2. Cambridge Primary Checkpoint English as a Second Language 0837

2.1 Comments on specific questions – English as a Second Language 0837 component 01

General Comments
The overall level of difficulty and learner performance appeared similar to equivalent papers (this is the first time this particular paper has been offered), with Part 3
(multiple matching) and Part 5 (comprehension) presenting the most difficulty for learners.

Part 1
(Questions 1-5)
A 5-gap, multiple-choice sentence completion test, required learners to select an appropriate item to fill 5 gaps and to circle the word of their choice; correct usage of lexical
and function words was tested. The majority of learners scored well on this part. The most incorrect answers were for Q3.

Part 2
(Questions 6-15)
For questions 6–15, learners were required to put one word only into the gaps to complete a single email message to a friend. The words needed to fit grammatically and to
carry the intended meaning to complete the text.
A lot of the incorrect answers seemed to be produced because the words chosen often fitted the words either side of the blank in terms of grammar or meaning but
learners did not take into account the wider context of sentence/discourse level. Possibly more focus is needed on this aspect when teaching reading skills to pupils. This
appears to be an important area for improvement as it is also noted in reports at secondary level.
Spelling needed to be accurate but the mark scheme did allow for alternative answers, apart from Questions 6, 10, 11, 12 and 15, where some appropriate words were
accepted. This part proved to be a good discriminator, with strong learners occasionally scoring full marks. The majority of learners answered fairly well; answers
were mostly correct, though weaker learners answered Questions 6, 8 and 9 incorrectly. Some weaker learners also attempted the task by inserting a variety of question
words into the gaps.

Part 3
(Questions 16-20)
In this part, learners were required to complete a short conversation by selecting appropriate responses from those given. Most learners scored well on this task, but
weaker learners found this to be the most difficult task. Errors are often made when learners find an item to match the sentence before or after but not both.
In this part and Part 2 a lot of learners changed their minds several times and, rather than crossing out their original mark, wrote over their first answer or erased
unclearly. This sometimes resulted in an illegible response. It would be preferable for learners to completely cross out the rejected response and write the preferred one
next to it. There were also a few ambiguously formed letters (especially A & H – sloping sides and a gap at top could often be either letter; it was sometimes difficult to see
if E or F was the intended final answer in cases where there was overwriting and/or incomplete erasure).

Part 4
(Questions 21-25)
In this section, learners were required to select the correct meaning of the message shown in a picture. Learners needed to circle one of the three choices given. This

4
task was generally well done. The task required careful reading and accurate matching of information to determine the correct response. Learners needed to be good at
making inferences and identifying different ways to convey a message. Most learners scored well, though there were a lot of incorrect answers for Questions 23 and 25.

Part 5
(Questions 26-30)
This task comprised multiple choice questions on a longer text with the title ‘Anna Learns to Surf’, which was about a girl, Anna, learning how to surf.
The need for learners to use a wide range of reading skills, including inference and deduction, makes this a fairly challenging part of the test. On the whole learners
answered well, but weaker learners found it difficult. Questions 28 and 29 were the most successfully answered and Questions 26 and 30 the least.

Question 1
A high proportion of correct answers C, (‘price’); common incorrect response was B (‘cost’).

Question 2
A high proportion of correct answers A (‘takes’); most common incorrect response was C (‘needs’).

Question 3
Mostly correct, A (‘meets’); most common incorrect response was B (‘invites’).

Question 4
Mostly correct, B (‘whole’); most common incorrect response was A (‘full’).

Question 5
A high proportion of correct answers B (‘wears’); most common incorrect response was C (‘puts’).

Question 6
Mostly correct, ‘of’; most common incorrect response was ‘in’.

Question 7
A high proportion of correct answers ‘up/older’; there was no one most common incorrect answer but an assortment of responses.

Question 8
Mostly correct, ‘they’; most common incorrect response was ‘lessons or Math(s)’.

Question 9
Mostly correct, ‘lot’; most common incorrect response was ‘lots’.

5
Question 10
A high proportion of correct answers ‘with’; there was no one most common incorrect answer but rather an assortment of responses.

Question 11
Mostly correct, ‘go’; most common incorrect responses were ‘are’ and ‘like’.

Question 12
Mostly correct, ‘ago’; most common incorrect response was ‘old’.

Question 13
A high proportion of correct answers ‘a’; there was no one most common incorrect answer but rather an assortment of responses.

Question 14
Mostly correct, ‘send/write’; most common incorrect response was ‘tell’. There were also a number of instances where spelling errors denied the mark.

Question 15
Mostly correct, ‘about’; most common incorrect responses were ‘what’s’ and ‘that’.

Question 16
Mostly correct, ‘C’; most common incorrect responses were ‘A’ or ‘B’.

Question 17
Mostly correct, ‘G’; most common incorrect responses were ‘D’ or ‘C’.

Question 18
Mostly correct, ‘A’; most common incorrect responses were ‘D’ or ‘E’.

Question 19
Mostly correct, ‘F’; most common incorrect responses were ‘A’ or ‘E’.

Question 20
Mostly correct, ‘D’; most common incorrect responses were ‘B’ or ‘E’.

Question 21
A high proportion of correct answers, ‘A’; most common incorrect response was ‘B’.

6
Question 22
Mostly correct, ‘B’; most common incorrect response was ‘C’.

Question 23
Mostly correct, ‘B’; most common incorrect response was ‘A’.

Question 24
Mostly correct, ‘C’; most common incorrect response was ‘A’.

Question 25
A lot of incorrect answers (usually B); correct answer was ‘C’.

Question 26
Mostly correct, ‘C’; most common incorrect response was ‘A’.

Question 27
Mostly correct, ‘B’; most common incorrect response was ‘A’.

Question 28
A high proportion of correct answers, ‘A’; most common incorrect response was ‘B’.

Question 29
Mostly correct, ‘B’; most common incorrect response was ‘C’.

Question 30
A lot of incorrect answers (usually C); correct answer was ‘A’.

7
2.2 Comments on specific questions – English as a Second Language 0837 component 02
General Comments
The paper seemed to work well although there were comments from examiners about the length of some of the responses. Often the candidates wrote too much in question 6
and question 7 which generated more language to assess, but also more errors. Often, if weaker candidates wrote answers which were too long, they didn’t have the linguistic
resources to develop the narrative through the use of linking words, narrative tense or cohesive devices.
Examiners also made some comments about handwriting as this was often difficult to read. However, on the whole, the candidates seemed to cope well with the tasks and
the majority produced English of the standard required.
Overall, this was a good paper and it seemed to discriminate well between weak and strong candidates.

Question 1
Most learners answered this correctly but the most common wrong answers were ‘sweet’ and ‘straw’ rather than ‘sugar’. ‘Sweet’ was taken from the definition provided on the
test. There were some problems with the spelling of ‘sugar’ with many candidates writing ‘suger’ or ‘shugr’.

Question 2
In most cases the candidates knew what the word should be, and the majority got it right. However, there were lots of variations in the spelling of ‘breakfast’, including
‘brakfeast’, ‘brakefast’, ‘beafsteak’, ‘brecfeast’, ‘breckfast’, ‘brakfaste’ and ‘breakfirst’, none of which the examiners could accept. A common wrong answer was
‘beginning’, which was copied form the definition provided on the test.

Question 3
Most learners answered this correctly, and the spelling was mostly accurate, although there were a few who wrote ‘tomate’, and occasionally the letters ‘o’ and ‘a’ were not clearly
defined, so it was difficult to judge if the spelling was correct. There were a few incorrect guesses for this, such as ‘table’, ‘turnip’, but nothing that could be credited.

Question 4
Many learners found this very difficult and if they did get the word right, many had problems spelling it. We had variations including, ‘frigge’, frigde’, but the most common wrong
answer was ‘freezer’, spelt in a variety of ways. Looking through the scripts marked, this seemed to be the question the candidates got wrong most often.

Question 5
Although the majority of candidates got this correct, there were a few candidates who had problems spelling the word, ‘pleat’, plaet’, ‘plait’, ‘plato’. This question also
generated the most alternative ideas from the candidates, including ‘poats’, ‘paper’, ‘paint’, ‘place’, ‘plant’, ‘plats’, ‘panet’, ‘preys’, ‘panes’, ‘party’, ‘pizza’, ‘panne’, ‘plane’.

Question 6
This question worked well and the vast majority of learners were able to gain high marks with examiners frequently awarding top marks (five) for content and four out of five
marks for communicative achievement. The topic was suitable for the candidates to write about and they generally covered the 3 points successfully. Some candidates didn’t

8
respond to the task appropriately and responded to the email or re-stated the content points in question form in their response. Many candidates also invited Alex to go
swimming, although the task did not ask them to do this as the plan had already been made. This implied they had not read all the information before attempting the task. Most
candidates specified a clear time to meet at the pool, but some were a bit vague and mentioned ‘the morning’, which would not be enough to inform the friend. The second
point was more problematic as many candidates wrote about a place that was known to them and their friend, so just said where to meet rather than suggesting how to get
there. Others gave directions rather than suggesting a mode of transport, which also was not quite correct. The final point was well covered. The candidates should remember
that although they are given a context of writing to their friend, the reader does not share their knowledge, so they have to explicitly state where they are going and how they
would get there.
One final comment concerns the use of the pronoun ’you’ in the input task. Candidates were approaching this in different ways, referring to Alex, themselves or both of them.
All of these were acceptable interpretations, but candidates should know that generally ‘you’ would refer to themselves, so they could just have stated how they were going to
travel to the pool.

Question 7
This was a good topic for a story and most candidates wrote about a fun day in the snow. They had a range of vocabulary which was appropriate for the topic, e.g.
‘snowmen’, ‘snowballs’, ‘snowfights’; and most candidates introduced other characters in the story, such as friends or family to play with. The candidates who did this well
used all the elements of the prompt sentence very well. There was excitement which led Ben to run outside. They made reference to the coat by explaining it was cold and
the snow featured heavily in the stronger candidates’ stories. The stories were coherent and made some use of simple narrative tenses to connect the events, and in some
cases developed a dialogue between the characters in the story, which was effective.
The weaker candidates struggled to make a strong link between the prompt and the story. They usually wrote about Ben, but there was little reference to the weather or
any sense of urgency or excitement in the story. These candidates tended to score lower on Content as they hadn’t used the ideas in the prompt to develop their story
appropriately. The weaker candidates also found it difficult to progress their stories and generally wrote in the present tense making the stories a list of events rather than a
connected series of events. There was also a lack of range of vocabulary meaning that words from the prompt appeared with frequency, or unconnected vocabulary from
the candidates knowledge was used, which made the story lack cohesion. Some of the candidates did not understand ‘coat’ and wrote about a lost cat, which made the
stories rather confusing.
The main difficulty the weaker candidates had was punctuation and a lack of awareness of how punctuation helps the reader follow the narrative. There were occasions
where the stories were written over two pages and no commas or full stops had been used at all. There were some simple linking words, but without punctuation, and
therefore it was difficult to follow the narrative.

9
2.3 Comments on specific questions – English as a Second Language component 03

General Comments
The great majority of learners attempted all the questions. It was noticeable that the number of incorrect answers increased later on in the test, especially in Parts 3 & 4.
The comments below regarding problems in deciphering some learners' intended answers also apply to Paper 1 Reading and Usage. Learners should remember that each
multiple choice question must have only one answer indicated; in a few cases two answers were circled, so the item was marked as incorrect. If learners wish to change an
answer they should very clearly put lines through the letter or words to cross out. A lot of learners tried to delete by writing a wavy line (resembling crocodile teeth) around a
circle but it was sometimes unclear what the intended answer was and the item was sometimes marked as incorrect.
A fairly frequent problem was that a lot of learners were using erasers to try to change answers and they did not always appear to be entirely effective in erasing answers
written in pen; unless the correction was very boldly written, the resulting lack of clarity sometimes made it difficult to decide what the intended answer was. The best
practice seen was when learners either put a clear tick next to their preferred response, or wrote clearly ‘Yes’ or ‘No’. The rubric on the Question Paper does not disallow
the use of erasers but centres should be aware of their limitations.
A similar problem found in a number of responses was that in Part 4 learners wrote over an answer to correct it but in a few cases the resulting answer was not clear enough
for it to be marked as correct.
In Part 4 there were a few acceptable misspellings Question16 (22nd/twenty(-)second/22/twenty(-)two), Q18 (sweetes) and Question 20 (dolfins), but very few learners
wrote these alternatives. With the exception of Question 20, the answers were all very common words.

Part 1 (questions 1-5)


Learners identify one of three pictures from short discrete dialogues. Most learners did well here, especially on Question 4, though answers to Question 5 were often incorrect.
The latter suggests that some extra attention to teaching the time would be useful.

Part 2 (questions 6-10)


This involved multiple choice questions based on a longer dialogue which was a boy called Tom asking his friend Mandy about making a cake. Question 6 was answered
most successfully and Question 9 least successfully.

Part 3 (questions 11-15)


This task comprised five questions based on an interview with a teenage ice-skater. There were a number of incorrect responses due to the increased complexity of
language and greater skills demanded, though overall most learners were quite successful. The most successfully-answered question was Question 11, the least was
Question 12, but Question 13 also had a number of incorrect answers.

Part 4 (questions 16-20)


This task comprised five questions based on a teacher telling her class about a trip to the Natural History Museum. There were a number of incorrect responses due to
the increased complexity of language and greater skills demanded, though incorrect spelling was also a significant contributor. The most successfully-answered question
was Question 16, the least was Question 17, but Question 18 also had a large number of incorrect answers.

10
Question 1
Mostly correct, ‘B’; most common incorrect response was ‘A’.

Question 2
Mostly correct, ‘C’; most common incorrect response was ‘A’.

Question 3
Mostly correct, ‘C’; most common incorrect response was ‘B’.

Question 4
A high proportion of correct answers, ‘A’; sometimes the incorrect response chosen was ‘B’.

Question 5
Mainly correct, ‘A’; most common incorrect response was ‘B’.

Question 6
A high proportion of correct answers, ‘C’; sometimes the incorrect response chosen was ‘A’.

Question 7
Mostly correct, ‘B’; most common incorrect response was ‘A’.

Question 8
Mostly correct, ‘A’; most common incorrect response was ‘C’.

Question 9
Mostly correct, ‘B’; most common incorrect response was ‘C’.

Question 10
Mostly correct, ‘A’; most common incorrect response was ‘B’.

Question 11
Mostly correct, ‘B’; most common incorrect response was ‘C’.

Question 12
A number of incorrect responses, mainly ‘A’; the correct response was ‘C’.

11
Question 13
Mostly correct, ‘A’; most common incorrect response was ‘C’.

Question 14
Mostly correct, ‘B’; most common incorrect response was ‘A’.

Question 15
Mostly correct, ‘A’; most common incorrect response was ‘C’.

Question 16
A high proportion of correct answers, ‘22nd/22’; sometimes the incorrect response chosen was ’20 second(s)’.

Question 17
A number of incorrect responses, mainly ‘sleeping bag’ and spelling errors; the correct response was ‘blanket’.

Question 18
A number of incorrect responses, mainly ‘drinks’ and spelling errors; the correct response was ‘sweets’.

Question 19
Mostly correct, ‘fur’; most common incorrect responses were spelling errors.

Question 20
Mostly correct, ‘dolphins’; most common incorrect responses were spelling errors.

12
2.4 Table and charts of sub-group performances - English as a Second Language 0837

Performances for each syllabus are reported separately; the entries for on-screen and paper-based syllabuses are not combined.

Overall and sub-group performances can change from series to series. You can use the report to compare sub-group performances for this syllabus in this series. You should
not use the information to compare performance changes over time.

13
Demographic breakdown of total entry for Cambridge Primary Checkpoint English as a Second
Language
Percentage of total Average total score Average Listening Average Reading Average Usage Average Writing
entry score score score score
Age in years First Language
10 and under Not English 16.7 3.1 3.1 3.0 3.2 3.4
10 and under English 0.1 2.9 4.1 2.2 2.3 3.0
10 and under All 16.7 3.1 3.1 3.0 3.2 3.4
Age in years First Language
11 Not English 47.0 3.6 3.6 3.6 3.8 3.6
11 English 1.0 5.1 5.0 4.8 5.2 4.9
11 All 48.0 3.7 3.6 3.7 3.8 3.7
Age in years First Language
12 and over Not English 33.9 3.5 3.5 3.6 3.6 3.5
12 and over English 1.4 4.6 4.3 4.6 4.8 4.6
12 and over All 35.3 3.6 3.5 3.7 3.6 3.6
Age in years First Language
All Not English 97.6 3.5 3.5 3.5 3.6 3.6
All English 2.4 4.8 4.6 4.6 4.9 4.7
All All 100.0 3.5 3.5 3.6 3.7 3.6

Please note that in the block charts that follow, the horizontal axis representing Cambridge Primary Checkpoint scores is annotated from 0 to 6.

The value 0 represents the group of scores below 1.0,


the value 1 represents the group of scores from 1.0 to 1.9,
the value 2 represents the group of scores from 2.0 to 2.9,
the value 3 represents the group of scores from 3.0 to 3.9,
the value 4 represents the group of scores from 4.0 to 4.9,
the value 5 represents the group of scores from 5.0 to 5.9,
the value 6 represents the group of scores of 6.0 or more.

For the curve graphs which follow the block charts, the horizontal axis also represents Cambridge Primary Checkpoint scores, but here the scores are continuous rather than grouped. The tick marks along the horizontal axis
therefore represent actual Cambridge Primary Checkpoint scores.
14
Distribution of Cambridge Primary Checkpoint total score for English as a Second Language
classified by student's first language.

15
Distribution of Cambridge Primary Checkpoint total score for English as a Second Language
classified by student's age.

16
Distribution of Cambridge Primary Checkpoint total score for English as a Second Language
by student's first language, showing the cumulative
percentage of the number of students at each score.

17
Distribution of Cambridge Primary Checkpoint total score for English as a Second Language
by student's age, showing the cumulative
percentage of the number of students at each score.

18
Distribution of Cambridge Primary Checkpoint Listening score
classified by student's first language.

19
Distribution of Cambridge Primary Checkpoint Listening score
classified by student's age.

20
Distribution of Cambridge Primary Checkpoint Listening score
by student's first language, showing the cumulative
percentage of the number of students at each score.

21
Distribution of Cambridge Primary Checkpoint Listening score
by student's age, showing the cumulative
percentage of the number of students at each score.

22
Distribution of Cambridge Primary Checkpoint Reading score
classified by student's first language.

23
Distribution of Cambridge Primary Checkpoint Reading score
classified by student's age.

24
Distribution of Cambridge Primary Checkpoint Reading score
by student's first language, showing the cumulative
percentage of the number of students at each score.

25
Distribution of Cambridge Primary Checkpoint Reading score
by student's age, showing the cumulative
percentage of the number of students at each score.

26
Distribution of Cambridge Primary Checkpoint Usage score
classified by student's first language.

27
Distribution of Cambridge Primary Checkpoint Usage score
classified by student's age.

28
Distribution of Cambridge Primary Checkpoint Usage score
by student's first language, showing the cumulative
percentage of the number of students at each score.

29
Distribution of Cambridge Primary Checkpoint Usage score
by student's age, showing the cumulative
percentage of the number of students at each score.

30
Distribution of Cambridge Primary Checkpoint Writing score
classified by student's first language.

31
Distribution of Cambridge Primary Checkpoint Writing score
classified by student's age.

32
Distribution of Cambridge Primary Checkpoint Writing score
by student's first language, showing the cumulative
percentage of the number of students at each score.

33
Distribution of Cambridge Primary Checkpoint Writing score
by student's age, showing the cumulative
percentage of the number of students at each score.

34
3. Cambridge Primary Checkpoint English 0844

3.1 Comments on specific questions – English 0844 component 01

General Comments
Overall, responses to questions which required straightforward answers were good, often needing just a short phrase could be extracted directly from the text to gain the
marks. Multiple choice questions were generally answered well, with most learners using the correct number of ticks. For questions which require learners to identify a
number of items learners should remember that if they provide a list of answers each one must be correct as a wrong answer will negate any correct ones.
In section B learners continued to score well with spelling in particular, but it wasn’t unusual to see both marks gained for punctuation. With Purpose and Audience and Text
Structure, although the vast majority of learners seemed well prepared for this now familiar task, more responses would have benefitted from more use of the planning box to
organise ideas. Simple sentence structures were generally secure, attempts at more ambitious structures were generally less successful.
Some parts of section C showed excellent understanding of grammar and punctuation among a high percentage of learners.

Question 1
Generally this question was well answered by the majority of learners. Unsuccessful answers were characterised by the mentioning of only one place name, or the
generalised ‘England and France’. The vast majority avoided the pitfall of not simply naming the places mentioned in the sentence, and providing instead a long, unfocused
quote.
Question 2
The answer to this question had two limbs, the cause and the effect, and hinged on the comparative speed of body heat loss of athletes exposed to the cold. ‘Because
athletes lack fat, they get cold more quickly’ is the idea, but it was only fully addressed by about 50% of learners, those who paid attention to the phrase ‘in particular’ in the
question.
Question 3
Almost no learners ticked more than two boxes. The number of learners who scored both marks was about half, with around 80% achieving at least one mark.
Question 4
This question tested two skills: extracting information and putting it together in a coherent form. Strong responses summarised the four main points and presented the
summary in 25 or fewer words. More learners achieved both marks this year than last, and fewer lost marks by exceeding the word limit or becoming ungrammatical by
trimming out words in an attempt to stay under the limit.

Question 5
Just over 50% answered correctly here. Incorrect answers were nearly always those ticking box 1 (only facts), because they failed to spot the opinion markers.

Question 6
Learners have been taught how to label a passage as informal or formal, and to include descriptions such as chatty or more adult. The majority gained the mark. However,
quite a large minority ticked the wrong box before providing the explanation for their choice, suggesting that they had learned what to say without being able to spot the

35
language features. Some learners missed the mark by, for instance, ticking the ‘informal’ box, and then writing that ‘it was more fun / easier to read’, without identifying what,
about the language, made it more fun.

Question 7
This was the question that the vast majority of learners answered correctly and succinctly, often choosing the salient words ‘invertebrates’ / ‘marine invertebrates’, ‘tentacles’.
Unsuccessful learners in 7a were distracted by the word ‘fish’ in ‘jellyfish’.

Question 8
Most learners scored at least one mark here. About 70% scored both marks. Those who were challenged by this question gave answers where guesswork appeared to have
been used as there were no trends in either correct or incorrect answers. Much depended on the learners understanding of ‘false’.

Question 9
This was a challenging question. Those who were successful linked both elements: a lack of skeletal structure / bones, an inability to hold shape to link cause and effect. As in
question two, learners seemed unclear that there was a lot to do to gain the mark. Most learners thought that jellyfish collapsed because they melted or couldn’t breathe once
out of water.

Question 10
(a) This question continues to be an effective way of assessing whether learners understand how non-fiction texts are put together. Overall, the success rate was good,
especially where learners considered the effect of subheadings on a reader’s understanding of what a paragraph will contain, or how the subheadings help a reader to
navigate the text. Less strong responses credited subheadings with dividing up the text (the job of paragraphs) or telling us more about the text, rather than the particular
paragraph. A feature of responses that failed to gain marks was their focus on text details.
(b) This was a straightforward question and highlighted those learners (most of them) who had really understood the text.

Question 11
Nearly all learners made a good attempt to respond to the task stimulus, and with the brief being so comprehensive, very few misunderstood what they needed to write about.
Those who chose sport generally chose football; those who chose an animal usually chose the dog. Fewer learners this year wrote a recount, although there were still a small
minority who wrote an instructional text, thereby limiting their marks for purpose and audience. Most wrote using a basic structure of title, introduction, several points in
paragraphs and a simple conclusion. Overall, it proved to be an accessible task for those learners experienced in reading and writing non-fiction texts.
The scores for learners were limited where detail was not developed. A statement that includes additional information, such as an adjectival phrase or an adverb, greatly
enhances a piece of writing. More ambitious sentence structures often accompany the inclusion of such detail. Some learners demonstrated a depth of subject knowledge, a
range of vocabulary and a grasp of syntax way beyond what might be expected of anyone working under the pressure of an exam, especially such young learners. This
suggests an ability to learn a subject, or a passage about a subject, thoroughly; however, this does not always translate into a piece of writing that acknowledges the needs of
a particular audience.
Purpose and Audience
Stronger learners wrote pieces that were both entertaining and informative, used a consistent point of view and demonstrated knowledge and enthusiasm for their chosen
subject. Less strong learners struggled to sustain their writing throughout the piece, often starting well but lacking the knowledge to continue for long. The majority of learners
gained 2 or 3 marks.

36
Text Structure
Stronger learners’ writing was characterised by good structural features and they were able to link ideas between sections. It was rare to see an unparagraphed piece, but in
such cases marks for text structure were limited. Most learners achieved a mark of 3, as they provided enough structure within paragraphs and the whole piece, satisfied the
requirement for a balance of coverage, but didn’t provide enough development of ideas before moving to the next paragraph. Weaker learners often failed to link their ideas
logically, offering instead a list of random facts, but even these responses showed some idea of how to present a text with headings and sub-headings.
Sentence Structure
Mostly, simple and compound structures were used, although some learners made a good attempt at using complex structures, linking ideas within a sentence by using, for
example, ‘which’ and ‘although’. Less successful writing was characterised by verb form inconsistencies and disagreement between subject and verb. Where they were used,
simple sentences were secure.
Punctuation
Many learners achieved full marks here. Sentence demarcation was secure. Where learners lost the second mark, there was some good use of commas to mark clause
divisions and pauses, but also an inconsistent approach to using capital letters: either they were missing or were used for words that didn’t require them.
Spelling
Spelling, at this level, was impressive, even among learners who struggled to achieve marks elsewhere in the writing.

Question 12
Fewer learners than last year scored 0. About 75% scored both marks, with about 90% scoring at least 1 mark.

Question 13
(a) Over half of the responses to this question were correct.
(b) Most learners inserted the clause correctly, thereby scoring the first mark. Many failed to earn the second mark by failing to punctuate correctly or, in a few cases, making
copying errors. It is important that these latter are avoided so there is no confusion about what has been added or changed. Nevertheless, more learners scored both
marks for this question than they had in previous sessions.

Question 14
(a) Mastering different verb forms is a challenge for most. Practising verb agreement will help learners to check their responses with more accuracy. Learners who failed to
gain the mark (about 40%) did so either because they didn’t convert ‘saw’ to ‘see’, or made changes or errors in other parts of the sentence.
(b) Changing verbs from active voice to passive represents a challenge for many. Learners took advantage of there being alternative forms for the third verb this year,
though there were many instances of marks being lost through incorrect spelling of the verbs.

Question 15
This was generally well answered, with fewer marks being lost through misspelling the replacement words ‘are’ and ‘them’, and most learners identified exactly what needed
changing in the sentence.

Question 16
Learners continue to gain confidence in being able to identify sentence types. Fewer identified either sentence as simple, perhaps because neither sentence was short.

37
Question 17
Stronger learners understand the difference between the apostrophe of possession and the apostrophe of omission. A few thought ‘swimmers’ was plural and placed the
apostrophe after the ‘s’; other errors mainly centred on placing the apostrophe between the ‘d’ and the ‘n’ of didn’t, and apostrophising ‘its’.

Question 18
Many learners recognise where speech marks should go. Most learners know where to place accompanying punctuation, such as commas and question marks, in relation to
the speech marks.

Question 19
This is always a challenging question. Successful learners read the sentences in context and chose synonyms that fitted both the meaning and sense of the sentences.

38
3.2 Comments on specific questions – English 0844 component 02

General Comments
Most learners were engaged by the story, showing some good understanding of the content. In particular, from examples seen in the story writing task, it would seem that
many learners were engaged by the idea of a secret location.
In section A, questions were generally answered well when learners had paid close attention to the text in inferential ways as well as in literal understanding. Learners who
were able to differentiate between questions requiring inference from explicit questions scored well. Learners are generally aware of the need to refer to the text, but
sometimes struggle to know when it is appropriate to use direct quotations in their responses. Longer questions, particularly those in two parts, needed answers with an
explanation relating to the main character and her situation, together with selected quotations or evidence from the text. While most learners are aware of the need to explain
their thoughts in their own words and then select the supporting quote or evidence accurately, many still struggle with this way of answering questions – this is discussed in
more detail in the commentary on specific questions below.
In Section B, it was apparent that most learners did respond to the stimulus. Most stories featured Mary and Dickon, and there were very few who did not set their story in the
garden, at least at the outset. The most successful stories invariably focused on a problem and the characters’ reaction to it. These learners managed to create a degree of
suspense and used ambitious vocabulary choices and sentence structures.

Question 1
This was an explicit question, with successful learners understanding that ‘he had untidy hair’ and ‘smelt of natural things’; however, a significant number of learners’ explicit
reading skills were challenged and many gained only one mark here.

Question 2
Good responses to this question recognised the correct detail about Dickon’s eyes being ‘round’ and ‘blue’. Stronger learners understood that this question focused on what
Mary thought was the most unusual thing about Dickon’s appearance. The relevant information was taken from: 'never before had Miss Mary seen ...' Of those that did
correctly choose the ‘round blue eyes’, some still got the answer wrong because they only included one of the two adjectives describing the eyes. Both adjectives needed to
be included to gain the mark as it is both qualities that make the eyes unusual.

Question 3
Question two was about the boy's appearance and question three was about the whole scene the boy was part of. Many learners did not make this distinction, and described
something about Dickon's appearance. Others identified the word 'strange' as a synonym of 'surprising', and others wrote about the sound that Dickon's pipe made. The
strongest learners were able to write about the animals surrounding Dickon and seemingly listening to him. Some did this with a mixture of text and own words. The proximity
of the animals was essential to gain marks here.

Question 4
While a significant number of successful learners answered ‘So as not to ‘startle’ or frighten the wild animals’, there were many who gave ‘A body has to move gently and
speak low when wild things are about’, thus gaining no marks. Learners should be encouraged to focus on the question stem: Why (did Dickon speak to Mary in a low voice
when he first saw her?) and ensure the reason is given in the response.

39
Question 5
A significant majority ticked ‘Yes’; however, the reasons given were not always adequate. Successful learners mostly recognised that he had brought a bag of garden tools for
her (so he was expecting to meet her); he had received a letter from Martha; he knew who she was (‘I know you’re Miss Mary’).

Question 6
This question asked learners to make inferences about Dickon’s character from the animals’ reaction to him. Many learners correctly identified that: he likes / is kind towards
animals; that he is familiar / at home with nature and wild animals; he has a gentle nature.

Question 7
Questions like this need learners to give an explanation, in their own words, supported by a suitable quote. Quotes need to be both relevant and accurate. Of those who were
successful in giving an answer in their own words, many failed to gain the second mark because the quote was inaccurate or incomplete. Some learners rightly said Dickon
spoke first / immediately to Mary, and so gained the first mark but missed the second mark because they quoted the first words he said (‘Don’t move,’ he said. It’d flight
them.’) A small number of learners referred to Dickon’s appearance i.e. his clean clothes to illustrate his confidence.

Question 8
This question required learners to recognise that it was Martha who had asked Dickon to go to the shop to buy the gardening tools for Mary. While this was successfully
answered by many learners, a few were unsuccessful, answering ‘Did you get Martha’s letter?’ This does not answer an essential aspect of the question ‘What made…?’

Question 9
As with question seven, many learners did not get these marks because they failed to use their own words correctly, even though they did interpret the question as required.
There was a lot of reliance on copying ‘forgot that she had felt shy’ for the explanation as well as for the quotation. A large proportion also described how Mary felt at the
beginning of the meeting or at the end of the text without explaining the change.

Question 10
Many learners understood that the point of view character was Mary. Fewer learners understood that this is the case because the reader knows what she is thinking or how
she is feeling. The majority focused on her being the main character, and ‘it talks about her a lot’. One good answer was ‘because her feelings drive the story’. Many chose
Dickon as the point of view character. Others wrote the narrator, and some wrote ‘the author’. The viewpoint in a text is an area of the curriculum learners need to be clear on
as it is a key learning objective. Establishing viewpoints in different texts and pointing out to learners the clues that show the reader from whose perspective a story is being
told would be useful classroom preparation.

Question 11
(a) This was a straightforward question. Many learners, even those who struggled on the rest of the paper, gained a mark here. Identifying figurative language techniques is
obviously an area of strength for learners.
(b) Only stronger learners were able successfully to link the fact about Mary’s stillness and the comparison in the metaphor about the state of being frozen. This suggests
that while learners are getting better at identifying metaphors, most still struggle to explain how they work. Some learners failed to respond to this question at all. The
second part of the question was misinterpreted by some. The question required an explanation of why the word ‘frozen’ was appropriate when describing someone not
moving. Many learners explained why it was appropriate that Mary wasn’t moving (i.e. Dickon told her not to move, it would frighten the animals). From some of the

40
answers given to this question, it would seem that some learners interpreted ‘Mary stood frozen, not daring to move a muscle’ as somebody so terrified she didn’t dare
move. A common misconception was to answer the question, ‘Why use a metaphor?’, giving answers such as ‘for added interest’. Others explained what a metaphor is.

Question 12
(a) Most learners gained the mark here by identifying the genre correctly as ‘adventure’.
(b) This question was challenging. Some learners showed a sound knowledge of the features of adventure stories, for example suspense, exciting plot, action. It is the
general features of the genre that are required. As with the point of view question, different genres of stories and their typical features are significant learning objectives
in the primary curriculum. It is advisable that learners become aware of these features linked to specific genres.

Question 13 – Writing task


Overall, the writing prompt appealed to most learners. Most learners were able to write a story that was set in the secret garden and featured Mary and Dickon. There were a
few excellent stories characterised by well-described settings, linked to a lively plot with well-built suspense and action. The content and audience was handled well. Structure
and dialogue was often used to create an interesting narrative and to develop character and mood. Grammatically sound, these were generally well-structured and well-
punctuated whilst exhibiting a good range of expressive and appropriate vocabulary. Learners who responded this way were able to move the narrative forward using more
complex sentence structures to express ideas and add descriptive detail.
Where attempts were made to produce a plan, there was evidence that learners were able to use a range of strategies to organise and set out their ideas. Where planning
was done well, learners wrote a series of well-organised paragraphs with a good balance between action, description and speech. This balance is an important consideration
and it helps to keep learners ‘on track’ as they write.
Content
The best narratives were those in which learners used good description and added detail. Problems and their resolution moved the narrative forward. Less strong learners
struggled with the use of a problem. Successful learners ensured that their narratives progressively revealed something of their characters as the plot developed. The use of
devices to create precise images was seen in some learners’ work. These better stories used a solid connection between character and events. There was evidence of
ambitious structures and vocabulary, and some degree of control in balancing action, dialogue and description.
Most stories had a simple plot and could be described as being well placed in the setting with a clear beginning and end; however, there were many examples where the story
did not develop beyond a simple description of Mary and Dickon playing or tending to the garden. Meeting and talking to a gardener was taken as being ‘action’ where very
little happened.
Audience
There were many attempts to try to engage the reader through the inclusion of detail. Finding a way to ‘hook’ the reader is not easy as it is difficult to control the flow of the
story. Attainment of higher marks depends on being able to control the content so that the reader is fully engaged. Remembering the audience is a key issue here. A few
learners who managed to do this described the characters’ reactions to different events as they unfolded during the telling of the story.
Ideas in third person narratives could be centred on working out how a character might be feeling at different stages of the story. The root of this lies in being able to create
realistic characters. Simple descriptions of how a person looks or of what they are wearing often slow the narrative and give little away about what a character might be
feeling. Exploring characters in fiction stories and identifying features that make them interesting and believable can benefit young writers greatly. Word choices are important
to achieve success here and help to develop and maintain a relationship with the reader.
Text Structure
As indicated in the content section, good, well-structured stories were usually written by learners who understood that a story needs a beginning, a middle and an end.
Stronger stories also showed a developed structure usually comprising an opening, a problem, development, a climax, resolution and an ending.

41
Using paragraphs and learning about their purpose improve the structure of a piece of writing. Strong learners were able to gain high marks where they showed a good
understanding of paragraphs by linking them together effectively, for example, through contrasts in mood, shifts in time and changes in location. Most learners attempted to
sequence their stories chronologically. This is very effective when paragraph breaks are evident.
Sentence Structure
Simple sentence structures were used by most learners to express their ideas. The control of verb forms was generally good in these sentence structures, so that consistency
was achieved. Many learners were able to link simple sentences using ‘and’, ‘but’ or ‘then’. They were also able to add simple details using adjectives.
Some strong learners extended their stories by using complex sentences, and some of these were excellent, showing the use of a range of phrases and clauses to develop
ideas. A few of these demonstrated the careful use of expanded phrases, particularly adjectival, adverbial and verb phrases, to develop ideas with a wider variety of
connectives to keep the story pace flowing and to develop ideas. Furthermore, very strong learners constructed some quite ambitious sentences, which were enhanced by
carefully chosen adjectives and adverbs and the positioning of clauses for effect. Sometimes these learners showed how the use of short simple sentences, especially in
short one-sentence paragraphs, can help to build suspense.
Punctuation
This is generally an area of strength. Demarcation of basic sentences was frequently secure, and there was clear evidence of some use of the comma to mark out clauses or
separate phrases within a long sentence. When punctuating speech, many learners showed that they know what to do. Speech marks were often placed accurately around
spoken words. Many learners were able to place other speech punctuation correctly.
Vocabulary
Most learners used a simple and appropriate vocabulary, showing a good understanding of how adjectives can develop description. Strong learners developed their use of
vocabulary beyond this level by using precise vocabulary where the choice of a particular word, or words, contributed significantly to the creation of image and mood. The
best examples of writing managed to create atmosphere and describe feelings, and included the effective use of adverbs. These learners also showed that they know how to
improve their writing by choosing and experimenting with ambitious words. Learners who produced writing like this also showed that they know how to deploy metaphors or
similes to good effect.
Spelling
Generally, spelling is a strong. Learners are able to spell a range of common words in everyday use and make good attempts to spell compound words and other polysyllabic
words. Basic spelling rules, for example making plurals and changing verb forms, are well known.

42
3.3 Table and charts of sub-group performances - English 0844

Performances for each syllabus are reported separately; the entries for on-screen and paper-based syllabuses are not combined.

Overall and sub-group performances can change from series to series. You can use the report to compare sub-group performances for this syllabus in this series. You should
not use the information to compare performance changes over time.

43
Demographic breakdown of total entry for Cambridge Primary Checkpoint English

Percentage of total Average total score Average Reading Average Usage Average Writing
entry score score score
Age in years First Language
10 and under Not English 14.2 3.4 3.3 3.4 3.4
10 and under English 7.0 3.9 3.8 3.9 3.7
10 and under All 21.2 3.5 3.5 3.5 3.5
Age in years First Language
11 Not English 37.6 3.7 3.7 3.6 3.6
11 English 16.1 4.0 4.0 4.0 3.9
11 All 53.7 3.8 3.8 3.7 3.7
Age in years First Language
12 and over Not English 19.6 3.6 3.6 3.6 3.6
12 and over English 5.6 3.8 3.8 3.8 3.8
12 and over All 25.2 3.7 3.7 3.6 3.6
Age in years First Language
All Not English 71.4 3.6 3.6 3.6 3.6
All English 28.6 3.9 3.9 3.9 3.8
All All 100.0 3.7 3.7 3.7 3.6

Please note that in the block charts that follow, the horizontal axis representing Cambridge Primary Checkpoint scores is annotated from 0 to 6.

The value 0 represents the group of scores below 1.0,


the value 1 represents the group of scores from 1.0 to 1.9,
the value 2 represents the group of scores from 2.0 to 2.9,
the value 3 represents the group of scores from 3.0 to 3.9,
the value 4 represents the group of scores from 4.0 to 4.9,
the value 5 represents the group of scores from 5.0 to 5.9,
the value 6 represents the group of scores of 6.0 or more.

For the curve graphs which follow the block charts, the horizontal axis also represents Cambridge Primary Checkpoint scores, but here the scores are continuous rather than grouped. The tick marks along the horizontal axis
therefore represent actual Cambridge Primary Checkpoint scores.

44
Distribution of Cambridge Primary Checkpoint total score for English
classified by student's first language.

45
Distribution of Cambridge Primary Checkpoint total score for English
classified by student's age.

46
Distribution of Cambridge Primary Checkpoint total score for English
by student's first language, showing the cumulative
percentage of the number of students at each score.

47
Distribution of Cambridge Primary Checkpoint total score for English
by student's age, showing the cumulative
percentage of the number of students at each score.

48
Distribution of Cambridge Primary Checkpoint Reading score
classified by student's first language.

49
Distribution of Cambridge Primary Checkpoint Reading score
classified by student's age.

50
Distribution of Cambridge Primary Checkpoint Reading score
by student's first language, showing the cumulative
percentage of the number of students at each score.

51
Distribution of Cambridge Primary Checkpoint Reading score
by student's age, showing the cumulative
percentage of the number of students at each score.

52
Distribution of Cambridge Primary Checkpoint Usage score
classified by student's first language.

53
Distribution of Cambridge Primary Checkpoint Usage score
classified by student's age.

54
Distribution of Cambridge Primary Checkpoint Usage score
by student's first language, showing the cumulative
percentage of the number of students at each score.

55
Distribution of Cambridge Primary Checkpoint Usage score
by student's age, showing the cumulative
percentage of the number of students at each score.

56
Distribution of Cambridge Primary Checkpoint Writing score
classified by student's first language.

57
Distribution of Cambridge Primary Checkpoint Writing score
classified by student's age.

58
Distribution of Cambridge Primary Checkpoint Writing score
by student's first language, showing the cumulative
percentage of the number of students at each score.

59
Distribution of Cambridge Primary Checkpoint Writing score
by student's age, showing the cumulative
percentage of the number of students at each score.

60
4. Cambridge Primary Checkpoint Mathematics 0845

4.1 Comments on specific questions - Mathematics 0845 component 01

General Comments
Topics that were well covered: • Understanding the meaning of each digit in a four-digit number. (Question 1)
• Knowing the multiplication facts for 6× table and derived division. (Question 2)
• Using a Venn diagram to sort data using two criteria (Questions 3a and 3b)
• Organizing data in a tally chart and identify frequencies (Question 7a)
• Finding the mode of a set of data. (Question 7b)
• Answering a question by interpreting data displayed in a bar chart. (Questions 8a and b)
• Deriving a pair of one-place decimals with a sum of 10 (Question 19a)
• Deriving a pair of two-place decimals with a sum of 1 (Question 19b)

Topics that proved to be more difficult: • Recognising and identifying a seven-sided polygon drawn on a dotted grid (Question 4a)
• Solving a spacial problem involving the identification of a mixed number. (Question 6)
• Counting on one thousand from a four-digit number. (Question 9)
• Solving a number puzzle involving multiples of 6 (Question 12)
• Solving a number puzzle involving units, tenths and hundredths in decimal notation. (Question 13)
• Using a given calculation and place value to solve a related calculation. (Questions 17a and 17b)
• Multiplying a two-place decimal by 10 and dividing a number by 100 giving
an answer with two decimal places. (Question 26)
• Using a protractor to measure a given acute angle. (Question 27)
• Calculating the area of a compound shape that can be split into rectangles. (Question 28)

Whilst the majority of scripts were well presented and a broad understanding of the Cambridge Primary Mathematics Curriculum displayed, there were papers where the
numerals were unclear. This was sometimes because of the formation of the numerals and sometimes because numerals had been overwritten. Where possible the benefit of
doubt was afforded to the leaner but occasionally marks could not be awarded because it was impossible to discern the number intended. This has been an issue in previous
tests.
In two mark questions leaners could be awarded one mark for showing a correct method even if the final answer was incorrect. Marks may have been lost where no working
was shown.
Some specific mathematical vocabulary such as mixed number, square numbers, multiples, heptagon and trapezium, was not widely understood.

61
Question 1
Objective: To understand the meaning of each digit in a four-digit number.
Common Errors: Most leaners answered this correctly. The two most common errors were:

5 an error with place value.


56 possibly missing the 6 given on the right of the answer box.

Question 2
Objective: To know multiplication facts for 6× table and derived division.
Common Errors: Most leaners answered this correctly. The few errors seen tended to be:

324 (km) calculating 54 × 6


48 (km) calculating 54 – 6
A few attempted the correct calculation (54 ÷ 6) but made an arithmetic error.

Question 3
(a) Objective: To use a Venn diagram to sort data using two criteria.
Common Errors: Most leaners answered this correctly.
Where 21 was misplaced it tended to be in the intersection i.e.

(b) Objective: To use a Venn diagram to sort data using two criteria.
Common Errors: 30 or 29

Question 4
(a) Objective: To recognise and identify a seven-sided polygon drawn on a dotted grid.
Common Errors: The wrong polygon name selected e.g. pentagon, hexagon or octagon.
An incorrect word used e.g. hectagon.

62
(b) Objective: To draw a trapezium, having one line of symmetry, on a dotted grid.
Common Errors: A variety of incorrect shapes were drawn including:

A shape with one line of symmetry


but not a trapezium e.g.

A trapezium without a line of


symmetry e.g.

A copy of the shape given in


part (a)

Question 5
Objective: To solve a number puzzle that involves dividing a two-digit number by a one-digit number.
Common Errors: 5 same amount as Lily
14 calculating 24 – (5 + 5)
19 calculating 24 – 5

Question 6
Objective: To solve a spacial problem involving the identification of a mixed number.
Common Errors A wide variety of incorrect answers were given including:

12
the number of slices left as a fraction of the total number of slices (not as a fraction of the pizzas).
16
12 the number of slices left.
12
the amount of pizza left expressed as a vulgar fraction (not as a mixed number).
8

63
Question 7
(a) Objectives: To organize data in a tally chart and identify frequencies.
Common Errors: Most leaners answered this correctly. A few gave the frequencies as percentages or fractions.

(b) Objectives: To find the mode of a set of data.


Common Errors: Most leaners answered this correctly.
A few leaners answered 5: the frequency of the modal colour.

Question 8
(a) Objectives: To answer a question by interpreting data displayed in a bar chart
Common Errors: Most leaners answered this correctly.
(b) Objectives: To answer a question by interpreting data displayed in a bar chart.
Common Errors: Most leaners answered this correctly.
A few leaners included June
A few leaners only listed August: the month when the most money was collected.

Question 9
Objective: To count on one thousand from a four-digit number.
Common Errors: 2250 (ml) subtracting 1000 from 3250 rather than adding it.

Question 10
Objective: To find simple fractions of quantities.
Common Errors:
Equating to

and, less often:

Equating to

64
Question 11
Objective: To know the square numbers to forty nine.
Common Errors: A variety of incorrect answers were given including:

understanding square numbers but miscalculating 72

possibly attempting to extend a number sequence based on the difference between 9 and 25

Question 12
Objective: To solve a number puzzle involving multiples of 6
Common Errors: Three multiples of 6 whose sum is not 60
Three multiples of 6 whose sum is 60 (with a repeat)
Three non-multiples of 6 whose sum is 60

Question 13
Objectives: To solve a number puzzle involving the representation of units, tenths and hundredths using decimal notation.
Common Errors: A wide range of incorrect answers were given. In particular:
841 taking hundredths to be hundreds and tenths to be tens.
0.48 a miscalculation of the units digit.

Question 14
Objective: To describe the occurrence of familiar events using the language associated with probability.

Common Errors: Matching to

Question 15
Objective: To identify the remainder when a two-digit number is divided by a one-digit number.
Common Errors: 13 confusing the quotient and remainder.
13.57 giving the answer as a decimal and rounding or truncating

65
Question 16
Objective: To predict where a polygon will be after one reflection.

Common Errors: drawing the correct shape displaced one square to the right.
possibly matching the spaces on the left and right edges of the grid.

Question 17
(a) Objective: To use a given calculation and place value to solve a related calculation.
Common Errors: 7840 calculating 112 × 70 without explaining how to use 112 × 7 = 784 to get the answer.
Stating that you place a nought after 784 to get 7840 without offering a mathematical justification.
Stating that you add a nought to 784 (n.b. 784 + 0 = 784)

(b) Objective: To use a given calculation and place value to solve a related calculation.
Common Errors: 78.4 calculating 11.2 × 7 without explain how to use 112 × 7 = 784 to get the answer.
Stating that you move the decimal point or add a decimal point without offering a mathematical justification.

Question 18
Objective: To identify fractions and their decimal equivalents.
Common Errors: Matching to 0.2

Matching to 0.75

66
Matching to 0.25

Matching to 0.4

Matching to 0.75

Question 19
(a) Objective: To derive a pair of one-place decimals with a sum of 10
Common Errors: 7.3 possibly focusing on the units (3 + 7) without accounting for the tenths
(b) Objective: To derive a pair of two-place decimals with a sum of 1
Common Errors: 76 place value error.
0.86 possibly focusing on the tenths (0.8 + 0.2) without accounting for the hundredths.

Question 20
(a) Objective: To read a co-ordinate using all four quadrants.
Common Errors: (–3, –4) reversal of co-ordinates
(–4, –4) incorrect location of fourth vertex of a square.
(b) Objective: To plot a co-ordinate using all four quadrants.
Common Errors: Plotting the point at (–4, –4)

Question 21
Objective: To order a set of numbers with one or two places of decimal.

Common Errors: possibly seeing the decimal as 2, 4, 12, 14 and 42 hundredths

67
Question 22
Objective: To multiply two numbers using a process of doubling one number and halving the other

Common Errors: calculating 35 × 8 possibly seeing the = sign as an instruction and not equating the left and right-hand
sides.

calculating 35 × 8 × 2

Question 23
Objective: To recognise general statements that correctly describe the sums and products of odd and even numbers.
Common Errors: A variety of incorrect answers were given. Amongst the more common errors were:

The more successful leaners appeared to use simple calculations to check each statement.

Question 24
Objective: To find all the factors of a two-digit number.
Common Errors: Some leaners attempted to give multiples of 33 rather than factors e.g. 33, 66, 99, 132
Some leaners missed one or two of the factors e.g. 3, 11,
Some leaners used numbers other than whole numbers e.g. 3, 11, 2, 17.5

Question 25
Objective: To solve a number puzzle involving the addition of one-place decimals.
Common Errors: Using numbers other than those provided
Incorrect placement of the given numbers

68
Question 26
Objective: To multiply a two-place decimal by 10 and divide a number by 100 giving an answer with two decimal places.
Common Errors: A wide variety of incorrect answers were given showing a lack of understanding of the effects of multiplication by 10 and 100. These included:
Multiplication by 10 269 0.269 2690 26.9
Division by 100 0.358 35.8

Question 27
Objective: To use a protractor to measure a given acute angle.
Common Errors: Inaccurate measuring e.g. 60° or 65°
Using the wrong scale on the protractor and giving the supplementary angle e.g.118°

Question 28
Objective: To calculate the area of a compound shape that can be split into rectangles.
Common Errors: Attempting to calculate the perimeter rather than the area and ignoring the given units e.g. 12 + 7 + 20 + 4 = 43 (cm2) or 7 + 20 +7 + 20 = 54 (cm2)
Calculating the area of a rectangle measuring 7 cm × 20 cm i.e. 7 × 20 = 140 (cm2)
Splitting the shape into two overlapping areas and finding the sum of these two composite parts e.g. (20 × 4) + (12 × 7) = 164 (cm2)

69
4.2 Comments on specific questions – Mathematics 0845 component 02

General Comments
Topics that were well covered: • Positioning numbers on a number line calibrated with divisions every 100 (Question 1)
• Understanding and applying the fact that multiplication is the inverse of division. (Question 3)
• Interpreting data presented in a pictogram. (Question 5a)
• Calculating the difference between two near multiples of 1000 (Question 7)
• Rounding four-digit numbers to the nearest 10, 100 and 1000 (Question 8)
• Solving a number puzzle involving the multiplication of two, two-digit numbers. (Question 9)
• Comparing two five-digit numbers using the signs < and > (Question 11)
• Knowing and applying the test for divisibility by 5 (Question 12)
• Identifying common multiples of 6 and 8 (Question 15)
• Identifying where a triangle will be after a given translation (one point given) (Question 16)

Topics that proved to be more difficult: • Understand the effect of changing the value of the symbols in a pictogram. (Question 5b)
• Solving a worded problem involving simple ratio. (Question 6b and 28)
• Calculating a time interval in minutes from times given in a digital format. (Question 13)
• Expressing a percentage as a vulgar fraction. (Question 20b)
• Comparing numbers written in figures and words using the signs >, < and = (Question 22)
• Measuring lines to the nearest centimetre and millimetre. (Question 24a and b)
• Appreciating time differences in different countries and calculating equivalent
times across different time zones. (Question 26)
• Solving a worded problem involving the conversion of a mass measured in
kilograms to grams. (Question 29)

As in paper one the majority of scripts were well presented and a broad understanding of the Cambridge Primary Mathematics Curriculum displayed.
In a few papers presentation was still an issue. Some figures were ambiguous and alterations to answers made them difficult to read. Wherever possible the benefit of doubt
was afforded to leaners but a few answers were too ambiguous to differentiate. Also, some drawings were not drawn with care. Marks may have been lost because a leaner’s
intentions were not clear.
Whilst number operations were generally well carried out, solving worded problems, especially involving simple ratio and proportion, proved more difficulty.
Some of the specific mathematical vocabulary such as common multiples, prime numbers, tetrahedron, cuboid, pyramid, vertices and edges, was not widely understood.
Some errors could have been due to carrying out longer calculations on paper and not making sensible use of the available calculator.

70
Question 1
Objective: To position numbers on a number line calibrated with divisions every 100

Common Errors: Most leaners answered correctly.


A few were out by a factor of ten: e.g.

Question 2
Objective: To compare and order angles less than 180°
Common Errors: Measuring the angles and using these values rather than the letters. Where an error was greater than ± 5° no mark was awarded.
Some leaners gave the angles in reverse order: largest to smallest.
Angles A and C were sometimes reversed with the right angle not being recognised.

Question 3
Objective: To understand and apply the fact that multiplication is the inverse of division.
Common Errors: Most leaners answered correctly.
A few gave the answer 24 by calculating 96 ÷ 4 rather than 96 × 4

Question 4
Objective: To recognise the equivalence between the decimal fraction and vulgar faction forms of halves quarters and hundredths.
75 3
Common Errors: Most leaners correctly expressed 0.75 as or
100 4
63
was quite often equated to 0.063 rather than 0.63
100

Question 5
(a) Objective: To interpret data presented in a pictogram.
Common Errors: 80 (ants) this is the total number of bugs found by Class 4B rather than the number of ants.
1
2 (ants) counting each symbol as 1 ant rather than 5
2

(b) Objective: To understand the effect of changing the value of the symbols used in a pictogram.
Common Errors: Not evaluating the number of spiders collected by each class but repeating the information given in the question regarding the value of each
symbol.
Saying both classes collected the same number by evaluating the total number of bugs collected by each class and not the number of spiders.

71
Question 6
(a) Objective: To understand simple ideas relating to ratio and proportion.
1
Common Errors: 6 (cm) calculating 12 ×
2
1
(cm) possibly a misunderstanding of the scale factor
6
(b) Objective: To understand simple ideas relating to ratio and proportion.
Common Errors: 12 (cm) repeating the height of the real toy rather than calculating its length.
72 (cm) calculating 12 × 6

Question 7
Objective: To calculate the difference between two near multiples of 1000
Common Errors: Most leaners answered correctly.
A few answered 4002 by calculating 2005 + 1997

Question 8
Objective: To round four-digit numbers to the nearest 10, 100 and 1000
Common Errors: Most leaners answered correctly.

Question 9
Objective: To solve a number puzzle involving the multiplication of a two-digit number by a two-digit number.
Common Errors: Whilst most leaners answered correctly the following errors were seen:

Using digits other than those provided e.g.

Arranging the digits to give the wrong product e.g.

Question 10
Objective: To compare the time of day using digital and analogue formats.

Common Errors: possibly failing to register that the time was in the afternoon not the morning.

possibly knowing 4:35pm was afternoon but not equating 16:25 with 4:25pm.

72
Question 11
Objective: To compare two five-digit numbers using the signs < and >
Common Errors: Whilst most leaners answered correctly, a few appeared to find the five digit numbers difficult to read and reproduce.

A more common error in the first inequality was:

A more common error in the second inequality was:

Question 12
Objective: To know and apply the test for divisibility by 5
Common Errors: Most leaners answered this correctly but a few did not offer sufficient explanation: e.g.
Stating that 342 was not divisible by 5 (or a multiple of 5) without qualifying how this was known.
Stating that 342 ÷ 5 did not result in a whole number but not evaluating this.

Question 13
Objective: To calculate a time interval in minutes from times given in a digital format.
Common Errors: A wide variety of incorrect answers were given including:
30 (minutes) possibly reading the question as: “how much later did Chen read” and calculating the interval from 10:58 to 11:28
50 (minutes) possibly treating the times as decimal numbers i.e. for Pierre 10.58 – 9.15 = 1.43 taking this to be 1 hour 43 mins.
for Chen 11.28 – 9.35 = 1.93 taking this to be 1 hour 93 mins.
and then finding the difference: 1hr 93mins. – 1hr 43mins. = 50 mins.

Question 14
Objective: To draw rectangles of a given area on a 1 cm dotted grid.
Common Errors: Giving the correct dimensions but drawing an incorrect rectangle. e.g
This was possibly because the dots were counted rather than the squares.

A number of leaners drew shapes other than a rectangle, most commonly a triangle.

73
Question 15
Objective: To identify numbers which are common multiples of 6 and 8
Common Errors: Most leaners answered this correctly

A few selected three numbers possibly selecting multiples of 6 or 8 rather than 6 and 8

Question 16
Objective: To identify where a triangle will be after a given translation (one point given).
Common Errors: Most leaners answered this correctly although the following errors were seen:
Drawing a non-congruent shape.
Drawing an attempt at a rotation of the triangle.
Drawing an approximate sketch without enough clarity to make the intention clear.

Question 17
Objective: To identify prime numbers up to 20
Common Errors: 11, 13, 15, 17, 19 possibly equating odd with prime numbers
A number of leaners included prime numbers outside of the required range. If correct these were discounted but several answers included 1
Question 18
Objective: To use the language associated with probability to describe the likelihood of a given event occurring.
Common Errors: All of the incorrect options were chosen separately by a number of leaners.

Question 19
Objective: To solve a worded problem that involves multiplying amounts of money by a single and a two-digit number.
Common Errors: A number of leaners employed a correct method but made one or several arithmetic errors.
Calculating 6 × $2.75 + 4 × £4.60 confusing the class number with the number in the class

Question 20
(a) Objective: To find a simple percentage of a shape.
Common Errors: A variety of errors were made including:

74
(b) Objective: To express a percentage as a vulgar fraction.
3
Common Errors: the fraction of the rectangle that was shaded.
10

Question 21
Objective: To add and subtract numbers with the same and different numbers of decimal places.
Common Errors: Inserting the numbers from the example, ignoring the given rules e.g.

Question 22
Objective: To use the >, < and = sign correctly.
Common Errors: A variety of incorrect answers were given, in particular false, false, true.
Errors occurred when the statements were compared as written rather than being converted to figures or other equivalent formats.

Question 23
Objective: To solve a number puzzle that involves knowing and applying the arithmetic laws as they apply to addition and multiplication.

Common Errors: calculating (5 + 5) × 3 ignoring the requirement to have the same number in each box.

calculating (3.87 + 3.87) × 3.87 an attempt at calculating √15

Question 24
(a) Objective: To measure a line to the nearest centimetre.
Common Errors: 2.8 (cm) correctly measuring the line but not rounding to the nearest centimetre.

(b) Objective: To measure a line to the nearest millimetre.


Common Errors: 50 (mm) measuring inaccurately.
28 (mm) measuring the wrong side.

75
Question 25
Objectives To visualise and describe the properties of given 3D shapes.
Common Errors: A wide variety of incorrect answers were given.
Tetrahedron appeared to be less well understood than cuboid or square-based pyramid.
Vertices and edges appeared to be less well understood than faces.
Question 26
Objective: To appreciate time differences in different countries and calculate equivalent times across different time zones.
Common Errors: 11:35 seven hours later than 04:35 rather than seven hours earlier.
9:35 not distinguishing 9:35 in the morning from 9:35 in the evening.

Question 27
Objective: To give the answer to a division as a mixed number.
Common Errors: A wide variety of incorrect answers were given including:

Question 28
Objective: To solve a worded problem involving simple ratio.
Common Errors: 6 (cats) calculating 36 ÷ 6
216 (cats) calculating 36 × 6

Question 29
Objective: To solve a worded problem that involves converting a mass measured in kilograms to grams.
Common Errors: ($) 62 incorrect conversion of cents to dollars
($) 7.595 calculating 2.17 × 3.5
($) 5.67 calculating 3.5 + 2.17

76
4.3 Table and charts of sub-group performances – Mathematics 0845

Performances for each syllabus are reported separately; the entries for on-screen and paper-based syllabuses are not combined.

Overall and sub-group performances can change from series to series. You can use the report to compare sub-group performances for this syllabus in this series. You should
not use the information to compare performance changes over time.

77
Demographic breakdown of total entry for Cambridge Primary Checkpoint Mathematics

Percentage of total Average total score Average Geometry Average Handling Average Number
entry and measure score data score score
Age in years First Language
10 and under Not English 14.3 3.7 3.6 3.7 3.7
10 and under English 5.8 3.9 3.8 4.1 3.9
10 and under All 20.1 3.8 3.7 3.8 3.8
Age in years First Language
11 Not English 39.8 3.8 3.8 3.9 3.8
11 English 13.3 3.9 3.9 4.0 3.9
11 All 53.1 3.9 3.8 3.9 3.9
Age in years First Language
12 and over Not English 22.3 3.8 3.8 3.7 3.8
12 and over English 4.6 3.9 3.8 3.8 3.8
12 and over All 26.9 3.8 3.8 3.7 3.8
Age in years First Language
All Not English 76.3 3.8 3.8 3.8 3.8
All English 23.7 3.9 3.9 4.0 3.9
All All 100.0 3.8 3.8 3.8 3.8

Please note that in the block charts that follow, the horizontal axis representing Cambridge Primary Checkpoint scores is annotated from 0 to 6.

The value 0 represents the group of scores below 1.0,


the value 1 represents the group of scores from 1.0 to 1.9,
the value 2 represents the group of scores from 2.0 to 2.9,
the value 3 represents the group of scores from 3.0 to 3.9,
the value 4 represents the group of scores from 4.0 to 4.9,
the value 5 represents the group of scores from 5.0 to 5.9,
the value 6 represents the group of scores of 6.0 or more.

For the curve graphs which follow the block charts, the horizontal axis also represents Cambridge Primary Checkpoint scores, but here the scores are continuous rather than grouped. The tick marks along the horizontal axis
therefore represent actual Cambridge Primary Checkpoint scores.

78
Distribution of Cambridge Primary Checkpoint total score for Mathematics
classified by student's first language.

79
Distribution of Cambridge Primary Checkpoint total score for Mathematics
classified by student's age.

80
Distribution of Cambridge Primary Checkpoint total score for Mathematics
by student's first language, showing the cumulative
percentage of the number of students at each score.

81
Distribution of Cambridge Primary Checkpoint total score for Mathematics
by student's age, showing the cumulative
percentage of the number of students at each score.

82
Distribution of Cambridge Primary Checkpoint Geometry and measure score
classified by student's first language.

83
Distribution of Cambridge Primary Checkpoint Geometry and measure score
classified by student's age.

84
Distribution of Cambridge Primary Checkpoint Geometry and measure score
by student's first language, showing the cumulative
percentage of the number of students at each score.

85
Distribution of Cambridge Primary Checkpoint Geometry and measure score
by student's age, showing the cumulative
percentage of the number of students at each score.

86
Distribution of Cambridge Primary Checkpoint Handling data score
classified by student's first language.

87
Distribution of Cambridge Primary Checkpoint Handling data score
classified by student's age.

88
Distribution of Cambridge Primary Checkpoint Handling data score
by student's first language, showing the cumulative
percentage of the number of students at each score.

89
Distribution of Cambridge Primary Checkpoint Handling data score
by student's age, showing the cumulative
percentage of the number of students at each score.

90
Distribution of Cambridge Primary Checkpoint Number score
classified by student's first language.

91
Distribution of Cambridge Primary Checkpoint Number score
classified by student's age.

92
Distribution of Cambridge Primary Checkpoint Number score
by student's first language, showing the cumulative
percentage of the number of students at each score.

93
Distribution of Cambridge Primary Checkpoint Number score
by student's age, showing the cumulative
percentage of the number of students at each score.

94
5. Cambridge Primary Checkpoint Science 0846

5.1 Comments on specific questions – Science 0846 component 01

General comments
Knowledge based recall questions were generally answered well, as were many of the questions which required analysis and interpretation. Questions which required open
responses sometimes lacked sufficient detail in answer, with many giving one word answers which needed to be expanded.
Questions on scientific enquiry were answered well by many candidates who appeared to be familiar with practical investigations, and showed that first-hand experience is
invaluable. The idea of fair testing, reliability and accuracy continues to be an area where there could be more focus along with the identification of the correct scientific
equipment for different measurements.
Candidates demonstrated good knowledge of the following areas of the framework: identification of the simpler organs of the human body, caring for the environment and
food chains. The following areas could have greater focus: how pitch and loudness can be altered using musical instruments, the plotting of line graphs and drawing a line to
complete the graph.

Question 1
Many learners answered the first two questions in the human organs quiz correctly, showing that they are familiar with more common organs. Some learners also displayed
knowledge of the kidneys and the liver.

Question 2
Some of the learners thought that they needed to add another lamp to the electrical circuit to make it brighter or that they could also achieve this by making the electrical wires
longer. Some of the learners correctly selected that they needed to add an extra cell rather than add another lamp and that the electrical wires should be made shorter.

Question 3
This question was answered well by many of the learners showing that this area of the framework has been covered well.

Question 4
Many of the learners correctly filled in the gaps about the electrical conductors and insulators in an electrical wire.

Question 5
(a) Some of the learners correctly classified all the substances as a solid, liquid or gas. Many of the learners answered gasoline is a gas rather than a liquid.
(b) Many of the learners correctly stated the change of state when copper melts.
(c) Many of the learners correctly stated the change of state when copper boils. Some of the learners thought that the copper would be changing from a solid into a gas.
(d) Some of the learners were able to interpret the information provided to determine that at 2000 °C the copper would be a liquid. Some of the learners thought that it would
be a gas.

95
Question 6
Many of the learners provided sensible suggestions of what Galileo would be able to see through the telescope. Some of the learners thought that they would be able to see
the Earth and provided answers of what Galileo found out using his telescope rather than answering what the question was asking. Some of the learners just said space, and
as space was provided in the stem of the question, it was an insufficient answer.

Question 7
Many of the learners correctly identified the different ways we can care for the environment from the suggestions provided. Some of the learners thought having a bath would
be one way.

Question 8
(a) Some of the learners answered this question well and correctly stated that the light would be blocked or be falling on an opaque object. Some simply said that shadows
would be formed when the light was on rather than providing the detail of how shadows are formed.
(b) Some of the learners answered this question well and correctly stated a variety of different ways that shadows can change during the day.

Question 9
Some of the learners provided good answers about how penguins are adapted to live in the Antarctic
Some of the learners thought that penguins have fur and discussed adaptations associated with having fur rather than feathers. Most of the learners were able to provide
features and explanations of how these features made them suited to this environment.

Question 10
(a) Some of the learners provided good descriptions of the salt dissolving in the water. Some of the learners thought that it was the sand which dissolved or both the salt
and the sand. Some of the learners discussed the salt mixing with the water when they needed to refer to it dissolving and some of the learners identified that a solution
would be formed but needed to indicate that it would be a salt solution which is formed.
(b) Many of the learners correctly stated that the insoluble substance was the sand and only a few of the learners thought that it was salt.
(c) Only a few of the learners appreciated that the mixture was heated gently at stage 3 to make the salt dissolve quicker. Many of the learners thought that it was to
evaporate the water to obtain salt crystals.
(d) Many of the learners correctly answered this question.

Question 11
This was answered well by some of the learners. Many of the learners knew that he put the inhaler into his mouth and that it travelled down his windpipe to his lungs.

Question 12
(a) Many of the learners correctly stated that the bell made a sound because it vibrated. Some of the learners simply said because it was shaking or moving which was
insufficient as it was given in the stem of the question.
(b) A few of the learners were able to accurately describe that the sound would change as it would now have a higher pitch. Answers were expected to be comparative and
refer to pitch so that they were not ambiguous. This is an area of the framework where there could be greater focus.

96
(c) A few of the learners were able to accurately describe that the sound would now be softer. Answers were expected to be comparative and they needed to be clear that
volume was being described so that they were not ambiguous. This is an area of the framework where there could be greater focus.

Question 13
(a) Many of the learners identified this as a food chain. Only a few learners thought that it was a food web. Some of the learners described it as a feeding relationship which
was in the stem of the question. A few learners wrote out the food chain again.
(b) Many of the learners answered this correctly.
(c) Some of the learners thought that both the fox and the rabbit were predators. The rabbit is not a predator as it is not chasing and eating another animal.

Question 14
(a) Many of the learners correctly identified the force as being friction.
(b) Many of the learners correctly identified the piece of equipment as a forcemeter or a newton meter.
(c) Some of the learners knew that the unit of force is newtons. Some of the learners thought that it was metre.

Question 15
(a) Many learners correctly used the results provided to predict the result for flower D by looking at the trend.
(b) Many learners made correct predictions about using blue ink rather than red ink. Most of the learners described the flower turning blue.
(c) Many of the learners shaded some of the petals turning red from the red ink. Some of the learners did not attempt this question.

Question 16
(a) Some of the learners were able to accurately plot both of the points. Some of the learners did not attempt this question.
(b) Some of the learners were able to complete the line by joining the last three points with a straight line. Some of the learners wanted to include the anomalous result and
drew a line from the point at 20 mins to the anomalous point at 10 mins and then connected this with a line to the two points they had plotted. This was also a question
that some of the learners did not attempt.
(c) Many of the learners correctly identified that the result that did not fit the pattern was the result at 10mins which did not fit onto the line drawn. Some of the learners
thought that it was the result at 0 mins.

Question 17
(a) Some of the learners thought that they should keep the materials the same but this was the variable which was being investigated.
(b) Many of the learners correctly interpreted the results to determine which parachute fell the slowest. Some of the candidates thought that it was parachute C which was
the parachute which fell the quickest.

Question 18
Many of the candidates knew some of the properties of copper. Some of the learners thought that copper had a low melting point.

97
5.2 Comments on specific questions – Science 0846 component 02

General comments
Knowledge based recall questions were generally answered well as were many of the questions which required analysis and interpretation. Questions which required open
response answers where at times answered with insufficient detail, with many giving one word answers which needed to be expanded.
Questions on scientific enquiry were answered well by many candidates who appeared to be familiar with practical investigations, and showed that first-hand experience is
invaluable. The idea of fair testing, reliability and accuracy continues to be an area where there could be more focus along with the identification of the correct scientific
equipment for different measurements.
Candidates demonstrated good knowledge of the following areas of the framework: parts of the plant, the Earth and the Sun, seed dispersal, evaporation and changes of
state. An area where there could be greater focus is on the sound topic and how to measure sound and the correct units of measurement.

Question 1
(a) Most of the learners correctly circled one of the two pictures of roots for the part of the plant which takes in water.
(b) This was answered correctly by the majority of the learners.

Question 2
(a) Many of the learners answered this correctly showing a good understanding of this area of the framework.
(b) Many of the learners answered this correctly showing a good understanding of this area of the framework.
(c) Many of the learners answered this correctly showing a good understanding of this area of the framework.

Question 3
(a) Most learners demonstrated a good understanding of this area of the framework and correctly stated that they could use a magnet to separate the steel cars from the
aluminium cars. Some of the learners thought that they could do it based on their different strengths or just by looking at the cars.
(b) Some learners answered this well. Good answers to this question explained that a magnet could be used as steel is attracted to magnets and that aluminium is not
attracted to magnets. Some candidates only discussed the steel being attracted but did not make it clear that the aluminium would not be attracted as well.

Question 4
(a) Many of the learners correctly matched all the seeds to their correct method of seed dispersal.
(b) A few of the learners provided answers such as water which is one of the methods already provided in the previous question and the question was asking for a different
method. Many of the learners gave the answer of explosive or self dispersal or good descriptions of explosive dispersal.
(c) Good answers to this question described how the animal would eat the strawberry and then deposit the seeds in their waste (faeces) or that they would spit the seeds
out. Both parts of eating and the dispersal method were needed for the mark.

Question 5
(a) There is a lot of coverage on the framework about evaporation and condensation so learners were expected to display this knowledge and not just simply say because
the water could be cold again. They needed to include the process of the water being cooled or condensed to illustrate why it is a reversible change. For the description

98
of the rice being an irreversible change learners were expected to describe how they could not get the rice back to its original state once it had been cooked. Some
learners just provided their knowledge of it being a chemical change rather than applying it to the situation illustrated in the question.
(b) Good answers to this question displayed knowledge of the formation of water vapour which would then condense on the cold kitchen walls thus making them wet.

Question 6
(a) Overall this was answered well and many of the learners selected the correct pictures displaying a pull force. Learners answered this question in a variety of ways, all of
which were accepted if the answer was correct.
(b) Overall this was answered well and many of the learners selected the correct pictures displaying a push force. Learners answered this question in a variety of ways, all
of which were accepted if the answer was correct.

Question 7
(a) Many of the learners correctly circled the anther and the filament as being the male parts of the flower.
(b) Some of the learners correctly stated that the pollen fuses with the ovum during fertilisation. Some of the learners thought that it was the anther. The anther makes the
pollen, but pollen was needed for the mark.
(c) Many of the learners correctly identified part X as being the seeds of the zucchini fruit.

Question 8
(a) Many of the learners answered this correctly showing good knowledge of this part of the framework.
(b) Many of the learners answered this correctly showing good knowledge of this part of the framework.

Question 9
This was answered well by some of the learners. Some of the learners drew the line for the mouth too low, labelling the throat. Some of the learners thought that food was
absorbed in the kidneys or in the large intestine.

Question 10
(a) Some of the learners correctly stated that Safia could feel vibrations with her hands once the radio was turned on. Some of the learners only stated that she would feel
the radio moving which was insufficient for the mark.
(b) A few of the learners answered this correctly demonstrating their knowledge of scientific equipment.
(c) There were a variety of responses to this question. This is an area of the framework where there could be greater focus.

Question 11
(a) Many of the learners correctly stated that the water was the solvent.
(b) Many learners showed a good knowledge of this area of the framework and correctly stated that the water had evaporated.
(c) Some of the learners identified the solid as being magnesium sulfate and all parts of the compound were required for the mark.

Question 12
(a) This was answered well demonstrating a good understanding of this area of the framework.

99
(b) Many of the learners thought that the lamp needed to be added next to the other lamp in the electrical circuit rather than appreciating that it could be placed anywhere
within the electrical circuit.

Question 13
(a) Some learners correctly identified two factors which would need to be kept the same for the investigation to be a fair test. Some of the learners did not include sufficient
detail in their answers. Water on its own was insufficient, they needed to include the same amount or volume of water to achieve the mark.
(b) A few of the learners identified the light intensity or a description of this as the variable which was being changed during the investigation. Some of the learners
interpreted the question differently and thought that they may be investigating the effect of different colours on plant growth and this was also accepted.
(c) Some of the learners displayed good ideas of measurements which could be made to compare the plants growth in the different conditions. Learners were expected to
provide quantitative measurements such as length and size rather than just stating how much the plant has grown.

Question 14
(a) Some of the learners drew straight lines from the monitor into the eye of Rajiv. Some of the learners did not draw the line to his eye so they could not be awarded the
mark. Some of the learners drew lines which were reflected from the monitor onto the keyboard and then into his eye which were accepted. Some of the learners did
not draw arrows to show the direction in which the light was travelling.
(b) Some of the learners correctly drew a straight line from the lamp into Rajiv’s eyes. Again some learners drew correct reflection diagrams which were accepted.

Question 15
Many of the learners thought that they could use a magnet to separate the copper from the mixture as they thought that copper was magnetic. Others thought that they could
do it through hand picking which is not good practice. Learners were expected to utilise the information provided in the table and to appreciate the difference in size and
therefore select to use a sieve to separate the copper lumps from the salt and steel powder. In their explanation they were expected to discuss how sieving would work to
separate the mixture rather than just provide the information given in the table that the copper was in a lump form and the salt and steel were powders.

Question 16
(a) This was answered well by many of the learners.
(b) Some of the learners thought that it was the measurement of 4.8cm which was incorrect it is more similar to the other results than the measurement of 6.4cm.
(c) Good answers to this question described how the measurement of 3.2cm was shorter than the previous results and therefore if the magnet had been stronger it should
have travelled a further distance than the previous results.

Question 17
(a) Most of the learners knew that the volume of water was kept the same so that the investigation would be fair.
(b) Many of the learners provided good predictions showing that they knew how to formulate their own predictions. Some of the learners did not discuss the effect of
temperature on the rate of evaporation but used the idea of surface area which was the investigation already given in the question.

Question 18
Many of the learners correctly circled that an animal cannot be at the start of a food chain. Some of the learners thought that it could be plankton.

100
5.3 Table and charts of sub-group performances – Science 0846

Performances for each syllabus are reported separately; the entries for on-screen and paper-based syllabuses are not combined.

Overall and sub-group performances can change from series to series. You can use the report to compare sub-group performances for this syllabus in this series. You should
not use the information to compare performance changes over time.

101
Demographic breakdown of total entry for Cambridge Primary Checkpoint Science

Percentage of total Average total score Average Biology Average Chemistry Average Physics Average Scientific
entry score score score enquiry score
Age in years First Language
10 and under Not English 14.2 3.8 3.9 3.8 3.8 3.8
10 and under English 5.8 4.4 4.4 4.3 4.3 4.2
10 and under All 20.0 4.0 4.0 3.9 4.0 4.0
Age in years First Language
11 Not English 39.9 4.1 4.1 4.1 4.1 4.1
11 English 13.5 4.3 4.3 4.2 4.3 4.2
11 All 53.4 4.2 4.1 4.1 4.1 4.1
Age in years First Language
12 and over Not English 21.9 4.0 3.9 3.9 3.9 4.0
12 and over English 4.7 4.2 4.2 4.1 4.1 4.2
12 and over All 26.6 4.0 4.0 4.0 3.9 4.1
Age in years First Language
All Not English 76.0 4.0 4.0 4.0 4.0 4.0
All English 24.0 4.3 4.3 4.2 4.3 4.2
All All 100.0 4.1 4.1 4.0 4.0 4.1

Please note that in the block charts that follow, the horizontal axis representing Cambridge Primary Checkpoint scores is annotated from 0 to 6.

The value 0 represents the group of scores below 1.0,


the value 1 represents the group of scores from 1.0 to 1.9,
the value 2 represents the group of scores from 2.0 to 2.9,
the value 3 represents the group of scores from 3.0 to 3.9,
the value 4 represents the group of scores from 4.0 to 4.9,
the value 5 represents the group of scores from 5.0 to 5.9,
the value 6 represents the group of scores of 6.0 or more.

For the curve graphs which follow the block charts, the horizontal axis also represents Cambridge Primary Checkpoint scores, but here the scores are continuous rather than grouped. The tick marks along the horizontal axis
therefore represent actual Cambridge Primary Checkpoint scores.

102
Distribution of Cambridge Primary Checkpoint total score for Science
classified by student's first language.

103
Distribution of Cambridge Primary Checkpoint total score for Science
classified by student's age.

104
Distribution of Cambridge Primary Checkpoint total score for Science
by student's first language, showing the cumulative
percentage of the number of students at each score.

105
Distribution of Cambridge Primary Checkpoint total score for Science
by student's age, showing the cumulative
percentage of the number of students at each score.

106
Distribution of Cambridge Primary Checkpoint Biology score
classified by student's first language.

107
Distribution of Cambridge Primary Checkpoint Biology score
classified by student's age.

108
Distribution of Cambridge Primary Checkpoint Biology score
by student's first language, showing the cumulative
percentage of the number of students at each score.

109
Distribution of Cambridge Primary Checkpoint Biology score
by student's age, showing the cumulative
percentage of the number of students at each score.

110
Distribution of Cambridge Primary Checkpoint Chemistry score
classified by student's first language.

111
Distribution of Cambridge Primary Checkpoint Chemistry score
classified by student's age.

112
Distribution of Cambridge Primary Checkpoint Chemistry score
by student's first language, showing the cumulative
percentage of the number of students at each score.

113
Distribution of Cambridge Primary Checkpoint Chemistry score
by student's age, showing the cumulative
percentage of the number of students at each score.

114
Distribution of Cambridge Primary Checkpoint Physics score
classified by student's first language.

115
Distribution of Cambridge Primary Checkpoint Physics score
classified by student's age.

116
Distribution of Cambridge Primary Checkpoint Physics score
by student's first language, showing the cumulative
percentage of the number of students at each score.

117
Distribution of Cambridge Primary Checkpoint Physics score
by student's age, showing the cumulative
percentage of the number of students at each score.

118
Distribution of Cambridge Primary Checkpoint Scientific enquiry score
classified by student's first language.

119
Distribution of Cambridge Primary Checkpoint Scientific enquiry score
classified by student's age.

120
Distribution of Cambridge Primary Checkpoint Scientific enquiry score
by student's first language, showing the cumulative
percentage of the number of students at each score.

121
Distribution of Cambridge Primary Checkpoint Scientific enquiry score
by student's age, showing the cumulative
percentage of the number of students at each score.

122

You might also like