Micro-Enquiry: something to try?

English year 5 pupils scored pretty well on the TIMSS 2019 science enquiry questions. In this post I will:

  • describe the questions asked ;
  • report how English Y5 pupils performed (for what it’s worth);
  • evaluate these questions and
  • suggest how I will adapt my practice to support pupils progress at these sorts of questions.

TIMSS and the Report

TIMSS is an international pupils performance study which takes place every 4 years (much like PISA). It ranks year 5 and year 9 pupils of different educational jurisdictions on maths and science performance. The most recent one was in 2019, but it’s been going for years.

Many different reports are generated from the database. The report I’ve based this post around is: Findings from the TIMSS 2019 Problem Solving and Inquiry Tasks p49 onwards.

The Assessment

In 2019, TIMMS assessed science enquiry and problem solving using an online app. The app set up a scenario and then asked specific questions and tasks. Pupils responded online.

The Scenario

TIMMS set up a mystery – George lives on a farm. An animal has broken into George’s vegetable garden and chomped some of his vegetables. Which animal did it?

The Questions

1, What clues should George look for in his garden? Pupils type in appropriate clues, e.g. footprints, bite marks, hair etc.

English pupils came 4th internationally with a score of 40 (+/-2.2) Girls scored 44 while boys scored 35. The highest score was Sweden with 45.

New evidence – a picture of farm animals’ foot/hoof prints.

2. In what ways are the prints different (e.g. size, shape, number of parts etc.) This is assessing the pupils ability to identify relevant differences and categorise.

English pupils came 3rd internationally with a score of 66 (+/-2.1) Girls scored 63 while boys scored 69. The highest score was the Republic of Korea with 82.

3. Measure the footprints. Pupils are asked to measure the prints with an online ruler. Measuring is tricky, with the challenge of lining up the ruler both ends and reading a scale.

English pupils came 9th internationally with a score of 67 (+/-2.8) Girls and boys both scored 67. Taipei scored 81.

4. Using a key: pupils are presented with an identification key. Can they identify an animal from the print?

English pupils came 1st internationally with a score of 72 (+/-2.3) Girls scored 73 while boys scored 70.

5. Applying knowledge – the animal ate plants. Which animals are most likely to have raided the garden? Cow, chicken, dog, goat? This is is a carnivore/herbivore question.

English pupils came 11th internationally with a score of 53 (+/-2.9) Girls and boys scored 53. The highest score was the Singapore with 70.

New evidence – George finds some animal hairs…

6. Which two scientific instruments could George use to examine the hairs? (Answer, magnifying glass and microscope).

English pupils came 9th internationally with a score of 69 (+/-2.5) Girls scored 68 while boys scored 69. The highest score was the Slovak Republic with 76.

George compares the found hair to samples of cow and goat hair.

7. Which animal’s hair is the found hair most similar to?

English pupils came 2nd internationally with a score of 35 (+/-2.5) Girls scored 38 while boys scored 32. The highest score was the Republic of Korea with 46.

CASE SOLVED!

Evaluation

I don’t find the rankings particularly useful other than to note our pupils aren’t much better or worse than pupils internationally.

The questions are interesting though. We definitely want our pupils to be able to do this sort of thing: this sort of problem solving using evidence is a key desired outcome of our science teaching. There are plenty of other types of question TIMSS could have asked (e.g. which source of evidence is most reliable) – but let’s just consider these ones for now.

My two big questions are:

  • how should we teach pupils to be effective problem solvers most efficiently, and
  • how much curriculum time should we commit?

A few possible answer might be:

  1. Nothing – kids learn to do this without any specific teaching (a null hypothesis). This would require no additional curriculum time.
  2. Doing lots of lessons like the TIMSS one – with a scenario and guided questions. This would take considerable curriculum time. What would you not teach in it’s place? (Note: there is substantive knowledge in TIMSS task – measuring, applying knowledge about appropriate equipment and classification of animals. Is the the most efficient method for teaching those concepts?)
  3. Dropping in individual questions through your lesson – so whenever one of these questions types is applicable, spend a minute or two modelling it, asking pupils to discuss or work independently. Little and often.

I think 3 is the most likely to be the most effective approach. Pupils get many more opportunities to see knowledge and skills applied in a wider set of contexts while leaving time to teach key scientific content. Pupils find it easier to transfer knowledge to new contexts if they’ve seen it applied in multiple contexts.

So, in my teaching this half term, I will plan questions (year 5 – we’re studying forces). I won’t drop any of the content we’ve planned in the MTP, but I will do my best to incorporate micro-enquiry questions. Any feedback I give you will be anecdotal, but I hope interesting.

Ben

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s