Better than Wrong

Science exams are typically examples of the difficulty model of assessment where answers are right or wrong (see Christodoulou, Making Good Progress chapter 3). Assessors can use a marking rubric to judge whether an answer reaches a threshold of ‘correctness’ and award or deny a mark.

For many answers, this will be uncontroversial. However, some answers are less wrong than others: some answers are better than wrong.

A colleague and I took a single sentence answer from an end of year exam that many students got wrong, but which weren’t uniformly wrong.

A mark-scheme is binary: right or wrong, but we wanted to know whether we could do better than that: could we rank the answers? Could we identify which answers were almost there?

The answer is yes, and pretty reliably, using comparative judgement.

Comparative judgement (CJ) is a method for ranking and assigning a score to writing by comparing two pieces at a time. All the judge needs to do it decide which of the two is better and then repeating. Typically this is done with longer texts (we will try using long answers after the summer). But I was particularly interested in seeing what happens when you compare single sentences.

You can try for yourself here.

Typically, for longer pieces of writing, you get a fairly even distribution of scores. This didn’t happen here.

CJ task #1 Better than Wrong (1)

The results show three populations of answers: right, wrong and better than wrong.

Conclusions

  1. For some questions, students should simply learn the approved answer. Definitions are a good example of this.
  2. Binary right/wrong answers often fail to pinpoint students’ understanding/misunderstanding.
  3. To do this process routinely for individual questions would be too time consuming, but it was a short instructive activity for teachers before discussing better ways of teaching the concept .

Thank you to all of the additional teachers who helped with the judgements.

The data is here.

Using Physics Questions to Build Problem Solving, Literacy and Knowledge.

A few weeks ago I observed an English teacher pull a sentence apart. A line from Romeo and Juliette was on the board and the class spent ten minutes together identifying the parts of speech (verb/noun/particle etc); the effect of words on the reader (to connote is a verb – I didn’t know that) and language techniques (repetition, alliteration, rhyme, personification etc). They marked it all up on the board. The students were practising marking-up the same sentence on paper (board=paper from Teach Like a Champion 2.0).

I thought it was like looking at a diagram goal free . Without worrying about the question at first, students were laying down all of the information they could about the line.

Then, as a class, they began to add the context. The result was an exploded sentence, with all of the bones exposed.

They answered questions about the line.

I wondered whether I could do something similar with a physics sentence – specifically an exam question. Instead of looking at the language techniques, I would pull apart the vocabulary and knowledge.

So I took some exam papers and chose a couple of questions to try.

The first obvious thing is that for GCSE physics questions (especially higher tier), single sentence questions are rare. Also, many questions have diagrams:

So I split the task into several parts. First, we went ‘goal-free’ on the diagram, which took 5 minutes using a think-pair-share.

goal-free 1-page-001

Goal-Free exploration of the diagram using Think-Pair-Share activity

(AQA Physics Unit 1 June 2016)

We did a mini control-the-game (Reading Reconsidered) on the opening line (you can see the mark-up we did on vacuum below).

Finally, we got to the close-reading of the question line. I think the student’s marked-up sheet explains what we did as well as I could write it. It was a collaborative effort – I asked students to do this in pairs and then we shared. I was modelling on the board.

Then they answered the question.

kettle close-read-page-001

A close-read of the question followed by a write/rewrite (see TLaC and Reading Reconsidered by Lemov)

We shared and read the mark scheme and the students did a rewrite.

The whole task took 20 minutes, which is a big investment for one question. But I recommend doing it regularly because it does three things:

  1. It exposes students to both technical and non-specialist vocabulary, in a physics context, over and over again. You don’t need to plan and track the high mileage non-specialist words – they come up naturally. Technical words are also experienced, in context, over and over again, building understanding.
  2. It teaches student how to read a question.
  3. It teaches students how to write a good answer.

This sequence is an effective and simple way to develop literacy in physics lessons. It does several jobs pretty well. With practice, you might be able to get the whole sequence down to 10/15 minutes – but I’m not there yet. I’d be interested to hear if anyone else tries it or does something similar.

Thanks,

Ben

 

If you want it here, don’t put it there.

Wall displays, formula sheets and placemats…

If you want students to commit knowledge to long-term memory, you shouldn’t put it on the wall.

wall display CLT diagram

Wall displays, formula sheets and placemats as external memory

When you put knowledge on the wall, students will make use of it. Using external memory is a tremendously helpful method of reducing cognitive load. They will learn how to use the information. But they won’t learn the things on the wall.

So if you want them to learn something, help them learn it. Keep the wall for other things.

 

When knowledge is wrong…

Babies are born knowing physics. They express surprise when an object appears to be suspended in mid-air or pass through walls (nice article here). These are the primitive physics schemas we are all born with. Onto these, we add experiences from our lives: metals are cold; batteries run out of charge; the sun moves. Then in physics lessons we try to supplant this knowledge with formalised knowledge. With mixed results.

metal-cup-250x250

A metal cup to keep drinks cold.

Continue reading

What are the Cognitive Loads of reading and how can we reduce them?

Reading is a physics problem that doesn’t receive much attention in class. I think it should. Science professionals read a lot:

110162_reading-stats_1_630m

How many hours of professional STEM reading per week?

 

It turns out that the people who responded to the survey read a lot. Almost 85% of them read professional texts for more than 5 hours per week and 20% of them read for more than 15 hours per week. And they read to learn…

110164_reading-stats_3_630m

Why STEM professionals read

But most weren’t taught to do it at school. 

110163_reading-stats_2_630m

Who teaches scientists and engineers to read professional texts?

This last chart troubles me. I know STEM texts (exams, textbooks, papers) are different to other texts. They use different vocabulary; follow different conventions and have a different purpose. Either learning to read these texts is so easy, it doesn’t require teaching, or it is hard and we are letting learners down.

How many capable young scientists and engineers are dropping out because they can’t access the information in texts? I worry about this a lot.

Cognitive Load Theory explains why reading is difficult and tells us how to make it easier. All three memories are in use:

  • long-term memory – the knowledge you already have. Commit as much to memory as possible – use quizzes every lesson. 
  • working memory – where we compare what we’ve read to what we know and try to make meaning. There isn’t much we can do to boost this, though a good night’s sleep always helps me. 
  • external memory – the text, and any scribbles you’ve added to it. This is a skill and we should teach it. 

Comprehension depends most on what you already know. The two most important things for reading are in your long term memory (or they need to be). They are vocabulary and knowledge. Readers who are equipped with these are equipped to understand texts.

Vocabulary

Science teachers are good at teaching science vocabulary. We explain clearly; we use example sentences; we revisit; we match words to diagrams. We use every trick we know.

But we ignore key non-specialist vocabulary. Words like: determine, suggest, establish and  system (I took these from a couple of recent GCSE papers).

These words should be taken as seriously as technical vocabulary. It is hard to choose words to focus on. I tend to teach words as I come across them in textbooks and exam papers (especially if I think they could come up again).

Knowledge

Along with vocabulary, the most important part of understanding is what you already know: your schemata. As we read, the information in the text is held in your working memory to be presented to knowledge from your long-term memory like a debutante or a novice speed-dater. If sense can be made, great. If not, the reader has work to do. 

Skills/Habits

Skills get tough press – but there are a few reading skills (or habits) which make a difference. These are the four that expert science readers (like us) use most often.

 

  • I Wonder…. Expert readers ask questions of the text. Often these questions are related to meaning, but they can be “I wonder what that word means?” or, “I wonder why the writer said that…”
  • In other words…. Paraphrasing (rewording, often making clearer) is a powerful comprehension checking skill/habit.
  • I predict…. Asking readers to predict what comes next in a test is a useful way of drawing attention to the structure and conventions of scientific texts – it is extremely useful when scanning a text for the information you want to be able to predict whether the information might be in a nearby section.
  • So far… Summarising is a habit which encourages prioritisation of information.

 

If these activities can be practiced enough (several times over a few weeks, with occasional top-ups) they quickly become part of a reader’s reading schema, increasing your students’ ability to learn from texts.

This blog is a development of the blog I wrote in 2015 for the Royal Society of Chemistry – here. I am reassured to find that I still agree with most of what I wrote then. Thank you if you’ve stuck with me all this time!

Ben