Primary Science: a tale of two practicals

Mr Smith and Mrs Jones both teach a Year 3 class in the same school. Mr Smith is an ECT (first year) and Mrs Jones is now in her 5th year of teaching.

Rocks. I don’t care what you say, they’re boring and should be left to the Geography teachers.

This term the Science topic is rocks. They both have access to the school Science Scheme of Work, which ties in with the National Curriculum Programme of Study, and has been adapted to suit the resources they have available.

In their first lesson pupils are learning about the properties of different rocks and are testing them for “hardness”.

After the lesson I take the opportunity to chat to the pupils about what they’ve learned in Science today.

Mr Smith’s class

They were learning about rocks and were testing them to see which were hardest. They had to give them a hardness score on a scale of 1 to 10. They scraped some rocks with a coin, and if nothing came off it was a low score, and if some did it was a medium score. They had a nail to hit the rocks with, and if some came off with the nail it was a higher score.

The first rock had a low score (2), the 2nd and 3rd rocks were medium (5 and 6) and the last rock was a high score because it was the hardest (8).

Mrs Jones’ class

They were learning about rocks and were testing them to see which were hardest. They were given 4 different types of rock; sandstone, marble, slate and granite. Sandstone is a sedimentary rock because its made from lots of small bits of sand or sediment pushed together. Marble and slate are metamorphic rocks which is what happens when other rocks get squashed down and heated up in the earth. Granite is an igneous rock and its made from magma that cools down.

They tested the hardness of the rocks by scraping them or hitting them with coins or nails. Mrs Jones spoke to them about how they could decide how hard the rocks were. They decided not to use a score, as there was no fair way of giving them a number from how easy they are to scrape. They decided to put them in order of hardness instead. They found that the igneous rocks where the hardest, then the metamorphic, and the sedimentary was softest.

What’s going on?

It was clear that both classes enjoyed the lesson, particularly the practical work, and they’d all learnt something, but there’s no denying there’s a difference in outcomes between the classes. So what happened?

Teachers prioritise the need to provide fun, awe and wonder moments that ensure children enjoy science. They are actively seeking to use and find great wow activities from internet ideas, video platforms and social media etc. These activities often stand alone and lack a relevant or appropriate curriculum rationale, with many relevant concepts inaccessible because the scientific explanation would be too abstract or complex. A clear rationale and articulation of why this activity is in this sequence of learning is not evident to the children or articulated by the teacher.

Bianchi et al. The 10 key issues with children’s learning in primary science in England (2021)

Mr Smith, like so many science teachers before him (at all key stages) took a look at the scheme of work, recognised that the practical would be interesting and fun for the pupils, and went into it with full gusto. And the pupils enjoyed it. But he’d not stopped to think about the underpinning rationale for doing the practical at all.

Mrs Jones did the same in her first year (and her second year too, actually). But her end of topic assessments showed a real gap in pupil knowledge about the properties of rocks, so she shifted her focus.

Mrs Jones recognised that the purpose of the practical wasn’t really about finding out how hard different rocks are. She could just tell them that. For her the practical was about

  1. making the different types of rocks tangible and real (concrete) rather than abstract concepts.
  2. Opening up a discussion about the limits of their quantitative data (scoring hardness) and looking at alternatives (rank ordering).

And for Mrs Jones the main focus of the lesson was recognising the different categories of rocks and their properties. She spent a lot of time focussing on scientific vocabulary (and the pupils articulated this to me beautifully).

Two classes. Same practical. Both classes had fun. But in one class the practical was purposeful, the intent was understood, and the breadth and depth of learning was far greater.

P.S. I know Moh’s hardness scale is a thing. Its just particularly ineffective with children. Most don’t have fingernails long or strong enough for scraping, and big burly Harry can scape a lot more off with a coin than little Larry sat next to him…

Short reflections on ResearchED Blackpool

What a day! Really excellent ResearchED. From a logistical viewpoint; not too far from home (more northern ResearchEDs please!), easy parking, rooms all good size and close together, plenty of refreshments (no vegan options at lunch but I’m over it!), space to socialise and brilliant pupils helping out wherever possible.

The sessions were fantastic. Lots of choice with a good spread of topics. I wanted to quickly jot down some stuff before it leaves my brain, so here goes…

Keynote: Daniel Muijs

Daniel opened by concisely summarising why I became so passionate about evidence-informed practice in the first place. “Being evidence informed is a moral duty”. Ultimately it’s a social justice issue. We are morally obliged to do the best we can for our pupils, and we are best placed to do that if we have a good understanding of the research evidence that’s already available to us. The most disadvantaged pupils are those that need our help most.

Daniel mentioned that despite the big improvement in teacher awareness of research etc there was still a long way to go. I think he’s absolutely right. Despite the great turnout (on a Saturday) to events like ResearchED the number of teachers I’ve met who are aware of, for example Retrieval Practice is still a small minority. Research evidence needs disseminating effectively and staff need the time and resources to engage with it properly. Daniel suggests that we need to “invest further in intermediation”. This is something that I’ve considered a big part of my job for a while now, and I’m more convinced than ever that it’s vitally important.

Ruth Walker

She’ll probably cringe if she reads this, but I can’t think of anyone who has influenced my teaching more in the last few years than Ruth (sorry Adam Boxer, close second because Physics!). This is partly because she’s a force of nature and her output of quality blogs/resources is phenomenal (when does she sleep?) but also because of our (for lack of a better term) Zone of Proximal Development. She’s a fellow physicist on a similar evidence-informed journey, but she’s way ahead of me. I know enough to just about keep up and understand what she’s doing, but every interaction I have with her (whether it’s reading her blogs, watching her present, or discussing things with her) expands my horizons. Adam Boxer is the same. I’m like a year 7 pupil with brilliant Y11 mentors, and I honestly consider myself very privileged to know them.

Ruth’s session today was as mind-blowing as ever. I’d read her blogs on Legitimation Code Theory and love how it helps pick apart the structure of knowledge in Science (and other subjects). However, Ruth talking about it gave it extra clarity. I’m sure she’ll blog about her talk sometime soon, but the standout point for me was her discussion of how and why we should reclaim and rejuvenate How Science Works (although it may need rebranding).

Craig Barton

So much useful practical advice from Craig! I thought I had a pretty good retrieval practice/low stakes quizzing setup established but Craig has given me lots of ideas for how to tweak/improve.

Key points/things to do:

  1. Print quizzes out. I normally project them but printing means more heads down concentrating, fewer distractions. Think if I do this in a well organised way I can get benefits without too much workload increase (although I may get nagged about photocopying).
  2. Get pupils to add confidence scores. I’d read Craig’s blog posts on the Hypercorrection Effect, but his session has convinced me I need to start doing this. Errors made with high confidence are more likely to be corrected than those made in low confidence (Carpenter et al 2018). Craig suggests getting pupils to review wrong answers with the highest confidence scores first. Also has benefit of making pupils more aware of what they know and don’t know.
  3. Get teachers to write each other’s quizzes to avoid bias. This makes sense. Too often have I fired up Retrieval Roulette and repeatedly re-randomised until questions appear that I’m confident I’ve taught them well enough to answer. I shouldn’t avoid the tricky questions.

Amy Forrester and Rob Petrie

I’d already read Amy’s marvellous blog on the Performance Management system at Cockermouth High School. It’s a refreshingly simple and downright sensible system that removes really poor proxies for teacher quality like formal lesson observations and data targets and places personalised staff development as the ONLY focal point. It was great to hear Rob talk passionately about why they’ve taken this approach and to hear Amy talk so positively about the impact it has had on both staff development and workload. I was already convinced that this was the way forward for school appraisal systems but it was great to hear Amy and Rob talk about it first hand.

Robin Macpherson

Robin’s session was on effective questioning. He opened with the classic Ferris Bueller classroom scene (“Anyone? Anyone”) which was unnervingly familiar. A large number of questions that I see teachers ask in the classroom (and I’m guilty of this myself) often become just background noise and aren’t really effective in getting pupils thinking.

Robin highlighted some key strategies for effective questioning from Doug Lemov and Martin Robinson (as published in the excellent What Does It Look Like In The Classroom). An area I feel I need to work on is probing deeper with questions after the initial answer (rather than bouncing it on to another pupil), either by getting pupils to apply their answer to another context or to follow up with further questions (could be as simple as a “because” cue to get them to develop their answers). I could also do with expanding my range of question types, planning my questioning more and thinking more in terms of Ratio (See Lemov, TLAC). Lots to think about! The only thing that Robin suggested that I struggled to see much application of in my classroom was the Harkness method. I can see how it would be useful in other subjects but (at the moment, anyway) it feels like it would have limited usefulness for me (and would take some training to get pupils to use it effectively).

Deep Ghatura

Ok, Deep is my number one assessment expert/nerd, and there was so much to take away from his session that I can’t yet begin to summarise it here. Biggest things for me are:

  1. So much of the summative assessment we do in school is useless
  2. The way we try to measure progress is (at best) rubbish or (at worst) dangerously misleading, potentially leading to a huge opportunity cost by focussing interventions on the wrong pupils
  3. The Rasch method looks like one of the best ways of effectively measuring progress, but the assessments need to be set up carefully to make that possible
  4. Understanding of Assessment is one of my areas for development and I need further training on it. It’s not just me though. I feel I know as much as most of my colleagues do about assessment, but I think it’s something that generally teachers don’t know anywhere nearly enough about.

Deep is truly passionate about the effective use of assessment and it’s impossible to come away from his session without a feeling of excitement about a topic that I normally consider really dry. Deep’s session also included an amazing Physics question on temperature (as an analogy for the importance of zero point data) that I’m going to have to try using with pupils. Ruth and I figured out the answer but we had to think about it (although to be fair it was the end of a long day).

Overall another brilliant ResearchED. Thanks to everyone I listened to/conversed with. Now I need to sit down and do some action points so this valuable experience leads to something more tangible.

Already looking forward to ResearchED Rugby!