Short reflections on ResearchED Blackpool

What a day! Really excellent ResearchED. From a logistical viewpoint; not too far from home (more northern ResearchEDs please!), easy parking, rooms all good size and close together, plenty of refreshments (no vegan options at lunch but I’m over it!), space to socialise and brilliant pupils helping out wherever possible.

The sessions were fantastic. Lots of choice with a good spread of topics. I wanted to quickly jot down some stuff before it leaves my brain, so here goes…

Keynote: Daniel Muijs

Daniel opened by concisely summarising why I became so passionate about evidence-informed practice in the first place. “Being evidence informed is a moral duty”. Ultimately it’s a social justice issue. We are morally obliged to do the best we can for our pupils, and we are best placed to do that if we have a good understanding of the research evidence that’s already available to us. The most disadvantaged pupils are those that need our help most.

Daniel mentioned that despite the big improvement in teacher awareness of research etc there was still a long way to go. I think he’s absolutely right. Despite the great turnout (on a Saturday) to events like ResearchED the number of teachers I’ve met who are aware of, for example Retrieval Practice is still a small minority. Research evidence needs disseminating effectively and staff need the time and resources to engage with it properly. Daniel suggests that we need to “invest further in intermediation”. This is something that I’ve considered a big part of my job for a while now, and I’m more convinced than ever that it’s vitally important.

Ruth Walker

She’ll probably cringe if she reads this, but I can’t think of anyone who has influenced my teaching more in the last few years than Ruth (sorry Adam Boxer, close second because Physics!). This is partly because she’s a force of nature and her output of quality blogs/resources is phenomenal (when does she sleep?) but also because of our (for lack of a better term) Zone of Proximal Development. She’s a fellow physicist on a similar evidence-informed journey, but she’s way ahead of me. I know enough to just about keep up and understand what she’s doing, but every interaction I have with her (whether it’s reading her blogs, watching her present, or discussing things with her) expands my horizons. Adam Boxer is the same. I’m like a year 7 pupil with brilliant Y11 mentors, and I honestly consider myself very privileged to know them.

Ruth’s session today was as mind-blowing as ever. I’d read her blogs on Legitimation Code Theory and love how it helps pick apart the structure of knowledge in Science (and other subjects). However, Ruth talking about it gave it extra clarity. I’m sure she’ll blog about her talk sometime soon, but the standout point for me was her discussion of how and why we should reclaim and rejuvenate How Science Works (although it may need rebranding).

Craig Barton

So much useful practical advice from Craig! I thought I had a pretty good retrieval practice/low stakes quizzing setup established but Craig has given me lots of ideas for how to tweak/improve.

Key points/things to do:

  1. Print quizzes out. I normally project them but printing means more heads down concentrating, fewer distractions. Think if I do this in a well organised way I can get benefits without too much workload increase (although I may get nagged about photocopying).
  2. Get pupils to add confidence scores. I’d read Craig’s blog posts on the Hypercorrection Effect, but his session has convinced me I need to start doing this. Errors made with high confidence are more likely to be corrected than those made in low confidence (Carpenter et al 2018). Craig suggests getting pupils to review wrong answers with the highest confidence scores first. Also has benefit of making pupils more aware of what they know and don’t know.
  3. Get teachers to write each other’s quizzes to avoid bias. This makes sense. Too often have I fired up Retrieval Roulette and repeatedly re-randomised until questions appear that I’m confident I’ve taught them well enough to answer. I shouldn’t avoid the tricky questions.

Amy Forrester and Rob Petrie

I’d already read Amy’s marvellous blog on the Performance Management system at Cockermouth High School. It’s a refreshingly simple and downright sensible system that removes really poor proxies for teacher quality like formal lesson observations and data targets and places personalised staff development as the ONLY focal point. It was great to hear Rob talk passionately about why they’ve taken this approach and to hear Amy talk so positively about the impact it has had on both staff development and workload. I was already convinced that this was the way forward for school appraisal systems but it was great to hear Amy and Rob talk about it first hand.

Robin Macpherson

Robin’s session was on effective questioning. He opened with the classic Ferris Bueller classroom scene (“Anyone? Anyone”) which was unnervingly familiar. A large number of questions that I see teachers ask in the classroom (and I’m guilty of this myself) often become just background noise and aren’t really effective in getting pupils thinking.

Robin highlighted some key strategies for effective questioning from Doug Lemov and Martin Robinson (as published in the excellent What Does It Look Like In The Classroom). An area I feel I need to work on is probing deeper with questions after the initial answer (rather than bouncing it on to another pupil), either by getting pupils to apply their answer to another context or to follow up with further questions (could be as simple as a “because” cue to get them to develop their answers). I could also do with expanding my range of question types, planning my questioning more and thinking more in terms of Ratio (See Lemov, TLAC). Lots to think about! The only thing that Robin suggested that I struggled to see much application of in my classroom was the Harkness method. I can see how it would be useful in other subjects but (at the moment, anyway) it feels like it would have limited usefulness for me (and would take some training to get pupils to use it effectively).

Deep Ghatura

Ok, Deep is my number one assessment expert/nerd, and there was so much to take away from his session that I can’t yet begin to summarise it here. Biggest things for me are:

  1. So much of the summative assessment we do in school is useless
  2. The way we try to measure progress is (at best) rubbish or (at worst) dangerously misleading, potentially leading to a huge opportunity cost by focussing interventions on the wrong pupils
  3. The Rasch method looks like one of the best ways of effectively measuring progress, but the assessments need to be set up carefully to make that possible
  4. Understanding of Assessment is one of my areas for development and I need further training on it. It’s not just me though. I feel I know as much as most of my colleagues do about assessment, but I think it’s something that generally teachers don’t know anywhere nearly enough about.

Deep is truly passionate about the effective use of assessment and it’s impossible to come away from his session without a feeling of excitement about a topic that I normally consider really dry. Deep’s session also included an amazing Physics question on temperature (as an analogy for the importance of zero point data) that I’m going to have to try using with pupils. Ruth and I figured out the answer but we had to think about it (although to be fair it was the end of a long day).

Overall another brilliant ResearchED. Thanks to everyone I listened to/conversed with. Now I need to sit down and do some action points so this valuable experience leads to something more tangible.

Already looking forward to ResearchED Rugby!