Amanda Spielman at ResearchEd Birmingham 2020

Decorative image

Introduction

It's good to be here at Nishkam, and I am going to start by telling you that when I heard where I was coming today, it felt as though the fates had come together - and in a good way.

That is because five months ago I went to a New Voices event in London, where I listened to Andy Brown, who is Assistant Principal here, give a fascinating talk on effective CPD, and about how Nishkam's model has moved away from being predominantly whole school, more generic CPD, towards a much higher proportion of subject-specific CPD.

He explained that some of the thinking originated in an interesting piece of Wellcome Trust research, that found that the weakest schools were the least likely to prioritise subject CPD, and he explained how their insights had been built into a multi-level CPD model here.

And he explained that as part of that, all Nishkam teachers get two days of individual level CPD each year, where you can decide what you are going to do. Which means that I know to ask how many of you here today are Nishkam teachers using one of your individual days.

Andy's talk was so interesting that I made a page and a half of notes. But I won't say any more about it because of course he's speaking here today on his home turf. But I am going to say to all you non-Nishkam people: you won't be disappointed. And if he is talking about the CPD here, I think you might well come away envying Nishkam teachers.

Why I am here

And of course, this is a ResearchEd event! As some of you will know, I have a pretty strong commitment to research, so I'm definitely in my comfort zone today.

Since I came to Ofsted, we have rebuilt the Ofsted research team. Of course we want to make sure that our work is as well-founded in evidence as we can make it. And we wanted to make a strong feedback loop between our development, implementation and evaluation work, so that we iterate our frameworks and practice intelligently over time, in the light of experience.

EIF

I did have an interesting comparison point recently. Probably few of you know that there is a European association of school inspectors, with 37 members last time I looked. Yes, that's probably more than there are countries. That's because while there are some countries with no inspection, many do have it, and of those, some have provincial rather than national inspectorates. And by the way, even though the first HMI were appointed in England before 1840, we are far from being the earliest - many countries had school inspections before that, some like the Netherlands and Prussia 100 years earlier.

Anyway, we took our turn hosting the annual conference back in October, and I will also tell you now that an international school inspectors' conference is very far from being a glitzy jolly - the conference dinner took place in a pub. And I can also tell you that school inspectors look like school inspectors, no matter which country they come from.

But the interesting point I want to make was the remarkable level of interest in our new framework, and especially the way it is built on a platform of research evidence. I think I could have turned football team manager and sold some of my team several times over. That would be one way to increase our income! Though perhaps a little hard to fit into the civil service employment model - I suspect the Treasury would instantly confiscate any transfer fees.

And our research team hasn't just been working on the evidence platform and the evaluations we have and will continue to carry out. In the Annual Report we just published, we listed our publications in 2018-19, and it is a considerable list. The research team contributed strongly - on curriculum, knife crime and teacher wellbeing to name just three.

And we have been endeavouring to get a bit of co-production here. We tweeted out the draft programme to find out what people wanted us to cover. In fact, we did the teacher wellbeing research because teachers told us they wanted it.

EIF implementation

Now you might have noticed that at Ofsted we've been putting a lot of time and effort over the last few years into working out how to make our inspection work as constructive as we can. Though I'm not going to get into discussions about graded judgements today.

We've published quite a lot and talked a lot about the EIF and its underpinnings. So I'm not going to rehash that, except to say that the foundations we laid are justifying the work that went into them.

We've done several thousand EIF inspections now - including nearly 1700 in schools. (Remember it's a common framework for early years and post-16 education as well.) And the feedback from many directions is telling us that the inspections are nearly always working well.

We do know there is a small - and vocal - minority who don't like the new model, or who haven't been happy with their experience of it or with their outcome. But overwhelmingly the schools who have been inspected are positive about it.

Our post-inspection surveys, which have a very good return rate, tell us that schools are finding the process fair, and that they think the feedback is going to help them improve. And yes, we do take account of the fact that people who are happy with the outcome are slightly more likely to fill in the survey. Even the people who are disappointed with the outcome by a very large majority tell us that they think the process was fair and the input constructive. And this message is echoed from many other directions. Quoting a couple of pieces of the typical feedback we are getting: 'The process was incredibly fair, done with and not to, and inspectors were genuinely looking for the positives.' And: 'It was professionally done, in partnership with me, focused on exactly the right things.'

In fact it seems to be largely win:win, in that inspectors are also finding inspecting under the new framework rewarding, and seem to have renewed enthusiasm for their work. Our HMI recruitment pipeline is the strongest it has been for a very long time, in terms of both quantity and quality, while a suggestion to our contracted Ofsted inspectors that they should resign doesn't seem to have prompted a single resignation that we can find, nor are we noticing people reducing their commitment.

Of course there are have been a few wrinkles and teething issues - among several thousand inspections, how could there not be - but we take all feedback very seriously, and work fast to address issues, as for example we did back in September to sort out a problem that was flagged up for small primary schools. I'll talk a bit more about how we are refining our implementation when I speak at the ASCL conference next week.

Stuck schools

Coming back to our wider research programme, it is of course intended to contribute to our aim of being a force for improvement. Our approach is about looking at what really matters and doing it in a way that helps everyone get from A to B in the most efficient way possible: maximum gain for minimum pain.

So today I want to talk about three pieces of work we have done recently. The first was published a couple of months ago, on what we have called stuck schools.

In some pockets of the country, there are schools that haven't reached the 'good' standard for 13 years or more. When we did the sums six months ago, there were over 400 of them. That's not a large proportion of the schools in England, but it is still more than 200,000 pupils being educated in stuck schools.

Despite the system of support, intervention and inspection designed to improve schools, nothing has changed for these children. This isn't good for the children, and it actually isn't good for the staff in these schools either.

And these children are more likely to live in deprived areas than children at other schools. We found some common factors among the schools we visited. All were operating in very challenging circumstances, where a mixture of geographical isolation, unstable pupil populations and often poor parental motivation seem to be compounding the issues for children. But poor education is not an inevitability for poor communities: most schools in the most deprived areas do give good or outstanding education, despite the challenging contexts in which they work.

And, some of these good and outstanding schools have not always been so. Some of them have had difficult journeys, with many different forms of intervention and support, and many different leadership strategies, finally coming together to make an impact. The reasons that they have improved have been under-investigated and are therefore far from clear cut.

Research

Our research explored why some consistently weak schools have been able to improve while others have not, so that the whole system can work together to make the right things happen.

It wasn't intended to apportion blame or to set the problem at schools' doors alone. Indeed, the whole school and accountability system - of which inspection is a part - has some responsibility for the lack of progress in these schools.

It drew on research visits to 20 schools, 10 of which have been graded less than good consistently for 13 years or more and are considered as 'stuck'. The other 10 were graded good in their last 2 inspections, but previously had 4 full inspections that graded them less than good. These are considered as 'unstuck'.

The evidence collected was self-reported through focus groups and interviews. We did not attempt to verify independently the views or facts that were given to us. This means that the evidence reported should be seen as schools' interpretation of their journey, rather than Ofsted's view.

Of course the first thing that was important to understand was why each school was or had been stuck. And the hypothesis that emerged from the work was that there are broadly two types of stuck schools.

The first kind can be characterised as chaotic and change-fatigued. One teacher told us: 'In the last 7 years, we've had 4 headteachers. We've looked like we're joining 3 different MATs.'

The second kind typically has a resistant and embedded culture, which might involve teachers who have been working for the school for decades and a head and senior leadership team in post for five years or more.

But even within those tentative categories, each school will be stuck in its own way.

The next point to make is that we found no substantial differences in the reported contexts of the stuck and unstuck schools we visited. The unstuck schools had very much the same set of problems of context as the stuck schools: the geographic isolation, the high mobility, limited parental support. The fact that some schools do well despite these challenges shows that it can be done.

We also found no systematic differences in the level or type of school improvement support that stuck and unstuck schools had been given. All had been involved in some kind of government-funded support programme. Most often that was advice from National Leaders of Education. The programmes have not succeeded in getting these stuck schools to good and they are not perceived to have been transformative in unstuck schools either.

In fact most stuck and unstuck schools said that they had received too much school improvement advice, from too many different quarters of the school system. Of course it was well intended. But it had rarely had the intended impact.

School leaders said that the quality of the advice itself was often lacking. They also commented on a poor match between the problems of the school and the advice on offer.

What these schools have too often received, after a brief inspection that reaches a judgement, is an uncoordinated bunch of interventions.

I've talked about 'a cacophony of consultants of variable quality', and 'a merry-go-round of changing headteachers'. It is hardly surprising that this typically fails to help unstick the school.

Overall, the evidence does suggest that there is enough capacity in the system to support and advise these schools. But too little attention is given to several things:

  • the content of the support, including whether it really helps with getting focused, effective action that responds directly to the issues that have been raised
  • whether the support is best provided internally or externally to the school or MAT
  • and of course the quality of the people and organisations coordinating or delivering the support

If we get these things right, and concentrate on doing just the things that matter most, in the right order, change will happen.

The stuck schools report got a lot of coverage.

The other recent example of our research that I thought it would be good to talk about today is the work we did as part of the development of the new framework for inspecting initial teacher education. Inspecting teacher education gets much less airtime than inspecting schools, but clearly it's a strong lever in the overall education system. And it needs to sit comfortably with a few other things: in particular, the DfE's content standards for ITE, the Early Career Framework, and of course the EIF. So we needed to update our approach to these inspections.

And we have been approaching this in the same way we approached the EIF: building on evidence, carrying out research where it is needed, testing the components of the emerging model.

As with the EIF, we knew that the new framework needed to get to the heart of quality in ITE: what trainees are taught and what they learn.

We developed a model using a literature review we commissioned from Sheffield Hallam University; discussions with current ITE practitioners; a survey of course leaders, trainees and NQTs; our own previous curriculum experience; and of course the experience and knowledge of our own HMI.

From this we developed a set of 22 indicators that the evidence suggested might be associated with ITE curriculum quality, covering partnership working as well as curriculum planning. The partnership working indicators were specific to the ITE context. A detailed rubric on a 5 point scale helped the 17 inspectors involved make consistent assessments of quality. We designed the research visits to align the evidence collection activities with the indicators and rubric design. They were two day visits so that we could collect evidence from partner schools, school-based mentors and trainees themselves. There's quite a lot about methodology in the report.

We visited 46 ITE partnerships, including 20 Higher Education Institutions, 24 SCITTS and 2 TF partnerships. For some reason very few of these were in the West Midlands - we did visit Birmingham City University.

And what did it all tell us?

Efficiency is of course really important in ITE. There is a great deal to cover in a year of teacher training so careful thought needs to go into what is taught and learnt when, and in what context.

In the strong programmes course leaders work with their partnership to plan and deliver a well-sequenced curriculum. This joins up centre-based provision properly with trainee placements. And it allowed trainees to practice what they learn in centre provision.

By contrast, leaders in weaker partnerships tend to arrange their programmes to meet the practical needs of partner schools and settings, rather than considering how best trainees learn and develop.

And of course the principles of good vocational education for adults are to a large extent the same as the principles of good education in schools. In the stronger partnerships, training was built on a strong understanding of learning and the fact that although trainees are already pretty highly educated, they are nevertheless usually novices in the business of teaching. By contrast, in weaker partnerships, sequencing of content was generally ignored in favour of attempting to capture everything in bite-sized chunks, so as to tick off the 'teachers' standards'

The strongest partnerships did a good job of developing subject knowledge, even though time is limited. They managed (though they couldn't entirely overcome) this time limitation by connecting trainees to subject organisations and to quality curriculum content that they could study themselves. The strongest partnerships had a strong focus on behaviour management, and they taught their students from up-to-date research.

Being thoughtful about what can and cannot be taught during ITE is, again, about the most efficient way to achieve quality.

And the research did a couple of other things as well. It showed us which of that set of 22 indicators added up to the strongest basis for a clear inspection construct in the new framework. It also showed us where the new construct diverges from the old. And it showed us that some of the outcome measures we were using were not good indicators of quality. When teacher supply is tight, nearly all teachers will get jobs, irrespective of course quality, so completion and employment rates are not good signals of quality.

We have now published our draft framework, and it builds on these findings. The plan is that inspection should look at curriculum and partnerships in detail, to see whether trainees are being taught the right things and get to practice them in supportive settings.

The development model is another example of how we use evidence to inform everything we do.

If we want to combine quality and efficiency, we need to draw on high quality evidence and research, in whatever part of education or social care that we work, be it as teachers, leaders, inspectors or civil servants.

Managing Behaviour

The third piece of research I wanted to mention is on managing behaviour. We know that behaviour remains a major concern for teachers. This was apparent from the NASUWT big question survey, the OECD TALIS study and our own study on teacher wellbeing. They all showed that teachers feel misbehaviour is common, and a major source of teacher stress. Our teacher wellbeing study found that many teachers felt that senior leaders provided insufficient support.

In 2014, we published a report on low-level disruption, 'Below the radar', It's fair to say the findings were disturbing. We found great concern among teachers and pupils about a lot of low-level disruption. In many cases this disruption wasn't recognised or properly addressed by school leaders.

In 2019, we felt it was time for an update, looking not just at low-level disruption but at more challenging forms of misbehaviour. We wanted to identify the strategies that schools use to pre-empt and manage challenging behaviour and, of course, to promote good behaviour.

Compared with 'Below the radar', we found some positive developments. Teachers and leaders understand the importance of consistency in the implementation of behaviour policies.

Most schools in our study favoured whole-school behaviour management approaches where a set of consistent routines are put into practice, and rigorously and consistently applied. Though that consistency does need to be flexed for the small group of pupils with SEND or issues at home.

In the best schools, staff emphasised the value of teaching desired behaviours and making them routine. And this is especially the case for behaviours repeated regularly throughout the school day - those to do with the safe movement of pupils around the school, the smooth running of lessons and the minimum loss of learning time to low-level disruption.

We're currently scoping the next phase of our behaviour research. We'll be asking:

  • What does good behaviour look like? - can we come to an organisational concept of good in this area?
  • What does this look like in different contexts and for schools on different trajectories?
  • And of course, do we need to refine inspection methodology in this area? If so, how?

We are particularly interested in 'turnaround schools' on behaviour and what their different journeys might be. We're hoping that our research could lead us to a typology of schools that could inform how we look at them on inspection.

Your interest

And it is so great that you are all here today to listen, think, talk about so many aspects of education. Not only is it intellectually satisfying to be part of events like this, it also fits in well with the evidence on the value of CPD.

A week or two ago I read with interest another study recently carried out for Wellcome by EPI, published just last month, which found that high-quality CPD for teachers is as effective for improving pupil outcomes as having a teacher with a decade's experience in the classroom. And that it has value for teacher retention, especially for early career teachers. And - back to the Nishkam example again - that CPD programmes are more effective when they have sustained support from school leaders.

So I hope that being here, even on a Saturday, really does feel like a great way for you to develop, personally and professionally, as of course it does for me.

Conclusion

And to finish, I'd like to say thank you to Claire Stoneman and the organising team for inviting me, and even more for making this event happen, and to the ResearchEd team behind them, and of course to Nishkam for generously playing host. Watching the development of the ResearchEd movement from three different seats in education has been fascinating, and awe-inspiring, seeing more and more talented people exploring, pushing themselves, and engaging others.

It's a great programme today, with lots for every kind of interest. For anyone who is allergic to talking about curriculum, I can only apologise for our part in raising the profile of this aspect of education.

I can only be here myself until 12.15 because I need to catch a train to speak at another event in London this afternoon. But I'm hoping to fit in a couple of talks before then and to have a chance to speak to some of you.

I do hope you all have a brilliant day, and come away with your brains fizzing. That will be good news for you, and for all the children you teach.

Thank you very much, for all that you do, and for listening to me. Let's all get going.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.