Podcast: Joanne Miles and Paul Tully on professionalism and supported experiments

In the latest episode from the Education and Training Foundation (ETF) podcast, Paul Tully, Strategic Researcher at the ETF, is joined by training expert, Joanne Miles, to discuss how teacher research in the form of supported experiments can strengthen professionalism and raise standards in teaching and training.

Full transcript

Paul: Hello and welcome to the Education and Training Foundation (ETF) podcast. My name is Paul Tully and I'm a researcher at the ETF, with a special interest in promoting professionalism across the FE and Skills sector. In today's episode we'll look at how teacher research in the form of supported experiments can strengthen professionalism and raise standards in teaching and learning. I'm delighted to be joined by the highly esteemed training expert, Joanne Miles, who has been a prominent fixture in professional development circles in the sector for some time now. Hello, Joanne.

Joanne: Hello, Paul, lovely to meet you.

Paul: It's fantastic that you can join us today. Thank you for sparing the time. I'm going to kick off with a starter for you. Joanne, can you tell us a little bit about your career in the FE and skill sector, and how's the sector changed in this time?

Joanne: I started out 25 years ago as a foreign language teacher, teaching English in Spain and in the UK, and then spent a little bit of time working in FE as an ESOL teacher before I moved on to becoming a teacher-trainer, so working on initial teacher education programmes. That then developed into a role around coaching teachers and working on CPD, so I spent about 10-15 years doing that in the FE sector in the UK. Subsequently, I moved out of colleges started working at the Learning and Skills network doing cross Britain projects around improving teaching learning, which was a really interesting point in my career. And then for the last 10 years, I've been working freelance, so I'm involved in consulting, training and coaching across the FE sector supporting institutions with professional development and TLA improvement. I'd say in the time I've been in FE, there have been some major challenges around budget, increasing workloads for staff, lots of pressure on people to respond to an auditing culture, and maybe a swing more towards assurance of quality, and some neglect of quality improvement sometimes in some settings. Encouragingly, in the last 10 years, I've seen this move towards a really exciting culture around ownership of professional developments, and a really big focus starting around practitioner-led research. We can see all the outstanding teaching, learning and assessment projects that the ETF has been involved in, we can see the subsequent work recently about digital, we can see the birth of FE Research meet on Twitter, and all the informal networks that teachers are formed to share their practitioner inquiry, so I think there's a lot of reason for hope in the sector at the moment around teachers owning and developing their own practice, despite the external constraints and pressures of budget and politics.

Paul: Absolutely. And you've just painted a very rich picture of what is happening in the FE sector at the moment and some of the challenges which we're going to be picking up on a little bit later in the podcast. Let me just set the scene for our listeners. This podcast is a follow-up to an article that I wrote in the summer 2021 edition of inTuition, which is a fabulous publication, if you haven't had the chance to read it, in which I suggested professionalism could be strengthened inside FE institutions by three big professional development ideas: teacher research projects, teaching circles, which are problem solving groups and 360 degree appraisals. Now, today, we'll be talking about teacher research projects, also known in the business as action research projects, or as we're going to be using today, supported experiments. Now, these might be described as inquiry-led investigations by teachers to improve some aspects of their teaching practice. So let's get into some of the detail. Joanne, the sector and the general public expect effective FE teachers to be professional. Now, in your view, how can I focus on professionalism to improve teaching and learning?

Joanne: I think professionalism for me is really about considering your professional identity, and the practices that your professional tribe might have. And it's around creating that sense of community and notions for ourselves of how we work well. So there's something here around standards and competencies in our work. And there's something about how we might be better; those aspirational conversations about how we can improve our work to benefit the learners that study with us. And I think professionalism is linked with all of those different things. There's some things around focusing on professional skills and knowledge, and that does link with qualifications, development projects, but also collaborative work with colleagues to really help us hone our craft. And I think as we hone our craft and skills and think about those areas of our practice, in professional conversations, we then can enhance our teaching, learning, practice and benefit our students. So for me, there's a virtuous circle there in thinking about our professionalism, acting to develop it, and then reflecting on the impact with our learners.

Paul: Would you say that this is a shared view across the sector? Would you say that's a view of professionalism that is shared across all the sector that you've experienced?

Joanne: I think there's an increasing narrative moving in that direction, and a feeling in many colleges where I work that there is value in creating time and space and focus for teachers to think about professional identity. More and more places where I work and looking The ETF Professional Standards for example, and thinking about how those can underpin self assessment about your professionalism. And think about, therefore, your development areas that you may want to develop linked to those standards and those then trigger practitioner-led research projects. So I think there is a notion around, but I wouldn't say it's codified, formalised, it's kind of emerging from the work as research is happening, and those conversations are happening in the sector.

Paul: Sure. That really brings us neatly on to supported experiments. Now, I came across the term supported experiments about 15 years ago, as a CPD manager when there was a huge interest in Geoff Petty's summaries of John Hattie's work. And in 2006, he brought out his book, Evidence-Based Teaching, if you remember, which transformed the professional development conversations happening inside teacher training courses, and staff learning days, and we had lots and lots of time discussing many of those ideas. Now, Geoff, and I know you know Geoff very well, Joanne, defines supported experiments as an approach to professional development that provides an opportunity for teachers to try out new strategies with the support of a colleague, who takes on a coaching role, aiding reflection and providing feedback. Okay, so how would you define supported experiments? And what role can they play in supporting teacher professionalism and improvement?

Joanne: I think there is such an exciting professional development cycle. And I've worked with Geoff over the last 15 years and been collaborating with him on more than 40 projects in different ways across the sector. There are real opportunities with supported experiments to stand back from your own teaching practice and start to identify one area that you really want to explore and develop in more depth. And that might be through engaging with research reading, looking at different research summaries by people like John Hattie, and Marzano, looking at the evidence base that's out there in the wider sector. But it also can be involving you engaging in action research in your own classroom, trying out a new approach, reflecting on the impact on the students, looking at how that affects your own mindset about teaching, learning and your own effect, if you like, as a practitioner. It's really a practitioner-led action research cycle, you try something new, you tap into your colleagues for support at all points in the process. That might be sharing practice and resources at the beginning of the cycle, it might be troubleshooting challenges as you go through. And it's definitely about reflecting on your own learning, and the impact on yourself and your students at the latter parts of the process. It's a really generous, reflective collaborative cycle for growing our own practice with colleagues' support. There are lots of useful stages in the cycle that I think we should draw attention to, and these really helped with maintaining our own focus and the momentum of the work in a college or a school. The first stage is around exploring your own context with the students. And that involves looking at your present practice, and the pedagogy that you have in place in your work, but also the wider research world as well. This is all about situated practice, owned by teachers. So starting to look at the areas of focus, really grounded in your own context. The second stage involves planning experimentation, what you're going to explore, how you're going to do it, how's that going to fit into your teaching week into your planning process into your resources and materials. There's a notion here of deliberate practice, which we know from Erickson about really trying something with rigour over time, using trial and error to learn about this as we explore it. The third stage can be quite an extensive stage, taking quite some time where you coach in the strategies, you have collaborative conversations with a partner, or with a teaching team, or through an online forum, to really think about the work that you're doing with the students and to explore the glitches, solve the challenges and move that planning forward. Stage four or five are the bigger level dissemination. So stage four is about sharing, celebrating what you've done. This may involve an event, a showcase, a learning fair, posters, prizes, depending on what your culture is like. And the final stage is the embedding practice where we look at the experiments that we've done and we put into schemes or work into online resource banks, into protocols and CPD programmes, the things that we think have value to take forward. So one of the benefits for me here is the richness of that cycle, and the fact that that's really, plan, do, reflect, which we know from lots of other learning cycles that we use in teacher education. So it's a really rich and interesting model to use that sustains that professional inquiry, I think.

Paul: That's a very, very detailed overview of supported experiments. So anybody listening, hopefully, we'll get a good idea about how it should work and perhaps the kind of impact or the kind of areas of work that they might would like to develop further. However, how long has it been around for in the FE sector?

Joanne: I mean, I would say that 15-year, 20-year, 15-year window, at least I mean, I'm thinking I've been working with Geoff for that long, so it's been a model that's been around for quite some time. And it kind of emerges, I think from Kolb's learning cycle, and other action research models. It isn't. It isn't something that was born on its own. It emerged from other thinking about how we learn effectively. There's something in here around the teacher is learner. And the teacher is reflective practitioner. And this cycle really fits with those two pieces of thinking that I think are really dominant when we think about how we grow and develop as practitioners. People identify a focus, they learn to read research, they look and select research that might be of interest, they need to develop an experimental timeline, the stages of that cycle for themselves, they need to think about a really interesting question, which is, how will we know the experiment has had any impact on the students or ourselves. And so we start to think about what we might see or hear happening to indicate impact, and how far we can separate that out from other factors that are going on in the teaching learning context. And that's an interesting set of skills to develop as a reflective practitioner growing ourselves across our careers. The cycle also really helps people with persistence. And we can see from experimental practice that we need to try things again and again, to get good at them, and to really see if they're going to have benefits for ourselves and the students. So we really embed this trial and error mentality that is so important for growing our practice, and is an excellent counterbalance to all the risk averse feelings that there might be in every sector at the moment of wanting to keep practice very safe and secure and measured and managed. This is the experimental space, where we explore where we take risk in a positive way, and we learn from it. And for me, there's real value from that academically, cognitively. And as a professional.

Paul: And one of the crucial ingredients here just to pick up again, on parts of the cycle is coaching, why is coaching so important in this cycle?

Joanne: So many reasons. I think the first aspect is the quality of the conversations that you can have if coaching skills are brought into play. So coaching skills, really look at how we listen effectively, how we formulate questions that make us think, how we come through to learning points, and then devise actions to take forward. So there's real intellectual depth in terms of reflection and action planning that emerge if the conversations are coming from a coaching place. And so in many, many settings where I work, it's peer-led coaching conversations that are underpinning this cycle. And that can be driven by the advanced practitioner group, facilitating those conversations, it might be the teaching learning coaches, or often I'm training up a peer-led coaching group, so that they have the skills to raise the quality and increase the depth of those conversations for their colleagues. It's not about the hierarchy. It's not about the managers leading and driving this. It's about peer led collegial conversations, but with quality and depth. And that's where I think the coaching skills really come into play.

Paul: Well, certainly my experience that you and Geoff has led this agenda for the last 15 years or more. And I know that you've worked across the entire sector, implementing this system, developing and supporting professionalism, and getting impact from this system of work as well. Let me just ask about impact. You've supported a large number of schools and colleges to introduce this model into classrooms. Can you give us some examples and their impact?

Joanne: I think the first impact that really I became aware of is the cultural impact, particularly when you're running these cycles as a whole organisational approach, because in many settings, they realise that the vehicle of an experimental cycle could be used to really encourage teachers across the whole college or school, to think about their practice, or to explore new ideas to share more widely to be less stuck in their own silo if you like. So there's definitely a cultural impact. People talk about wider experimentation and TLA. More sharing going on. People talk also about a more open approach to classroom visits emerging from this cycle, less cliques in departments, people sharing materials more openly, more generously. And lots of people talk about the college feeling like a different place to work. That the excitement of conversations together, the lack of closeness in terms of one's development work makes it feel like a really different place to work. And there's a knock on implication there. That impacts on staff satisfaction, which is what we see when we start to look at staff satisfaction surveys, in the settings where collaborative development practice is really well embedded. Something here about the bars, people talk a lot about the buzz and the energy of it, the excitement of exploring something new, getting yourself maybe out of a rut that you can be in, particularly experienced staff say to me, this actually made me rethink areas of my practice, it challenged my thinking. It made me try new things. And that was exciting, and a bit risky, but I really grew from it. So there's a re-connection there back to professional values that I think is really exciting, too. We've obviously got tangible improvements that people talk about classroom level. So an example of that might be a college said to me they did lots of work around self and peer assessment some years ago, they read the research by Black and William on self and peer assessment models, and they then introduced quite a rigorous self and peer assessment structure into business courses to help students with their assignment writing to really get them to grasp what criteria might mean. And what they found from that was that they had big improvements in the quality of draft one coming in for the teachers to mark, it reduced the marking for the teachers, the learners were talking about how they understood the criteria better, they knew how to do the task, it was less of the mystery of how do I get the merit. And this fostered their independence in grappling with criteria more confidently. The ultimate outcome of that was improved merit and distinction grades when they compared those cohorts with the previous five years, so in terms of the college actually pushing up achievement on those programmes, this experimental cycle had that impact. Also, a couple of colleges talking to me about graphic organiser work introducing things like mind mapping for students to help with planning their assignment work, teaching them collaboratively in class modelling for students, getting seniors to think about how to plan their assignment and assessment work into sections and adding timings to that so that they can complete timed assessments within the right time to complete the task. They found that learners completed tasks better that there was an improved completion rate. They then were using maps and mind mapping in other subjects in other tasks on other programmes, other teachers were reporting suddenly seeing the learners using those in their working class and in exams, which was interesting. Learners then said, we've actually realised we have a personal organising tool that we can use for study, but also for planning work. And when we go into work, and we're involved in projects, and things in the workplace, we will have that skill. So there's something here about confidence in the learners thinking and making thinking explicit and having that tool to take away. So really interesting to me that we see a real range of impacts from this kind of work. Some of them are big cultural, some of them are staff feeling emotional feeling about professionalism, but lots of them are actually right down at the gritty chalk face of how it affects the practice for the students and ultimately, their achievement.

Paul: I mean, wow, that's an extraordinary range of ideas that clearly people have been embedded into, that have been developing and working towards with a range of different impacts there, which includes student outcomes. If I was a senior manager I would be interested. Of course, that would be a key metric for me. And what you're saying is that supported experiments have shown genuine struggle impact in that particular area.

Joanne: Absolutely, definitely the case. But I think what's important for leaders to consider is that there really are bigger benefits that are about the dynamics in your organisation, the emotional state, the morale of your staff, about this collaborative practice together, the ways teachers are supporting each other's thinking, in those conversations, that actually is incredibly important for being an organisation that can continue to improve, even under the pressures that we're dealing with. So to me, there's definitely improvements for the classroom practice and the outcomes for the students. But there are bigger benefits as well that I think we must bear in mind when we're adopting this work. So interesting to note, if people want to do a bit more research in their own time, a couple of websites, GeoffPetty.com, is a great resource with lots of information around supported experiments. My own website, Joannemilesconsulting.wordpress.com, also has articles, research summaries, tips and hints for people to look at. And the evidence based teachers network is a really rich and interesting resource with all the research summaries and tips and suggestions for people to and that's ebtn.org.uk.

Paul: Fabulous, some good resources there. But there is one question that's gnawing away at me really as a practitioner, and a former CPD manager. Who gets to choose the ideas that people work with in supported experiments?

Joanne: Such a great idea, and this is quite a contentious one. In some cultures that are quite top down and quite leadership led, there's an instinct and a desire to impose. When we look at the research studies, Dylan Williams work on learning communities, Geoff's work on experiments, my own experience across the sector, would suggest to us that the most powerful impactful work is when the teachers own it, when they choose when they look at their context and something relevant and situated and contextualised emerges that they want to explore. Then we get relevance, we get motivation, we get dynamism. We get so much more growth than if a senior leadership team says numeracy is our hot topic of the year. We're all experimenting with numeracy. Paul: We've seen a few of those have we?

Joanne: We have seen that. So I would really encourage that open ownership. When we look at other stuff, development research. Work from Curie, work from the Teacher Development Trust, work from the Education Endowment Foundation. It all talks about the importance of the teacher owning the development to really harness that motivational journey. So, I think for leaders it is really important to step back and allow it to emerge from the setting. Having said that, I do encourage colleges and schools to look at their own evidence base in order to select things that they feel are powerful for students. People are often looking at their own data, or data and you know, around assignments, assessments, course and qualification data, punctuality and attendance, their own observation reports in their own departments, their own self-evaluation reports, their SARS, their course evaluation data from students, as a really interesting, rich source of helping them select something that could be powerful for learning. But I think the ultimate decision has to sit with the practitioner and the team. So it's a time to step back for the leaders.

Paul: And certainly, I think one of the core learning points, I think, from what you've been telling us, is that supported experiments can support an elevation of the professional culture in ways which empower, develop collegiality, develop ownership, and also stimulate innovation and creativity. I think that's what we've been suggesting here. a whole range of very, very positive impacts, which organisations are constantly striving for anyway. But which perhaps supported experiments in the past perhaps hasn't been the vehicle through which they have first gone to, but what we're saying is enormous potential. So let me come to our next question. Now, we know that the culture of FE is busy, busy, busy, full-time teachers are on 25 teaching hours a week, classes of 20 to 30, working across different subjects and levels. So, finding the time to build this into their working schedules, which is always going to be a major challenge. So what planning tips can you give to organisations interested in implementing supported experiments?

Joanne: I think that's a really key point that if we don't have the time allocated, protected and committed time, we're going to really struggle to do anything collaborative, because actually, the workloads now at the level of marking, the level of bureaucracy in administration, assessment is so massive for staff that there really aren't many spaces. So one of the key things for me is around what Matt O'Leory and his team, who wrote the big report for Fettle around leadership and prioritising leadership and teaching learning, talk about structured autonomy. So this notion of really leaders needing to create spaces in the timetable, in which teachers can act in a really independent and owned way. But if there's no structure of space made there, you're going to really struggle to have meetings where you can discuss research, identify your timeline, troubleshoot together, we need to have those spaces. My own learning is we need every two to three weeks allocated time. And ideally, we need an hour, if we can have an hour and a half, I'm even happier, because we'll get richer, more in depth conversation. Dylan William and his work on professional learning communities suggest it's monthly. My own feeling is every two to three weeks gets us more action gets us more reflection drives more change. You can take your pick two to three weeks or every month, so definitely the allocated time is critical. But within that time, we need the right ethos. There's something for me about thinking around and communicating really clearly about the ethos of this work. It's about the leadership teams who have given the shape to the teachers to this work, creating safety to take risk, putting the ownership very vocally and explicitly in the teachers’ courts, not defining the focus for the experiments, and encouraging the risk taking. And in many cultures where I work, actually, the leaders need to openly say we want you to try new things. We want innovation, we want experimentation, we want you to feel safe to do things you haven't done before, some of which might work, some of which won't work. And we're going to value the learning on both sides of those things. The importance is the experimentation in order to not stagnate in your practice to keep growing to keep changing to keep evolving as a practitioner. So, I think there's this huge value there. But we need the leaders to message that that has value and importance. Really interesting read for people I think from Fettle is recent report called The Role of Leadership in Prioritising and Improving the Quality of Teaching Learning in FE. Professor Matt O'Leary was a part of the team writing this. They talk a lot about the importance of structured autonomy, the teacher ownership and the value of things like teacher triangles, teacher squares and communities. So that there's this exploration of practice, so a very interesting read for people if you want to look at the wider leadership level around this work. So I think structured autonomy, leadership tone is important. I think allocating time is absolutely fundamental. The third point I'd make is around identifying spaces for collaboration. We need safe, reflective spaces that teachers can work in. So it could be that this is the tea and teacher talk meeting that people have. It could be the breakfast CPD club space. It could be an online area where people can post resources, thoughts, questions through a discussion board. There could be folders where people are storing information and commenting on each other's schemes of work. each other's resources and experimental case studies. Very importantly, we need physical time when people can get together. So, we need team meeting time or CPD slots regularly on the calendar, ring fenced, allocated, protected for these conversations to happen. If we haven't got those collaborative spaces alive and online, we're going to struggle to have those collaborative dialogues. To consider then the fourth point, I think, is around the quality of the conversations. We need to have trusting, reflective, high quality conversations. And this is where the coaching skills come in, to raise the level and to deepen the depth of those conversations. We need rules of engagement here, we need ways to talk, we need the notion of listening to each other, in thoughtful, open professional ways with respect. And we need to be able to challenge each other professionally, to really raise the level of our thinking to say how will you know if that works? What will you see happening, if it's making a difference for our students? So we need those really peer-led coaching conversations to create those high quality spaces. Ring fencing resources is also important. The coaches often need training, and that involves funding, possibly remission to free them up to go on that training. Okay, that can operate online at the moment and I'm doing all my training on Zoom, so it's more flexible for people, but there's still a need to commit professional resources inside your organisation to enable it. Teaching or learning coaches might need paying, there could be an innovation fund that you need to provide so that people can buy a set of materials, buy a licence for something to explore using technology. So small innovation fund that people can bid for, is actually a really important thing to ring fence. And the final thing is around how you plan this work, and how you monitor it. The most successful projects I've seen run as projects people perceive and structure and think about and implement them using project management techniques. A lot of the support I do is around one-to-one planning with the CPD leader who's going to drive the cycle, and small planning group sessions on Zoom with the cluster of coaches who might be leading those meetings. We need a clear timeline, we need purpose and focus at each stage, we need to think about how we're going to engage teachers in each part of that process, using question prompts, using resources that they might want to deploy, using ways to capture the thinking and learning at different parts of the process. We need a really good piece of project management thinking to bring this cycle off the page and into life. So, for me, all of those tips will help people. And almost when I see people missing any of those tips in their cycle, you can see problems starting. Without the project management, we don't get the momentum. Without the coaching, we don't get the quality of conversation to drive the improvement, without the time we just can't start. And if the leaders don't create the ownership for risk taking, teachers don't feel safe. Therefore, they create an experiment that something very, very simple, not very challenging, not very exploratory. So for me, we need all six in the picture to really make this happen.

Paul: Very, very strong argument there, very, very powerful. My experiences is that on the one hand, there are always objections on the grounds of we don't have enough time, we don't have enough resources or funding to manage what might sound like a luxury CPD innovation. But actually, having been a CPD manager, and Joanne I know that you were managing CPD some time ago as well, you know budgets are not insubstantial. And it doesn't take too much money to put aside and what you've just described as an innovation fund, that could support small level projects, you know, catalyst projects, things that start off. They're real experiments in the sense that perhaps sometimes a destination, or an outcome is not yet certain. But which the process itself is empowering for those staff, but also for students. The fact that students can also take part in this process is an incredible piece of collaboration between those two partners. And it really is kind of a learner's partnership type initiative, isn't it, with supported experiments very much working alongside students in that regard.

Joanne: Absolutely. I mean, I've noticed in lots of settings, the feedback is that that teachers are gathering feedback from students much more frequently during these experimental cycles. And it's actually a dialogue about what are we learning together? And how are we learning? And that's being fostered by the fact that we're in this experimental cycle. And I totally agree with you with the point about making the time and that, in fact, it's about prioritising where you put your resources. Many, many colleges have funds and money that they using for certain things that actually don't have impact, sending people on expensive short courses that don't feed back into the organisation, taking people out to an expensive event, whereas they could spend that money on releasing peer coaches to spend time to work with a whole team of people on a cycle. I think there's something here about deciding whether you're going to incentivize your or teaching learning coaches with remission with an incentive honorarium payment, or whether it's some vouchers at the end of the year and a drinks event, I think there are many ways to cut it. But that actually, if you prioritise the budget to shift the culture, you can get much bigger impact from the fund that you've got.

Paul: And wouldn't it be amazing if actually, staff and managers got together to talk creatively about how budgets could support those range of initiatives? Considering the amount of money that, as you say, is spent on sending people to a variety of external events, some of which were likely to be very powerful, and very informing, but others, perhaps less, so? Maybe a re-look at the impacts of different CPD initiatives will suddenly make people realise that actually these supported experiments are comparatively low cost, but very powerful in terms of their impact. Can I just ask just one last thing? It struck me, as you were talking. Once you the consultant leaves an organisation, and you leave the staff to it, you know, you've supported them with the process, you've perhaps developed their confidence in managing aspects of that. But then you leave them? How do they take that forward? You know, what do you leave them with just like experiments forward, so that becomes an integrated part of their culture?

Joanne: I try and make sure I leave a really good sense of the skeleton underneath the process, so they have a sense of the planning timeline, the priority areas for each stage. Often I'm leaving templates that they will be using for structuring the planning at each stage. And even some of the question prompts that people are using, I've written for them, so that they can actually have high level conversations, they just adapt my wording then, but they've got something to go on to structure the conversations at that point at the beginning, when you're planning that midpoint when you're reviewing and that final point, when you're looking at the outcomes. I always try and encourage people to think about what's the infrastructure going forward? Have you then got a clear way of operating experimentally at team level, at the whole culture level on teacher training programmes? Have you built the repository where these things can be stored? Do you have a good teacher toolkit area, where teachers are encouraged to put case studies, related resources, capturing their learning, and where are the sharing practice mechanisms across the year so that teachers can keep talking beyond their team beyond their clique, about their experimental practice. So show and tell sessions are really powerful, carousel meetings where people move around tables, talking to colleagues really informally about their work, poster display, so you can have gallery walk, can also be another interesting one to explore, the showcase event where people are showcasing things that they've been exploring in their curriculum area, those operate across the year now in different settings. And that's a sustainable infrastructure, where you can feed the experimental work to make sure it gets disseminated to spark more thoughts, to develop more practice. You're absolutely right, if they use the consultant, and then just drop all the mechanisms and tools. There's no long-term learning and we haven't built any capacity. But we can build capacity if we keep those mechanisms in place in beyond the life of the initial project.

Paul: Some terrific ideas there. Thanks very much, Joanna. We've just got time for one last question. What skills do teachers learn during supported experiments that might prepare them to be successful in other programmes, such as the ETF's Advanced Teacher Status programme, or the Advanced Pactitioner Programme?

Joanne: I think the first point to make is the importance of owning your own development and really situating it in your own context. So this is, as we said before, talking about relevance, making it really contextualised, it's therefore more likely to impact your thinking and your practice. So it's really giving those skills of focusing down on your learners and your learning situation. You need to also in all of those programmes, look at yourself against the bigger picture. So there's a reflective skill of looking at the teaching standards, maybe the ETF provide all the criteria on your programme. And then contrasting your own practice with that of your colleagues. Getting the sense of professional identity and fit in the bigger picture, I think, is a skill that people learn on these cycles. Articulating your rationales for your own use of TLA approaches is part of it too. So starting to think about what I'm doing in my practice, why I'm doing it and how I'm doing it. And like articulating those things in a really explicit way. We learn to do that in experimental cycles. We definitely do that on all the programmes that you've described. There's a need to plan your own work, plan your own developmental practice in exposed experiments, but also on a development programme like the ones you've described, we have to learn to reflect, so those reflective skills in the conversations that we have with colleagues are also developed in both of those spaces. Having a colleague challenge you to really think about why you've done things in a certain way, and what you've actually noticed in granular detail, is a really useful skill to develop in an experimental cycle and you transfer that into growing your practice across assignments and assessments, or any kind of training programme There's the peer coaching skill of actually learning to listen and reflect together with colleagues as well, I think on these programmes, and they're starting to share our practice and not be possessive about the work we're doing, we need to share it in social experiments of the dissemination stages and through the coaching conversations, and on advanced teacher status and advanced practitioner programmes we share with others through webinars, through case study work, through putting resources onto Flipgrid. It's that openness to be in a more collegial conversation, I think we develop in all of this kind of work.

Paul: So very much supported experiments is professionalising in every respect, whether it's at a cultural level, whether it's at an individual level, or at a whole institutional level, it enables people to work together far more powerfully, far more evidence based in terms of decisions that are made. Working with learners as partners, seems to be a lot of strengths there. And perhaps, for those who are listening in thinking about whether supported experiments is for them, they'll take heed by just how clearly that you've articulated the steps through which an implementation could take place, with a view that there are plenty of benefits at the end of that of that cycle. And perhaps they can consult your website for free materials, and blogs and advice and get in contact with you should they want to know a little bit more about how to do this effectively within their own organisation.

Joanne: That would be absolutely great. I mean, one final point to make is the flexibility of the model, it can definitely work as a whole organisation approach. It works as a team of teachers in a subject area looking to explore things within their curriculum practice together, it works with a coach working with one or two teachers together in a tiny cluster, as part of developing their practice. So to me that the other benefit apart from things we've already discussed, is the flexibility of the model to operate small and large scale, which I think makes it a hugely exciting professional development tool. We'd love to hear from people interested in talking about the model a little bit more, and obviously from people who might want some support in putting something into place maybe ready for the next academic year.

Paul: Well, thank you. We've come to the end of our podcast today on supported experiments. I for one, have certainly been inspired by the many examples of good practice and positive impact that we've heard from you today Joanne. May I extend a warm thank you to Joanne for sparing the time in a very busy schedule to come and speak with us today. I mean, supported experiments clearly have an enormous potential for improving the culture of teaching. What we need, I suppose and what we've heard today are the conditions to seed these; allow them to prosper, and to feed into wider communities of practice across the sector. Now if you'd like further information on supported experiments, please email me at Paul.Tully@etfoundation.co.uk or to contact Joanne Miles at the email address read out earlier. Joanne’s website can be found at Joannemilesconsulting.wordpress.com where there are lots of materials and advice that you can get, as well as opportunities to discuss your projects with Joanne herself. I look forward to further podcasts on professionalism in the near future. So please join us for those. And in the meantime, thank you for listening.

Joanne: Thanks a lot for such an interesting conversation together. And if people want to reach out to me via Twitter, I'm @JoanneMiles2, and it would be great to make contact with people there.

Further information 

Twitter

  • #FEResearchMeet 
  • #UKFEChat
  • #FEResearchPodcast    
  • Joanne Miles: @JoanneMiles2
  • Jo Fletcher-Saxon: @JFletcherSaxon
  • @LSRNetwork: Learning and Skills Research Network
  • @RCGResearch: The Research College Group brings together 10 founder organisations. General further education and sixth form colleges who lead practitioner research.
  • Sam Jones: @FE_Lecturer 
  • #APConnect 
  • @touchcons_FE

Websites