Social Mobility in the AI Era: Risks and Opportunities

This month’s blog comes from our Data & Impact Manager, Maddie Kilgarriff. Maddie leads our approach to evidencing and improving the impact of our coaching programmes. Her work ensures that CoachBright remains data-driven and accountable, and that we continually improve in our mission to reduce educational inequality. Before joining CoachBright, Maddie completed an MSc in Philosophy of Science and an MA in Philosophy and Maths. In her spare time, you'll find her reading novels (non-fiction when she's feeling formidable!) or out for long(ish) walks.

"The basic facts of social class performance in school are so well known as hardly to need repeating. As all teachers know, the children who do the best work, are easiest to control and stimulate, make the best prefects, stay at school longest, take part in extra-curricular activities, finish school with the best qualifications and references and get into the best jobs, tend to come from the middle class." (Swift, 2006)

As ChatGPT and alternative generative AI (GenAI) systems are increasingly utilised across the UK by young people and educators alike, we face challenges and opportunities when considering the impact these technologies may have on social mobility. According to a recent survey by the Sutton Trust, 34% of teachers report that their students are already using AI tools like ChatGPT to assist with homework, highlighting both its growing presence in education and the need to assess its implications for social mobility. Many variables will shape this impact, and these factors provide key considerations for educators, policymakers, and students alike. This piece explores how the uptake of GenAI in educational contexts may affect educational equity. 

The Attainment Gap 

Discussions on social mobility frequently reference the ‘attainment gap’—the difference in educational outcomes between socioeconomically disadvantaged pupils and their more privileged peers. This gap is significant because educational attainment is a major determinant of future opportunities, including access to higher education, employment prospects, and social mobility. Research from the Education Endowment Foundation (EEF) highlights that the attainment gap has widened in recent years, exacerbated by disruptions such as the COVID-19 pandemic.

The existence of this gap is driven by multiple factors, including unequal access to private tutoring, disparities in school funding, and differences in home learning environments. Students from wealthier backgrounds often have access to one-on-one tutoring, well-resourced schools, and supportive networks that reinforce learning outside of the classroom. In contrast, disadvantaged students may lack these additional educational resources, putting them at a structural disadvantage.

One potential solution to this inequality lies in the use of AI-driven educational tools, which can provide support that disadvantaged students might otherwise lack.

AI as a Tutor

GenAI has the potential to help bridge this gap by offering tailored educational support and feedback, functioning much like a private tutor. AI-powered tools can generate explanations, quizzes, and personalised learning materials, enabling students to receive instant feedback and targeted assistance. This could be particularly transformative for students who may not have access to expensive tutoring services.

Freeing up Teacher Time

Additionally, GenAI could alleviate the administrative burden on teachers, allowing them to dedicate more time to direct student support. As former Education Secretary Gillian Keegan highlighted, teachers today are often overwhelmed by administrative tasks, limiting the time they can spend on hands-on teaching, providing thoughtful feedback, and addressing individual student needs. By automating lesson planning, grading, and paperwork, AI could free up teachers to focus on personalised instruction, particularly for students who need extra help.

The Risk of Overdependence and the Need for Critical Use

While GenAI has the potential to enhance learning, its effectiveness depends on how it is used. AI outputs are not always reliable—models can hallucinate information, meaning that students must be trained to engage critically with AI-generated content rather than accepting it at face value. This is particularly crucial in research and writing tasks, where misinformation could lead to fundamental misunderstandings of a subject.

There is also a risk that overreliance on AI tools could weaken students’ development of critical faculties. If students depend too heavily on AI-generated writing, research, and explanations, they may not fully develop essential academic skills such as independent analysis, logical reasoning, and original thought. These skills are critical not only for higher education but also for long-term career success.

Moreover, the risk of overdependence may not be evenly distributed. While some students—particularly those in environments that strongly value education—may be encouraged to use AI as a supplement rather than a replacement for learning, others may lack this guidance. Research from the EEF suggests that disadvantaged students often face greater challenges in adapting to digital learning tools, as they may have fewer resources at home and less support in understanding how to use such tools effectively. Young people from disadvantaged backgrounds, who may already face barriers to developing strong independent learning habits, could be more vulnerable to passive AI use. This could widen, rather than close, the attainment gap if AI is not integrated into education with appropriate safeguards and critical literacy training.

Conclusion

While generative AI presents promising opportunities to address educational disparities, it is not a panacea. To truly enhance social mobility, AI must be integrated thoughtfully into the education system, accompanied by digital literacy training and safeguards against overdependence. By doing so, we can harness its potential while mitigating risks, ensuring that technology serves as a tool for empowerment rather than exclusion. Looking ahead, policymakers must focus on ensuring equitable access to these technologies, while researchers should continue to explore how AI can be used to complement, rather than replace, traditional learning methods, particularly for students from disadvantaged backgrounds.