In an era when artificial intelligence increasingly shapes decisions in education, it’s critical to examine how these technologies impact historically marginalized communities.
AI offers both promise and peril, and parents have the power to drive this change. By engaging with schools, collaborating with their communities and advocating for transparency and inclusivity, they can ensure that AI serves as a tool for empowerment rather than exclusion.
As a tool of social capital — the networks and resources that help individuals achieve their goals — AI could be transformative in addressing systemic inequities. However, if not carefully designed and utilized, it risks inadvertently amplifying existing disparities instead of addressing them, and that’s why parents need to pay close attention.
AI models are built on data, much of which reflects historical inequality. For example, an algorithm that prioritizes test scores might inadvertently favor schools in affluent, historically less diverse neighborhoods while marginalizing less-affluent schools that excel in cultural responsiveness, inclusivity and fostering equitable learning environments.
For lower-income Black families, particularly those raising Black sons, algorithms that favor affluent schools can reinforce systemic barriers already present in education — from disproportionate disciplinary actions, underrepresentation in curricula and exclusionary practices that overlook the needs of marginalized students.
But if the algorithms can account for the diverse needs of parents, AI-driven school recommendation systems hold immense potential.
Related: Become a lifelong learner. Subscribe to our free weekly newsletter to receive our comprehensive reporting directly in your inbox.
Imagine an AI tool acting as a helpful assistant for finding the perfect kindergarten. You could tell it what’s important to you (like location, play-based learning or a focus on the arts), and it could suggest schools that might be a good fit. It would be like having a super-smart friend who knows a lot about schools and can give you lots of ideas.
Similar to how high school students use platforms like Common App and Naviance for college research, parents could utilize AI tools like ChatGPT to gather information, compare kindergartens based on their criteria and explore potential options, effectively using AI as a personalized school recommendation system.
For parents juggling multiple priorities, such as academic quality, cultural representation and safety, AI — by streamlining a process often fraught with complexity —could offer valuable insights.
Yet, as research has highlighted, current school-finding assistant systems are not immune to bias. In our research, we are exploring how Black mothers navigate these challenges in comparison to white mothers, uncovering stark disparities in how AI-generated recommendations align with parental preferences.
While white mothers often benefit from AI tools that prioritize metrics tied to privilege, Black mothers frequently encounter a mismatch between their priorities — such as safety and cultural representation — and the algorithm’s outputs.
This discrepancy underscores the importance of how community-led approaches to AI development could bridge this gap. By involving marginalized voices in the design process, AI developers can create tools that prioritize equity and inclusivity alongside traditional metrics. By building diverse development teams and conducting user research with target communities and gathering direct feedback from people who will be using AI, developers can gain crucial insights into more users’ needs and concerns.
But continuous monitoring and evaluation will also be essential to identify and address potential biases within these systems.
Related: How ed tech can worsen racial inequality
Parents can and should play a crucial role. To harness AI’s potential as a tool for equity, we must address its limitations. This work includes mitigating biases in training data, designing adaptive algorithms that evolve to meet diverse needs and ensuring accessibility for all families. Parents can help by:
- Engaging with schools by attending meetings and workshops and asking how AI algorithms are designed and whether they consider factors like cultural representation and equity.
- Collaborating with community groups and other parents to share resources and strategies. Community-led advocacy can push for AI systems that reflect diverse needs.
- Advocating for transparency and demanding that developers provide clear information about how AI recommendations are generated.
- Participating in research and volunteering for studies exploring the impact of AI in education.
- Driving policy changes. Policies that require equity-focused design in AI systems might include mandates for diverse training data, community oversight in algorithm development and regular audits to ensure AI systems are fair and inclusive.
As AI continues to influence education, it’s vital that we approach these technologies with both optimism and caution. For Black mothers and other historically marginalized parents, AI’s lure of promising to do right by users should not blind us to the risk of perpetuating inequality.
By viewing AI as a form of capital and leveraging it to address systemic barriers, we can create a more equitable future for all children. Together, schools and parents can reimagine education — not just as a system, but as a shared responsibility to give every child the opportunity to thrive.
Anastasia Proctor is a doctoral student at the University of North Carolina at Charlotte, specializing in multilingual education, equity in policy and chronopolitical influences in education. Charlitta Hatch is a doctoral student at the University of North Carolina at Charlotte, specializing in school choice, family engagement strategies and the intersection of race, gender and power dynamics in educational decision-making.
Contact the opinion editor at [email protected].
This story about AI and inequality was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s weekly newsletter.