Will AI Shrink Disparities in Schools, or Widen Them?

Date:

Share post:


For the past couple of years, unrelenting change has come fast.

Even while schools are stuck dealing with deep challenges, COVID-19 pandemic relief funding is running its course. Meanwhile, new technologies seem to flow out in an unstoppable stream. These often have consequences in education, from an increase in cheating on assignments enabled by prose-spewing chatbots, to experiments that bring AI into classrooms as teaching assistants or even as students.

For some teachers and school leaders, it can feel like an onslaught.

Some educators connect AI to broader changes that they perceive have been harmful to students, says Robin Lake, director of the Center on Reinventing Public Education. Through interviews, she’s found that some educators link AI to social media and cellphones. So they’re having an understandably emotional response, she adds: “It’s kinda scary if you think about it too long.”

But in this ever-shifting stream of change, Lake is among those who believe new technology can be steered in a way that navigates schools to a more promising channel for reducing disparities in education in the U.S.

However, if that’s going to happen, it’s imperative that education leaders start pushing AI to transform teaching and learning in ways that are beneficial, particularly for low-income and historically disadvantaged students, observers like Lake argue.

If artificial intelligence doesn’t help solve disparities, advocates worry, it will worsen them.

Hazard Lights

AI has been used in education since at least the 1970s. But the recent barrage of technology has coincided with a more intense spotlight on disparities in student outcomes, fueled by the pandemic and social movements such as protests over the killing of George Floyd. AI has fed hopes of reaching more equality thanks to its promise to increase personalized learning and to boost efficiency and sustainability for an overworked teaching force.

In late 2022, the White House released a “Blueprint for an AI Bill of Rights,” hoping that it would strengthen privacy rights. And last year, the U.S. Department of Education, along with the nonprofit Digital Promise, weighed in with recommendations for making sure this technology can be used “responsibly” in education to increase equity and support overburdened teachers.

If you ask some researchers, though, it’s not enough.

There have been fears that AI will accidentally magnify biases either by relying on algorithms that are trained on biased data, or by other methods such as automating assessments that ignore student experiences even while sorting them into different learning paths.

Now, some early data suggests that AI could indeed widen disparities. For instance: Lake’s organization, a national research and policy center that’s associated with Arizona State University’s Mary Lou Fulton Teachers College, released a report this spring that looked at K-12 teachers’ use of virtual learning platforms, adaptive learning systems and chatbots. The report, a collaboration with the RAND Corporation, found that educators working in suburban schools already profess to having more experience with and training for AI than those in urban or rural schools.

The report also found that teachers in schools where more than half of students are Black, Hispanic, Asian, Pacific Islander or Native American had more experience using the tools — but less training — than teachers who work in majority-white schools.

If suburban students — on average, wealthier than urban or rural students — are receiving more preparation for the complexities of an AI-influenced world, it opens up really big existential questions, Lake says.

Big Promises — or Problems

So how can advocates push AI to deliver on its promise of serving all students?

It’s all about strategy right now, making smart investments and setting down smart policy, Lake says.

Another report from the Center on Reinventing Public Education calls for more work to engage states on effective testing and implementation in their schools, and for the federal government to put more detailed guardrails and guidance in place. The report, “Wicked Opportunities,” also calls for more investment into research and development. From its perspective, the worst outcome would be to leave districts to fend for themselves when it comes to AI.

Part of the reason urban districts are less prepared for AI may be complexity and the sheer number of issues they are facing, observers speculate. Superintendents in urban districts say they are overwhelmed, Lake says. She explains that while they may be excited by the opportunities of AI, superintendents are busy handling immediate problems: pandemic recovery, the end of federal relief funding, enrollment declines and potential school closures, mental health crises among students and absenteeism. What these leaders want is evidence that suggests which tools actually work, as well as help navigating edtech tools and training their teachers, she adds.

But other observers worry about whether AI is truly the answer for solving structural problems in schools broadly.

Introducing more AI to classrooms, at least in the short term, implies teaching students using screens and virtual learning, argues Rina Bliss, an associate professor of sociology at Rutgers University. But many students are already getting too much screen and online time at home, she says. It degrades their mental health and their ability to work through assignments, and educators should be cautious about adding more screen time or virtual learning, Bliss says.

Bliss also points to a “print advantage,” a bump in how much is learned from print materials compared to screens, which has to do with factors like engagement with the text and how quickly a student’s eyes can lock onto and stay focused on material. In her view, digital texts, especially when they are connected to the internet, are “pots of distractions,” and increasing screen-based instruction can actually disadvantage students.

Ultimately, she adds, an approach to instruction that overrelies on AI could reinforce inequality. It’s possible that these tools are setting up a tiered system, where affluent students attend schools that emphasize hands-on learning experiences while other schools increasingly depend on screens and virtual learning. These tools shouldn’t replace real-world learning, particularly in under-resourced schools, she adds. She worries that excessive reliance on this technology could create an “underclass of students” who are given artificial stopgaps to big problems like school understaffing and underfunding. It wouldn’t be responsible to lean on AI as the quick fix for all our economic shortages in schooling, Bliss argues.

So how should educators approach AI? Perhaps the correct posture is cautious hope and deliberate planning.

Nobody knows precisely how AI will impact education yet, argues Lake, of CRPE. It is not a panacea, but in her estimation there’s a real opportunity to use it to close learning gaps. So it’s important to craft plans to deliver on the potential: “A lot of people freeze when it comes to AI, and if they can instead think about what they want for their kids, their schools, and whether AI can help, that seems like a productive path to me, and a much more manageable one,” Lake says.

There’s nothing wrong with being hopeful, she adds.



Source link

Alexandra Williams
Alexandra Williams
Alexandra Williams is a writer and editor. Angeles. She writes about politics, art, and culture for LinkDaddy News.

Recent posts

Related articles

This School Nurse Explains How Her Work Goes Beyond ‘Band-Aids, Boo-Boos and Head Lice’

Many people think of school nurses as the people in a school building responsible for patching up...

Is There a Problem With ‘Mathbots’?

When GPT-3 burst on the education scene, it caused a flood of reactions, ranging from gleeful to...

Revolutionizing How Educators Find Tech Solutions

With a new school year now in full swing, educators are in a bind. They want to...

Students Need Learning Opportunities Beyond Core Subjects. Here's Why.

Last year, I presented a paper on using technology in my Arabic class at the Ohio Foreign...

Here Are Some Models of Recovery for Early Care and Learning After Hurricane Helene

This story was originally published by EdNC.org. Unlike North Carolina’s K-12 schools or community colleges, child care...

Should Students Chat With AI Versions of Historical Figures?

Veteran multimedia producer and professor Lynn Rogoff has long experimented with ways to bring history alive for...

5 Essential Questions Educators Have About AI

Mentoring both teachers and students in ethical and responsible AI use is key. Teachers need to model...

What If Finding Child Care Online Were as Easy as Making a Dinner Reservation?

In 2024, if you want to make a dinner reservation, you’re very likely to open an app...