Editor’s note: This is part three in a series. If you missed them, you can read part one and part two.

 

Not long ago, states and school districts were implementing one-to-one initiatives designed to make a computer or iPad available to every child. Then, flipped classrooms and blended learning introduced compelling ways of leveraging digital tools to maximize instructional time and student-teacher interactions.

Now, as my colleagues and I discussed in in Navigating the Future of Learning: A Strategy Guide, smart technologies such as artificial intelligence (AI) and neuro-enhancement tools promise to enable new approaches and efficiencies that can help personalize learning and chip away at the inequities lurking within our education system.

But if we fail to deploy smart technologies mindfully, we risk exacerbating inequity and compromising the privacy of learners and teachers in ways that could follow them throughout their lives.

“Your personal digital information is a commodity. It has immense value,” Temple S. Lovelace, an Associate Professor of Special Education at Duquesne University, said. Her research interests include social justice and equity through a disability inquiry lens.

Here I consider some of the promises and pitfalls of using smart technologies in our learning communities.

 

Promises

Imagine a classroom where an English teacher can spend more of her time on the themes and content of a learner’s essay because a computer program has corrected, with explanations, for grammar, syntax and spelling.

As explained by Dr. Charles Fadel, founder and chairman of Center for Curriculum Redesign, that sort of help from smart tech can free teachers to devote more time to more meaningful interaction with learners.

If interactive math tools can help a student learn the basics of fractions, a teacher can devote face-to-face time to putting that knowledge into action on a real-world project.

Beyond helping with relatively menial tasks, Fadel said, technology can help learners build empathy through, for example, a virtual reality tour of a refugee camp. In addition, cognitive fitness protocols that combined diagnostic and therapeutic activities along with vetted tools could help educators support learners’ cognitive health and performance.

As these examples illustrate, smart technologies, if used thoughtfully, could be harnessed along with other teaching tools and personal interaction to help learners develop competence and character, as part of an education that goes beyond superficial learning to imparting tools a learner can use in 50 years, Fadel said.

Further, smart technologies can help learners who have different learning styles working at varying paces to master material. Dr. Lovelace said that smart technologies can help learners avoid being defined negatively because they didn’t learn at the same pace in a class where all were expected to do so. “It’s less about an achievement gap than an opportunity gap,” she said. “A lot of times a teacher doesn’t believe a student can learn, but students working on a screen at their own pace can demonstrate achievement.”

Smart technologies can also help teachers personalize learning for students without tying them to screen-based activities. For example, Lovelace pointed out, smart technologies can connect learners with experiences beyond the classroom. Smart technologies could also help design custom configurations of support and learning experiences for students experiencing educational disruption as a result of family instability, homelessness or climate dislocation.

 

Pitfalls

As we wrote in the strategy guide, “Smart technologies are only as good as the code that powers them and as the practices that guide their use.”

Lovelace points out that a lack of gender and racial diversity within the coding community risks positioning schools to deploy programs that have unintended biases that favor some learners over others. “The people who are creating this code are not diverse,” she said. “We have to have diversity and equity in those who are creating the tools.”

Every few years, Fadel said, new technology is hailed by some as the silver bullet that will fix persistent problems within education – we’ve seen this with laptops, tablets, smart phones, etc.

But educators and stakeholders must always ask the same questions when considering the use of cutting-edge tech. “It’s the same challenge: Can you prove you have better results with the technology than without it, delivered at a reasonable cost?” he asked.

Education stakeholders can lead the charge, joining community members, learners and policymakers to formalize the rules of engagement around the use of smart technologies in education. Among the approaches we identified in the strategy guide, a Bill of Data Rights could protect against the misuse of learner data. In addition, pursuing long-term dialogue with developers could help shine an ongoing spotlight on ethics.

Fadel is concerned that predictive technology could be used to “pilot” learners down learning pathways that they may not have taken if their agency had not been manipulated by smart tech. This possibility is not dissimilar to the narrowing of options faced by a learner whose teacher wrongly pigeon-holed him by failing to recognize his potential. “That’s the trick, to figure out the monitoring systems. It certainly puts more power in the hands of the publisher rather than the teacher,” he said.

Personal interaction between and among teachers and learners must remain at the center of an effective education rather than being swayed by technology’s allure. “Just screens is a disaster,” Fadel said.

 

Stewardship for the Ethical Application of Smart Technologies

As with so many other challenges we face in our rapidly changing society, we have it in our power to approach the application of smart technologies mindfully and effectively. Absent our attention, smart technologies could subvert our intentions, exacerbate inequity and compromise privacy.

As Lovelace said, “All I can look at is the past to inform the future. It does scare me. I love technology, I love the promise of technology. I think that as long as we don’t have diverse representation [among coders], it will be a pitfall, and we will look back and say ‘Why didn’t we stop this?’”

Despite that significant concern, some early indicators suggest that people with a stake in education are beginning to work through such complexities. Among them, the nonprofit AI4All runs summer education programs in artificial intelligence and computer science for under-represented high school students. These programs aim to cultivate a diverse group of future AI leaders and to promote a humanistic view of the AI field.

In addition, the GovLab project seeks to exercise data stewardship by facilitating collaboration across private-sector organizations, public agencies and researchers. This project promotes responsible data leadership and analytical expertise in the service of the public good.

We need more such efforts, with both widespread and local reach. Educators and other stakeholders need to accept their responsibility for stewarding the ethical use of smart technologies in education. Bringing together coalitions to co-create frameworks for smart technologies’ use in education can provide both initial and ongoing guidance.

The time is now to think through smart technologies’ promises and pitfalls and to set up structures and coalitions for pursuing their ethical use in education.

 

About the Author

Katherine Prince leads KnowledgeWorks’ exploration of the future of learning. As Vice President, Strategic Foresight, she speaks and writes about the trends shaping education over the next decade and helps education stakeholders strategize about how to become active agents of change in shaping the future. She tweets as @katprince using the hashtags #NavigateFutureEd and #FutureEd and can be found on LinkedIn.

 

For more on the opportunities described in this post, see KnowledgeWorks’ Navigating the Future of Learning: A Strategy Guide.