Is AI the Future of Differentiated Instruction?

“Just on the border of your waking mind

There lies another time

Where darkness and light are one,

And as you tread the halls of sanity

You feel so glad to be unable to go beyond

I have a message from another time.”

–Jeffrey Lynne, Electric Light Orchestra

Teachers, the future is upon us in education. Fifty years ago there were no computers in the classroom. Movies were shown on film projectors, and copies were made on mimeograph machines that left the papers with addictive aromas. The only “message from another time” that we had were from popular media, glimpses of a future run by computers in the form of HAL–the heuristic algorithmic computer–from 2001: A Space Odyssey (1968), and The Jetsons (1962). In just over a half-century, we are faced with artificial intelligence that has the capacity to answer questions, solve problems, and carry on a conversation. We have truly moved into a new era, eerily described in the tag line to an early 60s science fiction series, the Twilight Zone: “It is the middle ground between light and shadow, between science and superstition, and it lies between the pit of man’s fears and the summit of his knowledge.”  As teachers we are afraid. We are afraid that we will not be able to help our students succeed, we are first afraid and then angry that administrators continue to infringe upon our rights to act upon our instincts and draw upon our professionalism to do right by students. Now we are afraid that our calling and our livelihood will be stolen by machines and their makers. We are inundated by questions. Will we be replaced by robots? Will artificial intelligence become the norm for guiding instruction? Will they do a better job than we can? Do we relinquish control or fight for freedom? Yes, this is overdramatized, yet the struggle is real.

Artificial intelligence is an insidious twist of technology that is already controlling much of what we see, do, and hear, whether we realize it or not. Your phone autocorrects your words. Your email app finishes your sentences. Your web browsers and social media apps are gatekeepers as to what you see and hear. Siri tells you where to go. Alexa chooses your music. And for the most part, it is helpful. But will artificial intelligence usurp the role of a teacher? Neil Selwyn highlights “the deeply relational nature of teaching and learning,” describing teaching as an art and craft rather than a science (Selwyn, 2022, p. 626). Selwyn cites Siddarth (2023) who wisely states, “even human motivations and basic reasoning capabilities fundamentally arise out of social interactions rather than as individual decision-making capabilities (n.p.).” And while a computer can “make decisions” based on common practices, or what usually happens, in the end, technology is a tool, and artificial intelligence is a tricky bit of technology.

Conditions for Change

In a summit earlier this year hosted by Stanford’s Human-Centered Artificial Intelligence center, four conditions that made the education profession ripe for the incursion of artificial intelligence were outlined (Stanford HAI, 2023).

  1. Data has evolved to a point that student performance can be captured and tracked in real time–and differentiation relies on continual assessment. Rather than waiting for results from once- or twice-a-year standardized test scores, teachers can view analytics from educational apps or learning management systems that can show how students fare at any point in time. For example, OTUS is a learning management system able to track data from the big players like PSAT or NWEA as well as from individual standards taught in the classroom.
  2. Educational technology is now scalable. Instead of buying single use or individual discs to use in individual computers, educational technology is now cloud-based, and can reach billions of students all over the world. For example, PhET, an interactive simulations project at the University of Colorado Boulder founded in 2002, has evolved from a collection of physics simulation videos, to a library of 164 interactive simulations in all of the core sciences plus math, published in 116 different languages, with over 1.1 billion simulations delivered to date. Differentiation embraces equitable solutions.
  3. With the rise of empiricism and research, learning and brain science reveal better ways to teach; the way we’ve always done it doesn’t mean it is the best way, especially in light of new technologies (Stanford HAI, 2023). For example, word problems have been a staple for learning mathematics since the ancient Greeks existed. However, Keith Devlin, an emeritus mathematician at Stanford University, makes a good case against contrived word problems that appear to apply to the “real world,” but in actuality present scenarios that do not actually exist. For example, Devlin stated the following: “I cringe whenever I see an elementary school textbook present a problem such as ‘If a quarter of a pound of ham costs $2, how much will three pounds cost?’ Any child who has accompanied a parent shopping for groceries knows that things are cheaper per pound when you buy a greater quantity. As a result, though the child may eventually learn to solve such problems the way the textbook wants, the real lesson being imparted is that mathematics is a stupid, arbitrary subject having no relevance to the real world (Devlin, 2010).” Rather than arbitrary word problems, educational technology can provide opportunities for students to interact with simulations that portray real events as they unfold. Authenticity is a keystone for differentiation.
  4. Advances in technology have made all of the above solutions possible. The biggest threat to their usefulness in education is the desire to monetize the work and once again ensure that the best methods are only available to the privileged, marginalizing already marginalized populations. In adopting educational technology, UNESCO admonishes that educational policy makers must, “Analyse the potential tension between market rewards and human values, skills, and social well-being in the context of AI technologies that increase productivity, “ and “Define values that prioritize people and the environment over efficiency,and human interaction over human-machine interaction (UNESCO, 2019, p. 32).”  

Modified Roles for Teachers and Students

As mentioned earlier, there are a lot of decisions being made in the background by artificially intelligent programs for students and teachers that make users suspicious of manipulation. On more than one occasion in my classroom, students have complained of not being able to access appropriate information to respond to prompts or to use as examples. Yet, when they type the same query into my teacher computer, they come up with exactly what they need. I blame this on the hidden algorithms that offer users what they always look for based on frequencies, rather than specifically what they want based on the query itself. Can using other AI technology such as ChatGPT resolve these problems? Perhaps, but any responses received as a result of Googling or asking a chatbot must be observed with a critical eye. Can students find any answer on the internet? Yes, but this process will change what is important for learners (Chen, 2023). In addition to finding possible right answers, student searches may just as often return the wrong answer. Thus, students must be taught to thoroughly vet information received as a result of a query. In this way, students are being moved from roles as information producers, to roles as information managers and quality control experts (Stanford HAI, 2023). These are not skills they know how to do, nor are they roles that teachers have historically needed to teach. Even with emphasis in past years on vetting websites using tools such as the CRAAP test for website review, the skills to verify information shared with such authority (if it’s on the internet it must be true) will be hard to teach and harder yet for students to value. So rather than just sending students off to find information on their own in separate stations to fulfill the need for differentiated activities, the teacher must take time to help students understand how to construct  discourse with generative AI to ensure that answers received are actually the answers wanted. 

Reconciling this shift toward learning by discourse rather than learning by rote may not be as much of a stretch as first imagined. Schools now envision their graduates as students who know how to learn, to collaborate, and to participate in society. In many ideal graduate pictures presented by school districts, there is no mention of math, or science, or reading achievement (Stanford HAI, 2023). For example, the Santa Clara Unified School District in Santa Clara, California lists eight qualities for their ideal graduates. They want their graduates to be resilient in mind and healthy in body, critical thinkers, collaborative problem solvers, future-ready learners, effective communicators, inclusive empathizers, equity ambassadors, and model global citizens. They explain that these attributes “describe the knowledge, skills, dispositions, and mindsets that Santa Clara Unified School District’s students need to thrive in life and career (SCUSD, n.d.).” No core subjects are mentioned. The school district I currently work in, a small rural district in Michigan, wants graduates to be ethical, respectful, responsible, goal oriented, critical thinkers, and inquisitive. No core subjects are mentioned there, either. Overall, the metaskill that seems to be prevalent amidst the research is problem-solving (Stanford HAI, 2023). This makes a lot of sense, considering the jobs most of our students will be vying for are not jobs that even exist yet (World Economic Forum, 2016). But it also suggests that for differentiation to be meaningful, it must embrace these qualities as well. Differentiated instruction must provide students with opportunities for collaboration, for critical thinking, for problem-solving, and for communication. Just knocking off a few practice problems from the bottom won’t meet educational goals for future learners.

What does that mean for teachers and students? It means that there will need to be plenty of opportunities offered for students to not only apply knowledge in authentic scenarios, but also transfer and use the knowledge in other disciplines in order to make sound decisions. 

Generative AI like ChatGPT can help simulate conversations that may lead students to possibilities for application and transference, another keystone for differentiated instruction. The teacher will continue to be the guide that helps students discern what is factual and true and what is not.

Benefits of Artificial Intelligence for Education

In an era where the list of teacher tasks keeps growing (conservatively, I counted 35 different apps and programs I am responsible for keeping up with each day in my current position), teachers can feel isolated. With so many details to track and decisions to make on a daily basis, collaboration with others can be difficult to find time for, much less time to plan for differentiated instruction. And, isolation can be an adaptive measure. In other words, teachers withdraw from communicating with others because it interferes with their ability to get all of their work done (Ostovar-Nameghi & Sheikhahmadi, 2016). While professional isolation has the potential to lead to a state of burnout, a collaborative atmosphere is conducive to professional growth and job satisfaction. But who will teachers collaborate with? Other teachers are experiencing similar work loads. Experts at the AI+Education Summit (Stanford HAI, 2023) suggest that AI might be able to help teachers by providing enhanced personalized support for teachers at scale. After all, instructional coaches must spread their expertise across entire buildings. They won’t always be available. But generative AI is a few keystrokes away, any time, any place, and is capable of generating a lot of ideas, really fast. How helpful that would be to a teacher who wants to incorporate differentiation strategies, but finds themselves short of time?

Stanford experts believe that teachers can use artificial intelligence applications such as ChatGPT to improve their instruction. Teachers can ask ChatGPT to generate curriculum ideas such as  “How can I explain slope in a most engaging way for students (Stanford HAI et al., 2023, n.p.)?” While a teacher can certainly find resources to answer questions like this on their own, they don’t operate at the speeds that an app like ChatGPT can operate at. In a classroom where time is already tight, getting some timely help could really make a difference. How about this for another useful application of ChatGPT–what if ChatGPT was asked to play the role of a student? A teacher could ask ChatGPT to carry out a set of instructions for an assignment. If the program can do so accurately, then the instructions were probably clear and precise. When generative AI is used to simulate student responses in this way, teachers can test and tweak differentiated learning strategies before implementing them in the classroom–and save days of reteaching during the course of a year. Clarity is essential if students are to work through differentiated activities independently.

Summit experts also suggested that AI could provide an efficient way to generate samples and examples to share with students (Brown et al., 2023). Imagine, for example, that a teacher gives students a prompt to respond to. The best teachers will create examples to share with students, talking through those examples with students to highlight the best strategies for responding. Creating examples takes a long time. But what if the teacher shares the prompt with ChatGPT? Not only will ChatGPT respond to a query like this quickly, with some minor alterations to the prompt the teacher can generate multiple examples. In fact, ChatGPT can generate real-time responses in the moment, and in sharing and evaluating the responses with students, the teacher becomes a reader and editor just like the students are asked to be.  In this simulated, AI generated dialogue, the teacher becomes a participant in the learning process rather than just an authority in the classroom (Brown et al., 2023). An added benefit of a think-aloud session like this is that the teacher can help students evaluate the success of the response in order to show them that AI doesn’t always provide right answers. That is powerful teaching that ensures students will be successful when they are released to work independently on similar activities during differentiated instruction.

There are also some benefits of using generative AI for students that are not readily apparent. Students often are reluctant to ask or respond to questions in class for fear of judgment from their peers. Teachers also often neglect to extend learning and understanding by pursuing a line of inquiry proposed by a student during whole group instruction because there just isn’t time. So how does generative AI fit into the picture? AI is a machine. It is a safe entity to pose questions to, because its job is to present information not judgment (Kahn & Piech, 2023). This also means that students working at higher or lower levels, or who have more or fewer questions to clarify, can do so with anonymity. AI is infinitely patient. A machine will answer as many questions as a student has time to ask. Again, this supports independent, student-centered work, a keystone process in the differentiated classroom.  AI is also able to create example responses to writing prompts to help students get started with a writing task. But wait, isn’t that plagiarism? Not if the teacher shows students how to evaluate and edit the response to not only answer the prompt more correctly (ChatGPT answers are often redundant and miss critical points), but to  include nuances that reflect their own writing voice. Since when is learning effective editing techniques a bad idea? (Never.) Students are asked to do more at higher levels than ever before–getting a jumpstart on the thinking process can mean the difference between a complete or incomplete assignment, and can also be a way to ensure that all students start on a level field. A really intriguing use of this thought-provoking “jumpstart” is the idea of using ChatGPT in role-playing. Students can converse with ChatGPT to help them understand course content when they need to, and elaborate on material already known. That’s essential differentiation, too.  Imagine the possibilities: 

  1. “Hello ChatGPT, I am Thomas Jefferson. I would like to discuss my role in drafting the Declaration of Independence.
  2. Can you play the role of Elizabeth Bennet from Pride and Prejudice? I want to know more about your thoughts on love and marriage.
  3. Bonjour, ChatGPT. I am Marie Curie. Can we discuss my discoveries in radioactivity and how it has impacted science (Daccord, 2023)?”

As with any new technology, it is crucial to remember that the technology is a tool.  Generative AI is a tool. Spell check, autofill, measurement tools, online equation solvers, and simulations are tools. Denying access to these types of technologies won’t make them go away, nor will it make them less intriguing to students. Instead, teachers should use access to and discussion of those tools to create a culture of collaboration and thoughtful evaluation. Teachers have a “professional responsibility to choose and deploy AI in ways that contribute to the healthy development of learners’ extended cognitive ecosystems, as well as provide insight into teachers’ future work and learning practices (Adams et al., 2023, p. 2).” AI can differentiate content and task for students–and teachers, too.

A Cautious Approach to Artificial Intelligence in Education

Putting the teacher in charge sounds easy, but again, it’s one more thing to add to a plate of responsibilities that is already full. And even though AI can be considered just another educational technology tool, there are some issues inherent to the AI model that need to be revealed.

First, AI is goal oriented (Office of Educational Technology, 2023). The types of decisions that AI makes by analyzing and summarizing data do not take into account the nuanced forms of judgment that machines just can’t do. Data is “value-laden” and should be viewed with an understanding of perspective. A struggling student may have a completely different view of earning a 100 on a quiz than a student who routinely makes such scores, and those differences should be validated. A machine can provide students with specific learning pathways, but it is unable to understand that the death of a pet is going to affect progress and results (Moser et al., 2022). Empathetic responses are ensured by the humans in the room.

As cited by Selwyn (2019), Murray Goulden makes a distinction between ‘technologically smart’ but ‘socially stupid’ systems; “the concern persists that there are not enough data points in the world to adequately capture the complexities and nuances of who a student is, or how a school functions (Selwyn, 2019, p. 12).”

At scale AI can emphasize really bad decisions (Stanford HAI, 2023). Humans, including teachers and educational administrators, have always made biased, illogical and just plain bad decisions. Education is full of occasions when teachers and school leaders might benefit from additional advice or an automated nudge in the right direction. Why wouldn’t one want additional input when making important decisions, even if it is machine-generated? After all, the ultimate decision couldn’t be any worse for having outside input, right?  Reaching the level of “being no worse than a human does not justify the adoption of flawed AI technology in an educational setting (Selwyn, 2022, p. 4).” What it does mean is that flaws, including those described below, need to be recognized and taken into account.

  1. AI output does not reflect true cultural diversity. If available data is based on information from a predominantly white, middle class, male population, then answers based on that data will not reflect equity (Chen, 2023). Along similar lines, AI models amplify discriminations found within the data, which can result in low automated scores being attributed  to subgroups who often score low, and higher automated grades being given to students who fit the profile of those who have historically scored higher.
  2. AI out isn’t solely designed as a learning tool. When AI generates answers, it does so as quickly as possible. Context isn’t considered, and phrasing at a level appropriate to ensure student understanding just doesn’t happen. Processes delegated to generative AI need to be carefully monitored, especially during independent, differentiated tasks.
  3. Incorrect responses are presented in exactly the same way as correct answers and look legitimate. Why? The machine can’t distinguish things like sarcasm or when to use parentheses in a mathematical equation. Askers and differentiators beware (Chen, 2023).
  4. Use of AI tends to promote a double-sided motivation crisis (Chen, 2023). On one side, students often fall prey to the desire to “outsource productive struggle,” looking for quick answers and easy responses without much thought, and without any evaluation of whether answers and responses are correct or not (Brown et al., 2023, n.p.). Ultimately, the work gets done–fast. On the other side, consider students currently working toward degrees in computational sciences. Many are concerned that they may become obsolete when AI “learns” enough to take over their jobs (Kahn & Piech, 2023).
  5. Algorithmic bias can influence the answers that a student receives. When responses are given based on preferences, rather than on accuracy, wrong or incomplete answers may result (Office of Educational Technology, 2023). Perhaps for the differentiated classroom, this means correct responses should be shared and students encouraged to figure out why the answers didn’t match.
  6. AI may support student-centered educational strategies such as differentiation of assessment and tasks, but may, at the same time, erode a student’s ability to interact with other humans with respect and empathy (Adams et al., 2023). To avoid this erosion, differentiated tasks must include collaborative activities.

Conclusion

Artificial intelligence has the potential to positively impact future teachers as they begin to dialogue with generative AI such as ChatGPT to tell stories, debate, role-play, and serve as thought partners for curriculum development or test subjects for new teaching strategies. Future students will be freed from repetitive rhetoric and tedious practice and skip to sharing conversations and ideas that lead to deeper, transferable understandings and to making new things in ways that were not possible in the past. “But with great power comes great responsibility (Spiderman, 1962).” Using generative AI in the classroom means that teachers need to know the concepts, skills, and ethical considerations surrounding the creation and use of AI (UNICEF, 2021). Teachers need to be able to engage with AI as skilled users, but also need to recognize data biases that may pose ethical dilemmas and infringe on user rights (UNICEF, 2021). “Shifting to augmented intelligence leads to an emphasis on developing AI technologies that complement and expand human cognition, suggests ways that humans and AI might work together more effectively, queries how tasks should be divided between humans and machines, and raises the tantalizing possibility that the world’s problems might be addressed by means of a judicious mix of artificial and collective intelligence (UNESCO, 2021, p. 12).” 

Check out the video version of this paper.. Check out Twilight, by ELO.

References

Aas, H. K. (2021). Learning through communication: exploring learning potential in teacher teams lesson study talk. International Journal for Lesson and Learning Studies, 10(1), 47-59.

Adams, C., Pente, P., Lemermeyer, G., & Rockwell, G. (2023). Ethical principles for artificial intelligence in K-12 education. Computers and Education, 4, 1-10.

Biesta, G., Priestly, M., & Robinson, S. (2017). Talking about education: Exploring the significance of teachers’ talk for teacher agency. Journal of Curriculum Studies, 49(1), 38-54.

Brown, B., Brunskill, E., Levine, S., & Thille, C. (Directors). (2023). AI+Education Summit: Envisioning AI Enriched Classrooms [Film]. Stanford University. https://youtu.be/5fWclPBzaRk

California State University. (2020, September 21). CRAAP Test Separates Good Resources from Bad. Chico State Today. Retrieved June 16, 2023, from https://today.csuchico.edu/how-to-craap-test/

Chen, C. (2023, March 9). AI Will Transform Teaching and Learning. Let’s Get it Right. Stanford HAI. Retrieved June 15, 2023, from https://hai.stanford.edu/news/ai-will-transform-teaching-and-learning-lets-get-it-right

Daccord, T. (2023, March 23). ChatGPT Teacher Tips Part 1: Role-Playing Activities. EdTechTeacher. Retrieved June 17, 2023, from https://edtechteacher.org/chatgptroleplaying/

Devlin, K. (2010). The problem with word problems. Devlin’s Angle. Retrieved June 16, 2023, from https://www.maa.org/external_archive/devlin/devlin_05_10.html

Herrmann, B. (n.d.). The Twilight Zone (TV Series 1959–1964). IMDb. Retrieved June 16, 2023, from https://www.imdb.com/title/tt0052520/

Kahn, S., & Piech, C. (Directors). (2023). AI+Education Summit: Realizing the Potential and Mitigating the Risks of AI for Education [Film]. Stanford University. https://youtu.be/8nn442SMzQ0

Liang, P., Goodman, N., Reich, R., & Demzsky, D. (Directors). (2023). AI+Education Summit: Generative AI for Education [Film]. Stanford University. https://youtu.be/Ks7enkKuZIo

Moser, C., den Hond, F., & Lindebaum, D. (2022). Morality in the age of artificially intelligent algorithms. Academy of Management Learning & Education, 21(1), 139-155.

Office of Educational Technology. (2023, May). Artificial Intelligence and Future of Teaching and Learning: Insights and Recommendations. U.S. Department of Education.

Ostovar-Nameghi, S. A., & Sheikhahmadi, M. (2016). From teacher Isolation to teacher collaboration: Theoretical perspectives and empirical findings. English Language Teaching, 9(5), 197-205.

SCUSD. (n.d.). District Plans / Graduate Portrait. Santa Clara Unified School District. Retrieved June 17, 2023, from https://www.santaclarausd.org/Page/3469

Selwyn, N. (2019). What’s the problem with learning analytics? Journal of Learning Analytics, 6(3), 11-19.

Selwyn, N. (2022). The future of AI and education: Some cautionary notes. European Journal of Education, 57, 620-631.

Siddarth, D., Acemoglu, d., Allen, D., Crawford, K., Evans, J., Jordan, M., & Weyl, G. (2023, May 9). How AI fails us. Technology & Democracy Discussion Paper. Harvard University. Retrieved June 16, 2023, from https://ethics.harvard.edu/files/center-for-%20ethics/files/howai_fails_us_2.pdf?m=1638369605

Stanford HAI (Director). (2023). AI+Education Summit: Is AI the Future of Education? [Film]. Stanford University. https://youtu.be/xRC9-kZAnBw

United Nations Educational, Scientific and Cultural Organization. (2021). AI and education: Guidance for policy-makers. UNESCO. https://unesdoc.unesco.org/ark: /48223/pf0000376709

United Nations Education, Science, and Cultural Organization. (2019). Beijing consensus on artificial intelligence and education. International Conference on Artificial Intelligence and Education, Planning Education in the AI Era: Lead the Leap, Beijing. https://unesdoc.unesco.org/ark:/48223/pf0000368303

World Economic Forum. (2016). The future of jobs employment, skills and workforce strategy for the fourth industrial revolution. n.p. https://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf

Leave a comment