August 21, 2025

Bibs & bobs #25

 OMG, OMG, OMG …

I recently listened to Richard Susskind talk to his recent book [1] and I scribbled some of the ideas in a more or less incoherent note which I dropped into GPT-5. The LLM dutifully turned it into a coherent text [2]. It struck me that since my early dabbling in OpenAI’s sandpit, I’m no programming nerd, that my history of experiencing AI of the generative variety, GenAI, has been a series of OMGs quickly followed by a ho hum. The availability of GPT-5 is just another element of the many AI triggered interruptions that first appear as OMG moments.


I think all through this time, formal education, if I can crudely collapse the raft of practices that fall under that umbrella, has been waiting for things to settle while at the same time putting a lot of effort into domesticating [3] the various LLMs and their relatives. Given the OMG pattern, it’s easy to see just how difficult a time it has been for schools, teachers and their students.


Susskind drew attention to a number of other patterns and phenomena associated with AI. He uses the singular term to refer to the myriad apps AI currently available now and into the future. I will do the same. It prompted me to try and elaborate some of them in terms of the implications for formal education. 


Education finds itself in a revolving door of “never… until it happens”. Never could a machine produce an essay good enough to fool a teacher. Never could it solve high-school math word problems. Never could it mimic a student’s writing style. Each “never” has collapsed. Often as an OMG moment.


The pattern has become predictable: first disbelief, then dismissal. The anthropomorphising card is often played: “It’s not real understanding”. Of course “it” does not understand but that does not stop students using ChatGPT to spew out reports, essays, analyses, summaries or anything that is text-based. The task to students changes in productive ways and changes in unhelpful ways for time poor teachers. The assignment mutates for both. 


The adjective jagged is commonly used to describe the range of views, practices and assumptions about AI. To return to an analogy I think is useful, a growing proportion teachers and students struggle to deal with is akin to the choice humans once had to make between horse and horseless carriage [4]. That choice, significant as it was at the time of the invention of the automobile, pales when compared with the complications that students and teachers currently face when dealing with the challenge of using, not using or co-using AI to do tasks.


The unevenness always reflects context. Leadership, culture, resources, intellectual traditions, and professional commitments all shape the endlessly repeating debates that have now become routine. Over the years, I’ve amassed far too many examples from boosters, doomsters, critics, and pragmatists [5], voices that collectively generate the noisy bubble we now call “AI in education.”


Regulation and policy sit at the centre of much of this noise. Yet education policy almost inevitably lags practice. Departments of Education and accreditation bodies continue drafting rules for yesterday’s technologies rather than today’s realities. But how do you legislate for a phenomenon that is always under construction? Savvy students push boundaries, sometimes tormenting their teachers. Less confident teachers retreat behind the walls of “academic integrity.” Administrators issue sweeping pronouncements that rarely change much in practice, while regulators posture like ghostbusters chasing shadows. The result is a jagged landscape of confusion, exploration, adoption, and rejection which becomes more jagged with every cycle.


All of this is only the surface. It dazzles. Users poke around, and often pay, for the privilege of doing unpaid product development for the tech bros. Beneath the polished surface lies a far more uneven and ugly underbelly: the consumption of electricity and water by data centres, the large number of poorly paid data labelers, the relentless mining and extraction of minerals to fuel each new GPU cycle, and the completely opaque financial machinery in which ongoing billions are invested, ROIs promised, and which are rarely disclosed. 


Perhaps the most unsettling feature of AI is not that it takes over tasks once reserved for students, but that it increasingly learns from itself. Large language models, for instance, can already generate plausible synthetic characters and situations (and e.g. Ronksley-Pavia, Ronksley-Pavia, & Bigum, 2025). It is hardly far-fetched to imagine automated graders being trained on essays written by automated writers, or synthetic student queries refining synthetic tutors. In such recursive loops, education risks becoming a testbed where human learners are reduced to peripheral participants. Tools built to simulate student learning may soon be training each other more effectively than they ever trained students. The precedent exists: machines long ago stopped playing chess and Go with us and began improving without us by playing themselves.


A fallacy related to machines learning from machines is one Susskind and Susskind (2015, p. 45) highlight. Teachers will often concede that machines can handle the routine, mechanical grunt work of education, but insist they can never be creative, empathetic, or as thoughtful as a skilled teacher. The mistaken assumption here is that the only route to machine capability is to mimic how humans operate, when in fact machines may reach comparable or even superior performance by entirely different means.


Each OMG moment in education can spawn unintended outcomes. Students form underground networks trading effective prompts. Teachers offload routine grading to AI while claiming not to. Publishers push AI textbooks that no one wanted. Each shock doesn’t go away it morphs into new norms, new evasions, new ways of gaming the system.


Finally, there is the matter of frequency. Each “OMG moment” in AI seems to arrive faster than the last. It took two decades to move from chess to Go, only a few years from Go to image generation, months from images to protein folding, and weeks from chatbots to agents. The intervals continue to compress. For education, the challenge is not simply to keep pace, but to ask which forms of knowledge, intellectual habits, and practices remain worth teaching given this acceleration. More directly: given what machines can now do, what knowledge and skills do humans still need in order to delegate work wisely and achieve good outcomes? It is a question education has largely sidestepped, if not bungled, ever since the arrival of the handheld calculator.



Notes


[1] Susskind, R. (2025). How to Think About AI : A Guide For The Perplexed. IRL Press at Oxford University Press.  


[2] Text conjuring is a term Carlos Perez uses to describe how he writes blog posts and books. 


[3] I have written about the history of the attempts by formal education to domesticate each new digital development, e.g. Bigum (2002, 2012a, 2012b); Bigum, Johnson & Bulfin 2015); Bigum & Rowan (2015). If of interest, they are available here.


[4] Horseless carriage thinking (Horton, 2000) is a way of describing the new in terms of the old. Automobiles were just like carriages pulled by a horse but without the horse.


[5] A long time ago, on the eve of presenting to principals at a conference in Darwin, I sketched out a simple categorisation to capture the dominant positions people were taking on digital technology in education: the booster, the doomster, the critic, and the anti-schooler. I wrote caricatured scripts for characters like Billy Booster and Doris Doomster. It was a gamble, but I managed to persuade four principals to perform the roles on stage. They did a fabulous job. That playful framework later found a more formal home in the International Handbook of Education Change (Bigum & Kenway, 1998).




References


Bigum, C. (2002). Design sensibilities, schools and the new computing and communication technologies. In I. Snyder (Ed.), Silicon literacies : communication, innovation and education in the electronic age (pp. 130-140). Routledge. 


Bigum, C. (2012a). Schools and computers: Tales of a digital romance. In L. Rowan & C. Bigum (Eds.), Transformative approaches to new technologies and student diversity in futures oriented classrooms: Future Proofing Education (pp. 15-28). Springer. 


Bigum, C. (2012b). Edges, exponentials & education: disenthralling the digital. In L. Rowan & C. Bigum (Eds.), Transformative approaches to new technologies and student diversity in futures oriented classrooms: Future Proofing Education (pp. 29-43). Springer. 


Bigum, C., Johnson, N. F., & Bulfin, S. (2015). Critical is something others (don't) do: mapping the imaginative of educational technology. In S. Bulfin, N. F. Johnson, & C. Bigum (Eds.), Critical Perspectives on Education & Technology (pp. 1-14). Palgrave. 


Bigum, C., & Kenway, J. (1998). New Information Technologies and the Ambiguous Future of Schooling: some possible scenarios. In A. Hargreaves, A. Lieberman, M. Fullan, & D. Hopkins (Eds.), International Handbook of Educational Change (pp. 375-395). Springer. 


Bigum, C., & Rowan, L. (2015). Gorillas in Their Midst: Rethinking Educational Technology. In S. Bulfin, N. Johnson, & C. Bigum (Eds.), Critical perspectives on technology and education (pp. 14-34). Palgrave Macmillan. 


Horton, W. (2000). Horseless-carriage Thinking. http://web.archive.org/web/20100411190630/http://www.designingwbt.com/content/hct/hct.pdf 


Ronksley-Pavia, M., Ronksley-Pavia, S., & Bigum, C. (2025). Experimenting with Generative AI to Create Personalized Learning Experiences for Twice-Exceptional and Multi-Exceptional Neurodivergent Students. Journal of Advanced Academics. https://doi.org/10.1177/1932202X251346349  


Susskind, R., & Susskind, D. (2015). The future of the professions : how technology will transform the work of human experts. Oxford University Press. 


No comments:

Bibs & bobs #25

  OMG, OMG, OMG … I recently listened to Richard Susskind talk to his recent book [1] and I scribbled some of the ideas in a more or less i...