Domesticating GenAI
I’ve been listening to discussions about GenAI in formal education for too long and noticing a flood of papers reviewing, analysing and prognosticating on what could, should, can happen as if GenAI had somehow settled into something fixed. The underlying logic in all of this is how to squish GenAI into existing, albeit ancient, education practices, structures and logics, something I have long called domestication [e.g. 1,2,3].
Given the state of my memory or rather admitting that I have a good case of what Morgan Freeman once called CRS (can’t remember shit), I returned to a long post I wrote in March 2023. To my horror I realised I had been saying much the same thing about GenAI for a long time. Talk about fixity. OK. I have made a few minor adjusts and probably have a better set of questions to mull than back then but the tyranny of intellectual path dependence seems to rely on seemingly endless repetition. My language model seems to have a shrinking context window.
What to do?
I thought a bit of playfulness might be in order. The bot, GPT 4o, was quite helpful when I asked it to develop five possible posts about education Darleks chanting “domesticate, domesticate”
as they try to stuff unruly GenAI applications into an ancient architecture of practices that constitute the fabric of formal education.
Perhaps for another time. Instead I asked the bot to
rework this post https://chrisbigum.blogspot.com/2023/03/bibs-bobs-14.html in the style of douglas adams
That’s the post from March 2023. What follows is mainly the bot, indicated by ֍ which has been lightly edited.
֍
Bibs, Bobs, and the Ultimate Question of Education
Education, as a concept, is rather like a poorly maintained intergalactic hyperspace bypass. Nobody’s quite sure why it’s there, it doesn’t seem to lead anywhere in particular, and the whole thing was probably put together by a bureaucrat who thought “progress” meant installing a few extra lanes for testing. The galaxy spins on regardless, but occasionally someone pulls over to take a closer look and decides it’s all in desperate need of fixing.
Nowhere is this more apparent than in the curious case of Generative AI, which has recently crash-landed into the educational hyperspace with all the grace of a Vogon poetry recital. Teachers, administrators, and students alike are staring at it, not entirely sure if it’s a marvelous leap forward or a slightly sinister box that might explode if you press the wrong button.
When GenAI first sauntered into the scene, it did so with an air of confidence usually reserved for door-to-door towel salesmen from Betelgeuse. “Here I am,” it declared, “ready to revolutionize education!” This pronouncement was met with a mix of awe, skepticism, and that peculiar sense of dread that arises when you realize the homework you set for your students has been completed, analyzed, and improved upon by a chatbot before you’ve even finished your morning cup of tea.
The potential for AI in education is staggering. Need an essay? Done. Want a lesson plan? Voilà. Have an existential crisis about your own usefulness as an educator? No problem—it can even help you write your resignation letter in three different tones of regret.
Of course, the educational establishment wasn’t going to let this sort of unregulated brilliance go unchecked. Committees were formed, policies were drafted, and somewhere in the galaxy, a small planet’s worth of paper was sacrificed to ensure that this “AI thing” was brought under control.
The result? A veritable treasure trove of recommendations that included gems like:
1 Ensure AI aligns with current educational standards—as if AI were a mildly rebellious teenager who just needed to buckle down and start respecting authority.
2 Ban its use during exams—because if there’s one thing education prizes above all else, it’s forcing students to pretend they live in a world without tools.
3 Train teachers to work alongside AI—a noble idea, assuming anyone could figure out what “working alongside AI” actually means. (Do you offer it coffee in the staffroom? Do you invite it to union meetings?)
While adults fretted and fumbled, students did what students do best: they ignored the rules and figured things out for themselves. It didn’t take long for them to discover that GenAI was less a tool for regurgitating Wikipedia entries and more a portal into the sorts of creative possibilities that made their teachers’ heads spin.
- One student used AI to compose an opera about the quadratic formula.
- Another created a simulation of medieval Europe where peasants could argue with their AI landlords about feudal rent agreements.
- A third wrote a 200-page manifesto titled Why Exams Are a Blight on Human Civilization—with footnotes that were more insightful than the original curriculum.
The Daleks of the educational bureaucracy, meanwhile, looked on in horror, muttering "Domesticate, domesticate..." under their mechanical breaths.
At this point, one might be tempted to ask: what is the purpose of education? Is it to teach students how to pass tests? To prepare them for jobs that may or may not exist in five years? Or is it, perhaps, something more fundamental—a way of helping them navigate the universe with all its improbabilities, contradictions, and occasional encounters with superintelligent shades of the color blue?
Generative AI, for all its quirks, seems to suggest the latter. It’s not here to replace teachers or turn students into mindless drones (though it’s quite capable of doing both if misused). Instead, it’s a tool—one that can be wielded to amplify creativity, spark curiosity, and yes, even make education a little more fun.
The Final Word (or Not)
Of course, as with all things in life, there are no definitive answers. Education will muddle along as it always has, oscillating between innovation and inertia, brilliance and bureaucracy. GenAI will continue to evolve, as will the ways we choose to use—or ignore—it.
And in the meantime, if you find yourself lost on the educational hyperspace bypass, remember this: the ultimate answer may still be 42, but the questions? The questions are entirely up to us.
֍
Interesting that it picked up on the previous query involving Darleks and the conservative nature of educational bureaucracies.
Coincidently, I came across a post by Rory Sutherland which may capture the current state of things:
I’ve been influenced in this endeavour by what my friend Paul Bloom calls ‘the Ginger Rogers theory of information’. In any information system, some ideas will propagate much more than others simply because they appeal to people’s existing prejudices. Information which runs counter to one or other world-view will be repressed or discounted unless it reaches a very high standard of proof – which is exactly why it deserves extra credence.
Plus ca change.
[1] Bigum, C. (2012). Edges, exponentials & education: disenthralling the digital. In L. Rowan & C. Bigum (Eds.), Transformative approaches to new technologies and student diversity in futures oriented classrooms: Future Proofing Education (pp. 29-43). Springer.
[2] Bigum, C. (2012). Schools and computers: Tales of a digital romance. In L. Rowan & C. Bigum (Eds.), Transformative approaches to new technologies and student diversity in futures oriented classrooms: Future Proofing Education (pp. 15-28). Springer.
[3] Bigum, C. (2002). Design sensibilities, schools and the new computing and communication technologies. In I. Snyder (Ed.), Silicon literacies: Communication, innovation and education in the electronic era (pp. 130-140). Routledge.