Hmm. Remaining stoical and optimistic means I avoid thinking about some stuff, from US politics to an unnerving dream about a giant thistle in a car. I’ve tried to ignore Artificial Intelligence thus: too technical, potentially boring and located somewhere in the future. But on a train down France last week I read a leader in the FT and, crikey, I’m thinking of little else. In fact, I apologise for not bothering you about this before.
While the Internet, mobile phones and social media bemused schools in their time, generative artificial intelligence knocks them into a cocked hat. Humanity’s response will shape the future and struth, schools should be steadily panicking about this without delay.
Forgive me if you’re already expert on this generative AI growing like Japanese knotweed. Building on published data, information, books, journals, wikis and social media it literally predicts the next word on any subject. This has mega-implications not just for learning, but the development of thinking citizens. And while folks are happily speculating on the drudge-work it’ll take off us, I’m captivated by two serious problems.
AI can fabricate facts and make up links and references. These ‘hallucinations’ or ‘confabulations’ give plausible-sounding falsehoods which are then repeated and become part of what people think is true.
Second, AI builds on what we already know, so stuff we’re trying to eradicate – the colonial past, misogyny, whether Joe Biden won the election or not – repeat endlessly. It’s not a tool designed to change the world for the better.
A relief, then, that the doughty Department for Education has produced a position statement. A bit bland, maybe, but usefully covering stuff about AI opportunities and challenges, exam malpractice, data protection, cyber security, protecting students from harmful content and potentially freeing up teacher time.
Nonetheless, I enjoyed it. I like a nicely-turned phrase in a formal document and the following gripped my by the throat. The DfE note that AI
- content they produce is not always accurate or appropriate as it has limited regard for truth and can output biased information.
- is not a substitute for having knowledge in long-term memory
- can make written tasks quicker and easier but cannot replace the judgement and deep subject knowledge of a human expert. It is more important than ever that our education system ensures pupils acquire knowledge, expertise and intellectual capability.
- can produce fluent and convincing responses to user prompts [but] the content produced can be factually inaccurate. Students need foundational knowledge and skills to discern and judge the accuracy and appropriateness of information, so a knowledge-rich curriculum, is therefore all the more important.
- can create believable content of all kinds
- content may seem more authoritative and believable
And there’s an Office for AI squirreling away within the Department for Science, Innovation and Technology. You’ve got to ask – who’s feeding it? Do they know it’s there?
Thank you, Secretary of State. Here are what’s on my mind, though.
- Schools have to be committed to the highest standards of knowledge, quality scholarship and truth as AI develops, to avoid misinformation taking root in teaching and learning.
- This is particularly important when a crippling teacher shortage leaves isolated, young or unsupported teachers creating lessons content alone (not at Tallis). Especially those misled by DfE trends such as the sponsoring of online lessons through Oak National Academy. If you get used to downloading off-the-peg lessons, you’ll need to check the facts.
- Students of course need to learn about and work with AI, but they also need to understand its dangers and the debilitatingly skewing effect that sustained falsehood has on human society. We’re already seeing this.
- The ridiculously named ‘knowledge-rich curriculum’ (what other sort is there?) is insufficient protection. As long as schools are cheaply assessed by exam results and snapshot inspections, some schools still face a perverse incentive to cover large amounts of knowledge superficially. Combined with a serious teacher shortage, the temptations of AI pose a real risk to the integrity of what children are being taught.
- Young people devote HUGE portions of their waking hours to AI-generated content. Adults, teachers and policy-makers need to understand and work very hard to bridge significant generational differences.
- Exams may be OK while they remain as memory tests fulfilled in handwriting in an exam room. Unconsidered absorption of AI information, though, is a risk to independent student learning. Coursework becomes a nightmare and mechanised marking – cheaper for the commercial exam boards - could install hallucinations at the heart of knowledge. However, our examination system’s glued to university entrance, so the academy needs to think about this. Which it is.
So, what to do? At Tallis we’re writing a developing policy and thinking about the big issues. As the leader in the Financial Times 27.5.23 said:
Every technology opens exciting new frontiers that must be responsibly explored. But as recent history has shown, the excitement must be accompanied by caution over the risk of misinformation and the corruption of the truth.
CR
7.6.23