We can't ban ChatGPT but don't be blind to the real risks it poses
Artificial intelligence offers exciting new ways to work and learn, but there are reasons to be careful.
Artificial intelligence offers exciting new ways to work and learn, but there are reasons to be careful.
In nature, sometimes the prey becomes the predator. Frogs eat beetles. But in the case of larvae of the Epomis beetle, it wriggles around to attract frogs, then latches on and sucks the life out of them.
This is how I’m feeling about artificial intelligence and ChatGPT in higher education just now. The positives are blinding us to the risks.
Many of my colleagues argue correctly that one cannot ban it. They also reason that since it will be a useful tool then we should embrace it and teach students to use it. They say it is like a calculator or Microsoft Word, or like Excel. Just another tool.
I guess so but I’m not excited. Teaching students molecular genetics or history is more attractive to me than teaching students to construct optimal prompts – which I think they’ll work out for themselves fairly quickly, in the same way as I learnt to use a calculator, Word and Excel.
I don’t think this is what a university is for.
And I don’t want us to embrace ChatGPT entirely. I believe that human lives will continue to unfold in two arenas – periods where ChatGPT is available and periods where it is not. We must train our students for both eventualities. It is not a question of embracing it or not. It is a question of how to train students for moments when it is not available.
We need medics who can assist a patient on a bushwalk or after a cyclone. We need leaders who can think on their feet, make decisions, and answer questions. We need teachers who can pass on their knowledge and skills.
Read more: Charting a new course for university education in the age of ChatGPT
So, in every course, we now must make sure we have assessments where students can use ChatGPT and some in which they cannot. Because in their lives students will face situations where they can use artificial intelligence and situations where they cannot.
Accordingly, we will teach students how to use ChatGPT but also continue to teach them how to manage without it.
Now here’s the thing. Teaching students how to use ChatGPT entails questioning it and cross-checking. It requires students to have their own database in their head or access to independent databases so that they can sense test what it throws up.
What does all this mean for teaching, learning, and assessment? We need to keep teaching some deep material and insist that students have foundational knowledge on the tip of their tongues, so they can use this knowledge for instant decision making, but also for sharing and forming relationships, and for persuading. We need to explain that students should never become dependent on any single technology. In assessment, we need to have tasks where ChatGPT cannot be used.
I hear people saying they will get students to critique or edit a ChatGPT answer – but beware – it can do that. Others say they will ask for clear referencing or use pictures because it can’t handle those things – but beware – the next version GPT-4 is being released and I think it can.
We cannot keep designing assessments and driving learning into the shrinking window of intellectual endeavours that artificial intelligence cannot master. That’s not a university education!
To enjoy rich lives and cultural exchange we must continue to learn and do. Artificial intelligence offers an exciting new dimension to life, but it should be used with great caution.
Read more: ChatGPT for medical data? Predicting the future of population health
Human connections come from human interactions, shared understandings and instantaneous responses, from body language, and often from knowing looks or involuntary mutual laughter triggered by some intricate paradox that we enjoy together, or from tears that we shed together. The more knowledge we have in common the deeper our relationships and the more solid the foundations of our society.
Teaching students how to optimise prompts or to edit is a worthy endeavour but a tiny, tiny fraction of the wonder that a proper education entails. My bet is that the people who worked tirelessly on developing all the artificial intelligence we are now seeing did hard yards in conventional learning and mastered the basics passed down by giants of the past. They didn’t just reflect back mainstream and conventional verbiage by using calculators, Word, Excel, Wikipedia and chatbots.
If we want a creative future in Australia, we must be careful to ensure we continue to drive knowledge acquisition, rather than surrendering to the death of deep learning.
The real problem is becoming dependent on ChatGPT because it can advise on – seemingly – everything. It isn’t a tool like a calculator. Already digital tools are changing lives in unexpected ways. The challenge for education is to keep ensuring students do the intellectual heavy lifting.
Merlin Crossley is Deputy Vice-Chancellor, Academic Quality at UNSW Sydney.
This article has been republished from The Australian. Read the original article (subscription required).