New artificial intelligence model will guide novice programming students to the right answer, instead of handing it to them.

Artificial Intelligence (AI) and Large Language Models (LLMs) have firmly made their way into our daily lives. But in education, there is one major concern with generative AI tools like Gemini and ChatGPT: they are just too helpful.

LLMs are trained to deliver answers as fast as possible, but that often conflicts with what is pedagogically sound.

Addressing this problem, Dr Jake Renzella, Dr Sasha Vassar, Dr Andrew Taylor and Dr Hammond Pearce from UNSW Computer Science and Engineering (CSE) are developing a guidance-focused model for introductory programming.

Unlike other AI chatbots, this model is trained to never explicitly give the answer, but to guide students to finding the right answer themselves.

“Essentially, we are taking an open-source large language model similar to that which powers AI chatbots, and training it through a process known as fine-tuning to behave more like a tutor rather than providing solutions to questions. We’re using a proprietary dataset to influence how the model responds,” says Dr Renzella.

The goal is to put downward pressure on how inherently helpful these models are.

“In every industry, you want AI to be as helpful as possible. But every industry is about increasing productivity, with one exception: education,” says Dr Renzella.

“Our model will lead the student to finding the solution, rather than just handing it to them, which from a pedagogical point of view is necessary for learning,” says Dr Vassar.

“We’re rolling the tool out slowly and carefully, starting with introductory programming at UNSW, a tricky discipline which struggles with drop-out rates of 30 to 40 per cent,” says Dr Vassar.

“One of the most common difficulties cited when learning programming is confusing compiler error messages. That is one problem we’re hoping to solve with this model: it will provide more information and explain each error in human-like language, to help students learn.”

A second motivation for the model stems from the fact that computing students at UNSW often work between 1am and 4am, when there are no tutors available. Having a tool that functions like a tutor with extra knowledge of the discipline can help them push past small problems and prevent them from becoming stuck and discouraged.

“This project is a great example of how AI should be used in the classroom. Rather than providing the answer to a problem, it guides students and helps them develop their problem-solving skills. They are ultimately better equipped to tackle the problems that they will face in the workplace after they graduate,” says Professor Maurice Pagnucco, Deputy Dean (Education) of the UNSW Faculty of Engineering.

The model is an evolution of tools that have been provided to UNSW CSE students for some time already. The Debugging C Compiler (DCC) was introduced by Dr Taylor in 2017 to improve C error messages. The evolution, called DCC --help, integrates AI models into the compiler to explain errors in human-like language and was introduced in 2023.

“In DCC --help, we prompted ChatGPT to avoid providing blocks of code, but it still did so in about 50 per cent of the cases. It can’t help itself – it’s just how it is designed. We’re hoping that our own model will bring this number down to 10 per cent,” says Dr Vassar.

“Training the model ourselves also means we can use the model on our own servers, ensuring student data remains private, which addresses ethical concerns,” says Dr Renzella.

The team received a Google Award for Inclusion Research of AU$100,000 to continue their work. The Award supports academic research that addresses the needs of historically marginalized groups globally.

“Given the current inequities in computing education, it is crucial to support academic research on how AI will impact computing in primary, secondary, and higher education. The research project at UNSW will help build understanding around the factors influencing teachers’ integration of AI-powered technologies into their teaching methods,” said Sidnie Davis, Program Manager, Education for Social Impact at Google.

“As educators and leaders in AI research, our CSE academics are aiming to set the standard in responsible and ethical use of AI in education,” said Professor Arcot Sowmya, Head of School of UNSW Computer Science and Engineering.

While the current focus is on introductory programming at UNSW, the team envisions the model to be implemented university-wide, or even globally, in the future.

“Our goal is to create a pedagogical model that can be used in any teaching context without having to worry about it handing out solutions. We can then collaborate and help others implement the tool in their fields,” says Dr Renzella.