AI: The world’s smartest study buddy
Everybody is talking about AI – and with good reason. Nevertheless, not many have thought about its potential use beyond being a tool to complete tasks for us, for instance using AI as a mentor to facilitate learning and acquire more knowledge.
This marked the starting point when Christian Hendriksen, Assistant Professor at the Department of Operations Management, introduced ‘Students and Generative AI: Strategies for using AI tools for learning’.
While some approach AI with scepticism regarding its impact on learning, Christian Hendriksen views it solely as an opportunity. You just need to have the right strategy, and it does not have to be complicated.
”Merely considering what AI should be used for and what we can do ourselves is a great point of departure,” says Hendriksen.
Use these three principles for a productive use of AI
As soon as the strategy has been outlined, the work begins. Three principles in particular can be used for AI.
Principle no. 1: Prompt as if you are writing to another human being
”First, you must communicate to the model as if it were another human being,” he says, chuckling: ”Even if it may sound a bit disturbing.”
The logic behind this is that the model is trained and built to interact in a natural language, which enhances its ability to understand your intentions and simplifies the process of expressing what you want, he elaborates.
Principle no. 2: Context
”Second, you must provide enough context.”
Christian Hendriksen demonstrated an example on the screen in auditorium SPs05 at Solbjerg Plads. He presented himself as a student at Copenhagen Business School, working on his thesis about ice cream industry monopoly, planning to carry out qualitative research. Having stated this, he wanted to look into the following: “What does theory entail?”
At first, his thought situation and detailed description made the audience laugh, but then people started taking notes. Because out came a very detailed and useful answer that provided different perspectives on theory within his specific conditions.
”Usually, there is not an upper limit as to how much context you can provide. The more context you give, the more relevant the output.”
Principle no. 3: Feedback
Third, students must iterate the output to get what they need.
”The model will do its best to fulfil the wish of the user, however, the result is not always spot on. Therefore, I recommend giving it feedback or describing what you need changed,” says Hendriksen and emphasises that the response quality increases as you repeat the feedback.
Don't fall asleep at the wheel
There are, however, also pitfalls to be aware of when using AI to learn, Christian Hendriksen underlines.
For instance, the models hallucinate. No, your AI has not been eating forbidden mushrooms; it just means that it will come up with something wrong in case it does not know what you are talking about.
But it is completely normal, Hendriksen explains. In fact, language models will always hallucinate, it is just how they work. But it means that humans should verify the information.
Besides hallucinating, the model can be biased or perhaps even lead to falling asleep at the wheel, losing a lot of intermediate stages. If humans delegate tasks to AI that it is not proficient in, the outcome could be subpar.
Finally, Hendriksen underlines that irresponsible use of AI will reduce learning rather than support it, for which reason AI should be considered a mentor who possesses endless knowledge and who can be used for exchanging ideas.
For this to succeed, however, students need to be well-informed. Hendriksen emphasises the importance of the university supporting the students. Many of them are not familiar with the existing guidelines, which may give rise to considerations they should not be handling alone. My hope is that “AI can enable us to learn more, better and faster for the benefit of all of us,” he says.