The future is now

ChatGPT, a chatbot prototype developed by US company OpenAI, is on everyone’s lips right now. Some of the questions asked include: Does the software constitute a challenge for the education sector? Will it be used in the future to write homework and theses or dissertations? Uwe Walz, Professor of Economics at Goethe University Frankfurt and an expert in industrial economy, analyzed the chatbot with his students as part of his teachings this ongoing winter semester.

UniReport: Professor Walz, what experiences did you make with ChatGPT in your course?

Uwe Walz: We have been looking at the economic impact of machine learning and artificial intelligence on various product, labor and financial markets as part of one of our courses for a few years now. In this context, ChatGPT gives us an opportunity to teach students how a bot works and how its algorithm is configured. Obviously, we are not talking here about a technical computer science course. Rather, we have tried to show how algorithms use data and the role the training data play in the quality of the predictions. Another focus of the course was the impact this technology will have on our students’ future employment markets.

How do you rate the challenges for educational institutions? Some people even argued it would be best not to tell students about the existence of ChatGPT at all?

To recognize ChatGPT’s tremendous potential, as well as its limitations, you must first of all try it out. One example: If you ask the bot today what the weather will be like tomorrow, it will be unable to give you an answer because it has been trained with data up to the end of 2021. Although we are undoubtedly only at the start of its development and some weaknesses remain, as things stand today, the bot is already astonishingly good. At the latest once big tech companies, such as Microsoft, integrate ChatGPT into their programs, it will be fed more and more data, learning, and as a result will become better and better. In some way, it will be like a race: a race against – but also with – technology. Young people access this technology very easily. It really would be very naïve to try to keep them away from it; we must assume that by now, people are familiar with the program. As is true of other technological developments: We will have to react to it. Looking the other way is not an option. On the contrary, it is imperative that we think about how we can integrate this technology into teaching in a productive way. The whole development also constitutes a tremendous opportunity.

Is it correct to assume that the fact that, unlike plagiarism, the program spits out texts whose origin cannot be traced, might pose a challenge for exams?

ChatGPT still produces partly generic results, with few variations. It is my belief that plagiarism software such as Turnitin could come close to recognizing this. My response to this challenge, however, is that we as teachers, at least in my subject area, should no longer just test students for standardized, repetitive knowledge, but instead set them more complex tasks and train them in what universities stand for: Structural knowledge.

Might oral exams become important again?

That could be a conceivable reaction, but one that, as far as resources are concerned, is not really feasible for courses with large numbers of students. Beyond that, such a response might elicit the question: Why should we ask something in an oral exam that the algorithm would also be capable of doing? That is something my faculty will examine more closely, discuss and think about ways of addressing. We should ask for much more critical analysis in theses and dissertations. After all, taking a critical look at an article in a scientific journal is not one of the algorithm’s comparative advantages.

Interviewer: Dirk Frank

Relevante Artikel

Das TOR zur besseren Krebstherapie

Die gängige Chemotherapie schlägt bei Darmkrebs oft nicht ausreichend an. Dies liegt unter anderem daran, dass sterbende Darmkrebszellen die Überlebensfähigkeit

Öffentliche Veranstaltungen

You cannot copy content of this page