03/10/2026 | News release | Distributed by Public on 03/11/2026 14:46
VUB scientist Tim Brys is a cautious admirer of artificial intelligence, even though he researches it himself. At a time of rapid change, he argues for slowness, wisdom and moral reflection - including within higher education. His recent book, And Then There Was AI, explores these questions. Together with co-author François Levrau, he examines how we can remain human in the midst of machines. "Perhaps students could even take inspiration from the monastic rules of Benedict."
No social media for children? A smartphone ban at school? For Tim Brys, these are no-brainers. He's perfectly content with a basic phone himself - no apps, no 5G. Quite unusual for an AI researcher. Sadly, the 'brick phones' of yesteryear aren't what they used to be.
"Back in the day, Nokias were robust and durable, but in a year I went through three of them. I'm trying a different brand now."
Enough nostalgia for faded Finnish glory - let's turn to his book And Then There Was AI. How do we remain human in the midst of machines? Tim co-wrote it with philosopher François Levrau from UAntwerpen. Across ten chapters, they paint a broad picture of the societal shock AI is causing. Here, we focus on the chapter about learning and education.
Students outsource boring, repetitive tasks to AI, freeing time for truly creative thinking.
Tim Brys: "That sounds appealing, but I don't buy it. Almost everything in education can be treated as routine: attending lectures, taking notes, reading texts, translating and summarising, writing papers, analysing data… If AI takes over all of that, what is left of higher education?"
But progress does come faster with AI.
"That's true, but the foundation isn't there. Studying is challenging and involves a lot of repetitive work. It's about cramming, drilling, wrestling with new knowledge. That friction is crucial. It's that 'boring' mental labour that forms your foundation and frames of reference. From there, you can build further and learn to see deeper connections. It's a paradox: creativity and innovation - the very skills AI is supposed to enhance - rely on a lot of prior repetitive work. If AI does it all for you, how do you learn to think for yourself?"
So, what's the solution?
"You first have to become an expert yourself. Only then can you judge whether AI is misleading you and use the technology as it should be: in a supporting role."
Tim Brys
Aren't you lumping all students together?
"According to lecturers, some students use AI as it's meant to be used: as an intelligent assistant. These students end up just as capable - or even more so - than those who graduated before the AI era. But a large group doesn't. They simply tick off the boxes required to get their degree. AI does the hard work for them. In my view, this bypasses the true purpose of higher education: developing into a well-rounded, competent individual who can engage with society with a measure of wisdom."
Working less and still passing - that temptation is hard to resist.
"We are evolutionarily programmed to conserve both physical and mental effort. In prehistoric times, those who wasted energy didn't survive. That logic of thrift is still wired into our brains. It automatically chooses the path of least resistance - and today, that path is AI."
But passively coasting isn't a new phenomenon, is it?
"No, but AI exacerbates the trend. Shortcuts are even shorter now with ChatGPT and the like. It's becoming increasingly easy to be a passive student and still graduate."
So what's the solution?
"Many are thinking about new teaching and assessment methods. More classroom work and shorter essays where students demonstrate they've processed the material. With the master's thesis, more attention could be given to interim assessments and the final presentation. That's when students can prove they genuinely understand what they've produced."
And the exams themselves? Everyone back to pen and paper?
"Why not? I studied computer science. We had to take our programming exams on paper. It wasn't a problem if you missed a bracket here or there, but at least you proved you understood the material."
You point out in your book the enormous acceleration of society. Moral, social, and legal systems can no longer keep up.
"American sociologist William Ogburn noted this as early as the twentieth century. He called it the cultural lag - cultural delay. Our technology, machines, and infrastructure evolve at lightning speed, while values, habits, laws, and norms lag behind. Take social media, for example. Its impact on our communication and self-image is immense. Regulations around privacy and mental health only come later, after the damage has been done. AI is now being unleashed on the world with similarly little precaution."
Humans are flexible, though, right?
"We can't endlessly adapt to a competitive, fast-paced, technology-driven society. Harvard Business School researchers studied what happened in a tech company when employees were given AI subscriptions. Productivity immediately increased. Tasks they would normally outsource, they now took on themselves. Naturally, they started working longer hours, trying out a few prompts during lunch breaks - that sort of thing."
Let me guess: the fun didn't last long?
"That's right. They felt they had even more balls to keep in the air. The constant pressure was unsustainable. The researchers called AI a burnout machine. We're simply not built for that. It's also inefficient: every email and notification disrupts your work and breaks your flow. In Deep Work, Cal Newport advocates for long periods of focused, undistracted concentration. If you maintain that for a few hours a day, you achieve far more than eight hours of shallow, fragmented work."
You suggest we can draw inspiration from the monastic tradition. What can we learn from monks?
"Monasteries are organised around strict rules. They prescribe how the monks live together and how their day is structured. It's an 'intentional' rhythm, balancing work, rest, contemplation, and creation."
How does that translate to modern life in the real world?
"Find a rhythm where periods of focused work alternate with activities like deep reading, writing, cooking, or making music. Perhaps occasionally try a 'smartphone fast'. And make conscious choices about local communities and collaborating with people. For instance, a group of like-minded friends and I are looking for a building in Brussels for a city monastery."
As a bastion against AI?
"Not exactly. It's a small community where people can slow down, support each other, and seek connection with God. AI is allowed - as long as it serves the good life. The question should be whether AI use makes us more loving, wiser, patient, just, and courageous."
Finally: which book or film connects well with your research? How close are fiction and reality?
"In our book, we mention The Matrix. Some tech billionaires believe AI will solve every problem, even death. In that scenario, we'd plug into an eternal simulation while machines keep our bodies young - willing slaves to a superior AI god, so to speak. Do you take the blue pill and stay in the comfortable illusion, or the red pill and face reality? Of course, it's not that binary, but it symbolises the willpower needed to resist losing yourself to the lure of technology."
"I also found Brave New World fascinating. In that story, citizens of a world state take a drug called soma, provided by the government, which keeps them calm, happy, and compliant. Scientists, artists, emotionally complex people, and independent thinkers are a threat to that stability and are sent to islands, where they are free to live together without disturbing society. Aldous Huxley was commenting on totalitarian regimes. The link with AI is that it too can threaten our freedom. AI companies are wealthier than anything or anyone in history; the concentration of power is enormous. The more control they have, the greater the potential pressure on democracy and freedom."