The University of Auckland

04/22/2026 | News release | Distributed by Public on 04/21/2026 18:27

Big tech: too big to fail

Breadcrumbs List.

Show Truncated Breadcrumbs.
  1. Home
  2. News and opinion

Professor Alexandra Andhov's inaugural lecture dove into the existential challenge of big tech's influence on democratic governance.

Professor Alexandra Andhov, Chair in Law and Technology, University of Auckland, Waipapa Taumata Rau

"Aim for a monopoly," influential tech investor Peter Thiel tells founders. Build a business so dominant that it sets the terms and becomes, as Professor Alexandra Andhov said in her inaugural lecture, the architecture.

Big tech companies dominate social media, AI and search and now underpin many of the systems societies depend on.

Microsoft Azure powers a significant share of enterprise computing, Amazon Web Services holds roughly a third of the cloud infrastructure market, and Google handles around 90 percent of search.

"The digital infrastructure of hospitals, courts, schools, universities, militaries and governments runs on their systems," said Andhov, as she opened the University of Auckland's Law, Technology and Government Conference 2026.

"These systems have become too embedded to regulate without disrupting everything that depends on them."

Andhov, the co-director of the Centre for Advancing Law and Technology Responsibly (ALTeR), pointed to the ease with which our data is given and taken, government support for unchecked innovation, and a lack of accountability and critical evaluation in the development and rollout of new technology.

She drew parallels from her doctoral research, which examined issues behind the 2008 Global Financial Crisis (GFC).

The banality of evil

During her PhD, Andhov read Hannah Arendt's Eichmann in Jerusalem. Arendt's account of the 1961 trial of the Nazi bureaucrat who organised the logistics of the Holocaust stuck with her.

"Eichmann was, by every psychiatric assessment, entirely normal. Not fanatical. He had, in the most technical sense, followed the law as it existed in Nazi Germany. His defence: I did what the law and the system required of me."

Arendt called this the 'banality of evil': Catastrophic harm does not require evil individuals, only individuals who follow without questioning. Such a lack of critical evaluation was, in part, what led to the GFC, says Andhov, and to today's big tech practices and market domination.

Andhov's friend Ken Singer, Director of the Berkeley Center for Entrepreneurship, spoke at the inaugural lecture and shared insights from Silicon Valley's ecosystem at the ALTeR conference.

Innovation unchecked

After a masters in international law and business, and a doctorate from the Central European University in Budapest, she gained her cross-EU attorney license and worked in an international law firm before landing a role at the University of Copenhagen.

At the time, Denmark was focused on fostering innovation and entrepreneurship. Andhov designed and taught a law course where students engaged directly with start-ups and, in some cases, launched their own ventures.

Meeting founders, designers, engineers, and data specialists, she learned something key.

"The most important legal decisions in any technology company are not made in a boardroom or by lawyers reviewing contracts," she says. "They are made on the first days... Because that's where you decide what data you will collect, how you collect it and what you do with it."

This is also when developers and companies decide whether customer consent for data collection, will be "meaningful or theatrical".

Extraordinarily talented and well-intentioned computer scientists and product managers made choices with legal and ethical consequences, "often without fully recognising them as legal choices at all, but rather technical".

There was no regulatory framework holding back start-ups.

"There was a consensus among many: technology is innovation. Innovation is growth. Growth is good. Do not constrain it."

Corporate getaway cars

Andhov argues Big Tech companies today are not corrupt in the conventional sense. They file accounts, have auditors and comply formally with applicable frameworks.

But their obligations, she says, are sometimes contracted out through specific corporate vehicles, carefully designed to sit beyond the reach of governance frameworks.

"Liability is distributed. Accountability is architecturally outsourced, so that when the question 'who is responsible' is asked, the answer is always: 'not here. Try the next layer'.

"The law that's supposed to hold these companies accountable as legal entities was built for a world where companies operate within markets. Not for a world where companies are the market."

In 2026, Andhov says Big Tech is conditioning governments.

"The machine has not just become the infrastructure; it has become the architecture within which governments themselves now operate."

Andhov encouraged people to think critically about how they interact with technology.

So, what can we do?

Andhov suggests regulating those who design and control technology and ensuring data collection is ethical and accountable.

"If you hold the power that belongs to others, you bear the obligation that comes with it."

Drawing on Arendt, she encourages people to think critically about how they interact with technology.

"Every query you put to an AI assistant trains the model further. Every document you upload, every professional judgement you outsource, this is data entering the loop. Legal data. Medical data. Financial data. Government data is flowing into systems owned by the same companies that already own the infrastructure."

Public procurement is the final lever Andhov points to for accountability and change within the big tech bubble.

"We might not be fully capable of regulating Silicon Valley from Wellington. But we can decide what our hospitals, schools, universities and courts are to run on - under whose law, with what conditions, with what liability when the architecture fails."

Arendt, she points out, said the antidote to evil isn't heroism or genius, but asking what the system is for, who it serves, and who pays when it fails.

"Thinking," says Andhov, "genuine, critical, independent thinking, is not something a model can do for you. It is the one thing that remains
irreducibly human."

Media contact:

Sophie Boladeras, media adviser
M: 022 4600 388
E: [email protected]

The University of Auckland published this content on April 22, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on April 22, 2026 at 00:27 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]