SXSW 2026: AI as systemic technology and our search for the human role

Once again, SXSW in Austin was an intense and inspiring week. As an experiment, I set out to use AI during the event as a thinking partner and to help write this blog afterward. More on that in the notes at the end.

What stood out this year is that the conversation about AI has fundamentally changed. In previous years it was mostly about applications, experiments and (promised) possibilities. AI appeared in every talk, but usually as a component, often in passing, because AI was simply trendy. This year was different. AI was back as the direct subject of conversation. Not as hype or a standalone technology, but as systemic technology: a foundation beneath everything.

John Maeda made this explicit in his talk on the shift from User Experience (UX) to Agentic Experience (AX). Where we once designed for users, we are now seeing a shift toward designing for agents. That may sound like a small step, but it represents a completely different reality. It aligns with what several speakers argued: AI is no longer a tool, but a General-Purpose Technology (GPT). What the steam engine once was for physical labor, AI is now for cognitive work. This time, however, the development is many times faster, exponential even.

At the same time, something very important stayed the same. The human dimension was once again central. Themes like connection, loneliness and meaning came up everywhere, perhaps more strongly than ever. And that combination is precisely what makes SXSW 2026 interesting. Technology is accelerating, but the question remains what that does to us.

In short: SXSW 2026 was about a world in which AI has become systems technology, and we ourselves are still searching for our role within it.

From trends to convergences

A striking shift this year was in how innovation is discussed. No longer in isolated trends or developments, but in the convergence of technological changes. Amy Webb framed this sharply with her talk on convergences.

According to her, mapping trends in isolation is no longer enough. Real impact only emerges when technologies come together. She compares it to weather data: temperature and wind say little on their own, only together do they signal an approaching storm.

In that analogy, she identifies three major ‘storms’ that are converging and reinforcing one another.

Human augmentation: technology that enhances our physical and cognitive capacities. Think of AI that supports our thinking, exoskeletons that ease physical labor, or interfaces that extend our senses.

Unlimited labor: AI and robots taking over work at scale. Not just routine tasks, but knowledge work too. The scarcity of labor is disappearing; the question is what replaces it.

Emotional outsourcing: delegating emotional needs to technology. Systems that listen, comfort, guide. Webb presents this not as science fiction, but as a description of something already happening: people having meaningful conversations with AI, maintaining relationships via avatars, or seeking grief support from a chatbot.

What struck many was that this is no longer about the future. Many of her examples already exist. The question is no longer whether it’s coming, but how fast, and what it means.

The reason this is being named so explicitly now seems clear. Technologies like AI, biotech and robotics are mature enough to genuinely converge. That makes the impact larger and less predictable.

AI as systems technology

The shift from AI as a tool to AI as systems technology was perhaps the most important throughline.

Multiple sessions made clear that we are entering a new phase. LLMs appear to be becoming commodities in many cases, no longer the differentiator, but the baseline. The real change lies in how they are deployed.

You can see this in the idea of the ‘next internet’. In the session The Internet After Search, the CEO of Cloudflare sketched how we are moving from searching to agents. We no longer look up information ourselves; we let systems do it for us. AI thus becomes the interface between people and the internet. The ‘next internet’ was not primarily built for us, but for our agents.

Amy Webb went a step further and described a world in which agents, robots and systems together form a new economic reality, a world in which labor is no longer scarce, but abundant. That has major implications: for work, for organizations and for ourselves as human beings.

Augment or automate

How should an organization position itself relative to this shift? John Maeda laid out two fundamentally different approaches: augment or automate. Do you use AI to make people better, or to take over tasks? That is not a technical choice, it is a design question for organizations.

Ian Beacraft argued for collaborating with AI, deliberately focusing on where human value lies rather than automating everything. Neil Redding extended that line into leadership: less steering by fixed plans, more orchestrating within a system that is constantly changing. Rohit Bhargava translated that into the relationship between humans and AI: bigger, smaller, alongside each other or at each other’s expense. His point: how you see that relationship determines what you build.

Together they paint a picture in which the question is no longer what AI can do, but how and where we want to deploy it. At SXSW it became clear that both directions are emerging simultaneously: systems taking over work and a growing emphasis on human qualities. Not a contradiction, but a choice that needs to be made more explicitly.

Three perspectives on AI kept coming up throughout: AI as opportunity, as risk and as a human question. Together they form the complete picture.

Maeda added an important caveat to all of this: “AI can’t give you good taste.” Technology can scale and optimize, but judgment, taste and human insight remain human work. That doesn’t make AI less disruptive, but it does make the human role more explicit. AI becomes something you need to actively take a position on.

What does this mean for people?

Rohit Bhargava put it sharply: “The people who understand people always win.” Value is shifting toward human understanding, toward context, interpretation and empathy. Greg Greenberg agreed. ‘Hands, mind & heart’ are, in his view, essential human qualities that will remain highly valuable. Steven Spielberg in turn illustrated that stories continue to revolve around human experience and emotion. Rana el Kaliouby turned it around and argued that AI must also learn to understand empathy, not as an add-on, but as a prerequisite.

At the same time, another movement is emerging. Esther Perel showed how technology is increasingly taking over human roles, including emotional ones. That is precisely what Amy Webb calls emotional outsourcing.

This creates a paradox: human qualities become more valuable, while technology simultaneously simulates them (increasingly well?).

Sovereignty & autonomy

Another clear thread this year was that of control and autonomy. Even in the US, awareness is growing that dependence on technology companies is a risk. Multiple sessions made visible how power is shifting toward systems and the organizations behind them. Whoever controls the systems determines how they work.

The session with Attaullah Baig, the WhatsApp whistleblower, made this tangible. Problems around security and data are hard to address, not because they are invisible, but because systems and incentives encourage different behavior.

The session with Tristan Harris (creator of the documentary “The AI Doc: Or How I Became an Apocaloptimist”) and sessions on the agentic internet raised the same question: how do we maintain control, and who holds it?

Maeda added an important caveat to all of this: “AI can’t give you good taste.” Technology can scale and optimize, but judgment, taste and human insight remain human work. That doesn’t make AI less disruptive, but it does make the human role more explicit. AI becomes something you need to actively take a position on.

Perhaps that is the core of SXSW 2026. We are past experimenting. We are in the phase of making choices, about how systems work, who steers them and what the role of people is within them. But who makes those choices?

In conclusion

What stays with me most is that the distance between technology and people is shrinking, while the tension is growing.

We are building systems that can do more and more. But with that comes a growing responsibility to choose consciously, to remain sharp about what we want to do ourselves.

Perhaps that was the most important lesson of SXSW 2026. Not whether AI can do more and more, but whether we as human beings remain sharp enough to determine what we want to use it for.

Transparency note

In the spirit of SXSW, I used AI as support this year, during the event itself for analyzing and processing information, and afterward to help write this blog. SXSW seemed like an ideal case: there is a lot of publicly available data.

It was an interesting experiment. Alongside my own thinking and conversations with people on the ground, I used ChatGPT to analyze overarching themes.

For this blog, I used ChatGPT to generate a first draft based on my prompts, notes and transcripts. I determined the analysis, structure and substantive direction myself, after which I read and checked the result thoroughly. I used Claude for final editing and translation into English.

None of this saved time, but it did yield an extra insight here and there and a valuable learning experience.

Further reading

For those who want to read or watch more:

Auteur

Reacties

Dit artikel heeft 0 reacties

Gerelateerde artikelen