Skip to main content

SXSW chronicles: 16 years of field notes from the future

Here I am, in Austin again. Sixteenth time. The brisket smells the same, the heat arrives early, and somewhere on Red River Street a band is already playing too loud for 11am. I came here for the first time in March 2010 with a backpack, a borrowed press pass, and no idea what I was walking into. A few days later I published a post called “Like flying an Apache helicopter” and another one about a stranger who handed me a free ice cream cone and turned out to be one of the sharpest minds I met that entire week. That was the signal. Not the panels. Not the keynotes. The accidents.

Sixteen years later, SXSW is my annual early-warning system. If something is going to matter to boards, to operators, to anyone trying to run an organization through the next 36 months, it usually shows up here first, messy and half-formed, three to five years before it reaches your strategy deck. I come to read the room before the room gets its talking points straight.

What 16 years of SXSW taught me

The first trip felt like trying to drink from a fire hose while someone handed you a map written in a language you’d never seen. There were thousands of sessions running simultaneously, a city full of people who were absolutely certain they were inventing the future, and no obvious way to separate the signal from the noise. I covered Dennis Crowley’s Foursquare badges moment, I watched the first location-based social apps fight for territory, and I wrote notes at midnight that I still sometimes go back to read. The energy was infectious, and I was young enough to think infectious energy was the same as correctness.

I was wrong about that, of course.

By 2012, the honeymoon was genuinely complicated. SXSW had grown past itself. I wrote a piece called “#SxSW: where ideas go to die” and meant some of it. The machine was producing hype on an industrial scale, startups were pitching in hallways like carnival barkers, and the gap between the breathless announcements and reality was wide enough to fall into. Gowalla died that week (three years old, RIP). The keynote slots filled with famous faces who had nothing particular to say. But then Ray Kurzweil walked on stage and talked about reprogramming DNA and I remembered why I kept coming back: every third year, something lands that genuinely changes how you think.

The 2017 and 2018 editions sharpened a different instinct. Vint Cerf and Tim Berners-Lee standing together on the barricades against fake news: two men who had built the infrastructure of the modern world watching it get used as a weapon and saying so plainly. Elon Musk calling Earth a cradle humanity should eventually leave. Amy Webb presenting her 315 tech trends for 2019 with the kind of rigor that makes futurism feel like engineering rather than poetry. The conference was no longer about social media virality or location check-ins.

Something larger was moving underneath.

2020 hit differently. SXSW was canceled two days before I was supposed to fly, the first cancellation in the event’s history. I wrote from my couch in Belgium about what that meant, what was coming, and what a city full of ideas looks like when the lights suddenly go out. COVID didn’t just cancel a conference; it accelerated five years of structural change into eighteen months. Looking back at those notes from “Hotel California” and the scrambled aftermath, I got some things right and some things embarrassingly wrong, which is probably the honest record of anyone paying close attention.

The 2021 virtual edition, the 2022 return, and the years leading to 2026 each added a layer. The question evolved. It was no longer “what’s the new shiny thing?” It was “who is actually serious?” The AI wave arrived at SXSW in 2023 already half-formed and slightly breathless, six months after ChatGPT launched and the entire industry simultaneously discovered it had opinions about large language models. By 2024, with Lisa Su of AMD on the main stage and MIT publishing its 10 breakthrough technologies list, the conversation had depth. By 2026, the convention center was literally demolished and the city itself had become the stage. That structural change was the best metaphor SXSW ever produced for what AI is doing to organizations.

What I look for now

The lens has shifted over the years. In 2010 I was looking for novelty: what’s here that I’ve never seen before? By 2015 I was looking for durability: which of the things from three years ago actually survived? By 2020 I started looking for honesty: who in this room is saying the uncomfortable thing, and who is performing optimism for an audience?

Now I look for weight. Specifically, I look for people who carry the weight of consequences. Amy Klobuchar at SXSW 2026 walked on stage in a room full of tech believers and said things about regulatory capture, about platform power, about what happens to democracy when three companies own the infrastructure of public discourse, that the audience didn’t fully want to hear. She killed it, literally. Not because she was performing, but because she had actually read the legislation, lived with the lobbying, and was willing to be specific.

Specificity is the rarest currency at a conference built on vision statements.

The Stargate announcement at SXSW 2026 was another calibration moment. OpenAI, in partnership with SoftBank and Oracle, is building a $500 billion AI infrastructure project with one early hub in Abilene, Texas: 10 gigawatts of compute, a footprint large enough to reshape the regional grid, water consumption that makes the local drought numbers look quaint. I wrote a piece called “how OpenAI is turning Texas into an AI sacrifice zone” and people thought the title was hyperbolic. The numbers are not hyperbolic.

When a single infrastructure project commands more power than most mid-sized European countries, the conversation about “AI strategy” needs a different vocabulary.

The Zuckerberg/Moltbook story that circulated during the same week was a different kind of signal. Meta buying a social network populated mostly by AI agents sounds absurd until you understand what they were actually buying: a verified registry of AI agent identities, a map of who each agent represents and what it’s authorized to do. In a world where your personal AI negotiates your calendar, screens your email, and trades on your behalf, whoever owns the authentication layer owns something more valuable than any social graph. The absurdity was the camouflage.

What makes a SXSW signal credible rather than hype is whether it has a named cost. The pressure cooker piece I wrote from Austin in 2026 was about that exactly: the United States was running a live experiment in what happens when you remove institutional friction very fast, and the results were visible in the city itself, in the conversations at Sixth Street, in the faces of the researchers and policy people who came to Austin that week carrying a specific kind of tiredness. Hype has no friction. Real change has costs, and the costs are borne by specific people in specific places.

The 2026 dispatch

Forty years of SXSW, and for the first time the convention center was gone. The Austin Convention Center had been demolished, and the festival spread across the city itself: hotel conference rooms, bars, food halls, parking lots fitted with temporary stages. The decentralized experiment produced something unexpected. Without a central hub, the collisions became more honest. You couldn’t avoid the people who disagreed with you; you were all in the same bar anyway.

The concept that stuck from 2026 was cognitive Darwinism, a frame I wrote about in “Charmageddon, cognitive Darwinism and a futurist coming home.” The argument, compressed: as AI handles more of the routine cognitive load (search, summarization, first drafts, pattern recognition), the humans who survive professionally are those with irreducible judgment, embodied experience, or creative friction that machines can’t replicate at cost. Everybody else is in a Darwinian pressure zone whether they know it or not.

Charmageddon is what happens when charm scales: when AI-generated charisma floods every communication channel and the authentic human signal gets lost in the noise. SXSW 2026 had plenty of both.

The “AI’s real problem may be that humans need to matter” session from Crossover Day landed with more force than most. The technical arguments about capabilities and alignment are important, but the psychological argument is more urgent for any organization actually trying to deploy AI: people who don’t feel they matter, stop contributing. The most sophisticated AI rollout fails if the humans around it conclude that their presence is decorative. This is a management problem dressed as a technology problem, and most CTOs I met in Austin were still discussing it as a technology problem.

The “You may all go to hell, I will go to Texas” energy of 2026 was different from the Davy Crockett bravado of the phrase’s origins. Texas in 2026 was simultaneously the home of Stargate (with its AI sacrifice zone implications), the proving ground for decentralized urban living, the stage for senators saying uncomfortable things about platform power, and the venue for a thousand conversations about what humans are for once machines can think. That’s a lot for one week in March. It’s exactly why I keep going.

Why this matters for boards and operators

If you’re running an organization larger than 50 people, SXSW earns its cost as reconnaissance. The things that get discussed at the fringe sessions in Austin in March land in your competitor’s product roadmap in 18 months and in your board’s questions in 36.

I’ve been wrong about specific technologies (Foursquare did not eat the world; iBeacons are not watching you from every store corner). The directional accuracy has been better. The themes that mattered at SXSW tend to matter, with a lag.

In 2026, the board-level implication is this: AI as a strategic force has moved past the technology teams and landed squarely on the governance agenda. Who controls the compute? Who owns the agent authentication layer? What happens to your workforce when the cognitive Darwinism pressure hits your middle management tier? These are the questions that came out of Austin, and they deserve a better answer than “we have a working group on it.” The leadership and communication frameworks for navigating structural change matter more now than they did in 2020, because the pace of structural change has compressed again.

The operators in the room at SXSW 2026 who were paying attention left with something specific: AI’s deployment problem is human, not technical. The infrastructure is being built (Stargate, the Azure expansions, the Google TPU farms). The question is what you build on top of it and whether the people inside your organization feel like participants or collateral. That’s a leadership question. It always was.

Austin functions as a specific variable in its own right: politically complex, energy-rich and energy-fragile simultaneously, culturally ambitious, geographically positioned between the Stargate AI infrastructure being built in Abilene and the legislative battles being fought in Washington. When Bruce Sterling stood on a SXSW stage in 2014 and said the future would be cities full of old people scared of the sky, he was being funny and completely serious. Austin keeps proving him right and then surprising him.

The chronicles themselves

Sixteen years of field notes don’t fit on one page. Below is a curated reading list, organized the way I’d walk a newcomer through the archive.

2026 dispatches (start here)

Prior years: the sharpest pieces

Dear Tara: letters from the field (cross-reference with 100,000 Miles)

Frequently asked

Why do you keep going back every year?

Because it keeps surprising me. Sixteen trips and I have been wrong more times than I care to count about specific predictions. But the pattern holds: the things that get discussed in Austin in March tend to be real 36 months later. That’s a better hit rate than most strategy consultants I know, and the brisket is better too. Also, the conversations that happen between sessions, at a bar at 11pm, between people who disagree, are irreplaceable. You can’t stream those.

What’s changed most since 2010?

The stakes. In 2010, we were excited about smartphones, location apps, and the idea that social media might change marketing. The stakes were commercial. By 2026, the conversations are about compute infrastructure large enough to reshape regional power grids, about AI agent authentication layers that will determine who controls the next phase of the internet, about whether democracy can survive the scaling of persuasion technology. The energy is the same (slightly chaotic, very loud). The weight is completely different.

What did 2026 feel like?

Specific and urgent, in a way I hadn’t felt since 2013. The physical decentralization of the festival, with no central convention center, forced a different kind of attention. You chose your sessions deliberately. The political temperature in the US was running high; senators were saying things in public that their staffers would normally edit out. The AI conversations had finally moved past “isn’t this amazing” into “what are we actually doing and who pays for it.” That felt like progress.

Who should go?

Anyone who has to make a call in the next three years about technology, workforce, or market positioning. Specifically: CDOs, CHROs, strategy leads, and board members who want to ask better questions of their management teams. SXSW runs as a collision event for people trying to figure out what happens next, and the technology track is only one part of it. Founders who want to find early adopters and media attention. Anyone who reads sxsw.com and feels a small pull of curiosity.

What’s the AI vs. creators tension that keeps coming up?

At SXSW 2026, the creative community and the AI community were sharing the same festival for the first time without pretending the friction didn’t exist. Musicians, filmmakers, writers, and game designers were in sessions right alongside OpenAI researchers and LLM developers, and the questions were direct: who owns the output trained on my work, who profits, and what’s left for the person who made the original? There were no clean answers. There were some honest conversations, which is rarer than you’d think. ABBA’s Björn Ulvaeus put it plainly back in 2018: data kills creativity when you optimize for pattern replication instead of surprise.

Who is Tara?

My daughter. Since 2021, I’ve been writing letters to her from SXSW (and from other places; she shows up often in 100,000 Miles). The letters are partly dispatches and partly honest attempts to explain what the world looks like from a conference floor when you’re a parent wondering what you’re handing over. The 2026 letter, written with a moon over Lake Travis, is the one I’m most proud of from that trip.

What’s Austin specifically got to do with it?

Austin functions as a specific variable in its own right: politically complex, energy-rich and energy-fragile simultaneously, culturally ambitious, geographically positioned between the Stargate AI infrastructure being built in Abilene and the legislative battles being fought in Washington. When SXSW loses its convention center and spreads into the streets, you feel the city differently. Bruce Sterling called it in 2014: the future is cities full of old people scared of the sky. Austin keeps testing that hypothesis from multiple angles at once.

What do you actually bring back for board-level conversations?

Specific signals, not general vibes. In 2024 I came back with a clear read on the AI infrastructure investment timeline and the workforce implications. In 2026 I came back with three things: the cognitive Darwinism frame (which I now use in every keynote), the Stargate numbers as a governance conversation starter, and the observation that AI deployment is failing most organizations for human reasons rather than technical ones. Boards that want the technology briefing already have analysts for that. What SXSW gives me is the texture of what serious people are actually worried about, before it gets smoothed into a deck.

What’s the “pressure cooker” reference?

A piece I wrote from Austin in March 2026, Notes from inside the American pressure cooker, about the political and cultural temperature in the United States at a specific moment. When you spend a week in a room with American researchers, journalists, policy people, founders, and artists who are all processing the same set of pressures in real time, the conference becomes a reading of something larger than tech trends. The pressure cooker piece was my attempt to write that reading honestly.

Does SXSW still matter, or has it become a brand festival?

Both, at the same time, which is roughly what it has always been. The brand activation layer (the sponsored lounges, the swag, the parties that run on marketing budgets) coexists with the serious programming, the researcher panels, the policy conversations. You have to choose your SXSW. The one I go to is not the one where you queue for a free tote bag. The sessions on AI governance, on cognitive science, on the economics of creativity: those remain serious. The trick is knowing where to look, and that takes a few years of calibration.

How do you use SXSW notes in your speaking work?

Directly. The cognitive Darwinism frame from 2026 went into my keynote within six weeks of returning from Austin. The Stargate numbers became a board conversation starter in three separate engagements. The pressure cooker observation became a section in a leadership workshop on change and institutional friction. SXSW is fieldwork. The writing here is the field notes. The speaking and consulting is what happens when those notes meet a real organization with a real decision in front of it.

What’s next for the chronicles?

SXSW 2027 will be in a transformed Austin, with a new convention space still being built and the decentralized format either proving itself or collapsing under its own logistics. The AI regulatory environment in the US will have moved. The European AI Act will be further into implementation. Stargate will have broken ground on multiple sites. I’ll be there, taking notes, probably eating too much brisket, and writing another letter to Tara. Find me at heliade.net/about/ if you want to compare notes before then.

Go to Texas. The worst that happens is you come back with a different set of questions, which is usually more valuable than the answers you arrived with.