Well, bollocks. I owe it to myself, and to you, to come up with a 2026 prognosis. Noblesse oblige. You cannot spend a year hopping across stages with a trends deck, half futurologist, half Madame Soleil, without eventually putting your chips and (and your cojones) clearly on the table. This is that moment. I look at the coming year with less caffeine optimism than last year and more structural clarity (am I getting older, or just more tired of nonsense?), which is what happens when the noise of change finally exhausts you instead of seducing you. 2025 was a rollercoaster. IPG was bought by Omnicom. My daughter turned 10. The tech cycle has eaten its own confetti. Geopolitics has migrated from think-tank chessboards straight into grocery prices. Ultra-right wing politicians try their demagogic hocus-pocus, the US tries to buy Greenland, Climate has stopped asking politely and started knocking things over, and Trump go a shiny peace medal from… FIFA.
Everywhere I look, systems that used to glide smoothly now scrape and grind, and that scraping is the sound of truth hitting reality at speed. Oddly enough, I am not (too) gloomy about it. Friction is how you find out what actually matters. Smooth stories are cheap, and boring. Resistance is honest. Ask any Star Wars fan who lived through the prequels. 2026 feels like the year excuses will burn off and only operation-ready thinking based on broad range contrarian rational survives.
I am not easing into this year. I am bracing for it, the way you brace when a train hits a curve too fast and you suddenly feel weight you forgot was there. The weight of systems. Of consequences. Of decisions not taken. Of promises that sounded elegant on slides and now have to survive contact with physics, law, middle aged managers and human fatigue. The weight of leaders who are not leading, but quietly hoping all of this will somehow go away (it will not). The weight of refusing to rethink foundations while repainting the lobby. For a decade we optimized for smoothness, frictionless UX, frictionless growth, frictionless narratives. Conflict-less leading. We banned all the colors, even all the shades of grey… we lost soul and leadership. Hell, we lost respect.
It turns out friction is where truth lives. In 2026, everything important resists you a little. Energy costs more. Microsoft and its SAAS nephews will take more money. Trust takes longer. Automation demands governance. Data is a pain. Security a nightmare. Yay-sayers overstayed their welcome. Survival means redesigning your company from the foundation up, not adding another dashboard on top. Pretending otherwise gets expensive very quickly. Welcome to the sobriety society. And sobriety, when shared, can be strangely energizing.

Societal evolution: from fragmentation to intentional connection
“There can be only one” sounded cool in Highlander. It aged badly. Platforms are still here, but there is no longer one to rule them all, and honestly, we should be very grateful. We are watching in real time what happens when too many precious, vulnerable eggs end up in one basket, held together by vibes and quarterly incentives. The post-platform era does not mean platforms vanish. It means they stop being sufficient, and frankly, stop being desirable as single points of truth, identity, and livelihood. We still use them, because gravity is real, but fewer organizations are willing to bet their entire future, and their kids’ heads, on someone else’s ketamine induced interface decisions made at three in the morning (yes, I am looking at you, Elon Musk). The shift is already visible. Value gets co-created across partners, suppliers, developers, regulators, and occasionally competitors. Industrial data spaces in Europe, open banking APIs, shared logistics and energy platforms, these are no longer empty buzzwords, they are coping (and survival) mechanisms. Intelligence distributes itself, decision-making fragments, and clever bullet-proof, future-ready coordination becomes the scarce skill. Ecosystem orchestration quickly replaces platform dominance as the real advantage. Not owning the town square, but knowing how to keep basic traffic flowing without riots. Argh, if only politicians (and some vitrified boardmembers) could keep up with this thinking. And yes, that includes deeply unsexy things like value graphs, shipping lanes, standards bodies, new social contracts, a plethora of unpopular decisions, and who actually picks up the phone when something breaks.
Authenticity hardens into bankable currency because cheap imitation and insincerity have become trivial. When AI can generate relative competent messaging at industrial scale, the only defensible differentiation left is lived consistency. Leaders who say one thing and optimize another get exposed fast, not through scandal or outrage, but through pattern recognition (they should read William Gibson). Performative culture collapses under repetition fatigue and insane business goals and the awkward absence of ice-cold ROI proof. Values-driven action gains weight precisely because it produces visible trade-offs: the deal you did not do, the market you exited, the revenue you delayed. Storytelling still matters, maybe more than ever, but only when it is anchored in concrete choices, real constraints, real profit, boardroom-level decisions, and scars you did not Photoshop out. In an AI-saturated world, specificity is trust.
Community-driven culture moves from “nice to have” to operational requirement. Employee activism rises not because people suddenly turned radical (they seldom do), but because they are better informed, better networked, and utterly unwilling to be decorative. Shadow AI helps them draft, analyze, organize, mail, and speak up with less fear and more precision. Workplace democracy grows unevenly and sometimes clumsily, but silence is no longer a stable state. Organizations that ignore this create a highly explosive ticking bomb, they get leaks, and burn-out fireworks. Community management becomes a strategic function, internal and external, because belonging reduces volatility better than any engagement survey ever did. Keeping an eye on your perception NPS score is keeping an eye on your canary inn the coalmine. Recruitment and retention increasingly hinge on values alignment, especially among younger workers who have watched institutions fail loudly, repeatedly, and with great metronomic PowerPoint discipline. Good people are gold. Who is willing to back your back in the trenches? And yes, the fear of a robotic intelligent army is everywhere, but the quieter reality is more interesting. People are not afraid of machines. They are afraid of their managers. They are afraid of systems that pretend not to care who gets crushed when optimization wins.

Political & geopolitical recalibration: navigating controlled disorder
Multipolarity stops being theoretical and starts shaping invoices, because geopolitics is now baked into procurement like VAT. The US–China competition is more than a thoughtless “headline risk,” it is a constraint on what you can buy, where you can buy it, and whether you’ll be allowed to buy it next quarter. And we know the market hates surprises and pending doom. Export controls on advanced semiconductors keep tightening, the US Commerce Department strengthened controls again in December 2024, and then updated rules in January 2025 (including new angles like controls tied to AI model weights). Trump would love to “update” that even more. I encourage you to read Chip War: The Fight for the World’s Most Critical Technology, a 2022 nonfiction and no-nonsense book by Chris Miller to see where that could lead us.
Friend-shoring becomes normal business language, because people love common ideology, but also because boards love continuity (and… nicer margins). And even tariffs, the bluntest tool in the box, are back in the conversation. The US had a 50% tariff on Chinese semiconductors taking effect January 1, 2025, and the latest noise is a new tariff track being delayed until June 2027 (because of course it is).
Strategic autonomy stops being a Brussels noun and becomes a tangible budget line in energy, defense supply chains, and infrastructure ownership. The Global South (half of the world is in those two words) does not “choose sides” as much as it chooses leverage. Countries hedge, trade, and negotiate in ways that look cynical until you remember cynicism is often just survival with (way) better PR. Supply chains get redesigned around political risk as much as cost. Capital flows become conditional. Compliance requirements follow the flag. You can feel it in boardrooms: “single supplier” stops sounding efficient and starts sounding irresponsible. This is what controlled disorder looks like: nobody wants open conflict, but everyone is building fallback positions (and fallback fallback positions).
Democracy runs under sustained pressure, and the pressure stopped being just elections, it is information in all its forms and appearances. Post-truth politics becomes ambient, a permanent haze where synthetic media and fragmented feeds make trust brittle by design. The liar’s dividend, “that’s fake,” becomes the universal escape hatch, and crisis comms windows shrink to minutes because narratives harden at scroll speed. Climate stress pours fuel on this. Migration pressure, protests, community unrest, and resource conflicts blur the line between environmental and political crises, and “stability” starts to mean “how quickly can we absorb shocks without breaking social glue.” Digital sovereignty rises for the same reason. States and continents reassert control over data, cloud infrastructure, and critical digital services, often accepting higher cost and lower efficiency in exchange for perceived safety. It’s not always elegant. It’s not always wise. But it’s predictable. And… it’s late.
Regulation, meanwhile, thickens rather than clarifies, especially in Europe. NIS2 had a transposition deadline of 17 October 2024 and started applying from 18 October 2024, and the Commission even opened infringement procedures against 23 Member States for missing the deadline. DORA (Digital Operational Resilience Act) started applying on 17 January 2025, and suddenly “operational resilience” stops being a nice phrase and becomes reporting duties, testing expectations, and vendor oversight. Work. Consequences. And the EU AI Act is not an abstract monster in the closet anymore, it has a calendar and a ticking clock: it entered into force August 2024, banned prohibited practices from February 2025, kicked in GPAI obligations from August 2025, and becomes fully applicable August 2026 (with some high-risk areas later).
This is the shift: compliance stops being about avoiding fines and becomes about staying operable and relevant. The companies that treat this as paperwork will drown in paperwork. The ones that treat it as a challenging system design might still have a business when the rules stop being optional reading.

Technology maturation: from hype to hard work
Still with me? Neat. I predict 2026 is the year the shiny stuff stops auditioning and starts doing shifts. Agentic AI marks the real transition, the models suddenly become enlightened (be it trained on stolen data), and they stop waiting for you to type something. We move swiftly beyond boring and underperforming chat boxes toward systems that pursue goals, coordinate with other agents, and operate across tools API’s and time, which sounds like sci-fi (hello Stanislaw Lem) until you realize it mostly behaves like a hyperactive intern with root access. The hard part is not “can it do the task,” it is “who is accountable when it does the wrong task perfectly?” Multi-agent orchestration drags us straight into management problems we tried to ignore: incentives, budgets, escalation paths, handoffs, post-mortems, and the awkward reality that failure modes multiply when you add autonomy on steroids. Human-AI collaboration shifts up the stack. Humans set intent, constraints, and judgment. Machines do the execution and monitoring at a scale that makes your team look like it is moving through mush.
And the money gets loud(er). Global AI spending is on a trajectory to hit roughly $631–$632 billion by 2028, according to IDC forecasts. In Europe, IDC pegs AI spending reaching $144 billion by 2028. We’re way past fancy hobby money, this is now a budget line item that fights for oxygen with headcount and factory. The CFO era arrives quickly. Boards stop applauding shiny pilots and start asking for EBITDA moving receipts. Demos without measurable outcomes get executed swiftly and quietly. This forces a shift toward self-verification, automated testing, and autonomous quality control. When systems produce outputs at machine speed, humans cannot be the review layer (sadly, we’re way too slow). Meanwhile the giants relentlessly keep pouring concrete. Reports in late 2025 pointed to over $300 billion in AI-related data center spending in 2025 by the big US hyperscalers, with China’s ByteDance (TikTok’s parent) reportedly budgeting around $23 billion for AI infrastructure in 2026. When that much capital is moving this fast, “hype” stops being the right word. It is infrastructure now.
Infrastructure is where operational physics kicks in. The International Energy Agency estimates data centres currently use about 415 TWh, roughly 1.5% of global electricity, and projects they could reach around 945 TWh by 2030, just under 3% of global electricity. That is slightly more than Japan’s electricity consumption today (!). In the EU specifically, an EC explainer citing IEA estimates puts data centre electricity use around 70 TWh in 2024, rising toward 115 TWh by 2030. So yes, privacy-first and on-device AI gain ground for philosophical reasons, but also for brutally practical ones: latency, compliance, and the growing suspicion that shipping every thought to the cloud is the digital equivalent of commuting to print a PDF: it does not make a lot of sense.
Quantum, meanwhile, continues its slow march from “conference magic trick” to “niche but real.” It does not replace classical computing, it piggybacks on it. Hybrid quantum-classical approaches keep showing promise in materials science, optimization, and early drug discovery, the kinds of problems where the search space is a swamp and brute force dies. The more urgent quantum story in 2026 is security. NIST has already finalized its first post-quantum cryptography standards (FIPS 203/204/205), built from CRYSTALS-Kyber, CRYSTALS-Dilithium and SPHINCS+, with FALCON also selected for a forthcoming standard. Quis custodiet ipsos custodes? Indeed, good question.
Serious organizations start inventorying cryptography and planning migration now, because “we’ll deal with it later” is how you end up trying to change the locks while the burglars are already in the hallway, on their way out.
The maturation theme running through all of this is simple: fewer fireworks, more honest debates on what matters, more plumbing. Green IT stops being a virtue signal and becomes cost containment. Carbon-aware computing becomes procurement language, not branding. Circular hardware and efficiency move up the priority list because electricity, rare earth materials, cooling, and supply chains are now strategic constraints, more than PR-able background noise. Technology in 2026 is still moving fast, but the winners are by no means the loudest (what were you thinking). They are the ones who can integrate autonomy with governance, intelligence with accountability, and ambition with the nasty little limits called operational physics.

The human-AI paradigm: redefining work & skills
Skills re-calibration is no longer a fancy curve on a slide, it is a daily (in)convenience. The World Economic Forum’s estimate that around 39% of core job skills will change by 2030 lands differently when you watch whole task categories evaporate inside a single planning cycle. AI eats the middle first, drafting, summarizing, scheduling, first-pass analysis, junior coding, the kind of work that used to train judgment by repetition. What remains is not “higher level” work so much as different work. Sensemaking. Framing. Knowing which question matters. Strategic thinking, emotional intelligence, and systems literacy rise in value because they do not compress neatly into prompts (yet). At the same time, trades and physical labor resurface as quietly strategic assets. The physical world is stubborn. Pipes leak. Concrete cracks. Logistics breaks. Reality keeps a body, and bodies still matter.
Entry-level pathways fracture in slow, dangerous ways. Junior roles were never just about cheap labor, they were about absorption. They taught context, rhythm, politics, and consequence. Forging mental muscle memory. When AI absorbs that foundational work, organizations risk building pyramids with no base. Seniors get overloaded. Institutional memory thins. The talent pipeline quietly but surely rots. The organizations that survive this do something uncomfortable. They redesign apprenticeship itself. They pair humans with AI systems deliberately, not as shortcuts, but as tutors and mirrors. New roles solidify fast, AI trainers shaping behavior, ethics specialists defining red lines, orchestrators coordinating human and machine effort, agent operators supervising autonomous systems. Work happens in hybrid teams where collaboration extends beyond people, and knowing when to distrust an output becomes as important as knowing how to generate one.
Many skilled workers respond by stepping sideways rather than climbing ladders that no longer lead anywhere. Portfolio careers, micro-entrepreneurship, and independent consulting surge because AI collapses the distance between idea and execution. One person with the right tools now rivals teams that once needed budgets, approvals, and headcount. Power shifts toward individuals who can integrate tools quickly, learn relentlessly, and re-frame their own value every few years without losing their spine. But this flexibility comes at a psychological cost. Constant reinvention erodes safety. Polarization, empathy fatigue, and algorithmic comparison grind people down. When everything feels provisional, experimentation starts to feel risky. Mind: three people in dad’s garage might save the world (or blow it up).
This is where many organizations misstep. They chase capability and forget safety. Innovation does not die because people lack tools. It dies because people stop trusting the ground under their feet. Rebuilding psychological safety means clear expectations, humane pacing, protected learning time, and leaders who tolerate intelligent failure without performative drama. In the human-AI paradigm, speed without safety produces burnout. Capability without belonging produces churn. But capability paired with safety produces something rare and valuable: loyalty and endurance. And in a decade that keeps accelerating, endurance may be the most underrated skill of all.

Leadership transformation: soft skills as hard power
Leadership in 2026 stops being about authority and starts being about architecture. Purpose evolves from a poster on the wall into an operating system that quietly governs decisions when no one is watching. Leaders who treat purpose as a campaign slogan burn credibility fast, because inconsistency now travels at algorithmic speed. Decision architectures increasingly hard-code mission, values, and long-term impact into investment logic, procurement rules, and product roadmaps. Some organizations experiment with roles that look a lot like Chief Value Officers, not to polish reputation (as if that ever worked), but to force ethical, societal, and long-term consequences into boardroom trade-offs instead of outsourcing them to PR after the fact.
Visionary leadership makes an unexpected comeback: leaders who matter think laterally and sometimes heretically. They ask questions that make rooms uncomfortable. They challenge sacred KPIs. They run small, semi-autonomous “special operations” units inside their organizations, teams explicitly designed to rethink how work gets done, how value is created, and which assumptions are no longer valid. These units are protected from day-to-day politics and measured on learning speed and insight, not quarterly output. It is contrarian thinking with guardrails, and it is one of the few ways large organizations avoid calcifying under their own success.
Generalists return to relevance because complexity punishes narrowness. Cross-disciplinary thinkers connect technology, finance, regulation, and culture without needing translation layers. They break silos through systems thinking rather than workshops and Post-its. Fusion teams, blending business, technical, legal, and human expertise by design, become the norm in organizations that want to move fast without breaking themselves. Crisis management hardens into a core competency. Continuous scenario planning replaces the once-a-year tabletop exercise, and transparent communication beats reactive scrambling when things inevitably go sideways.
Authentic influence replaces positional authority. People follow leaders who make sense under pressure, not those who perform confidence. Vulnerability, when grounded and non-performative, builds trust because it signals reality awareness rather than weakness. Leaders who prioritize resilience over optics, coherence over charisma, and long-term health over short-term applause earn followership that survives turbulence. Soft skills stop being soft the moment systems strain. In 2026, they are the load-bearing beams.

Communication reinvention: truth in post-truth times
Every organization becomes a media company whether it asked for the job or not. Silence now communicates as loudly as speech, and so does repetition. Storytelling volume explodes because AI makes production cheap, fast, and endless, but that same abundance flattens everything. What cuts through is not polish and posh buzzwords, it is humanity. Imperfection, simplicity, specificity, and context become differentiators because they are harder to fake at scale. Trust grows from showing your work, how decisions were made, which options were rejected, where things broke, and what you learned. Companies that publish their reasoning, trade-offs, and even their failures build more credibility than those chasing aesthetic perfection and brand-safe nothingness. In a world drowning in generated sameness, texture becomes truth. Look at your neighbor’s Instagram feed and compare that to reality. It hurts.
Search behavior fragments in ways that quietly terrify traditional comms teams. A well-cited internal Google insight from 2022 already showed that around 40% of Gen Z users turn to social platforms like TikTok and Instagram instead of Google Search or Maps for discovery, and that behavior has only normalized since. On top of that, AI-mediated search has arrived. Conversational engines, embedded assistants, and generated answers increasingly replace the blue-link model with direct synthesis. People no longer just search. They ask. They delegate. They accept a contextualized answer, often without ever seeing the original source. Search becomes social, visual, contextual, opinionated, and increasingly filtered through AI systems that summarize, rank, and decide on the user’s behalf. This fundamentally reshapes discoverability. Content that is not structured, credible, and reusable by AI models quietly disappears, even if it technically still exists somewhere on the web. In parallel, newsletters, podcasts, and small community platforms regain power because they offer continuity and a recognizable voice, qualities automated systems still struggle to sustain over time.
Serialized content and lore-driven narratives outperform isolated announcements because they reward attention over time and feed both humans and models. Communication stops behaving like a campaign calendar and starts behaving like a relationship: slow to build, fast to break, and impossible to fake consistently when humans and AI are observing the same patterns, again and again. Visual ambition rises sharply as tools lower the barrier to cinematic expression. AI enables micro-cinema, animation, translation, and remixing at a cost that would have been laughable five years ago. Brands experiment with immersive formats, interactive stories, and real-time visual responses. But the same tools flood the ecosystem with low-effort sludge. Information warfare intensifies not through brilliance, but through volume. Signal gets buried under competent noise. As a result, signal detection becomes a professional skill. Platforms like Reddit increasingly act as early warning systems, surfacing contradictions, sentiment shifts, and emerging narratives before they hit mainstream channels. Verification and fact-checking move out of journalism and into everyday corporate competence, not as a moral stance, but as a survival tactic. In post-truth times, credibility is not declared. It is continuously earned, one consistent, verifiable message at a time.

Climate & sustainability: adaptation becomes urgent
Adaptation finally steps out of the moral debate and into the machinery room. For years, climate lived in the future tense, something we would deal with after the next quarter, the next election, the next breakthrough. In 2026 it becomes present tense. The European Union’s integrated adaptation planning pushes resilience straight into the guts of infrastructure, agriculture, water systems, and urban design. This is not about hitting abstract percentages on a slide. It is about roads that do not buckle at 45°C, ports that can operate with higher sea levels, cities that can survive a week-long heat dome without turning into emergency wards. Regional pathways replace universal targets because reality is stubbornly local. Flooding in Belgium, drought in southern Spain, wildfires in Greece are not variations of the same problem. They are different systems failing in different ways.
Climate risk becomes financial risk in ways that cut through ideology. Insurance is the early warning system, and it is blinking red. In flood-prone and wildfire-exposed regions across Europe and North America, premiums spike or coverage simply vanishes, not next decade, now. When a house or factory becomes uninsurable, it becomes unfinanceable. Banks notice. Credit models start pricing physical climate risk into long-term assessments, nudging borrowing costs up and investment flows sideways. Capital does what it always does, it moves away from uncertainty and toward resilience. Suddenly adaptation budgets look less like environmental spending and more like balance-sheet defense.
The 1.5°C story does not disappear, but it loses its comforting neatness. Overshoot is increasingly acknowledged, quietly, in technical briefings rather than press releases. The conversation shifts from prevention alone to damage limitation and adaptive capacity. How much heat can a workforce endure? Which assets do you harden, which do you relocate, which do you abandon? Investment accelerates into flood defenses, heat-resilient buildings, water recycling, cooling infrastructure, and nature-based solutions like wetlands and urban green corridors that absorb shock while buying time. These are not romantic projects. They are shock absorbers for a system under stress. Plain wrong, focusing on disasters we should prevent in the first place. Only, shareholders and politicians lack their 10 minutes of courage.
Sustainability, in this environment, stops being about intent and starts being about proof. Green claims without data, decay instantly. What holds value are transparent emissions figures, credible transition plans, and demonstrable resilience under stress. Companies that can show how they operate during heatwaves, droughts, supply disruptions, and energy volatility earn trust from insurers, investors, and employees alike. In 2026, the question is no longer whether you care about climate. It is whether your organization still functions when the climate stops playing nice.

Business model innovation: value creation reimagined
Business models finally let go of the platform fantasy and grow up into ecosystems. Owning the interface is no longer enough, and often not even desirable. Value now emerges through interoperability, shared standards, and deliberate co-creation across networks of partners, suppliers, developers, and customers. You see this in open banking, in industrial data spaces, in energy markets that balance generation, storage, and consumption across many actors. Companies learn to play in multiple ecosystems at once, sometimes even with competitors, without dissolving into strategic mush. The skill is not domination. It is coherence. Knowing where you anchor, where you plug in, and where you refuse to be dependent.
Outcome-based economics spreads because customers are done paying for potential. They pay for results, uptime, efficiency gains, emissions reduction, risk avoided. This is not a philosophical shift, it is very much a hard-negotiated contractual one. Service-level agreements evolve into outcome-level agreements, and suddenly continuous measurement becomes part of the product. If you cannot prove value over time, you do not get renewed. Period. Automation creeps into value accounting itself, tracking impact in near real time instead of quarterly post-mortems. This changes how products are built, how pricing works, and how trust is maintained, because the numbers are always on.
Infrastructure choices stop being purely technical and start reflecting sovereignty and survival concerns. Hybrid cloud and edge architectures balance control, latency, regulatory constraints, and resilience. Data that cannot leave a jurisdiction stays close to where it is generated. Real-time intelligence moves to the edge because physics still matters. Multi-cloud strategies become less about shaving costs and more about hedging risk, against vendor lock-in, geopolitical tension, and single points of failure. The business model is no longer just what you sell. It is how robust your value creation remains when the environment stops being friendly.

Organizational culture: the differentiator
Culture stops being the soft stuff once systems run hot. I sincerely hope that in 2026, well-being is no longer framed as kindness, it is framed as capacity. Mental health support, finally room for neuro-divergent flexible work structures, and third spaces move into standard operating models because burnout is expensive and measurable. The World Health Organization already estimates that depression and anxiety cost the global economy over $1 trillion per year in lost productivity, and organizations are finally connecting that number to their own attrition dashboards. Burnout stops being treated as a personal resilience failure and starts being recognized as a system design flaw, too much load, too little recovery, unclear priorities, permanent urgency. Fixing it requires redesigning work, not adding another mindfulness app.
Recognition and belonging quietly outperform perks. Free lunches, the ping-pong table, corporate swag, and inflated compensation packages lose their stickiness when people feel invisible or disposable. Gallup data consistently shows that employees who feel recognized are significantly more engaged and less likely to leave, and in 2026 that correlation turns into strategy. Shared moments, rituals, and visible appreciation create memory, and memory creates loyalty. Community becomes the real engagement engine, internal networks, peer recognition, and collective milestones replacing perk inflation as the glue that holds people through change. Belonging reduces volatility better than any retention bonus.
Forcing people back into the office is rarely a bold cultural stance. More often, it is a management failure wearing a confidence costume. It signals an inability, or unwillingness, to do the harder work of designing hybrid systems that function. The real advantage of distributed work was never “work from home in pajamas,” it was trust, autonomy, and access to wider talent pools, combined with intentional moments of gathering that reinforce the tribe. Culture is not created by mandatory clock-ins or full parking lots. It is created by shared purpose, clear expectations, and environments where individuals are treated like adults with lives, energy cycles, and different ways of doing their best work. Organizations that default to blanket return-to-office mandates usually reveal that they never learned how to lead without line-of-sight control. The managerial gold is elsewhere. It sits in designing socio-technical ecosystems where teams know when to come together, why it matters, and what they are building collectively, while still honoring individual focus, flexibility, and dignity. That kind of culture does not shout. It compounds.
Continuous learning embeds itself into organizational DNA because stasis becomes a liability. Upskilling shifts from a development benefit to a retention strategy as roles evolve faster than job titles. Humans increasingly learn alongside AI systems, not just how to use them, but how to question, correct, and supervise them. Learning becomes part of the workflow, not something scheduled between crises. Organizations that normalize experimentation and adaptation build confidence instead of fear. Adaptation stops being an initiative with a launch date and becomes a habit, quietly practiced, rarely announced, and absolutely decisive when the environment shifts again.

The integration challenge: making it all work
Integration is where strategies go to either become boringly effective or die noisily. By 2026, most large organizations are no longer short on ideas, pilots, or frameworks. They are drowning in them. The average enterprise already runs hundreds of SaaS tools, dozens of AI experiments, multiple regulatory regimes, and parallel transformation programs that barely speak to each other. Strategy stops being about choosing initiatives and starts being about designing an operating system that can survive load. Projects are temporary. Operating systems persist. The competitive advantage moves to those who can stitch AI, legacy IT, regulation, cybersecurity, culture, finance, and incentives into something that actually works on a Tuesday afternoon, not just at the annual offsite.
Governance has to grow up fast. Complexity is no longer exceptional; it is the baseline. The organizations that win are not the most agile in the buzzword sense, but the ones that can make fast decisions without losing control. That means clear ownership, explicit trade-offs, and escalation paths that work before something is on fire. We already have enough cautionary tales. Look at aviation, finance, healthcare. Most failures are not caused by a single bad technology, but by handoffs, gaps, and assumptions between systems that were never designed to coexist. Integration is not a tech problem. It is an organizational one, and it is brutally unforgiving to hand-waving.
Trust becomes the scarcest and most expensive resource in this environment. Edelman’s Trust Barometer has shown for years that trust in institutions is fragile, but in 2026 that fragility becomes operational risk. Customers hesitate. Employees leak. Regulators scrutinize harder. Transparency stops being a communications exercise and turns into behavior under stress. Do your incentives match your values when revenue is on the line? Do your AI systems behave the same in production as they did in the demo? Do your suppliers meet the standards you claim to enforce? Trust is no longer built by saying the right things once. It is built by being boringly consistent across products, people, data, and decisions.
Long-term thinking, paradoxically, becomes a competitive edge precisely because it is so rare. Public markets still reward quarterly performance, but infrastructure, AI capability, energy resilience, and talent pipelines operate on decade-long timelines. The companies that invest early in integration, clean data foundations, regulatory readiness, and human capability look slower at first. Then the environment tightens, and suddenly they are the only ones still moving. This is not idealism. It is compound interest applied to organizational design.
The uncomfortable truth is that most integration work is invisible. It does not demo well. It does not fit neatly into a keynote. It shows up as fewer incidents, faster recovery, lower friction, and quieter nights. In 2026, that quiet becomes a luxury. Building for decades instead of quarters is no longer a moral stance, it is a survival strategy. Legacy, not optics, becomes the real scorecard. And the organizations that accept this early will find that coherence, once achieved, is extraordinarily hard for competitors to copy.

The hidden bill: the cost of new technology
The invoice for progress finally arrives, and it is eyewatering. AI is no longer just a software story, it is a bonehard infrastructure story, and infrastructure always sends the bill somewhere very physical. Data centers already consume roughly 1–2% of global electricity, and multiple credible forecasts (including from the International Energy Agency) expect that share to double by the end of the decade as AI workloads explode. A single large hyperscale data center can draw as much power as a medium-sized city, and unlike cities, it does not vote. Training frontier models burns through tens of gigawatt-hours of electricity, and inference, the part we pretend is cheap, runs 24/7. Cooling turns electricity into water politics. In drought-prone regions like the US Southwest, parts of Spain, and northern Mexico, water-cooled data centers now compete directly with agriculture and households. We need severe and quick sustainability action over endless talk. We need zoning hearings, emergency permits, true numbers, regulations and local elections being lost over insensitive positioned server racks.
Energy is only the first line item. The ethical bill grows quietly and compounds. Automation is eliminating jobs, it is collapsing career ladders. It will kill more in the very near future. Entry-level roles in marketing, law, finance, and software are being hollowed out faster than reskilling systems can compensate. The World Economic Forum estimates that while automation will create new roles, millions of workers globally will still need to transition roles by 2030, and transitions are where people fall through the cracks. When junior work disappears, mentorship pipelines break, institutional memory thins, and you risk losing an entire generation of skilled professionals before they ever stabilize. Governance struggles to keep up, and the moral hazard is obvious. It is very easy to deploy systems that quietly externalize harm, because the cost does not show up on the P&L this quarter.
Job loss is uneven, and that is what makes it destabilizing. Tech hubs adapt. Peripheral regions hollow out. Highly skilled workers gain leverage while others get stuck with shrinking options. Productivity gains concentrate in firms that own models, compute, and data, while adjustment costs are distributed across communities, schools, and families that never signed up for the experiment. This is how social contracts fray, not with a bang, but with a growing sense that the system optimizes around you, not for you. Ignoring this bill does not make it go away. It just adds interest, and interest, as anyone who has missed a payment knows, is merciless.
The uncomfortable truth is this: we are running one of the most ambitious technological transitions in history without having finished the plumbing. Energy grids lag ambition. Water policy lags deployment. Education systems lag labor markets. Ethics lags incentives. None of this means we should stop. It means we should stop pretending the cost is theoretical. Progress is still worth it. But only if we are adult enough to admit that someone always pays, and to decide, deliberately, who that someone will be.

AI’s next Act: agents, worlds, and the battle of giants
AI evolution accelerates toward autonomy… the next phase is not about sounding smarter. It is about behaving differently. The evolution accelerates toward autonomy, and the quiet enablers matter more than the flashy demos. Retrieval-augmented generation (RAG) becomes boring infrastructure, which is exactly the point. Grounding models in verified, auditable knowledge turns hallucination from an existential risk into a systems problem. Model Context Protocols (MCP) move in the same direction, standardizing how models talk to tools, memory, permissions, and each other. This is plumbing, not poetry, but it is the plumbing that allows agents to act coherently across workflows instead of improvising themselves into trouble. Once you combine RAG, MCP, long-term memory, and tool access, you stop building chatbots and start assembling digital actors that persist over time.
That persistence changes everything. Agents stop being transactional and start being situational. They plan, simulate, revise, and coordinate. This is where world models enter the conversation, systems that do not just predict the next token, but maintain an internal representation of how the world behaves. Cause and effect. Physics. Constraints. Feedback loops.
Yann LeCun, one of the architects of modern deep learning, has been blunt about this. He argues that today’s large language models lack true understanding and that progress toward more general intelligence requires models that can reason about the world, not just language scraped from it. Whether you agree with his framing or not, the direction of travel is clear. Text prediction alone hits a ceiling. Simulation changes the game.
As autonomy increases, organizational boundaries start to blur. We already see the early outlines of hybrid HR-CIO responsibilities emerging, because managing agents starts to look suspiciously like managing a workforce. Onboarding non-human actors. Defining permissions. Setting budgets. Logging actions. Auditing behavior. Reviewing performance. When an agent schedules work, negotiates with vendors, touches customer data, or triggers financial actions, that is not just IT anymore. It is governance, compliance, and people management, minus the people. The organizations that grasp this early stop treating AI as a tool category and start treating it as a new class of organizational actor. The ones that do not end up with shadow agents doing shadow labor outside any meaningful control.
Meanwhile, the battle between giants intensifies with very real consequences. OpenAI, Google, Meta, and Microsoft pour tens of billions into compute, talent, and infrastructure, not out of curiosity, but because scale now determines who sets the defaults. Whoever controls the dominant models, protocols, and ecosystems ends up writing the de facto standards the rest of the world adapts to. This concentration ripples outward. Power centralizes. Dependency deepens. Geopolitical entanglement increases as compute, energy, and talent pools become strategic assets rather than neutral inputs.
Artificial general intelligence remains contested, undefined, and heavily mythologized. But focusing on the label misses the point. What matters is not whether a system crosses some philosophical threshold, but how much agency it is given, under what constraints, and with whose oversight. Systems that can plan, simulate, and act at scale do not need to be “general” to be disruptive. They just need to be deployed without brakes. Whether this next act amplifies human capability or deepens dependency depends less on model architecture and more on governance, incentives, and design choices made now. Autonomy without accountability is not intelligence. It is just speed with consequences.
We are not racing toward a single breakthrough moment. We are drifting into a new equilibrium, one protocol, one agent, one delegation at a time. The question for 2026 is not “will AI get smarter?” It will. The question is who decides how much freedom these systems get, how their actions are constrained, and who carries the weight when they get it wrong. That, more than any benchmark score, is where the real battle is being fought.

Friction is the new black (which is the old green)
I keep coming back to the same uncomfortable comfort: friction is honest. It is the sound a system makes when it finally has to carry its own weight. In 2026 we do not get to outsource that weight to “innovation,” or “culture,” or a vendor with a glossy PDF. We have to build things that survive Tuesdays. Things that behave the same in production as they did in the demo. Things that can be audited, throttled, shut down, explained. Real governance. Real energy math. Real human limits. Real accountability.
The good news, if you can call it that, is that this is exactly where grown-up advantage is made. Not by shouting “AI-first” like it’s a war cry, but by designing orchestration, incentives, and trust the way you’d design a bridge. Assume stress. Assume weather. Assume idiots. Build anyway. If your organization cannot tell the truth at scroll speed, cannot measure value without theatre, cannot keep humans intact while machines accelerate, then 2026 will not be a tough year. It will be a sorting hat.
So here’s my bet: the winners are the boring ones. The ones who keep receipts, who invest in plumbing, who treat trust like an asset and energy like a constraint, who stop repainting the lobby and finally fix the foundations. Friction is our frenemy this year.
It is the test, and it is the teacher.
