Beta World

Agents > Apps

Published
October 1, 2025
Share article

The year software stopped asking for clicks

In 2025 a design leader at a global airline sat in a conference room, laptop open, trying to plan an employee’s complex relocation. Each step required hopping between six apps: HR, payroll, real-estate partner, compliance tracker, travel booking, and IT provisioning. By the end of the day, the manager had 47 browser tabs open.

Now imagine the same scenario with an AI agent. Instead of logging into six apps, the manager simply says: “Relocate Priya from Bangalore to London. Handle payroll, housing, travel, and compliance.” Within minutes, everything is orchestrated. Flights, visas, housing allowances, device provisioning, all in motion. What used to be an exercise in switching contexts becomes an outcome delivered.

So, what actually changed in that second scenario? The intent stayed the same, the interface collapsed into the background.

This is the tectonic shift: apps lock you into boxes; agents dissolve those boxes.

From rigid apps to fluid agents

Apps are structured around menus, forms, and navigation patterns. They ask users to adapt to their structure. Agents, in contrast, adapt themselves to the user’s intent.

  • Apps: pre-defined workflows, brittle integrations, user clicks drive progress.
  • Agents: dynamic orchestration, context-aware, goal-driven execution.

Do users still click anything? Sometimes, yes. But the number of required actions drops because the agent handles the glue work.

As Jakob Nielsen put it recently, “AI will browse for us.” His point is simple: the traditional UI, menus, tabs, screens, is an artifact of the past. Agents don’t present walls of options, they deliver outcomes.

Why “fluid” matters

Fluidity means a workflow does not break when the context shifts. Consider expense reporting:

  • App model: open expense app → upload receipt → categorize → submit.
  • Agent model: forward receipt → agent recognizes category, fills form, submits, follows up if policy conflict arises.

What about edge cases? The agent asks for a quick confirmation, then moves on.

Fluidity also unlocks scale. A report from Master of Code Global forecasts the AI agent market to grow from $5.4B in 2024 to $7.9B in 2025, with a 45% CAGR through 2034. See the summary of the projections at AI agent statistics. Adoption is not just technical, it is economic gravity.

Does this mean every workflow becomes invisible? Not quite. Critical steps remain visible so people can approve, pause, or reverse actions.

Diagramming the shift

flowchart LR
    A[Apps\nPre-defined menus & clicks] --> B[Workflows\nManual orchestration]
    B --> C[AI Agents\nFluid intent capture]
    C --> D[Outcomes\nAdaptive, automated execution]
    D -.->|Human oversight| C

Is this too neat for messy reality? Of course. It is a sketch to show where attention moves.

The agent-centric stack

Agents are not magic, they sit atop layers of design and infra. The modern agentic stack looks like this:

  1. Intent capture: natural language, voice, or multimodal input.
  2. Context ingestion: data from systems, design tokens, analytics, databases, docs.
  3. Orchestration: reasoning engines, planning, chaining tools.
  4. Execution: API calls, scripts, task completion.
  5. Feedback loop: users oversee, correct, and re-align.

Where should teams start? Nail intent capture and context ingestion, then automate the smallest valuable loop.

Apps, by contrast, stop at step 3, they expose execution as a UI you must drive.

What UX designers need to re-learn

Luke Wroblewski mapped the evolution of AI products into six stages, culminating in “AI agents” as the system of record. His framing is blunt: “Design morphs to orchestration.”

That means UX design shifts from crafting screens to designing systems of intent. The surfaces are no longer static pages but policy levers, confidence indicators, and trust signals.

As Nielsen Norman Group argues, we will need UX for agents, not interfaces.

So what do designers actually design?

  • Policy surfaces: where users set boundaries, budget caps, approval rules.
  • Confidence conveyors: how the agent signals certainty or doubt.
  • Choreography: managing multiple agents interacting in a workflow.

How do you know when to show the surface and when to hide it? Show it when risk, irreversibility, or ambiguity is high, hide it when the path is routine.

What about trust? Show what changed, why it changed, and how to undo it.

Business implications

For business owners, the stakes are clear. Agents collapse friction and unlock new economics:

  • Efficiency: workflows complete in minutes, not hours.
  • Consistency: agents apply rules uniformly across teams.
  • Scalability: no training 500 employees on six tools, train one agent once.
  • Differentiation: products that embed agents deliver outcomes, not interfaces.

Will this kill training budgets? It shrinks tool training, then shifts investment to policy design and oversight.

A report from Total Design calls it the era of agent-centric design, noting how firms will compete not on feature lists but on the fluidity of outcomes they can deliver.

Is this only for big companies? No. Small teams benefit first because fewer handoffs make automation land quickly.

FAQs

1. Will agents kill apps entirely?

Not overnight. Apps will still exist under the hood. But from the user’s perspective, the app layer recedes. Agents sit on top, orchestrating those apps invisibly.

2. How do agents handle trust and errors?

Designers must surface confidence levels and enable human in the loop oversight. The goal is not blind automation but augmented control.

3. What skills should UX/UI designers develop now?

Move beyond layout craft. Learn orchestration patterns, policy surface design, and intent-driven workflows.

4. Are businesses really adopting agents?

Yes. Surveys show strong executive optimism about AI’s impact, and consumers increasingly expect more personalized experiences from automated systems.

5. How do agents differ from chatbots?

Chatbots answer. Agents act. The distinction is execution, agents do not stop at conversation, they carry tasks across systems until outcomes are delivered.

Closing thought

The basic gist is this: apps made us click; agents let us choose. Designers and businesses that understand this shift will stop thinking in menus and start thinking in outcomes. When workflows adapt themselves, the future of UX will not be screens, it will be trust, clarity, and fluidity.