The weird part is not that product teams are moving faster. It’s that the front door still feels like 2019.
Inside the product org, the pace has changed. A PM and designer can work through a new onboarding flow, test edge cases, and get to something close to a believable prototype before the week is over. In some teams, before lunch. Tools like Figr are a big reason why. It helps product teams think through UX flows, edge cases, and product decisions before they start drawing boxes for the sake of drawing boxes, and it can turn product context, design systems, and user journeys into high-fidelity prototypes that reflect actual product logic.
That compresses a lot of work that used to sprawl across docs, Figma files, Slack threads, and somebody’s memory.
But the customer still lands on a homepage that says things like “built for modern teams,” clicks to a features page, gets a three-minute brand film in text form, and then hits a signup wall before they can tell whether the product does the one thing they came for.
That gap is getting wider.
A lot of product teams still put their best UX effort inside the app, which made sense for a long time. The website explained the category. The help center filled in the blanks. The product did the rest. That arrangement gets shakier when the product itself changes every few weeks, and when AI tools make it easier to ship new flows, settings, and use cases before the surrounding language has caught up.
You can see the failure mode in pretty ordinary places. A team ships a smarter permissions model, a new workspace setup, and two AI-assisted features in one quarter. The product is better. The website still describes the version from April. The first person who notices is not usually the design team. It’s the buyer who opens three tabs, searches the docs for “Can this work with contractors?” and leaves because the answer is spread across a pricing page, an FAQ, and a changelog entry written like release notes for existing users.
Most teams do not have a product problem at that point. They have a comprehension problem.
And that problem shows up before signup, which is exactly where many product teams have the least instrumentation and the least patience. They’ll measure activation down to the decimal, then treat the website like a brochure with a calendar link attached. Meanwhile, the hard part for a potential customer is often deciding whether this product fits their situation at all. Not learning one button. Not finding the settings panel. Just getting to a clean, credible “yes, this is for me.”
Here’s the catch: some advice on this topic gets too tidy. You cannot redesign the whole customer journey every time the team ships something new, and you probably shouldn’t. A company releasing weekly changes cannot run a full website strategy project every Friday. That way lies another kind of slowdown, where marketing becomes the bottleneck and every launch drags a tail of copy reviews, page updates, and last-minute docs debt.
So the practical answer is not “make the website perfect.” It’s to make the early journey more adaptive than a static page can be.
This is where I start thinking less about page design and more about support for understanding. A visitor arrives with uneven context. One person wants to know whether the product handles role-based approvals. Another is trying to understand if the workflow is too complex for a six-person ops team. A third already gets the category and just wants to know what happens after signup. Pushing all three through the same page sequence is lazy. It’s also expensive.
Mando is useful in exactly that narrow part of the problem. It sits on the website, learns from a company’s docs, site content, and internal knowledge, and answers visitor questions in real time so people can understand the product and find the right next step before they sign up.
That does not fix bad positioning. It does not remove the need for clear pages. It does help when feature velocity has outrun the company’s ability to explain itself in static copy.
The teams that handle this well usually stop treating pre-product understanding as a marketing-only job. They treat it as part of product adoption. That changes what they look for. Not just bounce rate, but repeated question patterns. Not just trial starts, but where visitors hesitate. Not just whether the docs exist, but whether a first-time visitor can get to relevance in under two minutes without reading like they’re studying for an exam.
And yes, there is a mild irony here. AI is helping create some of this mess by making it easier to ship faster than the story around the product can keep up. It may also be one of the few realistic ways to absorb that mess without hiring a small army to rewrite pages every month.
Start smaller than you think. Pick one part of the journey before signup where confusion is obviously costing you momentum: pricing questions, use-case fit, implementation uncertainty, whatever comes up most. Then watch for one success signal. Fewer repeated pre-sales questions. More visitors reaching the right page on the first try. Shorter time between landing and action.
The product team has already sped up. The rest of the journey needs to stop pretending nothing changed.
