How to validate features before writing a single line of code
Every product carries a graveyard inside it. Features that took three months to build and six weeks to remove. Someone was confident. Engineering was committed. Users were not consulted. The feature launched, users shrugged, and the deprecation began.
Three months of developer time, buried.
I have watched this happen at every company I have worked at. The confidence was real. The commitment was genuine. The users just did not care.
What if you could have known? (Known what, exactly? Whether users would care, before you commit.)
The Validation Gap
Most product teams have a validation problem they refuse to articulate. (What is the problem, in plain terms? That validation requires resources they cannot access.)
A/B testing? Engineering needs to build two versions. User research? You need a prototype realistic enough to show. UX review? Designer time is allocated to next sprint. (So what happens in practice? PMs validate with opinions instead of evidence.)
So PMs validate with opinions instead of evidence. They present in stakeholder meetings and interpret nodding heads as approval. (Is nodding heads the same as users caring? No, it never touched a user's hands.)
A 2022 study in the Journal of Product Innovation Management measured the gap: products with pre-development validation had 47% higher market success rate than those validated only post-launch.
The Three Validation Layers
Layer 1: Concept Validation. Does this feature solve a real problem? Will users understand what it does without explanation?
This is the cheapest layer and the one most teams skip. They assume the problem exists because a stakeholder said so. (Is that enough? It is not evidence.)
Layer 2: Flow Validation. Users understood the concept. Can they complete the journey? (What are you looking for here? Where users get stuck, where they abandon, and where they ask questions the UI should have answered.)
This layer reveals where users get stuck. Where they abandon. Where they ask questions the UI should have answered.
Layer 3: Edge Case Validation. The happy path works. What about unhappy paths? (What counts as "unhappy" here? The scenarios you already listed, the ones that break the flow.)
When you ask for file upload, you should get twelve screens, not one. Progress, pause, resume, format error, size limit, network interruption, retry, success, failure.
→ See Dropbox edge cases, 14 scenarios mapped
.png)
The Economics
Engineering time runs approximately $75/hour. A feature taking three months to build and subsequently removed costs around $30,000 in developer time alone. (Is that only developer time? Yes, developer time alone.) This excludes opportunity cost, designer time, and support handling confused users.
Pre-development validation: PM spending two days, roughly $1,000 in loaded time. What is the bet here? That two days prevents one failed feature per quarter, ROI is 30x.
According to IBM's Systems Sciences Institute research, the cost to fix a defect escalates by 6x from design to development, and by 100x from design to post-release maintenance.
What Validation Surfaces
A PM at a logistics company validated a warehouse routing feature. Initial concept: show drivers the optimal path.
During edge case generation, scenarios emerged the PM had not considered. What happens when an item is out of stock? When two routes conflict at the same aisle? When the device loses connectivity mid-route?
Each scenario became a screen. Each screen became a conversation with warehouse operators who confirmed these scenarios happen constantly.
The feature was redesigned before engineering wrote a line of code.
The basic gist is this: validation layers stack. What happens if you skip a layer? The feature fails at that layer.
In Short
Validation was traditionally gated behind resources PMs did not control. When validation costs days instead of weeks, the gate dissolves.
PMs validate before they pitch. They fail fast without failing publicly.
→ Try Figr for your next validation
