Solution Validation

Testing if your specific solution actually solves the problem, before building the full product.

The Critical Distinction

Problem validation: Is the problem real? Solution validation: Does YOUR solution work?

Many real problems have failed solutions. You need both.

The Solution Validation Mindset

Most founders:

  1. Have a solution in mind
  2. Build it for months
  3. Show it to customers
  4. Realize it doesn't solve the problem

Better approach:

  1. Validate the problem exists
  2. Test multiple solutions cheaply
  3. Build what customers actually want
  4. Launch quickly, iterate fast

The insight: Your first solution idea is probably wrong. Test assumptions before committing.

The Riskiest Assumption Test

Before building anything, identify your riskiest assumptions.

Framework:

  1. List all assumptions about your solution
  2. Rate each by: (Risk if wrong) × (Certainty you're right)
  3. Test the highest-scoring assumptions first

Example: Meal planning app

AssumptionRisk (1-10)Certainty (1-10)Score
People will cook recipes we suggest9436
They'll pay $10/month8540
Push notifications drive engagement5630
We can source good recipes3927

Test #1: Will people cook our recipes? (highest score = most important to validate)

Solution Validation Methods

Level 1: Fake It (Cheapest, Fastest)

Goal: Test demand before building anything.

Landing Page Test

What: Create a page describing your solution. Drive traffic. Measure signups.

How:

  1. Write compelling copy (problem + solution)
  2. Add email signup form
  3. Set success metric (5% conversion)
  4. Buy ads or post in communities ($100-500)
  5. Measure results in 1-2 weeks

Example:

Headline: "Meal planning in 2 minutes, not 2 hours"
Subhead: "Weekly meal plans personalized to your taste, dietary needs, and schedule"
CTA: "Join the waitlist"

Success criteria:

  • 5%+ conversion = strong interest
  • 2-5% = medium interest
  • <2% = weak interest

Cost: $200 (domain, hosting, ads) Time: 1-2 days Learning: Will people even want this?

Smoke Test

What: Advertise a product that doesn't exist. See who clicks "buy."

How:

  1. Create ad with your value prop
  2. Link to simple page with "buy" button
  3. On click: "Thanks for your interest! We're launching in X weeks."
  4. Collect email
  5. Measure click-through and purchase intent

Example:

  • Facebook ad: "Finally, healthy meal planning that takes 2 minutes"
  • Landing page: $10/month, "Start free trial"
  • After click: Waitlist form

Success criteria:

  • 2%+ click "buy" = validated
  • 1-2% = maybe
  • <1% = not interested

Cost: $300 (ads) Time: 2-3 days Learning: Will people pay?

Concierge MVP

What: Manually deliver your solution to a few customers.

How:

  1. Find 5-10 early customers
  2. Charge them (even if small amount)
  3. Do everything manually behind the scenes
  4. Learn what actually solves their problem
  5. Automate only what works

Example: Meal planning

  • Customer signs up for $20
  • You manually create their meal plan
  • You grocery list (by hand)
  • You send via email
  • You adjust based on feedback

Success criteria:

  • They use it
  • They're willing to pay
  • They refer others
  • You understand what they need

Cost: $0-100 Time: 1-2 weeks per customer Learning: What actually solves the problem?

Level 2: Prototype It (Medium Cost, Speed)

Goal: Test if the experience resonates.

Clickable Prototype

What: Fake app/website that looks real but has no backend.

Tools:

  • Figma (design + prototype)
  • InVision (interactive mockups)
  • Marvel (simple prototypes)

How:

  1. Design key screens
  2. Link them together
  3. Share with 20-30 users
  4. Watch them use it (don't help!)
  5. Interview after

What to test:

  • Can they complete core tasks?
  • Is the flow intuitive?
  • Do they understand the value?
  • What confuses them?

Cost: $0 (tools are free) Time: 3-5 days Learning: Is the UX right?

Wizard of Oz MVP

What: Looks automated but you're doing it manually.

Example:

  • User thinks AI is generating meal plans
  • You're actually doing it by hand
  • User can't tell the difference
  • You learn if the output is valuable

How:

  1. Build simple frontend
  2. Fake the automation
  3. Fulfill requests manually
  4. Measure satisfaction
  5. Only automate what works

Success criteria:

  • 4+ star rating
  • Users would pay for this
  • Clear what features matter
  • You understand the workflow

Cost: $500-2,000 (simple frontend) Time: 1-2 weeks Learning: Is the output valuable?

Level 3: Build Minimum MVP (Higher Cost)

Goal: Test with real, functioning product (minimal features).

The Single-Feature MVP

What: Build one core feature really well.

Rule: If you're embarrassed by v1, you launched too late.

Example: Project management tool

Don't build:

  • Task management
  • Time tracking
  • File sharing
  • Reporting
  • Integrations
  • Mobile apps
  • Gantt charts

Do build:

  • Create task
  • Assign to person
  • Mark complete
  • That's it

Why it works:

  • Launches in 2-4 weeks
  • Tests core value proposition
  • Gets real feedback fast
  • Cheap to build

When to use: After validating with prototypes/fake tests.

Technical Prototype

What: Prove technical feasibility of the hard part.

When needed:

  • Complex algorithm
  • New technology
  • Integration challenges
  • Performance concerns

Example:

  • "Can we actually match recipes to preferences?"
  • Build the matching algorithm
  • Test with real data
  • Ignore UI, payments, etc.

Cost: 1-4 weeks of dev time Learning: Is this technically possible?

The Solution Interview

After showing prototype/MVP:

Structure (45 minutes)

1. Context (5 min)

  • Remind them of the problem
  • "Remember you said [problem was painful]?"

2. Show Solution (10 min)

  • Demo your prototype/MVP
  • Let them try it (if possible)
  • Don't explain too much, see if they get it

3. Open Feedback (20 min)

  • "What do you think?"
  • "Would this solve your problem?"
  • "What's missing?"
  • "What's confusing?"
  • "What would you change?"

4. Prioritization (5 min)

  • "What's most important?"
  • "What could we skip?"
  • "What would make this a must-have?"

5. Commitment Test (5 min)

  • "If we launched this next month at $X, would you buy?"
  • "Would you refer others?"
  • "Can we follow up for beta testing?"

What You're Listening For

Strong signals: ✅ "When can I get this?" ✅ "How much does it cost?" ✅ "Can I sign up now?" ✅ "This solves my exact problem" ✅ They describe specific use cases ✅ They spot problems you didn't think of

Weak signals: ⚠️ "This is interesting" ⚠️ "I like it" (generic) ⚠️ Lots of feature requests ⚠️ "I'd need to see more" ⚠️ Can't articulate how they'd use it

Red flags: 🚩 "I'm not sure I'd use this" 🚩 "Current solution works fine" 🚩 "Too complicated" 🚩 "Doesn't really solve my problem" 🚩 They don't understand it 🚩 Confused about value prop

The Five-Second Test

Goal: Can people understand your value proposition instantly?

How:

  1. Show your landing page/product for 5 seconds
  2. Hide it
  3. Ask: "What does this product do?"

Success:

  • They can explain it accurately
  • They mention key benefit
  • They identify target user

Failure:

  • "I'm not sure"
  • Wrong understanding
  • Focus on features, not benefits

Fix: Simplify your message.

Measuring Solution Validation

Qualitative Signals

After 20 solution interviews:

MetricSuccess
"When can I get this?"40%+
Clear on what it does80%+
Would solve their problem60%+
Would pay $X30%+
Would beta test50%+
Gave specific feedback70%+

Quantitative Signals

For landing page / fake tests:

MetricBenchmark
Landing page conversion5%+
Ad click-through rate2%+
Email open rate20%+
Prototype completion rate70%+
Would recommend (NPS)30+

Iterating Based on Feedback

Pattern Recognition

After 20+ interviews, look for:

1. Consistent confusion

  • If 40%+ confused about same thing → Fix immediately
  • Example: "I thought it did X not Y"

2. Missing features

  • If 60%+ mention same gap → Add to roadmap
  • Example: "I need to integrate with Slack"

3. Wrong solution

  • If 50%+ say it doesn't solve problem → Pivot
  • Example: "This is nice but doesn't address my real issue"

4. Wrong segment

  • If only subset loves it → Focus on them
  • Example: Freelancers love it, agencies don't care

The Pivot Decision

When to pivot:

  • 50%+ say solution doesn't address the problem
  • You keep explaining and they still don't get it
  • Features they want are completely different product
  • Different segment is way more excited

When to iterate:

  • Core value is clear but needs refinement
  • Missing features that make sense
  • UX issues but concept is right
  • Pricing concerns

When to kill:

  • Can't get anyone excited
  • Problem wasn't as painful as thought
  • Solution doesn't actually work
  • Better alternatives exist

The Iteration Loop

Test → Measure → Learn → Adjust → Test

Week 1: Show prototype to 10 people Week 2: Fix top 3 issues Week 3: Test with 10 new people Week 4: Fix next issues Repeat until 60%+ would buy

Common Mistakes

Mistake 1: Building Too Much

❌ "Let's build the full product then test" ✅ "Let's test the core idea with a landing page"

Cost of mistake: 3-6 months wasted, $50K+ burned

Mistake 2: Ignoring Negative Feedback

❌ "They just don't get our vision" ✅ "If they don't get it, we need to change it"

Reality: Customers are always right about their problems, often wrong about solutions.

Mistake 3: Testing With Wrong People

❌ Showing to friends, family, non-customers ✅ Testing with target customers who have the problem

Mistake 4: Feature Creep

❌ "Let's add just one more feature before testing" ✅ "Test with minimum, add based on feedback"

The MVP rule: If you're not embarrassed, you waited too long.

Mistake 5: Explaining Too Much

❌ Walking them through every feature ✅ Watching them struggle (that's the data!)

If they need explanation to understand it, it's not clear enough.

Mistake 6: Taking All Feedback Equally

❌ Treating all feedback the same ✅ Weighting feedback by:

  • How much they have the problem
  • Whether they'd actually pay
  • How well they fit target segment

The Solution Validation Checklist

Before moving to next stage:

  • [ ] Tested riskiest assumptions first
  • [ ] Ran landing page test (if applicable)
  • [ ] Built clickable prototype or concierge MVP
  • [ ] Conducted 20+ solution interviews
  • [ ] 60%+ say solution would solve problem
  • [ ] 30%+ would pay at target price
  • [ ] Clear on what features are must-haves
  • [ ] Iterated based on feedback
  • [ ] Five-second test passes
  • [ ] No major confusion about value prop
  • [ ] Identified segment that loves it most
  • [ ] Got beta tester commitments
  • [ ] Documented findings and learnings
  • [ ] Decided: build, pivot, or kill

Red Flags

Kill or pivot if:

🚩 Solution doesn't solve the problem

  • Persistent feedback that it misses the mark
  • Need to explain it extensively
  • "Nice but not what I need"

🚩 Too complex to build/use

  • Would take 12+ months to build
  • Requires specialized expertise you don't have
  • Users find it confusing

🚩 Wrong approach

  • Different solution would work better
  • Competitor already nails this
  • Market wants different approach

🚩 Can't validate quickly

  • Can't test without building everything
  • Need 12-month sales cycle to validate
  • Dependencies outside your control

Moving to Business Model Validation

If solution validation succeeded:

You now know:

  • Your solution solves the problem
  • Target customers understand it
  • 30%+ would pay
  • What features are essential
  • Where you need to improve

Next step: Business Model Validation (Chapter 5)

  • Can you acquire customers profitably?
  • What pricing actually converts?
  • What's your path to revenue?
  • Unit economics work?

Resources

Prototyping tools:

  • Figma (design + prototype)
  • Webflow (functional websites)
  • Bubble (no-code apps)
  • Loom (demo videos)
  • Canva (marketing materials)

Testing tools:

  • Maze (prototype testing)
  • UserTesting (user feedback)
  • Hotjar (heatmaps, recordings)
  • Typeform (surveys)

Books:

  • The Lean Startup by Eric Ries
  • Sprint by Jake Knapp
  • Don't Make Me Think by Steve Krug