Skip to main content
Product GrowthConversion

LP conversion from 5% to 65%: What actually worked

A tactical breakdown of the changes that dramatically improved our landing page performance—plus how we automated LP creation afterward.

Niels KaspersNiels Kaspers
December 20, 2024
9 min read
LP conversion from 5% to 65%: What actually worked

TL;DR

We 13x'd landing page conversion through systematic testing: put the tool above the fold, reduce friction, show real results immediately. Watch session recordings before you guess what's wrong. Then we automated the whole thing.

We improved landing page conversion from 5% to 65%. Not with fancy tools or expensive redesigns—with systematic testing and user understanding.

Then we automated LP creation entirely. But I'm getting ahead of myself.

The Starting Point

Our AI tool landing pages were converting at 5%. Users would land, look around, and leave. Classic.

These weren't obscure pages. We're talking about landing pages for Picsart's core AI tools—background remover, image enhancer, AI image generator—pages getting hundreds of thousands of visits per month. At 150M+ monthly active users, even small conversion improvements meant real numbers. A single percentage point on our background remover page was worth thousands of new tool activations per week.

The pages had:

  • Long explanations of features nobody read
  • Multiple CTAs competing for attention ("Try free," "Sign up," "Learn more," "Download app")
  • Generic stock imagery that could've been on any SaaS site
  • Form fields asking for email before showing any value

I'd seen this pattern before. At NU.nl, at Whisbi, at every company I've worked at—the first version of any landing page tries to say too much. It's the "but wait, there's more" impulse baked into a webpage.

The Framework

Instead of random A/B tests, we followed a systematic approach. I'd learned from scaling Quicktools to 10M users that random experimentation wastes time. You need a framework.

1. Understand Why People Leave

We installed Hotjar session recordings and watched 200+ sessions over two weeks. Not a sample—a commitment. You can't watch 15 recordings and call it research.

Patterns emerged fast:

  • Users scrolled past the fold but didn't engage with anything below it. The scroll was exploratory, not intentional.
  • Many hovered over the CTA, moved away, came back, and then left. Classic hesitation—they wanted to try the tool but the "Sign up" requirement killed it.
  • Mobile users struggled with the layout. Buttons were too small, the tool preview was cropped, and the page loaded slowly on 4G.
  • A surprising number of users scrolled all the way to the bottom, found nothing useful, and bounced. They were looking for proof the tool actually worked.

The session recordings told a story that analytics alone couldn't. Our bounce rate said "people leave." The recordings said "people want to stay but we're giving them reasons not to."

2. Prioritize by Impact

Not all changes matter equally. We ranked hypotheses using ICE scoring:

  • Impact (how many users affected)
  • Confidence (how sure are we this will work, based on session data)
  • Ease (how fast can we test it)

Our top-scored hypothesis: "Put the tool above the fold so users can start immediately." High impact (affects every visitor), high confidence (session recordings showed users wanted to use the tool), medium ease (required frontend work but no backend changes).

3. Test One Thing at a Time

No multi-variate madness. Change one thing, measure the result, move on. We ran each test for a minimum of two weeks with at least 10,000 visitors per variant before calling it. No peeking at results on day three and declaring victory.

The whole testing cycle took about four months, running tests sequentially on our highest-traffic pages first—background remover, then image enhancer, then AI image generator—and rolling winning patterns across all pages.

The Changes That Worked

Test 1: Immediate Value Above the Fold

Before: Feature descriptions and company info. A hero image showing a phone with the app. Three paragraphs explaining what background removal is. (As if anyone searching "remove background from image" needs that explained.)

After: The tool itself, ready to use. Upload button front and center. A sample image already loaded showing a before/after preview.

Conversion impact: +15%

Timeline: Ran for 3 weeks on the background remover page. Results were statistically significant after 8 days, but we let it run to confirm stability.

Users came to do a task. Let them start immediately. This sounds obvious in hindsight, but the original page was designed by people who thought about the page. The new version was designed by people who thought about the user.

Test 2: Remove Friction from CTAs

Before: "Sign up to remove background"—requiring email, password, and email verification before the user could touch the tool.

After: "Remove background"—one click, instant result. Account creation moved to after the user saw value.

Conversion impact: +20%

This was the single biggest win. We debated it internally for weeks. The growth team wanted to gate the tool for lead capture. I argued that a user who's seen the tool work is 10x more likely to create an account than one who hasn't. The data proved it—not only did conversion jump 20%, but downstream account creation actually increased because more people reached the "wow" moment.

Let people experience value before asking for commitment. This principle now drives every LP we build.

Test 3: Show Real Results

Before: Stock photos of happy people using laptops. A lifestyle shot of someone "being creative." You know the ones.

After: Before/after examples of the actual tool output. Real images processed by the tool, showing exactly what the user would get.

Conversion impact: +12%

Proof beats promises. We tested this on the AI image enhancer page first. Instead of telling users "enhance your images with AI," we showed a blurry photo next to its enhanced version. The before/after slider became our most-interacted-with element.

Test 4: Mobile-First Design

Before: Shrunk desktop layout. Everything squeezed into a smaller viewport with tiny touch targets.

After: Purpose-built mobile experience. Larger upload button, simplified layout, image previews optimized for portrait orientation.

Conversion impact: +8%

63% of our traffic was mobile. We were optimizing for 37%. The mobile redesign wasn't just "make it responsive"—it was a different layout entirely. The desktop version kept a side-by-side before/after. Mobile got a vertical stack with swipe interactions.

Test 5: Reduce Cognitive Load

Before: 5 different CTAs—"Try free," "Download app," "Watch demo," "See pricing," "Sign up."

After: 1 primary action. Everything else moved to secondary navigation or removed entirely.

Conversion impact: +10%

When everything is important, nothing is. We literally deleted four buttons. The designer was nervous. The PM was nervous. The data was not nervous at all.

What Didn't Work

Not every test wins. Honesty about failures saves others time:

  • Adding social proof badges ("Trusted by 150M users"): No significant impact. Users who search for "remove background" care about the tool, not your user count.
  • Changing button colors: We tested green, orange, and the original blue. Marginal 0.3% difference. Not worth the meeting time spent debating it.
  • Adding live chat support: Increased support load by 40% without measurable conversion lift. Users didn't have questions—they had friction.
  • Exit-intent popups: Annoyed users and hurt brand perception. Our session recordings showed people actively trying to close them, sometimes leaving the site entirely out of frustration.
  • Video backgrounds: Increased page load time by 2.3 seconds. Any conversion benefit was obliterated by the users who left before the page finished loading.

The Final Stack

After 4 months of testing:

ElementApproach
HeroTool ready to use, no gates
CTASingle, clear action
TrustReal output examples with before/after
LayoutMobile-first, desktop-enhanced
CopyBenefit-focused, minimal, no jargon
SpeedSub-2-second load time

Then We Automated It

Here's where it gets interesting. Once we knew the winning formula, we had a new problem: we needed to apply it to 50+ landing pages across different tools and languages. Doing this manually would take months.

So we built a landing page generation system using N8N workflows and LLMs. The system takes a tool name and target keyword, generates LP copy following our proven patterns, and creates the page in our CMS.

The automation handles:

  • Copy generation following our winning structure (tool above fold, single CTA, real examples)
  • SEO meta tags optimized for the target keyword
  • Localization into 10+ languages
  • Internal linking to related tools and pages

What used to take a designer + copywriter + developer 2 weeks per page now takes 20 minutes of review time. We went from shipping 2-3 new LPs per month to 15-20. And because every page follows the tested conversion patterns, new pages launch at 40-50% conversion instead of the old 5%.

This is what I mean when I talk about small teams beating big teams—it's not about working harder. It's about building systems that multiply your output. Two people with the right automation outship a team of ten doing everything manually.

Lessons Learned

  1. Watch users, don't survey them. People say one thing and do another. Session recordings don't lie. Budget two weeks of just watching before you touch anything.

  2. Friction is the enemy. Every field, every click, every second of loading time costs conversions. Audit your page with a stopwatch—how many seconds from landing to value?

  3. Mobile isn't an afterthought. Design for the majority of your traffic. Check your analytics before your next redesign and design for whatever device 60%+ of visitors use.

  4. Test with adequate sample sizes. Don't declare winners too early. Two weeks minimum, 10K visitors per variant minimum. Patience pays.

  5. Compound gains matter. 5 changes at +10% each don't add up to +50%. They compound to +61%. That's why systematic testing beats one big redesign.

  6. Then automate the winner. Once you know what works, don't keep doing it manually. Build systems that replicate the pattern. Your time is better spent finding the next insight than executing the last one.


Working on conversion optimization? I'm always happy to chat about what's working. Find me on LinkedIn.

Niels Kaspers

Written by Niels Kaspers

Principal PM, Growth at Picsart

More articles

Get in touch

Have questions or want to discuss this topic? Let me know.