Back to playbooks

    LCA and Stretch AI by Frey

    Featured in
    Tactical Playbook
    Single Tactic Deep Dive
    Bootstrap Story

    Build profitable directories in 4 days using Claude Code and AI web scraping

    4 daystime to build71,000 to 725 listingsdata processing$250starting capital

    Core Founder Skill

    Non-technical

    Available Resources

    <$1K

    Core Insight

    Data moats in boring niches beat fancy tech — a crude directory with Lorem Ipsum text can generate $20,000+ leads if it solves a real search problem

    Challenges: Most founders think you need polished products and advanced features to generate revenue

    Applies when: Applies to local service directories where price transparency and comparison shopping are difficult, especially in B2B niches

    Evidence: His original portapottymatch.com with Lorem Ipsum text and AI-generated content generated multiple leads including a $20,000+ order from New Mexico State Fair, proving market need exists regardless of polish

    Summary

    Frey shows how to build profitable online directories in 4 days for under $250 using Claude Code and an open-source web crawler. His luxury restroom trailer directory generated high-value leads including a $20,000+ order from New Mexico State Fair, despite starting with a bare-bones WordPress site with Lorem Ipsum still on the homepage.

    Founder Moves

    Built ugly MVP directory first to validate demand

    What worked
    Context

    Wanted to test if there was demand for porta potty comparison without investing heavily in development

    What they did

    Created portapottymatch.com in WordPress with Lorem Ipsum text, same AI-generated descriptions for all listings, and identical stock photos

    Result

    Generated multiple inbound leads including $20,000+ order from New Mexico State Fair despite poor presentation

    Multiple leads including $20,000+ order
    Why it worked

    Solved a real search problem in a market with limited comparison tools — people needed to find luxury restroom trailers for events and had few options for comparing providers

    "I built this thinking like no one could reasonably trust this directory... But to my surprise, I just built this and left it up and I got like some leads coming in"

    Used OutScraper to get massive raw dataset

    What worked
    Context

    Needed comprehensive data for luxury restroom trailer providers nationwide

    What they did

    Scraped Google Maps using OutScraper to get 71,000 rows of potential porta potty businesses covering entire US

    Result

    Got massive dataset but 90%+ was irrelevant junk data requiring extensive cleaning

    71,000 initial listings scraped
    Why it worked

    Casting a wide net ensured comprehensive coverage of the market, even though most data would be filtered out later

    "I got 71,000 rows and potential listings and cover the entire state"

    Used Claude Code for initial data cleaning

    What worked
    Context

    Had 71,000 rows of mostly irrelevant data that needed basic filtering

    What they did

    Wrote prompt telling Claude Code to remove listings with no business name, address, permanently closed ones, and obvious non-relevant businesses like big box retailers

    Result

    Reduced dataset from 71,000 to 20,000 potential listings

    71,000 to 20,000 listings in one pass
    Why it worked

    Automated the most obvious filtering tasks that would have taken hundreds of hours manually while maintaining high accuracy for clear-cut decisions

    "this simple prompt got me from 71K down to 20,000 listings"

    Installed Crawl4AI for automated website verification

    What worked
    Context

    Still had 20,000 listings that needed manual verification of whether they actually offered luxury restroom trailers

    What they did

    Installed open-source Crawl4AI web crawler locally, connected it to Claude Code, and wrote prompts to automatically visit each website and identify luxury restroom trailer keywords

    Result

    Reduced 20,000 websites down to 725 verified luxury restroom trailer providers in 3 hours

    20,000 to 725 verified listings, saved ~1,000 hours
    Why it worked

    Automated the most time-intensive manual task using AI to read and understand website content at scale, something that would have taken 1,000+ hours manually

    "this is really what would have taken me like a 1,000 hours just a couple years back"

    Iterative data enrichment with single-focus passes

    What worked
    Context

    Needed detailed information about each provider's trailer inventory, amenities, and service areas

    What they did

    Made separate Crawl4AI passes for each data type: trailer inventory (2-stall, 3-stall, etc.), amenities/features, service areas, and images. Examined results after each pass to identify edge cases and reran 2-3 times per category

    Result

    Built comprehensive database with filterable amenities, stall counts, service areas, and high-quality images for each listing

    725 listings fully enriched with 4+ data categories
    Why it worked

    Single-focus passes produced higher quality results than trying to extract everything at once, and iterative refinement caught edge cases that would have degraded data quality

    "the first mistakes that I did when I started enriching my data with craw for AI was I would just give it this massive laundry list of things to get... it just didn't work"

    Used Claude Vision for image quality filtering

    What worked
    Context

    Scraped images were low quality - logos, favicons, and poor photos mixed with good trailer images

    What they did

    Sent top 3 image candidates per listing to Claude Vision API with prompts to identify the highest quality trailer images and filter out logos/irrelevant images

    Result

    Got high-quality, relevant images for each listing that enhanced the directory's credibility

    $30 API cost for 700+ listings, significant quality improvement
    Why it worked

    Visual AI could distinguish image quality and relevance better than filename/alt-text analysis alone, crucial for user trust in the directory

    "the first time I did this I got like logos and crappy images and favicons and it was like am I really about to clean this image data?"

    Built directory targeting ultra-specific niche

    What worked
    Context

    Competitive directories dominate broad categories like 'bathroom contractors' or 'senior living'

    What they did

    Focused on 'luxury restroom trailers' instead of general porta potty rentals, and suggested even more specific niches like 'senior living homes for people with dementia' or 'ADA accessible bathroom contractors'

    Result

    Easier to rank in search results and attract highly qualified leads willing to pay premium prices

    High-value leads in premium niche
    Why it worked

    Ultra-specific niches have less competition and serve users with very specific needs who are past the discovery phase and ready to make decisions

    "you might rank for senior living homes for people with dementia, which if we look on Hrefs actually gets like over a 1,000 monthly searches"

    Tools Used

    No tools listed yet.