Back to Explore

Crawler.sh

SEO tools

Free Local AEO & SEO Spider and a Markdown content extractor

💡 A lightning-fast, local-first web crawler and AEO & SEO analysis tool. Crawl entire sites in seconds directly from your terminal or a native desktop app. Run automated SEO audits, extract content as clean Markdown, and export data to JSON, CSV, or Sitemap XML with zero hassle.

"Think of it as a Swiss Army knife for the AI era—part high-speed scout, part precision content surgeon."

30-Second Verdict
What is it: A Rust-based local SEO crawler that performs 16 SEO checks and extracts web content into clean Markdown.
Worth attention: Definitely worth watching, especially for developers who want to bypass Screaming Frog's free limits or need to bulk-convert web pages to Markdown for LLMs.
6/10

Hype

7/10

Utility

12

Votes

Product Profile
Full Analysis Report

Crawler.sh: The Developer's "Anti-SaaS" Local SEO Crawler

2026-03-03 | ProductHunt | Official Site

Crawler.sh Dashboard

Interface Breakdown: A sleek dark-themed dashboard with a URL input bar and a 'Start Crawl' button at the top. The card-style layout features a Live Feed of URLs and status codes (green for 200, red for 404), an SEO Issues panel highlighting 17 common problems (short titles, missing descriptions, thin content), and a donut chart showing HTTP status distribution. It feels more like a geeky terminal tool than traditional SEO software.


30-Second Quick Judgment

What is it?: A local SEO crawler written in Rust that can scan an entire site in seconds via terminal or desktop app. It checks for 16 types of SEO issues and extracts content into clean Markdown. It also claims to support AEO (Answer Engine Optimization) analysis.

Is it worth your attention?:

YES, if you:

  • Frequently audit SEO but hate Screaming Frog's £149/year price tag and 500-URL free limit.
  • Need to bulk-convert web pages to Markdown (for LLMs or content libraries).
  • Love CLI tools and want to automate SEO checks in your scripts.
  • Are curious about the AEO trend and want a tool that supports it.

NO, if you:

  • Require Windows or Linux support (currently macOS only).
  • Need JavaScript rendering (it can't crawl SPAs).
  • Need enterprise features (team collaboration, APIs, cloud sync).
  • Already have a mature, paid SEO toolchain that works for you.

How does it stack up?

vsCrawler.shScreaming FrogLibreCrawlSitebulb
Core DifferenceRust Local + Markdown + AEOIndustry Standard SEO SpiderOpen-source Python AlternativeVisual SEO Auditing
PriceCLI Free, Desktop PremiumFree (500 URLs) / £149/yrFree (MIT)£10-35/mo
ProsFast, Local, Markdown ExtractionMost comprehensive featuresCompletely free & open-sourceBest visualizations
ConsVery new, macOS onlyExpensive, no MarkdownRequires Python environmentExpensive, steep learning curve

The Three Big Questions

Is it for me?

Target User Personas:

  • Indie Devs / Technical SEOs: Who prefer working in the terminal.
  • Content Creators: Who need to bulk-extract web content into Markdown.
  • AI App Developers: Who need "LLM-ready" web data.
  • Small Teams: With limited budgets needing a solid SEO audit tool.

The Litmus Test:

  • If you perform at least one SEO check a week → This saves time.
  • If you need to turn competitor blogs into Markdown → This is perfect.
  • If you only use Windows → Skip it for now; it won't run.

Is it useful?

DimensionBenefitCost
TimeRust engine is faster than Screaming Frog; CLI is scriptable~15 min learning curve for the new tool
MoneyCLI is totally free, saving £149/year on Screaming FrogDesktop Premium pricing is TBD
EffortOne tool for crawling + SEO + Markdown extractionmacOS only; ecosystem is still early

ROI Judgment: If you're hitting the 500-URL wall on Screaming Frog's free version or need Markdown extraction, trying the CLI version costs nothing but a few minutes. If you have a full paid suite, there’s no urgent need to switch.

Does it delight?

Delight Points:

  1. Rust Speed - "Crawling a whole site in seconds" isn't hyperbole; Rust's async concurrency makes it blazingly fast.
  2. Markdown Extraction - Uses readability-rust for content and htmd for Markdown conversion—a feature most SEO crawlers lack.
  3. CLI Automation - crawler crawl → crawler seo → crawler export. Three commands, and you're done.

Real User Voice:

"I made a desktop app and a cli tool that you can analyze your sites' SEO issues, broken links etc. Rust based, works faster than screamingfrog." — @mehmetkose (Founder)

"Building crawler.sh week 2. Distributing a desktop app for multiple OS's macos(arm64,x64), linux (*n) and windows hell of a job." — @mehmetkose (Founder, venting about cross-platform distribution)

To be honest, since it launched on March 2, 2026, real user reviews are scarce. The founder is currently the loudest voice.


For the Developers

Tech Stack

LayerTechnologyNotes
Crawling EngineRust (crawler-core)Monorepo workspace
CLIRust (crawler-cli)Subcommands: crawl/info/export/seo
Desktop AppTauri 2 + React 19macOS DMG universal binary
Content Extractionreadability-rust + htmdHTML → Markdown
Data Format.crawl (NDJSON)Line-by-line JSON for streaming

Core Implementation Highlights

  1. NDJSON Streaming: The .crawl file is line-delimited JSON. The first line is metadata, followed by one page record per line. This allows real-time data processing without waiting for the crawl to finish—smart for large sites.

  2. Markdown Pipeline: readability-rust (Rust port of Mozilla's algorithm) extracts the main content, which htmd then converts to Markdown. It includes word counts, authors, and excerpts, making the output perfect for RAG or LLM feeding.

Open Source Status

  • Crawler.sh: Closed-source commercial product.
  • Open Source Alternatives:
  • Build Difficulty: Medium. 3-4 weeks for the Rust engine, 2-3 weeks for SEO logic, 2-3 weeks for the Tauri app. A solo dev could hit MVP in 2-3 months.

For the Product Managers

Pain Point Analysis

Pain PointSeverityCrawler.sh Solution
SEO tools are too expensiveHigh (for small teams)Free CLI, free tier for Desktop
Content extraction requires custom scriptsMediumBuilt-in Markdown extraction
Enterprise tools are too bloatedEmotionalSingle binary, local execution
AI era requires AEOEmergingClaims to support AEO analysis

The "Why Now": SEO auditing is a consistent need, but "Markdown extraction" is becoming a high-frequency requirement in the AI era. Crawler.sh sits right at that intersection.

Competitive Differentiation

vsCrawler.shScreaming FrogLibreCrawlCrawl4AI
Selling PointSpeed + MarkdownFeature CompleteFree/Open SourceLLM Optimized
MarkdownBuilt-inNoneNoneBuilt-in
AEOClaimed SupportNoneNoneNone
PriceCLI Free£149/yrFreeFree
PlatformmacOS onlyCross-platformCross-platformCross-platform

For the Early Adopters

Pricing Breakdown

TierPriceFeaturesIs it enough?
CLIFreeAll 4 core commandsPlenty for technical users
Desktop FreeFreeBasic dashboard + Live feedGood for a quick look
Desktop PremiumTBDAdvanced featuresUnknown value proposition

Hidden Costs: None. No API keys, no cloud fees, no account required.

Quick Start Guide

  1. Install CLI: Run the install script in your terminal to download the macOS binary.
  2. First Crawl: crawler crawl https://your-site.com
  3. View Info: crawler info output.crawl for an overview.
  4. SEO Audit: crawler seo output.crawl to export the issue list.
  5. Export Data: crawler export output.crawl --format json.

Time to value: 5-10 minutes.

The "Gotchas"

  1. macOS Only: Windows/Linux users are out of luck for now.
  2. Very Early Stage: Launched March 2026; expect bugs.
  3. No JS Rendering: It might miss content on React/Vue-heavy sites.
  4. Vague AEO Features: While AEO is in the name, the specific AEO checks aren't clearly defined yet.

Conclusion

The Bottom Line: Crawler.sh is an intriguing mix of "Rust + AEO + Markdown." However, it's extremely new, platform-restricted, and currently lacks community traction. It’s great for curious tech enthusiasts, but not yet ready to be your primary professional tool.

User TypeRecommendation
Indie DevsTry the CLI; the Markdown extraction is genuinely useful.
PMsStudy the "CLI + Desktop" and "LLM Ready" positioning for your own products.
Early AdoptersGrab the CLI for free and play with it for 5 minutes. Wait for cross-platform for the desktop app.
InvestorsToo early. Solo-founder, single-platform, and low initial launch heat.

Resource Links

ResourceLink
Official Sitehttps://crawler.sh/
Documentationhttps://crawler.sh/docs/
ProductHunthttps://www.producthunt.com/products/crawler-sh
Founder's Twitterhttps://twitter.com/mehmetkose
Founder's GitHubhttps://github.com/mehmetkose
Roadmaphttps://crawler.sh/roadmap/

2026-03-03 | Trend-Tracker v7.3

One-line Verdict

Crawler.sh is a high-potential tool that hits the sweet spot between 'SEO' and 'AI Data' needs. Despite current platform limitations, its Rust-driven performance and Markdown extraction make it a must-try for technical users.

FAQ

Frequently Asked Questions about Crawler.sh

A Rust-based local SEO crawler that performs 16 SEO checks and extracts web content into clean Markdown.

The main features of Crawler.sh include: Rust-based concurrent crawling engine, 16 automated SEO audits, Markdown content extraction, Multi-format export (JSON/CSV/XML).

CLI version is free; Desktop version is free to download, with Premium pricing yet to be disclosed.

Indie developers, technical SEO professionals, content creators, and AI application developers.

Alternatives to Crawler.sh include: Screaming Frog, LibreCrawl, Crawl4AI, SiteOne Crawler..

Data source: ProductHuntMar 2, 2026
Last updated: