The SEO Crawler MCP is a self-contained application that crawls and analyzes websites for SEO issues, storing results locally in an SQLite database. It provides tools for both automated and manual analysis of technical SEO health.
From the registry: Crawl and analyse websites for SEO errors using Crawlee with SQLite storage
Please install the `seo-crawler-mcp` MCP server into my current AI client (that's you).
Required prerequisites (do these first if not already done):
- **Node.js 18+** — Required runtime
Canonical MCP server config (stdio transport):
- command: `npx`
- args: ["-y","@houtini/seo-crawler-mcp"]
- required environment variables:
- `OUTPUT_DIR`: Directory where crawl SQLite databases are saved (example: `/path/to/seo-audits`)
Note: Full website crawler storing data in SQLite. 4 tools: run_seo_audit, analyze_seo, query_seo_data, list_seo_queries. 28 SQL queries across 5 categories (critical, content, technical, security, optimisation).
Add this MCP server to my current client's config in the correct format for you. If you need secrets or credentials I haven't provided, ASK me — do not invent values or leave raw placeholders. After adding it, tell me how to verify the server is connected.OUTPUT_DIRrequiredDirectory where crawl results are saved.MCP server for searching Airweave collections with natural language queries.