This MCP server provides programmatic access to the Screaming Frog SEO Spider, allowing users to crawl websites, export crawl data, and manage crawl storage through an AI assistant.
From the registry: Crawl websites, export SEO data, and manage crawls via Screaming Frog SEO Spider.
Please install the `screaming-frog-mcp` MCP server into my current AI client (that's you).
Required prerequisites (do these first if not already done):
- **Screaming Frog SEO Spider** — Must be installed on your machine (v16+, tested with v23.x). Paid license required for most features. (https://www.screamingfrog.co.uk/seo-spider/)
- **Python 3.10+** — Required to run the MCP server
Canonical MCP server config (stdio transport):
- command: `uvx`
- args: ["screaming-frog-mcp"]
- optional environment variables:
- `SF_CLI_PATH`: Path to Screaming Frog CLI executable. Default works for macOS. Required for Linux/Windows. (example: `/Applications/Screaming Frog SEO Spider.app/Contents/MacOS/ScreamingFrogSEOSpiderLauncher`)
Note: Must close the Screaming Frog GUI before using MCP tools — the GUI locks the crawl database. Workflow: crawl in GUI → close GUI → use MCP to analyze/export results.
Add this MCP server to my current client's config in the correct format for you. If you need secrets or credentials I haven't provided, ASK me — do not invent values or leave raw placeholders. After adding it, tell me how to verify the server is connected.SF_CLI_PATHrequiredPath to the Screaming Frog CLI executableMCP server for searching Airweave collections with natural language queries.