JSPM

  • ESM via JSPM
  • ES Module Entrypoint
  • Export Map
  • Keywords
  • License
  • Repository URL
  • TypeScript Types
  • README
  • Created
  • Published
  • Downloads 11
  • Score
    100M100P100Q73284F
  • License MIT

SEO regression testing for CI/CD pipelines — catches noindex accidents, broken canonicals, and schema errors before Google does.

Package Exports

    This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (serpclaw) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.

    Readme

    SerpClaw

    SEO regression testing for CI/CD pipelines. Catch noindex accidents, broken canonicals, and schema errors before Google does.

    npm version License: MIT


    The Problem

    One misplaced noindex tag. A broken canonical. A robots.txt blocking your checkout flow.
    These accidents ship to production daily — and nobody notices until rankings tank weeks later.

    The Fix

    npx serpclaw check https://staging.myapp.com

    SerpClaw runs 40+ SEO checks on any URL and returns clear PASS/FAIL/WARN results — designed to drop into your CI/CD pipeline in under 2 minutes.


    Quick Start

    Requires Python 3.8+ with pip.

    # Install Python dependencies (one-time)
    pip install requests beautifulsoup4 lxml
    
    # Run a check
    npx serpclaw check https://mysite.com

    Output:

    SerpClaw v0.1.0 — https://mysite.com
    
    ── Indexability ──────────────────────────────
      ✔  noindex meta tag              PASS
      ✔  X-Robots-Tag header           PASS
      ✔  robots.txt disallows URL      PASS
    
    ── Canonical ──────────────────────────────
      ✖  Canonical tag                 FAIL   — No <link rel='canonical'> found
    
    ── Meta Tags ──────────────────────────────
      ✔  Title tag                     PASS   — "My Site" (18 chars)
      ⚠  Meta description              WARN   — 172 chars (max 160)
    
    ────────────────────────────────────────────────
    Results: 18 passed · 1 failed · 1 warning
    Status:  FAILED
    → 1 critical issue. Fix before deploying.

    CLI Options

    # Basic check
    npx serpclaw check <url>
    
    # CI mode — exits with code 1 on any critical failure (use in pipelines)
    npx serpclaw check <url> --ci
    
    # JSON output (machine-readable, for downstream processing)
    npx serpclaw check <url> --json
    
    # Save current results as a baseline
    npx serpclaw check <url> --save-baseline ./baseline.json
    
    # Compare against baseline (detects regressions)
    npx serpclaw check <url> --baseline ./baseline.json --ci

    GitHub Actions

    Add to .github/workflows/seo.yml:

    name: SerpClaw SEO Audit
    
    on: [push, pull_request]
    
    jobs:
      seo-check:
        runs-on: ubuntu-latest
        steps:
          - uses: actions/checkout@v4
    
          - name: Install Python deps
            run: pip install requests beautifulsoup4 lxml
    
          - name: Run SerpClaw
            run: npx serpclaw check ${{ secrets.STAGING_URL }} --ci

    Or use the GitHub Action directly:

          - uses: serpclaw/check@v1
            with:
              url: ${{ secrets.STAGING_URL }}
              fail_on: critical
              slack_webhook: ${{ secrets.SLACK_WEBHOOK }}

    What Gets Checked

    Category Checks
    Indexability noindex meta tag, X-Robots-Tag header, robots.txt disallow
    Canonical Tag present, not empty, not pointing to wrong domain
    Meta Tags Title (presence + length), description (presence + length), viewport, charset
    Content H1 presence, single H1, H1 not empty, image alt text
    Schema JSON-LD present, valid JSON, @type declared
    Open Graph og:title, og:description, og:image
    Technical HTTPS, HTTP→HTTPS redirect, HTTP status, redirect chains, hreflang
    Sitemap sitemap.xml reachable, URL listed in sitemap, sitemap in robots.txt

    Requirements

    • Node.js 16+
    • Python 3.8+ (python3 or python in PATH)
    • pip packages: requests beautifulsoup4 lxml

    Install Python deps:

    pip install requests beautifulsoup4 lxml


    License

    MIT © SerpClaw