Package Exports
- pagesight
- pagesight/src/index.ts
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (pagesight) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
Pagesight
See your site the way search engines and AI see it.
npm install pagesightMost SEO tools flag "title over 60 characters" and "only one H1 allowed." Google's own engineers say those rules don't exist. Pagesight skips the myths and goes to the sources.
Tools
inspect
Ask Google: is this page indexed? What canonical did you choose? Any crawl errors? Structured data issues?
Returns index status, canonical (yours vs Google's), crawl status, rich results validation, sitemaps, and referring URLs.
pagespeed
Run Google Lighthouse on any URL:
- Scores: performance, accessibility, best-practices, seo
- Core Web Vitals (lab): FCP, LCP, TBT, CLS, Speed Index, TTI
- CrUX field data: real Chrome user metrics (page + origin)
- Opportunities: ranked by severity with potential savings
- Strategy:
mobileordesktop
crux
Real-world Core Web Vitals from Chrome users (28-day rolling window):
- Metrics: LCP, FCP, INP, CLS, TTFB, RTT, navigation types, form factors
- Granularity: by URL or origin, by device (DESKTOP, PHONE, TABLET)
- Data: p75 values + histogram distributions
crux_history
Core Web Vitals trends over time — up to 40 weekly data points (~10 months):
- Trend detection (improved/stable/worse) with percentage change
- Recent data points table for core metrics
- Custom period count (1-40)
performance
Google Search Console search analytics:
- Dimensions:
query,page,country,device,date,searchAppearance,hour - Search types:
web,image,video,news,discover,googleNews - Filters:
equals,contains,notEquals,notContains,includingRegex,excludingRegex - Pagination: up to 25,000 rows
robots
Analyze any site's robots.txt:
- Syntax validation per RFC 9309
- AI crawler audit — 139+ bots from the ai-robots-txt community registry
- Bot categories: training scrapers, AI search crawlers, AI assistants, AI agents
- Per-bot status: blocked or allowed, with the matched rule
=== robots.txt: https://www.nytimes.com ===
AI Crawlers: 35 blocked, 104 allowed (of 139 known)
BLOCKED GPTBot (OpenAI) — GPT model training
BLOCKED ClaudeBot (Anthropic) — Claude model training
ALLOWED Claude-User (Anthropic) — User-initiated fetchingsitemaps
Search Console properties and sitemaps (read-only):
list_sites— all properties with permission levelget_site— details for a specific propertylist_sitemaps— sitemaps with error/warning countsget_sitemap— full details for a specific sitemap
setup
Check auth status or walk through OAuth interactively.
Setup
1. Google Cloud project
- Go to Google Cloud Console
- Create a project (or use existing)
- Enable: Search Console API, PageSpeed Insights API, Chrome UX Report API
- Create OAuth client ID (Desktop app) — for Search Console
- Create API key — for PageSpeed and CrUX
2. Configure
GSC_CLIENT_ID=your-client-id.apps.googleusercontent.com
GSC_CLIENT_SECRET=your-client-secret
GSC_REFRESH_TOKEN=your-refresh-token
GOOGLE_API_KEY=your-api-keyThe robots tool works without any credentials.
3. Use with your AI assistant
{
"mcpServers": {
"pagesight": {
"command": "bun",
"args": ["run", "/path/to/pagesight/src/index.ts"],
"env": {
"GSC_CLIENT_ID": "your-client-id",
"GSC_CLIENT_SECRET": "your-secret",
"GSC_REFRESH_TOKEN": "your-token",
"GOOGLE_API_KEY": "your-api-key"
}
}
}
}Then just ask:
"Is https://mysite.com indexed?"
"Run pagespeed on my homepage"
"Which AI crawlers can access my site?"
"How have my Core Web Vitals changed?"
"Which queries bring traffic to this page?"Why not other SEO tools?
We checked every common SEO "rule" against official Google documentation:
- "Title must be under 60 characters" — Google: "there's no limit." Gary Illyes: "an externally made-up metric."
- "Meta description must be 155 characters" — Google: "there's no limit on how long a meta description can be."
- "Only one H1 per page" — John Mueller: "You can use H1 tags as often as you want. There's no limit."
- "Minimum 300 words per page" — Mueller: "the number of words on a page is not a quality factor."
- "Text-to-HTML ratio matters" — Mueller: "it makes absolutely no sense at all for SEO."
Tools that flag these are reporting their opinions. Pagesight only reports what the sources actually return.
Development
bun install
bun run start # start server
bun run lint # biome check
bun run format # biome formatLicense
MIT