Package Exports
- ax-init
- ax-init/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (ax-init) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
ax-init
Generate AI Agent Experience (AX) files — companion to ax-audit.
ax-audit tells you what's missing. ax-init generates it.
Usage
npx ax-initCLI flags
npx ax-init # Interactive mode
npx ax-init --config ax.json # Non-interactive mode
npx ax-init --help # Show help
npx ax-init --version # Show versionNon-interactive mode
Create an ax.json config file and run without prompts — useful for CI/CD:
{
"url": "https://example.com",
"name": "My Site",
"type": "business",
"description": "A great website",
"contactName": "Acme Inc",
"contactEmail": "hello@example.com",
"languages": ["en"],
"crawlerPolicy": "allow",
"outputDir": "./public",
"generators": ["llms-txt", "robots-txt", "agent-json", "mcp-json",
"security-txt", "structured-data", "meta-tags", "http-headers"]
}npx ax-init --config ax.jsonGenerated files
Interactive CLI that generates:
| File | Description |
|---|---|
llms.txt |
LLM-readable site description (llmstxt.org spec) |
robots.txt |
AI crawler allow/block rules for 29+ known crawlers |
.well-known/agent.json |
A2A Agent Card for protocol compliance |
.well-known/mcp.json |
MCP server configuration for AI agent discovery |
.well-known/security.txt |
RFC 9116 security contact file |
openapi.yaml |
OpenAPI 3.0 stub (API type sites only) |
| JSON-LD | Structured data <script> tag for <head> |
| AI Meta Tags | <meta> and <link> tags for <head> |
| HTTP Headers | Server config snippets for Nginx, Apache, Vercel, Netlify |
How it works
- Answer questions about your site (URL, name, type, description, contact, languages, crawler policy)
- Select which files to generate
- Files are written to your output directory; snippets are printed to the console
- Run
npx ax-auditto verify your score
Example
$ npx ax-init
ax-init v1.1.0 — Generate AI Agent Experience files
Site URL: https://example.com
Site name: Example
Site type: Personal
Brief description: Personal portfolio and blog
Your name: John Doe
Contact email: john@example.com
Languages: en
AI crawler policy: Allow
Output directory: ./public
Files to generate: all
✓ public/llms.txt
✓ public/robots.txt
✓ public/.well-known/agent.json
✓ public/.well-known/mcp.json
✓ public/.well-known/security.txt
5 files written
Snippets — copy to your config:
── Structured Data (JSON-LD) ──
<script type="application/ld+json">
...
</script>
── AI Meta Tags ──
<meta name="ai:site" content="Example">
...
── HTTP Headers ──
# Nginx / Apache / Vercel / Netlify configs
...
──────────────────────────────────────
Verify your score: npx ax-audit https://example.com
Show only issues: npx ax-audit https://example.com --only-failuresSupported site types
- Personal — generates
Personschema, personal llms.txt - Business — generates
Organizationschema, corporate llms.txt - API / Developer Tool — generates
SoftwareApplicationschema, API-focused llms.txt, OpenAPI stub - Blog — generates
Blogschema, content-focused llms.txt
AI crawlers configured
robots.txt rules cover 29 known AI crawlers including:
GPTBot, ChatGPT-User, OAI-SearchBot, ClaudeBot, Claude-Web, anthropic-ai, Google-Extended, Gemini, Amazonbot, Grok, xAI-Bot, DeepSeekBot, Meta-ExternalAgent, Meta-ExternalFetcher, PerplexityBot, and more.
Requirements
- Node.js 18+
Related
- ax-audit — Lighthouse for AI Agents. Audit your AX score.
License
Apache 2.0