Package Exports
- ax-init
- ax-init/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (ax-init) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
ax-init
Generate AI Agent Experience (AX) files — companion to ax-audit.
ax-audit tells you what's missing. ax-init generates it.
Usage
npx ax-initInteractive CLI that generates:
| File | Description |
|---|---|
llms.txt |
LLM-readable site description (llmstxt.org spec) |
robots.txt |
AI crawler allow/block rules for 22+ known crawlers |
.well-known/agent.json |
A2A Agent Card for protocol compliance |
.well-known/security.txt |
RFC 9116 security contact file |
| JSON-LD | Structured data <script> tag for <head> |
| AI Meta Tags | <meta> and <link> tags for <head> |
How it works
- Answer 9 questions about your site (URL, name, type, description, contact, languages, crawler policy)
- Select which files to generate
- Files are written to your output directory; HTML snippets are printed to the console
- Run
npx ax-auditto verify your score
Example
$ npx ax-init
ax-init — Generate AI Agent Experience files
Site URL: https://example.com
Site name: Example
Site type: Personal
Brief description: Personal portfolio and blog
Your name: John Doe
Contact email: john@example.com
Languages: en
AI crawler policy: Allow
Output directory: ./public
Files to generate: all
✓ public/llms.txt
✓ public/robots.txt
✓ public/.well-known/agent.json
✓ public/.well-known/security.txt
2 files written
HTML snippets — copy to your <head>:
── Structured Data (JSON-LD) ──
<script type="application/ld+json">
...
</script>
── AI Meta Tags ──
<meta name="ai:site" content="Example">
...
──────────────────────────────────────
Verify your score: npx ax-audit https://example.comSupported site types
- Personal — generates
Personschema, personal llms.txt - Business — generates
Organizationschema, corporate llms.txt - API / Developer Tool — generates
SoftwareApplicationschema, API-focused llms.txt - Blog — generates
Blogschema, content-focused llms.txt
AI crawlers configured
robots.txt rules cover 22 known AI crawlers including:
GPTBot, ChatGPT-User, ClaudeBot, Claude-Web, Google-Extended, Amazonbot, Bytespider, CCBot, PerplexityBot, YouBot, Cohere-ai, anthropic-ai, Meta-ExternalAgent, OAI-SearchBot, and more.
Requirements
- Node.js 18+
Related
- ax-audit — Lighthouse for AI Agents. Audit your AX score.
License
Apache 2.0