Package Exports
- @waldheimdev/astro-ai-llms-txt
- @waldheimdev/astro-ai-llms-txt/dist/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (@waldheimdev/astro-ai-llms-txt) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
π @waldheimdev/astro-ai-llms-txt
β¨ Astro Integration: llms.txt Generator β¨
This plugin magically creates a KI-optimized llms.txt in your build output on every Astro build!
Perfect for SEO, AI crawlers, and anyone who loves content. π¦Ύπ
π Features
- π·οΈ Extracts title, description, H1, H2, H3, and all
<p>texts from HTML - π€ AI-powered summarization via Ollama, OpenAI, or Gemini (provider/model/key/endpoint configurable)
- ποΈ AI response caching (SHA256,
.llms-txt-cacheindist) - ποΈ Groups entries by root web section (e.g.
/blog/,/services/) - π‘οΈ Robust path normalization (OS-independent)
- πͺ² Debug logging, error detection, build abort on errors
β‘ Installation
# npm
npm install @waldheimdev/astro-ai-llms-txt
# pnpm
pnpm add @waldheimdev/astro-ai-llms-txt
# yarn
yarn add @waldheimdev/astro-ai-llms-txt
# bun
bun add @waldheimdev/astro-ai-llms-txtπ Usage
Add the plugin to your astro.config.mjs and let the magic begin:
import llmsTxt from '@waldheimdev/astro-ai-llms-txt';
export default {
integrations: [
llmsTxt({
projectName: 'π My Project',
description: 'KI-optimized overview for LLMs. π§ ',
aiProvider: 'ollama', // 'openai' | 'gemini' | 'ollama'
aiApiKey: '', // API key for OpenAI/Gemini
aiModel: 'llama3', // Model name for provider
site: 'https://my-domain.com', // Base URL for links
maxInputLength: 8000, // Optional: max length for AI input
}),
],
};All Options
| Option | Type | Default | Description |
|---|---|---|---|
projectName |
string | 'My Project' | Name for the llms.txt header |
description |
string | 'KI-optimized overview for LLMs.' | Description for llms.txt header |
aiProvider |
string | 'ollama' | AI provider: 'ollama', 'openai', or 'gemini' |
aiApiKey |
string | '' | API key for OpenAI or Gemini (not needed for Ollama) |
aiModel |
string | 'llama3' | Model name for the selected provider |
aiUrl |
string | '' | Custom endpoint for Ollama (optional) |
site |
string | '' | Base URL for links in llms.txt |
maxInputLength |
number | 8000 | Maximum input length for AI summarization |
π¦ Output
After every Astro build you'll find in dist/:
llms.txtβ Your KI-optimized overview of all pages β¨.llms-txt-cache/β Cache for AI responses ποΈ
π οΈ Extending
- Want more AI providers? Just add them in
src/aiProvider.ts! π§© - Tests & coverage:
npm testβ - Linting:
npm run lintπ§Ή
π Example llms.txt
# π My Project
> KI-optimized overview for LLMs. π§
## Blog
- [/blog/post-1]: Post title summary...
- [/blog/post-2]: Post title summary...
## Services
- [/services/web]: Web service summary...Note: llms.txt complements existing standards like robots.txt and sitemap.xml, providing LLMs with a curated, AI-optimized overview. Learn more at llmstxt.org
Made with β€οΈ for Astro & AI enthusiasts!
Need help or find a bug?
Please open an issue on GitLab or submit a merge request. We appreciate your feedback!
Or contact me on my Website. You need my service? Let's talk!