Package Exports
- launchstack
- launchstack/src/cli.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (launchstack) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
⚡ LaunchStack
Production-ready project scaffolding for Node.js developers.
LaunchStack is an interactive CLI that generates fully structured, dependency-ready projects in seconds. It detects your environment, asks the right questions (or lets AI answer them for you), and outputs a project wired with the exact stack you chose — with no pinned outdated dependencies.
Why LaunchStack?
Starting new projects usually means:
- manually creating folders
- installing and configuring dependencies
- wiring databases and authentication
- setting up Docker and environment variables
- writing repetitive boilerplate
LaunchStack removes that friction by generating a production-ready project structure in seconds. Instead of spending time on setup, developers can start building immediately.
Installation
# Run without installing
npx launchstack
# Or install globally
npm install -g launchstackRequires Node.js v20 or higher.
Quick Start
npx launchstack init my-appLaunchStack will guide you through:
- Selecting a project type
- Choosing your stack (framework, database, ORM, auth, modules)
- Optionally describing your app in plain English for AI-assisted configuration
- Installing the latest stable dependencies automatically
How LaunchStack Works
Developer
│
▼
launchstack init
│
▼
Environment Detection
(Node • Docker • Redis • Git • Ollama)
│
▼
Interactive Wizard or AI Description
│
▼
Stack Configuration
(Framework • DB • ORM • Auth • Modules)
│
▼
Template Generator
│
▼
Dependency Installer
(latest stable versions)
│
▼
Production-ready projectCommands
launchstack init [name]
Launch the full interactive wizard. Optionally pass a project name.
launchstack init
launchstack init my-saas-app
launchstack init my-api --no-installlaunchstack create <type>
Quickly scaffold a project by type without going through the full wizard.
launchstack create backend
launchstack create frontend
launchstack create fullstack
launchstack create microservice
launchstack create worker
launchstack create clilaunchstack add <module>
Add a module to an existing LaunchStack project.
launchstack add redis
launchstack add logging
launchstack add email
launchstack add rate-limit
launchstack add queue
launchstack add file-storage
launchstack add payments
launchstack add swagger
launchstack add validation
launchstack add socket
launchstack add usersRun this from inside a project created with LaunchStack (requires launchstack.json).
launchstack doctor
Check your environment and scan all template dependencies for freshness.
launchstack doctorOutput example:
✔ Node detected
✔ npm detected
✔ Git detected
✔ Docker detected
✖ Redis not detected
✖ Ollama not detected
Checking template dependencies...
✔ express v5.0.0
✔ fastify v5.2.0
✔ prisma v6.3.1
⚠ mongoose — last updated >12 months agolaunchstack templates update
Pull the latest templates from the remote registry.
launchstack templates updatelaunchstack templates list
List all available template categories and variants.
launchstack templates listOutput example:
Available templates:
backend/
express
fastify
nest
django
frontend/
next
react-vite
vue
svelte
alpine
static
modules/
logging
rate-limit
email
redis
queue
payments
swagger
validation
socket
users
docker/
node
pythonProject Types
| Type | Description |
|---|---|
backend |
API server, SaaS backend, GraphQL API, AI server |
frontend |
Next.js, React (Vite), Vue, Svelte, Alpine.js, Static |
fullstack |
Combined frontend + backend in a monorepo |
microservice |
Lightweight service with optional framework |
worker |
Background job processor |
cli |
Node.js CLI tool |
Supported Stack
Backend Frameworks
Node: Express, Fastify, NestJS Python: Django, Flask
Frontend Frameworks
- Next.js
- React (Vite)
- Vue (Vite)
- Svelte
- Alpine.js (for lightweight interactive UIs)
- Static HTML
Databases
PostgreSQL, MySQL, MongoDB, SQLite
ORMs
Prisma, Drizzle, Sequelize, Mongoose
Authentication
JWT, OAuth (Passport)
Optional Modules
Rate limiting, Logging (Winston), Email (Nodemailer), Redis (ioredis), Background Jobs (BullMQ), File Storage (AWS S3), Swagger/OpenAPI, Zod Validation, Socket.IO, User CRUD + RBAC
Payments
Stripe, PayPal, Paystack, Flutterwave
AI-Assisted Mode
LaunchStack can automatically analyze your app description and suggest a stack configuration. To enable AI mode, you must have at least one supported AI provider available.
Enable AI Mode
LaunchStack supports multiple AI providers. It will automatically use the first available provider in this order:
- GitHub Models (recommended) – fast, free tier available, no local setup required
- Ollama – local AI, no API key required, but slower
- Manual prompts – fallback when no AI provider is available
Option 1: GitHub Models (Recommended)
GitHub Models is fast and free to use with a personal access token. Tokens are scoped to your session and can be revoked anytime from GitHub — do not store them permanently in your shell profile unless you understand the risk.
Step 1 — Create a token:
Go to github.com/settings/tokens → Generate new token (classic) → no special scopes required → set an expiry that suits you.
Step 2 — Set the token for your current terminal session:
This is the recommended approach — it lasts only for the session and never touches your disk:
Mac/Linux (VSCode terminal or any shell session):
export GITHUB_TOKEN=your_token_hereThen run LaunchStack immediately in the same terminal window.
Windows (PowerShell session):
$env:GITHUB_TOKEN="your_token_here"Optional — Make it permanent (only if you trust your machine is secure):
Mac/Linux:
echo 'export GITHUB_TOKEN=your_token_here' >> ~/.zshrc
source ~/.zshrcWindows: Add via System Properties → Environment Variables.
Tokens expire. If AI mode stops working, generate a new token and re-export it.
Option 2: Ollama (Local AI)
Ollama runs AI models locally with no API key. It is slower than GitHub Models and requires downloading a model (~4GB+), but works fully offline.
Install Ollama:
Mac (Homebrew):
brew install ollamaMac (MacPorts):
sudo port install ollamaLinux:
curl -fsSL https://ollama.com/install.sh | shWindows: Download and run the installer from ollama.com
Start the service and pull a model:
ollama serve
ollama run llama3After installation, LaunchStack will automatically detect Ollama during the environment check.
If no AI provider is detected, LaunchStack will fall back to the manual interactive prompts.
Example:
Describe your app in one sentence:
> A SaaS for selling digital downloads with Stripe and user accounts
AI suggested configuration:
{
"projectType": "backend",
"backendFramework": "express",
"database": "postgresql",
"orm": "prisma",
"auth": "jwt",
"modules": ["email", "redis"],
"payments": "stripe"
}Dependency Philosophy
LaunchStack never pins outdated versions. Dependencies are always installed at their latest stable versions at the time you run the CLI:
# LaunchStack runs this — no version pinning
npm install express prisma @prisma/client jsonwebtokenBefore installation, LaunchStack:
- Resolves the latest version of each package from the npm registry for display
- Runs a freshness guard that warns if any package hasn't been updated in over 12 months
- Checks for and blocks deprecated packages (e.g.
request,node-sass)
Template System
Templates are automatically synced from the remote registry at https://github.com/Ennygabby01/launchstack-templates.
They are cached locally at ~/.launchstack/templates. On each scaffold, LaunchStack:
- Checks if a local cache exists
- If online, pulls the latest templates (
git pull) - If offline, uses the local cache
- If no cache exists and the registry is unreachable, LaunchStack falls back to the bundled templates included with the CLI.
Templates are maintained in a separate repository and can be updated independently of CLI releases — no CLI update needed to get the latest template improvements.
# Manually update templates
launchstack templates updatelaunchstack.json
Projects generated by LaunchStack include a small metadata file used by CLI commands such as launchstack add.
{
"projectType": "backend",
"framework": "express",
"database": "postgresql",
"orm": "prisma",
"modules": ["redis", "email"]
}This file allows LaunchStack to understand the structure of the current project and safely apply additional modules later.
Example Generated Backend Project
Express / Fastify:
my-api/
src/
config/ ← env validation + config export
routes/
v1/ ← versioned API routes
middleware/
nodemon.json
.env.example
launchstack.json
README.mdNestJS:
my-api/
src/
config/ ← typed config factory
common/
guards/ ← auth guards placeholder
modules/
redis/ ← redis.module.ts + redis.service.ts
queue/ ← queue.module.ts + queue.service.ts
logging/ ← logging.module.ts + logger.service.ts
rate-limit/ ← rate-limit.module.ts + rate-limit.middleware.ts
prisma/
schema.prisma ← generated when Prisma is selected
.env.example
launchstack.json
README.mdThe exact structure depends on the selected framework, modules, and Docker configuration.
Environment Variables
After scaffolding, copy .env.example to .env and fill in your values:
cp .env.example .envCommon variables generated based on your stack:
| Variable | When generated |
|---|---|
DATABASE_URL |
Any database selected |
SHADOW_DATABASE_URL |
PostgreSQL + Prisma |
JWT_SECRET |
JWT auth |
REDIS_URL / REDIS_PASSWORD |
Redis or queue modules |
LOG_LEVEL |
Logging module |
STRIPE_SECRET_KEY / STRIPE_WEBHOOK_SECRET |
Stripe payments |
STRIPE_PRICE_BASIC / STRIPE_PRICE_PRO |
Stripe payments |
SMTP_HOST / SMTP_USER |
Email module |
STORAGE_BUCKET |
File storage module |
SECRET_KEY / DEBUG / ALLOWED_HOSTS |
Django projects |
Docker
If Docker is detected, LaunchStack will offer to generate:
Dockerfile— multi-stage Node.js or Python builddocker-compose.yml— app + database + Redis (based on your stack).dockerignore
docker-compose up -dChangelog
See CHANGELOG.md for full version history.
License
MIT © GBT3K