Package Exports
- compose-env
- compose-env/next
- compose-env/testing
Readme
compose-env
One schema, every config source. Load from .env, environment variables, CLI arguments, JSON/YAML files, AWS SSM, HashiCorp Vault, Azure Key Vault, or GCP Secret Manager - fully typed, validated, and documented.
Table of Contents
- Features
- Installation
- Quick Start
- Schema Field Types
- Base Field Options
- Sources
- Source Priority
- defineConfig Options
- TypeScript Inference
- Security Model
- Diagnostics
- CLI
- Watch Mode
- Next.js Adapter
- Testing Helpers
- Plugin Interface
- Cloud Adapters
- ESLint Plugin
Features
- Single schema - define every config variable once; all sources read from it
- Full TypeScript inference -
config.PORTis typednumber,config.DEBUGisboolean, etc. - Rich field types - string, number, boolean, enum, JSON, array, url, port, email, custom
- Multiple sources - env vars,
.envfiles (with interpolation), CLI args, JSON, YAML, cloud providers - Secret protection -
secret: truefields are redacted in all console output; access via Proxy throws on accidental serialisation - Diagnostics table - pretty-printed table at startup showing each key, type, value, and source
- CLI tools - generate
.env.example, validate config, export Docker env files - Watch mode - auto-reload config when
.envfiles change in development - Framework support - Next.js adapter for build-time validation and public var injection
- Cloud adapters - separate packages for AWS SSM, Vault, Azure Key Vault, GCP Secret Manager
- ESLint plugin - catch typos in
config.KEYaccess at lint time
Installation
npm install compose-envNode.js 18+ required. No required runtime dependencies - cloud adapters install peer deps only when needed.
Quick Start
// config.ts
import { defineConfig, source } from 'compose-env'
const config = await defineConfig(
{
PORT: { type: 'port', required: true, description: 'HTTP server port' },
DATABASE_URL: { type: 'url', required: true, secret: true },
NODE_ENV: { type: 'enum', values: ['development', 'production', 'test'], default: 'development' },
LOG_LEVEL: { type: 'enum', values: ['debug', 'info', 'warn', 'error'], default: 'info' },
REDIS_TTL: { type: 'number', default: 3600, min: 0 },
FEATURE_FLAGS:{ type: 'array', default: [] },
ALLOWED_HOSTS:{ type: 'json', required: false },
},
{
sources: [
source.envFile('.env.local'),
source.envFile('.env'),
source.env(),
],
},
)
// config.PORT → number
// config.DATABASE_URL → string (redacted in logs)
// config.NODE_ENV → 'development' | 'production' | 'test'
// config.REDIS_TTL → number
// config.FEATURE_FLAGS → string[]
export default configSchema Field Types
Every field requires a type property plus any type-specific options.
string
{ type: 'string', minLength?: number, maxLength?: number, pattern?: string }| Option | Type | Description |
|---|---|---|
minLength |
number |
Minimum character length |
maxLength |
number |
Maximum character length |
pattern |
string |
Regex pattern the value must match |
number
{ type: 'number', min?: number, max?: number, integer?: boolean }| Option | Type | Description |
|---|---|---|
min |
number |
Minimum allowed value |
max |
number |
Maximum allowed value |
integer |
boolean |
Reject non-integer values |
boolean
{ type: 'boolean' }Accepts 'true', '1', 'yes', 'on' (truthy) and 'false', '0', 'no', 'off' (falsy).
enum
{ type: 'enum', values: readonly string[] }{ type: 'enum', values: ['development', 'staging', 'production'] as const }
// Inferred type: 'development' | 'staging' | 'production'json
{ type: 'json' }Parses the value with JSON.parse. Inferred type is unknown; narrow it with transform or validate.
array
{ type: 'array', separator?: string }Splits a delimited string into string[]. Default separator is ,.
{ type: 'array', separator: ':' } // PATH-styleurl
{ type: 'url' }Validates using the WHATWG URL constructor.
port
{ type: 'port' }Parses as an integer and validates it is in the range 1–65535. Inferred type is number.
email
{ type: 'email' }Validates with a standard email regex.
custom
{ type: 'custom', parse: (raw: string) => T, serialize?: (val: T) => string }Full control over parsing. The parse function may throw to signal invalid input. serialize is used for .env.example generation.
{
type: 'custom',
parse: (raw) => new URL(raw),
serialize: (val) => (val as URL).toString(),
}Base Field Options
These options apply to every field type.
| Option | Type | Description |
|---|---|---|
required |
boolean |
Throw ConfigValidationError if the value is missing |
default |
T |
Fallback value when no source provides this key |
secret |
boolean |
Redact value in logs, diagnostics, and JSON serialisation |
description |
string |
Shown in .env.example and diagnostics table |
deprecated |
boolean | string |
Emit a warning; pass a string for a migration hint |
alias |
string[] |
Alternative key names (e.g. legacy names) to check in sources |
transform |
(val: T) => T |
Transform the parsed value before it is stored |
validate |
(val: T) => boolean | string |
Return false or an error string to fail validation |
{
API_KEY: {
type: 'string',
required: true,
secret: true,
description: 'Third-party API key',
alias: ['LEGACY_API_KEY'],
validate: (v) => v.length >= 32 || 'API_KEY must be at least 32 characters',
transform: (v) => v.trim(),
}
}Sources
Sources are loaded in order and merged left-to-right: sources listed later override earlier ones.
source.env()
Reads the current process environment (process.env).
source.env()source.envFile(path)
Reads a .env-format file from disk. Missing files are silently skipped.
source.envFile('.env')
source.envFile('.env.local')Supported .env syntax:
KEY=valueandexport KEY=value- Unquoted values (inline
#comments stripped) - Double-quoted (
"...") with escape sequences (\n,\t,\\,\") - Single-quoted (
'...') literal values ${PREVIOUSLY_DEFINED}interpolation - resolved within the file only;process.envis never consulted during interpolation
source.cli(argv?)
Parses CLI arguments. Defaults to process.argv.
source.cli()
source.cli(['--PORT=3000', '--DEBUG'])Supported argument formats:
| Format | Result |
|---|---|
--KEY=value |
{ KEY: 'value' } |
--key value |
{ KEY: 'value' } |
--boolean-flag |
{ BOOLEAN_FLAG: 'true' } |
--no-boolean-flag |
{ BOOLEAN_FLAG: 'false' } |
Keys are normalised: dashes become underscores and the result is uppercased (--db-host → DB_HOST).
source.json(path)
Reads a JSON file and uses its top-level keys as the source.
source.json('./config/settings.json')source.yaml(path)
Reads a YAML file. Requires js-yaml as a peer dependency.
// npm install js-yaml
source.yaml('./config/settings.yaml')Source Priority
Sources passed to sources are merged in array order, later sources winning. The recommended pattern:
sources: [
source.envFile('.env'), // base defaults
source.envFile('.env.local'), // local overrides (git-ignored)
source.env(), // process environment (CI, prod)
source.cli(), // runtime flags (highest priority)
]defineConfig Options
await defineConfig(schema, {
sources?: Source[],
diagnostics?: 'off' | 'summary' | 'verbose',
})| Option | Default | Description |
|---|---|---|
sources |
[source.env(), source.envFile('.env')] |
Source adapters to load from |
diagnostics |
'verbose' (dev) / 'off' (prod) |
Controls startup table output |
When NODE_ENV is not set, diagnostics defaults to 'verbose' and a warning is printed to stderr.
TypeScript Inference
InferConfig<S> maps your schema to exact TypeScript types automatically. Required fields and fields with defaults are non-optional.
import { defineConfig, source } from 'compose-env'
import type { InferConfig } from 'compose-env'
const schema = {
PORT: { type: 'port', required: true } as const,
DEBUG: { type: 'boolean', default: false } as const,
API_URL: { type: 'url', required: false } as const,
} as const
type Config = InferConfig<typeof schema>
// {
// readonly PORT: number - required → always present
// readonly DEBUG: boolean - has default → always present
// readonly API_URL: string | undefined - neither → may be undefined
// }
const config = await defineConfig(schema)
config.PORT.toFixed(0) // ✓ typed as number
config.API_URL?.startsWith // ✓ TypeScript requires null checkSecurity Model
Secret redaction
Fields marked secret: true are protected at all levels:
- Diagnostics table - value shown as
[secret] console.log(config)- returns[object Object]with secrets replaced by[REDACTED]JSON.stringify(config)- secrets replaced by"[REDACTED]"- Direct property access - works normally; protection only applies to serialisation
process.env.DATABASE_URL = 'postgres://user:pass@host/db'
const config = await defineConfig({
DATABASE_URL: { type: 'url', required: true, secret: true }
})
console.log(config) // DATABASE_URL: [REDACTED]
JSON.stringify(config) // {"DATABASE_URL":"[REDACTED]"}
config.DATABASE_URL // 'postgres://user:pass@host/db' ✓Accessing real values
Use the escape hatch when you intentionally need the raw values (e.g. to pass to a database driver):
const raw = config.toUnsafeObject()
// { DATABASE_URL: 'postgres://user:pass@host/db', ... }The name is intentionally verbose to prevent casual misuse.
ReDoS protection
Pattern validation for string fields uses the built-in regex engine with a timeout guard to prevent catastrophic backtracking.
Diagnostics
compose-env prints a startup table to process.stdout showing all loaded config values.
┌─────────────────┬─────────┬──────────────────────────────────────┬────────────────┬──────────────┐
│ Key │ Type │ Value │ Source │ Required │
├─────────────────┼─────────┼──────────────────────────────────────┼────────────────┼──────────────┤
│ PORT │ port │ 3000 │ .env.local │ ✓ │
│ DATABASE_URL │ url │ [secret] │ process.env │ ✓ │
│ NODE_ENV │ enum │ development │ default │ │
│ LOG_LEVEL │ enum │ info │ default │ │
└─────────────────┴─────────┴──────────────────────────────────────┴────────────────┴──────────────┘Control diagnostics mode:
await defineConfig(schema, { diagnostics: 'off' }) // no output
await defineConfig(schema, { diagnostics: 'summary' }) // only warnings/errors
await defineConfig(schema, { diagnostics: 'verbose' }) // full table (default in dev)Automatic mode - when diagnostics is omitted:
NODE_ENV=production→'off'- Otherwise →
'verbose'
CLI
compose-env ships a CLI for development workflows.
npx compose-env <command> [options]generate
Generates a .env.example file with all schema keys, descriptions, and default values. Secrets are represented as empty or placeholder values.
npx compose-env generate
npx compose-env generate --config ./src/config.ts --output .env.example| Flag | Description |
|---|---|
--config, -c |
Path to config file (default: ./config.ts) |
--output, -o |
Output file path (default: print to stdout) |
validate
Validates all required environment variables are present and valid. Exits with code 1 on failure.
npx compose-env validate
npx compose-env validate --strict --env production| Flag | Description |
|---|---|
--config, -c |
Path to config file |
--env, -e |
Load .env.<name> in addition to .env |
--strict |
Fail if any unknown variables are present in .env |
Useful in CI pipelines to catch missing secrets before deployment:
# .github/workflows/ci.yml
- name: Validate config
run: npx compose-env validate --strictdocker-env
Generates a Docker-compatible .env file (one KEY=value per line) suitable for --env-file or docker-compose.
npx compose-env docker-env
npx compose-env docker-env --include-secrets --output .env.docker| Flag | Description |
|---|---|
--config, -c |
Path to config file |
--env, -e |
Load .env.<name> file |
--output, -o |
Output file path (default: stdout) |
--include-secrets |
Include secret: true fields in the output |
Watch Mode
watchConfig reloads configuration automatically when watched files change. It uses Node.js built-in fs.watch - zero extra dependencies.
import { watchConfig, source } from 'compose-env'
const watcher = watchConfig(schema, {
sources: [source.env(), source.envFile('.env')],
paths: ['.env'],
onReload: (config) => {
console.log('Config reloaded. New PORT:', config.PORT)
restartServer(config)
},
onError: (err) => {
console.error('Config reload failed:', err)
},
debounce: 200, // ms to wait before reloading (default: 200)
})
// Stop watching (e.g. on shutdown)
process.on('SIGTERM', () => watcher.stop())WatchOptions
| Option | Type | Required | Description |
|---|---|---|---|
paths |
string[] |
✓ | File paths to watch for changes |
sources |
Source[] |
Source adapters for each reload. Defaults to [source.env(), source.envFile('.env')] |
|
onReload |
(config) => void |
✓ | Called with new config after each successful reload |
onError |
(err) => void |
Called when a reload fails. Defaults to console.error |
|
debounce |
number |
Milliseconds to wait before triggering reload. Default: 200 |
The returned ConfigWatcher object has a single stop() method that closes all fs.watch instances and cancels pending timers.
Next.js Adapter
The compose-env/next export provides a Next.js config wrapper that validates environment variables at build time and injects public vars into the bundle.
npm install compose-env// next.config.js
const { withEnvCompose } = require('compose-env/next')
const schema = require('./config.schema')
module.exports = withEnvCompose({
schema,
publicPrefix: 'NEXT_PUBLIC_', // default
})(nextConfig)What it does:
- Validates all required schema vars before Next.js starts building - fails fast with a clear error
- Injects variables whose names start with
publicPrefixinto the Next.jsenvconfig, making them available asprocess.env.NEXT_PUBLIC_*in client bundles - Never injects
secret: truefields, regardless of name
// TypeScript usage
import { withEnvCompose } from 'compose-env/next'
import type { NextConfig } from 'next'
const nextConfig: NextConfig = { /* ... */ }
export default withEnvCompose({
schema: {
NEXT_PUBLIC_API_URL: { type: 'url', required: true },
DATABASE_URL: { type: 'url', required: true, secret: true },
},
})(nextConfig)Testing Helpers
Import source.object from compose-env/testing to create an in-memory source for unit tests.
import { defineConfig } from 'compose-env'
import { source } from 'compose-env/testing'
it('uses custom PORT from config', async () => {
const config = await defineConfig(
{ PORT: { type: 'port', required: true } },
{
sources: [source.object({ PORT: '4000' })],
diagnostics: 'off',
},
)
expect(config.PORT).toBe(4000)
})Plugin Interface
Use defineSource to create custom source adapters - for example, to load from a database, remote API, or any proprietary secrets store.
import { defineSource } from 'compose-env'
const mySource = defineSource('My Custom Source', async () => {
const response = await fetch('https://config.internal/api/settings')
const data = await response.json()
return data as Record<string, string>
})
const config = await defineConfig(schema, {
sources: [mySource],
})The loader must return a Promise<Record<string, string>> - a flat key-value map of raw string values. compose-env handles parsing and validation.
Cloud Adapters
Each cloud adapter is a separate npm package and uses defineSource internally. Install only what you need.
AWS Systems Manager Parameter Store
npm install compose-env-aws-ssm @aws-sdk/client-ssmimport { defineConfig } from 'compose-env'
import { awsSSMSource } from 'compose-env-aws-ssm'
const config = await defineConfig(schema, {
sources: [
awsSSMSource('/myapp/prod/', {
region: 'us-east-1', // defaults to AWS_REGION env var
uppercase: true, // default: true
}),
],
})Key normalisation: Parameter /myapp/prod/db/host with prefix /myapp/prod/ → DB__HOST (slashes become __, result is uppercased).
Credentials: uses the standard AWS SDK credential chain (env vars, ~/.aws/credentials, IAM role, etc.).
HashiCorp Vault
npm install compose-env-vaultNo extra dependencies - uses the built-in fetch (Node 18+).
import { defineConfig } from 'compose-env'
import { vaultSource } from 'compose-env-vault'
const config = await defineConfig(schema, {
sources: [
vaultSource('myapp/config', {
address: 'https://vault.example.com', // or VAULT_ADDR env var
token: process.env.VAULT_TOKEN, // or VAULT_TOKEN env var
mount: 'secret', // default KV v2 mount
}),
],
})Reads GET {address}/v1/{mount}/data/{secretPath} (KV v2 API).
Azure Key Vault
npm install compose-env-azure @azure/keyvault-secrets @azure/identityimport { defineConfig } from 'compose-env'
import { azureKeyVaultSource } from 'compose-env-azure'
const config = await defineConfig(schema, {
sources: [
azureKeyVaultSource('https://my-vault.vault.azure.net', {
secrets: ['db-url', 'api-key'], // omit to load all secrets
}),
],
})Authenticates via DefaultAzureCredential - supports AZURE_CLIENT_ID / AZURE_TENANT_ID / AZURE_CLIENT_SECRET env vars, managed identity, and the Azure CLI.
Key normalisation: Azure secret names use hyphens (my-db-url) → converted to underscores and uppercased (MY_DB_URL).
GCP Secret Manager
npm install compose-env-gcp @google-cloud/secret-managerimport { defineConfig } from 'compose-env'
import { gcpSecretsSource } from 'compose-env-gcp'
const config = await defineConfig(schema, {
sources: [
gcpSecretsSource('my-gcp-project', {
secrets: ['DB_URL', 'API_KEY'], // omit to load all accessible secrets
version: 'latest', // default; pin to '3' for reproducibility
}),
],
})Authenticates via Application Default Credentials (GOOGLE_APPLICATION_CREDENTIALS, Workload Identity, or gcloud auth application-default login).
ESLint Plugin
eslint-plugin-compose-env catches typos in config.KEY access at lint time.
npm install --save-dev eslint-plugin-compose-envFlat config (ESLint 9+)
// eslint.config.js
import envCompose from 'eslint-plugin-compose-env'
export default [
...envCompose.configs.recommended,
{
rules: {
'compose-env/no-unknown-key': ['error', {
keys: ['PORT', 'DATABASE_URL', 'NODE_ENV'],
}],
},
},
]Legacy config (ESLint 8)
// .eslintrc.js
module.exports = {
plugins: ['compose-env'],
rules: {
'compose-env/no-unknown-key': ['error', {
keys: ['PORT', 'DATABASE_URL', 'NODE_ENV'],
}],
},
}no-unknown-key rule options
| Option | Type | Default | Description |
|---|---|---|---|
keys |
string[] |
Explicit list of known schema keys | |
schemaFile |
string |
Path to a JSON file listing schema keys | |
configVariables |
string[] |
['config'] |
Variable names that hold a config object |
schemaFile formats:
{ "keys": ["PORT", "DATABASE_URL"] }{ "PORT": { "type": "port" }, "DATABASE_URL": { "type": "url" } }When schemaFile is used, the rule re-reads the file at lint time, so it stays in sync without manual keys maintenance.
What it catches
import config from './config'
config.PORT // ✓ known key
config.PROT // ✗ compose-env/no-unknown-key: 'PROT' is not defined in the compose-env schema
config['DB_URL'] // ✗ compose-env/no-unknown-key: 'DB_URL' is not defined in the compose-env schemaLicense
MIT