Package Exports
- json-stream-lite
- json-stream-lite/byte-buffer
- json-stream-lite/byte-buffer.d.ts
- json-stream-lite/byte-buffer.js
- json-stream-lite/generators
- json-stream-lite/generators.d.ts
- json-stream-lite/generators.js
- json-stream-lite/index
- json-stream-lite/index.d.ts
- json-stream-lite/index.js
- json-stream-lite/parser
- json-stream-lite/parser.d.ts
- json-stream-lite/parser.js
- json-stream-lite/stringify
- json-stream-lite/stringify.d.ts
- json-stream-lite/stringify.js
- json-stream-lite/types
- json-stream-lite/types.d.ts
- json-stream-lite/types.js
- json-stream-lite/utils
- json-stream-lite/utils.d.ts
- json-stream-lite/utils.js
Readme
json-stream-lite
A lightweight, memory-efficient streaming JSON parser and stringifier for JavaScript and TypeScript. Process large JSON files without loading them entirely into memory.
Features
- 🚀 Stream parsing: Parse JSON incrementally as data arrives
- 💾 Memory efficient: Process large JSON files without loading them entirely into memory
- 🔄 Bidirectional: Both parse and stringify JSON in streaming fashion
- 🎯 Type-safe: Full TypeScript support with comprehensive type definitions
- 🔌 Flexible input: Support for sync/async iterables, ReadableStreams, strings, and byte arrays
- 🎨 Key-value extraction: Flatten nested JSON structures into key-value pairs
- ⚡ Zero dependencies: Minimal footprint with no external runtime dependencies
Installation
npm install json-stream-litepnpm add json-stream-liteyarn add json-stream-liteQuick Start
Parsing JSON
Parse a complete JSON object incrementally
import { JsonObject } from 'json-stream-lite'
const json = '{"name": "Alice", "age": 30, "active": true}'
const parser = new JsonObject()
// Feed bytes into the parser
parser.feed(...new TextEncoder().encode(json))
// Read the complete object
const result = parser.read()
console.log(result) // { name: 'Alice', age: 30, active: true }Stream through object members
import { JsonObject } from 'json-stream-lite'
const json = '{"name": "Alice", "age": 30, "city": "NYC"}'
const parser = new JsonObject()
parser.feed(...new TextEncoder().encode(json))
// Iterate through key-value pairs without loading the entire object
for (const [keyEntity, valueEntity] of parser.members()) {
const key = keyEntity.read()
const value = valueEntity.read().read()
console.log(`${key}: ${value}`)
}
// Output:
// name: Alice
// age: 30
// city: NYCParse JSON arrays incrementally
import { JsonArray } from 'json-stream-lite'
const json = '[1, 2, 3, 4, 5]'
const parser = new JsonArray()
parser.feed(...new TextEncoder().encode(json))
// Process each item individually
for (const item of parser.items()) {
console.log(item.read())
}
// Output: 1, 2, 3, 4, 5Async Streaming
Process JSON from async sources like HTTP responses or file streams:
import { JsonObject } from 'json-stream-lite'
async function processStream(stream: ReadableStream<Uint8Array>) {
const parser = new JsonObject(stream)
// Asynchronously iterate through members
for await (const [keyEntity, valueEntity] of parser.membersAsync()) {
const key = keyEntity.read()
const value = await valueEntity.readValueAsync()
console.log(`${key}: ${value}`)
}
}
// Example with fetch
const response = await fetch('https://api.example.com/data.json')
await processStream(response.body!)Key-Value Extraction
Flatten nested JSON structures into dot-notation key-value pairs:
import { jsonKeyValueParser } from 'json-stream-lite'
const json = '{"user": {"name": "Alice", "scores": [95, 87, 92]}}'
for (const [key, value] of jsonKeyValueParser(json)) {
console.log(`${key} = ${value}`)
}
// Output:
// user.name = Alice
// user.scores[0] = 95
// user.scores[1] = 87
// user.scores[2] = 92Async key-value extraction
import { jsonKeyValueParserAsync } from 'json-stream-lite'
async function extractKeyValues(stream: ReadableStream) {
for await (const [key, value] of jsonKeyValueParserAsync(stream)) {
console.log(`${key} = ${value}`)
}
}Stringifying JSON
Convert JavaScript objects to JSON strings in a streaming fashion:
import { jsonStreamStringify } from 'json-stream-lite'
const data = {
name: 'Alice',
scores: [95, 87, 92],
metadata: { verified: true },
}
// Generate JSON in chunks
for (const chunk of jsonStreamStringify(data, null, 2)) {
process.stdout.write(chunk)
}Stringify to bytes
import { jsonStreamStringifyBytes } from 'json-stream-lite'
const data = { name: 'Alice', age: 30 }
for (const bytes of jsonStreamStringifyBytes(data)) {
// bytes is a Uint8Array
await writeToFile(bytes)
}Control chunk size
import { jsonStreamStringify } from 'json-stream-lite'
const data = { longString: 'x'.repeat(10000) }
// Control how strings are chunked (default: 1024 bytes)
for (const chunk of jsonStreamStringify(data, null, 0, {
stringChunkSize: 512,
})) {
console.log(chunk.length) // Chunks will be ~512 bytes
}API Reference
Parser Classes
JsonValue
Represents any JSON value. Automatically detects the type and returns the appropriate entity.
import { JsonValue } from 'json-stream-lite'
const parser = new JsonValue()
parser.feed(...bytes)
const valueEntity = parser.read() // Returns JsonString, JsonNumber, etc.
const actualValue = valueEntity.read()JsonObject
Parses JSON objects.
class JsonObject<T = unknown> extends JsonEntity<T>
// Methods
members(): Generator<[JsonString, JsonValue]>
membersAsync(): AsyncGenerator<[JsonString, JsonValue]>
read(): T
readAsync(): Promise<T>JsonArray
Parses JSON arrays.
class JsonArray<T = unknown> extends JsonEntity<T[]>
// Methods
items(): Generator<JsonValueType>
itemsAsync(): AsyncGenerator<JsonValueType>
read(): T[]
readAsync(): Promise<T[]>JsonString, JsonNumber, JsonBoolean, JsonNull
Parse primitive JSON values.
const str = new JsonString()
str.feed(...bytes)
console.log(str.read()) // Returns a string
const num = new JsonNumber()
num.feed(...bytes)
console.log(num.read()) // Returns a numberJsonKeyValueParser
Flattens nested JSON into key-value pairs.
class JsonKeyValueParser extends JsonEntity<Generator<JsonKeyValuePair>>
// Methods
parse(): Generator<[string, JsonPrimitive]>
parseAsync(): AsyncGenerator<[string, JsonPrimitive]>Base Class: JsonEntity
All parser classes extend JsonEntity<T>:
// Properties
consumed: boolean
bufferLength: number
maxBufferSize: number
// Methods
feed(...input: (number | number[])[]): void
read(): T
readAsync(): Promise<T>
consume(): void
consumeAsync(): Promise<void>
tryParse<T>(cb: (entity: this) => T): T | undefinedGenerator Functions
jsonKeyValueParser(bytes: Iterable<number> | string)
Synchronously parse JSON into key-value pairs.
import { jsonKeyValueParser } from 'json-stream-lite'
for (const [key, value] of jsonKeyValueParser('{"a": 1, "b": [2, 3]}')) {
console.log(key, value)
}
// a 1
// b[0] 2
// b[1] 3jsonKeyValueParserAsync(bytes: AsyncIterable<number> | ReadableStream | string)
Asynchronously parse JSON into key-value pairs.
import { jsonKeyValueParserAsync } from 'json-stream-lite'
for await (const [key, value] of jsonKeyValueParserAsync(stream)) {
console.log(key, value)
}Stringify Functions
jsonStreamStringify(value, replacer?, indent?, options?)
Generate JSON string chunks.
function jsonStreamStringify(
value: unknown,
replacer?: any,
indent?: number,
options?: JsonStreamStringifyOptions,
): Generator<string>
interface JsonStreamStringifyOptions {
stringChunkSize?: number // Default: 1024
}Parameters:
value: The value to stringifyreplacer: Optional replacer function called on each value (with empty key) during stringificationindent: Number of spaces for indentation (0 for compact)options.stringChunkSize: Maximum size of string chunks in bytes
jsonStreamStringifyBytes(value, replacer?, indent?, options?)
Generate JSON as Uint8Array chunks.
function jsonStreamStringifyBytes(
value: unknown,
replacer?: any,
indent?: number,
options?: JsonStreamStringifyOptions,
): Generator<Uint8Array>Advanced Usage
Processing Large Files
import { createReadStream } from 'fs'
import { JsonObject } from 'json-stream-lite'
async function processLargeFile(filePath: string) {
const stream = createReadStream(filePath)
const parser = new JsonObject(stream)
for await (const [keyEntity, valueEntity] of parser) {
const key = keyEntity.read()
const value = await valueEntity.readValueAsync()
// Process each key-value pair without loading entire file
await processRecord(key, value)
}
}Handling Nested Structures
import { JsonObject, JsonArray } from 'json-stream-lite'
const json = '{"users": [{"name": "Alice"}, {"name": "Bob"}]}'
const parser = new JsonObject()
parser.feed(...new TextEncoder().encode(json))
for (const [keyEntity, valueEntity] of parser) {
const key = keyEntity.read()
const value = valueEntity.read()
if (key === 'users' && value instanceof JsonArray) {
for (const userEntity of value.items()) {
const user = userEntity.read()
console.log(user) // Each user object
}
}
}Incremental Feeding
import { JsonObject } from 'json-stream-lite'
const parser = new JsonObject()
// Feed data incrementally as it arrives
parser.feed(123) // {
parser.feed(34, 110, 97, 109, 101, 34) // "name"
parser.feed(58, 34, 65, 108, 105, 99, 101, 34) // :"Alice"
parser.feed(125) // }
const result = parser.read()
console.log(result) // { name: 'Alice' }Use Cases
1. Processing API Responses
async function processApiResponse(url: string) {
const response = await fetch(url)
const parser = new JsonObject(response.body!)
for await (const [keyEntity, valueEntity] of parser.membersAsync()) {
const key = keyEntity.read()
const value = await valueEntity.readValueAsync()
console.log(`Processing ${key}:`, value)
}
}2. Log File Analysis
import { jsonKeyValueParserAsync } from 'json-stream-lite'
async function analyzeLogFile(stream: ReadableStream) {
const metrics: Record<string, number> = {}
for await (const [key, value] of jsonKeyValueParserAsync(stream)) {
if (typeof value === 'number') {
metrics[key] = (metrics[key] || 0) + value
}
}
return metrics
}3. Generating Large JSON Files
import { jsonStreamStringifyBytes } from 'json-stream-lite'
import { createWriteStream } from 'fs'
async function generateLargeFile(data: unknown, outputPath: string) {
const writeStream = createWriteStream(outputPath)
for (const chunk of jsonStreamStringifyBytes(data, null, 2)) {
writeStream.write(chunk)
}
writeStream.end()
}4. Database Export
import { jsonStreamStringify } from 'json-stream-lite'
async function* exportDatabase(query: string) {
const records = await db.query(query)
for (const chunk of jsonStreamStringify(records, null, 2)) {
yield chunk
}
}
// Stream to client
app.get('/export', async (req, res) => {
res.setHeader('Content-Type', 'application/json')
for await (const chunk of exportDatabase('SELECT * FROM users')) {
res.write(chunk)
}
res.end()
})Performance Tips
- Use async methods for I/O-bound operations
- Set appropriate buffer limits with
maxBufferSize - Stream member-by-member instead of calling
read()on large objects - Control chunk size in stringify operations for optimal throughput
- Avoid reading entire objects when you only need specific fields
Browser Support
Works in all modern browsers and Node.js environments that support:
- ES2015+ features
- Generators and async generators
- TextEncoder/TextDecoder (for string conversion)
- ReadableStream (for stream processing)
TypeScript Support
Full TypeScript definitions included. All types are exported:
import type {
JsonPrimitive,
JsonKeyValuePair,
JsonValueType,
JsonPrimitiveType,
JsonStreamStringifyOptions,
} from 'json-stream-lite'License
MIT
Contributing
Contributions welcome! Please ensure:
- All tests pass:
pnpm test - Code compiles:
pnpm compile - Coverage maintained:
pnpm test -- --coverage
Related Projects
- json-stream - Alternative streaming JSON parser
- stream-json - Node.js streaming JSON parser
- jsonparse - Pure JavaScript streaming JSON parser