Package Exports
- extended-buffer
Readme
ExtendedBuffer
ExtendedBuffer is a growable binary buffer built on top of Node.js Buffer.
It keeps an internal read pointer (similar to a stream cursor) and supports appending data at the end or prepending data at the start.
Install
npm install extended-bufferQuick start
import { ExtendedBuffer } from 'extended-buffer';
const b = new ExtendedBuffer();
b.writeString("OK"); // append
b.writeUInt16BE(1337); // append
console.log(b.readString(2)); // "OK"
console.log(b.readUInt16BE()); // 1337Core concepts
Stored data vs readable data
The buffer stores a contiguous region of bytes. A separate read pointer tracks how many bytes were already consumed.
length— total stored bytes (including already-read bytes).getReadableSize()— unread bytes remaining.pointer/getPointer()— current read pointer (0…length).nativePointer()— absolute index inside the underlyingBufferfor the next read.
Views
nativeBufferView— aBufferview of all stored bytes (from the start of stored data to the end).- If you need only unread bytes, you can derive it:
const unread = b.nativeBufferView.subarray(b.pointer);Construction and options
type ExtendedBufferOptions = {
capacity?: number; // initial native buffer size (bytes)
capacityStep?: number; // how much to grow when resizing
nativeAllocSlow?: boolean; // using Buffer.allocUnsafeSlow() when initializing ExtendedBuffer
nativeReallocSlow?: boolean; // using Buffer.allocUnsafeSlow() for further reallocations
};Default values:
capacity:16 * 1024bytes (16 KiB)capacityStep: same ascapacitynativeAllocSlow:falsenativeReallocSlow:false
Example:
const b = new ExtendedBuffer({
capacity: 1024 * 1024,
capacityStep: 1024 * 1024,
nativeAllocSlow: true,
nativeReallocSlow: true
});Writing data
Most write methods accept an optional unshift?: boolean:
unshift = false(default): append to the endunshift = true: prepend to the start
Buffers and strings
b.writeNativeBuffer(Buffer.from([1, 2, 3]));
b.writeBuffer(Buffer.from([4, 5, 6])); // alias that also accepts ExtendedBuffer
b.writeString("hello", "utf8");Prepend example:
b.writeString("payload");
b.writeUInt16BE(7, true); // prepend length/headerIntegers
Variable-width (size must be 1…6 bytes):
b.writeIntBE(-10, 3);
b.writeUIntLE(5000, 4);Fixed-width helpers:
writeInt8,writeUInt8writeInt16BE,writeInt16LE,writeUInt16BE,writeUInt16LEwriteInt32BE,writeInt32LE,writeUInt32BE,writeUInt32LE
BigInt (64-bit integers)
If your runtime supports BigInt and Node's Buffer.readBig* / Buffer.writeBig* APIs, you can read/write 64-bit integers as bigint values (always 8 bytes):
writeBigInt64BE,writeBigInt64LE— signed 64-bitwriteBigUInt64BE,writeBigUInt64LE— unsigned 64-bit
import { ExtendedBuffer } from 'extended-buffer';
const b = new ExtendedBuffer();
b.writeBigUInt64BE(2n ** 63n); // 9223372036854775808n
b.writeBigInt64LE(-42n);
b.setPointer(0);
console.log(b.readBigUInt64BE()); // 9223372036854775808n
console.log(b.readBigInt64LE()); // -42nIf BigInt is not supported, these methods throw ExtendedBufferUnsupportedError('EXECUTION_ENVIRONMENT_NOT_SUPPORT_BIG_INT').
Floating point
writeFloatBE,writeFloatLE(4 bytes)writeDoubleBE,writeDoubleLE(8 bytes)
Reading data
All read* methods advance the internal read pointer (consume bytes).
If there aren’t enough readable bytes, they throw ExtendedBufferRangeError('SIZE_OUT_OF_RANGE').
Checking before reading
if (b.isReadable(4)) {
const x = b.readUInt32BE();
}Read a native Buffer or another ExtendedBuffer
// Copy out as a native Buffer
const chunk: Buffer = b.readBuffer(10, true);
// Copy out as a new ExtendedBuffer (same capacity/capacityStep/nativeAllocSlow/nativeReallocSlow by default)
const eb: ExtendedBuffer = b.readBuffer(10);Strings
const s = b.readString(5, "utf8");Integers
Variable-width (size 1…6 bytes):
const a = b.readIntBE(3);
const u = b.readUIntLE(4);Fixed-width helpers:
readInt8,readUInt8readInt16BE,readInt16LE,readUInt16BE,readUInt16LEreadInt32BE,readInt32LE,readUInt32BE,readUInt32LE
BigInt (64-bit integers)
readBigInt64BE,readBigInt64LE— signed 64-bitreadBigUInt64BE,readBigUInt64LE— unsigned 64-bit
const b = new ExtendedBuffer();
b.writeBigInt64BE(-1n);
b.writeBigUInt64BE(18446744073709551615n); // 2^64 - 1
b.setPointer(0);
console.log(b.readBigInt64BE()); // -1n
console.log(b.readBigUInt64BE()); // 18446744073709551615nNote: Node's Buffer will throw a native RangeError if the value doesn't fit into signed/unsigned 64-bit range.
Floating point
readFloatBE,readFloatLEreadDoubleBE,readDoubleLE
Pointer control (peeking / rewinding)
Save pointer, read, then restore (peek)
const p = b.pointer;
const header = b.readUInt16BE();
// decide what to do...
b.setPointer(p); // rewind back to before headerMove relative to current position
b.offset(4); // skip 4 bytes
b.offset(-2); // go back 2 bytes (must stay within 0…length)If you try to set the pointer outside [0, length], it throws
ExtendedBufferRangeError('POINTER_OUT_OF_RANGE').
Transactions (atomic changes)
Sometimes you want to perform a multi-step read/write and either:
- commit everything if it succeeds, or
- rollback the buffer to the exact previous state if something fails.
ExtendedBuffer.transaction() wraps your code in a transaction:
const result = b.transaction(() => {
// any reads/writes/offsets/etc.
return 123;
});Rules:
- If the callback returns normally, changes are kept (committed).
- If the callback throws, the buffer is restored (rolled back) and the error is re-thrown.
- Transactions are re-entrant: nested
transaction()calls do not create extra snapshots.
What gets rolled back:
- stored payload bytes
pointer(read pointer)- internal start/end offsets and the original native
Buffer(even if the buffer was reallocated during the callback)
Example: "try parse" without consuming bytes
This is useful for protocols where you might receive partial data and want to retry later.
import { ExtendedBuffer } from 'extended-buffer';
function tryReadFrame(b: ExtendedBuffer): Buffer | null {
try {
return b.transaction(() => {
// (1) read header
const len = b.readUInt16BE();
// (2) not enough bytes yet -> rollback and let the caller wait for more data
if (!b.isReadable(len)) {
throw new Error('INCOMPLETE_FRAME');
}
// (3) success -> commit
return b.readBuffer(len, true);
});
} catch {
return null;
}
}Example: rollback on validation error
b.transaction(() => {
const magic = b.readUInt32BE();
if (magic !== 0xdeadbeef) {
throw new Error('BAD_MAGIC');
}
const version = b.readUInt8();
if (version !== 1) {
throw new Error('UNSUPPORTED_VERSION');
}
});Performance note
transaction() snapshots the current payload (it copies the stored bytes) before running the callback.
That makes rollbacks safe, but can be expensive for very large buffers. Use it for small/medium payloads,
or when the safety/ergonomics is worth the extra copy.
Memory management
Discard already-read data
If you continuously read from the buffer, you can drop the consumed prefix:
b.discardReadData();This moves the internal start forward by the number of read bytes and resets pointer to 0.
Shrink free capacity (gc())
b.gc();gc() first discards read data, then may shrink the underlying native Buffer
when free space exceeds capacityStep.
Reset everything
b.clean(); // alias for initExtendedBuffer()Errors
The library defines these error classes:
ExtendedBufferErrorExtendedBufferTypeErrorExtendedBufferRangeErrorExtendedBufferUnsupportedError
Common error codes you may see:
SIZE_OUT_OF_RANGE: reading more bytes than availablePOINTER_OUT_OF_RANGE: setting pointer outside0…lengthINVALID_INTEGER_SIZE_VALUE_TYPE: size is not a safe integerINVALID_INTEGER_SIZE_VALUE_RANGE: integer size not in1…6INVALID_INSTANCE_STATE: internal invariant check failedINVALID_BUFFER_TYPE: attempt write invalid buffer typeVALUE_MUST_BE_AN_INTEGER: value not a safe integerVALUE_MUST_BE_AN_UNSIGNED_INTEGER: value is not a safe integer or less than 0VALUE_MUST_BE_AN_BIG_INTEGER: value is not abigintVALUE_MUST_BE_AN_UNSIGNED_BIG_INTEGER: value is not abigintor less than 0EXECUTION_ENVIRONMENT_NOT_SUPPORT_BIG_INT: BigInt methods are not supported in the current runtimeEXCEEDING_MAXIMUM_BUFFER_SIZE: allocation exceeds Node’skMaxLengthoros.totalmem()
Caveats
Prepending (unshift) after reading
unshift=true prepends bytes by moving the internal start pointer, but the read pointer is not adjusted automatically.
If you prepend after consuming bytes, you may get surprising results (e.g., some previously read bytes can become readable again, or newly prepended bytes may be skipped).
A safe pattern is:
b.discardReadData();
b.writeUInt16BE(123, true);nodeGc() is Node-specific
nodeGc() calls global.gc() if it exists. In Node.js it requires starting the process with --expose-gc.
In non-Node runtimes, global may not exist.
Reference: full public API (names)
Properties:
length,capacity,pointer,nativeBufferView
Core:
initExtendedBuffer(),assertInstanceState(),clean()nativePointer(),getWritableSizeStart(),getWritableSizeEnd(),getWritableSize(),getReadableSize()transaction(callback)allocStart(size),allocEnd(size)writeNativeBuffer(buf, unshift?),writeBuffer(bufOrEB, unshift?),writeString(str, enc?, unshift?)- Pointer:
setPointer(p),getPointer(),offset(n),isReadable(size) - Maintenance:
discardReadData(),gc(),nodeGc()
Numbers:
- Write:
writeIntBE/LE,writeUIntBE/LE,writeInt8,writeUInt8,writeInt16BE/LE,writeUInt16BE/LE,writeInt32BE/LE,writeUInt32BE/LE,writeBigInt64BE/LE,writeBigUInt64BE/LE,writeFloatBE/LE,writeDoubleBE/LE - Read:
readBuffer,readString,readIntBE/LE,readUIntBE/LE,readInt8,readUInt8,readInt16BE/LE,readUInt16BE/LE,readInt32BE/LE,readUInt32BE/LE,readBigInt64BE/LE,readBigUInt64BE/LE,readFloatBE/LE,readDoubleBE/LE
License
MIT