Package Exports
- @trap_stevo/cynq
Readme
๐งฉ @trap_stevo/cynq
Automation Evolved.
The next-generation, event-driven, real-time CI/CD and pipeline orchestration engine.
Unifying pipelines, triggers, deployments, metrics, and real-time observability into a single, composable core โ powering ultra-dynamic continuous integration and delivery workflows across any environment.
๐ Features
- ๐ง Pipeline-as-Code โ Define fully declarative, composable pipelines that run anywhere
- โก Real-Time Engine โ React instantly to events, webhooks, and remote triggers
- ๐น๏ธ Dynamic Drivers โ Register sources, runners, triggers, and deployers with simple interfaces
- ๐ Cynq Route Engine โ Secure HTTP layer for remote enqueues and external control
- ๐ก Remote Execution Ready โ Dispatch builds or deployments to other instances
- ๐ Queue & Retry Logic โ Resilient queue system with per-tenant isolation and idempotency
- ๐ Security-First Architecture โ HMAC / JWT auth, replay protection, allowlists, and TTL enforcement
- ๐ Integrated Metrics โ Seamless telemetry for build time, deploy time, and success ratios
- ๐งฉ Vault Integration โ Secret and credential management for pipelines
- ๐ฐ๏ธ Multi-Target Sync Loops โ Orchestrate multiple pipelines in parallel
- ๐งน Graceful Shutdowns โ Unified
close()method to stop routes, triggers, and sync loops safely
โ๏ธ System Requirements
| Requirement | Version |
|---|---|
| Node.js | โฅ 19.x |
| npm | โฅ 9.x |
| OS | Windows, macOS, Linux |
๐งฉ Built-In Drivers
| Kind | Driver | Description |
|---|---|---|
source |
git | Clone repositories (HTTPS / SSH) |
runner |
shell | Execute local shell scripts or commands |
deployer |
copy-folder | Copy build artifacts between directories |
deployer |
ssh-rsync | Deploy files over SSH using rsync |
deployer |
remote-cynq-enqueue | Trigger another instance remotely |
trigger |
webhook | Listen for GitHub, Gitea, or GitLab webhooks |
trigger |
poll | Poll remote HTTP endpoints or files for changes |
trigger |
fs-watch | React to filesystem events using Vigilite |
Register a custom driver:
engine.registerDriver("runner", "python", (ctx) => ({
exec : async ({ with : w }) => {
const { spawn } = require("child_process");
await new Promise((resolve) =>
spawn("python", [w.script], { stdio : "inherit" })
.on("exit", resolve)
);
}
}));๐ง Driver Authoring
Cynq enables fully modular extensions through drivers โ small, isolated units that define custom logic for different stages of the pipeline lifecycle.
Drivers transform how builds run, deploy, trigger, and source artifacts across environments.
โ๏ธ Driver Fundamentals
| Concept | Description |
|---|---|
| Driver | A pluggable logic unit that executes during a specific lifecycle phase (runner, deployer, trigger, source). |
| Registration | Drivers register with registerDriver(category, name, factoryFn) and expose lifecycle hooks. |
| Factory Function | Returns an object defining async lifecycle methods such as start, activate, or fetch. |
| Schema Contract | Each driver consumes a with object โ user-defined parameters validated before execution. |
| Sandbox | Drivers run in isolated async contexts; timeouts and I/O guards ensure safe parallel execution. |
Context Object (ctx) |
Provides runtime utilities such as logging, storage, event emission, and vault access. |
๐งฐ Shared Context Reference
| Property | Type | Description |
|---|---|---|
ctx.logger |
Function |
Structured log emitter; supports { level, msg, data }. |
ctx.emit(event, data) |
Function |
Emits custom runtime events. |
ctx.enqueue(target, payload, opts?) |
Function |
Adds new jobs dynamically to a target queue. |
ctx.storage |
CynqStorageEngine |
Persistent store for queue, attempts, secrets, KV, and approvals. |
ctx.vault |
Vault |
Abstracted key-value backend (may represent local, S3, SQL, etc.). |
ctx.abortSignal |
AbortSignal |
Enables cooperative cancellation. |
ctx.meta |
Object |
Contains metadata for the current project, tenant, and pipeline. |
ctx.env |
Object |
Environment variables scoped to current job. |
๐ Runner Drivers
Runner drivers execute actual work โ builds, tests, packaging, or orchestration steps.
| Lifecycle Method | Description | Async |
|---|---|---|
start({ with : w }, ctx) |
Begin job execution using configuration w. |
โ |
stop(ctx) |
Optional cleanup logic or abort signal handling. | โ |
Example
cynq.registerDriver("runner", "shell-task", (ctx) => ({
start : async ({ with : w }) => {
const { exec } = require("child_process");
if (!w?.cmd) { throw new Error("Missing cmd parameter"); }
await new Promise((resolve, reject) => {
exec(w.cmd, { cwd : w.cwd || process.cwd(), env : { ...process.env, ...w.env } }, (err, stdout, stderr) => {
if (err) reject(stderr || err);
else { ctx.logger?.(`stdout: ${stdout}`); resolve(); }
});
});
}
}));Common with Parameters
| Key | Type | Description |
|---|---|---|
cmd |
string |
Shell command to execute. |
cwd |
string |
Optional working directory. |
env |
object |
Optional environment overrides. |
๐ Deployer Drivers
Deployer drivers publish or distribute artifacts โ pushing to registries, cloud storage, or remote systems.
| Lifecycle Method | Description | Async |
|---|---|---|
activate({ with : w }, ctx) |
Deploy artifacts or assets to remote target. | โ |
rollback(ctx) |
Optional rollback or cleanup handler. | โ |
Example
cynq.registerDriver("deployer", "artifact-uploader", (ctx) => ({
activate : async ({ with : w }) => {
const fs = require("fs");
const axios = require("axios");
if (!w?.url || !w?.filePath) {
throw new Error("artifact-uploader requires url and filePath");
}
const data = fs.createReadStream(w.filePath);
await axios.post(w.url, data, {
headers : {
"Content-Type" : "application/octet-stream",
Authorization : w.auth || ""
}
});
ctx.logger?.(`Uploaded ${w.filePath} to ${w.url}`);
}
}));๐ Trigger Drivers
Trigger drivers listen for external or scheduled events and enqueue new jobs dynamically.
| Lifecycle Method | Description | Async |
|---|---|---|
start({ with : w }, enqueue) |
Start event listener or schedule; call enqueue() when triggered. |
โ |
stop(ctx) |
Optional stop handler to clean up listener. | โ |
Example
cynq.registerDriver("trigger", "webhook", (ctx) => ({
start : async ({ with : w }, enqueue) => {
const express = require("express");
const app = express();
app.use(express.json());
const route = w.path || "/hook";
const port = w.port || 3100;
app.post(route, async (req, res) => {
await enqueue({ event : "webhook", payload : req.body });
res.json({ ok : true });
});
app.listen(port, () => ctx.logger?.(`Webhook listening on :${port}${route}`));
}
}));๐ฆ Source Drivers
Source drivers fetch repositories, packages, or other input materials before a build or deployment begins.
| Lifecycle Method | Description | Async |
|---|---|---|
fetch({ with : w }, ctx) |
Acquire source content and return local path reference. | โ |
Example
cynq.registerDriver("source", "git-clone", (ctx) => ({
fetch : async ({ with : w }) => {
const { execSync } = require("child_process");
if (!w?.repo) throw new Error("Missing repo URL");
const dir = w.dir || `./workspace-${Date.now()}`;
execSync(`git clone ${w.repo} ${dir}`, { stdio : "inherit" });
ctx.logger?.(`Cloned ${w.repo} into ${dir}`);
return dir;
}
}));๐งฉ Driver Safety Guidelines
| Guideline | Purpose |
|---|---|
Validate with object early |
Prevent undefined behavior. |
| Handle exceptions cleanly | Throw structured errors for proper reporting. |
| Use context utilities | For logging, metrics, vault, and queue operations. |
| Avoid global state | Drivers run in parallel; isolation avoids conflicts. |
| Respect abort signals | Check ctx.abortSignal.aborted for cooperative termination. |
| Secure external calls | Validate URLs, use HTTPS, enforce small payloads. |
๐ง Advanced Driver Patterns
1. Composable Drivers
Use drivers that delegate to other registered drivers internally.
cynq.registerDriver("runner", "composite", (ctx) => ({
start : async ({ with : w }) => {
for (const step of w.steps) {
await ctx.enqueue(step.target, step.payload);
}
}
}));2. Stateful Deployers
Maintain incremental state via ctx.storage.kvFacade().
const kv = ctx.storage.kvFacade("deploy", "artifact");
await kv.put("lastVersion", w.version);๐งฉ Pipeline Spec Reference
The Pipeline Spec defines how a project builds, tests, deploys, and reacts to events.
Every spec describes sources, steps, environments, and follow-up actions under a unified JSON structure.
๐งฑ Top-Level Schema
| Key | Type | Description |
|---|---|---|
name |
string |
Logical pipeline identifier. |
triggers |
array<object> |
List of trigger definitions (e.g., webhook, cron, manual). |
pipeline |
object |
Core job structure including source, steps, and on. |
env |
object |
Static environment variables injected into every driver. |
matrix |
object |
Optional parameter expansion to run multiple variants. |
secrets |
object |
Vault-backed secret reference map. |
description |
string |
Optional pipeline documentation text. |
Example
{
"name": "backend-prod",
"triggers": [
{ "driver": "webhook", "with": { "path": "/github", "port": 3100, "secret": "shared" } }
],
"pipeline": {
"source": { "driver": "git-clone", "with": { "repo": "https://github.com/org/app.git", "branch": "main" } },
"steps": [
{ "kind": "run", "name": "build", "runner": "shell-task", "with": { "cmd": "npm ci && npm run build" } },
{ "kind": "deploy", "name": "publish", "deployer": "artifact-uploader", "with": { "url": "https://cdn.example.com/upload", "filePath": "./dist.zip" } }
],
"on": {
"success": [
{ "kind": "deploy", "deployer": "cynq-enqueue", "with": { "project": "web", "target": "smoke-tests" } }
],
"failure": [
{ "kind": "run", "runner": "shell-task", "with": { "cmd": "bash scripts/rollback.sh" } }
]
}
},
"env": {
"NODE_ENV": "production",
"REGION": "us-east-1"
},
"matrix": {
"node": ["18", "20"],
"region": ["us-east-1", "eu-west-1"]
},
"secrets": {
"GITHUB_TOKEN": "vault:deploy.github"
}
}โ๏ธ triggers[]
Defines what initiates the pipeline.
Each trigger uses a registered driver and optional configuration.
| Key | Type | Description |
|---|---|---|
driver |
string |
Trigger driver name (e.g., "webhook", "cron", "manual"). |
with |
object |
Parameters specific to the driver. |
filter |
object |
Optional condition (branch, event type). |
Example
{ "driver": "webhook", "with": { "path": "/hook", "secret": "abc123" } }๐ pipeline.source
Defines how to retrieve or prepare the source materials before the build starts.
| Key | Type | Description |
|---|---|---|
driver |
string |
Source driver name ("git-clone", "fetch-archive", etc.). |
with |
object |
Source configuration (repository URL, branch, token, etc.). |
cache |
boolean |
Enables reuse of previous checkouts if unchanged. |
Example
{ "driver": "git-clone", "with": { "repo": "https://github.com/org/app.git", "branch": "main" } }๐งฉ pipeline.steps[]
Describes ordered tasks inside the pipeline.
Each step specifies what to run, deploy, or trigger next.
| Key | Type | Description |
|---|---|---|
kind |
string |
"run", "deploy", "fetch", or "custom". |
name |
string |
Human-readable identifier for the step. |
runner / deployer |
string |
Driver name used for the step. |
with |
object |
Configuration passed to the driver. |
continueOnError |
boolean |
Whether subsequent steps execute after failure. |
timeoutMs |
number |
Optional timeout per step. |
Example
{ "kind": "run", "name": "build", "runner": "shell-task", "with": { "cmd": "npm run build" } }๐ pipeline.on
Defines follow-up actions based on the result of the main pipeline execution.
| Key | Type | Description |
|---|---|---|
success |
array<object> |
Steps to execute if the pipeline completes successfully. |
failure |
array<object> |
Steps to execute on any failure. |
always |
array<object> |
Steps that always run at the end regardless of outcome. |
Example
"on": {
"success": [
{ "kind": "deploy", "deployer": "cynq-enqueue", "with": { "project": "web", "target": "smoke-tests" } }
],
"failure": [
{ "kind": "run", "runner": "shell-task", "with": { "cmd": "bash rollback.sh" } }
]
}๐ env
Defines environment variables that apply globally to all steps.
Can be overridden per step via its own with.env.
| Key | Type | Description |
|---|---|---|
| any | string |
Environment variable name/value pairs. |
Example
"env": {
"NODE_ENV": "production",
"LOG_LEVEL": "debug"
}๐งฎ matrix
Generates multiple parallel pipeline runs for parameter combinations.
Each key defines an axis with possible values.
| Key | Type | Description |
|---|---|---|
| axis | array<string> |
Each array defines possible values for that variable. |
Example
"matrix": {
"node": ["18", "20"],
"region": ["us-east-1", "eu-west-1"]
}This expands into four runs:
(18, us-east-1), (18, eu-west-1), (20, us-east-1), (20, eu-west-1).
๐ secrets
Lists vault references used in the pipeline.
Each secret resolves securely at runtime through the configured vault backend.
| Key | Type | Description |
|---|---|---|
| secretName | string |
Vault reference in the form vault:key.path. |
Example
"secrets": {
"GITHUB_TOKEN": "vault:deploy.github",
"DOCKER_PASSWORD": "vault:docker.pass"
}When the pipeline runs, these values resolve securely at runtime.
๐ง Execution Flow Summary
1๏ธโฃ Trigger fires (webhook, manual, cron, etc.) 2๏ธโฃ Source driver fetches code or assets 3๏ธโฃ Steps execute sequentially or in matrix form 4๏ธโฃ Environment and secrets inject automatically 5๏ธโฃ On success/failure handlers run 6๏ธโฃ Metrics, logs, and states persist securely through the configured storage backend
Every source, step, and secret becomes structured, reproducible, and composable.
๐งญ Cynq Core Methods
| Method | Description | Async |
|---|---|---|
deploy(project, target, spec, ctx) |
Execute a pipeline immediately | โ |
planPipeline(project, spec, ctx) |
Preview a pipeline plan before execution | โ |
previewPipeline(project, spec, ctx) |
Produce a summarized execution preview | โ |
validatePipeline(spec) |
Validate a pipeline definition | โ |
startTriggers(project, spec, ctx) |
Start trigger listeners | โ |
runOnce(project, target, spec, ctx) |
Run a single queued job manually | โ |
sync(project, target, spec, ctx) |
Continuously process jobs for a target | โ |
registerDriver(kind, name, factory) |
Register a custom driver | โ |
resolve(kind, name, ctx) |
Retrieve a driver instance by name | โ |
startRoutes(override?) |
Start the route engine | โ |
stopRoutes() |
Stop the route engine | โ |
close() |
Stop all routes, triggers, and loops safely | โ |
๐งน Graceful Shutdown
process.on("SIGINT", async () => {
await engine.close();
process.exit(0);
});Stops all routes, trigger listeners, and active synchronization loops cleanly.
๐ Cynq Route Engine
Overview
Cynq Route Engine exposes a minimal, secure HTTP interface for remote interaction and job enqueueing.
| Route | Description |
|---|---|
POST /enqueue |
Enqueue a new pipeline job remotely |
Configuration Example
engine : {
routes : {
enabled : true,
autoStart : true,
port : 3333,
hmacSecret : "super-secret",
ipAllowlist : ["10.0.0.0/24"],
maxBytes : "5mb",
rate : { capacity : 100, refillPerSec : 5 }
}
}Security Checklist
โ
HMAC or JWT authentication
โ
Replay protection (X-Timestamp, nonce, short TTL)
โ
IP allowlist and CIDR support
โ
Content-type and size limits
โ
Rate limiting and quotas
โ
Idempotency via Idempotency-Key header
๐ Remote Communication
Remote Enqueue Example
{
"steps": [
{
"kind": "deploy",
"name": "trigger-next",
"deployer": "remote-cynq-enqueue",
"with": {
"url": "https://cynq-node-b.example.com",
"project": "frontend",
"target": "production",
"auth": "Bearer xyz123",
"payload": { "version": "1.2.0" }
}
}
]
}๐ฐ๏ธ Cynq Event Reference
Cynq emits structured events for real-time dashboards, external integrations, or telemetry pipelines.
Events follow a consistent envelope:
{
"event": "cynq:step:ok",
"project": "frontend",
"target": "production",
"timestamp": 1734819100000,
"data": { /* event-specific payload */ }
}๐ Core Lifecycle Events
| Event | Description | Payload Fields |
|---|---|---|
cynq:trigger:received |
Trigger signal enters queue | { trigger, headers, source } |
cynq:pipeline:start |
Pipeline begins execution | { name, project, target } |
cynq:pipeline:ok |
Pipeline completes successfully | { name, project, target, durationMs } |
cynq:pipeline:fail |
Pipeline fails at any step | { name, project, target, reason } |
cynq:sync:start |
Sync loop starts for target | { project, target } |
cynq:sync:stop |
Sync loop stops for target | { project, target } |
cynq:queue:enqueue |
Job enters queue | { project, target, jobId } |
cynq:queue:dequeue |
Job leaves queue for execution | { project, target, jobId } |
cynq:queue:done |
Queued job finishes | { project, target, jobId, status } |
โ๏ธ Step-Level Events
| Event | Description | Payload Fields |
|---|---|---|
cynq:step:start |
Step execution begins | { step, kind, name, driver } |
cynq:step:ok |
Step completes successfully | { step, durationMs, driver } |
cynq:step:fail |
Step throws error or non-zero exit | { step, reason, driver } |
cynq:step:retry |
Step retries after transient failure | { step, attempt, reason } |
cynq:step:skipped |
Step bypassed due to condition | { step, reason } |
๐ Deployment Events
| Event | Description | Payload Fields |
|---|---|---|
cynq:deploy:start |
Deployment begins | { deployer, target, project } |
cynq:deploy:ok |
Deployment completes successfully | { deployer, target, durationMs } |
cynq:deploy:fail |
Deployment fails | { deployer, target, reason } |
cynq:deploy:chain |
Remote enqueue triggered | { deployer, nextProject, nextTarget } |
๐งฉ Diagnostic & Audit Events
| Event | Description | Payload Fields |
|---|---|---|
cynq:driver:load |
Driver registers successfully | { kind, name } |
cynq:driver:error |
Driver registration fails | { kind, name, error } |
cynq:storage:lock:acquire |
Lock acquired | { key, namespace } |
cynq:storage:lock:release |
Lock released | { key, namespace } |
cynq:storage:kv:set |
Key written to store | { key, namespace } |
cynq:storage:kv:delete |
Key removed from store | { key, namespace } |
๐ Metrics Integration
All metric events flow through the same channel.
Metrics describe runtime timing, counts, and performance distribution across pipelines.
| Metric Key | Description |
|---|---|
attempt.start |
Marks job start |
attempt.ok |
Marks successful completion |
attempt.fail |
Marks job failure |
step.time.ms |
Measures duration of each step |
deploy.time.ms |
Measures deployment duration |
Example Metric Stream
{
"event": "metric",
"metric": "step.time.ms",
"labels": { "project": "frontend", "step": "build" },
"value": 5234
}๐ง Realtime Consumption
Consume events from the runtime emitter or integrate IoTide for networked streams:
engine.on("event", (evt, data) => {
console.log("[event]", evt, data);
});
// or with IoTide
engine.realtime.emit("cynq:step:ok", { step: "build", durationMs: 5123 });All events preserve deterministic naming (cynq:*) for consistent filtering across observability systems such as Grafana, Prometheus, or custom dashboards.
๐ช Realtime Hook Patterns
Cynq exposes an event stream for live monitoring, logging, and notification systems.
Hooks can run inline, forward to third-party services, or store in custom observability backends.
๐ง Basic Listener
Attach directly to the realtime emitter to react to all events.
const { Cynq } = require("@trap_stevo/cynq");
const engine = new Cynq({
realtime : { emit : (evt, data) => console.log("[event]", evt, data) }
});For targeted subscriptions, filter by prefix:
engine.on("cynq:step:ok", (data) => {
console.log(`[ok] ${data.step} โ ${data.durationMs}ms`);
});๐ง Scoped Hook Helpers
Define scoped handlers to simplify observability integration.
engine.onPipelineStart = (fn) => engine.on("cynq:pipeline:start", fn);
engine.onPipelineEnd = (fn) => engine.on("cynq:pipeline:ok", fn);
engine.onPipelineFail = (fn) => engine.on("cynq:pipeline:fail", fn);
engine.onStepStart = (fn) => engine.on("cynq:step:start", fn);
engine.onStepEnd = (fn) => engine.on("cynq:step:ok", fn);
engine.onStepFail = (fn) => engine.on("cynq:step:fail", fn);
engine.onDeployStart = (fn) => engine.on("cynq:deploy:start", fn);
engine.onDeployEnd = (fn) => engine.on("cynq:deploy:ok", fn);
engine.onDeployFail = (fn) => engine.on("cynq:deploy:fail", fn);Example
engine.onStepEnd((data) => {
console.log(`โ
Step "${data.step}" completed in ${data.durationMs}ms`);
});๐ฌ Slack and Discord Hooks
Integrate with messaging platforms by posting from realtime events.
const axios = require("axios");
engine.on("cynq:pipeline:ok", async (data) => {
await axios.post(process.env.SLACK_WEBHOOK_URL, {
text : `โ
Pipeline "${data.name}" succeeded in ${data.durationMs}ms`
});
});
engine.on("cynq:pipeline:fail", async (data) => {
await axios.post(process.env.DISCORD_WEBHOOK_URL, {
content : `โ Pipeline "${data.name}" failed โ reason: ${data.reason}`
});
});๐ Custom Telemetry Collector
Aggregate step times or pipeline durations for dashboards.
const metrics = [];
engine.on("cynq:step:ok", (data) => {
metrics.push({
step : data.step,
duration : data.durationMs,
ts : Date.now()
});
});Send data periodically:
setInterval(() => {
if (metrics.length === 0) { return; }
console.table(metrics);
metrics.length = 0;
}, 5000);๐งฉ Chained Reactivity
Forward events into another Cynq instance or any remote listener.
engine.on("cynq:deploy:ok", async (data) => {
await axios.post("https://remote-node.example.com/enqueue", {
project : "mirror",
target : "sync",
payload : data
}, { headers : { Authorization : "Bearer xyz" } });
});โ๏ธ Pattern Reference
| Pattern | Description | Example |
|---|---|---|
engine.on(event, fn) |
Subscribe to a single event | engine.on("cynq:step:ok", fn) |
engine.once(event, fn) |
Subscribe once, auto-unsubscribe | engine.once("cynq:pipeline:ok", fn) |
engine.off(event, fn) |
Remove a specific listener | engine.off("cynq:deploy:start", fn) |
engine.emit(event, data) |
Emit custom event manually | engine.emit("custom:event", {...}) |
engine.realtime.emit(event, data) |
Broadcast event across IoTide or networked backplane | engine.realtime.emit("cynq:step:ok", data) |
๐ฐ๏ธ Example: Unified Realtime Dashboard Feed
engine.on("*", (evt, data) => {
console.log(`[${new Date().toISOString()}] ${evt}`, data);
});Combined with IoTide, events propagate instantly between distributed nodes for visualization, alerting, and streaming dashboards.
๐ Production Tip
Always sanitize payloads before broadcasting externally.
Avoid exposing secret paths, credentials, or internal error stacks in event handlers.
Forward only relevant metadata for telemetry or logs.
๐ก Example: End-to-End Flow
1๏ธโฃ Push event triggers webhook
2๏ธโฃ Job enters queue and runs build
3๏ธโฃ Deploy step copies files to target path
4๏ธโฃ Remote instance receives follow-up trigger
5๏ธโฃ Metrics update in real time
6๏ธโฃ Route engine awaits next event
๐ก Distributed Mesh Mode
Cynq operates in mesh topology when IoTide connects multiple instances together.
Each node participates in a shared event fabric that synchronizes jobs, telemetry, and triggers across regions or data centers.
๐งฉ Mesh Overview
| Role | Description |
|---|---|
| Primary Node | Publishes build, deploy, and event data to the mesh |
| Replica Node | Receives mirrored events, executes delegated pipelines |
| Observer Node | Listens only, no execution; ideal for dashboards or metrics aggregation |
All nodes authenticate using IoTide peer configuration and auto-discover each other through shared topics.
โ๏ธ Configuration Example
const { Cynq } = require("@trap_stevo/cynq");
const IoTide = require("@trap_stevo/iotide");
// Launch IoTide on each node
const tide = new IoTide(
3101,
{ useCors : true, useHTTPS : false, tidalCoreOptions : { namespace : "cynq-mesh" } },
true,
(socket) => {
console.log("[mesh] node connected:", socket.tideID);
// Automatically join the shared mesh channel
tide.joinChannel(socket.tideID, { roomName : "cynq-mesh", userID : socket.tideID });
}
);
const engine = new Cynq({
realtime : {
emit : (event, data) => tide.emitOnChannel("cynq-mesh", event, data)
},
logger : (msg) => console.log("[mesh]", msg)
});
await engine.sync("backend", "prod", spec);Each Cynq instance emits and listens on the same IoTide channel.
When one node triggers a job, others react instantly through IoTide propagation.
๐ Mesh Synchronization Events
| Event | Description |
|---|---|
mesh:node:join |
A new node connects to the mesh |
mesh:node:leave |
Node disconnects from the cluster |
mesh:job:forward |
Job forwarded to remote instance |
mesh:job:ack |
Remote instance acknowledges receipt |
mesh:sync:status |
Status update shared across peers |
๐ง Use Cases
| Scenario | Description |
|---|---|
| Geo-Redundant Deployments | Mirror builds and releases across multiple regions |
| Multi-Tenant Builds | Partition tenants by node for load balancing |
| Failover CI/CD | Standby nodes automatically resume pipelines on downtime |
| Global Observability | Aggregate step and deploy metrics across all Cynq instances |
| Event Broadcasts | Real-time alerts or dashboards updated via distributed emitters |
๐ Security & Isolation
โ
Peer authentication via signed tokens
โ
Namespace-scoped topic isolation
โ
HMAC validation for message integrity
โ
Configurable mesh-level ACLs
โ
Built-in replay and tamper protection
๐ Example: Mirrored Event Relay
tide.on("cynq:deploy:ok", (data) => {
console.log("[replica] received deploy confirmation", data);
// Optionally enqueue follow-up job on another node
});๐ฐ๏ธ Combined Architecture
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Cynq Node A โ
โ โข Receives webhook โ
โ โข Runs build & deploy โ
โ โข Emits events via IoTide โ
โโโโโโโโโโโโโฒโโโโโโโโโโโโโโโโโ
โ
Mesh Fabric (IoTide)
โ
โโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโ
โ Cynq Node B โ
โ โข Receives event mirror โ
โ โข Performs post-deploy โ
โ โข Sends metrics upstream โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ๐งฉ Cluster-Wide Behavior
- Shared event namespace (
cynq:*) - Job delegation through mesh forwarding
- Metrics and locks synchronized across nodes
- Vault data kept local per node (no secret broadcast)
- RouteEngine remains independent per instance
โก Example: Cross-Node Job Flow
1๏ธโฃ Node A receives webhook โ triggers build
2๏ธโฃ Build completes โ emits cynq:deploy:ok
3๏ธโฃ Node B listens โ forwards remote-cynq-enqueue job
4๏ธโฃ Node B deploys to its environment
5๏ธโฃ Cluster updates metrics in real time
๐ก๏ธ Production Tip
Keep mesh namespaces separate for staging vs production.
Always rotate mesh authentication tokens periodically and monitor node join events.
๐ Outcome
Distributed Mesh Mode turns independent Cynq instances into a single cooperative CI/CD fabric โ enabling horizontally scalable pipelines, real-time telemetry, and autonomous recovery across global environments.
๐ Secret Management
const secrets = engine.storage.secretsFacade("project", "target", { tenantId : "demo" });
await secrets.put("deploy.token", "ghp_ABC123");
const token = await secrets.get("deploy.token");Use vault references inside pipeline specs:
{
"with": {
"repo": "https://github.com/org/repo.git",
"token": "vault:deploy.token"
}
}๐ก๏ธ Security Principles
- Immutable deployment logs
- HMAC-protected enqueue endpoints
- Optional JWT-based authentication
- Replay-safe timestamp verification
- Strict content-type enforcement
- Tenant and project isolation
๐ฆ Installation
npm install @trap_stevo/cynqโก Quick Start
Minimal Example
const { Cynq } = require("@trap_stevo/cynq");
const engine = new Cynq({
engine : {
routes : { enabled : true, autoStart : true, port : 3333 }
}
});
const spec = {
name : "web-deploy",
pipeline : {
source : { driver : "git", with : { repo : "https://github.com/user/app" } },
steps : [
{ kind : "run", name : "build", runner : "shell", with : { script : "npm run build" } },
{ kind : "deploy", name : "copy", deployer : "copy-folder", with : { from : "./dist", to : "/srv/app" } }
]
}
};
await engine.deploy("myProject", "production", spec, { clientAuth : { token : "xyz" } });๐ License
See License in LICENSE.md
โก Automation Evolved.
From commits to clouds, from triggers to telemetry โ one intelligent engine unites it all.
Automate intelligently. Deploy infinitely.