Node Connector

A visual workflow automation platform. Build chains of executable plugins, trigger them via cron, webhook, or terminal, and monitor execution in real-time.

This project started from a single script file that automated repetitive tasks — manual deployments, server health checks, file operations — because I had no access to tools like n8n or a proper CI/CD pipeline. Over time that script grew too large to maintain, so I took inspiration from n8n and rebuilt it as a visual node-based editor. Node Connector is a personal project; while the codebase has been hardened with production-grade security practices, it was built for personal use and is not intended as a production platform.

Express.js 5 React 19 SQLite Docker
Node Connector Screenshot

Overview

Node Connector lets you create visual workflows by connecting plugin nodes on a canvas. Each node represents an action (run a script, SSH into a server, upload via FTP, rename files, etc.). Nodes pass data from one to the next, forming an execution chain.

Key Features

Architecture

+---------------------------------------------------+ | Docker Container | | | | +----------+ +-------------+ +-----------+ | | | Nginx | | API | | Scheduler | | | | :80 |--->| :3001 |<---| | | | | | | Express.js | | node-cron | | | +----------+ +------+------+ +-----------+ | | | | | | | +------+------+ | | Static Files | SQLite | | | (React build) | + JSON | | | +-------------+ | +---------------------------------------------------+ | Port 80 (HTTP)

Project Structure

node-connector/ +-- api/ # Express.js REST API | +-- index.js # Server entry point | +-- cli.js # CLI execution tool | +-- plugins/ # Plugin files (auto-loaded) | | +-- custom-script.js # Custom Node.js scripts | | +-- ssh.js # SSH remote commands | | +-- ftp.js # FTP file upload | | +-- http-request.js # HTTP/REST API calls | | +-- download-file.js # Download files from URLs | | +-- send-email.js # Send emails via SMTP | | +-- read-write-file.js# Read/write/append files | | +-- rename-file.js # Move/rename files | | +-- zip.js # Zip/unzip archives | | +-- json-transform.js # Transform JSON data | | +-- database-query.js # SQLite database queries | | +-- if-condition.js # Conditional branching | | +-- loop.js # Loop over downstream nodes | | +-- loop-end.js # Loop body boundary marker | | +-- delay.js # Wait/delay timer | | +-- linux-terminal.js # Bash commands | | +-- windows-cmd.js # CMD commands | | +-- powershell.js # PowerShell scripts | | +-- example.js # Template plugin | +-- db/ # SQLite database | +-- sheets/ # Sheet JSON files | +-- src/ | +-- routes/ # auth.js, sheet.js | +-- models/ # user.js, plugin.js, execution-history.js | +-- middleware/ # JWT, internal key auth | +-- executer.js # Execution engine (multi-input support) | +-- node-executer.js # Node data wrapper | +-- sheet-manager.js # Sheet CRUD + persistence | +-- sqlite-manager.js# Database abstraction | +-- tool-loader.js # Plugin discovery +-- scheduler/ # Cron scheduler service | +-- index.js # Entry point | +-- src/ | +-- executor.js # Job sync + execution | +-- logger.js # Logging +-- front/ # React 19 + Vite 6 | +-- src/ | +-- App.jsx # Router | +-- components/ # UI components | | +-- sheet/ # SVG editor + node edit modal | | +-- list/ # Sheet list + settings | | +-- log-modal/ # Execution log viewer | | +-- history-modal/ # Execution history viewer | | +-- messagebox/ # Confirmation dialogs | +-- services/ # API clients | +-- models/ # node.model.js (SVG node rendering) +-- start.js # Cross-platform launcher (no Docker) +-- start.bat # Windows launcher shortcut +-- start.sh # Linux/macOS launcher shortcut +-- service-install.bat # Windows auto-start installer +-- service-uninstall.bat # Windows auto-start uninstaller +-- service-install.sh # Linux auto-start installer (systemd) +-- service-uninstall.sh # Linux auto-start uninstaller +-- docs.html # Documentation (this file) +-- Dockerfile # Single-image build +-- entrypoint.sh # Container startup +-- nginx.conf # Reverse proxy config +-- run.bat # Windows Docker build/run script

Local Setup (no Docker)

Run Node Connector directly on your machine for full access to host tools (Python, CMD, bash, and any installed program). This is the recommended approach when you need terminal plugins to interact with your system.

Prerequisites

One-Command Start

# Windows — double-click start.bat or run:
start.bat

# Linux / macOS
./start.sh

# Or directly (any platform)
node start.js

The start.js launcher automatically:

  1. Installs dependencies for API, frontend, and scheduler (if node_modules/ is missing)
  2. Builds the frontend (if front/build/ doesn't exist)
  3. Writes config.js pointing the frontend to the API
  4. Starts the API server (which also serves the frontend static files)
  5. Starts the scheduler (after a 3-second delay for API readiness)
Tip: Open http://localhost:3001 in your browser once all services are running. Press Ctrl+C in the terminal to stop everything.

How It Works

Without Docker, there is no Nginx. Instead, the Express API serves double duty — it handles both API routes (/auth, /sheet) and the frontend static files from front/build/. Everything runs on a single port.

+---------------------------------------------+ | Host Machine | | | | +------------------+ +---------------+ | | | API (Express) | | Scheduler | | | | :3001 | | | | | | - API routes | | node-cron | | | | - Static files | +---------------+ | | +------------------+ | | | | | +------+------+ | | | SQLite | Full access to host: | | | + JSON | Python, CMD, bash, etc. | | +-------------+ | +---------------------------------------------+ | Port 3001 (HTTP)

Environment Variables

Set these before running start.js, or use the defaults:

VariableDefaultDescription
PORT3001HTTP port for API + frontend
JWT_SECRETrandom per sessionJWT signing secret (required in production)
REFRESH_TOKEN_SECRETrandom per sessionRefresh token secret (required in production)
INTERNAL_API_KEYrandom per sessionScheduler/CLI to API key (required in production)
ENCRYPTION_KEYrandom per sessionAES-256 encryption key for secret params (required in production)
CORS_ORIGINhttp://localhost:PORTAllowed CORS origin (set to your domain in production)
DATA_DIR/dataBase directory for file-operation plugins (path traversal sandbox)

Rebuilding the Frontend

The frontend is only built once. To rebuild after frontend code changes:

# Delete the build folder and restart
rm -rf front/build
node start.js

Auto-Start on Boot

To have Node Connector start automatically when the system boots, use the provided service scripts. They require elevated privileges.

Windows (Task Scheduler)

# Install — run as Administrator
service-install.bat

# Uninstall — run as Administrator
service-uninstall.bat

This creates a Windows Task Scheduler task named NodeConnector that runs node start.js at system startup under the SYSTEM account. To start immediately without rebooting:

schtasks /run /tn "NodeConnector"

Linux (systemd)

# Install
sudo ./service-install.sh

# Uninstall
sudo ./service-uninstall.sh

This creates a systemd service named node-connector that starts on boot with automatic restart on failure. Useful commands:

sudo systemctl status node-connector     # Check status
sudo systemctl stop node-connector       # Stop service
sudo systemctl restart node-connector    # Restart service
sudo journalctl -u node-connector -f     # View logs
Docker vs Local: Use Docker for isolated, reproducible deployments. Use local setup when you need plugins to access host-installed tools (Python, compilers, CLIs) or prefer a simpler development workflow.

Docker Deployment

Build & Run

docker build -t node-connector .
docker run -d -p 80:80 node-connector

With Required Secrets

All secret environment variables are required in production. The container will exit on startup if any are missing.

docker run -d -p 80:80 \
  -e JWT_SECRET=$(openssl rand -hex 32) \
  -e REFRESH_TOKEN_SECRET=$(openssl rand -hex 32) \
  -e INTERNAL_API_KEY=$(openssl rand -hex 32) \
  -e ENCRYPTION_KEY=$(openssl rand -hex 32) \
  node-connector

With Host Volume (for file operations)

# Linux / macOS
docker run -d -p 80:80 -v /path/on/host:/data node-connector

# Windows
docker run -d -p 80:80 -v C:\Users\You\Desktop:/data node-connector
Tip: The Custom Script plugin defaults its working directory to /data. Mount your host directory there so scripts can read/write host files.

Windows Quick Script (run.bat)

@echo off
REM Stops existing containers, rebuilds, and starts fresh
for /f "tokens=*" %%i in ('docker ps -q --filter "ancestor=node-connector"') do docker stop %%i
for /f "tokens=*" %%i in ('docker ps -aq --filter "ancestor=node-connector"') do docker rm %%i
docker build -t node-connector .
docker run -d -p 80:80 --name node-connector node-connector

Container Services

ServicePortDescription
Nginx80Serves frontend, proxies /api/ to API
API3001 (internal)Express.js REST API
Scheduler-Cron job manager, syncs every 30s

Authentication

The API uses JWT-based authentication with httpOnly cookies. Access tokens expire in 15 minutes and are automatically refreshed using a refresh token cookie (7-day expiry). Tokens are never exposed to JavaScript, preventing XSS-based token theft.

POST /auth/register
Create a new user account.
Body: { "username": "...", "password": "..." }
Response: 201{ "message": "User created" }
POST /auth/login
Login and receive authentication cookies.
Body: { "username": "...", "password": "..." }
Response: 200{ "message": "Login successful" }
Cookies Set:
  • token — httpOnly, secure (production), sameSite=strict, 15 min expiry
  • refreshToken — httpOnly, secure (production), sameSite=strict, path=/auth/refresh-token, 7 day expiry
POST /auth/refresh-token
Refresh the access token. The refresh token is read from the cookie automatically.
Auth: refreshToken cookie (sent automatically)
Response: 200{ "message": "Token refreshed" } (new token cookie set)
POST /auth/logout
Logout and clear authentication cookies.
Auth: token cookie
Response: 200{ "message": "Logged out" }
GET /auth/verify
Verify the current access token is valid.
Auth: token cookie
Response: 200{ "valid": true }
GET /auth/profile
Get the authenticated user's profile.
Auth: token cookie
Response: 200 — User object (without password)

Authentication Method

MethodUsed ByDescription
token cookie (httpOnly)FrontendJWT access token, sent automatically with every request
refreshToken cookie (httpOnly)FrontendRefresh token, scoped to /auth/refresh-token path only
X-Internal-Key: <key>Scheduler, CLIInternal API key for service-to-service calls

Sheets

A sheet is a workflow definition containing a graph of connected nodes. Sheet metadata is stored in SQLite; node data is stored as JSON files.

GET /sheet/list
List all sheets for the authenticated user.
Auth: JWT
Response: Array of { id, name, uid, slug, is_active, trigger_type, cron_schedule }
POST /sheet/create
Create a new sheet.
Body: { "name": "My Workflow" }
Response: { uid, name, slug, data: { nodes: [] } }
GET /sheet/get?id=<uid>
Get full sheet with all nodes and connections.
Auth: JWT
PUT /sheet/update
Update sheet name and node structure.
Body: { "sheet": { "uid": "...", "name": "...", "data": { "nodes": [...] } } }
GET /sheet/plugins/list
List all available plugins with metadata, params, and tags.
Auth: JWT

Nodes

Nodes are the building blocks of a workflow. Each node uses a plugin and has configurable parameters, inputs (from upstream nodes), and outputs (to downstream nodes).

Node Structure

{
  "id": "uuid",
  "title": "Node Title",
  "className": "CustomScript",   // Plugin class name
  "position": { "x": 100, "y": 200 },
  "params": {
    "0": { "name": "Script", "alias": "script", "type": "big_string", "default": "...", "value": "..." }
  },
  "inputs": ["node-id-1"],       // IDs of upstream nodes
  "outputs": ["node-id-2"]       // IDs of downstream nodes
}
POST /sheet/node
Add a node to a sheet.
Body: { "sheetId": "...", "title": "...", "pluginId": "CustomScript", "position": {x, y}, "params": {} }
Response: { "id": "new-node-uuid", "message": "created" }
PUT /sheet/node
Update a node's properties (title, params, connections, position).
Body: { "sheetId": "...", "node": { "id": "...", ... } }
PUT /sheet/node/delete
Delete a node and clean up its connections.
Body: { "sheetId": "...", "node": { "id": "..." } }

Execution

Single Node (Interactive)

GET /sheet/node/execute?sheetId=...&nodeId=...&token=...
Execute a node and all its downstream nodes with real-time streaming via SSE.
Auth: JWT via token query param (required for SSE)
Response: text/event-stream with progress events

SSE events have this shape:

{
  "id": "node-uuid",
  "result": { ... },          // Plugin output (after execution)
  "error": false,
  "message": "Done",
  "stage": "executed"          // "executing" | "executed"
}

Batch (All Root Nodes)

POST /sheet/execute-batch
Execute all root nodes (nodes with no inputs) and their downstream chains. Used by scheduler, CLI, and webhooks.
Auth: X-Internal-Key
Body: { "sheetUid": "..." }
// Response
{
  "sheetUid": "...",
  "sheetName": "My Workflow",
  "results": [
    {
      "rootNodeId": "...",
      "rootNodeTitle": "Start",
      "status": "success",
      "nodes": [
        { "nodeId": "...", "title": "Custom Script", "result": { ... } }
      ]
    }
  ]
}

Webhook Trigger

POST /sheet/webhook/:uid
Trigger execution for a webhook-type sheet. Sheet must be active with trigger_type: "webhook".
Auth: X-Internal-Key

Settings & Triggers

PUT /sheet/settings
Update sheet settings (name, active status, trigger type, cron schedule).
Auth: JWT
Body: { "uid": "...", "name": "...", "is_active": 1, "trigger_type": "cron", "cron_schedule": "*/5 * * * *" }

Trigger Types

TypeValueHow It Works
Cron "cron" Scheduler automatically executes the sheet on the configured schedule. The cron_schedule field defines timing.
Webhook "webhook" Sheet is executed when an HTTP POST is sent to /sheet/webhook/{uid}.
Terminal "terminal" Sheet is executed manually via the CLI tool or from the UI's execute button.

Cron Schedule Options

LabelExpression
Every 15 seconds*/15 * * * * *
Every 30 seconds*/30 * * * * *
Every minute* * * * *
Every 3 minutes*/3 * * * *
Every 5 minutes*/5 * * * *
Every 15 minutes*/15 * * * *
Every 30 minutes*/30 * * * *
Every 45 minutes*/45 * * * *
Every hour0 * * * *
Daily at 6 AM0 6 * * *
Daily at 9 AM0 9 * * *
Daily at 12 PM0 12 * * *
Daily at 3 PM0 15 * * *
Daily at 6 PM0 18 * * *

Plugin System

Plugins are auto-discovered from the api/plugins/ directory. Any .js file exporting a class that extends Plugin is loaded automatically.

Base Plugin Class

const Plugin = require("./../src/models/plugin");

class MyPlugin extends Plugin {
  name()            { return "My Plugin"; }
  description()     { return "What this plugin does"; }
  icon()            { return "🔧"; }
  iconBase64()      { return null; }   // Optional: return "data:image/png;base64,..." for a custom image icon
  tags()            { return ["category"]; }

  paramsDefinition() {
    return [
      {
        name: "Param Name",      // Display label
        alias: "param_alias",    // Key in params object
        type: "string",          // "string" | "big_string" | "number" | "boolean" | "select" | "radio"
        secret: false,           // true = rendered as password field, encrypted at rest
        default: "default value",
        value: undefined
      }
    ];
  }

  async logic(params = {}) {
    // params.param_alias  = this node's configured value
    // params.input        = output from upstream nodes

    return {
      status: { error: false, message: "Success" },
      output: { key: "value" }  // Passed to downstream nodes
    };
  }
}

module.exports = MyPlugin;

Plugin Icons

Plugins support two icon modes:

Parameter Types

TypeUI Rendered As
stringSingle-line text input
big_stringMulti-line textarea (code editor)
numberNumeric input
booleanCheckbox
selectDropdown menu. Requires options array: [{label, value}, ...]
radioRadio button group. Requires options array: [{label, value}, ...]

Secret Parameters

Add secret: true to any parameter definition to mark it as sensitive. Secret parameters are:

{ name: "Password", alias: "password", type: "string", secret: true, default: "", value: undefined }

Return Format

Every plugin's logic() must return:

{
  status: {
    error: boolean,    // true if execution failed
    message: string    // Human-readable status message
  },
  output: { ... }      // Data passed to downstream nodes via params.input
}

Available Plugins

📜 Custom Script script
api/plugins/custom-script.js
Execute custom Node.js code. The script must define an async function main().
ParameterTypeDefaultDescription
scriptbig_stringTemplate codeJavaScript code with async function main()
working_directorystring/dataWorking directory for script execution (CWD)

Built-in Helpers

NameTypeDescription
inputObjectOutput from the previous node, injected as a variable. Access keys directly: input.filepath
output(data)FunctionPass structured data to the next node. Equivalent to console.log(JSON.stringify(data))

Example

async function main() {
  const fs = require('fs');

  // Access data from the previous node
  console.log("Received:", input);

  // Read files from mounted volume
  const files = fs.readdirSync('.');

  // Pass structured data to the next node
  output({ files, count: files.length });
}
🔐 SSH network
api/plugins/ssh.js
Execute commands on a remote server via SSH.
ParameterTypeDescription
ssh_hoststringServer hostname or IP
ssh_usernamestringSSH username
ssh_passwordstring secretSSH password (encrypted at rest)
ssh_cmdstringCommands to run (comma-separated for multiple)
📤 FTP network
api/plugins/ftp.js
Upload files to an FTP server.
ParameterTypeDescription
hoststringFTP server hostname
usernamestringFTP username
passwordstring secretFTP password (encrypted at rest)
local_file_pathstringPath to local file to upload
remote_folder_pathstringRemote destination folder (default: /)

Output: { size: "1.50 MB" }

📝 Rename File io
api/plugins/rename-file.js
Move or rename files on the filesystem. Creates destination directories if needed.
ParameterTypeDescription
source_pathstringCurrent file path (also accepts from input)
dest_pathstringNew file path (also accepts from input)

Output: { source_path, dest_path, filename }

🐧 Linux Terminal terminal linux
api/plugins/linux-terminal.js
Execute bash/shell commands on a Linux system. Auto-detects the platform and uses the appropriate shell.
ParameterTypeDefaultDescription
commandbig_stringls -laShell command to execute
working_directorystring/dataWorking directory (defaults to process.cwd() on Windows)
timeoutnumber30000Maximum execution time in milliseconds

Output: { result: "stdout content", exitCode: 0 } — If stdout is valid JSON, it is parsed into the output object.

🪟 Windows CMD terminal windows
api/plugins/windows-cmd.js
Execute CMD commands on a Windows system. Falls back to /bin/bash when running inside a Linux Docker container.
ParameterTypeDefaultDescription
commandbig_stringdirCMD command to execute
working_directorystring/dataWorking directory (defaults to process.cwd() on Windows)
timeoutnumber30000Maximum execution time in milliseconds

Output: { result: "stdout content", exitCode: 0 } — If stdout is valid JSON, it is parsed into the output object.

🌐 HTTP Request network
api/plugins/http-request.js
Make HTTP/REST API calls (GET, POST, PUT, DELETE) with custom headers, body, and timeout. Automatically parses JSON responses.
ParameterTypeDefaultDescription
urlstringhttps://httpbin.org/getThe URL to request
methodselectGETHTTP method: GET, POST, PUT, DELETE
headersbig_string{"Content-Type": "application/json"}Request headers as JSON
bodybig_stringRequest body (for POST/PUT)
timeoutnumber30000Timeout in milliseconds

Output: { statusCode, statusText, body }

⬇️ Download File network io
api/plugins/download-file.js
Download a file from a URL and save it to the local filesystem.
ParameterTypeDefaultDescription
urlstringURL of the file to download
destinationstringLocal path to save the file
timeoutnumber60000Timeout in milliseconds

Output: { path, size, statusCode, contentType }

✉️ Send Email notification
api/plugins/send-email.js
Send emails via SMTP. Supports plain text and HTML bodies. Requires nodemailer: run npm install nodemailer in api/.
ParameterTypeDefaultDescription
smtp_hoststringsmtp.gmail.comSMTP server hostname
smtp_portnumber587SMTP port (587 for TLS, 465 for SSL)
usernamestringSMTP username
passwordstring secretSMTP password or app password (encrypted at rest)
fromstringSender email address
tostringRecipient email address
subjectstringEmail subject
bodybig_stringEmail body content
is_htmlbooleanfalseSend body as HTML

Output: { messageId, response }

📄 Read/Write File io
api/plugins/read-write-file.js
Read, write, or append content to files on the filesystem. Automatically creates directories for write operations.
ParameterTypeDefaultDescription
operationselectreadOperation: read, write, or append
file_pathstringPath to the file
contentbig_stringContent to write/append
encodingstringutf-8File encoding

Output (read): { content, size, path } | Output (write): { path, size }

📦 Zip / Unzip io
api/plugins/zip.js
Compress files/folders into ZIP archives or extract them. Requires adm-zip: run npm install adm-zip in api/.
ParameterTypeDefaultDescription
operationradiozipOperation: zip or unzip
sourcestringSource file or folder path
destinationstringDestination path for the archive or extraction folder

Output (zip): { path, size } | Output (unzip): { path, entries }

🔄 JSON Transform data
api/plugins/json-transform.js
Parse, transform, and reshape JSON data using JavaScript expressions. The input variable contains data from previous nodes.
ParameterTypeDefaultDescription
transformbig_stringreturn input;JavaScript code that receives input and returns transformed data

Example: return { name: input.first + " " + input.last, count: input.items.length };

🗄️ Database Query data
api/plugins/database-query.js
Execute SQL queries against a SQLite database. Requires better-sqlite3: run npm install better-sqlite3 in api/.
ParameterTypeDefaultDescription
db_pathstringPath to the SQLite database file
querybig_stringSELECT 1 AS resultSQL query to execute
paramsbig_string[]Query parameters as a JSON array

Output (SELECT): { rows: [...], count } | Output (INSERT/UPDATE/DELETE): { changes, lastInsertRowid }

🔀 If Condition flow
api/plugins/if-condition.js
Evaluate a condition and pass or block downstream execution. If the condition is not met, the node turns gray and downstream execution stops silently — downstream nodes are not executed or marked as failed.
ParameterTypeDefaultDescription
valuestringThe value to evaluate (supports {{variable}} syntax)
operatorselectequalsOperator: equals, not_equals, contains, not_contains, greater_than, less_than, is_empty, is_not_empty
compare_tostringValue to compare against

Output: { passed, value, operator, compareTo }

🔁 Loop flow
api/plugins/loop.js
Re-execute all downstream nodes for each item in a JSON array or a numeric range. Downstream nodes can access {{item}}, {{index}}, and {{total}} via the template variable system.
ParameterTypeDefaultDescription
itemsbig_stringA JSON array of items to iterate over (e.g. ["a","b","c"])
countnumber0Number of iterations if no items provided (creates range 0..N-1)

Output: { iterations: [...], count } — aggregated results from all downstream nodes across all iterations.

🔚 Loop End flow
api/plugins/loop-end.js
Marks the end of a loop body. Place this node between the looped nodes and the post-loop nodes. Nodes connected after Loop End run only once when all iterations are complete, and receive the aggregated loop results ({{iterations}}, {{count}}).

Parameters: None

Output: { iterations: [...], count } — the aggregated results from the loop, forwarded to post-loop nodes.

⏱️ Delay flow
api/plugins/delay.js
Wait for a specified duration before continuing to the next node. Useful for rate limiting or waiting between API calls.
ParameterTypeDefaultDescription
durationnumber1000Duration to wait in milliseconds

Output: { duration } — passes through input data unchanged.

💠 PowerShell terminal windows
api/plugins/powershell.js
Execute PowerShell scripts. Uses -EncodedCommand for reliable multi-line script support. Uses powershell.exe on Windows, pwsh on Linux.
ParameterTypeDefaultDescription
scriptbig_stringWrite-Output "Hello World"PowerShell script to execute
working_directorystringWorking directory for the script
timeoutnumber30000Timeout in milliseconds

Output: { result: "stdout content", exitCode: 0 }

🌟 Example Plugin example
api/plugins/example.js
A template/demonstration plugin. Use as a starting point for creating new plugins.

Output: { example: 0 }

Creating a Plugin

To create a new plugin, add a .js file in api/plugins/. It will be auto-discovered on the next API restart.

Step-by-step

  1. Create a new file: api/plugins/my-plugin.js
  2. Extend the Plugin base class
  3. Implement all required methods
  4. Export the class
  5. Restart the API — the plugin appears automatically

Template

const Plugin = require("./../src/models/plugin");

class MyPlugin extends Plugin {
  name()        { return "My Plugin"; }
  description() { return "Describe what it does"; }
  icon()        { return "🚀"; }
  // iconBase64()  { return "data:image/png;base64,..."; }  // Optional: custom image icon
  tags()        { return ["custom"]; }

  paramsDefinition() {
    return [
      { name: "My Param", alias: "my_param", type: "string", default: "", value: undefined },
      { name: "API Key", alias: "api_key", type: "string", secret: true, default: "", value: undefined },  // encrypted at rest, shown as password field
      {
        name: "Method",
        alias: "method",
        type: "select",           // dropdown menu
        default: "GET",
        options: [
          { label: "GET", value: "GET" },
          { label: "POST", value: "POST" }
        ],
        value: undefined
      }
    ];
  }

  async logic(params = {}) {
    const myParam = params.my_param;
    const previousOutput = params.input || {};

    // Your logic here...
    this.log("Processing...");

    return {
      status: { error: false, message: "Done" },
      output: { result: "my output" }
    };
  }
}

module.exports = MyPlugin;

Data Flow Between Nodes

When nodes are connected, the output of one node becomes the input of the next. The execution engine handles this automatically.

Node A executes result.output stored Merged into params.input {{variables}} resolved Node B receives it

Template Variables {{variable}}

Any node parameter can reference output from the previous node using {{variable}} syntax. The execution engine resolves these placeholders before the plugin runs.

Syntax: Use {{key}} in any parameter field. It will be replaced with the value of key from the previous node's output. If the key doesn't exist, the placeholder is left as-is.

Example: Custom Script → Rename File

StepNodeDetails
1 Custom Script Outputs: { filepath: "/data/report.txt", newpath: "/data/archive/report.txt" }
2 Rename File Source Path: {{filepath}} → resolved to /data/report.txt
Dest Path: {{newpath}} → resolved to /data/archive/report.txt

How It Works Internally

  1. Node A's logic() returns { status, output: { filepath: "...", newpath: "..." } }
  2. The executer merges output into params.input for Node B
  3. Before calling Node B's logic(), all string params are scanned for {{key}} patterns
  4. Each {{key}} is replaced with the matching value from params.input
  5. Node B receives the resolved values in its params

Variable Resolution Rules

Input Value TypeResolution
String, Number, BooleanConverted to string directly
Object or ArrayJSON stringified
Key not foundPlaceholder left unchanged: {{unknown}}

Multi-Input Nodes

When a node has multiple input connections, the execution engine enforces these rules:

  1. Wait for all inputs — The node will not execute until all upstream input nodes have completed.
  2. Stop on failure — If any input node fails (error or exception), the dependent node is skipped with a "Skipped: a previous node failed" message. This failure propagates to all downstream nodes.
  3. Merge all outputs — Outputs from all input nodes are merged into a single params.input object. If multiple inputs have the same output key, the last one wins (in connection order).
Node A completes Node B completes Both outputs merged Node C executes
Important: If Node A fails, Node C (and all its downstream nodes) will be skipped — even if Node B succeeded.

Accessing Input Programmatically

Built-in Plugins (SSH, FTP, Rename File)

Two ways to receive data from previous nodes:

// Option 1: Use {{variable}} in the parameter field in the UI
// Source Path: {{filepath}}
// The value is resolved before logic() is called

// Option 2: Access params.input directly in plugin code
async logic(params = {}) {
  const filePath = params.source_path || params.input.filepath;
}

Custom Script Plugin

The input variable is injected automatically into the script scope:

async function main() {
  // `input` contains the output from the previous node
  console.log("File:", input.filepath);   // "/data/report.txt"
  console.log("Count:", input.count);     // 42

  // Pass structured data to the next node
  output({ processed: true, path: input.filepath });
}

Full Chain Example

// ── Node 1: Custom Script ── "List files"
async function main() {
  const fs = require('fs');
  const files = fs.readdirSync('.');
  output({
    source: "/data/" + files[0],
    dest: "/data/archive/" + files[0]
  });
}
// Output: { source: "/data/report.txt", dest: "/data/archive/report.txt" }


// ── Node 2: Rename File ──
// Source Path field:  {{source}}   →  resolves to "/data/report.txt"
// Dest Path field:    {{dest}}     →  resolves to "/data/archive/report.txt"
// Output: { source_path, dest_path, filename }


// ── Node 3: Custom Script ── "Log result"
async function main() {
  console.log("Moved:", input.filename, "from", input.source_path);
  output({ status: "done" });
}

Execution History

Every non-test execution (cron, webhook, or terminal) is recorded in the execution history. Test executions from the UI editor are not recorded.

GET /sheet/history/:uid
Get execution history for a sheet, ordered by most recent first (limit 100).
Auth: JWT
Response: Array of history records
// History record
{
  "id": 1,
  "sheet_uid": "abc-123",
  "sheet_name": "My Workflow",
  "trigger_type": "cron",         // "cron" | "webhook" | "terminal"
  "status": "success",            // "success" | "error"
  "started_at": "2025-05-01T12:00:00.000Z",
  "duration_ms": 1523,
  "node_count": 4,
  "error_count": 0,
  "results_summary": "{...}"      // JSON string of full results
}

Trigger Type Tracking

SourceTrigger TypeHow It's Set
Scheduler (cron)"cron"Included in execute-batch request body
Webhook endpoint"webhook"Set automatically in webhook handler
CLI tool"terminal"Included in execute-batch request body
UI Execute buttonNot recorded (test execution)
UI Access: Click the blue History button in the sheet editor toolbar to view the history modal with expandable rows showing status, trigger type, timestamp, duration, and node count.

Scheduler

The scheduler is a background service that manages per-sheet cron jobs. It runs inside the Docker container alongside the API.

How It Works

  1. Starts with a 5-second delay (waits for API to be ready)
  2. Every 30 seconds, syncs with the API:
    • Fetches all sheets via /sheet/list-internal
    • Filters for active sheets with trigger_type: "cron"
    • Creates new cron jobs for newly added sheets
    • Removes jobs for deleted or deactivated sheets
    • Updates jobs if the cron schedule changed
  3. Each cron job calls /sheet/execute-batch on its schedule
Note: The scheduler only processes sheets with trigger_type: "cron". Webhook and terminal sheets are ignored.

CLI Tool

Execute sheets manually from the command line. The INTERNAL_API_KEY environment variable must be set.

Usage

# From inside the container
INTERNAL_API_KEY=your_key node cli.js <sheet-uid>

# From host (via docker exec)
docker exec node-connector node cli.js <sheet-uid>

# From host (via curl)
curl -X POST http://localhost/api/sheet/execute-batch \
  -H "Content-Type: application/json" \
  -H "X-Internal-Key: $INTERNAL_API_KEY" \
  -d '{"sheetUid":"<sheet-uid>"}'

Output

Sheet "My Workflow" executed successfully.

Root: Start Node [success]
  -> Custom Script: {"files":["a.txt","b.txt"]}
  -> Rename File: {"source_path":"/data/a.txt","dest_path":"/data/archive/a.txt"}

Frontend

The frontend is a React 19 single-page application built with Vite 6. It provides the visual workflow editor and sheet management interface.

Routes

PathComponentAccess
/loginLoginGuest only
/registerRegisterGuest only
/editorEditor (sheet list)Authenticated
/editor?id=uidEditor (canvas)Authenticated

Key Components

ComponentDescription
ListComponentDisplays sheets as cards with status badge, trigger summary, and settings gear button. Includes settings modal and create modal.
SheetComponentSVG-based visual editor. Drag plugins from the palette, connect nodes, configure parameters, execute workflows. Supports zoom and pan.
Plugin PaletteLeft sidebar listing all plugins, filterable by tag. Supports base64 image icons and emoji icons. Drag to canvas to add nodes.
Node Edit ModalOpened on double-click of a node. Shows the node's icon and title in the header with an "Editing" tag. Contains the parameter form, Save, Execute, and Delete buttons.
LogModalDisplays real-time execution logs streamed via SSE.
HistoryModalShows execution history for the current sheet. Expandable rows with status, trigger type, timestamp, duration, and node count.
MessageBoxStyled modal dialogs with overlay, type-based icons (info, warning, error, success), and button variants. Used for confirmations, alerts, and notifications. Replaces native alert() and confirm().

Sheet Editor Layout

AreaPositionContents
Left ToolbarTop-leftBack button (return to sheet list), Play All button (execute all root nodes with live progress), and History button (view execution history)
Right ToolbarTop-rightGrid toggle checkbox, zoom in (+) and zoom out (-) buttons
Plugin PaletteLeft sidebarTag filter buttons and draggable plugin cards
CanvasCenterSVG workspace with nodes, connections, pan and zoom
Play All: Finds all root nodes (nodes with no inputs) and executes each chain in parallel. Progress is streamed live in the log modal — each node lights up yellow while executing, then green (success) or red (error).

Database Schema

The application uses SQLite for metadata and JSON files for sheet node data.

Users Table

CREATE TABLE users (
  id            INTEGER PRIMARY KEY AUTOINCREMENT,
  username      TEXT UNIQUE NOT NULL,
  password      TEXT NOT NULL,       -- bcrypt hashed
  refreshToken  TEXT
);

Sheets Table

CREATE TABLE sheets (
  id             INTEGER PRIMARY KEY AUTOINCREMENT,
  name           TEXT,
  uid            TEXT,                -- UUID
  slug           TEXT,                -- URL-safe name
  is_active      INTEGER DEFAULT 1,  -- 1 = active, 0 = inactive
  trigger_type   TEXT DEFAULT 'cron', -- 'cron' | 'webhook' | 'terminal'
  cron_schedule  TEXT DEFAULT '0 * * * *'
);

Execution History Table

CREATE TABLE execution_history (
  id              INTEGER PRIMARY KEY AUTOINCREMENT,
  sheet_uid       TEXT NOT NULL,
  sheet_name      TEXT,
  trigger_type    TEXT,              -- 'cron' | 'webhook' | 'terminal'
  status          TEXT,              -- 'success' | 'error'
  started_at      TEXT,              -- ISO 8601 timestamp
  duration_ms     INTEGER,
  node_count      INTEGER,
  error_count     INTEGER DEFAULT 0,
  results_summary TEXT               -- JSON string of full results
);

Sheet JSON Files

Stored at api/sheets/{uid}.json:

{
  "id": -1,
  "uid": "uuid",
  "name": "Sheet Name",
  "slug": "sheet-name",
  "data": {
    "nodes": [
      { "id": "...", "title": "...", "className": "...", "params": {...}, "inputs": [], "outputs": [] }
    ]
  }
}

Environment Variables

VariableDefaultDescription
PORT3001API server port (internal)
DB_PATH./db/sheets.dbSQLite database file path
JWT_SECRETrequiredSecret for signing JWT access tokens. Must be set. In dev, a random key is generated per session if missing.
REFRESH_TOKEN_SECRETrequiredSecret for signing refresh tokens. Must be set. In dev, a random key is generated per session if missing.
INTERNAL_API_KEYrequiredKey for scheduler/CLI to API communication. Must be set. In dev, a random key is generated per session if missing.
ENCRYPTION_KEYrequiredKey for AES-256-GCM encryption of secret plugin parameters. Must be set. In dev, a random key is generated per session if missing.
CORS_ORIGINhttp://localhost:PORTAllowed CORS origin. Set to your domain in production (e.g. https://example.com)
API_BASE_URLhttp://127.0.0.1:3001API URL used by scheduler
LOG_LEVELinfoLogging verbosity
DATA_DIR/dataBase directory for file-operation plugins. All file paths are confined to this directory to prevent path traversal
Important: All secret variables (JWT_SECRET, REFRESH_TOKEN_SECRET, INTERNAL_API_KEY, ENCRYPTION_KEY) are required in production (NODE_ENV=production). The server will refuse to start if any are missing. In development, random secrets are generated per session with a console warning.

Generate strong secrets with:

node -e "console.log(require('crypto').randomBytes(32).toString('hex'))"

Security

Authentication

HTTP Security

Rate Limiting

ScopeWindowMax RequestsDescription
Global (all routes)15 minutes100General protection against abuse
/auth/login15 minutes10Brute-force login protection
/auth/register1 hour5Prevents mass account creation

Rate limit status is returned via standard headers (RateLimit-Policy, RateLimit-Remaining, RateLimit-Reset). When exceeded, the API responds with 429 Too Many Requests.

Credential Encryption

SSH Command Safety

Custom Script Sandboxing

Secure Temp Files

Path Traversal Protection

SQL Injection Protection

Input Validation

Password Policy

Content Security Policy

Error Message Sanitization

Client-Side Form Validation

Audit Logging