Compare commits

...

178 Commits

Author SHA1 Message Date
n8n-assistant[bot]
acb9bab175
🚀 Release 1.123.41 (#30009)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: konstantintieber <46342664+konstantintieber@users.noreply.github.com>
2026-05-07 13:21:47 +00:00
n8n-assistant[bot]
73539f4740
fix(core): Simple-git update broke https connection (backport to 1.x) (#30005)
Co-authored-by: Konstantin Tieber <46342664+konstantintieber@users.noreply.github.com>
2026-05-07 12:48:34 +00:00
n8n-assistant[bot]
2b425cd612
🚀 Release 1.123.40 (#29949)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-05-07 08:29:26 +03:00
n8n-assistant[bot]
1bb7d110e5
fix(core): Allow GIT_SSH_COMMAND in simple-git after 3.36.0 upgrade (backport to 1.x) (#29947)
Co-authored-by: Daria <daria.staferova@n8n.io>
2026-05-07 07:56:47 +03:00
n8n-assistant[bot]
aec110f198
fix(Snowflake Node): Fix issue with Insert and Update operations not working (backport to 1.x) (#29812)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Jon <jonathan.bennetts@gmail.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-06 16:03:40 +00:00
n8n-assistant[bot]
6c8536ecf3
🚀 Release 1.123.39 (#29867)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-05-06 10:16:35 +00:00
n8n-assistant[bot]
0d62a137eb
chore: Bump simple-git to 3.36.0 (backport to 1.x) (#29837)
Co-authored-by: Matsu <huhta.matias@gmail.com>
2026-05-06 06:44:14 +00:00
Konstantin Tieber
db3b57b040
feat(core): Add flag to import workflow cli to activate workflow on import (#29341)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
2026-05-05 11:25:41 +03:00
Ali Elkhateeb
77eb53363d
fix(core): Add timeout to external secrets provider update to prevent startup hang (#29682)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
2026-05-04 15:10:34 +02:00
n8n-assistant[bot]
f8845745a6
🚀 Release 1.123.38 (#29514)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Performance (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
Co-authored-by: mfsiega <93014743+mfsiega@users.noreply.github.com>
2026-04-29 13:55:01 +00:00
n8n-assistant[bot]
6aaf436435
feat(core): Add --include and --exclude flags to import:credentials command (backport to 1.x) (#29465)
Co-authored-by: Ali Elkhateeb <ali.elkhateeb@n8n.io>
2026-04-29 12:18:57 +00:00
n8n-assistant[bot]
4af49f1d9e
fix(core): Fix code node executions hanging when idle timer overlaps with task acceptance (backport to 1.x) (#29393)
Co-authored-by: Tomi Turtiainen <10324676+tomi@users.noreply.github.com>
2026-04-29 13:53:00 +03:00
n8n-assistant[bot]
271af23ef3
fix(core): Fix task runner hanging when connection attempt fails (backport to 1.x) (#29441)
Co-authored-by: Tomi Turtiainen <10324676+tomi@users.noreply.github.com>
2026-04-29 12:39:32 +03:00
Matsu
d67c7144f9
chore: Remove unused create-patch workflow and clean up test-e2e-reusable (#29439) 2026-04-29 09:05:08 +01:00
n8n-assistant[bot]
55cad3babb
feat(core): Add --projectId filter to export:workflow and export:credentials commands (backport to 1.x) (#29373)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
CI: Python / Checks (push) Has been cancelled
Co-authored-by: Ali Elkhateeb <ali.elkhateeb@n8n.io>
2026-04-28 15:39:56 +03:00
n8n-assistant[bot]
880e40cde6
ci: Clean up Template Injection surface in Actions (backport to 1.x) (#29367)
Co-authored-by: Matsu <huhta.matias@gmail.com>
2026-04-28 09:10:37 +00:00
n8n-assistant[bot]
eb752033f2
🚀 Release 1.123.37 (#29104)
Co-authored-by: konstantintieber <46342664+konstantintieber@users.noreply.github.com>
2026-04-24 15:25:02 +00:00
Konstantin Tieber
31f55085e2
fix(core): Fix InstanceSettings.isMultiMain still returning false for multi-main cli command (#29101) 2026-04-24 14:54:11 +00:00
n8n-assistant[bot]
f4e941d394
🚀 Release 1.123.36 (#29083)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-24 11:45:21 +00:00
Konstantin Tieber
1d6791179f
fix(core): Workflow import cli doesn't deregister crons for deactivated workflows (multi-main only) (#29079) 2026-04-24 13:21:21 +02:00
n8n-assistant[bot]
f193f3133d
🚀 Release 1.123.35 (#29040)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-24 07:00:56 +00:00
Declan Carroll
5ce4b5d46c
fix: Fix critical dependency vulnerabilities and build errors on 1.x (#29026)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-24 07:51:11 +03:00
n8n-assistant[bot]
3d5cde8579
🚀 Release 1.123.34 (#28956)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-23 12:04:16 +03:00
aikido-autofix[bot]
caa65d8e9b
fix: Fix 50 critical issues in handlebars, lodash, @microsoft/api-extractor and 20 more (#28927)
Co-authored-by: aikido-autofix[bot] <119856028+aikido-autofix[bot]@users.noreply.github.com>
Co-authored-by: Matsuuu <huhta.matias@gmail.com>
Co-authored-by: Declan Carroll <declan@n8n.io>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-23 10:22:46 +03:00
n8n-assistant[bot]
f07ca0d5a0
🚀 Release 1.123.33 (#28879)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-22 09:55:06 +00:00
n8n-assistant[bot]
e67b44b0e0
fix(core): Enforce credential access checks in dynamic node parameter requests (backport to 1.x) (#28862)
Co-authored-by: Stephen Wright <sjw948@gmail.com>
2026-04-22 10:33:07 +01:00
n8n-assistant[bot]
90cb7227cf
🚀 Release 1.123.32 (#28847)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-22 06:16:48 +00:00
n8n-assistant[bot]
0e626768ed
chore: Bundle 1.x (#28845)
Co-authored-by: Matsu <matias.huhta@n8n.io>
Co-authored-by: n8n-assistant[bot] <100856346+n8n-assistant[bot]@users.noreply.github.com>
Co-authored-by: Dawid Myslak <dawid.myslak@gmail.com>
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
Co-authored-by: Danny Martini <danny@n8n.io>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: RomanDavydchuk <roman.davydchuk@n8n.io>
Co-authored-by: Milorad FIlipović <milorad@n8n.io>
Co-authored-by: Iván Ovejero <ivov.src@gmail.com>
2026-04-22 08:42:55 +03:00
n8n-assistant[bot]
a6b3e819bb
fix(core): Preserve NODE_PATH for globally installed npm packages in Docker (backport to 1.x) (#28781)
Co-authored-by: Declan Carroll <declan@n8n.io>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Matsuuu <huhta.matias@gmail.com>
2026-04-21 14:13:40 +03:00
n8n-assistant[bot]
c4b79637b7
ci: Allow only bundles to 1.x (backport to 1.x) (#28696)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Performance (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
Co-authored-by: Matsu <huhta.matias@gmail.com>
2026-04-20 16:08:16 +03:00
n8n-assistant[bot]
e7d95055d1
🚀 Release 1.123.31 (#28508)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Performance (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-15 07:32:55 +00:00
n8n-assistant[bot]
6bd24636ee
ci: Account for pnpm-workspace changes in bump-versions.mjs (backport to 1.x) (#28506)
Co-authored-by: Matsu <huhta.matias@gmail.com>
2026-04-15 10:08:52 +03:00
n8n-assistant[bot]
808bc6a469
chore: Bump axios to 1.15.0 (backport to 1.x) (#28466)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Performance (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
Co-authored-by: Matsu <huhta.matias@gmail.com>
2026-04-14 14:48:28 +03:00
n8n-assistant[bot]
bf646761eb
ci: Add security publish fix workflow for 1.x branch (backport to 1.x) (#28403)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Matsu <huhta.matias@gmail.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-13 14:21:26 +03:00
n8n-assistant[bot]
e633500f18
🚀 Release 1.123.30 (#28237)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Performance (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-09 09:04:09 +00:00
n8n-assistant[bot]
506cd5858e
feat: Environment var to disable forms pages sandboxing (backport to 1.x) (#28158)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
Co-authored-by: Michael Kret <michael.k@radency.com>
2026-04-08 13:04:49 +03:00
n8n-assistant[bot]
302186db5d
🚀 Release 1.123.29 (#28154)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-08 06:05:58 +00:00
n8n-assistant[bot]
27cbfbfc92
ci: Install script dependencies before detecting new packages (backport to 1.x) (#28114)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Matsu <huhta.matias@gmail.com>
2026-04-07 16:18:01 +03:00
n8n-assistant[bot]
6e66377074
fix(core): Support reconnecting on Redis failover (backport to 1.x) (#28106)
Co-authored-by: mfsiega <93014743+mfsiega@users.noreply.github.com>
2026-04-07 15:22:04 +03:00
Craig McElroy
1fe07e97b0
fix(core): Restore missing axios request interceptor dropped during 1.x backport (#27842)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
2026-04-07 09:59:14 +02:00
Matsu
ea12d022be
ci: Bring .github/scripts up to date in 1.x (#27965)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Performance (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
2026-04-02 18:30:57 +02:00
n8n-assistant[bot]
22d02e5ad6
🚀 Release 1.123.28 (#27960)
Some checks are pending
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (24.13.1) (push) Waiting to run
CI: Master (Build, Test, Lint) / Unit tests (25.x) (push) Waiting to run
CI: Master (Build, Test, Lint) / Lint (push) Waiting to run
CI: Master (Build, Test, Lint) / Performance (push) Waiting to run
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Blocked by required conditions
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-04-02 13:39:30 +03:00
Matsu
581a955e1f
ci: Backport security folder to 1.x (#27958) 2026-04-02 13:35:06 +03:00
Matsu
bf86a98163
ci: Backport .github/ to 1.x (#27154) 2026-04-02 12:40:06 +03:00
n8n-assistant[bot]
f3c0b2c0cb
chore: Update ssh2-sftp-client to 12.1.0 (backport to 1.x) (#27891)
Some checks failed
CI: Master (Build, Test, Lint) / Build for Github Cache (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (20.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (22.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Unit tests (24.3.x) (push) Has been cancelled
CI: Master (Build, Test, Lint) / Lint (push) Has been cancelled
CI: Master (Build, Test, Lint) / Notify Slack on failure (push) Has been cancelled
Co-authored-by: Jon <jonathan.bennetts@gmail.com>
2026-04-01 14:12:05 +03:00
n8n-assistant[bot]
48fdd9e947
ci: Pin action to commit SHA and pass secrets via env vars (backport to 1.x) (#27658)
Co-authored-by: Matsu <huhta.matias@gmail.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 13:16:41 +02:00
n8n-assistant[bot]
a4d6a6d2f2
ci: Use track-specific npm dist-tags on publish (backport to 1.x) (#27616)
Co-authored-by: Matsu <huhta.matias@gmail.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-27 13:03:31 +02:00
n8n-assistant[bot]
d9924ab38f
🚀 Release 1.123.27 (#27558)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-03-25 14:58:58 +02:00
n8n-assistant[bot]
88264ed350
chore: Bundle 2026-W9 (backport to 1.x) (#27538)
Co-authored-by: n8n-assistant[bot] <100856346+n8n-assistant[bot]@users.noreply.github.com>
Co-authored-by: Matsu <matias.huhta@n8n.io>
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
Co-authored-by: Charlie Kolb <charlie@n8n.io>
Co-authored-by: RomanDavydchuk <roman.davydchuk@n8n.io>
Co-authored-by: Jaakko Husso <jaakko@n8n.io>
Co-authored-by: Dawid Myslak <dawid.myslak@gmail.com>
Co-authored-by: Svetoslav Dekov <svetoslav.dekov@n8n.io>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Guillaume Jacquart <jacquart.guillaume@gmail.com>
Co-authored-by: Sandra Zollner <sandra.zollner@n8n.io>
Co-authored-by: Milorad FIlipović <milorad@n8n.io>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
Co-authored-by: Ricardo Espinoza <ricardo@n8n.io>
2026-03-25 14:27:48 +02:00
n8n-assistant[bot]
a486719f15
🚀 Release 1.123.26 (#27254)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-03-19 09:31:36 +02:00
Konstantin Tieber
13de8dfccc
fix(core): Old version of workflow still active after pulling [1.x] (#27017) 2026-03-18 12:15:14 +01:00
aikido-autofix[bot]
5504844633
fix: Fix 16 security issues in hono, simple-git, multer and 5 more (#27025)
Co-authored-by: aikido-autofix[bot] <119856028+aikido-autofix[bot]@users.noreply.github.com>
2026-03-14 12:07:26 +00:00
n8n-assistant[bot]
7c8ff45509
🚀 Release 1.123.25 (#27002)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-03-13 12:54:51 +00:00
n8n-assistant[bot]
86191fee1c
fix(Form Node): Improve custom CSS sanitization (backport to 1.x) (#26668)
Co-authored-by: Dawid Myslak <dawid.myslak@gmail.com>
2026-03-13 14:33:33 +02:00
n8n-assistant[bot]
fe6b0a8b36
🚀 Release 1.123.24 (#26879)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-03-11 10:14:28 +00:00
n8n-assistant[bot]
ebf7721a59
fix(core): Fix entity import failing in Kubernetes due to ZIP self-inclusion and local header size placeholders (backport to 1.x) (#26823)
Co-authored-by: Ahsan Virani <ahsan.virani@gmail.com>
2026-03-10 13:50:43 +01:00
n8n-assistant[bot]
4849d95b4b
fix(Form Node): Improve form rendering consistency (backport to 1.x) (#26656)
Co-authored-by: Dawid Myslak <dawid.myslak@gmail.com>
2026-03-06 15:09:59 +01:00
Matsuuu
643e50524d
Merge tag 'n8n@1.123.23' into 1.x 2026-03-04 14:30:59 +02:00
Matsu
d60437662b
ci: Pin Trivy binary version to fix yanked release (#26431) (#26519)
Co-authored-by: Declan Carroll <declan@n8n.io>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-04 09:13:34 +00:00
n8n-assistant[bot]
cd3bdce623
🚀 Release 1.123.23 (#26514)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-03-04 10:19:35 +02:00
Declan Carroll
10aa98fceb
fix(editor): Replace jsonpath with jsonpath-plus to resolve CVE (#26408)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 11:34:26 +00:00
aikido-autofix[bot]
9b1d4564e7
fix: Fix 14 security issues in jsonpath, mailparser, mysql2 and 9 more (#26363)
Co-authored-by: aikido-autofix[bot] <119856028+aikido-autofix[bot]@users.noreply.github.com>
2026-03-01 16:28:07 +00:00
Declan Carroll
aef8c80491
fix: Backport transitive dependency bumps to 1.x (#26260)
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 16:56:54 +00:00
Matsuuu
b8accfedbb
Merge tag 'n8n@1.123.22' into 1.x 2026-02-25 13:55:01 +02:00
n8n-assistant[bot]
49d7e16028
🚀 Release 1.123.22 (#26235)
Co-authored-by: Matsuuu <16068444+Matsuuu@users.noreply.github.com>
2026-02-25 13:16:27 +02:00
Irénée
0a4d66685a
chore(core): Add more tests (#26237) 2026-02-25 10:55:30 +00:00
Matsu
1479aab2d3
chore: Backport Bundle (#26218)
Signed-off-by: Oleg Ivaniv <me@olegivaniv.com>
Co-authored-by: n8n-assistant[bot] <100856346+n8n-assistant[bot]@users.noreply.github.com>
Co-authored-by: Tomi Turtiainen <10324676+tomi@users.noreply.github.com>
Co-authored-by: yehorkardash <yehor.kardash@n8n.io>
Co-authored-by: James Gee <1285296+geemanjs@users.noreply.github.com>
Co-authored-by: Iván Ovejero <ivov.src@gmail.com>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Stephen Wright <sjw948@gmail.com>
Co-authored-by: oleg <me@olegivaniv.com>
Co-authored-by: Albert Alises <albert.alises@gmail.com>
Co-authored-by: Danny Martini <danny@n8n.io>
2026-02-25 12:25:54 +02:00
n8n-assistant[bot]
7762bef712
🚀 Release 1.123.21 (#26029)
Co-authored-by: ireneea <20843309+ireneea@users.noreply.github.com>
2026-02-20 08:55:55 +00:00
Tomi Turtiainen
b30ed4c662
fix: Use task runners for AI Transform node (#25917) 2026-02-19 11:34:29 +00:00
Matsu
cebe811fe2
chore: Disable flaky tests on codemirror-lang-sql (#25976) 2026-02-19 12:30:45 +02:00
Tomi Turtiainen
b045eb9b5f
fix(core): Remove --tunnel option after hooks.n8n.cloud shutdown (#25944) 2026-02-18 20:59:35 +02:00
Tomi Turtiainen
e0315d396f
chore: Upgrade vm2 to 3.10.5 (#25942) 2026-02-18 17:07:51 +02:00
Declan Carroll
d1061826e9
fix: Dependency bump backport (#25788) 2026-02-16 08:05:45 +00:00
n8n-assistant[bot]
300f429d9d
🚀 Release 1.123.20 (#25444)
Co-authored-by: CharlieKolb <13814565+CharlieKolb@users.noreply.github.com>
2026-02-06 14:28:55 +01:00
Iván Ovejero
9ccc1888f2
refactor(core): Improve expressions handling (#25436) 2026-02-06 11:10:20 +00:00
Michael Kret
20c4ba9c1a
feat(Kafka Trigger Node): Refactoring and fixes (backport 1.x) (#25424) 2026-02-06 12:42:34 +02:00
github-actions[bot]
9e417b9eaa
fix: Fix status overwrite for donePromise (backport 1.x) (#25416)
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
2026-02-06 10:26:30 +00:00
n8n-assistant[bot]
db31c46f2a
🚀 Release 1.123.19 (#25423)
Co-authored-by: CharlieKolb <13814565+CharlieKolb@users.noreply.github.com>
2026-02-06 09:16:14 +01:00
Tomi Turtiainen
b5138c9c98 Merge tag 'n8n@1.123.18' into 1.x 2026-02-05 20:40:13 +02:00
github-actions[bot]
c39496eda9
fix(core): Use stricter flags when starting python runner (backport 1.x) (#25157)
Co-authored-by: Tomi Turtiainen <10324676+tomi@users.noreply.github.com>
2026-02-04 13:43:50 +02:00
Dimitri Lavrenük
dba9864e00
fix: Update mime-types to fixed version 3.0.2 (#25148) 2026-02-02 15:01:37 +01:00
n8n-assistant[bot]
ad1023b57e
🚀 Release 1.123.18 (#25047)
Co-authored-by: seemewalkin <38620398+seemewalkin@users.noreply.github.com>
2026-01-29 16:37:40 +00:00
Artem Sorokin
4e5e9ff133
chore: Bump security dependencies (#25039) 2026-01-29 14:36:08 +00:00
Dawid Myslak
c6520e4e87
feat(Zendesk Trigger Node): Add webhook signature verification (#25011) 2026-01-29 10:19:48 +01:00
Ricardo Espinoza
7f36e8e6d8
fix(core): Posthog proxy (no-changelog) (#25002)
Co-authored-by: Milorad FIlipović <milorad@n8n.io>
2026-01-28 17:34:35 +00:00
Benjamin Schroth
6a9eccbfb9
fix(AI Agent Node): Fix toolInput field in intermediateSteps output (#24925) 2026-01-28 15:02:33 +01:00
Iván Ovejero
b00dcd9221
refactor(core): Improve Python runner analyzer (#24980) 2026-01-28 11:48:09 +00:00
mfsiega
70c573c882
fix(Merge Node): Prevent writing files from merge node sql (no-changelog) (#24907)
Co-authored-by: Michael Kret <michael.k@radency.com>
2026-01-27 11:26:48 +00:00
mfsiega
46dd25439c
fix(core): Use fsRealpath instead of resolve to get the real path (no-changelog) (#24905) 2026-01-27 11:01:11 +00:00
mfsiega
8d8681403c
fix(core): Stronger allowed path enforcement for read/write Node (no-changelog) (#24887) 2026-01-27 10:59:19 +01:00
RomanDavydchuk
13ec09b159
fix(Eventbrite Trigger Node): Validate received URL (#24874) 2026-01-26 14:26:33 +00:00
n8n-assistant[bot]
911d3771ce
🚀 Release 1.123.17 (#24780)
Co-authored-by: seemewalkin <38620398+seemewalkin@users.noreply.github.com>
2026-01-23 14:45:43 +01:00
Tomi Turtiainen
740a518bf7 Merge tag 'n8n@1.123.16' into 1.x 2026-01-23 15:05:41 +02:00
Artem Sorokin
264db125ea
ci: Add empty vex.openvex.json for 1.x branch (#24760) 2026-01-23 12:54:17 +01:00
RomanDavydchuk
7860896909
fix(Git Node): Clean up URLs returned from config (#24754) 2026-01-23 11:37:53 +02:00
Artem Sorokin
298c673bcb
ci: Backport .github folder changes into 1.x (#24741) 2026-01-23 08:38:14 +00:00
Artem Sorokin
a8ddcea5f5
ci: Backport .github folder changes into 1.x (#24725)
Co-authored-by: Declan Carroll <declan@n8n.io>
2026-01-22 17:12:03 +01:00
Iván Ovejero
30383d8613
refactor(core): Improve expressions handling (#24688) 2026-01-22 11:06:36 +00:00
Declan Carroll
8ab4492e8c
fix: Unfork @n8n/vm2 (backport to 1.x) (#24597)
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-20 17:21:15 +00:00
n8n-assistant[bot]
61fd8625d7
🚀 Release 1.123.16 (#24468)
Co-authored-by: ireneea <20843309+ireneea@users.noreply.github.com>
2026-01-16 16:57:18 +00:00
Iván Ovejero
d05fc24fc3
refactor(core): Improve expressions handling (#24454) 2026-01-16 17:12:48 +01:00
n8n-assistant[bot]
7c81ee3152
🚀 Release 1.123.15 (#24400)
Co-authored-by: ireneea <20843309+ireneea@users.noreply.github.com>
2026-01-15 16:23:12 +00:00
Irénée
e6737d24a8
fix: Apply source control configuration changes to all multi main instances (#24391) 2026-01-15 15:35:23 +00:00
Dawid Myslak
afe3223255
feat(GitHub Trigger Node): Add automatic webhook signature verification (#24389) 2026-01-15 14:57:20 +00:00
n8n-assistant[bot]
019b462d2c
🚀 Release 1.123.14 (#24339)
Co-authored-by: ireneea <20843309+ireneea@users.noreply.github.com>
2026-01-14 16:34:46 +00:00
Dimitri Lavrenük
465209a377
fix: Form trigger and Wait Form mode basic authentication fix for form POST request (#24329)
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
2026-01-14 16:09:05 +00:00
Irénée
95173c5ecf
fix: Harden Git node parameter handling (#24323)
Co-authored-by: Elias Meire <elias@meire.dev>
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
2026-01-14 14:43:28 +00:00
Irénée
25f644f7f3
fix(McpClientTool Node): Filter out tool arguments unless explicitly … (#24321)
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
2026-01-14 15:23:29 +01:00
Irénée
512f50fa61
fix: Regenerate form webhook ids when pasting workflow data (#24320)
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
2026-01-14 14:22:30 +00:00
aikido-autofix[bot]
59ca0a2d9b
fix: Fix security issue in @rudderstack/rudder-sdk-node via major version upgrade from 2.1.4 to 3.0.0 (#24312)
Co-authored-by: aikido-autofix[bot] <119856028+aikido-autofix[bot]@users.noreply.github.com>
Co-authored-by: Nikhil Kuriakose <nikhil.kuriakose@n8n.io>
2026-01-14 15:20:41 +01:00
Iván Ovejero
1d5372ff93
refactor(core): Normalize exception attribute access in Python task runner (#24310) 2026-01-14 12:00:05 +00:00
n8n-assistant[bot]
a49067d6ba
🚀 Release 1.123.13 (#24246)
Co-authored-by: ireneea <20843309+ireneea@users.noreply.github.com>
2026-01-13 15:10:08 +00:00
Tomi Turtiainen
918bdcc286 Merge tag 'n8n@1.123.12' into 1.x 2026-01-13 16:09:13 +02:00
Charlie Kolb
b1b39bee74
fix: Fix CLI import command (#24239)
Co-authored-by: Daria <daria.staferova@n8n.io>
2026-01-13 15:04:34 +01:00
n8n-assistant[bot]
3a3e4c6cc2
🚀 Release 1.123.12 (#24228)
Co-authored-by: ireneea <20843309+ireneea@users.noreply.github.com>
2026-01-13 11:53:32 +00:00
Dawid Myslak
528ad6b982
fix(core): Sanitize filenames for file operations (#24221)
Co-authored-by: Michael Kret <michael.k@radency.com>
2026-01-13 12:18:29 +01:00
Iván Ovejero
27383c6d24
refactor(core): Improve expressions handling (#24220) 2026-01-13 10:04:14 +00:00
Tomi Turtiainen
9262607282 Merge tag 'n8n@1.123.11' into 1.x 2026-01-13 11:43:13 +02:00
mfsiega
7c2eb8cbdd
fix(Webhook Node): Use CIDR matching for IP whitelist check (no-changelog) (#24047) 2026-01-08 16:49:24 +01:00
n8n-assistant[bot]
148236390b
🚀 Release 1.123.11 (#24046)
Co-authored-by: seemewalkin <38620398+seemewalkin@users.noreply.github.com>
2026-01-08 15:33:39 +00:00
Artem Sorokin
45179a2c6f
ci: Sync .github folder from master with v1 tagging (#24023) 2026-01-08 14:35:05 +00:00
Declan Carroll
5dc3e4171b
fix: Bump Validator dependency 1.x (#24027) 2026-01-08 10:41:38 +00:00
Declan Carroll
b1460c7cc1
fix: Backport CVE fixes from master (#23984) 2026-01-07 18:50:14 +00:00
Iván Ovejero
dc3706ae55
refactor(core): Replay expressions handling onto 1.x (#23968)
Co-authored-by: eilonc-pillar <eilon@pillar.security>
2026-01-07 16:33:28 +01:00
Dimitri Lavrenük
b6059a120b
fix(McpClientTool Node): Sanitize MCP tool arguments based on schema (#23979)
Co-authored-by: Thomas B. <thobra@gmail.com>
2026-01-07 14:31:59 +00:00
Artem Sorokin
f632578f8b
ci: Backport GH workflow renames to 1.x branch (#23959)
Co-authored-by: Charlie Kolb <charlie@n8n.io>
2026-01-07 11:41:53 +01:00
Shireen Missi
8ea741a2e3
fix(core): Fix CORS issue in waiting webhook responses (#23861)
Co-authored-by: Shashwat <work.shashwatojha@gmail.com>
Co-authored-by: Michael Kret <michael.k@radency.com>
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
2026-01-05 16:41:14 +02:00
Cornelius Suermann
abd04226c1
Merge tag 'n8n@1.123.10' into 1.x 2026-01-05 12:28:26 +01:00
Charlie Kolb
57aad1e856
ci: Use App token for release (#23788) 2026-01-02 14:10:54 +01:00
github-actions[bot]
e0cfbdc48b
🚀 Release 1.123.10 (#23753)
Co-authored-by: CharlieKolb <13814565+CharlieKolb@users.noreply.github.com>
2026-01-02 09:18:48 +01:00
Shireen Missi
f7cf22f92c
fix(core): Modify path validation to work cross platforms (#23740) 2025-12-30 16:32:11 +00:00
Shireen Missi
5a3d556ce2
fix(Stripe Trigger Node): Add Stripe signature verification (#23741) 2025-12-30 16:09:00 +00:00
Shireen Missi
cbbd64f0eb
fix: Sharepoint file selection correctly applies filter (#23742)
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
2025-12-30 15:48:35 +00:00
github-actions[bot]
6eb2bac670
🚀 Release 1.123.9 (#23564)
Co-authored-by: tomi <10324676+tomi@users.noreply.github.com>
2025-12-23 13:02:00 +02:00
Jaakko Husso
e6313f6364
fix: Improve markdown rendering (#23561) 2025-12-23 12:49:35 +02:00
Iván Ovejero
8a5d4d5746
fix: Improve expression handling (#23560) 2025-12-23 12:38:05 +02:00
github-actions[bot]
aed5416484
🚀 Release 1.123.8 (#23523)
Co-authored-by: ShireenMissi <94372015+ShireenMissi@users.noreply.github.com>
2025-12-22 14:14:12 +00:00
Shireen Missi
97365caf25
fix: Limit access to files based on regex pattern (#23528)
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
2025-12-22 13:37:00 +00:00
Declan Carroll
1b5ccd8dee
ci: Sync release workflows and actions from master (#23517)
Co-authored-by: Tomi Turtiainen <10324676+tomi@users.noreply.github.com>
2025-12-22 08:39:47 +00:00
Tomi Turtiainen
ae8097e60e
fix(core): Only resolve the filepath once (#23466)
Co-authored-by: mfsiega <93014743+mfsiega@users.noreply.github.com>
Co-authored-by: cubic-dev-ai[bot] <191113872+cubic-dev-ai[bot]@users.noreply.github.com>
2025-12-19 15:47:53 +02:00
Shireen Missi
00b2b3b463
fix(n8n Form Node): Restores executions status check for waiting forms (#23459)
Co-authored-by: Michael Kret <88898367+michael-radency@users.noreply.github.com>
2025-12-19 12:30:20 +00:00
Iván Ovejero
4900d89650
fix(core): Fix verified community packages reinstall (#23455) 2025-12-19 12:51:27 +01:00
Charlie Kolb
7c66e72450
test: Unflake WorkflowPublish history tests (#23367) 2025-12-18 09:54:45 +01:00
Tomi Turtiainen
1448293d48
🚀 Release 1.123.7 (#23352) 2025-12-17 14:00:00 +01:00
Shireen Missi
5102991310
fix: Only support specified git config keys in Git node (#23346)
Co-authored-by: Dimitri Lavrenük <20122620+dlavrenuek@users.noreply.github.com>
2025-12-17 11:45:32 +00:00
Eugene
e6fe97cb31
fix(core): Error running evaluations in queue mode (#23341) 2025-12-17 12:00:08 +01:00
Artem Sorokin
a03d4efa3b
ci: Auto-tag Dockerhub releases with 1 docker tag (#23247) 2025-12-15 22:26:13 +01:00
Tomi Turtiainen
6f8dccf537 Merge tag 'n8n@1.123.6' into 1.x 2025-12-15 16:27:36 +02:00
Tomi Turtiainen
64bcbf450b
ci: Run tests against release branches in 1.x branch (#23233) 2025-12-15 16:16:41 +02:00
github-actions[bot]
8be76b3c5a
🚀 Release 1.123.6 (#23231)
Co-authored-by: tomi <10324676+tomi@users.noreply.github.com>
2025-12-15 16:03:37 +02:00
Tomi Turtiainen
7f53cbbc6c test: Fix workflows controller tests 2025-12-13 23:12:27 +02:00
Tomi Turtiainen
3730172e36 chore: Add missing field to IWorkflowDb interface 2025-12-13 23:09:33 +02:00
Elias Meire
84fb27aa1d fix: Harden form and trigger response handling (#23061) 2025-12-12 22:37:56 +02:00
Raúl Gómez Morales
bf19e8d9a0 fix(editor): Fix project selector scroll (#22728) 2025-12-12 22:37:56 +02:00
Ricardo Espinoza
db20ecfe51 feat(core): Add breaking change rule for start node deprecation (#23097)
Co-authored-by: Tomi Turtiainen <10324676+tomi@users.noreply.github.com>
2025-12-12 22:37:56 +02:00
Daria
883c409be9 fix: Make sure duplicating workflows creates them as unpublished (no-changelog) (#23113) 2025-12-12 22:37:56 +02:00
Daria
e924f07e62 fix: Backfill missing workflow history records (#23070) 2025-12-12 22:37:56 +02:00
github-actions[bot]
14f70d3416 🚀 Release 1.123.5 (#23037)
Co-authored-by: ivov <44588767+ivov@users.noreply.github.com>
2025-12-12 21:49:23 +02:00
Iván Ovejero
39f17f5fb3 Remove unused import 2025-12-12 21:47:08 +02:00
Declan Carroll
177000bc89 chore: Upgrade launcher to 1.4.2 (#22995) 2025-12-12 21:47:08 +02:00
Daria
3d2193278c fix: Add version history records when importing workflows (#22974) 2025-12-12 21:47:08 +02:00
Iván Ovejero
158a3c35d3 fix(core): Add missing env vars to internal mode (#22965) 2025-12-12 21:47:08 +02:00
github-actions[bot]
d0e3d69c13 🚀 Release 1.123.4 (#22921)
Co-authored-by: ivov <44588767+ivov@users.noreply.github.com>
2025-12-12 21:15:17 +02:00
Iván Ovejero
7bd2b8d617 fix(core): Allowlist HOME env var in JS runner config (#22839) 2025-12-12 21:15:17 +02:00
Iván Ovejero
f191116594 refactor(core): Make Sentry init non-fatal for JS runner (#22800) 2025-12-12 21:15:17 +02:00
github-actions[bot]
098fc046b4 🚀 Release 1.123.3 (#22815)
Co-authored-by: tomi <10324676+tomi@users.noreply.github.com>
2025-12-12 21:15:17 +02:00
Iván Ovejero
173fa0868a fix: Add HOME env var to distroless runners image (#22796) 2025-12-12 21:15:17 +02:00
Guillaume Jacquart
56b43c8b73 fix(core): Hide migration rule issues not relevant to cloud (#22749) 2025-12-12 21:15:17 +02:00
github-actions[bot]
4560a305dd 🚀 Release 1.123.2 (#22733)
Co-authored-by: tomi <10324676+tomi@users.noreply.github.com>
2025-12-12 21:14:51 +02:00
Guillaume Jacquart
9cdc03e049 fix(core): Do not prevent credential save if property has default value (#22720) 2025-12-12 21:14:51 +02:00
Tomi Turtiainen
158afd1d15 fix(core): Fix html header check (#22713) 2025-12-12 21:14:51 +02:00
Guillaume Jacquart
cc380559d3 feat(core): Add toolCode nodes to the pyodide check for v2 migration (#22659) 2025-12-12 21:14:51 +02:00
github-actions[bot]
d51b779ed3 🚀 Release 1.123.1 (#22674)
Co-authored-by: tomi <10324676+tomi@users.noreply.github.com>
2025-12-12 21:14:25 +02:00
Danny Martini
7f27e06b22 fix(core): Prevent execution data from being overwritten on manual workflow resume (#22665)
Co-authored-by: Claude <noreply@anthropic.com>
2025-12-12 21:14:25 +02:00
RomanDavydchuk
0fe86822ef fix(MCP Client Node): Make "Use Dynamic Client Registration" toggle not required (#22645) 2025-12-12 21:14:25 +02:00
Iván Ovejero
86d9ce1ca2
feat(core): Introduce native Python code tool for AI agent (#22657) 2025-12-03 16:25:38 +01:00
Tomi Turtiainen
573a0a34aa chore: Revert "chore: Initial V2 changes (#22553)"
This reverts commit a4757cf009.
2025-12-03 12:51:33 +02:00
Tomi Turtiainen
7de2a7a33b chore: Revert "feat(core): Make chat hub workflows treat activeWorkflowId correctly (#22546)"
This reverts commit d6b9e7c8fb.
2025-12-03 12:51:19 +02:00
685 changed files with 41555 additions and 10151 deletions

114
.github/CI-TELEMETRY.md vendored Normal file
View File

@ -0,0 +1,114 @@
# CI Telemetry
Pipeline: **GitHub Actions → Webhook → n8n → BigQuery**
## Unified Payload Shape
All telemetry uses the same format:
```json
{
"timestamp": "2026-03-16T12:00:00.000Z",
"benchmark_name": "kafka-throughput-10n-10kb",
"git": { "sha": "abc12345", "branch": "master", "pr": null },
"ci": { "runId": "123", "runUrl": "...", "job": "test", "workflow": "CI", "attempt": 1 },
"runner": { "provider": "blacksmith", "cpuCores": 8, "memoryGb": 16.0 },
"metrics": [
{ "metric_name": "exec-per-sec", "value": 12.4, "unit": "exec/s", "dimensions": { "trigger": "kafka", "nodes": 10 } }
]
}
```
## Standard Context Fields
```typescript
git.sha // GITHUB_SHA (first 8 chars)
git.branch // GITHUB_HEAD_REF ?? GITHUB_REF_NAME
git.pr // PR number from GITHUB_REF
ci.runId // GITHUB_RUN_ID
ci.runUrl // https://github.com/<repo>/actions/runs/<runId>
ci.job // GITHUB_JOB
ci.workflow // GITHUB_WORKFLOW
ci.attempt // GITHUB_RUN_ATTEMPT
runner.provider // 'github' | 'blacksmith' | 'local'
runner.cpuCores // os.cpus().length
runner.memoryGb // os.totalmem()
```
**Runner provider logic:**
```typescript
if (!process.env.CI) return 'local';
if (process.env.RUNNER_ENVIRONMENT === 'github-hosted') return 'github';
return 'blacksmith';
```
## Implementations
| Telemetry | Source | Metrics |
|-----------|--------|---------|
| Playwright perf/benchmark | `packages/testing/playwright/reporters/metrics-reporter.ts` | Any metric attached via `attachMetric()` |
| Build stats | `.github/scripts/send-build-stats.mjs` | Per-package build duration, cache hit/miss, run total |
| Docker stats | `.github/scripts/send-docker-stats.mjs` | Image size per platform, docker build duration |
| Container stack | `packages/testing/containers/telemetry.ts` | E2E stack startup times per service |
## Secrets
```
QA_METRICS_WEBHOOK_URL
QA_METRICS_WEBHOOK_USER
QA_METRICS_WEBHOOK_PASSWORD
```
## BigQuery Table
`qa_performance_metrics` — schema:
```sql
timestamp TIMESTAMP NOT NULL
benchmark_name STRING
metric_name STRING NOT NULL
value FLOAT64 NOT NULL
unit STRING
dimensions JSON -- {"nodes": 10, "trigger": "kafka", "package": "@n8n/cli"}
git_sha STRING
git_branch STRING
git_pr INT64
ci_run_id STRING
ci_run_url STRING
ci_job STRING
ci_workflow STRING
ci_attempt INT64
runner_provider STRING
runner_cpu_cores INT64
runner_memory_gb FLOAT64
```
Query example:
```sql
-- Build duration trend by package (cache misses only)
SELECT DATE(timestamp), JSON_VALUE(dimensions, '$.package'), AVG(value)
FROM qa_performance_metrics
WHERE metric_name = 'build-duration'
AND JSON_VALUE(dimensions, '$.cache') = 'miss'
GROUP BY 1, 2 ORDER BY 1;
```
## Adding New Telemetry
**From a script:**
```javascript
import { sendMetrics, metric } from './send-metrics.mjs';
await sendMetrics([
metric('my-metric', 42.0, 'ms', { context: 'value' }),
]);
```
**From a Playwright test:**
```typescript
import { attachMetric } from '../utils/performance-helper';
await attachMetric(testInfo, 'my-metric', 42.0, 'ms', { context: 'value' });
```

42
.github/CLAUDE.md vendored Normal file
View File

@ -0,0 +1,42 @@
@../AGENTS.md
## .github Quick Reference
This folder contains n8n's GitHub Actions infrastructure.
### Key Files
| File/Folder | Purpose |
|-------------|---------|
| `WORKFLOWS.md` | Complete CI/CD documentation |
| `workflows/` | GitHub Actions workflows |
| `actions/` | Reusable composite actions |
| `scripts/` | Release & Docker automation |
| `CODEOWNERS` | Team review ownership |
### Workflow Naming
| Prefix | Purpose |
|--------|---------|
| `test-` | Testing (unit, E2E, visual) |
| `ci-` | Continuous integration |
| `util-` | Utilities (notifications) |
| `build-` | Build processes |
| `release-` | Release automation |
| `sec-` | Security scanning |
Reusable workflows: add `-reusable` or `-callable` suffix.
### Common Tasks
**Add workflow:** Create in `workflows/`, document in `WORKFLOWS.md`
**Add script:** Create `.mjs` in `scripts/`, document in `WORKFLOWS.md`
### Reference
See `WORKFLOWS.md` for:
- Architecture diagrams
- Workflow call graph
- Scheduled jobs & triggers
- Runners & secrets

6
.github/CODEOWNERS vendored
View File

@ -1,4 +1,6 @@
packages/@n8n/db/src/migrations/ @n8n-io/migrations-review
.github/workflows @n8n-io/ci-admins
.github/scripts @n8n-io/ci-admins
.github/actions @n8n-io/ci-admins
.github/poutine-rules @n8n-io/ci-admins
# Node popularity data updates
packages/frontend/editor-ui/data/node-popularity.json @n8n-io/catalysts

View File

@ -66,7 +66,7 @@ body:
id: nodejs-version
attributes:
label: Node.js Version
placeholder: ex. 22.16.0
placeholder: ex. 24.0.0
validations:
required: true
- type: dropdown
@ -76,8 +76,6 @@ body:
options:
- SQLite (default)
- PostgreSQL
- MySQL
- MariaDB
default: 0
validations:
required: true

670
.github/WORKFLOWS.md vendored Normal file
View File

@ -0,0 +1,670 @@
# GitHub Actions & CI/CD Documentation
Complete reference for n8n's `.github/` folder.
---
## Folder Structure
```
.github/
├── WORKFLOWS.md # This document
├── CI-TELEMETRY.md # Telemetry & metrics guide
├── CODEOWNERS # Team ownership for PR reviews
├── pull_request_template.md # PR description template
├── pull_request_title_conventions.md # Title format rules (Angular)
├── actionlint.yml # Workflow linter config
├── docker-compose.yml # DB services for local testing
├── test-metrics/
│ └── playwright.json # E2E performance baselines
├── ISSUE_TEMPLATE/
│ ├── config.yml # Routes to community/security
│ └── 01-bug.yml # Structured bug report form
├── scripts/ # Automation scripts
│ ├── bump-versions.mjs # Calculate next version
│ ├── update-changelog.mjs # Generate CHANGELOG
│ ├── trim-fe-packageJson.js # Strip frontend devDeps
│ ├── ensure-provenance-fields.mjs # Add license/author fields
│ ├── validate-docs-links.js # Check documentation URLs
│ ├── send-build-stats.mjs # Turbo build telemetry → webhook
│ └── docker/
│ ├── docker-tags.mjs # Generate image tags
│ └── docker-config.mjs # Build context config
├── actions/ # Custom composite actions
│ ├── setup-nodejs/ # pnpm + Node + Turbo cache
│ └── docker-registry-login/ # GHCR + DockerHub auth
└── workflows/ # GitHub Actions workflows
```
---
## Architecture Overview
```
┌────────────────────────────────────────────────────────────────────────────┐
│ n8n CI/CD ARCHITECTURE │
├────────────────────────────────────────────────────────────────────────────┤
│ │
│ TRIGGERS PIPELINES OUTPUTS │
│ ──────── ───────── ─────── │
│ │
│ ┌──────────┐ ┌──────────────────────────────────┐ ┌────────────┐ │
│ │ PR │───▶│ ci-pull-requests.yml │───▶│ Checks │ │
│ └──────────┘ │ ├─ build + paths-filter │ │ Gate │ │
│ │ ├─ unit-test (reusable) │ └────────────┘ │
│ ┌──────────┐ │ ├─ typecheck │ │
│ │ Push │───▶│ ├─ lint (reusable) │ ┌────────────┐ │
│ │ master │ │ ├─ e2e-tests (reusable) │───▶│ Coverage │ │
│ └──────────┘ │ └─ security (if .github/**) │ └────────────┘ │
│ └──────────────────────────────────┘ │
│ │
│ ┌──────────┐ ┌──────────────────────────────────┐ ┌────────────┐ │
│ │ Merge │───▶│ release-publish.yml │───▶│ NPM │ │
│ │release/* │ │ ├─ publish-to-npm │ ├────────────┤ │
│ └──────────┘ │ ├─ publish-to-docker-hub │───▶│ Docker │ │
│ │ ├─ create-github-release │ ├────────────┤ │
│ │ ├─ create-sentry-release │───▶│ Sentry │ │
│ │ └─ generate-sbom │ ├────────────┤ │
│ └──────────────────────────────────┘───▶│ SBOM │ │
│ └────────────┘ │
│ ┌──────────┐ ┌──────────────────────────────────┐ │
│ │ Schedule │───▶│ Nightly/Weekly Jobs │ ┌────────────┐ │
│ │ (cron) │ │ ├─ docker-build-push (nightly) │───▶│ Images │ │
│ └──────────┘ │ ├─ test-benchmark-nightly │───▶│ Metrics │ │
│ │ ├─ test-workflows-nightly │ └────────────┘ │
│ │ └─ test-e2e-coverage-weekly │ │
│ └──────────────────────────────────┘ │
│ │
└────────────────────────────────────────────────────────────────────────────┘
```
---
## Quick Reference
| Prefix | Purpose |
|------------|-----------------------------------------|
| `test-` | Testing (E2E, unit, visual, benchmarks) |
| `ci-` | Continuous integration |
| `util-` | Utilities (notifications, sync, Claude) |
| `build-` | Build processes |
| `release-` | Release automation |
| `sec-` | Security scanning |
| Other | Docker, SBOM, patch releases |
---
## PR Title Conventions
Commits drive changelog generation. Follow Angular convention:
```
Format: <type>(<scope>): <summary>
Types: feat | fix | perf | test | docs | refactor | build | ci | chore
Scopes: API | benchmark | core | editor | * Node (optional)
Examples:
feat(editor): Add dark mode toggle
fix(Slack Node): Handle rate limiting correctly
perf(core): Optimize workflow execution by 20%
refactor: Migrate to TypeScript strict mode (no-changelog)
Breaking Changes: Add "BREAKING CHANGE:" footer with migration guide
Deprecations: Add "DEPRECATED:" footer with update path
Skip Changelog: Add "(no-changelog)" to PR title
```
See `pull_request_title_conventions.md` for full spec.
---
## What Runs When You Open a PR
### Flow Diagram
```
┌──────────────────────────────────────────────────────────────────────────────┐
│ PR OPENED / UPDATED │
└─────────────────────────────────────┬────────────────────────────────────────┘
┌───────────────────────────┴───────────────────────┐
▼ ▼
┌───────────────────────────┐ ┌───────────────────────────┐
│ ci-pull-requests.yml │ │ ci-check-pr-title.yml │
│ (main orchestrator) │ │ (validates title format) │
└─────────────┬─────────────┘ └───────────────────────────┘
┌───────────────────────────┐
│ install-and-build │
│ └─ paths-filter │──────────────────────────────────────────┐
└─────────────┬─────────────┘ │
│ │
│ [if non-Python files changed] │ [if .github/** changed]
│ │
┌─────────┼─────────┬─────────────┬─────────────┐ │
│ │ │ │ │ │
▼ ▼ ▼ ▼ ▼ ▼
┌───────┐ ┌───────┐ ┌───────┐ ┌────────────┐ ┌────────────┐ ┌────────────┐
│ unit │ │ type │ │ lint │ │ e2e-tests │ │ security │ │ security │
│ test │ │ check │ │ │ │ │ │ checks │ │ checks │
└───┬───┘ └───┬───┘ └───┬───┘ └─────┬──────┘ └─────┬──────┘ └─────┬──────┘
│ │ │ │ │ │
│ │ │ ┌─────┴─────┐ │ │
│ │ │ ▼ ▼ │ │
│ │ │ Internal Fork PR │ │
│ │ │ 14 shards 6 shards │ │
│ │ │ Docker SQLite │ │
│ │ │ │ │
└─────────┴─────────┴──────────┬───────────────┴────────────────┘
┌──────────────────────────────┐
│ required-checks │
│ (merge gate) │
└──────────────────────────────┘
```
### Path-Filtered Workflows
These only run if specific files changed:
| Files Changed | Workflow | Branch |
|------------------------------------------------------------------------|-----------------------------|------------|
| `packages/@n8n/task-runner-python/**` | `ci-python.yml` | any |
| `packages/cli/src/databases/**`, `*.entity.ts`, `*.repository.ts` | `test-db.yml` | any |
| `packages/frontend/@n8n/storybook/**`, design-system, chat | `test-visual-storybook.yml` | master |
| `docker/images/n8n-base/Dockerfile` | `build-base-image.yml` | any |
| `**/package.json`, `**/turbo.json` | `build-windows.yml` | master |
| `packages/@n8n/ai-workflow-builder.ee/evaluations/programmatic/python/**` | `test-evals-python.yml` | any |
| `packages/@n8n/benchmark/**` | `build-benchmark-image.yml` | master |
| `packages/cli/src/public-api/**/*.{css,yaml,yml}` | `util-sync-api-docs.yml` | master |
### On PR Review
| Event | Workflow | Condition |
|----------------------------|-----------------------------|------------------------------|
| Review approved | `test-visual-chromatic.yml` | + design files changed |
| Comment with `@claude` | `util-claude.yml` | mention in any comment |
| Any review | `util-notify-pr-status.yml` | not community-labeled |
### On PR Close/Merge
| Event | Workflow |
|----------------------------|-----------------------------|
| PR closed (any) | `util-notify-pr-status.yml` |
| PR merged to `release/*` | `release-publish.yml` |
### Manual Triggers (PR Comments)
| Command | Workflow | Permissions |
|--------------------|------------------------------|---------------------|
| `/test-workflows` | `test-workflows-callable.yml`| admin/write/maintain|
**Why:** Re-run tests without pushing commits. Useful for flaky test investigation.
### Other Manual Workflows
| Workflow | Purpose |
|---------------------------|---------------------------------------------------------|
| `util-claude-task.yml` | Run Claude Code to complete a task and create a PR |
| `util-data-tooling.yml` | SQLite/PostgreSQL export/import validation (manual) |
#### Claude Task Runner (`util-claude-task.yml`)
Runs Claude Code to complete a task, then creates a PR with the changes. Use for well-specced tasks or simple fixes. Can be triggered via GitHub UI or API.
Claude reads templates from `.github/claude-templates/` for task-specific guidance. Add new templates as needed for recurring task types.
**Inputs:**
- `task` - Description of what Claude should do
- `user_token` - GitHub PAT (PR will be authored by the token owner)
**Token requirements** (fine-grained PAT):
- Repository: `n8n-io/n8n`
- Contents: `Read and write`
- Pull requests: `Read and write`
**Governance:** If you provide your personal PAT, you cannot approve the resulting PR. For automated/bot use cases (e.g., dependabot-style updates via n8n workflows), an app token can be used instead.
---
## Workflow Call Graph
Shows which workflows call which reusable workflows:
```
CALLER REUSABLE WORKFLOW
───────────────────────────────────────────────────────────────────────────────
ci-pull-requests.yml
├──────────────────────────▶ test-unit-reusable.yml
├──────────────────────────▶ test-linting-reusable.yml
├──────────────────────────▶ test-e2e-ci-reusable.yml
│ └──────────▶ test-e2e-reusable.yml
└──────────────────────────▶ sec-ci-reusable.yml
└──────────▶ sec-poutine-reusable.yml
ci-master.yml
├──────────────────────────▶ test-unit-reusable.yml
└──────────────────────────▶ test-linting-reusable.yml
release-publish.yml
├──────────────────────────▶ docker-build-push.yml
│ └──────────▶ security-trivy-scan-callable.yml
└──────────────────────────▶ sbom-generation-callable.yml
test-workflows-nightly.yml
└──────────────────────────▶ test-workflows-callable.yml
PR Comment Dispatchers (triggered by /command in PR comments):
test-workflows-pr-comment.yml
└──────────────────────────▶ test-workflows-callable.yml
```
---
## Release Lifecycle
```
┌────────────────────────────────────────────────────────────────────────────┐
│ RELEASE LIFECYCLE │
├────────────────────────────────────────────────────────────────────────────┤
│ │
│ STAGE 1: Create Release PR │
│ ─────────────────────────── │
│ Trigger: Manual workflow_dispatch │
│ │
│ release-create-pr.yml │
│ ├─ bump-versions.mjs ────────▶ Calculate X.Y.Z │
│ ├─ update-changelog.mjs ─────▶ Generate CHANGELOG │
│ └─ Create PR: release-pr/X.Y.Z → release/X.Y.Z │
│ │
│ Inputs: │
│ ├─ release-type: patch │ minor │ major │ experimental │ premajor │
│ └─ base-branch: default master │
│ │ │
│ ▼ │
│ STAGE 2: CI Validation │
│ ─────────────────────── │
│ ci-pull-requests.yml runs full suite │
│ ├─ NO ci-check-pr-title.yml (skipped for release branches) │
│ └─ NO test-visual-chromatic.yml (skipped) │
│ │ │
│ ▼ [Merge PR] │
│ STAGE 3: Publish │
│ ─────────────── │
│ release-publish.yml (triggered on merge to release/*) │
│ ├─ publish-to-npm │
│ │ ├─ trim-fe-packageJson.js ───▶ Strip devDeps │
│ │ ├─ ensure-provenance-fields.mjs ───▶ Add license fields │
│ │ └─ npm publish (tag: rc or latest) │
│ ├─ publish-to-docker-hub ────────▶ docker-build-push.yml │
│ │ └─ Multi-arch: amd64 + arm64 │
│ ├─ create-github-release │
│ ├─ create-sentry-release (sourcemaps) │
│ ├─ generate-sbom ────────────────▶ sbom-generation-callable.yml │
│ │ └─ CycloneDX + Cosign signing │
│ └─ trigger-release-note (stable only) │
│ │ │
│ ▼ │
│ STAGE 4: Channel Promotion (optional) │
│ ────────────────────────────────────── │
│ Trigger: Manual release-push-to-channel.yml │
│ ├─ beta ─────▶ npm tags: next, beta │
│ └─ stable ───▶ npm tags: latest, stable │
│ │
└────────────────────────────────────────────────────────────────────────────┘
```
### Other Release Workflows
| Workflow | Trigger | Purpose |
|----------------------------------|--------------------|------------------------------------------------|
| `release-standalone-package.yml` | Manual dispatch | Release individual packages (@n8n/codemirror-lang, @n8n/create-node, etc.) |
| `create-patch-release-branch.yml`| Manual dispatch | Cherry-pick commits for patch releases |
---
## Fork vs Internal PR
| Aspect | Internal PR | Fork PR |
|--------------------|----------------------------------|-------------------------|
| E2E Runner | `blacksmith-2vcpu-ubuntu-2204` | `ubuntu-latest` |
| E2E Mode | `docker-build` (multi-main) | `local` (SQLite) |
| E2E Shards | 14 + 2 | 6 + 2 |
| Test Command | `test:container:multi-main:*` | `test:local:*` |
| Secrets | Full access | None |
| Currents Recording | Yes | No |
| Failure Artifacts | No | Yes |
**Why:** Fork PRs cannot access repository secrets. Local mode with SQLite provides feedback without paid services.
---
## ci-master.yml
Runs on push to `master` or `1.x`:
```
Push to master/1.x
├─ build-github (populate cache)
├─ unit-test (matrix: Node 22.x, 24.13.1, 25.x)
│ └─ Coverage only on 24.13.1
├─ lint
└─ notify-on-failure (Slack #alerts-build)
```
---
## Scheduled Jobs
| Schedule (UTC) | Workflow | Purpose |
|---------------------------|-----------------------------------|--------------------------|
| Daily 00:00 | `docker-build-push.yml` | Nightly Docker images |
| Daily 00:00 | `test-db.yml` | Database compatibility |
| Daily 00:00 | `test-e2e-performance-reusable.yml`| Performance E2E |
| Daily 00:00 | `test-visual-storybook.yml` | Storybook deploy |
| Daily 00:00 | `test-visual-chromatic.yml` | Visual regression |
| Daily 00:00 | `util-check-docs-urls.yml` | Doc link validation |
| Daily 01:30, 02:30, 03:30 | `test-benchmark-nightly.yml` | Performance benchmarks |
| Daily 02:00 | `test-workflows-nightly.yml` | Workflow tests |
| Daily 05:00 | `test-benchmark-destroy-nightly.yml`| Cleanup benchmark env |
| Monday 00:00 | `util-update-node-popularity.yml` | Node usage stats |
| Monday 02:00 | `test-e2e-coverage-weekly.yml` | Weekly E2E coverage |
| Saturday 22:00 | `test-evals-ai.yml` | AI workflow evals |
---
## Custom Actions
Composite actions in `.github/actions/`:
| Action | Purpose | Used By |
|--------------------------|----------------------------------------------|--------------------|
| `setup-nodejs` | pnpm + Node.js + Turbo cache + Docker (opt) | Most CI workflows |
| `docker-registry-login` | GHCR + DockerHub + DHI authentication | Docker workflows |
### setup-nodejs
```yaml
inputs:
node-version: # default: '24.13.1'
enable-docker-cache: # default: 'false' (Blacksmith Buildx)
build-command: # default: 'pnpm build'
```
### docker-registry-login
```yaml
inputs:
login-ghcr: # default: 'true'
login-dockerhub: # default: 'false'
login-dhi: # default: 'false'
```
---
## Reusable Workflows
Workflows with `workflow_call` trigger:
| Workflow | Inputs | Purpose |
|------------------------------------|-----------------------------------------------|-----------------------|
| `test-unit-reusable.yml` | `ref`, `nodeVersion`, `collectCoverage` | Unit tests |
| `test-linting-reusable.yml` | `ref`, `nodeVersion` | ESLint |
| `test-e2e-reusable.yml` | `branch`, `test-mode`, `shards`, `runner` | Core E2E executor |
| `test-e2e-ci-reusable.yml` | `branch` | E2E orchestrator |
| `test-e2e-docker-pull-reusable.yml`| `branch`, `n8n_version` | E2E with pulled image |
| `test-workflows-callable.yml` | `git_ref`, `compare_schemas` | Workflow tests |
| `docker-build-push.yml` | `n8n_version`, `release_type`, `push_enabled` | Docker build |
| `sec-ci-reusable.yml` | `ref` | Security orchestrator |
| `sec-poutine-reusable.yml` | `ref` | Poutine scanner |
| `security-trivy-scan-callable.yml` | `image_ref` | Trivy scan |
| `sbom-generation-callable.yml` | `n8n_version`, `release_tag_ref` | SBOM generation |
---
## Scripts
Scripts in `.github/scripts/`:
### Release Scripts
| Script | Purpose | Called By |
|-------------------------------|----------------------------|-------------------------|
| `bump-versions.mjs` | Calculate next version | `release-create-pr.yml` |
| `update-changelog.mjs` | Generate CHANGELOG | `release-create-pr.yml` |
| `trim-fe-packageJson.js` | Strip frontend devDeps | `release-publish.yml` |
| `ensure-provenance-fields.mjs`| Add license/author fields | `release-publish.yml` |
### Docker Scripts
| Script | Purpose | Called By |
|-------------------------|-------------------|------------------------|
| `docker/docker-config.mjs`| Build context | `docker-build-push.yml`|
| `docker/docker-tags.mjs` | Image tags | `docker-build-push.yml`|
### Validation Scripts
| Script | Purpose | Called By |
|-------------------------|-------------------|---------------------------|
| `validate-docs-links.js`| Check doc URLs | `util-check-docs-urls.yml`|
| `send-build-stats.mjs` | Build telemetry | `setup-nodejs` action |
---
## Telemetry
CI metrics are collected via webhooks to n8n, then stored in BigQuery for analysis.
See **[CI-TELEMETRY.md](CI-TELEMETRY.md)** for:
- Common data points (git, CI context, runner info)
- Existing implementations (build stats, container stack)
- How to add new telemetry
- BigQuery schema patterns and queries
---
## CODEOWNERS
Team ownership mappings in `CODEOWNERS`:
| Path Pattern | Team |
|--------------------------------------------------------------|----------------------------|
| `packages/@n8n/db/src/migrations/` | @n8n-io/migrations-review |
---
## Runner Selection
| Runner | vCPU | Use Case |
|-------------------------------------|------|-----------------------------|
| `ubuntu-slim` | 1 | Gate jobs (required-checks) |
| `ubuntu-latest` | 2 | Simple jobs, fork PR E2E |
| `blacksmith-2vcpu-ubuntu-2204` | 2 | Standard builds, E2E shards |
| `blacksmith-4vcpu-ubuntu-2204` | 4 | Unit tests, typecheck, lint |
| `blacksmith-8vcpu-ubuntu-2204` | 8 | E2E coverage (weekly) |
| `blacksmith-4vcpu-ubuntu-2204-arm` | 4 | ARM64 Docker builds |
### Selection Guidelines
**`ubuntu-slim`** - Status check aggregation, gate/required-check jobs, notifications
**`ubuntu-latest`** - Simple build verification, scheduled maintenance, PR comment handlers, release tagging, Docker manifest creation, any job where speed is not critical
**`blacksmith-2vcpu-ubuntu-2204`** - Initial build/install (benefits from Blacksmith caching), database integration tests (I/O bound), Chromatic/Storybook builds
**`blacksmith-4vcpu-ubuntu-2204`** - Unit tests (parallelized), linting (parallel file processing), typechecking (CPU-intensive), E2E test shards
**`blacksmith-8vcpu-ubuntu-2204`** - Heavy parallel workloads, full E2E coverage runs
### Runner Provider Toggle
The `RUNNER_PROVIDER` repository variable controls runner selection across workflows:
| Value | Behavior |
|-------|----------|
| (unset) | Use Blacksmith runners (default) |
| `github` | Use GitHub-hosted `ubuntu-latest` |
**Note:** When set to `github`, all jobs use `ubuntu-latest` regardless of any runner inputs or defaults specified in reusable workflows. GitHub runners have fewer vCPUs (2 vs 4), so jobs may run slower.
---
## Security
### Why We Do This
Supply chain security ensures artifacts haven't been tampered with. We provide three types of signed attestations:
```
ATTESTATION (signed statement)
┌─────────────────┼─────────────────┐
│ │ │
▼ ▼ ▼
PROVENANCE SBOM VEX
"Trust the "Know the "Understand
build" contents" the risk"
```
| Attestation | Question It Answers |
|-------------|--------------------------------|
| **Provenance** | "Can we trust this artifact came from n8n's CI and wasn't tampered with?" |
| **SBOM** | "What dependencies are inside?" (license compliance, vulnerability scanning) |
| **VEX** | "The scanner found CVE-X - does it actually affect us or is it a false positive?" |
**How they relate:**
- **SBOM** is the ingredients list - input for both license checks AND security scanning
- **VEX** is the security triage output - "we investigated CVE-X, here's our assessment"
- **Provenance** proves the SBOM and VEX came from our CI, not an attacker
---
### Poutine (Supply Chain)
- **Runs on:** PR changes to `.github/**`
- **Detects:** Exposed secrets, insecure workflow configs
- **Output:** SARIF to GitHub Security tab
### Trivy (Container)
- **Runs on:** stable/nightly/rc Docker builds
- **Scans:** n8n image, runners image
- **Output:** Slack `#notify-security-scan-outputs` (all), `#mission-security` (critical)
### SBOM
- **Runs on:** release-publish
- **Format:** CycloneDX JSON
- **Signing:** GitHub Attestation API
- **Attached to:** GitHub Release
### SLSA L3 Provenance
SLSA (Supply-chain Levels for Software Artifacts) Level 3 provides cryptographic proof of build integrity.
| Artifact | Generator | Level |
|----------|-----------|-------|
| Docker images | `slsa-framework/slsa-github-generator` | L3 |
| npm packages | `NPM_CONFIG_PROVENANCE=true` | L3 |
**Docker provenance** uses the SLSA GitHub Generator as a reusable workflow (not an action). This is required for L3 because provenance must be generated in an isolated environment the build can't tamper with.
```yaml
# IMPORTANT: Must use semantic version tags (@vX.Y.Z), NOT commit SHAs.
# The slsa-verifier requires tagged versions to verify authenticity.
uses: slsa-framework/slsa-github-generator/.github/workflows/generator_container_slsa3.yml@v2.1.0
```
**Verify provenance:**
```bash
# Docker
slsa-verifier verify-image ghcr.io/n8n-io/n8n:VERSION \
--source-uri github.com/n8n-io/n8n
# npm
npm audit signatures n8n@VERSION
```
### VEX (Vulnerability Exploitability eXchange)
VEX documents which CVEs actually affect n8n vs false positives from scanners.
- **File:** `security/vex.openvex.json`
- **Format:** OpenVEX (broad scanner compatibility - Trivy, Docker Scout, etc.)
- **Attached to:** GitHub Release, Docker image attestations
- **Used by:** Trivy scans (via `security/trivy.yaml`)
**VEX Status Types:**
| Status | Meaning |
|--------|---------|
| `not_affected` | CVE doesn't impact n8n (code not reachable, etc.) |
| `affected` | CVE impacts n8n, tracking fix |
| `fixed` | CVE was present, now fixed |
| `under_investigation` | Assessing impact |
**Verify VEX attestation:**
```bash
cosign verify-attestation --type openvex \
--certificate-identity-regexp '.*github.com/n8n-io/n8n.*' \
--certificate-oidc-issuer https://token.actions.githubusercontent.com \
ghcr.io/n8n-io/n8n:VERSION
```
**Adding a CVE statement to security/vex.openvex.json:**
```json
{
"statements": [
{
"vulnerability": { "name": "CVE-2024-XXXXX" },
"products": [{ "@id": "pkg:github/n8n-io/n8n" }],
"status": "not_affected",
"justification": "vulnerable_code_not_in_execute_path",
"statement": "n8n does not use the affected code path in this dependency"
}
]
}
```
---
## Secrets
### By Category
| Category | Secrets |
|---------------------|-------------------------------------------------------------|
| Package Publishing | `NPM_TOKEN`, `DOCKER_USERNAME`, `DOCKER_PASSWORD` |
| Notifications | `SLACK_WEBHOOK_URL`, `QBOT_SLACK_TOKEN` |
| Code Quality | `CODECOV_TOKEN`, `CHROMATIC_PROJECT_TOKEN`, `CURRENTS_RECORD_KEY` |
| Error Tracking | `SENTRY_AUTH_TOKEN`, `SENTRY_ORG`, `SENTRY_*_PROJECT` |
| Cloud/CDN | `CLOUDFLARE_API_TOKEN`, `CLOUDFLARE_ACCOUNT_ID` |
| GitHub Automation | `N8N_ASSISTANT_APP_ID`, `N8N_ASSISTANT_PRIVATE_KEY` |
| Benchmarking | `BENCHMARK_ARM_*`, `N8N_BENCHMARK_LICENSE_CERT` |
| AI/Evals | `ANTHROPIC_API_KEY`, `EVALS_LANGSMITH_*` |
### Scoping
- **`secrets: inherit`** - passes all secrets to reusable workflows
- **Explicit passing** - for minimal exposure
- **Environment: `benchmarking`** - Azure OIDC credentials
---
## Future Vision
### Redundancy Review
Comment trigger (`/test-workflows`) is a workaround.
Long-term: Main CI should be reliable enough to not need these.
### Workflow Testability
- Tools like `act` for local testing
- Unit tests for `.github/scripts/*.mjs`
- Validation with `actionlint`

View File

@ -5,3 +5,4 @@ self-hosted-runner:
- blacksmith-2vcpu-ubuntu-2204-arm
- blacksmith-4vcpu-ubuntu-2204-arm
- blacksmith-8vcpu-ubuntu-2204
- ubuntu-slim

View File

@ -0,0 +1,235 @@
import { describe, it } from 'node:test';
import assert from 'node:assert/strict';
import { matchGlob, parseFilters, evaluateFilter, runValidate } from '../ci-filter.mjs';
// --- matchGlob ---
describe('matchGlob', () => {
it('** matches dotfiles', () => {
assert.ok(matchGlob('.github/workflows/ci.yml', '**'));
});
it('** matches deeply nested paths', () => {
assert.ok(matchGlob('packages/cli/src/controllers/auth.ts', '**'));
});
it('** matches root-level files', () => {
assert.ok(matchGlob('README.md', '**'));
});
it('.github/** matches workflow files', () => {
assert.ok(matchGlob('.github/workflows/ci.yml', '.github/**'));
});
it('.github/** matches action files', () => {
assert.ok(matchGlob('.github/actions/ci-filter/action.yml', '.github/**'));
});
it('.github/** does not match non-.github paths', () => {
assert.ok(!matchGlob('packages/cli/src/index.ts', '.github/**'));
});
it('scoped package pattern matches files in that package', () => {
assert.ok(
matchGlob(
'packages/@n8n/task-runner-python/src/main.py',
'packages/@n8n/task-runner-python/**',
),
);
});
it('scoped package pattern does not match other packages', () => {
assert.ok(!matchGlob('packages/@n8n/config/src/index.ts', 'packages/@n8n/task-runner-python/**'));
});
it('* matches single-level only', () => {
assert.ok(matchGlob('README.md', '*.md'));
assert.ok(!matchGlob('docs/README.md', '*.md'));
});
it('exact path match', () => {
assert.ok(matchGlob('package.json', 'package.json'));
assert.ok(!matchGlob('packages/cli/package.json', 'package.json'));
});
it('? matches single character', () => {
assert.ok(matchGlob('file1.txt', 'file?.txt'));
assert.ok(!matchGlob('file12.txt', 'file?.txt'));
});
it('**/ at start matches zero or more path segments', () => {
assert.ok(matchGlob('src/index.ts', '**/index.ts'));
assert.ok(matchGlob('packages/cli/src/index.ts', '**/index.ts'));
assert.ok(matchGlob('index.ts', '**/index.ts'));
});
it('**/ in middle matches nested paths', () => {
assert.ok(matchGlob('packages/@n8n/db/src/deep/file.ts', 'packages/@n8n/db/**'));
});
});
// --- parseFilters ---
describe('parseFilters', () => {
it('parses single-line filter', () => {
const filters = parseFilters('workflows: .github/**');
assert.deepEqual(filters.get('workflows'), ['.github/**']);
});
it('parses single-line with multiple patterns', () => {
const filters = parseFilters('db: packages/@n8n/db/** packages/cli/**');
assert.deepEqual(filters.get('db'), ['packages/@n8n/db/**', 'packages/cli/**']);
});
it('parses multi-line filter', () => {
const input = `non-python:
**
!packages/@n8n/task-runner-python/**`;
const filters = parseFilters(input);
assert.deepEqual(filters.get('non-python'), ['**', '!packages/@n8n/task-runner-python/**']);
});
it('parses mixed single and multi-line', () => {
const input = `non-python:
**
!packages/@n8n/task-runner-python/**
workflows: .github/**`;
const filters = parseFilters(input);
assert.equal(filters.size, 2);
assert.deepEqual(filters.get('non-python'), ['**', '!packages/@n8n/task-runner-python/**']);
assert.deepEqual(filters.get('workflows'), ['.github/**']);
});
it('ignores comments and blank lines', () => {
const input = `# This is a comment
workflows: .github/**
# Another comment
db: packages/@n8n/db/**`;
const filters = parseFilters(input);
assert.equal(filters.size, 2);
});
it('throws on malformed input', () => {
assert.throws(() => parseFilters('not a valid filter line'), /Malformed/);
});
it('throws on filter with no patterns', () => {
const input = `empty:
other: .github/**`;
assert.throws(() => parseFilters(input), /no patterns/);
});
});
// --- evaluateFilter ---
describe('evaluateFilter', () => {
it('python-only files with non-python filter returns false', () => {
const files = [
'packages/@n8n/task-runner-python/src/main.py',
'packages/@n8n/task-runner-python/pyproject.toml',
];
const patterns = ['**', '!packages/@n8n/task-runner-python/**'];
assert.equal(evaluateFilter(files, patterns), false);
});
it('mixed python and non-python returns true', () => {
const files = [
'packages/@n8n/task-runner-python/src/main.py',
'packages/cli/src/index.ts',
];
const patterns = ['**', '!packages/@n8n/task-runner-python/**'];
assert.equal(evaluateFilter(files, patterns), true);
});
it('non-python files with non-python filter returns true', () => {
const files = ['packages/cli/src/index.ts', 'packages/core/src/utils.ts'];
const patterns = ['**', '!packages/@n8n/task-runner-python/**'];
assert.equal(evaluateFilter(files, patterns), true);
});
it('.github files with workflows filter returns true', () => {
const files = ['.github/workflows/ci.yml', '.github/actions/setup/action.yml'];
const patterns = ['.github/**'];
assert.equal(evaluateFilter(files, patterns), true);
});
it('non-.github files with workflows filter returns false', () => {
const files = ['packages/cli/src/index.ts'];
const patterns = ['.github/**'];
assert.equal(evaluateFilter(files, patterns), false);
});
it('empty changed files returns false', () => {
assert.equal(evaluateFilter([], ['**']), false);
});
it('last matching pattern wins (gitignore semantics)', () => {
const files = ['packages/@n8n/task-runner-python/src/main.py'];
const patterns = ['**', '!packages/@n8n/task-runner-python/**', 'packages/@n8n/task-runner-python/**'];
assert.equal(evaluateFilter(files, patterns), true);
});
});
// --- runValidate ---
describe('runValidate', () => {
function runWithResults(jobResults: Record<string, { result: string }>): number | null {
const originalEnv = process.env.INPUT_JOB_RESULTS;
const originalExit = process.exit;
let exitCode: number | null = null;
process.env.INPUT_JOB_RESULTS = JSON.stringify(jobResults);
process.exit = ((code: number) => { exitCode = code; }) as never;
try {
runValidate();
} finally {
process.env.INPUT_JOB_RESULTS = originalEnv;
process.exit = originalExit;
}
return exitCode;
}
it('passes when all jobs succeed', () => {
assert.equal(runWithResults({
'install-and-build': { result: 'success' },
'unit-test': { result: 'success' },
typecheck: { result: 'success' },
lint: { result: 'success' },
}), null);
});
it('passes when some jobs are skipped (filtered out)', () => {
assert.equal(runWithResults({
'install-and-build': { result: 'success' },
'unit-test': { result: 'success' },
'security-checks': { result: 'skipped' },
}), null);
});
it('fails when a job fails', () => {
assert.equal(runWithResults({
'install-and-build': { result: 'success' },
'unit-test': { result: 'failure' },
typecheck: { result: 'success' },
}), 1);
});
it('fails when a job is cancelled', () => {
assert.equal(runWithResults({
'install-and-build': { result: 'success' },
'unit-test': { result: 'cancelled' },
}), 1);
});
it('fails when multiple jobs have problems', () => {
assert.equal(runWithResults({
'unit-test': { result: 'failure' },
typecheck: { result: 'cancelled' },
lint: { result: 'success' },
}), 1);
});
});

39
.github/actions/ci-filter/action.yml vendored Normal file
View File

@ -0,0 +1,39 @@
name: 'CI Filter'
description: |
Filter CI jobs by changed files and validate results.
Modes:
- filter: Determines which jobs to run based on changed files and a provided filter definition.
- validate: Checks the results of required jobs and fails if any of them failed or were cancelled.
inputs:
mode:
description: 'filter or validate'
required: true
filters:
description: 'Filter definitions (gitignore-style DSL)'
required: false
base-ref:
description: 'Base branch for diff. Auto-detected if not specified.'
required: false
job-results:
description: 'Job results from needs context as JSON (mode=validate)'
required: false
outputs:
results:
description: 'JSON object: { "filter-name": true/false }'
value: ${{ steps.run.outputs.results }}
runs:
using: 'composite'
steps:
- name: Run CI Filter
id: run
shell: bash
env:
INPUT_MODE: ${{ inputs.mode }}
INPUT_FILTERS: ${{ inputs.filters }}
INPUT_BASE_REF: ${{ inputs.base-ref || github.event.pull_request.base.ref || github.event.merge_group.base_ref || 'master' }}
INPUT_JOB_RESULTS: ${{ inputs.job-results }}
run: node ${{ github.action_path }}/ci-filter.mjs

216
.github/actions/ci-filter/ci-filter.mjs vendored Normal file
View File

@ -0,0 +1,216 @@
import { execSync } from 'node:child_process';
import { appendFileSync } from 'node:fs';
import { fileURLToPath } from 'node:url';
import { resolve } from 'node:path';
// --- Glob matching (dotfile-safe) ---
/**
* Match a file path against a glob pattern.
* Unlike path.matchesGlob / standard POSIX globs, `**` matches dotfiles.
*/
export function matchGlob(filePath, pattern) {
let regex = '';
let i = 0;
while (i < pattern.length) {
const ch = pattern[i];
if (ch === '*' && pattern[i + 1] === '*') {
if (pattern[i + 2] === '/') {
regex += '(?:.+/)?';
i += 3;
} else {
regex += '.*';
i += 2;
}
} else if (ch === '*') {
regex += '[^/]*';
i++;
} else if (ch === '?') {
regex += '[^/]';
i++;
} else {
regex += ch.replace(/[.+^${}()|[\]\\]/g, '\\$&');
i++;
}
}
return new RegExp(`^${regex}$`).test(filePath);
}
// --- Filter DSL parser ---
/**
* Parse filter definitions from the input DSL.
*
* Supports two formats:
* Single-line: `name: pattern1 pattern2`
* Multi-line: `name:` followed by indented patterns (one per line)
*
* Lines starting with # and blank lines are ignored.
*/
export function parseFilters(input) {
const filters = new Map();
const lines = input.split('\n');
let currentFilter = null;
for (const rawLine of lines) {
const line = rawLine.trim();
if (!line || line.startsWith('#')) continue;
const headerMatch = line.match(/^([a-zA-Z0-9_-]+):\s*(.*)?$/);
if (headerMatch) {
const name = headerMatch[1];
const rest = (headerMatch[2] || '').trim();
const patterns = [];
currentFilter = name;
filters.set(name, patterns);
if (rest) {
patterns.push(...rest.split(/\s+/));
currentFilter = null;
}
continue;
}
if (currentFilter && rawLine.match(/^\s/)) {
const patterns = filters.get(currentFilter);
if (patterns) patterns.push(line);
continue;
}
throw new Error(`Malformed filter input at: "${rawLine}"`);
}
for (const [name, patterns] of filters) {
if (patterns.length === 0) {
throw new Error(`Filter "${name}" has no patterns`);
}
}
return filters;
}
// --- Git operations ---
const SAFE_REF = /^[a-zA-Z0-9_./-]+$/;
export function getChangedFiles(baseRef) {
if (!SAFE_REF.test(baseRef)) {
throw new Error(`Unsafe base ref: "${baseRef}"`);
}
execSync(`git fetch --depth=1 origin ${baseRef}`, { stdio: 'pipe' });
const output = execSync('git diff --name-only FETCH_HEAD HEAD', { encoding: 'utf-8' });
return output
.split('\n')
.map((f) => f.trim())
.filter(Boolean);
}
// --- Filter evaluation ---
/**
* Evaluate a single filter against changed files using gitignore semantics.
* Patterns evaluated in order, last match wins. ! prefix excludes.
* Filter triggers if ANY changed file passes.
*/
export function evaluateFilter(changedFiles, patterns) {
for (const file of changedFiles) {
let included = false;
for (const pattern of patterns) {
if (pattern.startsWith('!')) {
if (matchGlob(file, pattern.slice(1))) {
included = false;
}
} else {
if (matchGlob(file, pattern)) {
included = true;
}
}
}
if (included) return true;
}
return false;
}
// --- Mode: filter ---
function setOutput(name, value) {
const outputFile = process.env.GITHUB_OUTPUT;
if (outputFile) {
const delimiter = `ghadelimiter_${Date.now()}`;
appendFileSync(outputFile, `${name}<<${delimiter}\n${value}\n${delimiter}\n`);
}
}
export function runFilter() {
const filtersInput = process.env.INPUT_FILTERS;
const baseRef = process.env.INPUT_BASE_REF;
if (!filtersInput) {
throw new Error('INPUT_FILTERS is required in filter mode');
}
if (!baseRef) {
throw new Error('INPUT_BASE_REF is required in filter mode');
}
const filters = parseFilters(filtersInput);
const changedFiles = getChangedFiles(baseRef);
console.log(`Changed files (${changedFiles.length}):`);
for (const f of changedFiles) {
console.log(` ${f}`);
}
const results = {};
for (const [name, patterns] of filters) {
const matched = evaluateFilter(changedFiles, patterns);
results[name] = matched;
console.log(`Filter "${name}": ${matched}`);
}
setOutput('results', JSON.stringify(results));
}
// --- Mode: validate ---
export function runValidate() {
const raw = process.env.INPUT_JOB_RESULTS;
if (!raw) {
throw new Error('INPUT_JOB_RESULTS is required in validate mode');
}
const jobResults = JSON.parse(raw);
const problems = [];
for (const [job, data] of Object.entries(jobResults)) {
if (data.result === 'failure') problems.push(`${job}: failed`);
if (data.result === 'cancelled') problems.push(`${job}: cancelled`);
}
if (problems.length > 0) {
console.error('Required checks failed:');
for (const p of problems) {
console.error(` - ${p}`);
}
process.exit(1);
}
console.log('All required checks passed:');
for (const [job, data] of Object.entries(jobResults)) {
console.log(` ${job}: ${data.result}`);
}
}
// --- Main (only when run directly, not when imported by tests) ---
if (resolve(fileURLToPath(import.meta.url)) === resolve(process.argv[1])) {
const mode = process.env.INPUT_MODE;
if (mode === 'filter') {
runFilter();
} else if (mode === 'validate') {
runValidate();
} else {
throw new Error(`Unknown mode: "${mode}". Expected "filter" or "validate".`);
}
}

View File

@ -0,0 +1,53 @@
# Composite action for logging into Docker registries (GHCR and/or DockerHub).
# Centralizes the login pattern used across multiple Docker workflows.
name: 'Docker Registry Login'
description: 'Login to GitHub Container Registry and/or DockerHub'
inputs:
login-ghcr:
description: 'Login to GitHub Container Registry'
required: false
default: 'true'
login-dockerhub:
description: 'Login to DockerHub'
required: false
default: 'false'
login-dhi:
description: 'Login to Docker Hardened Images registry (dhi.io)'
required: false
default: 'false'
dockerhub-username:
description: 'DockerHub username (required if login-dockerhub or login-dhi is true)'
required: false
dockerhub-password:
description: 'DockerHub password (required if login-dockerhub or login-dhi is true)'
required: false
runs:
using: 'composite'
steps:
- name: Login to GitHub Container Registry
if: inputs.login-ghcr == 'true'
shell: bash
env:
GHCR_TOKEN: ${{ github.token }}
GHCR_USER: ${{ github.actor }}
run: |
node .github/scripts/retry.mjs --attempts 3 --delay 10 \
'echo "$GHCR_TOKEN" | docker login ghcr.io -u "$GHCR_USER" --password-stdin'
- name: Login to DockerHub
if: inputs.login-dockerhub == 'true'
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
username: ${{ inputs.dockerhub-username }}
password: ${{ inputs.dockerhub-password }}
- name: Login to DHI Registry
if: inputs.login-dhi == 'true'
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: dhi.io
username: ${{ inputs.dockerhub-username }}
password: ${{ inputs.dockerhub-password }}

View File

@ -1,42 +0,0 @@
name: 'Blacksmith Node.js Build Setup'
description: 'Configures Node.js with pnpm, installs dependencies, enables Turborepo caching, (optional) sets up Docker layer caching, and builds the project or an optional command.'
inputs:
node-version:
description: 'Node.js version to use. Uses latest 22.x by default.'
required: false
default: '22.x'
enable-docker-cache:
description: 'Whether to set up Blacksmith Buildx for Docker layer caching.'
required: false
default: 'false'
type: boolean
build-command:
description: 'Command to execute for building the project or an optional command. Leave empty to skip build step.'
required: false
default: 'pnpm build'
type: string
runs:
using: 'composite'
steps:
- name: Setup Node.js
uses: useblacksmith/setup-node@65c6ca86fdeb0ab3d85e78f57e4f6a7e4780b391 # v5.0.4
with:
node-version: ${{ inputs.node-version }}
- name: Setup pnpm and Install Dependencies
uses: pnpm/action-setup@fe02b34f77f8bc703788d5817da081398fad5dd2 # v4.0.0
with:
run_install: true
- name: Configure Turborepo Cache
uses: useblacksmith/caching-for-turbo@bafb57e7ebdbf1185762286ec94d24648cd3938a # v1
- name: Setup Docker Builder for Docker Cache
if: ${{ inputs.enable-docker-cache == 'true' }}
uses: useblacksmith/setup-docker-builder@0b434dfbb431f4e3a2bcee7a773a56bd363184c5 # v1
- name: Build Project
run: ${{ inputs.build-command }}
shell: bash

View File

@ -1,33 +0,0 @@
name: 'GitHub Node.js Build Setup for Github Hosted Runners'
description: 'Configures Node.js with pnpm, installs dependencies, enables Turborepo caching, and builds the project or an optional command.'
inputs:
node-version:
description: 'Node.js version to use. Uses latest 22.x by default.'
required: false
default: '22.x'
build-command:
description: 'Command to execute for building the project or an optional command. Leave empty to skip build step.'
required: false
default: 'pnpm build'
type: string
runs:
using: 'composite'
steps:
- name: Setup Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: ${{ inputs.node-version }}
- name: Setup pnpm and Install Dependencies
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
with:
run_install: true
- name: Configure Turborepo Cache
uses: rharkor/caching-for-turbo@2b4b5b14a8d16b8556a58993a8ac331d56d8906d # v2.3.2
- name: Build Project
run: ${{ inputs.build-command }}
shell: bash

92
.github/actions/setup-nodejs/action.yml vendored Normal file
View File

@ -0,0 +1,92 @@
# This action works transparently on both Blacksmith and GitHub-hosted runners.
# Blacksmith runners benefit from transparent caching and optional Docker layer caching.
# GitHub-hosted runners use standard GitHub Actions caching.
name: 'Node.js Build Setup'
description: 'Configures Node.js with pnpm, installs Aikido SafeChain for supply chain protection, installs dependencies, enables Turborepo caching, (optional) sets up Docker layer caching, and builds the project or an optional command.'
inputs:
node-version:
description: 'Node.js version to use. Pinned to 24.14.1 by default for reproducible builds.'
required: false
default: '24.14.1'
enable-docker-cache:
description: 'Whether to set up Blacksmith Buildx for Docker layer caching (Blacksmith runners only).'
required: false
default: 'false'
build-command:
description: 'Command to execute for building the project or an optional command. Leave empty to skip build step.'
required: false
default: 'pnpm build'
install-command:
description: 'Command to execute for installing project dependencies. Leave empty to skip install step.'
required: false
default: 'pnpm install --frozen-lockfile'
runs:
using: 'composite'
steps:
- name: Setup pnpm
uses: pnpm/action-setup@b906affcce14559ad1aafd4ab0e942779e9f58b1 # v4.3.0
- name: Setup Node.js
uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: ${{ inputs.node-version }}
cache: 'pnpm'
# To avoid setup-node cache failure.
# see: https://github.com/actions/setup-node/issues/1137
- name: Verify PNPM Cache Directory
shell: bash
run: |
PNPM_STORE_PATH="$( pnpm store path --silent )"
if [ ! -d "$PNPM_STORE_PATH" ]; then
mkdir -p "$PNPM_STORE_PATH"
fi
- name: Install Aikido SafeChain
if: runner.os != 'Windows'
run: |
VERSION="1.4.1"
EXPECTED_SHA256="628235987175072a4255aa3f5f0128f31795b63970f1970ae8a04d07bf8527b0"
node .github/scripts/retry.mjs --attempts 3 --delay 10 \
"curl -fsSL -o install-safe-chain.sh https://github.com/AikidoSec/safe-chain/releases/download/${VERSION}/install-safe-chain.sh"
echo "${EXPECTED_SHA256} install-safe-chain.sh" | sha256sum -c -
sh install-safe-chain.sh --ci
rm install-safe-chain.sh
shell: bash
- name: Install Dependencies
if: ${{ inputs.install-command != '' }}
env:
INSTALL_COMMAND: ${{ inputs.install-command }}
run: |
$INSTALL_COMMAND
shell: bash
- name: Disable safe-chain
if: runner.os != 'Windows'
run: safe-chain teardown
shell: bash
- name: Configure Turborepo Cache
uses: rharkor/caching-for-turbo@0abc2381e688c4d2832f0665a68a01c6e82f0d6c # v2.3.11
- name: Setup Docker Builder for Docker Cache (Blacksmith)
if: ${{ inputs.enable-docker-cache == 'true' && contains(runner.name, 'blacksmith') }}
uses: useblacksmith/setup-docker-builder@ef12d5b165b596e3aa44ea8198d8fde563eab402 # v1.4.0
- name: Setup Docker Builder (GitHub fallback)
if: ${{ inputs.enable-docker-cache == 'true' && !contains(runner.name, 'blacksmith') }}
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Build Project
if: ${{ inputs.build-command != '' }}
env:
BUILD_COMMAND: ${{ inputs.build-command }}
run: |
$BUILD_COMMAND --summarize
node .github/scripts/send-build-stats.mjs || true
node .github/scripts/send-docker-stats.mjs || true
shell: bash

119
.github/claude-templates/e2e-test.md vendored Normal file
View File

@ -0,0 +1,119 @@
# E2E Test Task Guide
## Required Reading
**Before writing any code**, read these files:
```
packages/testing/playwright/AGENTS.md # Patterns, anti-patterns, entry points
packages/testing/playwright/CONTRIBUTING.md # Detailed architecture (first 200 lines)
```
## Spec Validation
Before starting, verify the spec includes:
| Required | Example |
|----------|---------|
| **File(s) to modify** | `tests/e2e/credentials/crud.spec.ts` |
| **Specific behavior** | "Verify credential renaming updates the list" |
| **Pattern reference** | "Follow existing tests in same file" or "See AGENTS.md" |
**If missing, ask for clarification.** Don't guess at requirements.
## Commands
```bash
# Run single test
pnpm --filter=n8n-playwright test:local tests/e2e/your-test.spec.ts --reporter=list 2>&1 | tail -50
# Run with pattern match
pnpm --filter=n8n-playwright test:local --grep "should do something" --reporter=list 2>&1 | tail -50
# Container tests (requires pnpm build:docker first)
pnpm --filter=n8n-playwright test:container:sqlite --grep @capability:email --reporter=list 2>&1 | tail -50
```
## Test Structure
```typescript
import { test, expect } from '../fixtures/base';
import { nanoid } from 'nanoid';
test('should do something @mode:sqlite', async ({ n8n, api }) => {
// Setup via API (faster, more reliable)
const workflow = await api.workflowApi.createWorkflow(workflowJson);
// UI interaction via entry points
await n8n.start.fromBlankCanvas();
// Assertions
await expect(n8n.workflows.getWorkflowByName(workflow.name)).toBeVisible();
});
```
## Entry Points
Use `n8n.start.*` methods - see `composables/TestEntryComposer.ts`:
- `fromBlankCanvas()` - New workflow
- `fromImportedWorkflow(file)` - Pre-built workflow
- `fromNewProjectBlankCanvas()` - Project-scoped
- `withUser(user)` - Isolated browser context
## Multi-User Tests
```typescript
const member = await api.publicApi.createUser({ role: 'global:member' });
const memberPage = await n8n.start.withUser(member);
await memberPage.navigate.toWorkflows();
```
## Development Process
1. **Validate spec** - Has file, behavior, pattern reference?
2. **Read existing code** - Understand current patterns in the file
3. **Identify helpers needed** - Check `pages/`, `services/`, `composables/`
4. **Add helpers first** if missing
5. **Write test** following 4-layer architecture
6. **Verify iteratively** - Small changes, test frequently
## Mandatory Verification
**Always run before marking complete:**
```bash
# 1. Tests pass (check output for failures - piping loses exit code)
pnpm --filter=n8n-playwright test:local <your-test> --reporter=list 2>&1 | tail -50
# 2. Not flaky (required)
pnpm --filter=n8n-playwright test:local <your-test> --repeat-each 3 --reporter=list 2>&1 | tail -50
# 3. Lint passes
pnpm --filter=n8n-playwright lint 2>&1 | tail -30
# 4. Typecheck passes
pnpm --filter=n8n-playwright typecheck 2>&1 | tail -30
```
**Important:** Piping through `tail` loses the exit code. Always check the output for "failed" or error messages rather than relying on exit codes.
**If any fail, fix before completing.**
## Refactoring Existing Tests
**Always verify tests pass BEFORE making changes:**
```bash
pnpm --filter=n8n-playwright test:local tests/e2e/target-file.spec.ts --reporter=list 2>&1 | tail -50
```
Then make small incremental changes, re-running after each.
## Done Checklist
- [ ] Spec had clear file, behavior, and pattern reference
- [ ] Read `AGENTS.md` and relevant existing code
- [ ] Used `n8n.start.*` entry points
- [ ] Used `nanoid()` for unique IDs (not `Date.now()`)
- [ ] No serial mode, `@db:reset`, or `n8n.api.signin()`
- [ ] Multi-user tests use `n8n.start.withUser()`
- [ ] Tests pass with `--repeat-each 3`
- [ ] Lint and typecheck pass

179
.github/claude-templates/security-fix.md vendored Normal file
View File

@ -0,0 +1,179 @@
# Security Vulnerability Fix Guidelines
## Overview
This guide covers how to fix security vulnerabilities in the n8n codebase. Follow a systematic approach to identify, fix, and verify vulnerabilities in dependencies or base images.
## Decision Tree
```
Is it a direct dependency?
→ Yes: Update in catalog or package.json
→ No: Is it transitive?
→ Yes: Add pnpm override
→ No: Is it base image?
→ Yes: Update Dockerfile, trigger base image workflow
```
## Process Flow
```
Scan → Investigate → Fix → Verify
↓ ↓ ↓ ↓
pnpm pnpm why Update pnpm
build: (trace) deps build:
docker: or docker:
scan override scan
```
## Step-by-Step Process
### 1. Initial Setup
Start with a clean install:
```bash
pnpm install --frozen-lockfile
```
### 2. Scan for Vulnerabilities
Run the Docker scan to verify if the vulnerability exists:
```bash
pnpm build:docker:scan
```
### 3. Investigate the Source
Use `pnpm why` to trace where the vulnerable package is coming from:
```bash
pnpm why <package-name> -r
```
### 4. Determine Fix Strategy
#### Case A: Direct Dependency
If the vulnerable package is a **direct dependency**:
**Update via Catalog** (preferred for shared dependencies):
```yaml
# pnpm-workspace.yaml
catalog:
'@azure/identity': 4.13.0 # Updated version
```
```json
// packages/cli/package.json
{
"dependencies": {
"@azure/identity": "catalog:"
}
}
```
**Or update directly in package.json:**
```json
{
"dependencies": {
"vulnerable-package": "^1.2.3"
}
}
```
Then: `pnpm install`
#### Case B: Transitive Dependency
If the vulnerable package is a **transitive dependency**:
**Add an override** in the root `package.json`:
```json
{
"pnpm": {
"overrides": {
"vulnerable-package": "^1.2.3"
}
}
}
```
**For multiple versions:**
```json
{
"pnpm": {
"overrides": {
"vulnerable-package@3": "^3.2.1",
"vulnerable-package@4": "^4.0.1"
}
}
}
```
Then: `pnpm install`
#### Case C: Base Image / NPM Issue
If the vulnerability comes from the **base Docker image**:
1. Check `docker/images/n8n-base/Dockerfile`
2. Update Node version or Alpine packages if needed
3. Note: Base image rebuild requires manual workflow trigger
### 5. Verify the Fix
```bash
pnpm install
pnpm why <package-name> # Check version updated
pnpm build:docker:scan # Confirm vulnerability resolved
```
## Commit & PR Standards
### Commit Format
```
{type}({scope}): {neutral description}
{Brief neutral context}
Addresses: CVE-XXXX-XXXXX
Refs: {LINEAR-ID}
```
### Type Selection
| Scenario | Type |
|----------|------|
| Dependency update | `fix(deps)` |
| Code vulnerability fix | `fix` |
| License/compliance | `chore` |
| Docker/build hardening | `build` |
### Title Language - USE NEUTRAL LANGUAGE
Commit/PR titles appear in changelogs. Use neutral language:
| ❌ Avoid | ✅ Use Instead |
|----------|----------------|
| CVE-XXXX-XXXXX | (footer only) |
| vulnerability, exploit | issue, concern |
| critical, security fix | improvement, update |
| patch vulnerability | validate, harden, ensure |
### Example Commit
**Good:**
```
fix(deps): update jws to 4.0.1
Updates jws package to latest stable version.
Addresses: CVE-2025-65945
Refs: SEC-412
```
**Bad:**
```
fix(security): patch critical CVE-2025-65945 in jws
```
## Done Checklist
- [ ] `pnpm build:docker:scan` shows no vulnerability for the CVE
- [ ] `pnpm why <package>` shows updated version
- [ ] Commit follows neutral language format (no CVE in title)
- [ ] PR references Linear ticket if provided
## Common Commands
```bash
pnpm install --frozen-lockfile # Initial setup
pnpm build:docker:scan # Scan for vulnerabilities
pnpm why <package-name> -r # Investigate dependency
pnpm install # Update lockfile after changes
pnpm list <package-name> # Check specific package versions
```

View File

@ -9,6 +9,11 @@ services:
- 3306:3306
tmpfs:
- /var/lib/mysql
healthcheck:
test: ['CMD', 'mysqladmin', 'ping', '-h', 'localhost', '-u', 'root', '-ppassword']
interval: 5s
timeout: 10s
retries: 10
mysql-8.4:
image: mysql:8.4
@ -19,6 +24,11 @@ services:
- 3306:3306
tmpfs:
- /var/lib/mysql
healthcheck:
test: ['CMD', 'mysqladmin', 'ping', '-h', 'localhost', '-u', 'root', '-ppassword']
interval: 5s
timeout: 10s
retries: 10
postgres:
image: postgres:16

View File

@ -0,0 +1,43 @@
# METADATA
# title: Unpinned GitHub Action
# description: |-
# GitHub Action not pinned to full commit SHA.
# Pin actions to SHA for supply chain security.
# custom:
# level: error
package rules.unpinned_action
import data.poutine
import rego.v1
rule := poutine.rule(rego.metadata.chain())
# Match 40-character hex SHA (Git) or 64-character sha256 digest (Docker)
is_sha_pinned(uses) if {
regex.match(`@(sha256:[a-f0-9]{64}|[a-f0-9]{40})`, uses)
}
# Check if it's a local action (starts with ./)
is_local_action(uses) if {
startswith(uses, "./")
}
# Check if it's a reusable workflow call
is_reusable_workflow(uses) if {
contains(uses, ".github/workflows/")
}
results contains poutine.finding(rule, pkg.purl, {
"path": workflow.path,
"job": job.id,
"step": i,
"details": sprintf("Action '%s' should be pinned to a full commit SHA", [step.uses]),
}) if {
pkg := input.packages[_]
workflow := pkg.github_actions_workflows[_]
job := workflow.jobs[_]
step := job.steps[i]
step.uses
not is_sha_pinned(step.uses)
not is_local_action(step.uses)
}

View File

@ -26,4 +26,4 @@ https://linear.app/n8n/issue/
A bug is not considered fixed, unless a test is added to prevent it from happening again.
A feature is not complete without tests.
-->
- [ ] PR Labeled with `release/backport` (if the PR is an urgent fix that needs to be backported)
- [ ] PR Labeled with `Backport to Beta`, `Backport to Stable`, or `Backport to v1` (if the PR is an urgent fix that needs to be backported)

View File

@ -1,4 +1,5 @@
import semver from 'semver';
import { parse } from 'yaml';
import { writeFile, readFile } from 'fs/promises';
import { resolve } from 'path';
import child_process from 'child_process';
@ -7,14 +8,19 @@ import assert from 'assert';
const exec = promisify(child_process.exec);
/**
* @param {string | semver.SemVer} currentVersion
*/
function generateExperimentalVersion(currentVersion) {
const parsed = semver.parse(currentVersion);
if (!parsed) throw new Error(`Invalid version: ${currentVersion}`);
// Check if it's already an experimental version
if (parsed.prerelease.length > 0 && parsed.prerelease[0] === 'exp') {
const minor = parsed.prerelease[1] || 0;
const minorInt = typeof minor === 'string' ? parseInt(minor) : minor;
// Increment the experimental minor version
const expMinor = (parsed.prerelease[1] || 0) + 1;
const expMinor = minorInt + 1;
return `${parsed.major}.${parsed.minor}.${parsed.patch}-exp.${expMinor}`;
}
@ -22,34 +28,32 @@ function generateExperimentalVersion(currentVersion) {
return `${parsed.major}.${parsed.minor}.${parsed.patch}-exp.0`;
}
function generateRcVersion(currentVersion) {
const parsed = semver.parse(currentVersion);
if (!parsed) throw new Error(`Invalid version: ${currentVersion}`);
// Check if it's already an RC version
if (parsed.prerelease.length > 0 && parsed.prerelease[0] === 'rc') {
// Increment the RC number
const rcNum = (parsed.prerelease[1] || 0) + 1;
return `${parsed.major}.${parsed.minor}.${parsed.patch}-rc.${rcNum}`;
}
// Create new RC version: <major>.<minor>.<patch>-rc.0
return `${parsed.major}.${parsed.minor}.${parsed.patch}-rc.0`;
}
const rootDir = process.cwd();
const releaseType = process.env.RELEASE_TYPE;
assert.match(releaseType, /^(patch|minor|major|experimental|rc)$/, 'Invalid RELEASE_TYPE');
const releaseType = /** @type { import('semver').ReleaseType | "experimental" } */ (
process.env.RELEASE_TYPE
);
assert.match(releaseType, /^(patch|minor|major|experimental|premajor)$/, 'Invalid RELEASE_TYPE');
// TODO: if releaseType is `auto` determine release type based on the changelog
const lastTag = (await exec('git describe --tags --match "n8n@*" --abbrev=0')).stdout.trim();
const packages = JSON.parse((await exec('pnpm ls -r --only-projects --json')).stdout);
const packages = JSON.parse(
(
await exec(
`pnpm ls -r --only-projects --json | jq -r '[.[] | { name: .name, version: .version, path: .path, private: .private}]'`,
)
).stdout,
);
const packageMap = {};
for (let { name, path, version, private: isPrivate, dependencies } of packages) {
if (isPrivate && path !== rootDir) continue;
if (path === rootDir) name = 'monorepo-root';
for (let { name, path, version, private: isPrivate } of packages) {
if (isPrivate && path !== rootDir) {
continue;
}
if (path === rootDir) {
name = 'monorepo-root';
}
const isDirty = await exec(`git diff --quiet HEAD ${lastTag} -- ${path}`)
.then(() => false)
@ -63,6 +67,111 @@ assert.ok(
'No changes found since the last release',
);
// Propagate isDirty transitively: if a package's dependency will be bumped,
// that package also needs a bump (e.g. design-system → editor-ui → cli).
// Detect root-level changes that affect resolved dep versions without touching individual
// package.json files: pnpm.overrides (applies to all specifiers)
// and pnpm-workspace.yaml catalog entries (applies only to deps using a "catalog:…" specifier).
const rootPkgJson = JSON.parse(await readFile(resolve(rootDir, 'package.json'), 'utf-8'));
const rootPkgJsonAtTag = await exec(`git show ${lastTag}:package.json`)
.then(({ stdout }) => JSON.parse(stdout))
.catch(() => ({}));
const getOverrides = (pkg) => ({ ...pkg.pnpm?.overrides, ...pkg.overrides });
const currentOverrides = getOverrides(rootPkgJson);
const previousOverrides = getOverrides(rootPkgJsonAtTag);
const changedOverrides = new Set(
Object.keys({ ...currentOverrides, ...previousOverrides }).filter(
(k) => currentOverrides[k] !== previousOverrides[k],
),
);
const parseWorkspaceYaml = (content) => {
try {
return /** @type {Record<string, unknown>} */ (parse(content) ?? {});
} catch {
return {};
}
};
const workspaceYaml = parseWorkspaceYaml(
await readFile(resolve(rootDir, 'pnpm-workspace.yaml'), 'utf-8').catch(() => ''),
);
const workspaceYamlAtTag = parseWorkspaceYaml(
await exec(`git show ${lastTag}:pnpm-workspace.yaml`)
.then(({ stdout }) => stdout)
.catch(() => ''),
);
const getCatalogs = (ws) => {
const result = new Map();
if (ws.catalog) {
result.set('default', /** @type {Record<string,string>} */ (ws.catalog));
}
for (const [name, entries] of Object.entries(ws.catalogs ?? {})) {
result.set(name, entries);
}
return result;
};
// changedCatalogEntries: Map<catalogName, Set<depName>>
const currentCatalogs = getCatalogs(workspaceYaml);
const previousCatalogs = getCatalogs(workspaceYamlAtTag);
const changedCatalogEntries = new Map();
for (const catalogName of new Set([...currentCatalogs.keys(), ...previousCatalogs.keys()])) {
const current = currentCatalogs.get(catalogName) ?? {};
const previous = previousCatalogs.get(catalogName) ?? {};
const changedDeps = new Set(
Object.keys({ ...current, ...previous }).filter((dep) => current[dep] !== previous[dep]),
);
if (changedDeps.size > 0) {
changedCatalogEntries.set(catalogName, changedDeps);
}
}
// Store full dep objects (with specifiers) so we can inspect "catalog:…" values below.
const depsByPackage = {};
for (const packageName in packageMap) {
const packageFile = resolve(packageMap[packageName].path, 'package.json');
const packageJson = JSON.parse(await readFile(packageFile, 'utf-8'));
depsByPackage[packageName] = /** @type {Record<string,string>} */ (
packageJson.dependencies ?? {}
);
}
// Mark packages dirty if any dep had a root-level override or catalog version change.
for (const [packageName, deps] of Object.entries(depsByPackage)) {
if (packageMap[packageName].isDirty) continue;
for (const [dep, specifier] of Object.entries(deps)) {
if (changedOverrides.has(dep)) {
packageMap[packageName].isDirty = true;
break;
}
if (typeof specifier === 'string' && specifier.startsWith('catalog:')) {
const catalogName = specifier === 'catalog:' ? 'default' : specifier.slice(8);
if (changedCatalogEntries.get(catalogName)?.has(dep)) {
packageMap[packageName].isDirty = true;
break;
}
}
}
}
let changed = true;
while (changed) {
changed = false;
for (const packageName in packageMap) {
if (packageMap[packageName].isDirty) continue;
if (Object.keys(depsByPackage[packageName]).some((dep) => packageMap[dep]?.isDirty)) {
packageMap[packageName].isDirty = true;
changed = true;
}
}
}
// Keep the monorepo version up to date with the released version
packageMap['monorepo-root'].version = packageMap['n8n'].version;
@ -71,17 +180,32 @@ for (const packageName in packageMap) {
const packageFile = resolve(path, 'package.json');
const packageJson = JSON.parse(await readFile(packageFile, 'utf-8'));
packageJson.version = packageMap[packageName].nextVersion =
isDirty ||
Object.keys(packageJson.dependencies || {}).some(
(dependencyName) => packageMap[dependencyName]?.isDirty,
)
? releaseType === 'experimental'
? generateExperimentalVersion(version)
: releaseType === 'rc'
? generateRcVersion(version)
: semver.inc(version, releaseType)
: version;
const dependencyIsDirty = Object.keys(packageJson.dependencies || {}).some(
(dependencyName) => packageMap[dependencyName]?.isDirty,
);
let newVersion = version;
if (isDirty || dependencyIsDirty) {
switch (releaseType) {
case 'experimental':
newVersion = generateExperimentalVersion(version);
break;
case 'premajor':
newVersion = semver.inc(
version,
version.includes('-rc.') ? 'prerelease' : 'premajor',
undefined,
'rc',
);
break;
default:
newVersion = semver.inc(version, releaseType);
break;
}
}
packageJson.version = packageMap[packageName].nextVersion = newVersion;
await writeFile(packageFile, JSON.stringify(packageJson, null, 2) + '\n');
}

View File

@ -0,0 +1,68 @@
#!/usr/bin/env node
/**
* Builds the Claude task prompt and writes it to GITHUB_ENV.
* Uses a random delimiter to prevent heredoc collision with user input.
*
* Usage: node prepare-claude-prompt.mjs
*
* Environment variables:
* INPUT_TASK - The task description (required)
* USE_RAW_PROMPT - "true" to pass task directly without wrapping
* GITHUB_ENV - Path to GitHub env file (set by Actions)
*/
import { randomUUID } from 'node:crypto';
import { appendFileSync, readdirSync } from 'node:fs';
const task = process.env.INPUT_TASK;
const useRaw = process.env.USE_RAW_PROMPT === 'true';
const envFile = process.env.GITHUB_ENV;
if (!task) {
console.error('INPUT_TASK environment variable is required');
process.exit(1);
}
if (!envFile) {
console.error('GITHUB_ENV environment variable is required');
process.exit(1);
}
let prompt;
if (useRaw) {
prompt = task;
} else {
// List available templates so Claude knows what exists (reads them if needed)
const templateDir = '.github/claude-templates';
let templateSection = '';
try {
const files = readdirSync(templateDir).filter((f) => f.endsWith('.md'));
if (files.length > 0) {
const listing = files.map((f) => ` - ${templateDir}/${f}`).join('\n');
templateSection = `\n# Templates\nThese guides are available if relevant to your task. Read any that match before starting:\n${listing}`;
}
} catch {
// No templates directory, skip
}
prompt = `# Task
${task}
${templateSection}
# Instructions
1. Read any relevant templates listed above before starting
2. Complete the task described above
3. Make commits as you work - the last commit message will be used as the PR title
4. IMPORTANT: End every commit message with: Co-authored-by: Claude <noreply@anthropic.com>
5. Ensure code passes linting and type checks before finishing
# Token Optimization
When running lint/typecheck, suppress verbose output:
pnpm lint 2>&1 | tail -30
pnpm typecheck 2>&1 | tail -30`;
}
// Random delimiter guarantees no collision with user content
const delimiter = `CLAUDE_PROMPT_DELIM_${randomUUID().replace(/-/g, '')}`;
appendFileSync(envFile, `CLAUDE_PROMPT<<${delimiter}\n${prompt}\n${delimiter}\n`);

View File

@ -0,0 +1,59 @@
#!/usr/bin/env node
/**
* Sends a callback to the resume URL with the Claude task result.
* Uses fetch() directly to avoid E2BIG errors from shell argument limits.
*
* Usage: node resume-callback.mjs
*
* Environment variables:
* RESUME_URL - Callback URL to POST to (required)
* EXECUTION_FILE - Path to Claude's execution output JSON (optional)
* CLAUDE_OUTCOME - "success" or "failure" (required)
* CLAUDE_SESSION_ID - Session ID for resuming conversations (optional)
* BRANCH_NAME - Git branch name (optional)
*/
import { existsSync, readFileSync } from 'node:fs';
const resumeUrl = process.env.RESUME_URL;
const executionFile = process.env.EXECUTION_FILE;
const claudeOutcome = process.env.CLAUDE_OUTCOME;
const sessionId = process.env.CLAUDE_SESSION_ID ?? '';
const branchName = process.env.BRANCH_NAME ?? '';
if (!resumeUrl) {
console.error('RESUME_URL environment variable is required');
process.exit(1);
}
const success = claudeOutcome === 'success';
let result = null;
if (executionFile && existsSync(executionFile)) {
try {
const execution = JSON.parse(readFileSync(executionFile, 'utf-8'));
// Extract the last element (Claude's final result message)
result = Array.isArray(execution) ? execution.at(-1) : execution;
} catch (err) {
console.warn(`Failed to parse execution file: ${err.message}`);
}
}
const payload = JSON.stringify({ success, branch: branchName, sessionId, result });
try {
const response = await fetch(resumeUrl, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: payload,
});
if (!response.ok) {
const body = await response.text();
console.error(`Callback failed: ${body}`);
process.exit(1);
}
} catch (err) {
console.error(`Callback error: ${err.message}`);
process.exit(1);
}

157
.github/scripts/cleanup-ghcr-images.mjs vendored Normal file
View File

@ -0,0 +1,157 @@
#!/usr/bin/env node
/**
* Cleanup GHCR images for n8n CI
*
* Modes:
* --tag <tag> Delete exact tag (post-run cleanup)
* --stale <days> Delete ci-* images older than N days (daily scheduled cleanup)
*
* Context:
* - Each CI run tags images as ci-{run_id}
* - Post-run cleanup uses --tag to delete the current run's images
* - Daily cron uses --stale to catch any orphaned images
*/
import { exec } from 'node:child_process';
import { promisify } from 'node:util';
const execAsync = promisify(exec);
const ORG = process.env.GHCR_ORG || 'n8n-io';
const REPO = process.env.GHCR_REPO || 'n8n';
const PACKAGES = [REPO, 'runners'];
const [mode, rawValue] = process.argv.slice(2);
if (!['--tag', '--stale'].includes(mode) || !rawValue) {
console.error('Usage: cleanup-ghcr-images.mjs --tag <tag> | --stale <days>');
process.exit(1);
}
const value = mode === '--stale' ? parseInt(rawValue, 10) : rawValue;
if (mode === '--stale' && (isNaN(value) || value <= 0)) {
console.error('Error: --stale requires a positive number of days');
process.exit(1);
}
async function ghApi(path) {
const { stdout } = await execAsync(
`gh api "/orgs/${ORG}/packages/container/${path}"`,
);
return JSON.parse(stdout);
}
async function ghDelete(path) {
await execAsync(`gh api --method DELETE "/orgs/${ORG}/packages/container/${path}"`);
}
async function fetchPage(pkg, page) {
try {
return await ghApi(`${pkg}/versions?per_page=100&page=${page}`);
} catch (err) {
if (err.code === 1 && err.stderr?.includes('404')) return [];
throw new Error(`Failed to fetch ${pkg} page ${page}: ${err.message}`);
}
}
const isCiImage = (v) => {
const tags = v.metadata?.container?.tags || [];
return tags.some((t) => t.startsWith('ci-') || t.startsWith('pr-'));
};
const isStale = (v, days) => {
const cutoff = Date.now() - days * 86400000;
return isCiImage(v) && new Date(v.created_at) < cutoff;
};
async function getVersionsForTag(pkg, tag) {
const batch = await fetchPage(pkg, 1);
const match = batch.find((v) => v.metadata?.container?.tags?.includes(tag));
return match ? [match] : [];
}
async function getVersionsForStale(pkg, days) {
const versions = [];
const cutoff = Date.now() - days * 86400000;
// Use 2x cutoff as safety window for early termination
const earlyExitCutoff = Date.now() - days * 2 * 86400000;
let pagesWithoutCiImages = 0;
const firstPage = await fetchPage(pkg, 1);
if (!firstPage.length) return [];
for (const v of firstPage) {
if (isStale(v, days)) versions.push(v);
}
if (firstPage.length < 100) return versions;
for (let page = 2; ; page += 10) {
const batches = await Promise.all(
Array.from({ length: 10 }, (_, i) => fetchPage(pkg, page + i)),
);
let done = false;
for (const batch of batches) {
if (!batch.length || batch.length < 100) done = true;
let hasCiImages = false;
for (const v of batch) {
if (isCiImage(v)) {
hasCiImages = true;
if (new Date(v.created_at) < cutoff) versions.push(v);
}
}
// Early termination: if we've gone through pages without finding
// any CI images and all items are older than 2x cutoff, we're past
// the CI image window
if (!hasCiImages) {
pagesWithoutCiImages++;
const oldestInBatch = batch[batch.length - 1];
if (
pagesWithoutCiImages >= 3 &&
oldestInBatch &&
new Date(oldestInBatch.created_at) < earlyExitCutoff
) {
console.log(` Early termination at page ${page + batches.indexOf(batch)}`);
done = true;
}
} else {
pagesWithoutCiImages = 0;
}
if (!batch.length || done) break;
}
if (done) break;
}
return versions;
}
let hasErrors = false;
for (const pkg of PACKAGES) {
console.log(`Processing ${pkg}...`);
let consecutiveErrors = 0;
const toDelete =
mode === '--tag'
? await getVersionsForTag(pkg, value)
: await getVersionsForStale(pkg, value);
if (!toDelete.length) {
console.log(` No matching images found`);
continue;
}
for (const v of toDelete) {
try {
await ghDelete(`${pkg}/versions/${v.id}`);
console.log(` Deleted ${v.metadata.container.tags.join(',')}`);
consecutiveErrors = 0;
} catch (err) {
console.error(` Failed to delete ${v.id}: ${err.message}`);
hasErrors = true;
if (++consecutiveErrors >= 3) {
throw new Error('Too many consecutive delete failures, aborting');
}
}
}
}
if (hasErrors) process.exit(1);

View File

@ -0,0 +1,123 @@
import fs from 'node:fs/promises';
import { getOctokit } from '@actions/github';
import { ensureEnvVar, readPrLabels } from './github-helpers.mjs';
/**
* @typedef {PullRequestCheckPass | PullRequestCheckFail} PullRequestCheckResult
**/
/**
* @typedef PullRequestCheckPass
* @property {true} pass
* @property {string} baseRef
* */
/**
* @typedef PullRequestCheckFail
* @property {false} pass
* @property {string} reason
* */
/**
* @param {PullRequestCheckResult} pullRequestCheck
*
* @returns { pullRequestCheck is PullRequestCheckFail }
* */
function pullRequestCheckFailed(pullRequestCheck) {
return !pullRequestCheck.pass;
}
/**
* @param {any} pullRequest
* @returns {PullRequestCheckResult}
*/
export function pullRequestIsDismissedRelease(pullRequest) {
if (!pullRequest) {
throw new Error('Missing pullRequest in event payload');
}
const baseRef = pullRequest?.base?.ref ?? '';
const headRef = pullRequest?.head?.ref ?? '';
const merged = Boolean(pullRequest?.merged);
if (merged) {
return { pass: false, reason: 'PR was merged' };
}
// Must match your release PR pattern:
// base: release/<ver>
// head: release-pr/<ver>
if (!baseRef.startsWith('release/')) {
return { pass: false, reason: `Base ref '${baseRef}' is not release/*` };
}
if (!headRef.startsWith('release-pr/')) {
return { pass: false, reason: `Head ref '${headRef}' is not release-pr/*` };
}
const baseVer = baseRef.slice('release/'.length);
const headVer = headRef.slice('release-pr/'.length);
if (!baseVer || baseVer !== headVer) {
return { pass: false, reason: `Version mismatch: base='${baseVer}' head='${headVer}'` };
}
const labelNames = readPrLabels(pullRequest);
if (!labelNames.includes('release')) {
return {
pass: false,
reason: `Missing required label 'release' (labels: ${labelNames.join(', ') || '[none]'})`,
};
}
return { pass: true, baseRef };
}
async function main() {
const token = ensureEnvVar('GITHUB_TOKEN');
const eventPath = ensureEnvVar('GITHUB_EVENT_PATH');
const repoFullName = ensureEnvVar('GITHUB_REPOSITORY');
const [owner, repo] = repoFullName.split('/');
if (!owner || !repo) {
throw new Error(`Invalid GITHUB_REPOSITORY: '${repoFullName}'`);
}
const rawEventData = await fs.readFile(eventPath, 'utf8');
const event = JSON.parse(rawEventData);
const result = pullRequestIsDismissedRelease(event.pull_request);
if (pullRequestCheckFailed(result)) {
console.log(`no-op: ${result.reason}`);
return;
}
const branch = result.baseRef; // e.g. "release/2.11.0"
console.log(`PR qualifies. Deleting branch '${branch}'...`);
const octokit = getOctokit(token);
try {
await octokit.rest.git.deleteRef({
owner,
repo,
// ref must be "heads/<branch>"
ref: `heads/${branch}`,
});
console.log(`Deleted '${branch}'.`);
} catch (err) {
// If it was already deleted, treat as success.
const status = err?.status;
if (status === 404) {
console.log(`Branch '${branch}' not found (already deleted).`);
return;
}
console.error(err);
throw new Error(`Failed to delete '${branch}'.`);
}
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
await main();
}

View File

@ -0,0 +1,147 @@
import { describe, it, mock, before } from 'node:test';
import assert from 'node:assert/strict';
import { readPrLabels } from './github-helpers.mjs';
/**
* Run these tests by running
*
* node --test --experimental-test-module-mocks ./.github/scripts/cleanup-release-branch.test.mjs
* */
// mock.module must be called before the module under test is imported,
// because static imports are hoisted and resolve before any code runs.
mock.module('./github-helpers.mjs', {
namedExports: {
ensureEnvVar: () => {}, // no-op
readPrLabels: (pr) => {
return readPrLabels(pr);
},
},
});
let pullRequestIsDismissedRelease;
before(async () => {
({ pullRequestIsDismissedRelease } = await import('./cleanup-release-branch.mjs'));
});
describe('pullRequestIsDismissedRelease', () => {
it('Recognizes classic dismissed pull request', () => {
const pullRequest = {
merged: false,
labels: ['release'],
base: {
ref: 'release/2.9.0',
},
head: {
ref: 'release-pr/2.9.0',
},
};
/** @type { import('./cleanup-release-branch.mjs').PullRequestCheckResult } */
const result = pullRequestIsDismissedRelease(pullRequest);
assert.equal(result.pass, true);
assert.equal(result.reason, undefined);
});
it("Doesn't pass PR with malformed head", () => {
const pullRequest = {
merged: false,
labels: ['release'],
base: {
ref: 'release/2.9.0',
},
head: {
ref: 'my-fork-release-pr/2.9.0',
},
};
/** @type { import('./cleanup-release-branch.mjs').PullRequestCheckResult } */
const result = pullRequestIsDismissedRelease(pullRequest);
assert.equal(result.pass, false);
assert.equal(result.reason, `Head ref '${pullRequest.head.ref}' is not release-pr/*`);
});
it("Doesn't pass PR with malformed base", () => {
const pullRequest = {
merged: false,
labels: ['release'],
base: {
ref: 'master',
},
head: {
ref: 'release-pr/2.9.0',
},
};
/** @type { import('./cleanup-release-branch.mjs').PullRequestCheckResult } */
const result = pullRequestIsDismissedRelease(pullRequest);
assert.equal(result.pass, false);
assert.equal(result.reason, `Base ref '${pullRequest.base.ref}' is not release/*`);
});
it("Doesn't pass merged PR's", () => {
const pullRequest = {
merged: true,
labels: ['release'],
base: {
ref: 'release/2.9.0',
},
head: {
ref: 'release-pr/2.9.0',
},
};
/** @type { import('./cleanup-release-branch.mjs').PullRequestCheckResult } */
const result = pullRequestIsDismissedRelease(pullRequest);
assert.equal(result.pass, false);
assert.equal(result.reason, `PR was merged`);
});
it("Doesn't pass on PR version mismatch", () => {
const pullRequest = {
merged: false,
labels: ['release'],
base: {
ref: 'release/2.9.0',
},
head: {
ref: 'release-pr/2.9.1',
},
};
/** @type { import('./cleanup-release-branch.mjs').PullRequestCheckResult } */
const result = pullRequestIsDismissedRelease(pullRequest);
assert.equal(result.pass, false);
assert.equal(
result.reason,
`Version mismatch: base='${pullRequest.base.ref.replace('release/', '')}' head='${pullRequest.head.ref.replace('release-pr/', '')}'`,
);
});
it("Doesn't pass a PR with missing 'release' label", () => {
const pullRequest = {
merged: false,
labels: ['release-pr', 'core-team'],
base: {
ref: 'release/2.9.0',
},
head: {
ref: 'release-pr/2.9.0',
},
};
/** @type { import('./cleanup-release-branch.mjs').PullRequestCheckResult } */
const result = pullRequestIsDismissedRelease(pullRequest);
assert.equal(result.pass, false);
assert.equal(
result.reason,
`Missing required label 'release' (labels: ${pullRequest.labels.join(', ')})`,
);
});
});

View File

@ -0,0 +1,109 @@
// Creates backport PR's according to labels on merged PR
import {
getPullRequestById,
readPrLabels,
resolveRcBranchForTrack,
writeGithubOutput,
} from './github-helpers.mjs';
/** @type { Record<string, import('./github-helpers.mjs').ReleaseTrack> } */
const BACKPORT_BY_TAG_MAP = {
'Backport to Beta': 'beta',
'Backport to Stable': 'stable',
};
const BACKPORT_BY_BRANCH_MAP = {
'Backport to v1': '1.x',
};
/**
* @param {Set<string>} labels
*
* @returns { Set<string> }
*/
export function labelsToReleaseCandidateBranches(labels) {
const targets = new Set();
// Backport by tag map includes mapping of label to git tag to resolve
for (const [label, tag] of Object.entries(BACKPORT_BY_TAG_MAP)) {
// Check if backport label is present
if (!labels.has(label)) {
continue;
}
const branch = resolveRcBranchForTrack(tag);
// Make sure our backport branch exists
if (!branch) {
continue;
}
targets.add(branch);
}
// Backport by branch map includes mapping of label to git branch. This is used for
// older versions of n8n. v1, etc.
for (const [label, branch] of Object.entries(BACKPORT_BY_BRANCH_MAP)) {
// Check if backport label is present
if (!labels.has(label)) {
continue;
}
targets.add(branch);
}
return targets;
}
/**
* This script is called in 2 cases:
*
* 1. When a PR is merged, in which case functions like `readPrLabels` reads PR info from GITHUB_EVENT_PATH
* 2. Manually via Workflow Dispatch, where a Pull Request ID is passed as an env parameter
*
* @returns { Promise<undefined | any> } Pull request object, if ID was provided in env params
*/
async function fetchPossiblePullRequestFromEnv() {
const pullRequestEnv = process.env.PULL_REQUEST_ID;
if (!pullRequestEnv) {
// No ID provided, will proceed to read data from GITHUB_EVENT_PATH
return undefined;
}
const pullRequestNumber = parseInt(pullRequestEnv);
if (isNaN(pullRequestNumber)) {
throw new Error(
"PULL_REQUEST_ID must be a number. It shouldn't contain any other symbols (#, PR, etc.)",
);
}
return await getPullRequestById(pullRequestNumber);
}
export async function getLabels() {
const pullRequest = await fetchPossiblePullRequestFromEnv();
return new Set(readPrLabels(pullRequest));
}
async function main() {
const labels = await getLabels();
if (!labels || labels.size === 0) {
console.log('No labels on PR. Exiting...');
return;
}
const backportBranches = labelsToReleaseCandidateBranches(labels);
if (backportBranches.size === 0) {
console.log('No backports needed. Exiting...');
return;
}
const target_branches = [...backportBranches].join(' '); // korthout/backport-action@v4 uses space-delimited branch list
writeGithubOutput({ target_branches });
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
await main();
}

View File

@ -0,0 +1,104 @@
import { describe, it, mock, before } from 'node:test';
import assert from 'node:assert/strict';
import { readPrLabels } from './github-helpers.mjs';
/**
* Run these tests by running
*
* node --test --experimental-test-module-mocks ./.github/scripts/compute-backport-targets.test.mjs
* */
// mock.module must be called before the module under test is imported,
// because static imports are hoisted and resolve before any code runs.
mock.module('./github-helpers.mjs', {
namedExports: {
ensureEnvVar: () => {}, // no-op
readPrLabels: readPrLabels,
resolveRcBranchForTrack: mockResolveRcBranchForTrack,
writeGithubOutput: () => {}, //no-op
getPullRequestById: () => {
return {
labels: ['n8n team', 'Backport to Beta'],
};
},
},
});
function mockResolveRcBranchForTrack(track) {
switch (track) {
case 'beta':
return 'release-candidate/2.10.1';
case 'stable':
return 'release-candidate/2.9.4';
}
return undefined;
}
let labelsToReleaseCandidateBranches, getLabels;
before(async () => {
({ labelsToReleaseCandidateBranches, getLabels } = await import(
'./compute-backport-targets.mjs'
));
});
describe('Compute backport targets', () => {
it('Finds backport branches for pointer tag labels', () => {
const labels = new Set(['Backport to Beta', 'Backport to Stable']);
/** @type { Set<string> } */
const result = labelsToReleaseCandidateBranches(labels);
assert.equal(result.size, 2);
assert.ok(result.has('release-candidate/2.10.1'));
assert.ok(result.has('release-candidate/2.9.4'));
});
it("Doesn't parse other labes to backport branches", () => {
const labels = new Set(['n8n team', 'release']);
/** @type { Set<string> } */
const result = labelsToReleaseCandidateBranches(labels);
assert.equal(result.size, 0);
});
it("Doesn't parse malformed backport labels", () => {
const labels = new Set(['Backport to Fork', 'Backport to my Home']);
/** @type { Set<string> } */
const result = labelsToReleaseCandidateBranches(labels);
assert.equal(result.size, 0);
});
it('Should parse labels properly in Pull request context', async () => {
process.env.GITHUB_EVENT_PATH = './fixtures/mock-github-event.json';
/** @type { Set<string> } */
const labels = await getLabels();
assert.equal(labels.size, 2);
assert.ok(labels.has('release'));
assert.ok(labels.has('Backport to Stable'));
});
it('Should parse labels properly in manual workflow context', async () => {
process.env.PULL_REQUEST_ID = '123';
/** @type { Set<string> } */
const labels = await getLabels();
assert.equal(labels.size, 2);
assert.ok(labels.has('n8n team'));
assert.ok(labels.has('Backport to Beta'));
});
it('Should throw when passed pull request id with #', async () => {
process.env.PULL_REQUEST_ID = '#123';
await assert.rejects(getLabels);
});
it('Should not throw when passed pull request id with just a number', async () => {
process.env.PULL_REQUEST_ID = '123';
await assert.doesNotReject(getLabels);
});
it('Should throw when passed pull request id with other than numbers included', async () => {
process.env.PULL_REQUEST_ID = 'abc-123';
await assert.rejects(getLabels);
});
});

View File

@ -0,0 +1,81 @@
import {
deleteRelease,
ensureEnvVar,
getExistingRelease,
initGithub,
isReleaseTrack,
writeGithubOutput,
} from './github-helpers.mjs';
/**
* Creates release in GitHub.
*
* Required env variables:
* - RELEASE_TAG - Release tag on git e.g. n8n@2.13.0
* - BODY - Body of the release. Contains release notes etc.
* - IS_PRE_RELEASE - If releasing in pre-release. Currently only for beta track.
* - MAKE_LATEST - If released version should be marked as latest on GitHub
* - COMMIT - Commitish for release to point to
*
* Optional env variables:
* - ADDITIONAL_TAGS - Comma-separated list of additional tags to release under e.g. beta
*
* GitHub variables
* - GITHUB_TOKEN - Used to authenticate to octokit - Can be overwritten for privileged access
* - GITHUB_REPOSITORY - Used to determine target repository
* */
async function createGitHubRelease() {
const RELEASE_TAG = ensureEnvVar('RELEASE_TAG');
const ADDITIONAL_TAGS = process.env.ADDITIONAL_TAGS ?? '';
const BODY = ensureEnvVar('BODY');
const IS_PRE_RELEASE = ensureEnvVar('IS_PRE_RELEASE');
const MAKE_LATEST = ensureEnvVar('MAKE_LATEST');
const COMMIT = ensureEnvVar('COMMIT');
const { octokit, owner, repo } = initGithub();
const allTags = [
RELEASE_TAG,
...ADDITIONAL_TAGS.split(',')
.map((t) => t.trim())
.filter(Boolean),
];
const releases = [];
for (const tag of allTags) {
const existingRelease = await getExistingRelease(tag);
const isReleaseTrackTag = isReleaseTrack(tag);
// If we have an existing track release, we want to
// delete the old release before pushing a new one.
if (isReleaseTrackTag && existingRelease) {
await deleteRelease(existingRelease.id);
}
const releaseResponse = await octokit.rest.repos.createRelease({
tag_name: tag,
name: tag,
body: BODY,
draft: false,
prerelease: IS_PRE_RELEASE === 'true',
make_latest: MAKE_LATEST === 'true' ? 'true' : 'false',
target_commitish: COMMIT,
owner,
repo,
});
const release = releaseResponse.data;
releases.push(release);
console.log(`Successfully created release ${release.html_url}`);
}
writeGithubOutput({
release_urls: releases.map((release) => release.html_url).join(', '),
});
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
createGitHubRelease();
}

104
.github/scripts/detect-new-packages.mjs vendored Normal file
View File

@ -0,0 +1,104 @@
/**
* Detects packages in the monorepo that have not yet been published to npm.
*
* Packages that are new (never published) cannot be released via OIDC Trusted
* Publishing because Trusted Publishing requires the package to already exist
* on npm with the publisher configured first.
*
* New packages must be handled manually:
* 1. Published once using an NPM token
* 2. Configured with Trusted Publishing on npmjs.com
*
* Exit codes:
* 0 All public packages exist on npm
* 1 One or more public packages have never been published
*/
import child_process from 'child_process';
import { promisify } from 'util';
import { writeGithubOutput } from './github-helpers.mjs';
const exec = promisify(child_process.exec);
const packages = JSON.parse(
(
await exec(
`pnpm ls -r --only-projects --json | jq -r '[.[] | { name:.name, private: .private}]'`,
)
).stdout,
);
const newPackages = [];
for (const { name, private: isPrivate } of packages) {
if (isPrivate) continue;
// Scoped packages must be encoded: @n8n/foo → @n8n%2Ffoo
const encodedName = name.startsWith('@') ? name.replace('/', '%2F') : name;
const url = `https://registry.npmjs.org/${encodedName}`;
try {
console.log(`Checking if ${name} exists...`);
const response = await fetch(url, { method: 'HEAD' });
if (response.status === 404) {
newPackages.push(name);
} else if (!response.ok && response.status !== 405) {
// 405 = Method Not Allowed for HEAD (some registries), not an error
console.log(
`::warning::Unexpected HTTP ${response.status} when checking npm registry for "${name}". Skipping check.`,
);
}
} catch (error) {
console.log(
`::warning::Could not reach npm registry for "${name}": ${error.message}. Skipping check.`,
);
}
}
if (newPackages.length === 0) {
const publicCount = packages.filter((p) => !p.private).length;
console.log(`✅ All ${publicCount} public packages exist on npm.`);
process.exit(0);
}
console.log(`
New unpublished packages detected!
The following packages do not yet exist on npm and cannot be published via
OIDC Trusted Publishing until they have been published at least once manually:
`);
for (const pkg of newPackages) {
console.log(
`::error::Package "${pkg}" has never been published to npm. A manual first-publish with an NPM token is required before it can use OIDC Trusted Publishing.`,
);
}
console.log(`
Steps to unblock the release, for each new package listed above:
1. Publish the package once manually using an NPM token:
cd to/where/package/lives
pnpm login
pnpm publish --access public
2. Configure Trusted Publishing on npmjs.com for each new package:
https://docs.npmjs.com/trusted-publishers
Use the following settings:
Repository owner : n8n-io
Repository name : n8n
Workflow filename: release-publish.yml
3. Re-run the Release: Publish workflow.
`);
const output = {
packages: newPackages.join(','),
};
console.log(` -- Writing to github output: ${JSON.stringify(output)}`);
writeGithubOutput(output);
process.exit(1);

View File

@ -0,0 +1,52 @@
import {
ensureEnvVar,
listCommitsBetweenRefs,
resolveRcBranchForTrack,
resolveReleaseTagForTrack,
writeGithubOutput,
} from './github-helpers.mjs';
function main() {
const track = /** @type { import('./github-helpers.mjs').ReleaseTrack } */ (
ensureEnvVar('TRACK')
);
const currentTag = resolveReleaseTagForTrack(track);
const releaseCandidateBranch = resolveRcBranchForTrack(track);
if (!currentTag?.tag || !releaseCandidateBranch) {
throw new Error(
`Couldn't resolve needed parameters. currentTag.tag=${currentTag?.tag}, releaseCandidateBranch=${releaseCandidateBranch}`,
);
}
console.log(`Commits between ${releaseCandidateBranch} and ${currentTag.tag}:`);
console.log(listCommitsBetweenRefs(releaseCandidateBranch, currentTag.tag));
const commitList = listCommitsBetweenRefs(releaseCandidateBranch, currentTag.tag)
.split('\n')
.filter((commit) => commit.trim().length > 0);
const actionableCommitList = filterActionableCommits(commitList);
const output = {
release_candidate_branch: releaseCandidateBranch,
should_update: actionableCommitList.length > 0 ? 'true' : 'false',
};
console.log(output);
writeGithubOutput(output);
}
/**
* @param { string[] } commitList
* */
export function filterActionableCommits(commitList) {
return commitList.filter((commit) => !commit.trimStart().startsWith('ci:'));
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
main();
}

View File

@ -0,0 +1,46 @@
import { ensureEnvVar, sh, writeGithubOutput } from './github-helpers.mjs';
function determineReleaseVersionChanges() {
const previousVersion = ensureEnvVar('PREVIOUS_VERSION_TAG');
const releaseVersion = ensureEnvVar('RELEASE_VERSION_TAG');
const log = sh('git', [
'--no-pager',
'log',
'--format="%s (%h)"',
`${previousVersion}..${releaseVersion}`,
]);
writeGithubOutput({
has_node_enhancements: hasNodeEnhancements(log),
has_core_changes: hasCoreChanges(log),
});
}
/**
* Matches commit messages with
*
* fix(nodes)
* fix(xyz Node)
* feat(nodes)
* feat(xyz Node)
*
* @param {string} log
*/
export function hasNodeEnhancements(log) {
return /(fix|feat)\((.*Node|nodes)\)/.test(log);
}
/**
* Matches commit messages with feat(core) or feat(editor)
*
* @param {string} log
*/
export function hasCoreChanges(log) {
return /feat\((core|editor)\)/.test(log);
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
determineReleaseVersionChanges();
}

View File

@ -0,0 +1,47 @@
import { describe, it, mock, before } from 'node:test';
import assert from 'node:assert/strict';
/**
* Run these tests by running
*
* node --test --experimental-test-module-mocks ./.github/scripts/determine-release-version-changes.test.mjs
* */
// mock.module must be called before the module under test is imported,
// because static imports are hoisted and resolve before any code runs.
mock.module('./github-helpers.mjs', {
namedExports: {
ensureEnvVar: () => {}, // no-op
sh: () => {}, // no-op
writeGithubOutput: () => {}, // no-op
},
});
let hasNodeEnhancements, hasCoreChanges;
before(async () => {
({ hasNodeEnhancements, hasCoreChanges } = await import(
'./determine-release-version-changes.mjs'
));
});
describe('Determine release version changes', () => {
it('Matches nodes feature', () => {
assert.ok(hasNodeEnhancements('feat(nodes): Added a utility for node'));
});
it('Matches nodes fix', () => {
assert.ok(hasNodeEnhancements('fix(nodes): Fix said utility'));
});
it('Matches named node feature', () => {
assert.ok(hasNodeEnhancements('feat(Github Actions Node): Add ability to call webhooks'));
});
it('Matches named node fix', () => {
assert.ok(hasNodeEnhancements('fix(OpenAI Node): Allow credentials to pass through'));
});
it('Matches core changes', () => {
assert.ok(hasCoreChanges('feat(core): Add cli flag'));
});
it('Matches editor changes', () => {
assert.ok(hasCoreChanges('feat(editor): Add button'));
});
});

View File

@ -0,0 +1,138 @@
import { readFileSync } from 'node:fs';
import {
RELEASE_TRACKS,
resolveReleaseTagForTrack,
tagVersionInfoToReleaseCandidateBranchName,
writeGithubOutput,
} from './github-helpers.mjs';
import semver from 'semver';
/**
* @param {any} packageVersion
*/
export function determineTrack(packageVersion) {
if (!semver.valid(packageVersion)) {
throw new Error(`Package semver not valid. Got ${packageVersion}`);
}
/** @type { Partial<Record<import('./github-helpers.mjs').ReleaseTrack, import('./github-helpers.mjs').TagVersionInfo>> } */
const trackToReleaseMap = {};
for (const t of RELEASE_TRACKS) {
trackToReleaseMap[t] = resolveReleaseTagForTrack(t);
}
console.log('Current Tracks: ', JSON.stringify(trackToReleaseMap, null, 4));
let track = null;
let newStable = null;
let bump = determineBump(packageVersion);
const releaseType = determineReleaseType(packageVersion);
// Check through our current release versions, if semver matches,
// we inherit the track pointer from them
for (const [releaseTrack, tagVersionInfo] of Object.entries(trackToReleaseMap)) {
if (tagVersionInfo && matchesTrack(tagVersionInfo, packageVersion)) {
track = releaseTrack;
break;
}
}
if (!track) {
if (!trackToReleaseMap.beta?.version) {
throw new Error(
'Likely updating to new beta release, but no existing beta tag was found in git.',
);
}
// If not track was found in current versions, we verify we're building a
// new beta version and the input is not invalid.
assertNewBetaRelease(trackToReleaseMap.beta.version, packageVersion);
track = 'beta';
newStable = trackToReleaseMap.beta.version;
}
if (!track) {
throw new Error('Could not determine track for release. Exiting...');
}
const rc_branch = tagVersionInfoToReleaseCandidateBranchName({
version: packageVersion,
tag: /** @type {import('./github-helpers.mjs').ReleaseVersion} */ (`n8n@${packageVersion}`),
});
const previousVersion = trackToReleaseMap[track]?.version;
const output = {
previous_version: previousVersion,
version: packageVersion,
track,
bump,
new_stable_version: newStable,
release_type: releaseType,
rc_branch,
};
writeGithubOutput(output);
console.log(
`Determined track info: ${Object.entries(output)
.map(([key, val]) => `${key}=${val}`)
.join(', ')}`,
);
return output;
}
/**
* The current version matches the track, if their Major and Minor semvers match.
*
* This means that we are working with a patch release
*
* @param {import("./github-helpers.mjs").TagVersionInfo} tagVersionInfo
* @param {any} currentVersion
*/
function matchesTrack(tagVersionInfo, currentVersion) {
if (semver.major(tagVersionInfo.version) !== semver.major(currentVersion)) {
return false;
}
if (semver.minor(tagVersionInfo.version) !== semver.minor(currentVersion)) {
return false;
}
return true;
}
/**
* @param {string} currentBetaVersion
* @param {any} currentVersion
*/
function assertNewBetaRelease(currentBetaVersion, currentVersion) {
if (semver.major(currentBetaVersion) !== semver.major(currentVersion)) {
throw new Error('Major version bumps are not allowed by this pipeline');
}
const bumpedCurrentBeta = semver.inc(currentBetaVersion, 'minor');
if (semver.minor(bumpedCurrentBeta) !== semver.minor(currentVersion)) {
throw new Error(
`Trying to upgrade minor version by more than one increment. Previous: ${bumpedCurrentBeta}, Requested: ${currentVersion}`,
);
}
}
function determineReleaseType(currentVersion) {
if (currentVersion.includes('-rc.')) {
return 'rc';
}
return 'stable';
}
function determineBump(currentVersion) {
if (semver.patch(currentVersion) === 0 && determineReleaseType(currentVersion) != 'rc') {
return 'minor';
}
return 'patch';
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
const packageJson = JSON.parse(readFileSync('./package.json', 'utf8'));
determineTrack(packageJson.version);
}

View File

@ -0,0 +1,119 @@
import { describe, it, mock, before } from 'node:test';
import assert from 'node:assert/strict';
import { tagVersionInfoToReleaseCandidateBranchName } from './github-helpers.mjs';
/**
* Run these tests by running
*
* node --test --experimental-test-module-mocks ./.github/scripts/determine-version-info.test.mjs
* */
// mock.module must be called before the module under test is imported,
// because static imports are hoisted and resolve before any code runs.
mock.module('./github-helpers.mjs', {
namedExports: {
RELEASE_TRACKS: ['stable', 'beta', 'v1'],
resolveReleaseTagForTrack: (track) => {
// Always return deterministic data
if (track === 'stable') return { version: '2.9.2', tag: 'n8n@2.9.2' };
if (track === 'beta') return { version: '2.10.1', tag: 'n8n@2.10.1' };
return { version: '1.123.33', tag: 'n8n@1.123.33' };
},
tagVersionInfoToReleaseCandidateBranchName,
writeGithubOutput: () => {}, // no-op in tests
getCommitForRef: () => {}, // no-op
localRefExists: () => {}, // no-op
remoteBranchExists: () => {}, // no-op
sh: () => {}, // no-op
},
});
let determineTrack;
before(async () => {
({ determineTrack } = await import('./determine-version-info.mjs'));
});
describe('determine-tracks', () => {
it('Allow patch releases on stable', () => {
const output = determineTrack('2.9.3');
assert.equal(output.track, 'stable');
assert.equal(output.version, '2.9.3');
assert.equal(output.previous_version, '2.9.2');
assert.equal(output.bump, 'patch');
assert.equal(output.new_stable_version, null);
assert.equal(output.release_type, 'stable');
assert.equal(output.rc_branch, 'release-candidate/2.9.x');
});
it('Allow patch releases on beta', () => {
const output = determineTrack('2.10.2');
assert.equal(output.track, 'beta');
assert.equal(output.version, '2.10.2');
assert.equal(output.previous_version, '2.10.1');
assert.equal(output.bump, 'patch');
assert.equal(output.new_stable_version, null);
assert.equal(output.release_type, 'stable');
assert.equal(output.rc_branch, 'release-candidate/2.10.x');
});
// This use case might happen if a patch release fails and we proceed with rolling over to next release
it('Allow skipping versions in patches', () => {
const output = determineTrack('2.9.4');
assert.equal(output.track, 'stable');
assert.equal(output.version, '2.9.4');
assert.equal(output.previous_version, '2.9.2');
assert.equal(output.bump, 'patch');
assert.equal(output.new_stable_version, null);
assert.equal(output.release_type, 'stable');
assert.equal(output.rc_branch, 'release-candidate/2.9.x');
});
it('Disallow skipping versions in minors', () => {
assert.throws(() => determineTrack('2.12.0'));
});
it('Disallow changing major version', () => {
assert.throws(() => determineTrack('3.0.0'));
});
it('Throw when track is not determinable', () => {
assert.throws(() => determineTrack(''));
});
it('Set track as "beta" when doing a minor bump', () => {
const output = determineTrack('2.11.0');
assert.equal(output.track, 'beta');
assert.equal(output.version, '2.11.0');
assert.equal(output.previous_version, '2.10.1');
assert.equal(output.bump, 'minor');
assert.equal(output.new_stable_version, '2.10.1');
assert.equal(output.release_type, 'stable');
assert.equal(output.rc_branch, 'release-candidate/2.11.x');
});
it('Set release_type accordingly on rc releases', () => {
const output = determineTrack('2.10.2-rc.1');
assert.equal(output.track, 'beta');
assert.equal(output.version, '2.10.2-rc.1');
assert.equal(output.previous_version, '2.10.1');
assert.equal(output.bump, 'patch');
assert.equal(output.new_stable_version, null);
assert.equal(output.release_type, 'rc');
assert.equal(output.rc_branch, 'release-candidate/2.10.x');
});
it('Determines correct branches on 1.x', () => {
const output = determineTrack('1.123.34');
assert.equal(output.track, 'v1');
assert.equal(output.version, '1.123.34');
assert.equal(output.previous_version, '1.123.33');
assert.equal(output.bump, 'patch');
assert.equal(output.new_stable_version, null);
assert.equal(output.release_type, 'stable');
assert.equal(output.rc_branch, '1.x');
});
});

View File

@ -31,7 +31,7 @@ class BuildContext {
case 'pull_request':
context.version = `pr-${pr}`;
context.release_type = 'dev';
context.push_to_ghcr = false;
context.platforms = ['linux/amd64'];
break;
case 'workflow_dispatch':

View File

@ -0,0 +1,63 @@
#!/usr/bin/env node
/**
* Extracts manifest digests and image names for SLSA provenance and VEX attestation.
*
* Usage:
* N8N_TAG=ghcr.io/n8n-io/n8n:1.0.0 node get-manifest-digests.mjs
*
* Environment variables:
* N8N_TAG - Full image reference for n8n image
* RUNNERS_TAG - Full image reference for runners image
* DISTROLESS_TAG - Full image reference for runners-distroless image
* GITHUB_OUTPUT - Path to GitHub Actions output file (optional)
*/
import { execSync } from 'node:child_process';
import { appendFileSync } from 'node:fs';
const githubOutput = process.env.GITHUB_OUTPUT || null;
function getDigest(imageRef) {
if (!imageRef) return '';
const raw = execSync(`docker buildx imagetools inspect "${imageRef}" --raw`, {
encoding: 'utf8',
});
const hash = execSync('sha256sum', { input: raw, encoding: 'utf8' }).split(' ')[0].trim();
return `sha256:${hash}`;
}
function getImageName(imageRef) {
if (!imageRef) return '';
return imageRef.replace(/:([^:]+)$/, '');
}
function setOutput(name, value) {
if (githubOutput && value) appendFileSync(githubOutput, `${name}=${value}\n`);
}
const n8nTag = process.env.N8N_TAG || '';
const runnersTag = process.env.RUNNERS_TAG || '';
const distrolessTag = process.env.DISTROLESS_TAG || '';
const results = {
n8n: { digest: getDigest(n8nTag), image: getImageName(n8nTag) },
runners: { digest: getDigest(runnersTag), image: getImageName(runnersTag) },
runners_distroless: { digest: getDigest(distrolessTag), image: getImageName(distrolessTag) },
};
setOutput('n8n_digest', results.n8n.digest);
setOutput('n8n_image', results.n8n.image);
setOutput('runners_digest', results.runners.digest);
setOutput('runners_image', results.runners.image);
setOutput('runners_distroless_digest', results.runners_distroless.digest);
setOutput('runners_distroless_image', results.runners_distroless.image);
console.log('=== Manifest Digests ===');
console.log(`n8n: ${results.n8n.digest || 'N/A'}`);
console.log(`runners: ${results.runners.digest || 'N/A'}`);
console.log(`runners-distroless: ${results.runners_distroless.digest || 'N/A'}`);
console.log('');
console.log('=== Image Names ===');
console.log(`n8n: ${results.n8n.image || 'N/A'}`);
console.log(`runners: ${results.runners.image || 'N/A'}`);
console.log(`runners-distroless: ${results.runners_distroless.image || 'N/A'}`);

View File

@ -9,7 +9,13 @@ const exec = promisify(child_process.exec);
const commonFiles = ['LICENSE.md', 'LICENSE_EE.md'];
const baseDir = resolve(dirname(fileURLToPath(import.meta.url)), '../..');
const packages = JSON.parse((await exec('pnpm ls -r --only-projects --json')).stdout);
const packages = JSON.parse(
(
await exec(
`pnpm ls -r --only-projects --json | jq -r '[.[] | { name: .name, version: .version, path: .path, private: .private}]'`,
)
).stdout,
);
for (let { name, path, version, private: isPrivate } of packages) {
if (isPrivate) continue;

View File

@ -0,0 +1,143 @@
import semver from 'semver';
import {
getCommitForRef,
localRefExists,
RELEASE_CANDIDATE_BRANCH_PREFIX,
remoteBranchExists,
resolveReleaseTagForTrack,
sh,
tagVersionInfoToReleaseCandidateBranchName,
trySh,
writeGithubOutput,
} from './github-helpers.mjs';
/**
* @typedef BranchChanges
* @property { import('./github-helpers.mjs').TagVersionInfo[] } branchesToEnsure TagVersionInfo for branches the system needs to make sure exist
* @property { string[] } branchesToDeprecate Branches the system needs to remove as deprecated
* */
/**
* Look into git tags and determine which release candidate branches need to
* exist and which need to be deprecated and removed.
*
* @returns { BranchChanges }
* */
export function determineBranchChanges() {
const branchesToDeprecate = [];
const currentBetaVersion = resolveReleaseTagForTrack('beta');
const currentStableVersion = resolveReleaseTagForTrack('stable');
if (!currentBetaVersion || !currentStableVersion) {
throw new Error(
`Could not find current stable and/or beta tags. Beta: ${currentBetaVersion?.tag ?? 'not found'}, Stable: ${currentStableVersion?.tag ?? 'not found'}`,
);
}
const branchesToEnsure = [currentBetaVersion, currentStableVersion];
const stableVersion = currentStableVersion.version;
// Deprecated branch is the current stable minus 2 versions. e.g. stable: 2.9.x, deprecated is 2.7.x
const deprecatedMinorVersion = semver.minor(stableVersion) - 2;
if (deprecatedMinorVersion >= 0) {
const deprecatedBranch = `${RELEASE_CANDIDATE_BRANCH_PREFIX}${semver.major(stableVersion)}.${deprecatedMinorVersion}.x`;
branchesToDeprecate.push(deprecatedBranch);
}
return {
branchesToEnsure,
branchesToDeprecate,
};
}
/**
* @param {import("./github-helpers.mjs").TagVersionInfo} tagInfo
*/
function ensureBranch(tagInfo) {
const branch = tagVersionInfoToReleaseCandidateBranchName(tagInfo);
if (remoteBranchExists(branch)) {
console.log(`Branch ${branch} already exists on origin. Skipping.`);
return branch;
}
const commitRef = getCommitForRef(tagInfo.tag);
console.log(`Creating branch ${branch} from ${tagInfo.tag} (${commitRef})`);
// Create local branch (force safe: it shouldn't exist, but keep it robust)
if (localRefExists(`refs/heads/${branch}`)) {
sh('git', ['branch', '-f', branch, commitRef]);
} else {
sh('git', ['switch', '-c', branch, commitRef]);
}
sh('git', ['push', 'origin', branch]);
return branch;
}
/**
* @param {string} branch
*/
function removeBranch(branch) {
if (!remoteBranchExists(branch)) {
console.log(`Couldn't find branch ${branch}. Skipping removal.`);
return null;
}
console.log(`Removing remote branch ${branch} from origin...`);
// Delete remote branch
trySh('git', ['push', 'origin', '--delete', branch]);
// Optional local cleanup (keeps reruns tidy)
if (localRefExists(`refs/heads/${branch}`)) {
console.log(`Removing local branch ${branch}...`);
trySh('git', ['branch', '-D', branch]);
}
return branch;
}
function main() {
const branchChanges = determineBranchChanges();
console.log('💡 Determined branch changes');
console.log('');
console.log(
` Branches to ensure: ${branchChanges.branchesToEnsure.map(tagVersionInfoToReleaseCandidateBranchName).join(', ')}`,
);
console.log(` Branches to deprecate: ${branchChanges.branchesToDeprecate.join(', ')}`);
console.log('');
console.log('Preparing to apply changes...');
let ensuredBranches = [];
for (const tagInfo of branchChanges.branchesToEnsure) {
const branch = ensureBranch(tagInfo);
ensuredBranches.push(branch);
}
console.log('');
console.log('Starting deprecation of branches...');
let removedBranches = [];
for (const branch of branchChanges.branchesToDeprecate) {
const removedBranch = removeBranch(branch);
if (removedBranch) {
removedBranches.push(removedBranch);
}
}
console.log('Done!');
writeGithubOutput({
ensuredBranches: ensuredBranches.join(','),
removedBranches: removedBranches.join(','),
});
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
main();
}

View File

@ -0,0 +1,69 @@
import { describe, it, mock, before } from 'node:test';
import assert from 'node:assert/strict';
import { RELEASE_CANDIDATE_BRANCH_PREFIX } from './github-helpers.mjs';
/**
* Run these tests by running
*
* node --test --experimental-test-module-mocks ./.github/scripts/ensure-release-candidate-branches.test.mjs
* */
let tagVersionInfoToReleaseCandidateBranchName;
before(async () => {
({ tagVersionInfoToReleaseCandidateBranchName } = await import('./github-helpers.mjs'));
});
// mock.module must be called before the module under test is imported,
// because static imports are hoisted and resolve before any code runs.
mock.module('./github-helpers.mjs', {
namedExports: {
RELEASE_TRACKS: ['stable', 'beta', 'v1'],
RELEASE_PREFIX: 'n8n@',
RELEASE_CANDIDATE_BRANCH_PREFIX: RELEASE_CANDIDATE_BRANCH_PREFIX,
tagVersionInfoToReleaseCandidateBranchName,
resolveReleaseTagForTrack: (track) => {
// Always return deterministic data
if (track === 'stable') return { version: '2.9.2', tag: 'n8n@2.9.2' };
if (track === 'beta') return { version: '2.10.1', tag: 'n8n@2.10.1' };
return { version: '1.123.33', tag: 'n8n@1.123.33' };
},
writeGithubOutput: () => {}, // no-op in tests
sh: () => {}, // no-op in tests
trySh: () => {}, // no-op in tests
getCommitForRef: () => {}, // no-op in tests
remoteBranchExists: () => {}, // no-op in tests
localRefExists: () => {}, // no-op in tests
},
});
let determineBranchChanges;
before(async () => {
({ determineBranchChanges } = await import('./ensure-release-candidate-branches.mjs'));
});
describe('Determine branch changes', () => {
it('Correctly determines ensureable branches', () => {
const output = determineBranchChanges();
const ensureBranches = output.branchesToEnsure.map(tagVersionInfoToReleaseCandidateBranchName);
assert.ok(
ensureBranches.includes('release-candidate/2.10.x'),
"Beta release-candidate branch doesn't exist",
);
assert.ok(
ensureBranches.includes('release-candidate/2.9.x'),
"Stable release-candidate branch doesn't exist",
);
});
it('Correctly determines deprecated branches', () => {
/** @type { import('./ensure-release-candidate-branches.mjs').BranchChanges} */
const output = determineBranchChanges();
assert.ok(
output.branchesToDeprecate.includes('release-candidate/2.7.x'),
'Existing branch release-candidate/2.7.x should be marked for removal',
);
});
});

View File

@ -0,0 +1,5 @@
{
"pull_request": {
"labels": ["release", "Backport to Stable"]
}
}

View File

@ -0,0 +1,63 @@
import semver from 'semver';
import {
getCommitForRef,
listTagsPointingAt,
RELEASE_PREFIX,
RELEASE_TRACKS,
stripReleasePrefixes,
writeGithubOutput,
} from './github-helpers.mjs';
/**
* Given a list of tag names, return the highest semver tag (keeping the original 'v' prefix),
* or "" if none match semver.
*
* @param {string[]} tags
**/
function highestSemverTag(tags) {
const candidates = tags
.filter((t) => t.startsWith(RELEASE_PREFIX))
.map((t) => ({
tag: t,
version: stripReleasePrefixes(t),
}))
.filter(({ version }) => semver.valid(version));
if (candidates.length === 0) return '';
candidates.sort((a, b) => semver.rcompare(a.version, b.version));
return candidates[0]?.tag;
}
/**
* @param {string} track
**/
function getSemverTagForTrack(track) {
const commit = getCommitForRef(track);
if (!commit) return '';
const tags = listTagsPointingAt(commit);
return highestSemverTag(tags);
}
function main() {
/** @type { Record<string, string> } */
const outputs = {};
for (const track of RELEASE_TRACKS) {
outputs[track] = getSemverTagForTrack(track);
}
writeGithubOutput(outputs);
console.log('Current release versions: ');
for (const [k, v] of Object.entries(outputs)) {
console.log(`${k}: ${v || '(not found)'}`);
}
}
try {
main();
} catch (err) {
console.error(String(err?.message ?? err));
process.exit(1);
}

386
.github/scripts/github-helpers.mjs vendored Normal file
View File

@ -0,0 +1,386 @@
import { getOctokit } from '@actions/github';
import { execFileSync } from 'node:child_process';
import fs from 'node:fs';
import path from 'node:path';
import semver from 'semver';
export const CURRENT_MAJOR_VERSION = 2;
export const RELEASE_CANDIDATE_BRANCH_PREFIX = 'release-candidate/';
export const RELEASE_TRACKS = /** @type { const } */ ([
//
'stable',
'beta',
'v1',
]);
/**
* @typedef { InstanceType<typeof import("@actions/github/lib/utils").GitHub> } GitHubInstance
* */
/**
* @typedef {typeof RELEASE_TRACKS[number]} ReleaseTrack
* */
/**
* @typedef {`${number}.${number}.${number}`} SemVer
* */
/**
* @typedef {`${RELEASE_PREFIX}${SemVer}`} ReleaseVersion
* */
/**
* @typedef {{ tag: ReleaseVersion, version: SemVer }} TagVersionInfo
* */
export const RELEASE_PREFIX = 'n8n@';
/**
* Given a list of tags, return the highest semver for tags like "n8n@2.7.0".
* Returns the *tag string* (e.g. "n8n@2.7.0") or null.
*
* @param {string[]} tags
*
* @returns { ReleaseVersion | null }
* */
export function pickHighestReleaseTag(tags) {
const versions = tags
.filter((t) => t.startsWith(RELEASE_PREFIX))
.map((t) => ({ tag: t, v: stripReleasePrefixes(t) }))
.filter(({ v }) => semver.valid(v))
.sort((a, b) => semver.rcompare(a.v, b.v));
return /** @type { ReleaseVersion } */ (versions[0]?.tag) ?? null;
}
/**
* @param {any} track
*
* @returns { track is ReleaseTrack }
* */
export function isReleaseTrack(track) {
return RELEASE_TRACKS.includes(track);
}
/**
* @param {any} track
*
* @returns { ReleaseTrack }
* */
export function ensureReleaseTrack(track) {
if (!RELEASE_TRACKS.includes(track)) {
throw new Error(`Invalid track ${track}. Available tracks are ${RELEASE_TRACKS.join(', ')}`);
}
return track;
}
/**
* Resolve a release track tag (stable/beta/etc.) to the corresponding
* n8n@x.y.z tag pointing at the same commit.
*
* Returns null if the track tag or release tag is missing.
*
* @param { typeof RELEASE_TRACKS[number] } track
*
* @returns { TagVersionInfo | null }
* */
export function resolveReleaseTagForTrack(track) {
const commit = getCommitForRef(track);
if (!commit) return null;
const tagsAtCommit = listTagsPointingAt(commit);
const releaseTag = pickHighestReleaseTag(tagsAtCommit);
if (!releaseTag) return null;
return {
tag: releaseTag,
version: stripReleasePrefixes(releaseTag),
};
}
/**
* Resolve a release track tag (stable/beta/etc.) to the corresponding
* release-candidate/<major>.<minor>.x branch, based on the n8n@<x.y.z> tag
* pointing at the same commit.
*
* Returns null if the track tag or release tag is missing.
*
* @param { ReleaseTrack } track
* */
export function resolveRcBranchForTrack(track) {
if (track === 'v1') {
return '1.x';
}
const commit = getCommitForRef(track);
if (!commit) return null;
const tagsAtCommit = listTagsPointingAt(commit);
const releaseTag = pickHighestReleaseTag(tagsAtCommit);
if (!releaseTag) return null;
const version = stripReleasePrefixes(releaseTag);
const parsed = semver.parse(version);
if (!parsed) return null;
return `release-candidate/${parsed.major}.${parsed.minor}.x`;
}
/**
* Takes a TagVersionInfo object and returns a rc-branch name.
*
* e.g. release-candidate/2.8.x or 1.x
*
* @param {import('./github-helpers.mjs').TagVersionInfo} tagVersionInfo
*
* @returns { `${RELEASE_CANDIDATE_BRANCH_PREFIX}${number}.${number}.x` | `${number}.x` }
* */
export function tagVersionInfoToReleaseCandidateBranchName(tagVersionInfo) {
const version = tagVersionInfo.version;
const majorVersion = semver.major(version);
if (majorVersion < CURRENT_MAJOR_VERSION) {
return `${majorVersion}.x`;
}
return `${RELEASE_CANDIDATE_BRANCH_PREFIX}${majorVersion}.${semver.minor(version)}.x`;
}
/**
* @param {string} tag
*
* @returns { SemVer }
* */
export function stripReleasePrefixes(tag) {
return /** @type { SemVer } */ (
tag.startsWith(RELEASE_PREFIX) ? tag.slice(RELEASE_PREFIX.length) : tag
);
}
export function getEventFromGithubEventPath() {
let eventPath = ensureEnvVar('GITHUB_EVENT_PATH');
if (!path.isAbsolute(eventPath)) {
eventPath = import.meta.dirname + '/' + eventPath;
}
return JSON.parse(fs.readFileSync(eventPath, 'utf8'));
}
/**
* @param {any} [pullRequest] Optional pull request object. If not provided, reads from GITHUB_EVENT_PATH
*
* @returns {string[]}
*/
export function readPrLabels(pullRequest) {
if (!pullRequest) {
const event = getEventFromGithubEventPath();
pullRequest = event.pull_request;
}
/** @type { string[] | { name: string }[] } */
const labels = pullRequest?.labels ?? [];
return labels.map((l) => (typeof l === 'string' ? l : l?.name)).filter(Boolean);
}
/**
* Ensures git tag exists.
*
* @param {string} tag
* @throws {Error} if no tag was found
*/
export function ensureTagExists(tag) {
sh('git', ['fetch', '--force', '--no-tags', 'origin', `refs/tags/${tag}:refs/tags/${tag}`]);
}
/**
* @param {string} bump
*
* @returns { bump is import("semver").ReleaseType }
* */
export function isReleaseType(bump) {
return ['major', 'minor', 'patch'].includes(bump);
}
/**
* @param {string} variableName
*/
export function ensureEnvVar(variableName) {
const v = process.env[variableName];
if (!v) {
throw new Error(`Missing required env var: ${variableName}`);
}
return v;
}
/**
* @param {string} cmd
* @param {readonly string[]} args
* @param {import("node:child_process").ExecFileOptionsWithStringEncoding} args
*
* @example sh("git", ["tag", "--points-at", commit]);
* */
export function sh(cmd, args, opts = {}) {
return execFileSync(cmd, args, { encoding: 'utf8', ...opts }).trim();
}
/**
* @param {string} cmd
* @param {readonly string[]} args
* @param {import("node:child_process").ExecFileOptionsWithStringEncoding} args
*
* @example trySh("git", ["tag", "--points-at", commit]);
* */
export function trySh(cmd, args, opts = {}) {
try {
return { ok: true, out: sh(cmd, args, opts) };
} catch {
return { ok: false, out: '' };
}
}
/**
* Append outputs to GITHUB_OUTPUT if available.
*
* @param {Record<string, string | boolean>} obj
*/
export function writeGithubOutput(obj) {
const path = process.env.GITHUB_OUTPUT;
if (!path) return;
const lines = Object.entries(obj)
.map(([k, v]) => `${k}=${v ?? ''}`)
.join('\n');
fs.appendFileSync(path, lines + '\n', 'utf8');
}
/**
* Resolve a ref (tag/branch/SHA) to the underlying commit SHA.
* Uses ^{} so annotated tags are peeled to the commit.
* Returns null if ref doesn't exist.
*
* @param {string} ref
*/
export function getCommitForRef(ref) {
const res = trySh('git', ['rev-parse', `${ref}^{}`]);
return res.ok && res.out ? res.out : null;
}
/**
* List all tags that point at the given commit SHA.
*
* @param {string} commit
*/
export function listTagsPointingAt(commit) {
const res = trySh('git', ['tag', '--points-at', commit]);
if (!res.ok || !res.out) return [];
return res.out
.split('\n')
.map((s) => s.trim())
.filter(Boolean);
}
/**
* @param {string} from
* @param {string} to
*/
export function listCommitsBetweenRefs(from, to) {
return sh('git', ['--no-pager', 'log', '--format=%s (%h)', `${to}..origin/${from}`]);
}
/**
* @param {string} from
* @param {string} to
*/
export function countCommitsBetweenRefs(from, to) {
const output = sh('git', ['rev-list', '--count', `${to}..origin/${from}`]);
const count = parseInt(output);
return isNaN(count) ? 0 : count;
}
/**
* @param {string} branch
*/
export function remoteBranchExists(branch) {
const res = trySh('git', ['ls-remote', '--heads', 'origin', branch]);
return res.ok && res.out.length > 0;
}
/**
* @param {string} ref
*/
export function localRefExists(ref) {
const res = trySh('git', ['show-ref', '--verify', '--quiet', ref]);
return res.ok;
}
/**
* Initializes octokit with GITHUB_TOKEN from env vars.
*
* Also ensures the existence of useful environment variables.
* */
export function initGithub() {
const token = ensureEnvVar('GITHUB_TOKEN');
const repoFullName = ensureEnvVar('GITHUB_REPOSITORY');
const [owner, repo] = repoFullName.split('/');
const octokit = getOctokit(token);
return {
octokit,
owner,
repo,
};
}
/**
* @param {number} pullRequestId
*/
export async function getPullRequestById(pullRequestId) {
const { octokit, owner, repo } = initGithub();
const pullRequest = await octokit.rest.pulls.get({
owner,
repo,
pull_number: pullRequestId,
});
return pullRequest.data;
}
/**
* @param {string} tag
*/
export async function getExistingRelease(tag) {
const { octokit, owner, repo } = initGithub();
try {
const releaseRequest = await octokit.rest.repos.getReleaseByTag({
owner,
repo,
tag,
});
return releaseRequest.data;
} catch (ex) {
if (ex?.status === 404) {
return undefined;
}
throw ex;
}
}
/**
* @param {number} releaseId
*/
export async function deleteRelease(releaseId) {
const { octokit, owner, repo } = initGithub();
await octokit.rest.repos.deleteRelease({
owner,
repo,
release_id: releaseId,
});
}

10
.github/scripts/jsconfig.json vendored Normal file
View File

@ -0,0 +1,10 @@
{
"compilerOptions": {
"module": "esnext",
"target": "esnext",
"checkJs": true,
"moduleResolution": "bundler"
},
"exclude": ["node_modules"]
}

18
.github/scripts/move-track-tag.mjs vendored Normal file
View File

@ -0,0 +1,18 @@
import { ensureEnvVar, ensureReleaseTrack, ensureTagExists, sh } from './github-helpers.mjs';
function main() {
const trackEnv = ensureEnvVar('TRACK');
const track = ensureReleaseTrack(trackEnv);
const versionInput = ensureEnvVar('VERSION_TAG'); // e.g. n8n@2.7.0
ensureTagExists(versionInput);
sh('git', ['tag', '-f', track, versionInput]);
sh('git', ['push', 'origin', '-f', `refs/tags/${track}:refs/tags/${track}`]);
console.log(`Moved pointer tag ${track} to point to ${versionInput}`);
}
main();

View File

@ -1,12 +1,19 @@
{
"name": "workflow-scripts",
"scripts": {
"test": "node --test --experimental-test-module-mocks ./*.test.mjs"
},
"dependencies": {
"cacheable-lookup": "6.1.0",
"conventional-changelog": "^4.0.0",
"debug": "4.3.4",
"glob": "10.5.0",
"p-limit": "3.1.0",
"picocolors": "1.0.1",
"semver": "7.5.4",
"tempfile": "5.0.0"
"@actions/github": "9.0.0",
"@octokit/core": "7.0.6",
"conventional-changelog": "7.2.0",
"debug": "4.4.3",
"glob": "13.0.6",
"semver": "7.7.4",
"tempfile": "6.0.1",
"yaml": "^2.8.3"
},
"devDependencies": {
"conventional-changelog-angular": "8.3.0"
}
}

62
.github/scripts/plan-release.mjs vendored Normal file
View File

@ -0,0 +1,62 @@
import semver from 'semver';
import {
ensureEnvVar,
isReleaseType,
RELEASE_PREFIX,
stripReleasePrefixes,
writeGithubOutput,
} from './github-helpers.mjs';
const track = ensureEnvVar('TRACK');
const bump = ensureEnvVar('BUMP');
const stable = process.env['STABLE_VERSION'];
const beta = process.env['BETA_VERSION'];
const v1 = process.env['V1_VERSION'];
let base = null;
switch (track) {
case 'stable':
base = stable;
break;
case 'beta':
base = beta;
break;
case 'v1':
base = v1;
break;
}
if (!base) {
console.error(
`Unknown track or missing base version. track=${track} stable=${stable} beta=${beta} v1=${v1}`,
);
process.exit(1);
}
const cleanedBase = stripReleasePrefixes(base);
if (!cleanedBase) {
console.error(`Invalid base version: ${base}`);
process.exit(1);
}
if (!isReleaseType(bump)) {
console.error(`Invalid release type in $bump: ${bump}`);
process.exit(1);
}
const next = semver.inc(cleanedBase, bump);
if (!next) {
console.error(`Could not bump version. base=${cleanedBase} bump=${bump}`);
process.exit(1);
}
const output = {
base_version: cleanedBase,
new_version: next,
new_version_tag: `${RELEASE_PREFIX}${next}`,
};
writeGithubOutput(output);
console.log(`Releasing track=${track} bump=${bump} base=${cleanedBase} -> new=${next}`);

549
.github/scripts/pnpm-lock.yaml vendored Normal file
View File

@ -0,0 +1,549 @@
lockfileVersion: '9.0'
settings:
autoInstallPeers: true
excludeLinksFromLockfile: false
importers:
.:
dependencies:
'@actions/github':
specifier: 9.0.0
version: 9.0.0
'@octokit/core':
specifier: 7.0.6
version: 7.0.6
conventional-changelog:
specifier: 7.2.0
version: 7.2.0(conventional-commits-filter@5.0.0)
debug:
specifier: 4.4.3
version: 4.4.3
glob:
specifier: 13.0.6
version: 13.0.6
semver:
specifier: 7.7.4
version: 7.7.4
tempfile:
specifier: 6.0.1
version: 6.0.1
yaml:
specifier: ^2.8.3
version: 2.8.3
devDependencies:
conventional-changelog-angular:
specifier: 8.3.0
version: 8.3.0
packages:
'@actions/github@9.0.0':
resolution: {integrity: sha512-yJ0RoswsAaKcvkmpCE4XxBRiy/whH2SdTBHWzs0gi4wkqTDhXMChjSdqBz/F4AeiDlP28rQqL33iHb+kjAMX6w==}
'@actions/http-client@3.0.2':
resolution: {integrity: sha512-JP38FYYpyqvUsz+Igqlc/JG6YO9PaKuvqjM3iGvaLqFnJ7TFmcLyy2IDrY0bI0qCQug8E9K+elv5ZNfw62ZJzA==}
'@conventional-changelog/git-client@2.6.0':
resolution: {integrity: sha512-T+uPDciKf0/ioNNDpMGc8FDsehJClZP0yR3Q5MN6wE/Y/1QZ7F+80OgznnTCOlMEG4AV0LvH2UJi3C/nBnaBUg==}
engines: {node: '>=18'}
peerDependencies:
conventional-commits-filter: ^5.0.0
conventional-commits-parser: ^6.3.0
peerDependenciesMeta:
conventional-commits-filter:
optional: true
conventional-commits-parser:
optional: true
'@octokit/auth-token@6.0.0':
resolution: {integrity: sha512-P4YJBPdPSpWTQ1NU4XYdvHvXJJDxM6YwpS0FZHRgP7YFkdVxsWcpWGy/NVqlAA7PcPCnMacXlRm1y2PFZRWL/w==}
engines: {node: '>= 20'}
'@octokit/core@7.0.6':
resolution: {integrity: sha512-DhGl4xMVFGVIyMwswXeyzdL4uXD5OGILGX5N8Y+f6W7LhC1Ze2poSNrkF/fedpVDHEEZ+PHFW0vL14I+mm8K3Q==}
engines: {node: '>= 20'}
'@octokit/endpoint@11.0.3':
resolution: {integrity: sha512-FWFlNxghg4HrXkD3ifYbS/IdL/mDHjh9QcsNyhQjN8dplUoZbejsdpmuqdA76nxj2xoWPs7p8uX2SNr9rYu0Ag==}
engines: {node: '>= 20'}
'@octokit/graphql@9.0.3':
resolution: {integrity: sha512-grAEuupr/C1rALFnXTv6ZQhFuL1D8G5y8CN04RgrO4FIPMrtm+mcZzFG7dcBm+nq+1ppNixu+Jd78aeJOYxlGA==}
engines: {node: '>= 20'}
'@octokit/openapi-types@27.0.0':
resolution: {integrity: sha512-whrdktVs1h6gtR+09+QsNk2+FO+49j6ga1c55YZudfEG+oKJVvJLQi3zkOm5JjiUXAagWK2tI2kTGKJ2Ys7MGA==}
'@octokit/plugin-paginate-rest@14.0.0':
resolution: {integrity: sha512-fNVRE7ufJiAA3XUrha2omTA39M6IXIc6GIZLvlbsm8QOQCYvpq/LkMNGyFlB1d8hTDzsAXa3OKtybdMAYsV/fw==}
engines: {node: '>= 20'}
peerDependencies:
'@octokit/core': '>=6'
'@octokit/plugin-rest-endpoint-methods@17.0.0':
resolution: {integrity: sha512-B5yCyIlOJFPqUUeiD0cnBJwWJO8lkJs5d8+ze9QDP6SvfiXSz1BF+91+0MeI1d2yxgOhU/O+CvtiZ9jSkHhFAw==}
engines: {node: '>= 20'}
peerDependencies:
'@octokit/core': '>=6'
'@octokit/request-error@7.1.0':
resolution: {integrity: sha512-KMQIfq5sOPpkQYajXHwnhjCC0slzCNScLHs9JafXc4RAJI+9f+jNDlBNaIMTvazOPLgb4BnlhGJOTbnN0wIjPw==}
engines: {node: '>= 20'}
'@octokit/request@10.0.8':
resolution: {integrity: sha512-SJZNwY9pur9Agf7l87ywFi14W+Hd9Jg6Ifivsd33+/bGUQIjNujdFiXII2/qSlN2ybqUHfp5xpekMEjIBTjlSw==}
engines: {node: '>= 20'}
'@octokit/types@16.0.0':
resolution: {integrity: sha512-sKq+9r1Mm4efXW1FCk7hFSeJo4QKreL/tTbR0rz/qx/r1Oa2VV83LTA/H/MuCOX7uCIJmQVRKBcbmWoySjAnSg==}
'@simple-libs/child-process-utils@1.0.2':
resolution: {integrity: sha512-/4R8QKnd/8agJynkNdJmNw2MBxuFTRcNFnE5Sg/G+jkSsV8/UBgULMzhizWWW42p8L5H7flImV2ATi79Ove2Tw==}
engines: {node: '>=18'}
'@simple-libs/hosted-git-info@1.0.2':
resolution: {integrity: sha512-aAmGQdMH+ZinytKuA2832u0ATeOFNYNk4meBEXtB5xaPotUgggYNhq5tYU/v17wEbmTW5P9iHNqNrFyrhnqBAg==}
engines: {node: '>=18'}
'@simple-libs/stream-utils@1.2.0':
resolution: {integrity: sha512-KxXvfapcixpz6rVEB6HPjOUZT22yN6v0vI0urQSk1L8MlEWPDFCZkhw2xmkyoTGYeFw7tWTZd7e3lVzRZRN/EA==}
engines: {node: '>=18'}
'@types/normalize-package-data@2.4.4':
resolution: {integrity: sha512-37i+OaWTh9qeK4LSHPsyRC7NahnGotNuZvjLSgcPzblpHB3rrCJxAOgI5gCdKm7coonsaX1Of0ILiTcnZjbfxA==}
array-ify@1.0.0:
resolution: {integrity: sha512-c5AMf34bKdvPhQ7tBGhqkgKNUzMr4WUs+WDtC2ZUGOUncbxKMTvqxYctiseW3+L4bA8ec+GcZ6/A/FW4m8ukng==}
balanced-match@4.0.4:
resolution: {integrity: sha512-BLrgEcRTwX2o6gGxGOCNyMvGSp35YofuYzw9h1IMTRmKqttAZZVU67bdb9Pr2vUHA8+j3i2tJfjO6C6+4myGTA==}
engines: {node: 18 || 20 || >=22}
before-after-hook@4.0.0:
resolution: {integrity: sha512-q6tR3RPqIB1pMiTRMFcZwuG5T8vwp+vUvEG0vuI6B+Rikh5BfPp2fQ82c925FOs+b0lcFQ8CFrL+KbilfZFhOQ==}
brace-expansion@5.0.4:
resolution: {integrity: sha512-h+DEnpVvxmfVefa4jFbCf5HdH5YMDXRsmKflpf1pILZWRFlTbJpxeU55nJl4Smt5HQaGzg1o6RHFPJaOqnmBDg==}
engines: {node: 18 || 20 || >=22}
compare-func@2.0.0:
resolution: {integrity: sha512-zHig5N+tPWARooBnb0Zx1MFcdfpyJrfTJ3Y5L+IFvUm8rM74hHz66z0gw0x4tijh5CorKkKUCnW82R2vmpeCRA==}
conventional-changelog-angular@8.3.0:
resolution: {integrity: sha512-DOuBwYSqWzfwuRByY9O4oOIvDlkUCTDzfbOgcSbkY+imXXj+4tmrEFao3K+FxemClYfYnZzsvudbwrhje9VHDA==}
engines: {node: '>=18'}
conventional-changelog-preset-loader@5.0.0:
resolution: {integrity: sha512-SetDSntXLk8Jh1NOAl1Gu5uLiCNSYenB5tm0YVeZKePRIgDW9lQImromTwLa3c/Gae298tsgOM+/CYT9XAl0NA==}
engines: {node: '>=18'}
conventional-changelog-writer@8.4.0:
resolution: {integrity: sha512-HHBFkk1EECxxmCi4CTu091iuDpQv5/OavuCUAuZmrkWpmYfyD816nom1CvtfXJ/uYfAAjavgHvXHX291tSLK8g==}
engines: {node: '>=18'}
hasBin: true
conventional-changelog@7.2.0:
resolution: {integrity: sha512-BEdgG+vPl53EVlTTk9sZ96aagFp0AQ5pw/ggiQMy2SClLbTo1r0l+8dSg79gkLOO5DS1Lswuhp5fWn6RwE+ivg==}
engines: {node: '>=18'}
hasBin: true
conventional-commits-filter@5.0.0:
resolution: {integrity: sha512-tQMagCOC59EVgNZcC5zl7XqO30Wki9i9J3acbUvkaosCT6JX3EeFwJD7Qqp4MCikRnzS18WXV3BLIQ66ytu6+Q==}
engines: {node: '>=18'}
conventional-commits-parser@6.3.0:
resolution: {integrity: sha512-RfOq/Cqy9xV9bOA8N+ZH6DlrDR+5S3Mi0B5kACEjESpE+AviIpAptx9a9cFpWCCvgRtWT+0BbUw+e1BZfts9jg==}
engines: {node: '>=18'}
hasBin: true
debug@4.4.3:
resolution: {integrity: sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==}
engines: {node: '>=6.0'}
peerDependencies:
supports-color: '*'
peerDependenciesMeta:
supports-color:
optional: true
dot-prop@5.3.0:
resolution: {integrity: sha512-QM8q3zDe58hqUqjraQOmzZ1LIH9SWQJTlEKCH4kJ2oQvLZk7RbQXvtDM2XEq3fwkV9CCvvH4LA0AV+ogFsBM2Q==}
engines: {node: '>=8'}
fast-content-type-parse@3.0.0:
resolution: {integrity: sha512-ZvLdcY8P+N8mGQJahJV5G4U88CSvT1rP8ApL6uETe88MBXrBHAkZlSEySdUlyztF7ccb+Znos3TFqaepHxdhBg==}
fd-package-json@2.0.0:
resolution: {integrity: sha512-jKmm9YtsNXN789RS/0mSzOC1NUq9mkVd65vbSSVsKdjGvYXBuE4oWe2QOEoFeRmJg+lPuZxpmrfFclNhoRMneQ==}
glob@13.0.6:
resolution: {integrity: sha512-Wjlyrolmm8uDpm/ogGyXZXb1Z+Ca2B8NbJwqBVg0axK9GbBeoS7yGV6vjXnYdGm6X53iehEuxxbyiKp8QmN4Vw==}
engines: {node: 18 || 20 || >=22}
handlebars@4.7.8:
resolution: {integrity: sha512-vafaFqs8MZkRrSX7sFVUdo3ap/eNiLnb4IakshzvP56X5Nr1iGKAIqdX6tMlm6HcNRIkr6AxO5jFEoJzzpT8aQ==}
engines: {node: '>=0.4.7'}
hasBin: true
hosted-git-info@8.1.0:
resolution: {integrity: sha512-Rw/B2DNQaPBICNXEm8balFz9a6WpZrkCGpcWFpy7nCj+NyhSdqXipmfvtmWt9xGfp0wZnBxB+iVpLmQMYt47Tw==}
engines: {node: ^18.17.0 || >=20.5.0}
is-obj@2.0.0:
resolution: {integrity: sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==}
engines: {node: '>=8'}
is-safe-filename@0.1.1:
resolution: {integrity: sha512-4SrR7AdnY11LHfDKTZY1u6Ga3RuxZdl3YKWWShO5iyuG5h8QS4GD2tOb04peBJ5I7pXbR+CGBNEhTcwK+FzN3g==}
engines: {node: '>=20'}
json-with-bigint@3.5.7:
resolution: {integrity: sha512-7ei3MdAI5+fJPVnKlW77TKNKwQ5ppSzWvhPuSuINT/GYW9ZOC1eRKOuhV9yHG5aEsUPj9BBx5JIekkmoLHxZOw==}
lru-cache@10.4.3:
resolution: {integrity: sha512-JNAzZcXrCt42VGLuYz0zfAzDfAvJWW6AfYlDBQyDV5DClI2m5sAmK+OIO7s59XfsRsWHp02jAJrRadPRGTt6SQ==}
lru-cache@11.2.7:
resolution: {integrity: sha512-aY/R+aEsRelme17KGQa/1ZSIpLpNYYrhcrepKTZgE+W3WM16YMCaPwOHLHsmopZHELU0Ojin1lPVxKR0MihncA==}
engines: {node: 20 || >=22}
meow@13.2.0:
resolution: {integrity: sha512-pxQJQzB6djGPXh08dacEloMFopsOqGVRKFPYvPOt9XDZ1HasbgDZA74CJGreSU4G3Ak7EFJGoiH2auq+yXISgA==}
engines: {node: '>=18'}
minimatch@10.2.4:
resolution: {integrity: sha512-oRjTw/97aTBN0RHbYCdtF1MQfvusSIBQM0IZEgzl6426+8jSC0nF1a/GmnVLpfB9yyr6g6FTqWqiZVbxrtaCIg==}
engines: {node: 18 || 20 || >=22}
minimist@1.2.8:
resolution: {integrity: sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==}
minipass@7.1.3:
resolution: {integrity: sha512-tEBHqDnIoM/1rXME1zgka9g6Q2lcoCkxHLuc7ODJ5BxbP5d4c2Z5cGgtXAku59200Cx7diuHTOYfSBD8n6mm8A==}
engines: {node: '>=16 || 14 >=14.17'}
ms@2.1.3:
resolution: {integrity: sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==}
neo-async@2.6.2:
resolution: {integrity: sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==}
normalize-package-data@7.0.1:
resolution: {integrity: sha512-linxNAT6M0ebEYZOx2tO6vBEFsVgnPpv+AVjk0wJHfaUIbq31Jm3T6vvZaarnOeWDh8ShnwXuaAyM7WT3RzErA==}
engines: {node: ^18.17.0 || >=20.5.0}
path-scurry@2.0.2:
resolution: {integrity: sha512-3O/iVVsJAPsOnpwWIeD+d6z/7PmqApyQePUtCndjatj/9I5LylHvt5qluFaBT3I5h3r1ejfR056c+FCv+NnNXg==}
engines: {node: 18 || 20 || >=22}
semver@7.7.4:
resolution: {integrity: sha512-vFKC2IEtQnVhpT78h1Yp8wzwrf8CM+MzKMHGJZfBtzhZNycRFnXsHk6E5TxIkkMsgNS7mdX3AGB7x2QM2di4lA==}
engines: {node: '>=10'}
hasBin: true
source-map@0.6.1:
resolution: {integrity: sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==}
engines: {node: '>=0.10.0'}
spdx-correct@3.2.0:
resolution: {integrity: sha512-kN9dJbvnySHULIluDHy32WHRUu3Og7B9sbY7tsFLctQkIqnMh3hErYgdMjTYuqmcXX+lK5T1lnUt3G7zNswmZA==}
spdx-exceptions@2.5.0:
resolution: {integrity: sha512-PiU42r+xO4UbUS1buo3LPJkjlO7430Xn5SVAhdpzzsPHsjbYVflnnFdATgabnLude+Cqu25p6N+g2lw/PFsa4w==}
spdx-expression-parse@3.0.1:
resolution: {integrity: sha512-cbqHunsQWnJNE6KhVSMsMeH5H/L9EpymbzqTQ3uLwNCLZ1Q481oWaofqH7nO6V07xlXwY6PhQdQ2IedWx/ZK4Q==}
spdx-license-ids@3.0.23:
resolution: {integrity: sha512-CWLcCCH7VLu13TgOH+r8p1O/Znwhqv/dbb6lqWy67G+pT1kHmeD/+V36AVb/vq8QMIQwVShJ6Ssl5FPh0fuSdw==}
temp-dir@3.0.0:
resolution: {integrity: sha512-nHc6S/bwIilKHNRgK/3jlhDoIHcp45YgyiwcAk46Tr0LfEqGBVpmiAyuiuxeVE44m3mXnEeVhaipLOEWmH+Njw==}
engines: {node: '>=14.16'}
tempfile@6.0.1:
resolution: {integrity: sha512-DE4nURsf7nUqYHJKTgOVdpt0SBY5r4us4kbFXqg7KZFB7ih27NxIk3qXv29FtqTaE45stnLKTECmSc9ICuRbDQ==}
engines: {node: '>=20'}
tunnel@0.0.6:
resolution: {integrity: sha512-1h/Lnq9yajKY2PEbBadPXj3VxsDDu844OnaAo52UVmIzIvwwtBPIuNvkjuzBlTWpfJyUbG3ez0KSBibQkj4ojg==}
engines: {node: '>=0.6.11 <=0.7.0 || >=0.7.3'}
uglify-js@3.19.3:
resolution: {integrity: sha512-v3Xu+yuwBXisp6QYTcH4UbH+xYJXqnq2m/LtQVWKWzYc1iehYnLixoQDN9FH6/j9/oybfd6W9Ghwkl8+UMKTKQ==}
engines: {node: '>=0.8.0'}
hasBin: true
undici@6.24.1:
resolution: {integrity: sha512-sC+b0tB1whOCzbtlx20fx3WgCXwkW627p4EA9uM+/tNNPkSS+eSEld6pAs9nDv7WbY1UUljBMYPtu9BCOrCWKA==}
engines: {node: '>=18.17'}
universal-user-agent@7.0.3:
resolution: {integrity: sha512-TmnEAEAsBJVZM/AADELsK76llnwcf9vMKuPz8JflO1frO8Lchitr0fNaN9d+Ap0BjKtqWqd/J17qeDnXh8CL2A==}
validate-npm-package-license@3.0.4:
resolution: {integrity: sha512-DpKm2Ui/xN7/HQKCtpZxoRWBhZ9Z0kqtygG8XCgNQ8ZlDnxuQmWhj566j8fN4Cu3/JmbhsDo7fcAJq4s9h27Ew==}
walk-up-path@4.0.0:
resolution: {integrity: sha512-3hu+tD8YzSLGuFYtPRb48vdhKMi0KQV5sn+uWr8+7dMEq/2G/dtLrdDinkLjqq5TIbIBjYJ4Ax/n3YiaW7QM8A==}
engines: {node: 20 || >=22}
wordwrap@1.0.0:
resolution: {integrity: sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==}
yaml@2.8.3:
resolution: {integrity: sha512-AvbaCLOO2Otw/lW5bmh9d/WEdcDFdQp2Z2ZUH3pX9U2ihyUY0nvLv7J6TrWowklRGPYbB/IuIMfYgxaCPg5Bpg==}
engines: {node: '>= 14.6'}
hasBin: true
snapshots:
'@actions/github@9.0.0':
dependencies:
'@actions/http-client': 3.0.2
'@octokit/core': 7.0.6
'@octokit/plugin-paginate-rest': 14.0.0(@octokit/core@7.0.6)
'@octokit/plugin-rest-endpoint-methods': 17.0.0(@octokit/core@7.0.6)
'@octokit/request': 10.0.8
'@octokit/request-error': 7.1.0
undici: 6.24.1
'@actions/http-client@3.0.2':
dependencies:
tunnel: 0.0.6
undici: 6.24.1
'@conventional-changelog/git-client@2.6.0(conventional-commits-filter@5.0.0)(conventional-commits-parser@6.3.0)':
dependencies:
'@simple-libs/child-process-utils': 1.0.2
'@simple-libs/stream-utils': 1.2.0
semver: 7.7.4
optionalDependencies:
conventional-commits-filter: 5.0.0
conventional-commits-parser: 6.3.0
'@octokit/auth-token@6.0.0': {}
'@octokit/core@7.0.6':
dependencies:
'@octokit/auth-token': 6.0.0
'@octokit/graphql': 9.0.3
'@octokit/request': 10.0.8
'@octokit/request-error': 7.1.0
'@octokit/types': 16.0.0
before-after-hook: 4.0.0
universal-user-agent: 7.0.3
'@octokit/endpoint@11.0.3':
dependencies:
'@octokit/types': 16.0.0
universal-user-agent: 7.0.3
'@octokit/graphql@9.0.3':
dependencies:
'@octokit/request': 10.0.8
'@octokit/types': 16.0.0
universal-user-agent: 7.0.3
'@octokit/openapi-types@27.0.0': {}
'@octokit/plugin-paginate-rest@14.0.0(@octokit/core@7.0.6)':
dependencies:
'@octokit/core': 7.0.6
'@octokit/types': 16.0.0
'@octokit/plugin-rest-endpoint-methods@17.0.0(@octokit/core@7.0.6)':
dependencies:
'@octokit/core': 7.0.6
'@octokit/types': 16.0.0
'@octokit/request-error@7.1.0':
dependencies:
'@octokit/types': 16.0.0
'@octokit/request@10.0.8':
dependencies:
'@octokit/endpoint': 11.0.3
'@octokit/request-error': 7.1.0
'@octokit/types': 16.0.0
fast-content-type-parse: 3.0.0
json-with-bigint: 3.5.7
universal-user-agent: 7.0.3
'@octokit/types@16.0.0':
dependencies:
'@octokit/openapi-types': 27.0.0
'@simple-libs/child-process-utils@1.0.2':
dependencies:
'@simple-libs/stream-utils': 1.2.0
'@simple-libs/hosted-git-info@1.0.2': {}
'@simple-libs/stream-utils@1.2.0': {}
'@types/normalize-package-data@2.4.4': {}
array-ify@1.0.0: {}
balanced-match@4.0.4: {}
before-after-hook@4.0.0: {}
brace-expansion@5.0.4:
dependencies:
balanced-match: 4.0.4
compare-func@2.0.0:
dependencies:
array-ify: 1.0.0
dot-prop: 5.3.0
conventional-changelog-angular@8.3.0:
dependencies:
compare-func: 2.0.0
conventional-changelog-preset-loader@5.0.0: {}
conventional-changelog-writer@8.4.0:
dependencies:
'@simple-libs/stream-utils': 1.2.0
conventional-commits-filter: 5.0.0
handlebars: 4.7.8
meow: 13.2.0
semver: 7.7.4
conventional-changelog@7.2.0(conventional-commits-filter@5.0.0):
dependencies:
'@conventional-changelog/git-client': 2.6.0(conventional-commits-filter@5.0.0)(conventional-commits-parser@6.3.0)
'@simple-libs/hosted-git-info': 1.0.2
'@types/normalize-package-data': 2.4.4
conventional-changelog-preset-loader: 5.0.0
conventional-changelog-writer: 8.4.0
conventional-commits-parser: 6.3.0
fd-package-json: 2.0.0
meow: 13.2.0
normalize-package-data: 7.0.1
transitivePeerDependencies:
- conventional-commits-filter
conventional-commits-filter@5.0.0: {}
conventional-commits-parser@6.3.0:
dependencies:
'@simple-libs/stream-utils': 1.2.0
meow: 13.2.0
debug@4.4.3:
dependencies:
ms: 2.1.3
dot-prop@5.3.0:
dependencies:
is-obj: 2.0.0
fast-content-type-parse@3.0.0: {}
fd-package-json@2.0.0:
dependencies:
walk-up-path: 4.0.0
glob@13.0.6:
dependencies:
minimatch: 10.2.4
minipass: 7.1.3
path-scurry: 2.0.2
handlebars@4.7.8:
dependencies:
minimist: 1.2.8
neo-async: 2.6.2
source-map: 0.6.1
wordwrap: 1.0.0
optionalDependencies:
uglify-js: 3.19.3
hosted-git-info@8.1.0:
dependencies:
lru-cache: 10.4.3
is-obj@2.0.0: {}
is-safe-filename@0.1.1: {}
json-with-bigint@3.5.7: {}
lru-cache@10.4.3: {}
lru-cache@11.2.7: {}
meow@13.2.0: {}
minimatch@10.2.4:
dependencies:
brace-expansion: 5.0.4
minimist@1.2.8: {}
minipass@7.1.3: {}
ms@2.1.3: {}
neo-async@2.6.2: {}
normalize-package-data@7.0.1:
dependencies:
hosted-git-info: 8.1.0
semver: 7.7.4
validate-npm-package-license: 3.0.4
path-scurry@2.0.2:
dependencies:
lru-cache: 11.2.7
minipass: 7.1.3
semver@7.7.4: {}
source-map@0.6.1: {}
spdx-correct@3.2.0:
dependencies:
spdx-expression-parse: 3.0.1
spdx-license-ids: 3.0.23
spdx-exceptions@2.5.0: {}
spdx-expression-parse@3.0.1:
dependencies:
spdx-exceptions: 2.5.0
spdx-license-ids: 3.0.23
spdx-license-ids@3.0.23: {}
temp-dir@3.0.0: {}
tempfile@6.0.1:
dependencies:
is-safe-filename: 0.1.1
temp-dir: 3.0.0
tunnel@0.0.6: {}
uglify-js@3.19.3:
optional: true
undici@6.24.1: {}
universal-user-agent@7.0.3: {}
validate-npm-package-license@3.0.4:
dependencies:
spdx-correct: 3.2.0
spdx-expression-parse: 3.0.1
walk-up-path@4.0.0: {}
wordwrap@1.0.0: {}
yaml@2.8.3: {}

View File

@ -0,0 +1,32 @@
import { ensureEnvVar } from './github-helpers.mjs';
async function populateCloudDatabases() {
const payload = ensureEnvVar('PAYLOAD');
const webhookData = ensureEnvVar('N8N_POPULATE_CLOUD_WEBHOOK_DATA');
const { user, secret, url } = JSON.parse(webhookData);
console.log('Payload: ', JSON.parse(payload));
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: 'Basic ' + Buffer.from(`${user}:${secret}`).toString('base64'),
},
body: payload,
});
const status = response.status;
console.log('Webhook call returned status ' + status);
if (status !== 200) {
const body = await response.text();
throw new Error(`Webhook call failed:\n\n ${body}`);
}
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
populateCloudDatabases();
}

View File

@ -0,0 +1,68 @@
import {
deleteRelease,
ensureEnvVar,
getExistingRelease,
initGithub,
writeGithubOutput,
} from './github-helpers.mjs';
/**
* Promotes a GitHub release to latest
*
* Required env variables:
* - RELEASE_TAG - Release tag on git e.g. n8n@2.13.0
*
* GitHub variables
* - GITHUB_TOKEN - Used to authenticate to octokit - Can be overwritten for privileged access
* - GITHUB_REPOSITORY - Used to determine target repository
* */
async function promoteGitHubRelease() {
const RELEASE_TAG = ensureEnvVar('RELEASE_TAG');
const { octokit, owner, repo } = initGithub();
const existingRelease = await getExistingRelease(RELEASE_TAG);
if (!existingRelease) {
console.warn("Couldn't find release by tag. Exiting...");
process.exit(1);
}
const releaseResponse = await octokit.rest.repos.updateRelease({
owner,
repo,
release_id: existingRelease.id,
prerelease: false,
make_latest: 'true',
});
console.log(`Successfully updated release ${releaseResponse.data.html_url}`);
const existingStableRelease = await getExistingRelease('stable');
if (existingStableRelease) {
await deleteRelease(existingStableRelease.id);
console.log("Deleted previous 'stable' release.");
}
const stableReleaseResponse = await octokit.rest.repos.createRelease({
tag_name: 'stable',
name: 'stable',
body: releaseResponse.data.body,
draft: false,
prerelease: false,
make_latest: 'false',
target_commitish: releaseResponse.data.target_commitish,
owner,
repo,
});
console.log(`Successfully created new stable release ${stableReleaseResponse.data.html_url}`);
writeGithubOutput({
release_url: releaseResponse.data.html_url,
stable_release_url: stableReleaseResponse.data.html_url,
});
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
promoteGitHubRelease();
}

79
.github/scripts/retry.mjs vendored Normal file
View File

@ -0,0 +1,79 @@
#!/usr/bin/env node
/**
* Retry a shell command with configurable attempts and delay.
*
* Usage (safe): node retry.mjs [--attempts N] [--delay N] -- <cmd> [args...]
* Usage (legacy): node retry.mjs [--attempts N] [--delay N] '<shell command>'
*
* Options:
* --attempts N Maximum number of attempts (default: 4)
* --delay N Seconds to wait between retries (default: 15)
*
* The -- form passes args directly to the process (no shell, safe for untrusted input).
* The legacy form executes via shell, so pipes and env-var expansion work but injection is possible.
* Exits 0 on first success, 1 if all attempts fail.
*/
import { execSync, spawnSync } from 'node:child_process';
const args = process.argv.slice(2);
function getFlag(name, defaultValue) {
const index = args.indexOf(`--${name}`);
if (index === -1 || !args[index + 1]) return defaultValue;
const value = parseInt(args[index + 1], 10);
if (Number.isNaN(value) || value <= 0) {
console.error(`Error: --${name} must be a positive integer`);
process.exit(1);
}
return value;
}
const attempts = getFlag('attempts', 4);
const delay = getFlag('delay', 15);
// Preferred form: -- cmd arg1 arg2 ... (no shell, safe for untrusted input)
// Legacy form: '<shell command string>' (uses shell; kept for backwards compat)
const separatorIndex = args.indexOf('--');
let command;
let commandArgs = [];
const isSafeRetry = separatorIndex !== -1;
if (isSafeRetry) {
[command, ...commandArgs] = args.slice(separatorIndex + 1);
} else {
command = args
.filter((a, i) => {
if (a.startsWith('--')) return false;
if (i > 0 && args[i - 1].startsWith('--')) return false;
return true;
})
.pop();
}
if (!command) {
console.error('Usage: node retry.mjs [--attempts N] [--delay N] -- <cmd> [args...]');
process.exit(1);
}
for (let i = 1; i <= attempts; i++) {
try {
if (isSafeRetry) {
const result = spawnSync(command, commandArgs, { stdio: 'inherit' });
if (result.status !== 0) throw new Error(`Exit code ${result.status}`);
} else {
execSync(command, { shell: true, stdio: 'inherit' });
}
process.exit(0);
} catch {
if (i < attempts) {
console.error(`Attempt ${i}/${attempts} failed, retrying in ${delay}s...`);
execSync(`sleep ${delay}`);
} else {
console.error(`Attempt ${i}/${attempts} failed, no more retries.`);
}
}
}
process.exit(1);

67
.github/scripts/send-build-stats.mjs vendored Normal file
View File

@ -0,0 +1,67 @@
#!/usr/bin/env node
/**
* Sends Turbo build stats to the unified QA metrics webhook.
*
* Reads the Turbo run summary from .turbo/runs/ and emits per-package
* build-duration metrics with {package, cache, task} dimensions, plus
* a run-level build-total-duration summary.
*
* Usage: node send-build-stats.mjs
*
* Environment variables:
* QA_METRICS_WEBHOOK_URL - Webhook URL (required to send)
* QA_METRICS_WEBHOOK_USER - Basic auth username
* QA_METRICS_WEBHOOK_PASSWORD - Basic auth password
*/
import { existsSync, readFileSync, readdirSync } from 'node:fs';
import { join } from 'node:path';
import { sendMetrics, metric } from './send-metrics.mjs';
const runsDir = '.turbo/runs';
if (!existsSync(runsDir)) {
console.log('No .turbo/runs directory found (turbo --summarize not used), skipping.');
process.exit(0);
}
const files = readdirSync(runsDir)
.filter((f) => f.endsWith('.json'))
.sort();
if (files.length === 0) {
console.error('No summary file found in .turbo/runs/');
process.exit(1);
}
const summary = JSON.parse(readFileSync(join(runsDir, files.at(-1)), 'utf-8'));
const metrics = [];
for (const task of summary.tasks ?? []) {
if (task.execution?.exitCode !== 0) continue;
const durationMs = task.execution.durationMs ?? 0;
const cacheHit = task.cache?.status === 'HIT';
// taskId format: "package-name#task-name"
const [pkg, taskName] = task.taskId?.split('#') ?? [task.package, task.task];
metrics.push(
metric('build-duration', durationMs / 1000, 's', {
package: pkg ?? 'unknown',
task: taskName ?? 'build',
cache: cacheHit ? 'hit' : 'miss',
}),
);
}
const totalMs = summary.durationMs ?? 0;
const totalTasks = summary.tasks?.length ?? 0;
const cachedTasks = summary.tasks?.filter((t) => t.cache?.status === 'HIT').length ?? 0;
metrics.push(
metric('build-total-duration', totalMs / 1000, 's', {
total_tasks: totalTasks,
cached_tasks: cachedTasks,
}),
);
await sendMetrics(metrics, 'build-stats');

74
.github/scripts/send-docker-stats.mjs vendored Normal file
View File

@ -0,0 +1,74 @@
#!/usr/bin/env node
/**
* Sends Docker build stats to the unified QA metrics webhook.
*
* Reads manifests produced by build-n8n.mjs and dockerize-n8n.mjs and emits
* per-image docker-image-size metrics and build duration metrics with
* {image, platform} dimensions.
*
* Usage: node send-docker-stats.mjs
*
* Environment variables:
* QA_METRICS_WEBHOOK_URL - Webhook URL (required to send)
* QA_METRICS_WEBHOOK_USER - Basic auth username
* QA_METRICS_WEBHOOK_PASSWORD - Basic auth password
*/
import { existsSync, readFileSync } from 'node:fs';
import { sendMetrics, metric } from './send-metrics.mjs';
const buildManifestPath = 'compiled/build-manifest.json';
const dockerManifestPath = 'docker-build-manifest.json';
if (!existsSync(buildManifestPath) && !existsSync(dockerManifestPath)) {
console.log('No build or docker manifests found, skipping.');
process.exit(0);
}
const buildManifest = existsSync(buildManifestPath)
? JSON.parse(readFileSync(buildManifestPath, 'utf-8'))
: null;
const dockerManifest = existsSync(dockerManifestPath)
? JSON.parse(readFileSync(dockerManifestPath, 'utf-8'))
: null;
const metrics = [];
if (buildManifest) {
if (buildManifest.artifactSize != null) {
metrics.push(metric('artifact-size', buildManifest.artifactSize, 'bytes', { artifact: 'compiled' }));
}
if (buildManifest.buildDuration != null) {
metrics.push(metric('build-duration', buildManifest.buildDuration / 1000, 's', { artifact: 'compiled' }));
}
}
if (dockerManifest) {
const platform = dockerManifest.platform ?? 'unknown';
for (const image of dockerManifest.images ?? []) {
if (image.sizeBytes != null) {
metrics.push(
metric('docker-image-size', image.sizeBytes, 'bytes', {
image: image.name ?? 'unknown',
platform,
}),
);
}
}
if (dockerManifest.buildDurationMs != null) {
metrics.push(
metric('docker-build-duration', dockerManifest.buildDurationMs / 1000, 's', { platform }),
);
}
}
if (metrics.length === 0) {
console.log('No metrics to send.');
process.exit(0);
}
await sendMetrics(metrics, 'docker-stats');

94
.github/scripts/send-metrics.mjs vendored Normal file
View File

@ -0,0 +1,94 @@
#!/usr/bin/env node
/**
* Shared metrics sender for CI scripts.
* See .github/CI-TELEMETRY.md for payload shape and BigQuery schema.
*
* Usage:
* import { sendMetrics, metric } from './send-metrics.mjs';
* await sendMetrics([metric('build-duration', 45.2, 's', { package: '@n8n/cli' })]);
*
* Env: QA_METRICS_WEBHOOK_URL, QA_METRICS_WEBHOOK_USER, QA_METRICS_WEBHOOK_PASSWORD
*/
import * as os from 'node:os';
/** Build a single metric object. */
export function metric(name, value, unit, dimensions = {}) {
return { metric_name: name, value, unit, dimensions };
}
/** Build git/ci/runner context from environment variables. */
export function buildContext(benchmarkName = null) {
const ref = process.env.GITHUB_REF ?? '';
const prMatch = ref.match(/refs\/pull\/(\d+)/);
const runId = process.env.GITHUB_RUN_ID ?? null;
return {
timestamp: new Date().toISOString(),
benchmark_name: benchmarkName,
git: {
sha: process.env.GITHUB_SHA?.slice(0, 8) ?? null,
branch: process.env.GITHUB_HEAD_REF ?? process.env.GITHUB_REF_NAME ?? null,
pr: prMatch ? parseInt(prMatch[1], 10) : null,
},
ci: {
runId,
runUrl:
runId && process.env.GITHUB_REPOSITORY
? `https://github.com/${process.env.GITHUB_REPOSITORY}/actions/runs/${runId}`
: null,
job: process.env.GITHUB_JOB ?? null,
workflow: process.env.GITHUB_WORKFLOW ?? null,
attempt: process.env.GITHUB_RUN_ATTEMPT
? parseInt(process.env.GITHUB_RUN_ATTEMPT, 10)
: null,
},
runner: {
provider: !process.env.CI
? 'local'
: process.env.RUNNER_ENVIRONMENT === 'github-hosted'
? 'github'
: 'blacksmith',
cpuCores: os.cpus().length,
memoryGb: Math.round((os.totalmem() / (1024 * 1024 * 1024)) * 10) / 10,
},
};
}
export async function sendMetrics(metrics, benchmarkName = null) {
const webhookUrl = process.env.QA_METRICS_WEBHOOK_URL;
const webhookUser = process.env.QA_METRICS_WEBHOOK_USER;
const webhookPassword = process.env.QA_METRICS_WEBHOOK_PASSWORD;
if (!webhookUrl) {
console.log('QA_METRICS_WEBHOOK_URL not set, skipping.');
return;
}
if (!webhookUser || !webhookPassword) {
console.log('QA_METRICS_WEBHOOK_USER/PASSWORD not set, skipping.');
return;
}
const payload = { ...buildContext(benchmarkName), metrics };
const basicAuth = Buffer.from(`${webhookUser}:${webhookPassword}`).toString('base64');
const response = await fetch(webhookUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: `Basic ${basicAuth}`,
},
body: JSON.stringify(payload),
signal: AbortSignal.timeout(30_000),
});
if (!response.ok) {
const body = await response.text().catch(() => '');
throw new Error(
`Webhook failed: ${response.status} ${response.statusText}${body ? `\n${body}` : ''}`,
);
}
console.log(`Sent ${metrics.length} metric(s): ${response.status}`);
}

View File

@ -0,0 +1,32 @@
import { ensureEnvVar } from './github-helpers.mjs';
async function sendVersionReleaseNotification() {
const payload = ensureEnvVar('PAYLOAD');
const webhookData = ensureEnvVar('N8N_VERSION_RELEASE_NOTIFICATION_DATA');
const { user, secret, url } = JSON.parse(webhookData);
console.log('Payload: ', JSON.parse(payload));
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
Authorization: 'Basic ' + Buffer.from(`${user}:${secret}`).toString('base64'),
},
body: payload,
});
const status = response.status;
console.log('Webhook call returned status ' + status);
if (status !== 200) {
const body = await response.text();
throw new Error(`Webhook call failed:\n\n ${body}`);
}
}
// only run when executed directly, not when imported by tests
if (import.meta.url === `file://${process.argv[1]}`) {
sendVersionReleaseNotification();
}

View File

@ -1,5 +1,5 @@
import createTempFile from 'tempfile';
import conventionalChangelog from 'conventional-changelog';
import { ConventionalChangelog } from 'conventional-changelog';
import { resolve } from 'path';
import { createReadStream, createWriteStream } from 'fs';
import { dirname } from 'path';
@ -12,21 +12,48 @@ const fullChangelogFile = resolve(baseDir, 'CHANGELOG.md');
// Version includes experimental versions (e.g., 1.2.3-exp.0)
const versionChangelogFile = resolve(baseDir, `CHANGELOG-${packageJson.version}.md`);
const changelogStream = conventionalChangelog({
preset: 'angular',
releaseCount: 1,
tagPrefix: 'n8n@',
transform: (commit, callback) => {
const hasNoChangelogInHeader = commit.header.includes('(no-changelog)');
const isBenchmarkScope = commit.scope === 'benchmark';
const changelogStream = new ConventionalChangelog()
.package(packageJson)
.readRepository()
.loadPreset('angular')
.tags({
prefix: 'n8n@',
})
.context({
version: packageJson.version,
repoUrl: 'https://github.com/n8n-io/n8n',
})
.options({
releaseCount: 1,
transformCommit(commit) {
const hasNoChangelogInHeader = commit.header?.includes('(no-changelog)');
const isBenchmarkScope = commit.scope === 'benchmark';
// Ignore commits that have 'benchmark' scope or '(no-changelog)' in the header
callback(null, hasNoChangelogInHeader || isBenchmarkScope ? undefined : commit);
},
}).on('error', (err) => {
console.error(err.stack);
process.exit(1);
});
// Ignore commits that have 'benchmark' scope or '(no-changelog)' in the header
if (hasNoChangelogInHeader || isBenchmarkScope) return null;
// Strip backport information from commit subject, e.g.:
// "Fix something (backport to release-candidate/2.12.x) (#123)" → "Fix something (#123)"
if (commit.subject) {
// The commit.subject is immutable so we need to recreate the commit object
/** @type { import("conventional-changelog").Commit } */
let newCommit = /** @type { any } */ ({
...commit,
subject: commit.subject.replace(/\s*\(backport to [^)]+\)/g, ''),
});
return newCommit;
}
return commit;
},
})
.writeStream()
.on('error', (err) => {
console.error(err.stack);
process.exit(1);
});
// Write the new changelog to a new temporary file, so that the contents can be used in the PR description
await pipeline(changelogStream, createWriteStream(versionChangelogFile));
@ -36,5 +63,6 @@ await pipeline(changelogStream, createWriteStream(versionChangelogFile));
const tmpFile = createTempFile();
const tmpStream = createWriteStream(tmpFile);
await pipeline(createReadStream(versionChangelogFile), tmpStream, { end: false });
tmpStream.write('\n\n');
await pipeline(createReadStream(fullChangelogFile), tmpStream);
await pipeline(createReadStream(tmpFile), createWriteStream(fullChangelogFile));

782
.github/test-metrics/playwright.json vendored Normal file
View File

@ -0,0 +1,782 @@
{
"updatedAt": "2026-03-03T14:06:03.725Z",
"source": "currents",
"projectId": "LRxcNt",
"specs": {
"tests/e2e/projects/projects.spec.ts": {
"avgDuration": 146428,
"testCount": 7,
"flakyRate": 0.0269
},
"tests/e2e/workflows/editor/canvas/actions.spec.ts": {
"avgDuration": 132050,
"testCount": 20,
"flakyRate": 0.0071
},
"tests/e2e/credentials/crud.spec.ts": {
"avgDuration": 120000,
"testCount": 14,
"flakyRate": 0
},
"tests/e2e/data-tables/tables.spec.ts": {
"avgDuration": 117860,
"testCount": 7,
"flakyRate": 0.0054
},
"tests/e2e/workflows/list/workflows.spec.ts": {
"avgDuration": 110286,
"testCount": 9,
"flakyRate": 0.0143
},
"tests/e2e/workflows/editor/canvas/canvas-nodes.spec.ts": {
"avgDuration": 106230,
"testCount": 8,
"flakyRate": 0.2183
},
"tests/e2e/workflows/editor/code/code-node.spec.ts": {
"avgDuration": 104346,
"testCount": 12,
"flakyRate": 0.1071
},
"tests/e2e/ai/assistant-basic.spec.ts": {
"avgDuration": 104278,
"testCount": 11,
"flakyRate": 0.0143
},
"tests/e2e/settings/personal/two-factor-authentication.spec.ts": {
"avgDuration": 103362,
"testCount": 7,
"flakyRate": 0.0036
},
"tests/e2e/data-tables/details.spec.ts": {
"avgDuration": 102518,
"testCount": 11,
"flakyRate": 0.0036
},
"tests/e2e/workflows/editor/canvas/canvas-zoom.spec.ts": {
"avgDuration": 98829,
"testCount": 13,
"flakyRate": 0.0698
},
"tests/e2e/workflows/editor/canvas/undo-redo.spec.ts": {
"avgDuration": 98612,
"testCount": 14,
"flakyRate": 0.0018
},
"tests/e2e/ai/langchain-agents.spec.ts": {
"avgDuration": 97616,
"testCount": 7,
"flakyRate": 0.0215
},
"tests/e2e/workflows/editor/ndv/ndv-data-display.spec.ts": {
"avgDuration": 91228,
"testCount": 11,
"flakyRate": 0.0477
},
"tests/e2e/workflows/editor/ndv/ndv-core.spec.ts": {
"avgDuration": 91044,
"testCount": 14,
"flakyRate": 0.0036
},
"tests/e2e/auth/oidc.spec.ts": {
"avgDuration": 90276,
"testCount": 1,
"flakyRate": 0.0214
},
"tests/e2e/workflows/editor/code/editors.spec.ts": {
"avgDuration": 87228,
"testCount": 11,
"flakyRate": 0.0036
},
"tests/e2e/projects/folders-operations.spec.ts": {
"avgDuration": 80775,
"testCount": 14,
"flakyRate": 0.009
},
"tests/e2e/nodes/webhook.spec.ts": {
"avgDuration": 80112,
"testCount": 9,
"flakyRate": 0.0143
},
"tests/e2e/workflows/templates/credentials-setup.spec.ts": {
"avgDuration": 79929,
"testCount": 8,
"flakyRate": 0.0178
},
"tests/e2e/ai/langchain-chains.spec.ts": {
"avgDuration": 77134,
"testCount": 4,
"flakyRate": 0.0179
},
"tests/e2e/workflows/executions/list.spec.ts": {
"avgDuration": 75367,
"testCount": 9,
"flakyRate": 0.2228
},
"tests/e2e/ai/hitl-for-tools.spec.ts": {
"avgDuration": 75127,
"testCount": 2,
"flakyRate": 0.0215
},
"tests/e2e/workflows/editor/execution/execution.spec.ts": {
"avgDuration": 72856,
"testCount": 14,
"flakyRate": 0.0623
},
"tests/e2e/sharing/credential-visibility.spec.ts": {
"avgDuration": 71778,
"testCount": 5,
"flakyRate": 0.0107
},
"tests/e2e/workflows/editor/ndv/ndv-parameters.spec.ts": {
"avgDuration": 70303,
"testCount": 9,
"flakyRate": 0.0125
},
"tests/e2e/ai/assistant-credential-help.spec.ts": {
"avgDuration": 67106,
"testCount": 4,
"flakyRate": 0.0143
},
"tests/e2e/ai/assistant-code-help.spec.ts": {
"avgDuration": 65391,
"testCount": 2,
"flakyRate": 0.0375
},
"tests/e2e/workflows/editor/viewer-permissions.spec.ts": {
"avgDuration": 63815,
"testCount": 3,
"flakyRate": 0.0071
},
"tests/e2e/workflows/editor/execution/logs.spec.ts": {
"avgDuration": 63194,
"testCount": 8,
"flakyRate": 0.0677
},
"tests/e2e/workflows/editor/execution/debug.spec.ts": {
"avgDuration": 63190,
"testCount": 4,
"flakyRate": 0.0357
},
"tests/e2e/workflows/editor/ndv/pinning.spec.ts": {
"avgDuration": 63033,
"testCount": 10,
"flakyRate": 0.0268
},
"tests/e2e/building-blocks/node-details-configuration.spec.ts": {
"avgDuration": 62896,
"testCount": 7,
"flakyRate": 0.0143
},
"tests/e2e/ai/assistant-support-chat.spec.ts": {
"avgDuration": 62156,
"testCount": 3,
"flakyRate": 0.0196
},
"tests/e2e/workflows/editor/subworkflows/extraction.spec.ts": {
"avgDuration": 60164,
"testCount": 3,
"flakyRate": 0.0018
},
"tests/e2e/building-blocks/canvas-actions.spec.ts": {
"avgDuration": 59268,
"testCount": 9,
"flakyRate": 0.0018
},
"tests/e2e/workflows/editor/expressions/mapping.spec.ts": {
"avgDuration": 58896,
"testCount": 10,
"flakyRate": 0.0036
},
"tests/e2e/chat-hub/chat-hub-basic.spec.ts": {
"avgDuration": 58724,
"testCount": 3,
"flakyRate": 0.0519
},
"tests/e2e/workflows/editor/ndv/paired-item.spec.ts": {
"avgDuration": 58608,
"testCount": 6,
"flakyRate": 0.0555
},
"tests/e2e/capabilities/proxy-server.spec.ts": {
"avgDuration": 57636,
"testCount": 4,
"flakyRate": 0.0179
},
"tests/e2e/projects/project-settings.spec.ts": {
"avgDuration": 57079,
"testCount": 8,
"flakyRate": 0
},
"tests/e2e/building-blocks/credentials.spec.ts": {
"avgDuration": 56988,
"testCount": 6,
"flakyRate": 0.0142
},
"tests/e2e/nodes/form-trigger-node.spec.ts": {
"avgDuration": 55372,
"testCount": 5,
"flakyRate": 0.0143
},
"tests/e2e/workflows/executions/filter.spec.ts": {
"avgDuration": 54860,
"testCount": 2,
"flakyRate": 0.4223
},
"tests/e2e/building-blocks/workflow-entry-points.spec.ts": {
"avgDuration": 54615,
"testCount": 5,
"flakyRate": 0.025
},
"tests/e2e/dynamic-credentials/execution-status.spec.ts": {
"avgDuration": 54611,
"testCount": 2,
"flakyRate": 0.0028
},
"tests/e2e/regression/ADO-4462-template-setup-experiment.spec.ts": {
"avgDuration": 54493,
"testCount": 2,
"flakyRate": 0.0108
},
"tests/e2e/node-creator/categories.spec.ts": {
"avgDuration": 53655,
"testCount": 5,
"flakyRate": 0.6301
},
"tests/e2e/workflows/editor/expressions/modal.spec.ts": {
"avgDuration": 53254,
"testCount": 6,
"flakyRate": 0
},
"tests/e2e/ai/rag-callout.spec.ts": {
"avgDuration": 53211,
"testCount": 2,
"flakyRate": 0.0197
},
"tests/e2e/projects/projects-move-resources.spec.ts": {
"avgDuration": 52538,
"testCount": 2,
"flakyRate": 0.0054
},
"tests/e2e/auth/authenticated.spec.ts": {
"avgDuration": 51796,
"testCount": 5,
"flakyRate": 0.0205
},
"tests/e2e/api/webhook-isolation.spec.ts": {
"avgDuration": 51794,
"testCount": 14,
"flakyRate": 0.0525
},
"tests/e2e/credentials/global.spec.ts": {
"avgDuration": 51699,
"testCount": 5,
"flakyRate": 0.0143
},
"tests/e2e/nodes/kafka-nodes.spec.ts": {
"avgDuration": 51563,
"testCount": 2,
"flakyRate": 0.018
},
"tests/e2e/workflows/editor/routing.spec.ts": {
"avgDuration": 51272,
"testCount": 6,
"flakyRate": 0.0071
},
"tests/e2e/app-config/demo.spec.ts": {
"avgDuration": 50675,
"testCount": 4,
"flakyRate": 0.0198
},
"tests/e2e/workflows/checklist/production-checklist.spec.ts": {
"avgDuration": 50490,
"testCount": 7,
"flakyRate": 0.0107
},
"tests/e2e/workflows/editor/expressions/inline.spec.ts": {
"avgDuration": 50014,
"testCount": 7,
"flakyRate": 0.0323
},
"tests/e2e/workflows/editor/expressions/transformation.spec.ts": {
"avgDuration": 49030,
"testCount": 6,
"flakyRate": 0
},
"tests/e2e/ai/chat-session.spec.ts": {
"avgDuration": 48206,
"testCount": 1,
"flakyRate": 0.0215
},
"tests/e2e/workflows/editor/ndv/resource-locator.spec.ts": {
"avgDuration": 47480,
"testCount": 7,
"flakyRate": 0.0018
},
"tests/e2e/projects/folders-basic.spec.ts": {
"avgDuration": 47403,
"testCount": 11,
"flakyRate": 0.0089
},
"tests/e2e/settings/log-streaming/log-streaming-observability.spec.ts": {
"avgDuration": 46652,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/chat-hub/chat-hub-attachment.spec.ts": {
"avgDuration": 46545,
"testCount": 3,
"flakyRate": 0.0735
},
"tests/e2e/app-config/security-notifications.spec.ts": {
"avgDuration": 44200,
"testCount": 5,
"flakyRate": 0.0071
},
"tests/e2e/workflows/editor/tags.spec.ts": {
"avgDuration": 43658,
"testCount": 7,
"flakyRate": 0.0053
},
"tests/e2e/ai/workflow-builder.spec.ts": {
"avgDuration": 43520,
"testCount": 5,
"flakyRate": 0.0036
},
"tests/e2e/workflows/editor/workflow-actions/publish.spec.ts": {
"avgDuration": 43049,
"testCount": 8,
"flakyRate": 0.0036
},
"tests/e2e/workflows/templates/templates.spec.ts": {
"avgDuration": 41875,
"testCount": 9,
"flakyRate": 0.1299
},
"tests/e2e/projects/folders-advanced.spec.ts": {
"avgDuration": 40967,
"testCount": 6,
"flakyRate": 0.0018
},
"tests/e2e/chat-hub/chat-hub-workflow-agent.spec.ts": {
"avgDuration": 40858,
"testCount": 2,
"flakyRate": 0.0054
},
"tests/e2e/workflows/editor/subworkflows/workflow-selector.spec.ts": {
"avgDuration": 39536,
"testCount": 5,
"flakyRate": 0.0053
},
"tests/e2e/cloud/cloud.spec.ts": {
"avgDuration": 39055,
"testCount": 3,
"flakyRate": 0.0036
},
"tests/e2e/ai/langchain-vectorstores.spec.ts": {
"avgDuration": 36808,
"testCount": 2,
"flakyRate": 0.0894
},
"tests/e2e/workflows/editor/ndv/ndv-floating-nodes.spec.ts": {
"avgDuration": 36193,
"testCount": 4,
"flakyRate": 0
},
"tests/e2e/settings/external-secrets/aws-secrets-manager.spec.ts": {
"avgDuration": 35223,
"testCount": 1,
"flakyRate": 0.0089
},
"tests/e2e/sentry/sentry-baseline.spec.ts": {
"avgDuration": 34042,
"testCount": 3,
"flakyRate": 0.0036
},
"tests/e2e/chat-hub/chat-hub-chat-user.spec.ts": {
"avgDuration": 33473,
"testCount": 1,
"flakyRate": 0.0125
},
"tests/e2e/auth/password-reset.spec.ts": {
"avgDuration": 28646,
"testCount": 1,
"flakyRate": 0.0089
},
"tests/e2e/workflows/editor/ndv/resource-mapper.spec.ts": {
"avgDuration": 28344,
"testCount": 4,
"flakyRate": 0.0107
},
"tests/e2e/settings/personal/personal.spec.ts": {
"avgDuration": 28212,
"testCount": 2,
"flakyRate": 0.0036
},
"tests/e2e/auth/admin-smoke.spec.ts": {
"avgDuration": 26384,
"testCount": 1,
"flakyRate": 0.0089
},
"tests/e2e/nodes/community-nodes.spec.ts": {
"avgDuration": 26234,
"testCount": 3,
"flakyRate": 0.0018
},
"tests/e2e/workflows/list/import.spec.ts": {
"avgDuration": 26160,
"testCount": 5,
"flakyRate": 0.0036
},
"tests/e2e/settings/environments/variables.spec.ts": {
"avgDuration": 25384,
"testCount": 7,
"flakyRate": 0.0036
},
"tests/e2e/workflows/editor/editor-after-route-changes.spec.ts": {
"avgDuration": 25048,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/workflows/editor/subworkflows/debugging.spec.ts": {
"avgDuration": 24663,
"testCount": 4,
"flakyRate": 0.0036
},
"tests/e2e/nodes/if-node.spec.ts": {
"avgDuration": 24306,
"testCount": 2,
"flakyRate": 0.066
},
"tests/e2e/app-config/env-feature-flags.spec.ts": {
"avgDuration": 23869,
"testCount": 2,
"flakyRate": 0.0036
},
"tests/e2e/node-creator/navigation.spec.ts": {
"avgDuration": 23617,
"testCount": 4,
"flakyRate": 0.0036
},
"tests/e2e/settings/log-streaming/log-streaming.spec.ts": {
"avgDuration": 23611,
"testCount": 5,
"flakyRate": 0.0018
},
"tests/e2e/node-creator/actions.spec.ts": {
"avgDuration": 22871,
"testCount": 4,
"flakyRate": 0.0036
},
"tests/e2e/nodes/schedule-trigger-node.spec.ts": {
"avgDuration": 22699,
"testCount": 1,
"flakyRate": 0.0159
},
"tests/e2e/workflows/editor/execution/inject-previous.spec.ts": {
"avgDuration": 22386,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/chat-hub/chat-hub-personal-agent.spec.ts": {
"avgDuration": 21180,
"testCount": 2,
"flakyRate": 0.0072
},
"tests/e2e/building-blocks/user-service.spec.ts": {
"avgDuration": 20964,
"testCount": 8,
"flakyRate": 0.0018
},
"tests/e2e/sharing/access-control.spec.ts": {
"avgDuration": 20638,
"testCount": 5,
"flakyRate": 0.0018
},
"tests/e2e/capabilities/task-runner.spec.ts": {
"avgDuration": 19897,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/chat-hub/chat-hub-settings.spec.ts": {
"avgDuration": 19840,
"testCount": 2,
"flakyRate": 0.0323
},
"tests/e2e/nodes/mcp-trigger.spec.ts": {
"avgDuration": 19374,
"testCount": 23,
"flakyRate": 0.0505
},
"tests/e2e/settings/users/users.spec.ts": {
"avgDuration": 19158,
"testCount": 5,
"flakyRate": 0.0036
},
"tests/e2e/workflows/demo-diff.spec.ts": {
"avgDuration": 19043,
"testCount": 9,
"flakyRate": 0.0036
},
"tests/e2e/api/webhook-external.spec.ts": {
"avgDuration": 18850,
"testCount": 2,
"flakyRate": 0.0125
},
"tests/e2e/regression/PAY-4367-node-shifting-cyclic.spec.ts": {
"avgDuration": 18217,
"testCount": 3,
"flakyRate": 0
},
"tests/e2e/sharing/workflow-sharing.spec.ts": {
"avgDuration": 18037,
"testCount": 4,
"flakyRate": 0.0161
},
"tests/e2e/auth/signin.spec.ts": {
"avgDuration": 17601,
"testCount": 1,
"flakyRate": 0.0036
},
"tests/e2e/nodes/pdf-node.spec.ts": {
"avgDuration": 17482,
"testCount": 1,
"flakyRate": 0.0389
},
"tests/e2e/node-creator/vector-stores.spec.ts": {
"avgDuration": 17319,
"testCount": 3,
"flakyRate": 0.0072
},
"tests/e2e/node-creator/special-nodes.spec.ts": {
"avgDuration": 17174,
"testCount": 3,
"flakyRate": 0
},
"tests/e2e/workflows/editor/ndv/io-filter.spec.ts": {
"avgDuration": 15837,
"testCount": 2,
"flakyRate": 0.0071
},
"tests/e2e/sharing/credential-sharing.spec.ts": {
"avgDuration": 14796,
"testCount": 3,
"flakyRate": 0.0036
},
"tests/e2e/nodes/http-request-node.spec.ts": {
"avgDuration": 14130,
"testCount": 2,
"flakyRate": 0.0089
},
"tests/e2e/workflows/editor/execution/partial.spec.ts": {
"avgDuration": 13393,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/regression/AI-812-partial-execs-broken-when-using-chat-trigger.spec.ts": {
"avgDuration": 12865,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/regression/ADO-2372-prevent-clipping-params.spec.ts": {
"avgDuration": 11308,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/node-creator/workflows.spec.ts": {
"avgDuration": 11144,
"testCount": 2,
"flakyRate": 0.0018
},
"tests/e2e/app-config/versions.spec.ts": {
"avgDuration": 11109,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/settings/workers/workers.spec.ts": {
"avgDuration": 10525,
"testCount": 4,
"flakyRate": 0.0018
},
"tests/e2e/chat-hub/chat-hub-tools.spec.ts": {
"avgDuration": 10235,
"testCount": 1,
"flakyRate": 0.0054
},
"tests/e2e/regression/ADO-1338-ndv-missing-input-panel.spec.ts": {
"avgDuration": 9645,
"testCount": 1,
"flakyRate": 0.0018
},
"tests/e2e/regression/CAT-726-canvas-node-connectors-not-rendered-when-nodes-inserted.spec.ts": {
"avgDuration": 8351,
"testCount": 1,
"flakyRate": 0.0018
},
"tests/e2e/regression/AI-1401-sub-nodes-input-panel.spec.ts": {
"avgDuration": 7922,
"testCount": 1,
"flakyRate": 0.0018
},
"tests/e2e/credentials/api-operations.spec.ts": {
"avgDuration": 7747,
"testCount": 5,
"flakyRate": 0
},
"tests/e2e/workflows/editor/ndv/schema-preview.spec.ts": {
"avgDuration": 7653,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/regression/AI-716-correctly-set-up-agent-model-shows-error.spec.ts": {
"avgDuration": 7322,
"testCount": 1,
"flakyRate": 0.0018
},
"tests/e2e/nodes/email-send-node.spec.ts": {
"avgDuration": 7251,
"testCount": 1,
"flakyRate": 0.0036
},
"tests/e2e/regression/SUG-121-fields-reset-after-closing-ndv.spec.ts": {
"avgDuration": 7203,
"testCount": 1,
"flakyRate": 0.0071
},
"tests/e2e/regression/SUG-38-inline-expression-preview.spec.ts": {
"avgDuration": 7181,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/credentials/oauth.spec.ts": {
"avgDuration": 6886,
"testCount": 1,
"flakyRate": 0.0053
},
"tests/e2e/settings/community-nodes/community-nodes.spec.ts": {
"avgDuration": 6527,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/regression/ADO-2230-ndv-reset-data-pagination.spec.ts": {
"avgDuration": 6230,
"testCount": 1,
"flakyRate": 0.0018
},
"tests/e2e/workflows/editor/canvas/stickies.spec.ts": {
"avgDuration": 5194,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/regression/ADO-2929-can-load-old-switch-node-workflows.spec.ts": {
"avgDuration": 5121,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/workflows/editor/canvas/focus-panel.spec.ts": {
"avgDuration": 5070,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/settings/log-streaming/log-streaming-ui-e2e.spec.ts": {
"avgDuration": 4739,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/mcp/mcp-service.spec.ts": {
"avgDuration": 3442,
"testCount": 23,
"flakyRate": 0
},
"tests/e2e/workflows/editor/subworkflows/wait.spec.ts": {
"avgDuration": 3295,
"testCount": 4,
"flakyRate": 0
},
"tests/e2e/dynamic-credentials/external-user-trigger.spec.ts": {
"avgDuration": 2166,
"testCount": 1,
"flakyRate": 0.0028
},
"tests/e2e/workflows/editor/subworkflows/subworkflow-version-resolution.spec.ts": {
"avgDuration": 1509,
"testCount": 4,
"flakyRate": 0.0036
},
"tests/e2e/nodes/n8n-trigger.spec.ts": {
"avgDuration": 60000,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/settings/external-secrets/secret-providers-connections.spec.ts": {
"avgDuration": 60000,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/source-control/push.spec.ts": {
"avgDuration": 60000,
"testCount": 4,
"flakyRate": 0
},
"tests/e2e/settings/environments/source-control.spec.ts": {
"avgDuration": 60000,
"testCount": 4,
"flakyRate": 0
},
"tests/e2e/workflows/editor/workflow-actions/archive.spec.ts": {
"avgDuration": 60000,
"testCount": 7,
"flakyRate": 0
},
"tests/e2e/workflows/editor/workflow-actions/run.spec.ts": {
"avgDuration": 60000,
"testCount": 4,
"flakyRate": 0
},
"tests/e2e/ai/langchain-tools.spec.ts": {
"avgDuration": 60000,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/source-control/pull.spec.ts": {
"avgDuration": 60000,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/ai/langchain-memory.spec.ts": {
"avgDuration": 60000,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/workflows/editor/workflow-actions/duplicate.spec.ts": {
"avgDuration": 60000,
"testCount": 2,
"flakyRate": 0
},
"tests/e2e/workflows/editor/workflow-actions/copy-paste.spec.ts": {
"avgDuration": 60000,
"testCount": 3,
"flakyRate": 0
},
"tests/e2e/app-config/nps-survey.spec.ts": {
"avgDuration": 60000,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/workflows/editor/workflow-actions/settings.spec.ts": {
"avgDuration": 60000,
"testCount": 3,
"flakyRate": 0
},
"tests/e2e/workflows/editor/execution/previous-nodes.spec.ts": {
"avgDuration": 60000,
"testCount": 1,
"flakyRate": 0
},
"tests/e2e/ai/evaluations.spec.ts": {
"avgDuration": 60000,
"testCount": 1,
"flakyRate": 0
}
}
}

5
.github/trivy.yaml vendored Normal file
View File

@ -0,0 +1,5 @@
# Trivy configuration for n8n security scans
# See: https://trivy.dev/latest/docs/references/configuration/config-file/
vulnerability:
vex:
- vex.openvex.json

83
.github/workflows/backport.yml vendored Normal file
View File

@ -0,0 +1,83 @@
name: 'Util: Backport pull request changes'
run-name: Backport pull request ${{ github.event.pull_request.number || inputs.pull-request-id }}
on:
pull_request:
types: [closed]
workflow_dispatch:
inputs:
pull-request-id:
description: 'The ID number of the pull request (e.g. 3342). No #, no extra letters.'
required: true
type: string
permissions:
contents: write
pull-requests: write
jobs:
backport:
if: |
github.event.pull_request.merged == true ||
github.event_name == 'workflow_dispatch'
runs-on: ubuntu-slim
steps:
- name: Generate GitHub App Token
id: generate-token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.N8N_ASSISTANT_APP_ID }}
private-key: ${{ secrets.N8N_ASSISTANT_PRIVATE_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
token: ${{ steps.generate-token.outputs.token }}
fetch-depth: 0
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Compute backport targets
id: targets
env:
PULL_REQUEST_ID: ${{ inputs.pull-request-id }}
GITHUB_TOKEN: ${{ steps.generate-token.outputs.token }}
run: node .github/scripts/compute-backport-targets.mjs
- name: Backport
if: steps.targets.outputs.target_branches != ''
uses: korthout/backport-action@4aaf0e03a94ff0a619c9a511b61aeb42adea5b02 # v4.2.0
with:
github_token: ${{ steps.generate-token.outputs.token }}
source_pr_number: ${{ github.event.pull_request.number || inputs.pull-request-id }}
target_branches: ${{ steps.targets.outputs.target_branches }}
pull_description: |-
# Description
Backport of #${pull_number} to `${target_branch}`.
## Checklist for the author (@${pull_author}) to go through.
- [ ] Review the backport changes
- [ ] Fix possible conflicts
- [ ] Merge to target branch
After this PR has been merged, it will be picked up in the next patch release for release track.
# Original description
${pull_description}
pull_title: ${pull_title} (backport to ${target_branch})
add_author_as_assignee: true
add_author_as_reviewer: true
copy_assignees: true
copy_requested_reviewers: false
copy_labels_pattern: '^(?!Backport to\b).+' # Copy everything except backport labels
add_labels: 'automation:backport'
experimental: >
{
"conflict_resolution": "draft_commit_conflicts"
}

View File

@ -1,4 +1,4 @@
name: Docker Base Image CI
name: 'Build: Base Image'
on:
push:
@ -6,9 +6,11 @@ on:
- master
paths:
- 'docker/images/n8n-base/Dockerfile'
- '.github/workflows/build-base-image.yml'
pull_request:
paths:
- 'docker/images/n8n-base/Dockerfile'
- '.github/workflows/build-base-image.yml'
workflow_dispatch:
inputs:
push:
@ -21,34 +23,37 @@ jobs:
build:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
node_version: ['20', '22.21.0', '24']
node_version: ['22', '24.13.1', '25']
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up QEMU
uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v3.6.0
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Login to GitHub Container Registry
if: github.event_name == 'push' || (github.event_name == 'workflow_dispatch' && inputs.push == true)
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
- name: Login to DHI Registry (for pulling base images)
uses: ./.github/actions/docker-registry-login
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
login-ghcr: 'false'
login-dhi: 'true'
dockerhub-username: ${{ secrets.DOCKER_USERNAME }}
dockerhub-password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to DockerHub
- name: Login to Docker registries (for pushing)
if: github.event_name == 'push' || (github.event_name == 'workflow_dispatch' && inputs.push == true)
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: ./.github/actions/docker-registry-login
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
login-ghcr: 'true'
login-dockerhub: 'true'
dockerhub-username: ${{ secrets.DOCKER_USERNAME }}
dockerhub-password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
with:
context: .
file: ./docker/images/n8n-base/Dockerfile

View File

@ -1,4 +1,4 @@
name: Benchmark Docker Image CI
name: 'Build: Benchmark Image'
on:
workflow_dispatch:
@ -9,30 +9,26 @@ on:
- 'packages/@n8n/benchmark/**'
- 'pnpm-lock.yaml'
- 'pnpm-workspace.yaml'
- '.github/workflows/docker-images-benchmark.yml'
- '.github/workflows/build-benchmark-image.yml'
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up QEMU
uses: docker/setup-qemu-action@53851d14592bedcffcf25ea515637cff71ef929a # v3.3.0
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@6524bf65af31da8d45b59e8c27de4bd072b392f5 # v3.8.0
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Login to GitHub Container Registry
uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
uses: ./.github/actions/docker-registry-login
- name: Build
uses: docker/build-push-action@b32b51a8eda65d6793cd0494a773d4f6bcef32dc # v6.11.0
uses: docker/build-push-action@d08e5c354a6adb9ed34480a06d141179aa583294 # v7.0.0
env:
DOCKER_BUILD_SUMMARY: false
with:

View File

@ -1,4 +1,4 @@
name: Trigger build/unit tests on PR comment
name: 'Build: Unit Test PR Comment'
on:
issue_comment:
@ -18,7 +18,7 @@ jobs:
steps:
- name: Validate user permissions and collect PR data
id: check_permissions
uses: actions/github-script@v7
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@ -94,7 +94,7 @@ jobs:
HEAD_SHA: ${{ steps.check_permissions.outputs.headSha }}
PR_NUMBER: ${{ steps.check_permissions.outputs.prNumber }}
run: |
gh workflow run ci-manual-build-unit-tests.yml \
gh workflow run ci-manual-unit-tests.yml \
--repo "${{ github.repository }}" \
-f ref="${HEAD_SHA}" \
-f pr_number="${PR_NUMBER}"

View File

@ -1,4 +1,4 @@
name: Windows CI
name: 'Build: Windows'
on:
workflow_dispatch:
@ -24,6 +24,7 @@ on:
- '**/package.json'
- '**/turbo.json'
- '.github/workflows/build-windows.yml'
- '.github/actions/setup-nodejs/**'
jobs:
build:
@ -31,13 +32,27 @@ jobs:
steps:
- name: Checkout code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Setup Node.js and Build
uses: ./.github/actions/setup-nodejs-github
uses: ./.github/actions/setup-nodejs
with:
build-command: pnpm build
- name: Smoke test pnpm start -- -- --version
shell: pwsh
run: |
Write-Host "Running smoke test: pnpm start -- -- --version"
pnpm start -- -- --version
if ($LASTEXITCODE -ne 0) {
Write-Host "`n❌ Smoke test failed (exit code: $LASTEXITCODE)"
exit $LASTEXITCODE
}
Write-Host "`n✓ Smoke test passed"
- name: Send Slack notification on failure
if: failure() && inputs.notify_on_failure == true
uses: slackapi/slack-github-action@91efab103c0de0a537f72a35f6b8cda0ee76bf0a # v2.1.1

View File

@ -1,83 +0,0 @@
name: Chromatic
on:
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
pull_request_review:
types: [submitted]
concurrency:
group: chromatic-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
changeset:
runs-on: blacksmith-2vcpu-ubuntu-2204
steps:
- name: Determine changed files
uses: tomi/paths-filter-action@v3.0.2
id: changed
if: github.event_name == 'pull_request_review'
with:
filters: |
design_system:
- 'packages/frontend/@n8n/design-system/**'
- '.github/workflows/storybook.yml'
outputs:
has_changes: ${{ steps.changed.outputs.design_system || 'false' }}
chromatic:
needs: [changeset]
if: |
github.event_name == 'schedule' ||
github.event_name == 'workflow_dispatch' ||
(
github.event_name == 'pull_request_review' &&
needs.changeset.outputs.has_changes == 'true' &&
github.event.review.state == 'approved' &&
!startsWith(github.event.pull_request.head.ref, 'release/') &&
!startsWith(github.event.pull_request.head.ref, 'release-pr/')
)
runs-on: blacksmith-2vcpu-ubuntu-2204
steps:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
with:
fetch-depth: 0
- name: Setup Node.js
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
with:
build-command: pnpm run build --filter=@n8n/utils --filter=@n8n/vitest-config --filter=@n8n/design-system
- name: Publish to Chromatic
uses: chromaui/action@1cfa065cbdab28f6ca3afaeb3d761383076a35aa # v11
id: chromatic_tests
continue-on-error: true
with:
workingDir: packages/frontend/@n8n/design-system
autoAcceptChanges: 'master'
skip: 'release/**'
onlyChanged: true
projectToken: ${{ secrets.CHROMATIC_PROJECT_TOKEN }}
exitZeroOnChanges: false
- name: Success comment
if: steps.chromatic_tests.outcome == 'success' && github.ref != 'refs/heads/master'
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
token: ${{ secrets.GITHUB_TOKEN }}
edit-mode: replace
body: |
:white_check_mark: No visual regressions found.
- name: Fail comment
if: steps.chromatic_tests.outcome != 'success' && github.ref != 'refs/heads/master'
uses: peter-evans/create-or-update-comment@71345be0265236311c031f5c7866368bd1eff043 # v4.0.0
with:
issue-number: ${{ github.event.pull_request.number }}
token: ${{ secrets.GITHUB_TOKEN }}
edit-mode: replace
body: |
[:warning: Visual regressions found](${{steps.chromatic_tests.outputs.url}}): ${{steps.chromatic_tests.outputs.changeCount}}

View File

@ -13,7 +13,7 @@
#
# It outputs `should_run` as 'true' if ALL conditions pass, 'false' otherwise.
name: PR Eligibility Check
name: 'CI: Check Eligibility'
on:
workflow_call:

View File

@ -1,4 +1,4 @@
name: Check PR title
name: 'CI: Check PR Title'
on:
pull_request:

View File

@ -0,0 +1,69 @@
name: 'CI: Block fork PRs to release branches'
on:
pull_request:
branches:
- 'release/**'
types:
- opened
- reopened
- synchronize
- ready_for_review
- edited
jobs:
block-fork-prs:
runs-on: ubuntu-slim
permissions:
pull-requests: write
contents: read
steps:
- name: Check if PR is from a fork
id: check
run: |
if [ "${{ github.event.pull_request.head.repo.fork }}" = "true" ]; then
echo "fork=true" >> "$GITHUB_OUTPUT"
else
echo "fork=false" >> "$GITHUB_OUTPUT"
fi
- name: Comment on PR explaining the block
if: steps.check.outputs.fork == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const { data: comments } = await github.rest.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number,
});
const alreadyCommented = comments.some(
(c) => c.user.login === 'github-actions[bot]' && c.body.includes('Pull request blocked')
);
if (!alreadyCommented) {
const body = `
🚫 **Pull request blocked**
Pull requests from **forked repositories** are not allowed to target **release branches** in this repository.
**Target branch:** \`${context.payload.pull_request.base.ref}\`
If you believe this was blocked in error, contact the repository maintainers.
`;
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: context.payload.pull_request.number,
body
});
}
- name: Fail workflow if from fork
if: steps.check.outputs.fork == 'true'
run: |
echo "PR from fork targeting a release branch is not allowed."
exit 1

View File

@ -1,57 +0,0 @@
name: Run Workflow Builder Evals
on:
push:
branches:
- master
paths:
- 'packages/@n8n/ai-workflow-builder.ee/**'
- '.github/workflows/ci-evals.yml'
schedule:
- cron: '0 22 * * 6'
workflow_dispatch:
inputs:
branch:
description: 'GitHub branch to test.'
required: false
default: 'master'
dataset:
description: 'LangSmith dataset to use.'
required: false
default: 'workflow-builder-canvas-prompts'
jobs:
evals:
name: Run Evaluations
runs-on: blacksmith-2vcpu-ubuntu-2204
env:
N8N_AI_ANTHROPIC_KEY: ${{ secrets.EVALS_ANTHROPIC_KEY }}
LANGSMITH_TRACING: true
LANGSMITH_ENDPOINT: ${{ secrets.EVALS_LANGSMITH_ENDPOINT }}
LANGSMITH_API_KEY: ${{ secrets.EVALS_LANGSMITH_API_KEY }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: ${{ github.event.inputs.branch || github.ref }}
- name: Select dataset
run: |
DATASET="workflow-builder-canvas-prompts"
if [ "${{ github.event_name }}" = "schedule" ]; then
DATASET="prompts-v2"
elif [ -n "${{ github.event.inputs.dataset }}" ]; then
DATASET="${{ github.event.inputs.dataset }}"
fi
echo "LANGSMITH_DATASET_NAME=$DATASET" >> "$GITHUB_ENV"
- name: Setup and Build
uses: ./.github/actions/setup-nodejs-blacksmith
- name: Export Node Types
run: |
./packages/cli/bin/n8n export:nodes --output ./packages/@n8n/ai-workflow-builder.ee/evaluations/nodes.json
- name: Run Evaluations
working-directory: packages/@n8n/ai-workflow-builder.ee/evaluations
run: |
pnpm eval:langsmith --repetitions 3

View File

@ -1,4 +1,4 @@
name: Build, unit test and lint (manual trigger)
name: 'CI: Manual Unit Tests'
on:
workflow_dispatch:
@ -25,7 +25,7 @@ jobs:
steps:
- name: Create pending check run on PR
id: create
uses: actions/github-script@v7
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
@ -55,7 +55,7 @@ jobs:
ref: ${{ inputs.ref }}
- name: Setup and Build
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
- name: Run format check
run: pnpm format:check
@ -66,7 +66,7 @@ jobs:
unit-tests:
name: Unit tests
needs: install-and-build
uses: ./.github/workflows/units-tests-reusable.yml
uses: ./.github/workflows/test-unit-reusable.yml
with:
ref: ${{ inputs.ref }}
collectCoverage: true
@ -76,7 +76,7 @@ jobs:
lint:
name: Lint
needs: install-and-build
uses: ./.github/workflows/linting-reusable.yml
uses: ./.github/workflows/test-linting-reusable.yml
with:
ref: ${{ inputs.ref }}
@ -88,7 +88,7 @@ jobs:
steps:
- name: Update check run on PR (if triggered from PR comment)
if: inputs.pr_number != ''
uses: actions/github-script@v7
uses: actions/github-script@f28e40c7f34bde8b3046d885e986cb6290c5673b # v7
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |

View File

@ -1,4 +1,4 @@
name: Test Master
name: 'CI: Master (Build, Test, Lint)'
on:
push:
@ -12,35 +12,45 @@ jobs:
build-github:
name: Build for Github Cache
runs-on: ubuntu-latest
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
QA_METRICS_WEBHOOK_URL: ${{ secrets.QA_METRICS_WEBHOOK_URL }}
QA_METRICS_WEBHOOK_USER: ${{ secrets.QA_METRICS_WEBHOOK_USER }}
QA_METRICS_WEBHOOK_PASSWORD: ${{ secrets.QA_METRICS_WEBHOOK_PASSWORD }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Setup and Build
uses: ./.github/actions/setup-nodejs-github
uses: ./.github/actions/setup-nodejs
unit-test:
name: Unit tests
uses: ./.github/workflows/units-tests-reusable.yml
uses: ./.github/workflows/test-unit-reusable.yml
strategy:
fail-fast: false
matrix:
node-version: [20.x, 22.x, 24.3.x]
node-version: [22.x, 24.13.1, 25.x]
with:
ref: ${{ github.sha }}
nodeVersion: ${{ matrix.node-version }}
collectCoverage: ${{ matrix.node-version == '22.x' }}
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
collectCoverage: ${{ matrix.node-version == '24.13.1' }}
secrets: inherit
lint:
name: Lint
uses: ./.github/workflows/linting-reusable.yml
uses: ./.github/workflows/test-linting-reusable.yml
with:
ref: ${{ github.sha }}
performance:
name: Performance
uses: ./.github/workflows/test-bench-reusable.yml
with:
ref: ${{ github.sha }}
notify-on-failure:
name: Notify Slack on failure
runs-on: ubuntu-latest
needs: [unit-test, lint, build-github]
needs: [unit-test, lint, performance, build-github]
steps:
- name: Notify Slack on failure
uses: act10ns/slack@44541246747a30eb3102d87f7a4cc5471b0ffb7d # v2.1.0

View File

@ -1,146 +1,182 @@
name: Build, unit test and lint branch
name: 'CI: Pull Requests (Build, Test, Lint)'
on:
pull_request:
branches:
- '**'
- '!release/*'
merge_group:
concurrency:
group: ci-${{ github.event.pull_request.number || github.ref }}
group: ci-${{ github.event.pull_request.number || github.event.merge_group.head_sha || github.ref }}
cancel-in-progress: true
env:
COVERAGE_ENABLED: 'true' # Set globally for all jobs - ensures Turbo cache consistency
jobs:
install-and-build:
name: Install & Build
runs-on: blacksmith-2vcpu-ubuntu-2204
runs-on: ${{ vars.RUNNER_PROVIDER == 'github' && 'ubuntu-latest' || 'blacksmith-2vcpu-ubuntu-2204' }}
env:
NODE_OPTIONS: '--max-old-space-size=6144'
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
QA_METRICS_WEBHOOK_URL: ${{ secrets.QA_METRICS_WEBHOOK_URL }}
QA_METRICS_WEBHOOK_USER: ${{ secrets.QA_METRICS_WEBHOOK_USER }}
QA_METRICS_WEBHOOK_PASSWORD: ${{ secrets.QA_METRICS_WEBHOOK_PASSWORD }}
outputs:
frontend_changed: ${{ steps.paths-filter.outputs.frontend == 'true' }}
non_python_changed: ${{ steps.paths-filter.outputs.non-python == 'true' }}
ci: ${{ fromJSON(steps.ci-filter.outputs.results).ci == true }}
unit: ${{ fromJSON(steps.ci-filter.outputs.results).unit == true }}
e2e: ${{ fromJSON(steps.ci-filter.outputs.results).e2e == true }}
workflows: ${{ fromJSON(steps.ci-filter.outputs.results).workflows == true }}
workflow_scripts: ${{ fromJSON(steps.ci-filter.outputs.results)['workflow-scripts'] == true }}
db: ${{ fromJSON(steps.ci-filter.outputs.results).db == true }}
performance: ${{ fromJSON(steps.ci-filter.outputs.results).performance == true }}
commit_sha: ${{ steps.commit-sha.outputs.sha }}
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: refs/pull/${{ github.event.pull_request.number }}/merge
# Use merge_group SHA when in merge queue, otherwise PR merge ref
ref: ${{ github.event_name == 'merge_group' && github.event.merge_group.head_sha || format('refs/pull/{0}/merge', github.event.pull_request.number) }}
- name: Check for frontend changes
uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
id: paths-filter
- name: Capture commit SHA for cache consistency
id: commit-sha
run: echo "sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
- name: Check for relevant changes
uses: ./.github/actions/ci-filter
id: ci-filter
with:
mode: filter
filters: |
frontend:
- packages/frontend/**
- packages/@n8n/design-system/**
- packages/@n8n/chat/**
- packages/@n8n/codemirror-lang/**
- .bundlemonrc.json
- .github/workflows/ci-pull-requests.yml
non-python:
- '**'
- '!packages/@n8n/task-runner-python/**'
ci:
**
!packages/@n8n/task-runner-python/**
!.github/**
unit:
**
!packages/@n8n/task-runner-python/**
!packages/testing/playwright/**
!.github/**
e2e:
.github/workflows/test-e2e-*.yml
.github/scripts/cleanup-ghcr-images.mjs
packages/testing/playwright/**
packages/testing/containers/**
workflows: .github/**
workflow-scripts: .github/scripts/**
design-system:
packages/frontend/@n8n/design-system/**
packages/frontend/@n8n/chat/**
packages/frontend/@n8n/storybook/**
.github/workflows/test-visual-chromatic.yml
db:
packages/cli/src/databases/**
packages/cli/src/modules/*/database/**
packages/cli/src/modules/**/*.entity.ts
packages/cli/src/modules/**/*.repository.ts
packages/cli/test/integration/**
packages/cli/test/migration/**
packages/cli/test/shared/db/**
packages/@n8n/db/**
packages/cli/**/__tests__/**
packages/testing/containers/services/postgres.ts
.github/workflows/test-db-reusable.yml
- name: Setup and Build
if: steps.paths-filter.outputs.non-python == 'true'
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
if: fromJSON(steps.ci-filter.outputs.results).ci
uses: ./.github/actions/setup-nodejs
- name: Run format check
if: steps.paths-filter.outputs.non-python == 'true'
if: fromJSON(steps.ci-filter.outputs.results).ci
run: pnpm format:check
- name: Upload Frontend Build Artifacts
if: steps.paths-filter.outputs.frontend == 'true'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: editor-ui-dist
path: packages/frontend/editor-ui/dist/
retention-days: 1
bundle-size-check:
name: Bundle Size Check
needs: install-and-build
if: needs.install-and-build.outputs.frontend_changed == 'true'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
ref: refs/pull/${{ github.event.pull_request.number }}/merge
- name: Setup pnpm CLI
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Setup Node.js
uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: '22.x'
cache: pnpm
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Download Frontend Build Artifacts
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
with:
name: editor-ui-dist
path: packages/frontend/editor-ui/dist/
- name: BundleMon
uses: lironer/bundlemon-action@cadbdd58f86faf1900725ef69d455444124b3748 # v1.3.0
unit-test:
name: Unit tests
if: needs.install-and-build.outputs.non_python_changed == 'true'
uses: ./.github/workflows/units-tests-reusable.yml
if: needs.install-and-build.outputs.unit == 'true'
uses: ./.github/workflows/test-unit-reusable.yml
needs: install-and-build
with:
ref: refs/pull/${{ github.event.pull_request.number }}/merge
ref: ${{ needs.install-and-build.outputs.commit_sha }}
collectCoverage: true
secrets:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
secrets: inherit
typecheck:
name: Typecheck
if: needs.install-and-build.outputs.non_python_changed == 'true'
runs-on: blacksmith-4vcpu-ubuntu-2204
if: needs.install-and-build.outputs.ci == 'true'
runs-on: ${{ vars.RUNNER_PROVIDER == 'github' && 'ubuntu-latest' || 'blacksmith-4vcpu-ubuntu-2204' }}
needs: install-and-build
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: refs/pull/${{ github.event.pull_request.number }}/merge
ref: ${{ needs.install-and-build.outputs.commit_sha }}
- name: Setup Node.js
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
with:
build-command: pnpm typecheck
lint:
name: Lint
if: needs.install-and-build.outputs.non_python_changed == 'true'
uses: ./.github/workflows/linting-reusable.yml
if: needs.install-and-build.outputs.ci == 'true'
uses: ./.github/workflows/test-linting-reusable.yml
needs: install-and-build
with:
ref: refs/pull/${{ github.event.pull_request.number }}/merge
ref: ${{ needs.install-and-build.outputs.commit_sha }}
e2e-test:
e2e-tests:
name: E2E Tests
needs: [install-and-build, unit-test, typecheck, lint]
if: |
always() &&
needs.install-and-build.result == 'success' &&
needs.unit-test.result != 'failure' &&
needs.typecheck.result != 'failure' &&
needs.lint.result != 'failure'
uses: ./.github/workflows/playwright-test-reusable.yml
needs: install-and-build
if: (needs.install-and-build.outputs.ci == 'true' || needs.install-and-build.outputs.e2e == 'true') && github.repository == 'n8n-io/n8n'
uses: ./.github/workflows/test-e2e-ci-reusable.yml
with:
branch: ${{ needs.install-and-build.outputs.commit_sha }}
secrets: inherit
e2e-checks:
name: E2E - Checks
runs-on: ubuntu-latest
needs: [e2e-test]
db-tests:
name: DB Tests
needs: install-and-build
if: needs.install-and-build.outputs.db == 'true'
uses: ./.github/workflows/test-db-reusable.yml
with:
ref: ${{ needs.install-and-build.outputs.commit_sha }}
security-checks:
name: Security Checks
needs: install-and-build
if: needs.install-and-build.outputs.workflows == 'true'
uses: ./.github/workflows/sec-ci-reusable.yml
with:
ref: ${{ needs.install-and-build.outputs.commit_sha }}
secrets: inherit
workflow-scripts:
name: Workflow scripts
needs: install-and-build
if: needs.install-and-build.outputs.workflow_scripts == 'true'
uses: ./.github/workflows/test-workflow-scripts-reusable.yml
with:
ref: ${{ needs.install-and-build.outputs.commit_sha }}
secrets: inherit
# This job is required by GitHub branch protection rules.
# PRs cannot be merged unless this job passes.
required-checks:
name: Required Checks
needs:
[
install-and-build,
unit-test,
typecheck,
lint,
e2e-tests,
db-tests,
security-checks,
workflow-scripts,
]
if: always()
runs-on: ubuntu-slim
steps:
- name: Fail if E2E tests failed
if: needs.e2e-test.result == 'failure'
run: exit 1
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
sparse-checkout: .github/actions/ci-filter
sparse-checkout-cone-mode: false
- name: Validate required checks
uses: ./.github/actions/ci-filter
with:
mode: validate
job-results: ${{ toJSON(needs) }}

View File

@ -1,4 +1,4 @@
name: Python CI
name: 'CI: Python'
on:
pull_request:
@ -18,10 +18,10 @@ jobs:
working-directory: packages/@n8n/task-runner-python
steps:
- name: Check out project
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Install uv
uses: astral-sh/setup-uv@d9e0f98d3fc6adb07d1e3d37f3043649ddad06a1 # 6.5.0
uses: astral-sh/setup-uv@6ee6290f1cbc4156c0bdd66691b2c144ef8df19a # v7.4.0
with:
enable-cache: true
@ -47,7 +47,7 @@ jobs:
run: uv run pytest --cov=src --cov-report=xml --cov-report=term-missing
- name: Upload coverage to Codecov
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: packages/@n8n/task-runner-python/coverage.xml

View File

@ -0,0 +1,92 @@
name: 'CI: Check merge source and destination'
on:
pull_request:
branches:
- master
- 1.x
permissions:
pull-requests: write
contents: read
jobs:
check_branch:
if: ${{ github.repository == 'n8n-io/n8n-private' }}
name: enforce-bundle-branches-only-in-private
runs-on: ubuntu-latest
steps:
- name: Validate head branch
id: validate
shell: bash
env:
HEAD_REF: ${{ github.head_ref }}
run: |
set -euo pipefail
head="$HEAD_REF"
if [[ "$head" == bundle/* ]]; then
echo "allowed=true" >> "$GITHUB_OUTPUT"
else
echo "allowed=false" >> "$GITHUB_OUTPUT"
fi
- name: Comment on PR (blocked)
if: ${{ steps.validate.outputs.allowed == 'false' }}
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8
with:
script: |
const owner = context.repo.owner;
const repo = context.repo.repo;
const issue_number = context.payload.pull_request.number;
const head = context.payload.pull_request.head.ref;
const base = context.payload.pull_request.base.ref;
const marker = "<!-- bundle-branch-only -->";
const body =
`${marker}\n` +
`🚫 **Merge blocked**: PRs into \`${base}\` are only allowed from branches named \`bundle/*\`.\n\n` +
`Current source branch: \`${head}\`\n\n` +
`Merge your developments into a bundle branch instead of directly merging to master or 1.x.`;
// Find an existing marker comment (to update instead of spamming)
const { data: comments } = await github.rest.issues.listComments({
owner,
repo,
issue_number,
per_page: 100,
});
const existing = comments.find(c => c.body && c.body.includes(marker));
if (existing) {
await github.rest.issues.updateComment({
owner,
repo,
comment_id: existing.id,
body,
});
} else {
await github.rest.issues.createComment({
owner,
repo,
issue_number,
body,
});
}
- name: Fail (blocked)
if: ${{ steps.validate.outputs.allowed == 'false' }}
env:
HEAD_REF: ${{ github.head_ref }}
run: |
echo "::error::You can only merge to master and 1.x from a bundle/* branch. Got '$HEAD_REF'."
exit 1
- name: Allowed
if: ${{ steps.validate.outputs.allowed == 'true' }}
env:
HEAD_REF: ${{ github.head_ref }}
BASE_REF: ${{ github.base_ref }}
run: |
echo "OK: '$HEAD_REF' can merge into '$BASE_REF'"

View File

@ -1,69 +0,0 @@
name: Create Branch For Patch Release
on:
workflow_dispatch:
inputs:
commit_shas:
description: 'Comma-separated commit SHAs'
required: true
old_version:
description: 'Old version to be patched'
required: true
default: '1.0.0'
new_version:
description: 'The new patch version'
required: true
default: '1.0.1'
resumeUrl:
description: 'n8n workflow resume URL'
required: true
jobs:
create-branch:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Validate inputs
run: |
if ! [[ "${{ inputs.old_version }}" =~ ^[0-9]+\.[0-9]+\.[0-9]+(-[0-9A-Za-z]+(\.[0-9A-Za-z]+)*)?$ ]]; then
echo "Invalid old version format: ${{ inputs.old_version }}"
exit 1
fi
if ! [[ "${{ inputs.new_version }}" =~ ^[0-9]+\.[0-9]+\.[0-9]+(-[0-9A-Za-z]+(\.[0-9A-Za-z]+)*)?$ ]]; then
echo "Invalid new version format: ${{ inputs.new_version }}"
exit 1
fi
- name: Notify if inputs are invalid
if: ${{ failure() }}
run: |
curl -X POST -H "Content-Type: application/json" -d '{ "success": false, "message": "The old or new version you provided is invalid, make sure they both follow the SemVer format" }' ${{ inputs.resumeUrl }}
exit 1
- name: Setup, cherry-pick and push branch
run: |
git config user.name "github-actions[bot]"
git config user.email "github-actions[bot]@users.noreply.github.com"
git switch "n8n@${{ inputs.old_version }}" --detach
BRANCH="patch/${{ inputs.new_version }}"
git checkout -b $BRANCH
IFS=',' read -ra SHAS <<< "${{ inputs.commit_shas }}"
for sha in "${SHAS[@]}"; do
sha=$(echo $sha | xargs)
if ! git merge-base --is-ancestor $sha HEAD; then
echo "Cherry-picking commit $sha"
git cherry-pick $sha
else
echo "Commit $sha is already in the branch, skipping"
fi
done
git push -f origin $BRANCH
- name: Notify if cherry-pick is successful
if: ${{ success() }}
run: |
curl -X POST -H "Content-Type: application/json" -d '{ "success": true }' ${{ inputs.resumeUrl }}
- name: Notify if cherry-pick is not successful
if: ${{ failure() }}
run: |
curl -X POST -H "Content-Type: application/json" -d '{ "success": false, "message": "There was a conflict when trying to create the branch, please do the cherry-pick and resolve the conflicts manually or do not include the PRs that caused the conflict" }' ${{ inputs.resumeUrl }}

View File

@ -7,7 +7,7 @@ name: 'Docker: Build and Push'
env:
NODE_OPTIONS: '--max-old-space-size=7168'
NODE_VERSION: '22.21.0'
NODE_VERSION: '24.13.1'
on:
schedule:
@ -42,18 +42,6 @@ on:
required: false
type: string
pull_request:
types:
- opened
- ready_for_review
paths:
- '.github/workflows/docker-build-push.yml'
- '.github/scripts/docker/docker-config.mjs'
- '.github/scripts/docker/docker-tags.mjs'
- 'docker/images/n8n/Dockerfile'
- 'docker/images/runners/Dockerfile'
- 'docker/images/runners/Dockerfile.distroless'
jobs:
determine-build-context:
name: Determine Build Context
@ -66,24 +54,28 @@ jobs:
build_matrix: ${{ steps.context.outputs.build_matrix }}
steps:
- name: Checkout code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Determine build context
id: context
env:
N8N_VERSION: ${{ inputs.n8n_version }}
RELEASE_TYPE: ${{ inputs.release_type }}
PUSH_ENABLED: ${{ inputs.push_enabled }}
run: |
node .github/scripts/docker/docker-config.mjs \
--event "${{ github.event_name }}" \
--pr "${{ github.event.pull_request.number }}" \
--branch "${{ github.ref_name }}" \
--version "${{ inputs.n8n_version }}" \
--release-type "${{ inputs.release_type }}" \
--push-enabled "${{ inputs.push_enabled }}"
--version "$N8N_VERSION" \
--release-type "$RELEASE_TYPE" \
--push-enabled "$PUSH_ENABLED"
build-and-push-docker:
name: Build App, then Build and Push Docker Image (${{ matrix.platform }})
needs: determine-build-context
runs-on: ${{ matrix.runner }}
timeout-minutes: 15
timeout-minutes: 25
strategy:
matrix: ${{ fromJSON(needs.determine-build-context.outputs.build_matrix) }}
outputs:
@ -93,14 +85,18 @@ jobs:
runners_distroless_primary_ghcr_manifest_tag: ${{ steps.determine-tags.outputs.runners_distroless_primary_tag }}
steps:
- name: Checkout code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Setup and Build
uses: ./.github/actions/setup-nodejs-blacksmith
uses: ./.github/actions/setup-nodejs
with:
build-command: pnpm build:n8n
enable-docker-cache: 'true'
env:
RELEASE: ${{ needs.determine-build-context.outputs.n8n_version }}
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: Determine Docker tags for all images
id: determine-tags
@ -116,28 +112,18 @@ jobs:
echo "${key}: ${value%%,*}..." # Show first tag for brevity
done
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@b5ca514318bd6ebac0fb2aedd5d36ec1b5c232a2 # v3.10.0
- name: Login to GitHub Container Registry
- name: Login to Docker registries
if: needs.determine-build-context.outputs.push_enabled == 'true'
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
uses: ./.github/actions/docker-registry-login
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Docker Hub
if: |
needs.determine-build-context.outputs.push_enabled == 'true' &&
needs.determine-build-context.outputs.push_to_docker == 'true'
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
login-ghcr: true
login-dockerhub: ${{ needs.determine-build-context.outputs.push_to_docker == 'true' }}
dockerhub-username: ${{ secrets.DOCKER_USERNAME }}
dockerhub-password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build and push n8n Docker image
uses: useblacksmith/build-push-action@574eb0ee0b59c6a687ace24192f0727dfb65d6d7 # v1.2
id: build-n8n
uses: useblacksmith/build-push-action@30c71162f16ea2c27c3e21523255d209b8b538c1 # v2
with:
context: .
file: ./docker/images/n8n/Dockerfile
@ -146,13 +132,14 @@ jobs:
N8N_VERSION=${{ needs.determine-build-context.outputs.n8n_version }}
N8N_RELEASE_TYPE=${{ needs.determine-build-context.outputs.release_type }}
platforms: ${{ matrix.docker_platform }}
provenance: true
provenance: false # Disabled - using SLSA L3 generator for isolated provenance
sbom: true
push: ${{ needs.determine-build-context.outputs.push_enabled == 'true' }}
tags: ${{ steps.determine-tags.outputs.n8n_tags }}
- name: Build and push task runners Docker image (Alpine)
uses: useblacksmith/build-push-action@574eb0ee0b59c6a687ace24192f0727dfb65d6d7 # v1.2
id: build-runners
uses: useblacksmith/build-push-action@30c71162f16ea2c27c3e21523255d209b8b538c1 # v2
with:
context: .
file: ./docker/images/runners/Dockerfile
@ -161,13 +148,14 @@ jobs:
N8N_VERSION=${{ needs.determine-build-context.outputs.n8n_version }}
N8N_RELEASE_TYPE=${{ needs.determine-build-context.outputs.release_type }}
platforms: ${{ matrix.docker_platform }}
provenance: true
provenance: false # Disabled - using SLSA L3 generator for isolated provenance
sbom: true
push: ${{ needs.determine-build-context.outputs.push_enabled == 'true' }}
tags: ${{ steps.determine-tags.outputs.runners_tags }}
- name: Build and push task runners Docker image (distroless)
uses: useblacksmith/build-push-action@574eb0ee0b59c6a687ace24192f0727dfb65d6d7 # v1.2
id: build-runners-distroless
uses: useblacksmith/build-push-action@30c71162f16ea2c27c3e21523255d209b8b538c1 # v2
with:
context: .
file: ./docker/images/runners/Dockerfile.distroless
@ -176,7 +164,7 @@ jobs:
N8N_VERSION=${{ needs.determine-build-context.outputs.n8n_version }}
N8N_RELEASE_TYPE=${{ needs.determine-build-context.outputs.release_type }}
platforms: ${{ matrix.docker_platform }}
provenance: true
provenance: false # Disabled - using SLSA L3 generator for isolated provenance
sbom: true
push: ${{ needs.determine-build-context.outputs.push_enabled == 'true' }}
tags: ${{ steps.determine-tags.outputs.runners_distroless_tags }}
@ -188,23 +176,27 @@ jobs:
if: |
needs.build-and-push-docker.result == 'success' &&
needs.determine-build-context.outputs.push_enabled == 'true'
outputs:
n8n_digest: ${{ steps.get-digests.outputs.n8n_digest }}
n8n_image: ${{ steps.get-digests.outputs.n8n_image }}
runners_digest: ${{ steps.get-digests.outputs.runners_digest }}
runners_image: ${{ steps.get-digests.outputs.runners_image }}
runners_distroless_digest: ${{ steps.get-digests.outputs.runners_distroless_digest }}
runners_distroless_image: ${{ steps.get-digests.outputs.runners_distroless_image }}
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
- name: Login to GitHub Container Registry
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
- name: Login to Docker registries
uses: ./.github/actions/docker-registry-login
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Login to Docker Hub
if: needs.determine-build-context.outputs.push_to_docker == 'true'
uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
login-ghcr: true
login-dockerhub: ${{ needs.determine-build-context.outputs.push_to_docker == 'true' }}
dockerhub-username: ${{ secrets.DOCKER_USERNAME }}
dockerhub-password: ${{ secrets.DOCKER_PASSWORD }}
- name: Create GHCR multi-arch manifests
run: |
@ -264,6 +256,14 @@ jobs:
"${DOCKER_BASE}/${IMAGE_NAME}:${TAG_SUFFIX}-arm64"
done
- name: Get manifest digests for attestation
id: get-digests
env:
N8N_TAG: ${{ needs.build-and-push-docker.outputs.primary_ghcr_manifest_tag }}
RUNNERS_TAG: ${{ needs.build-and-push-docker.outputs.runners_primary_ghcr_manifest_tag }}
DISTROLESS_TAG: ${{ needs.build-and-push-docker.outputs.runners_distroless_primary_ghcr_manifest_tag }}
run: node .github/scripts/docker/get-manifest-digests.mjs
call-success-url:
name: Call Success URL
needs: [create_multi_arch_manifest]
@ -279,6 +279,113 @@ jobs:
curl -v "${{ env.SUCCESS_URL }}" || echo "Failed to call success URL"
shell: bash
# SLSA L3 Provenance - Must use version tags (@vX.Y.Z), NOT SHAs
provenance-n8n:
name: SLSA Provenance (n8n)
needs: [determine-build-context, build-and-push-docker, create_multi_arch_manifest]
if: |
needs.create_multi_arch_manifest.result == 'success' &&
needs.create_multi_arch_manifest.outputs.n8n_digest != ''
permissions:
id-token: write
packages: write
actions: read
uses: slsa-framework/slsa-github-generator/.github/workflows/generator_container_slsa3.yml@v2.1.0
with:
image: ${{ needs.create_multi_arch_manifest.outputs.n8n_image }}
digest: ${{ needs.create_multi_arch_manifest.outputs.n8n_digest }}
registry-username: ${{ github.actor }}
secrets:
registry-password: ${{ secrets.GITHUB_TOKEN }}
provenance-runners:
name: SLSA Provenance (runners)
needs: [determine-build-context, build-and-push-docker, create_multi_arch_manifest]
if: |
needs.create_multi_arch_manifest.result == 'success' &&
needs.create_multi_arch_manifest.outputs.runners_digest != ''
permissions:
id-token: write
packages: write
actions: read
uses: slsa-framework/slsa-github-generator/.github/workflows/generator_container_slsa3.yml@v2.1.0
with:
image: ${{ needs.create_multi_arch_manifest.outputs.runners_image }}
digest: ${{ needs.create_multi_arch_manifest.outputs.runners_digest }}
registry-username: ${{ github.actor }}
secrets:
registry-password: ${{ secrets.GITHUB_TOKEN }}
provenance-runners-distroless:
name: SLSA Provenance (runners-distroless)
needs: [determine-build-context, build-and-push-docker, create_multi_arch_manifest]
if: |
needs.create_multi_arch_manifest.result == 'success' &&
needs.create_multi_arch_manifest.outputs.runners_distroless_digest != ''
permissions:
id-token: write
packages: write
actions: read
uses: slsa-framework/slsa-github-generator/.github/workflows/generator_container_slsa3.yml@v2.1.0
with:
image: ${{ needs.create_multi_arch_manifest.outputs.runners_distroless_image }}
digest: ${{ needs.create_multi_arch_manifest.outputs.runners_distroless_digest }}
registry-username: ${{ github.actor }}
secrets:
registry-password: ${{ secrets.GITHUB_TOKEN }}
# VEX Attestation - Documents which CVEs affect us (security/vex.openvex.json)
vex-attestation:
name: VEX Attestation
needs: [determine-build-context, build-and-push-docker, create_multi_arch_manifest, provenance-n8n, provenance-runners, provenance-runners-distroless]
if: |
always() &&
needs.create_multi_arch_manifest.result == 'success' &&
(needs.determine-build-context.outputs.release_type == 'stable' ||
needs.determine-build-context.outputs.release_type == 'rc' ||
needs.determine-build-context.outputs.release_type == 'nightly')
runs-on: ubuntu-latest
permissions:
id-token: write
packages: write
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Install Cosign
uses: sigstore/cosign-installer@7e8b541eb2e61bf99390e1afd4be13a184e9ebc5 # v3.10.1
- name: Login to GHCR
uses: docker/login-action@b45d80f862d83dbcd57f89517bcf500b2ab88fb2 # v4.0.0
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Attest VEX to n8n image
if: needs.create_multi_arch_manifest.outputs.n8n_digest != ''
run: |
cosign attest --yes \
--type openvex \
--predicate security/vex.openvex.json \
${{ needs.create_multi_arch_manifest.outputs.n8n_image }}@${{ needs.create_multi_arch_manifest.outputs.n8n_digest }}
- name: Attest VEX to runners image
if: needs.create_multi_arch_manifest.outputs.runners_digest != ''
run: |
cosign attest --yes \
--type openvex \
--predicate security/vex.openvex.json \
${{ needs.create_multi_arch_manifest.outputs.runners_image }}@${{ needs.create_multi_arch_manifest.outputs.runners_digest }}
- name: Attest VEX to runners-distroless image
if: needs.create_multi_arch_manifest.outputs.runners_distroless_digest != ''
run: |
cosign attest --yes \
--type openvex \
--predicate security/vex.openvex.json \
${{ needs.create_multi_arch_manifest.outputs.runners_distroless_image }}@${{ needs.create_multi_arch_manifest.outputs.runners_distroless_digest }}
security-scan:
name: Security Scan
needs: [determine-build-context, build-and-push-docker, create_multi_arch_manifest]

View File

@ -0,0 +1,64 @@
name: 'Docker Build Smoke Test'
# Verifies the full Docker build chain works without any caching.
# Catches native module compilation failures (e.g., isolated-vm, sqlite3)
# that layer caching can mask in the regular E2E pipeline.
#
# Full chain: pnpm install → pnpm build (no Turbo cache) →
# build base image (no Docker cache) →
# build n8n + runners images (no Docker cache)
on:
schedule:
- cron: '0 3 * * *' # 3:00 AM UTC, after the nightly Docker build at midnight
pull_request:
paths:
- 'docker/images/n8n/**'
- 'docker/images/n8n-base/**'
- 'docker/images/runners/**'
- 'scripts/build-n8n.mjs'
- 'scripts/dockerize-n8n.mjs'
workflow_dispatch:
jobs:
docker-smoke-test:
name: 'Docker Build (no cache)'
runs-on: blacksmith-4vcpu-ubuntu-2204
if: ${{ !github.event.pull_request.head.repo.fork }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Login to DHI Registry (for base image)
uses: ./.github/actions/docker-registry-login
with:
login-ghcr: 'false'
login-dhi: 'true'
dockerhub-username: ${{ secrets.DOCKER_USERNAME }}
dockerhub-password: ${{ secrets.DOCKER_PASSWORD }}
- name: Build full chain (no cache)
uses: ./.github/actions/setup-nodejs
with:
build-command: 'pnpm build:docker:clean'
enable-docker-cache: true
- name: Verify n8n image starts
run: |
docker run --rm -d --name n8n-smoke-test n8nio/n8n:local
sleep 5
docker logs n8n-smoke-test 2>&1 | tail -20
docker stop n8n-smoke-test
notify-on-failure:
name: Notify on nightly smoke test failure
runs-on: ubuntu-slim
needs: [docker-smoke-test]
if: needs.docker-smoke-test.result == 'failure' && github.event_name == 'schedule'
steps:
- uses: slackapi/slack-github-action@91efab103c0de0a537f72a35f6b8cda0ee76bf0a # v2.1.1
with:
method: chat.postMessage
token: ${{ secrets.QBOT_SLACK_TOKEN }}
payload: |
channel: C0A9RLY8Y20
text: "🚨 Nightly Docker smoke test failed (no-cache build) - ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"

View File

@ -1,29 +0,0 @@
name: Playwright Tests - Nightly
on:
schedule:
- cron: '0 4 * * *'
workflow_dispatch:
inputs:
image:
description: 'Docker image to test against'
required: false
default: 'n8nio/n8n:nightly'
type: string
push:
branches:
- ci-containers-nightly
jobs:
test-configurations:
strategy:
fail-fast: false
matrix:
config: [standard, postgres]
name: Test ${{ matrix.config }}
uses: ./.github/workflows/playwright-test-reusable.yml
with:
test-mode: docker-pull
docker-image: ${{ github.event.inputs.image || 'n8nio/n8n:nightly' }}
test-command: pnpm --filter=n8n-playwright test:container:${{ matrix.config }}
secrets: inherit

View File

@ -1,50 +0,0 @@
name: Weekly Coverage Tests
on:
schedule:
- cron: '0 2 * * 1' # Every Monday at 2 AM
workflow_dispatch: # Allow manual triggering
env:
PLAYWRIGHT_BROWSERS_PATH: packages/testing/playwright/ms-playwright-cache
NODE_OPTIONS: --max-old-space-size=16384
TESTCONTAINERS_RYUK_DISABLED: true
PLAYWRIGHT_WORKERS: 4
jobs:
coverage:
runs-on: blacksmith-8vcpu-ubuntu-2204
name: Coverage Tests
steps:
- name: Checkout
uses: actions/checkout@08eba0b27e820071cde6df949e0beb9ba4906955 # v4.3.0
- name: Setup Environment
uses: ./.github/actions/setup-nodejs-blacksmith
with:
build-command: pnpm turbo build:playwright
- name: Build with Coverage
run: pnpm --filter n8n-editor-ui build:coverage
- name: Run Coverage Tests
run: |
pnpm --filter n8n-playwright test:local \
--workers=${{ env.PLAYWRIGHT_WORKERS }}
env:
BUILD_WITH_COVERAGE: 'true'
CURRENTS_RECORD_KEY: ${{ secrets.CURRENTS_RECORD_KEY }}
QA_PERFORMANCE_METRICS_WEBHOOK_URL: ${{ secrets.QA_PERFORMANCE_METRICS_WEBHOOK_URL }}
QA_PERFORMANCE_METRICS_WEBHOOK_USER: ${{ secrets.QA_PERFORMANCE_METRICS_WEBHOOK_USER }}
QA_PERFORMANCE_METRICS_WEBHOOK_PASSWORD: ${{ secrets.QA_PERFORMANCE_METRICS_WEBHOOK_PASSWORD }}
- name: Generate Coverage Report
run: pnpm --filter n8n-playwright coverage:report
- name: Upload Coverage Report
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: coverage-report
path: packages/testing/playwright/coverage/
retention-days: 14

View File

@ -1,14 +0,0 @@
name: Run Playwright Tests (Docker Build)
# This workflow is used to run Playwright tests in a Docker container built from the current branch
on:
workflow_call:
workflow_dispatch:
jobs:
build-and-test:
uses: ./.github/workflows/playwright-test-reusable.yml
with:
test-mode: docker-build
test-command: pnpm --filter=n8n-playwright test:container:standard
secrets: inherit

View File

@ -1,103 +0,0 @@
name: Playwright Tests - Reusable
on:
workflow_call:
inputs:
branch:
description: 'GitHub branch to test.'
required: false
type: string
test-mode:
description: 'Test mode: local (pnpm start from local), docker-build, or docker-pull'
required: false
default: 'local'
type: string
test-command:
description: 'Test command to run'
required: false
default: 'pnpm --filter=n8n-playwright test:local'
type: string
shards:
description: 'Shards for parallel execution'
required: false
default: '[1, 2, 3, 4, 5, 6, 7, 8]'
type: string
docker-image:
description: 'Docker image to use (for docker-pull mode)'
required: false
default: 'n8nio/n8n:nightly'
type: string
workers:
description: 'Number of parallel workers'
required: false
default: ''
type: string
secrets:
CURRENTS_RECORD_KEY:
required: false
QA_PERFORMANCE_METRICS_WEBHOOK_URL:
required: false
QA_PERFORMANCE_METRICS_WEBHOOK_USER:
required: false
QA_PERFORMANCE_METRICS_WEBHOOK_PASSWORD:
required: false
env:
PLAYWRIGHT_BROWSERS_PATH: packages/testing/playwright/ms-playwright-cache
NODE_OPTIONS: --max-old-space-size=3072
# Disable Ryuk to avoid issues with Docker since it needs privileged access, containers are cleaned on teardown anyway
TESTCONTAINERS_RYUK_DISABLED: true
PLAYWRIGHT_WORKERS: ${{ inputs.workers || '2' }} # Configurable workers, defaults to 2 to reduce resource contention
jobs:
test:
runs-on: blacksmith-2vcpu-ubuntu-2204
strategy:
fail-fast: false
matrix:
shard: ${{ fromJSON(inputs.shards || '[1, 2, 3, 4, 5, 6, 7, 8]') }}
name: Test (Shard ${{ matrix.shard }}/${{ strategy.job-total }})
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 1
ref: ${{ inputs.branch || github.ref }}
- name: Setup Environment
uses: ./.github/actions/setup-nodejs-blacksmith
with:
build-command: ${{ inputs.test-mode == 'docker-build' && 'pnpm build:docker' || 'pnpm turbo build:playwright' }}
enable-docker-cache: ${{ inputs.test-mode != 'local' }}
env:
INCLUDE_TEST_CONTROLLER: ${{ inputs.test-mode == 'docker-build' && 'true' || '' }}
- name: Install Browsers
if: inputs.test-mode == 'docker-build'
run: pnpm turbo install-browsers:ci
- name: Pre-pull Test Container Images
if: ${{ !contains(inputs.test-command, 'test:local') }}
run: |
# Pre-pull all test container images to avoid network changes during test execution
npx tsx packages/testing/containers/pull-test-images.ts || true
env:
N8N_DOCKER_IMAGE: ${{ inputs.test-mode == 'docker-build' && 'n8nio/n8n:local' || inputs.docker-image }}
- name: Prepare Test Image
if: inputs.test-mode == 'docker-pull'
run: pnpm --filter=n8n-playwright prepare-test-image ${{ inputs.docker-image }}
- name: Run Tests
run: |
${{ inputs.test-command }} \
--shard=${{ matrix.shard }}/${{ strategy.job-total }} \
--workers=${{ env.PLAYWRIGHT_WORKERS }}
env:
N8N_DOCKER_IMAGE: ${{ inputs.test-mode == 'docker-build' && 'n8nio/n8n:local' || inputs.docker-image }}
CURRENTS_RECORD_KEY: ${{ secrets.CURRENTS_RECORD_KEY }}
QA_PERFORMANCE_METRICS_WEBHOOK_URL: ${{ secrets.QA_PERFORMANCE_METRICS_WEBHOOK_URL }}
QA_PERFORMANCE_METRICS_WEBHOOK_USER: ${{ secrets.QA_PERFORMANCE_METRICS_WEBHOOK_USER }}
QA_PERFORMANCE_METRICS_WEBHOOK_PASSWORD: ${{ secrets.QA_PERFORMANCE_METRICS_WEBHOOK_PASSWORD }}

View File

@ -0,0 +1,15 @@
name: 'Release: Create Minor Release PR'
on:
workflow_dispatch:
# schedule:
# - cron: 0 13 * * 1
jobs:
create-release-pr:
name: Create release PR
uses: ./.github/workflows/release-create-pr.yml
secrets: inherit
with:
base-branch: master
release-type: minor

View File

@ -0,0 +1,55 @@
name: 'Release: Create Patch Release PR'
run-name: 'Release: Create Patch Release PR for track ${{ inputs.track }}'
on:
workflow_dispatch:
inputs:
track:
description: 'Release Track'
required: true
type: choice
options: [stable, beta, v1]
jobs:
determine-version-info:
name: Determine publishing track
runs-on: ubuntu-latest
outputs:
release_candidate_branch: ${{ steps.determine-branch.outputs.release_candidate_branch }}
should_update: ${{ steps.determine-branch.outputs.should_update }}
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Determine release candidate branch from track
id: determine-branch
env:
TRACK: ${{ inputs.track }}
run: node .github/scripts/determine-release-candidate-branch-for-track.mjs
skip-release-pr:
name: Skip release PR (no new commits)
needs: [determine-version-info]
if: needs.determine-version-info.outputs.should_update != 'true'
runs-on: ubuntu-latest
steps:
- name: Log skip reason
run: echo "No new commits found between the release candidate branch and the current release tag. Skipping PR creation."
create-release-pr:
name: Create release PR
needs: [determine-version-info]
if: needs.determine-version-info.outputs.should_update == 'true'
uses: ./.github/workflows/release-create-pr.yml
secrets: inherit
with:
base-branch: ${{ needs.determine-version-info.outputs.release_candidate_branch }}
release-type: patch

View File

@ -1,6 +1,18 @@
name: 'Release: Create Pull Request'
on:
workflow_call:
inputs:
base-branch:
description: 'The branch, tag, or commit to create this release PR from.'
required: true
type: string
release-type:
description: 'A SemVer release type.'
required: true
type: string
workflow_dispatch:
inputs:
base-branch:
@ -18,7 +30,11 @@ on:
- minor
- major
- experimental
- rc
- premajor
permissions:
contents: write
pull-requests: write
jobs:
create-release-pr:
@ -30,18 +46,35 @@ jobs:
timeout-minutes: 5
outputs:
pull-request-number: ${{ steps.create-pr.outputs.pull-request-number }}
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.N8N_ASSISTANT_APP_ID }}
private-key: ${{ secrets.N8N_ASSISTANT_PRIVATE_KEY }}
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
ref: ${{ github.event.inputs.base-branch }}
token: ${{ steps.generate_token.outputs.token }}
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
# Checkout base branch via separate step to prevent unsafe actions/checkout ref usage.
# poutine: untrusted_checkout_exec
- name: Switch to base branch
env:
BASE_BRANCH: ${{ inputs.base-branch }}
run: git checkout "$BASE_BRANCH"
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
with:
node-version: 22.x
- run: npm install --prefix=.github/scripts --no-package-lock
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Setup corepack and pnpm
run: |
@ -52,14 +85,14 @@ jobs:
run: |
echo "NEXT_RELEASE=$(node .github/scripts/bump-versions.mjs)" >> "$GITHUB_ENV"
env:
RELEASE_TYPE: ${{ github.event.inputs.release-type }}
RELEASE_TYPE: ${{ inputs.release-type }}
- name: Update Changelog
run: node .github/scripts/update-changelog.mjs
- name: Push the base branch
env:
BASE_BRANCH: ${{ github.event.inputs.base-branch }}
BASE_BRANCH: ${{ inputs.base-branch }}
run: |
git push -f origin "refs/remotes/origin/${{ env.BASE_BRANCH }}:refs/heads/release/${{ env.NEXT_RELEASE }}"
@ -81,12 +114,23 @@ jobs:
fi
- name: Push the release branch, and Create the PR
uses: peter-evans/create-pull-request@c5a7806660adbe173f04e3e038b0ccdcd758773c # v6
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8.1.0
id: create-pr
with:
token: ${{ steps.generate_token.outputs.token }}
base: 'release/${{ env.NEXT_RELEASE }}'
branch: 'release-pr/${{ env.NEXT_RELEASE }}'
commit-message: ':rocket: Release ${{ env.NEXT_RELEASE }}'
delete-branch: true
labels: release,release:${{ github.event.inputs.release-type }}
labels: release,release:${{ inputs.release-type }}
title: ':rocket: Release ${{ env.NEXT_RELEASE }}'
body: ${{ steps.generate-body.outputs.content }}
approve-and-automerge:
needs: [create-release-pr]
if: |
needs.create-release-pr.outputs.pull-request-number != ''
uses: ./.github/workflows/util-approve-and-set-automerge.yml
secrets: inherit
with:
pull-request-number: ${{ needs.create-release-pr.outputs.pull-request-number }}

View File

@ -0,0 +1,66 @@
name: 'Release: Merge tag to branch'
run-name: 'Merge n8n@${{ inputs.version }} to ${{ inputs.target-branch }}'
on:
workflow_call:
inputs:
version:
description: 'The release version (e.g. 2.10.2)'
required: true
type: string
target-branch:
description: 'The branch to merge the release tag into (e.g. master or release-candidate/2.10.x)'
required: true
type: string
jobs:
merge-tag-to-branch:
name: Merge release tag to ${{ inputs.target-branch }}
runs-on: ubuntu-latest
environment: minor-release-tag-merge
env:
VERSION: ${{ inputs.version }}
TARGET_BRANCH: ${{ inputs.target-branch }}
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.RELEASE_TAG_MERGE_APP_ID }}
private-key: ${{ secrets.RELEASE_TAG_MERGE_PRIVATE_KEY }}
skip-token-revoke: false
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ inputs.target-branch }}
fetch-depth: 500
token: ${{ steps.generate_token.outputs.token }}
- name: Verify release tag exists
run: |
if ! git ls-remote --tags origin "refs/tags/n8n@${VERSION}" | grep -q .; then
echo "::error::Tag n8n@${VERSION} not found on remote"
exit 1
fi
- name: Fetch release tag
run: git fetch origin "refs/tags/n8n@${VERSION}:refs/tags/n8n@${VERSION}"
- name: Merge release tag to branch
run: |
git config user.name "n8n-release-tag-merge[bot]"
git config user.email "256767729+n8n-release-tag-merge[bot]@users.noreply.github.com"
git merge --ff "n8n@${VERSION}"
- name: Push to ${{ inputs.target-branch }}
run: git push origin "HEAD:${TARGET_BRANCH}"
- name: Notify Slack on failure
if: failure()
uses: act10ns/slack@44541246747a30eb3102d87f7a4cc5471b0ffb7d # v2.1.0
with:
status: ${{ job.status }}
channel: '#updates-and-product-releases'
webhook-url: ${{ secrets.SLACK_WEBHOOK_URL }}
message: |
<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}| Release tag merge to ${{ inputs.target-branch }} failed for n8n@${{ inputs.version }} >

View File

@ -0,0 +1,72 @@
name: 'Release: Populate cloud with releases'
run-name: 'Populate cloud with version n8n@${{ inputs.version }}'
on:
workflow_dispatch:
inputs:
previous-version:
description: 'The previous release version (e.g. 2.10.2)'
required: true
type: string
version:
description: 'The release version (e.g. 2.11.0)'
required: true
type: string
experimental:
description: 'If publishing experimental version'
type: boolean
default: false
workflow_call:
inputs:
previous-version:
description: 'The previous release version (e.g. 2.10.2)'
required: true
type: string
version:
description: 'The release version (e.g. 2.11.0)'
required: true
type: string
experimental:
description: 'If publishing experimental version'
type: boolean
default: false
jobs:
determine-changes:
runs-on: ubuntu-slim
environment: release
outputs:
has_node_enhancements: ${{ steps.get-changes.outputs.has_node_enhancements }}
has_core_changes: ${{ steps.get-changes.outputs.has_core_changes }}
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Extract changes
id: get-changes
env:
PREVIOUS_VERSION_TAG: 'n8n@${{ inputs.previous-version }}'
RELEASE_VERSION_TAG: 'n8n@${{ inputs.version }}'
run: node ./.github/scripts/determine-release-version-changes.mjs
- name: Populate databases
id: populate-databases
env:
N8N_POPULATE_CLOUD_WEBHOOK_DATA: ${{ secrets.N8N_POPULATE_CLOUD_WEBHOOK_DATA }}
PAYLOAD: |
{
"target_version_to_update": "${{ inputs.version }}",
"has_node_enhancements": ${{ steps.get-changes.outputs.has_node_enhancements }},
"has_core_changes": ${{ steps.get-changes.outputs.has_core_changes }},
"has_breaking_change": false,
"is_experimental": ${{ inputs.experimental }}
}
run: node ./.github/scripts/populate-cloud-databases.mjs

View File

@ -0,0 +1,74 @@
name: 'Release: Publish: Post-release'
on:
workflow_call:
inputs:
track:
description: 'Release track acquired from determine-version-info. (e.g. stable, beta)'
required: true
type: string
previous_version:
description: 'Previous release version acquired from determine-version-info. (e.g. 2.9.2, 1.123.22)'
required: true
type: string
version:
description: 'Release version acquired from determine-version-info. (e.g. 2.9.3, 1.123.23)'
required: true
type: string
bump:
description: 'Release bump size acquired from determine-version-info. (e.g. minor, patch)'
required: true
type: string
new_stable_version:
description: 'New stable version acquired from determine-version-info. (e.g. 2.9.3, null (on patch releases))'
required: true
type: string
release_type:
description: 'Release type acquired from determine-version-info. (stable or rc)'
required: true
type: string
jobs:
push-new-release-to-channel:
name: Push new release to channel
if: inputs.release_type != 'rc'
uses: ./.github/workflows/release-push-to-channel.yml
secrets: inherit
with:
version: ${{ inputs.version }}
release-channel: ${{ inputs.track }}
promote-previous-beta-to-stable:
name: Promote previous beta to stable
if: |
inputs.release_type != 'rc' &&
inputs.bump == 'minor'
uses: ./.github/workflows/release-push-to-channel.yml
secrets: inherit
with:
version: ${{ inputs.new_stable_version }}
release-channel: stable
ensure-release-candidate-branches:
name: 'Ensure release candidate branches'
if: |
inputs.release_type != 'rc'
uses: ./.github/workflows/util-ensure-release-candidate-branches.yml
secrets: inherit
populate-cloud-with-releases:
name: 'Populate cloud database with releases'
uses: ./.github/workflows/release-populate-cloud-with-releases.yml
with:
previous-version: ${{ inputs.previous_version }}
version: ${{ inputs.version }}
experimental: ${{ inputs.release_type == 'rc' }}
secrets: inherit
send-version-release-notification:
name: 'Send version release notifications'
uses: ./.github/workflows/release-version-release-notification.yml
with:
previous-version: ${{ inputs.previous_version }}
version: ${{ inputs.version }}
secrets: inherit

View File

@ -8,54 +8,61 @@ on:
- 'release/*'
jobs:
build-arm64:
runs-on: blacksmith-4vcpu-ubuntu-2204-arm
determine-version-info:
name: Determine publishing track
runs-on: ubuntu-latest
if: github.event.pull_request.merged == true
env:
NODE_OPTIONS: --max-old-space-size=6144
outputs:
track: ${{ steps.determine-info.outputs.track }}
previous_version: ${{ steps.determine-info.outputs.previous_version }}
version: ${{ steps.determine-info.outputs.version }}
bump: ${{ steps.determine-info.outputs.bump }}
new_stable_version: ${{ steps.determine-info.outputs.new_stable_version }}
release_type: ${{ steps.determine-info.outputs.release_type }}
rc_branch: ${{ steps.determine-info.outputs.rc_branch }}
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
- name: Setup and Build ARM64
uses: ./.github/actions/setup-nodejs-blacksmith
env:
N8N_FAIL_ON_POPULARITY_FETCH_ERROR: true
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Determine track from package version number
id: determine-info
run: node .github/scripts/determine-version-info.mjs
publish-to-npm:
name: Publish to NPM
needs: [determine-version-info]
runs-on: ubuntu-latest
if: github.event.pull_request.merged == true
timeout-minutes: 20
environment: npm
permissions:
id-token: write
env:
NPM_CONFIG_PROVENANCE: true
outputs:
release: ${{ steps.set-release.outputs.release }}
release_type: ${{ steps.set-release.outputs.release_type }}
RELEASE: ${{ needs.determine-version-info.outputs.version }} # Used by Vite build process
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set release version in env
run: echo "RELEASE=$(node -e 'console.log(require("./package.json").version)')" >> "$GITHUB_ENV"
- name: Determine release type
id: release-type
run: |
VERSION="${{ env.RELEASE }}"
if [[ "$VERSION" == *"-rc."* ]]; then
echo "type=rc" >> "$GITHUB_OUTPUT"
else
echo "type=stable" >> "$GITHUB_OUTPUT"
fi
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Setup and Build
uses: ./.github/actions/setup-nodejs-github
uses: ./.github/actions/setup-nodejs
env:
N8N_FAIL_ON_POPULARITY_FETCH_ERROR: true
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
- name: Install script dependencies
run: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Check for new unpublished packages
run: node .github/scripts/detect-new-packages.mjs
- name: Dry-run publishing
run: |
@ -64,43 +71,43 @@ jobs:
- name: Pre publishing changes
run: |
echo "//registry.npmjs.org/:_authToken=${{ secrets.NPM_TOKEN }}" > ~/.npmrc
node .github/scripts/trim-fe-packageJson.js
node .github/scripts/ensure-provenance-fields.mjs
cp README.md packages/cli/README.md
sed -i "s/default: 'dev'/default: '${{ steps.release-type.outputs.type }}'/g" packages/cli/dist/config/schema.js
sed -i "s/default: 'dev'/default: '${{ needs.determine-version-info.outputs.release_type }}'/g" packages/cli/dist/config/schema.js
- name: Publish n8n to NPM with rc tag
env:
PUBLISH_BRANCH: ${{ github.event.pull_request.base.ref }}
run: pnpm --filter n8n publish --publish-branch "$PUBLISH_BRANCH" --access public --tag rc --no-git-checks
- name: Publish other packages to NPM with latest tag
- name: Publish other packages to NPM
env:
PUBLISH_BRANCH: ${{ github.event.pull_request.base.ref }}
run: pnpm publish -r --filter '!n8n' --publish-branch "$PUBLISH_BRANCH" --access public --no-git-checks
PUBLISH_TAG: ${{ needs.determine-version-info.outputs.track == 'stable' && 'latest' || needs.determine-version-info.outputs.track }}
run: |
# Prefix version-like track names (e.g. "1", "v1") to avoid npm rejecting them as semver ranges
if [[ "$PUBLISH_TAG" =~ ^v?[0-9] ]]; then
PUBLISH_TAG="release-${PUBLISH_TAG}"
fi
pnpm publish -r --filter '!n8n' --publish-branch "$PUBLISH_BRANCH" --access public --tag "$PUBLISH_TAG" --no-git-checks
- name: Cleanup rc tag
run: npm dist-tag rm n8n rc
continue-on-error: true
- id: set-release
run: |
echo "release=${{ env.RELEASE }}" >> "$GITHUB_OUTPUT"
echo "release_type=${{ steps.release-type.outputs.type }}" >> "$GITHUB_OUTPUT"
publish-to-docker-hub:
name: Publish to DockerHub
needs: [publish-to-npm, build-arm64]
needs: [determine-version-info]
uses: ./.github/workflows/docker-build-push.yml
with:
n8n_version: ${{ needs.publish-to-npm.outputs.release }}
release_type: ${{ needs.publish-to-npm.outputs.release_type }}
n8n_version: ${{ needs.determine-version-info.outputs.version }}
release_type: ${{ needs.determine-version-info.outputs.release_type }}
secrets: inherit
create-github-release:
name: Create a GitHub Release
needs: [publish-to-npm, publish-to-docker-hub]
needs: [determine-version-info, publish-to-npm, publish-to-docker-hub]
runs-on: ubuntu-latest
if: github.event.pull_request.merged == true
timeout-minutes: 5
@ -110,88 +117,102 @@ jobs:
id-token: write
steps:
- name: Create a Release on GitHub
uses: ncipollo/release-action@1c89adf39833729d8f85a31ccbc451b078733c80 # v1
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.N8N_ASSISTANT_APP_ID }}
private-key: ${{ secrets.N8N_ASSISTANT_PRIVATE_KEY }}
- name: Create a Release on GitHub
uses: ncipollo/release-action@b7eabc95ff50cbeeedec83973935c8f306dfcd0b # v1.20.0
with:
token: ${{ steps.generate_token.outputs.token }}
commit: ${{github.event.pull_request.base.ref}}
tag: 'n8n@${{ needs.publish-to-npm.outputs.release }}'
prerelease: true
makeLatest: false
tag: 'n8n@${{ needs.determine-version-info.outputs.version }}'
prerelease: ${{ needs.determine-version-info.outputs.track == 'beta' }}
makeLatest: ${{ needs.determine-version-info.outputs.track == 'stable' }}
body: ${{github.event.pull_request.body}}
create-sentry-release:
name: Create a Sentry Release
needs: [publish-to-npm, publish-to-docker-hub]
runs-on: ubuntu-latest
move-track-tag:
name: Move track tag
needs: [determine-version-info, create-github-release]
if: github.event.pull_request.merged == true
timeout-minutes: 5
env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}
SENTRY_ORG: ${{ secrets.SENTRY_ORG }}
uses: ./.github/workflows/release-update-pointer-tag.yml
with:
track: ${{ needs.determine-version-info.outputs.track }}
version-tag: 'n8n@${{ needs.determine-version-info.outputs.version }}'
secrets: inherit
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Restore Turbo Cache
uses: ./.github/actions/setup-nodejs-github
- name: Create a frontend release
uses: getsentry/action-release@e769183448303de84c5a06aaaddf9da7be26d6c7 # v1.7.0
continue-on-error: true
with:
projects: ${{ secrets.SENTRY_FRONTEND_PROJECT }}
version: n8n@${{ needs.publish-to-npm.outputs.release }}
sourcemaps: packages/frontend/editor-ui/dist
- name: Create a backend release
uses: getsentry/action-release@e769183448303de84c5a06aaaddf9da7be26d6c7 # v1.7.0
continue-on-error: true
with:
projects: ${{ secrets.SENTRY_BACKEND_PROJECT }}
version: n8n@${{ needs.publish-to-npm.outputs.release }}
sourcemaps: packages/cli/dist packages/core/dist packages/nodes-base/dist packages/@n8n/n8n-nodes-langchain/dist
- name: Create a task runner release
uses: getsentry/action-release@e769183448303de84c5a06aaaddf9da7be26d6c7 # v1.7.0
continue-on-error: true
with:
projects: ${{ secrets.SENTRY_TASK_RUNNER_PROJECT }}
version: n8n@${{ needs.publish-to-npm.outputs.release }}
sourcemaps: packages/core/dist packages/workflow/dist/esm packages/@n8n/task-runner/dist
promote-stable-tag:
name: Promote stable tag (minor bump)
needs: [determine-version-info, create-github-release]
if: |
github.event.pull_request.merged == true &&
needs.determine-version-info.outputs.new_stable_version != ''
uses: ./.github/workflows/release-update-pointer-tag.yml
with:
track: stable
version-tag: 'n8n@${{ needs.determine-version-info.outputs.new_stable_version }}'
secrets: inherit
generate-and-attach-sbom:
name: Generate and Attach SBOM to Release
needs: [publish-to-npm, create-github-release]
needs: [determine-version-info, create-github-release]
uses: ./.github/workflows/sbom-generation-callable.yml
with:
n8n_version: ${{ needs.publish-to-npm.outputs.release }}
release_tag_ref: 'n8n@${{ needs.publish-to-npm.outputs.release }}'
n8n_version: ${{ needs.determine-version-info.outputs.version }}
release_tag_ref: 'n8n@${{ needs.determine-version-info.outputs.version }}'
secrets: inherit
trigger-release-note:
name: Trigger a release note
needs: [publish-to-npm, create-github-release]
merge-release-tag-to-master:
name: Merge release tag to master on minor release
needs: [determine-version-info, publish-to-npm, create-github-release]
if: |
github.event.pull_request.merged == true &&
!contains(needs.publish-to-npm.outputs.release, '-rc.')
runs-on: ubuntu-latest
steps:
- name: Trigger a release note
run: curl -u docsWorkflows:${{ secrets.N8N_WEBHOOK_DOCS_PASSWORD }} --request GET 'https://internal.users.n8n.cloud/webhook/trigger-release-note' --header 'Content-Type:application/json' --data '{"version":"${{ needs.publish-to-npm.outputs.release }}"}'
needs.determine-version-info.outputs.bump == 'minor' &&
needs.determine-version-info.outputs.release_type != 'rc'
uses: ./.github/workflows/release-merge-tag-to-branch.yml
with:
version: ${{ needs.determine-version-info.outputs.version }}
target-branch: master
secrets: inherit
# merge-back-into-master:
# name: Merge back into master
# needs: [publish-to-npm, create-github-release]
# if: ${{ github.event.pull_request.merged == true && !contains(github.event.pull_request.labels.*.name, 'release:patch') }}
# runs-on: ubuntu-latest
# steps:
# - uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11
# v4.1.1
# fetch-depth: 0
# - run: |
# git checkout --track origin/master
# git config user.name "github-actions[bot]"
# git config user.email 41898282+github-actions[bot]@users.noreply.github.com
# git merge --ff n8n@${{ needs.publish-to-npm.outputs.release }}
# git push origin master
# git push origin :${{github.event.pull_request.base.ref}}
merge-release-tag-to-rc-branch:
name: Merge release tag to RC branch on patch release
needs: [determine-version-info, publish-to-npm, create-github-release]
if: |
github.event.pull_request.merged == true &&
needs.determine-version-info.outputs.bump == 'patch' &&
needs.determine-version-info.outputs.release_type != 'rc'
uses: ./.github/workflows/release-merge-tag-to-branch.yml
with:
version: ${{ needs.determine-version-info.outputs.version }}
target-branch: ${{ needs.determine-version-info.outputs.rc_branch }}
secrets: inherit
post-release:
name: Run Post-release actions
needs:
[
determine-version-info,
publish-to-npm,
create-github-release,
move-track-tag,
promote-stable-tag,
]
if: |
always() &&
needs.publish-to-npm.result == 'success' &&
needs.create-github-release.result == 'success' &&
(needs.move-track-tag.result == 'success' || needs.move-track-tag.result == 'skipped') &&
(needs.promote-stable-tag.result == 'success' || needs.promote-stable-tag.result == 'skipped')
uses: ./.github/workflows/release-publish-post-release.yml
with:
track: ${{ needs.determine-version-info.outputs.track }}
previous_version: ${{ needs.determine-version-info.outputs.previous_version }}
version: ${{ needs.determine-version-info.outputs.version }}
bump: ${{ needs.determine-version-info.outputs.bump }}
new_stable_version: ${{ needs.determine-version-info.outputs.new_stable_version }}
release_type: ${{ needs.determine-version-info.outputs.release_type }}
secrets: inherit

View File

@ -1,6 +1,18 @@
name: 'Release: Push to Channel'
on:
workflow_call:
inputs:
version:
description: 'n8n Release version to push to a channel (e.g., 1.2.3 or 1.2.3-beta.4)'
required: true
type: string
release-channel:
description: 'Release channel'
required: true
type: string
workflow_dispatch:
inputs:
version:
@ -23,12 +35,12 @@ jobs:
runs-on: ubuntu-latest
outputs:
version: ${{ steps.check_version.outputs.version }}
release_channel: ${{ github.event.inputs.release-channel }}
release_channel: ${{ inputs.release-channel }}
steps:
- name: Check Version Format
id: check_version
env:
INPUT_VERSION: ${{ github.event.inputs.version }}
INPUT_VERSION: ${{ inputs.version }}
run: |
input_version="${{ env.INPUT_VERSION }}"
version_regex='^[0-9]+\.[0-9]+\.[0-9]+(-[a-zA-Z0-9.-]+)?$'
@ -43,8 +55,8 @@ jobs:
- name: Block RC promotion to stable/beta
env:
INPUT_VERSION: ${{ github.event.inputs.version }}
CHANNEL: ${{ github.event.inputs.release-channel }}
INPUT_VERSION: ${{ inputs.version }}
CHANNEL: ${{ inputs.release-channel }}
run: |
if [[ "$INPUT_VERSION" == *"-rc."* ]]; then
echo "::error::RC versions cannot be promoted to '$CHANNEL' channel. Version '$INPUT_VERSION' contains '-rc.'"
@ -57,12 +69,18 @@ jobs:
runs-on: ubuntu-latest
needs: validate-inputs
timeout-minutes: 5
environment: release
permissions:
id-token: write
steps:
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- uses: actions/setup-node@53b83947a5a98c8d113130e565377fae1a50d02f # v6.3.0
with:
node-version: 22.x
node-version: 24.13.1
- run: echo "//registry.npmjs.org/:_authToken=${{ secrets.NPM_TOKEN }}" > ~/.npmrc
# Remove after https://github.com/npm/cli/issues/8547 gets resolved
- run: echo "//registry.npmjs.org/:_authToken=${NPM_TOKEN}" > ~/.npmrc
env:
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
- name: Add beta/next tags to NPM
if: needs.validate-inputs.outputs.release_channel == 'beta'
@ -81,39 +99,53 @@ jobs:
runs-on: ubuntu-latest
needs: validate-inputs
timeout-minutes: 5
environment: release
steps:
- uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Login to DockerHub
uses: ./.github/actions/docker-registry-login
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
login-ghcr: false
login-dockerhub: true
dockerhub-username: ${{ secrets.DOCKER_USERNAME }}
dockerhub-password: ${{ secrets.DOCKER_PASSWORD }}
- name: Tag stable/latest Docker image
if: needs.validate-inputs.outputs.release_channel == 'stable'
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
run: |
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/n8n:stable" "${{ secrets.DOCKER_USERNAME }}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/n8n:latest" "${{ secrets.DOCKER_USERNAME }}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/runners:stable" "${{ secrets.DOCKER_USERNAME }}/runners:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/runners:latest" "${{ secrets.DOCKER_USERNAME }}/runners:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/n8n:stable" "${DOCKER_USERNAME}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/n8n:latest" "${DOCKER_USERNAME}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/runners:stable" "${DOCKER_USERNAME}/runners:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/runners:latest" "${DOCKER_USERNAME}/runners:${{ needs.validate-inputs.outputs.version }}"
- name: Tag beta/next Docker image
if: needs.validate-inputs.outputs.release_channel == 'beta'
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
run: |
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/n8n:beta" "${{ secrets.DOCKER_USERNAME }}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/n8n:next" "${{ secrets.DOCKER_USERNAME }}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/runners:beta" "${{ secrets.DOCKER_USERNAME }}/runners:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${{ secrets.DOCKER_USERNAME }}/runners:next" "${{ secrets.DOCKER_USERNAME }}/runners:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/n8n:beta" "${DOCKER_USERNAME}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/n8n:next" "${DOCKER_USERNAME}/n8n:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/runners:beta" "${DOCKER_USERNAME}/runners:${{ needs.validate-inputs.outputs.version }}"
docker buildx imagetools create -t "${DOCKER_USERNAME}/runners:next" "${DOCKER_USERNAME}/runners:${{ needs.validate-inputs.outputs.version }}"
release-to-github-container-registry:
name: Release to GitHub Container Registry
runs-on: ubuntu-latest
needs: validate-inputs
timeout-minutes: 5
environment: release
permissions:
packages: write
steps:
- uses: docker/login-action@9780b0c442fbb1117ed29e0efdff1e18412f7567 # v3.3.0
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Login to GitHub Container Registry
uses: ./.github/actions/docker-registry-login
- name: Tag stable/latest GHCR image
if: needs.validate-inputs.outputs.release_channel == 'stable'
@ -135,6 +167,7 @@ jobs:
name: Update latest and next in the docs
runs-on: ubuntu-latest
needs: [validate-inputs, release-to-npm, release-to-docker-hub]
environment: release
steps:
- continue-on-error: true
run: curl -u docsWorkflows:${{ secrets.N8N_WEBHOOK_DOCS_PASSWORD }} --request GET 'https://internal.users.n8n.cloud/webhook/update-latest-next'

View File

@ -8,10 +8,14 @@ on:
required: true
type: choice
options:
- '@n8n/node-cli'
- '@n8n/codemirror-lang'
- '@n8n/codemirror-lang-html'
- '@n8n/codemirror-lang-sql'
- '@n8n/create-node'
- '@n8n/scan-community-package'
- '@n8n/eslint-plugin-community-nodes'
- '@n8n/node-cli'
- '@n8n/scan-community-package'
# All packages listed above require OIDC enabled in NPM. https://docs.npmjs.com/trusted-publishers
concurrency:
group: release-package-${{ github.event.inputs.package }}
@ -25,6 +29,7 @@ jobs:
name: Publish to NPM
runs-on: ubuntu-latest
timeout-minutes: 15
environment: npm
permissions:
id-token: write
env:
@ -38,20 +43,18 @@ jobs:
exit 1
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Setup and Build
uses: ./.github/actions/setup-nodejs-github
uses: ./.github/actions/setup-nodejs
with:
build-command: 'pnpm turbo build --filter "...${{ github.event.inputs.package }}"'
- name: Pre publishing changes
run: |
echo "//registry.npmjs.org/:_authToken=${{ secrets.NPM_TOKEN }}" > ~/.npmrc
node .github/scripts/ensure-provenance-fields.mjs
- name: Publish package
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
PACKAGE: ${{ github.event.inputs.package }}
run: pnpm --filter "$PACKAGE" publish --access public --no-git-checks --publish-branch master

View File

@ -0,0 +1,66 @@
name: 'Release: Update pointer tag'
run-name: 'Update pointer tag: ${{ inputs.track }} -> ${{ inputs.version-tag }}'
on:
workflow_call:
inputs:
track:
required: true
type: string
version-tag:
required: true
type: string
workflow_dispatch:
inputs:
track:
description: 'Release Track'
required: true
type: choice
options: [stable, beta, v1]
version-tag:
description: 'Version tag (e.g. n8n@2.7.0). Track tag will point to this version tag.'
required: true
type: string
permissions:
contents: write
jobs:
update-pointer-tags:
name: Update pointer tags
runs-on: ubuntu-slim
environment: minor-release-tag-merge
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.RELEASE_TAG_MERGE_APP_ID }}
private-key: ${{ secrets.RELEASE_TAG_MERGE_PRIVATE_KEY }}
skip-token-revoke: false
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
token: ${{ steps.generate_token.outputs.token }}
# We can manage with a shallow clone, since `ensureTagExists` in github-helpers.mjs
# does a targeted fetch for the tags it needs.
fetch-depth: 1
- name: Setup NodeJS
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Configure git author
run: |
git config user.name "n8n-release-tag-merge[bot]"
git config user.email "256767729+n8n-release-tag-merge[bot]@users.noreply.github.com"
- name: Move track tag
env:
TRACK: ${{ inputs.track }}
VERSION_TAG: ${{ inputs.version-tag }}
run: node ./.github/scripts/move-track-tag.mjs

View File

@ -0,0 +1,50 @@
name: 'Release: Send version release notification'
run-name: 'Send version release notification for n8n@${{ inputs.version }}'
on:
workflow_dispatch:
inputs:
previous-version:
description: 'The previous release version (e.g. 2.10.2)'
required: true
type: string
version:
description: 'The release version (e.g. 2.11.0)'
required: true
type: string
workflow_call:
inputs:
previous-version:
description: 'The previous release version (e.g. 2.10.2)'
required: true
type: string
version:
description: 'The release version (e.g. 2.11.0)'
required: true
type: string
jobs:
release-notification:
runs-on: ubuntu-slim
environment: release
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
install-command: pnpm install --frozen-lockfile --dir ./.github/scripts --ignore-workspace
- name: Send release notification
env:
N8N_VERSION_RELEASE_NOTIFICATION_DATA: ${{ secrets.N8N_VERSION_RELEASE_NOTIFICATION_DATA }}
PAYLOAD: |
{
"previous_version": "${{ inputs.previous-version }}",
"new_version": "${{ inputs.version }}"
}
run: node ./.github/scripts/send-version-release-notification.mjs

View File

@ -39,64 +39,44 @@ jobs:
continue-on-error: true
steps:
- name: Checkout release tag
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ inputs.release_tag_ref }}
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- name: Setup Node.js and install dependencies
uses: ./.github/actions/setup-nodejs
with:
node-version: 22.x
- name: Setup corepack and pnpm
run: |
npm i -g corepack@0.33
corepack enable
- name: Install dependencies for SBOM generation
run: pnpm install --frozen-lockfile
build-command: ''
- name: Generate CycloneDX SBOM for source code
uses: anchore/sbom-action@f8bdd1d8ac5e901a77a92f111440fdb1b593736b # v0.20.6
uses: anchore/sbom-action@57aae528053a48a3f6235f2d9461b05fbcb7366d # v0.23.1
with:
path: ./
format: cyclonedx-json
output-file: sbom-source.cdx.json
- name: Attest build provenance for source release
uses: actions/attest-build-provenance@977bb373ede98d70efdf65b84cb5f73e068dcc2a0 # v3.0.0
with:
subject-path: './package.json'
- name: Attest SBOM for source release
uses: actions/attest-sbom@4651f806c01d8637787e274ac3bdf724ef169f34 # v3.0.0
uses: actions/attest-sbom@07e74fc4e78d1aad915e867f9a094073a9f71527 # v4.0.0
with:
subject-path: './package.json'
sbom-path: 'sbom-source.cdx.json'
- name: Install Cosign
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
- name: Sign SBOM (keyless)
run: |
# Sign SBOM using Cosign keyless signing with GitHub OIDC
# This provides cryptographic proof of authenticity and integrity
cosign sign-blob --yes --output-signature sbom-source.cdx.sig --output-certificate sbom-source.cdx.pem sbom-source.cdx.json
- name: Attach SBOM files to release
- name: Attach SBOM and VEX files to release
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
RELEASE_TAG_REF: ${{ inputs.release_tag_ref }}
run: |
# Upload SBOM files to the existing release
gh release upload "${{ inputs.release_tag_ref }}" \
# Upload SBOM and VEX files to the existing release
gh release upload "$RELEASE_TAG_REF" \
sbom-source.cdx.json \
sbom-source.cdx.sig \
sbom-source.cdx.pem \
security/vex.openvex.json \
--clobber
COMPONENT_COUNT=$(jq '.components | length' sbom-source.cdx.json 2>/dev/null || echo "unknown")
echo "✅ SBOM workflow completed"
echo "📊 SBOM contains $COMPONENT_COUNT components"
echo "🛡️ GitHub attestations created for source release"
VEX_STATEMENTS=$(jq '.statements | length' security/vex.openvex.json 2>/dev/null || echo "0")
echo "SBOM and VEX attached to release"
echo " - SBOM: $COMPONENT_COUNT components"
echo " - VEX: $VEX_STATEMENTS CVE statements"
- name: Notify Slack on failure
if: failure()

23
.github/workflows/sec-ci-reusable.yml vendored Normal file
View File

@ -0,0 +1,23 @@
name: 'Sec: CI Checks'
on:
workflow_call:
inputs:
ref:
description: GitHub ref to scan.
required: false
type: string
default: ''
jobs:
poutine-scan:
name: Poutine Security Scan
uses: ./.github/workflows/sec-poutine-reusable.yml
with:
ref: ${{ inputs.ref }}
secrets: inherit
# Future security checks can be added here:
# - dependency-scan:
# - secret-detection:
# - container-scan:

View File

@ -0,0 +1,44 @@
name: 'Sec: Poutine Scan'
on:
workflow_dispatch:
workflow_call:
inputs:
ref:
description: GitHub ref to scan.
required: false
type: string
default: ''
permissions:
contents: read
security-events: write
jobs:
poutine_scan:
name: Poutine Security Scan
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ inputs.ref }}
- name: Run Poutine Security Scanner
uses: boostsecurityio/poutine-action@84c0a0d32e8d57ae12651222be1eb15351429228 # v0.15.2
- name: Fail on error-level findings
run: |
# Check SARIF for error-level findings
if jq -e '.runs[].results[] | select(.level == "error")' results.sarif > /dev/null 2>&1; then
echo "::error::Poutine found error-level security findings:"
jq -r '.runs[].results[] | select(.level == "error") | " - \(.ruleId): \(.message.text)"' results.sarif
exit 1
fi
echo "No error-level findings detected"
- name: Upload SARIF results
uses: github/codeql-action/upload-sarif@48ab28a6f5dbc2a99bf1e0131198dd8f1df78169 # v3.28.0
if: github.repository == 'n8n-io/n8n'
with:
sarif_file: results.sarif

View File

@ -0,0 +1,61 @@
name: 'Security: Publish fix (1.x)'
on:
pull_request:
types: [closed]
branches: ['1.x']
jobs:
sync-security-fix:
if: github.repository == 'n8n-io/n8n-private' && github.event.pull_request.merged == true
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.N8N_ASSISTANT_APP_ID }}
private-key: ${{ secrets.N8N_ASSISTANT_PRIVATE_KEY }}
owner: n8n-io
repositories: n8n,n8n-private
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
token: ${{ steps.generate_token.outputs.token }}
- name: Open PR to public repo
run: |
COMMIT_TO_PUBLISH=$(git rev-parse HEAD)
BRANCH_NAME="private-1x-$(date +%Y%m%d-%H%M%S)"
git remote add public-repo https://x-access-token:${{ steps.generate_token.outputs.token }}@github.com/n8n-io/n8n.git
git fetch public-repo 1.x
git checkout -b "$BRANCH_NAME" public-repo/1.x
git config user.name "github-actions[bot]"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
git cherry-pick "$COMMIT_TO_PUBLISH"
git push public-repo "$BRANCH_NAME"
gh pr create \
--repo n8n-io/n8n \
--base 1.x \
--head "$BRANCH_NAME" \
--title "$PR_TITLE" \
--body "Cherry-picked from n8n-private. Original PR: $PR_URL"
env:
GH_TOKEN: ${{ steps.generate_token.outputs.token }}
PR_TITLE: ${{ github.event.pull_request.title }}
PR_URL: ${{ github.event.pull_request.html_url }}
- name: Notify on failure
if: failure()
uses: act10ns/slack@44541246747a30eb3102d87f7a4cc5471b0ffb7d # v2.1.0
with:
status: ${{ job.status }}
channel: '#alerts-security'
webhook-url: ${{ secrets.SLACK_WEBHOOK_URL }}
message: 'Security fix PR creation failed (1.x). Run "Security: Sync from Public" workflow, rebase your branch, reopen PR. (${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }})'

61
.github/workflows/sec-publish-fix.yml vendored Normal file
View File

@ -0,0 +1,61 @@
name: 'Security: Publish fix'
on:
pull_request:
types: [closed]
branches: [master]
jobs:
sync-security-fix:
if: github.repository == 'n8n-io/n8n-private' && github.event.pull_request.merged == true
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.N8N_ASSISTANT_APP_ID }}
private-key: ${{ secrets.N8N_ASSISTANT_PRIVATE_KEY }}
owner: n8n-io
repositories: n8n,n8n-private
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
token: ${{ steps.generate_token.outputs.token }}
- name: Open PR to public repo
run: |
COMMIT_TO_PUBLISH=$(git rev-parse HEAD)
BRANCH_NAME="private-$(date +%Y%m%d-%H%M%S)"
git remote add public-repo https://x-access-token:${{ steps.generate_token.outputs.token }}@github.com/n8n-io/n8n.git
git fetch public-repo master
git checkout -b "$BRANCH_NAME" public-repo/master
git config user.name "github-actions[bot]"
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
git cherry-pick "$COMMIT_TO_PUBLISH"
git push public-repo "$BRANCH_NAME"
gh pr create \
--repo n8n-io/n8n \
--base master \
--head "$BRANCH_NAME" \
--title "$PR_TITLE" \
--body "Cherry-picked from n8n-private. Original PR: $PR_URL"
env:
GH_TOKEN: ${{ steps.generate_token.outputs.token }}
PR_TITLE: ${{ github.event.pull_request.title }}
PR_URL: ${{ github.event.pull_request.html_url }}
- name: Notify on failure
if: failure()
uses: act10ns/slack@44541246747a30eb3102d87f7a4cc5471b0ffb7d # v2.1.0
with:
status: ${{ job.status }}
channel: '#alerts-security'
webhook-url: ${{ secrets.SLACK_WEBHOOK_URL }}
message: 'Security fix PR creation failed. Run "Security: Sync from Public" workflow, rebase your branch, reopen PR. (${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }})'

View File

@ -0,0 +1,113 @@
# Sync n8n-io/n8n to n8n-io/n8n-private
#
# Runs hourly to keep private in sync with public.
# Can also be triggered manually for conflict recovery.
#
# Scheduled runs only sync if private is not ahead of public.
# Manual runs always sync (for conflict recovery after failed cherry-pick).
name: 'Security: Sync from Public'
on:
schedule:
- cron: '0 * * * *'
workflow_dispatch:
inputs:
force:
description: Sync even if private is ahead (for conflict recovery)
type: boolean
default: true
jobs:
sync-from-public:
if: github.repository == 'n8n-io/n8n-private'
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Generate App Token
id: app-token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
with:
app-id: ${{ secrets.N8N_ASSISTANT_APP_ID }}
private-key: ${{ secrets.N8N_ASSISTANT_PRIVATE_KEY }}
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
fetch-depth: 0
token: ${{ steps.app-token.outputs.token }}
- name: Sync master from public
env:
EVENT_NAME: ${{ github.event_name }}
FORCE: ${{ inputs.force }}
run: |
git fetch https://github.com/n8n-io/n8n.git master:public-master
# Check if private is ahead of public, ignore Bundle commits
AHEAD_COUNT=$(git rev-list public-master..HEAD --pretty=oneline --grep="chore: Bundle" --invert-grep --count)
if [ "$AHEAD_COUNT" -gt 0 ]; then
if [ "$EVENT_NAME" = "schedule" ]; then
echo "Private is $AHEAD_COUNT commit(s) ahead of public, skipping scheduled sync"
exit 0
elif [ "$FORCE" != "true" ]; then
echo "Private is $AHEAD_COUNT commit(s) ahead of public, skipping (force not enabled)"
exit 0
else
echo "Private is $AHEAD_COUNT commit(s) ahead of public, force syncing anyway"
fi
fi
git reset --hard public-master
git push origin master --force-with-lease
- name: Sync 1.x from public
env:
EVENT_NAME: ${{ github.event_name }}
FORCE: ${{ inputs.force }}
run: |
git fetch https://github.com/n8n-io/n8n.git 1.x:public-1.x
git checkout 1.x
# Check if private is ahead of public, ignore Bundle commits
AHEAD_COUNT=$(git rev-list public-1.x..HEAD --pretty=oneline --grep="chore: Bundle" --invert-grep --count)
if [ "$AHEAD_COUNT" -gt 0 ]; then
if [ "$EVENT_NAME" = "schedule" ]; then
echo "Private 1.x is $AHEAD_COUNT commit(s) ahead of public, skipping scheduled sync"
exit 0
elif [ "$FORCE" != "true" ]; then
echo "Private 1.x is $AHEAD_COUNT commit(s) ahead of public, skipping (force not enabled)"
exit 0
else
echo "Private 1.x is $AHEAD_COUNT commit(s) ahead of public, force syncing anyway"
fi
fi
git reset --hard public-1.x
git push origin 1.x --force-with-lease
- name: Ensure bundle/2.x exists
run: |
if git ls-remote --exit-code origin refs/heads/bundle/2.x; then
echo "bundle/2.x already exists, skipping"
else
echo "bundle/2.x not found, creating from master"
git checkout master
git checkout -b bundle/2.x
git push origin bundle/2.x
fi
- name: Ensure bundle/1.x exists
run: |
if git ls-remote --exit-code origin refs/heads/bundle/1.x; then
echo "bundle/1.x already exists, skipping"
else
echo "bundle/1.x not found, creating from 1.x"
git checkout 1.x
git checkout -b bundle/1.x
git push origin bundle/1.x
fi

View File

@ -22,30 +22,40 @@ permissions:
env:
QBOT_SLACK_TOKEN: ${{ secrets.QBOT_SLACK_TOKEN }}
SLACK_CHANNEL_ID: C042WDXPTEZ #mission-security
SLACK_CHANNEL_ID: C0AHNJU9XFA #updates-security
jobs:
security_scan:
name: Security - Scan Docker Image With Trivy
runs-on: ubuntu-latest
steps:
- name: Checkout for VEX file
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
sparse-checkout: |
security/vex.openvex.json
security/trivy.yaml
security/trivy-ignore-policy.rego
.github/scripts/retry.mjs
sparse-checkout-cone-mode: false
- name: Pull Docker image with retry
run: |
for i in {1..4}; do
docker pull "${{ inputs.image_ref }}" && break
[ "$i" -lt 4 ] && echo "Retry $i failed, waiting..." && sleep 15
done
env:
IMAGE_REF: ${{ inputs.image_ref }}
run: node .github/scripts/retry.mjs --attempts 4 --delay 15 -- docker pull "$IMAGE_REF"
- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@dc5a429b52fcf669ce959baa2c2dd26090d2a6c4 # v0.32.0
uses: aquasecurity/trivy-action@e368e328979b113139d6f9068e03accaed98a518 # v0.34.1
id: trivy_scan
with:
image-ref: ${{ inputs.image_ref }}
version: 'v0.69.2'
format: 'json'
output: 'trivy-results.json'
severity: 'CRITICAL,HIGH,MEDIUM,LOW'
ignore-unfixed: false
exit-code: '0'
trivy-config: 'security/trivy.yaml'
- name: Calculate vulnerability counts
id: process_results
@ -82,16 +92,22 @@ jobs:
- name: Generate GitHub Job Summary
if: always()
env:
IMAGE_REF: ${{ inputs.image_ref }}
run: |
{
echo "# 🛡️ Trivy Security Scan Results"
echo ""
echo "**Image:** \`${{ inputs.image_ref }}\`"
echo "**Image:** \`$IMAGE_REF\`"
echo "**Scan Date:** $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
echo ""
} >> "$GITHUB_STEP_SUMMARY"
if [ "${{ steps.process_results.outputs.vulnerabilities_found }}" == "false" ]; then
if [ ! -s trivy-results.json ]; then
{
echo "⚠️ **Scan did not produce results.** Check the 'Run Trivy vulnerability scanner' step for errors."
} >> "$GITHUB_STEP_SUMMARY"
elif [ "${{ steps.process_results.outputs.vulnerabilities_found }}" == "false" ]; then
{
echo "✅ **No vulnerabilities found!**"
} >> "$GITHUB_STEP_SUMMARY"
@ -113,7 +129,7 @@ jobs:
{
# Generate detailed vulnerability table
jq -r --arg image_ref "${{ inputs.image_ref }}" '
jq -r --arg image_ref "$IMAGE_REF" '
# Collect all vulnerabilities
[.Results[] | select(.Vulnerabilities != null) | .Vulnerabilities[]] |
# Group by CVE ID to avoid duplicates
@ -153,8 +169,10 @@ jobs:
- name: Generate Slack Blocks JSON
if: steps.process_results.outputs.vulnerabilities_found == 'true'
id: generate_blocks
env:
IMAGE_REF: ${{ inputs.image_ref }}
run: |
BLOCKS_JSON=$(jq -c --arg image_ref "${{ inputs.image_ref }}" \
BLOCKS_JSON=$(jq -c --arg image_ref "$IMAGE_REF" \
--arg repo_url "${{ github.server_url }}/${{ github.repository }}" \
--arg repo_name "${{ github.repository }}" \
--arg run_url "${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}" \
@ -242,3 +260,4 @@ jobs:
channel: ${{ env.SLACK_CHANNEL_ID }}
text: "🚨 Trivy Scan: ${{ steps.process_results.outputs.critical_count }} Critical, ${{ steps.process_results.outputs.high_count }} High, ${{ steps.process_results.outputs.medium_count }} Medium, ${{ steps.process_results.outputs.low_count }} Low vulnerabilities found in ${{ inputs.image_ref }}"
blocks: ${{ steps.generate_blocks.outputs.slack_blocks }}

View File

@ -0,0 +1,36 @@
name: 'Test: Benchmarks'
on:
workflow_call:
inputs:
ref:
description: GitHub ref to test.
required: false
type: string
default: ''
workflow_dispatch:
inputs:
ref:
description: Branch or ref to benchmark (defaults to the workflow's branch).
required: false
type: string
default: ''
jobs:
bench:
name: Benchmarks
if: github.repository == 'n8n-io/n8n'
runs-on: ${{ vars.RUNNER_PROVIDER == 'github' && 'ubuntu-latest' || 'blacksmith-2vcpu-ubuntu-2204' }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
with:
ref: ${{ inputs.ref }}
- name: Setup Node.js
uses: ./.github/actions/setup-nodejs
- name: Run benchmarks
uses: CodSpeedHQ/action@281164b0f014a4e7badd2c02cecad9b595b70537 # v4.11.1
with:
mode: simulation
run: CODSPEED=true pnpm --filter=@n8n/performance bench

View File

@ -1,4 +1,4 @@
name: Destroy Benchmark Env
name: 'Test: Benchmark Destroy Env'
on:
schedule:
@ -20,26 +20,19 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- name: Azure login
uses: azure/login@6c251865b4e6290e7b78be643ea2d005bc51f69a # v2.1.1
uses: azure/login@a457da9ea143d694b1b9c7c869ebb04ebe844ef5 # v2.3.0
with:
client-id: ${{ secrets.BENCHMARK_ARM_CLIENT_ID }}
tenant-id: ${{ secrets.BENCHMARK_ARM_TENANT_ID }}
subscription-id: ${{ secrets.BENCHMARK_ARM_SUBSCRIPTION_ID }}
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
- name: Setup Node.js and install dependencies
uses: ./.github/actions/setup-nodejs
with:
node-version: 22.x
- name: Setup corepack and pnpm
run: |
npm i -g corepack@0.33
corepack enable
- name: Install dependencies
run: pnpm install --frozen-lockfile
build-command: ''
- name: Destroy cloud env
run: pnpm destroy-cloud-env

View File

@ -1,4 +1,4 @@
name: Run Nightly Benchmark
name: 'Test: Benchmark Nightly'
run-name: Benchmark ${{ inputs.n8n_tag || 'nightly' }}
on:
@ -42,26 +42,19 @@ jobs:
steps:
- name: Checkout
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
- uses: hashicorp/setup-terraform@b9cd54a3c349d3f38e8881555d616ced269862dd # v3
- name: Setup Node.js and install dependencies
uses: ./.github/actions/setup-nodejs
with:
build-command: ''
- uses: hashicorp/setup-terraform@5e8dbf3c6d9deaf4193ca7a8fb23f2ac83bb6c85 # v4.0.0
with:
terraform_version: '1.8.5'
- uses: actions/setup-node@49933ea5288caeca8642d1e84afbd3f7d6820020 # v4.4.0
with:
node-version: 22.x
- name: Setup corepack and pnpm
run: |
npm i -g corepack@0.33
corepack enable
- name: Install dependencies
run: pnpm install --frozen-lockfile
- name: Azure login
uses: azure/login@6c251865b4e6290e7b78be643ea2d005bc51f69a # v2.1.1
uses: azure/login@a457da9ea143d694b1b9c7c869ebb04ebe844ef5 # v2.3.0
with:
client-id: ${{ env.ARM_CLIENT_ID }}
tenant-id: ${{ env.ARM_TENANT_ID }}
@ -93,7 +86,7 @@ jobs:
# We need to login again because the access token expires
- name: Azure login
if: always()
uses: azure/login@6c251865b4e6290e7b78be643ea2d005bc51f69a # v2.1.1
uses: azure/login@a457da9ea143d694b1b9c7c869ebb04ebe844ef5 # v2.3.0
with:
client-id: ${{ env.ARM_CLIENT_ID }}
tenant-id: ${{ env.ARM_TENANT_ID }}

View File

@ -1,27 +1,12 @@
name: Test Postgres and MySQL schemas
name: 'Test: DB Postgres MySQL'
on:
schedule:
- cron: '0 0 * * *'
workflow_dispatch:
pull_request:
paths:
- packages/cli/src/databases/**
- packages/cli/src/modules/*/database/**
- packages/cli/src/modules/**/*.entity.ts
- packages/cli/src/modules/**/*.repository.ts
- packages/cli/test/integration/**
- packages/cli/test/shared/db/**
- packages/@n8n/db/**
- packages/cli/**/__tests__/**
- .github/workflows/ci-postgres-mysql.yml
- .github/docker-compose.yml
pull_request_review:
types: [submitted]
concurrency:
group: db-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: false
workflow_call:
inputs:
ref:
required: false
type: string
default: ''
env:
NODE_OPTIONS: '--max-old-space-size=3072'
@ -35,7 +20,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
- name: Setup and Build
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
sqlite-pooled:
name: SQLite Pooled
@ -49,7 +34,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
- name: Setup and Build
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
- name: Test SQLite Pooled
working-directory: packages/cli
@ -60,7 +45,6 @@ jobs:
needs: build
runs-on: blacksmith-4vcpu-ubuntu-2204
timeout-minutes: 30
if: false
env:
DB_MYSQLDB_PASSWORD: password
DB_MYSQLDB_POOL_SIZE: 1
@ -72,7 +56,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
- name: Setup and Build
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
- name: Start MariaDB
uses: isbang/compose-action@802a148945af6399a338c7906c267331b39a71af # v2.0.0
@ -80,6 +64,7 @@ jobs:
compose-file: ./.github/docker-compose.yml
services: |
mariadb
up-flags: --wait
- name: Test MariaDB
working-directory: packages/cli
@ -90,7 +75,6 @@ jobs:
needs: build
runs-on: blacksmith-2vcpu-ubuntu-2204
timeout-minutes: 20
if: false
env:
DB_MYSQLDB_PASSWORD: password
DB_MYSQLDB_POOL_SIZE: 1
@ -101,18 +85,18 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
- name: Setup and Build
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
- name: Start MySQL
uses: isbang/compose-action@802a148945af6399a338c7906c267331b39a71af # v2.0.0
with:
compose-file: ./.github/docker-compose.yml
services: mysql-8.4
up-flags: --wait
- name: Test MySQL
working-directory: packages/cli
# We sleep here due to flakiness with DB tests if we connect to the database too soon
run: sleep 2s && pnpm test:mysql --testTimeout 120000
run: pnpm test:mysql --testTimeout 120000
postgres:
name: Postgres
@ -126,7 +110,7 @@ jobs:
- uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
- name: Setup and Build
uses: n8n-io/n8n/.github/actions/setup-nodejs-blacksmith@f5fbbbe0a28a886451c886cac6b49192a39b0eea # v1.104.1
uses: ./.github/actions/setup-nodejs
- name: Start Postgres
uses: isbang/compose-action@802a148945af6399a338c7906c267331b39a71af # v2.0.0
@ -142,7 +126,7 @@ jobs:
notify-on-failure:
name: Notify Slack on failure
runs-on: ubuntu-latest
needs: [sqlite-pooled, postgres]
needs: [sqlite-pooled, mariadb, postgres, mysql]
steps:
- name: Notify Slack on failure
uses: act10ns/slack@44541246747a30eb3102d87f7a4cc5471b0ffb7d # v2.1.0

Some files were not shown because too many files have changed in this diff Show More