Skip to content

[core] Ensure open stream flush is await-able in pendingOps#1446

Open
VaguelySerious wants to merge 6 commits intomainfrom
peter/stream-flush-op
Open

[core] Ensure open stream flush is await-able in pendingOps#1446
VaguelySerious wants to merge 6 commits intomainfrom
peter/stream-flush-op

Conversation

@VaguelySerious
Copy link
Member

@VaguelySerious VaguelySerious commented Mar 18, 2026

Stream tests seemed slightly flaky, and I suspect it might be partially because the 10ms buffer on streams wasn't await-able.

VaguelySerious and others added 3 commits March 18, 2026 13:20
WorkflowServerWritableStream buffers writes and flushes via a 10ms
setTimeout for batching. The write() callback returned immediately
after buffering, causing flushablePipe's pendingOps counter to reach 0
before data actually reached the server via the deferred HTTP flush.

Fix: write() now returns a promise that resolves only after the
scheduled flush completes. Multiple writes within the 10ms window
still share a single batched flush — the batching optimization is
preserved. Each write's promise resolves when the batch's HTTP
round-trip finishes, so pendingOps accurately reflects server state.

This is implemented via a flushWaiters array: each write() pushes
a {resolve, reject} pair. When the setTimeout fires and flush()
completes, all waiters are resolved (or rejected on error).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Signed-off-by: Peter Wielander <mittgfu@gmail.com>
Signed-off-by: Peter Wielander <mittgfu@gmail.com>
@changeset-bot
Copy link

changeset-bot bot commented Mar 18, 2026

🦋 Changeset detected

Latest commit: 3b5d85b

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 16 packages
Name Type
@workflow/core Patch
@workflow/builders Patch
@workflow/cli Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/vitest Patch
@workflow/web-shared Patch
workflow Patch
@workflow/world-testing Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/sveltekit Patch
@workflow/vite Patch
@workflow/nuxt Patch
@workflow/ai Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vercel
Copy link
Contributor

vercel bot commented Mar 18, 2026

@github-actions
Copy link
Contributor

github-actions bot commented Mar 18, 2026

🧪 E2E Test Results

Some tests failed

Summary

Passed Failed Skipped Total
✅ ▲ Vercel Production 758 0 67 825
✅ 💻 Local Development 782 0 118 900
✅ 📦 Local Production 782 0 118 900
✅ 🐘 Local Postgres 782 0 118 900
✅ 🪟 Windows 72 0 3 75
❌ 🌍 Community Worlds 118 56 15 189
✅ 📋 Other 198 0 27 225
Total 3492 56 466 4014

❌ Failed Tests

🌍 Community Worlds (56 failed)

mongodb (3 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KM1F77E636J4ZGZARVHNRWWV
  • webhookWorkflow | wrun_01KM1F7EWZCMCW1V2YMSWD8V96
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KM1FCTSSPEJD2SKKZJ53YYBR

redis (2 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KM1F77E636J4ZGZARVHNRWWV
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KM1FCTSSPEJD2SKKZJ53YYBR

turso (51 failed):

  • addTenWorkflow | wrun_01KM1F63XVVM0KM4QWXXCYF773
  • addTenWorkflow | wrun_01KM1F63XVVM0KM4QWXXCYF773
  • wellKnownAgentWorkflow (.well-known/agent) | wrun_01KM1F7ASXVNCKT9AWEEKHFD3S
  • should work with react rendering in step
  • promiseAllWorkflow | wrun_01KM1F6ACAA2MPJ5HD9WF3WD68
  • promiseRaceWorkflow | wrun_01KM1F6G384TECJHP7RA1DXQFK
  • promiseAnyWorkflow | wrun_01KM1F6J8BVK34RZTQ45G739KS
  • importedStepOnlyWorkflow | wrun_01KM1F7R76WNMYBRQ5DBHR5KR7
  • hookWorkflow | wrun_01KM1F6Y4V5TKE2BV97R5WHZ2G
  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KM1F77E636J4ZGZARVHNRWWV
  • webhookWorkflow | wrun_01KM1F7EWZCMCW1V2YMSWD8V96
  • sleepingWorkflow | wrun_01KM1F7NYEHX4DFENGB0N1331R
  • parallelSleepWorkflow | wrun_01KM1F824TSCBE9YDZDTWCKTSA
  • nullByteWorkflow | wrun_01KM1F86SNZX7G12SRE8TYC38A
  • workflowAndStepMetadataWorkflow | wrun_01KM1F88TM8D6E2XYTKQRRDAD5
  • fetchWorkflow | wrun_01KM1F9711YQ2TA83439HFRZNS
  • promiseRaceStressTestWorkflow | wrun_01KM1F9AA03X08VWQPKKF6TQJ8
  • error handling error propagation workflow errors nested function calls preserve message and stack trace
  • error handling error propagation workflow errors cross-file imports preserve message and stack trace
  • error handling error propagation step errors basic step error preserves message and stack trace
  • error handling error propagation step errors cross-file step error preserves message and function names in stack
  • error handling retry behavior regular Error retries until success
  • error handling retry behavior FatalError fails immediately without retries
  • error handling retry behavior RetryableError respects custom retryAfter delay
  • error handling retry behavior maxRetries=0 disables retries
  • error handling catchability FatalError can be caught and detected with FatalError.is()
  • hookCleanupTestWorkflow - hook token reuse after workflow completion | wrun_01KM1FC7AWXCN3JM11V0CZXP5R
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KM1FCTSSPEJD2SKKZJ53YYBR
  • hookDisposeTestWorkflow - hook token reuse after explicit disposal while workflow still running | wrun_01KM1FDGPP4EYXPE3ACV2YVFDG
  • stepFunctionPassingWorkflow - step function references can be passed as arguments (without closure vars) | wrun_01KM1FE3V3GBXCVJ5PNPWEVXCZ
  • stepFunctionWithClosureWorkflow - step function with closure variables passed as argument | wrun_01KM1FECTW9X2HA8ACMTVF56BS
  • closureVariableWorkflow - nested step functions with closure variables | wrun_01KM1FEJ6KKXN5CJ0DN6W3PRQ3
  • spawnWorkflowFromStepWorkflow - spawning a child workflow using start() inside a step | wrun_01KM1FEM75R3YPBV8VMGTFBXV4
  • health check (queue-based) - workflow and step endpoints respond to health check messages
  • pathsAliasWorkflow - TypeScript path aliases resolve correctly | wrun_01KM1FF35HXPA1AFJEDXKTNBGD
  • Calculator.calculate - static workflow method using static step methods from another class | wrun_01KM1FF8KWWRW1NDBVNVMGGNEP
  • AllInOneService.processNumber - static workflow method using sibling static step methods | wrun_01KM1FFF5DTEJDZKTTK1T4Y3HS
  • ChainableService.processWithThis - static step methods using this to reference the class | wrun_01KM1FFNFMF3HQPPEVN132V4YF
  • thisSerializationWorkflow - step function invoked with .call() and .apply() | wrun_01KM1FFVX36PDXW01QXCASDQ1T
  • customSerializationWorkflow - custom class serialization with WORKFLOW_SERIALIZE/WORKFLOW_DESERIALIZE | wrun_01KM1FG3BW2JJCMCHWKYHB3WQG
  • instanceMethodStepWorkflow - instance methods with "use step" directive | wrun_01KM1FG9XJJYMDB0G7P4WC5PK0
  • crossContextSerdeWorkflow - classes defined in step code are deserializable in workflow context | wrun_01KM1FGMCF6TGVK6E23WE8R4VT
  • stepFunctionAsStartArgWorkflow - step function reference passed as start() argument | wrun_01KM1FGX5SGC75D1Q77ZD6GMG9
  • cancelRun - cancelling a running workflow | wrun_01KM1FH3PA1Z1BWS5CCV28Q247
  • cancelRun via CLI - cancelling a running workflow | wrun_01KM1FHCPY7HS4NZ7BSBDPY9K0
  • pages router addTenWorkflow via pages router
  • pages router promiseAllWorkflow via pages router
  • pages router sleepingWorkflow via pages router
  • hookWithSleepWorkflow - hook payloads delivered correctly with concurrent sleep | wrun_01KM1FHSAN1ZWV08AKB56471M1
  • sleepInLoopWorkflow - sleep inside loop with steps actually delays each iteration | wrun_01KM1FJCG15DWVNKF1516CEDPJ
  • sleepWithSequentialStepsWorkflow - sequential steps work with concurrent sleep (control) | wrun_01KM1FJR8GHYWEY2JF6N422BXA

Details by Category

✅ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 68 0 7
✅ example 68 0 7
✅ express 68 0 7
✅ fastify 68 0 7
✅ hono 68 0 7
✅ nextjs-turbopack 73 0 2
✅ nextjs-webpack 73 0 2
✅ nitro 68 0 7
✅ nuxt 68 0 7
✅ sveltekit 68 0 7
✅ vite 68 0 7
✅ 💻 Local Development
App Passed Failed Skipped
✅ astro-stable 66 0 9
✅ express-stable 66 0 9
✅ fastify-stable 66 0 9
✅ hono-stable 66 0 9
✅ nextjs-turbopack-canary 55 0 20
✅ nextjs-turbopack-stable 72 0 3
✅ nextjs-webpack-canary 55 0 20
✅ nextjs-webpack-stable 72 0 3
✅ nitro-stable 66 0 9
✅ nuxt-stable 66 0 9
✅ sveltekit-stable 66 0 9
✅ vite-stable 66 0 9
✅ 📦 Local Production
App Passed Failed Skipped
✅ astro-stable 66 0 9
✅ express-stable 66 0 9
✅ fastify-stable 66 0 9
✅ hono-stable 66 0 9
✅ nextjs-turbopack-canary 55 0 20
✅ nextjs-turbopack-stable 72 0 3
✅ nextjs-webpack-canary 55 0 20
✅ nextjs-webpack-stable 72 0 3
✅ nitro-stable 66 0 9
✅ nuxt-stable 66 0 9
✅ sveltekit-stable 66 0 9
✅ vite-stable 66 0 9
✅ 🐘 Local Postgres
App Passed Failed Skipped
✅ astro-stable 66 0 9
✅ express-stable 66 0 9
✅ fastify-stable 66 0 9
✅ hono-stable 66 0 9
✅ nextjs-turbopack-canary 55 0 20
✅ nextjs-turbopack-stable 72 0 3
✅ nextjs-webpack-canary 55 0 20
✅ nextjs-webpack-stable 72 0 3
✅ nitro-stable 66 0 9
✅ nuxt-stable 66 0 9
✅ sveltekit-stable 66 0 9
✅ vite-stable 66 0 9
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 72 0 3
❌ 🌍 Community Worlds
App Passed Failed Skipped
✅ mongodb-dev 3 0 2
❌ mongodb 52 3 3
✅ redis-dev 3 0 2
❌ redis 53 2 3
✅ turso-dev 3 0 2
❌ turso 4 51 3
✅ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 66 0 9
✅ e2e-local-postgres-nest-stable 66 0 9
✅ e2e-local-prod-nest-stable 66 0 9

📋 View full workflow run

@github-actions
Copy link
Contributor

github-actions bot commented Mar 18, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Express 0.045s (+12.6% 🔺) 1.006s (~) 0.961s 10 1.00x
💻 Local Nitro 0.046s (+8.1% 🔺) 1.006s (~) 0.959s 10 1.04x
🐘 Postgres Express 0.048s (-23.1% 🟢) 1.011s (~) 0.963s 10 1.07x
💻 Local Next.js (Turbopack) 0.048s 1.006s 0.957s 10 1.08x
🌐 Redis Next.js (Turbopack) 0.054s (-4.6%) 1.005s (~) 0.951s 10 1.22x
🐘 Postgres Next.js (Turbopack) 0.059s 1.011s 0.952s 10 1.33x
🐘 Postgres Nitro 0.060s (-6.1% 🟢) 1.011s (~) 0.951s 10 1.34x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 0.523s (+13.6% 🔺) 2.432s (+15.0% 🔺) 1.909s 10 1.00x
▲ Vercel Next.js (Turbopack) 0.531s (-28.0% 🟢) 2.621s (-3.9%) 2.089s 10 1.02x
▲ Vercel Express 0.621s (+13.0% 🔺) 2.469s (+6.4% 🔺) 1.849s 10 1.19x

🔍 Observability: Nitro | Next.js (Turbopack) | Express

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 1.113s 2.005s 0.892s 10 1.00x
🐘 Postgres Express 1.115s (-2.8%) 2.012s (~) 0.898s 10 1.00x
🌐 Redis Next.js (Turbopack) 1.124s (~) 2.007s (~) 0.883s 10 1.01x
💻 Local Express 1.135s (+3.3%) 2.006s (~) 0.871s 10 1.02x
💻 Local Nitro 1.140s (+0.9%) 2.007s (~) 0.867s 10 1.02x
🐘 Postgres Next.js (Turbopack) 1.146s 2.012s 0.866s 10 1.03x
🐘 Postgres Nitro 1.151s (~) 2.013s (~) 0.863s 10 1.03x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.072s (-3.7%) 3.619s (-4.4%) 1.547s 10 1.00x
▲ Vercel Nitro 2.194s (+6.9% 🔺) 3.819s (+18.7% 🔺) 1.625s 10 1.06x
▲ Vercel Express 2.719s (+28.7% 🔺) 4.291s (+16.6% 🔺) 1.572s 10 1.31x

🔍 Observability: Next.js (Turbopack) | Nitro | Express

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 10.761s (-1.7%) 11.042s (~) 0.281s 3 1.00x
🌐 Redis Next.js (Turbopack) 10.765s (~) 11.022s (~) 0.257s 3 1.00x
💻 Local Next.js (Turbopack) 10.776s 11.023s 0.247s 3 1.00x
💻 Local Express 10.908s (+2.4%) 11.023s (~) 0.115s 3 1.01x
🐘 Postgres Nitro 10.947s (~) 11.041s (~) 0.094s 3 1.02x
🐘 Postgres Next.js (Turbopack) 10.947s 11.374s 0.426s 3 1.02x
💻 Local Nitro 10.948s (~) 11.024s (~) 0.076s 3 1.02x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 17.128s (-1.0%) 19.146s (+1.9%) 2.019s 2 1.00x
▲ Vercel Next.js (Turbopack) 17.854s (+5.2% 🔺) 19.361s (+3.5%) 1.507s 2 1.04x
▲ Vercel Nitro 17.919s (+1.5%) 20.020s (+8.2% 🔺) 2.101s 2 1.05x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 26.618s (-2.2%) 27.053s (-3.6%) 0.434s 3 1.00x
🌐 Redis Next.js (Turbopack) 26.899s (~) 27.386s (+1.2%) 0.487s 3 1.01x
💻 Local Next.js (Turbopack) 27.178s 28.053s 0.875s 3 1.02x
🐘 Postgres Nitro 27.194s (~) 28.061s (~) 0.867s 3 1.02x
🐘 Postgres Next.js (Turbopack) 27.223s 28.065s 0.842s 3 1.02x
💻 Local Express 27.472s (+2.5%) 28.052s (+3.7%) 0.580s 3 1.03x
💻 Local Nitro 27.562s (~) 28.055s (~) 0.493s 3 1.04x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 44.575s (-1.1%) 45.973s (-0.8%) 1.398s 2 1.00x
▲ Vercel Express 44.762s (~) 46.273s (~) 1.511s 2 1.00x
▲ Vercel Next.js (Turbopack) 45.453s (+1.0%) 47.201s (~) 1.748s 2 1.02x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 53.172s (-2.1%) 54.095s (-1.8%) 0.924s 2 1.00x
🌐 Redis Next.js (Turbopack) 53.654s (~) 54.095s (~) 0.440s 2 1.01x
🐘 Postgres Next.js (Turbopack) 53.995s 54.104s 0.109s 2 1.02x
🐘 Postgres Nitro 54.279s (~) 55.101s (~) 0.823s 2 1.02x
💻 Local Next.js (Turbopack) 55.916s 56.101s 0.185s 2 1.05x
💻 Local Express 56.606s (+2.9%) 57.100s (+3.6%) 0.494s 2 1.06x
💻 Local Nitro 56.653s (~) 57.100s (~) 0.447s 2 1.07x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 95.438s (~) 96.638s (~) 1.200s 1 1.00x
▲ Vercel Express 95.831s (~) 97.331s (~) 1.500s 1 1.00x
▲ Vercel Nitro 96.484s (-7.4% 🟢) 98.789s (-6.2% 🟢) 2.305s 1 1.01x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.219s (-4.6%) 2.011s (~) 0.793s 15 1.00x
🐘 Postgres Next.js (Turbopack) 1.255s 2.012s 0.756s 15 1.03x
🐘 Postgres Nitro 1.271s (-0.5%) 2.011s (~) 0.740s 15 1.04x
🌐 Redis Next.js (Turbopack) 1.377s (~) 2.006s (~) 0.629s 15 1.13x
💻 Local Express 1.501s (+3.0%) 2.006s (~) 0.505s 15 1.23x
💻 Local Nitro 1.504s (-1.3%) 2.006s (~) 0.501s 15 1.23x
💻 Local Next.js (Turbopack) 1.506s 2.005s 0.499s 15 1.24x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.530s (+3.5%) 3.846s (-4.0%) 1.315s 8 1.00x
▲ Vercel Express 2.551s (+6.5% 🔺) 4.030s (+7.1% 🔺) 1.479s 8 1.01x
▲ Vercel Nitro 3.324s (+36.8% 🔺) 4.916s (+46.9% 🔺) 1.591s 7 1.31x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.355s (-4.0%) 3.012s (~) 0.656s 10 1.00x
🐘 Postgres Nitro 2.448s (-0.6%) 3.013s (~) 0.566s 10 1.04x
🐘 Postgres Next.js (Turbopack) 2.478s 3.013s 0.535s 10 1.05x
🌐 Redis Next.js (Turbopack) 2.566s (~) 3.008s (~) 0.442s 10 1.09x
💻 Local Next.js (Turbopack) 2.704s 3.007s 0.303s 10 1.15x
💻 Local Nitro 2.972s (+3.6%) 3.453s (+14.8% 🔺) 0.481s 9 1.26x
💻 Local Express 3.083s (+18.5% 🔺) 3.759s (+25.0% 🔺) 0.675s 8 1.31x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.515s (-6.4% 🟢) 3.967s (+9.3% 🔺) 1.452s 8 1.00x
▲ Vercel Express 2.639s (+10.8% 🔺) 3.859s (-0.5%) 1.219s 8 1.05x
▲ Vercel Next.js (Turbopack) 2.743s (+8.6% 🔺) 3.982s (-0.7%) 1.239s 8 1.09x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 3.495s (-2.9%) 4.013s (~) 0.518s 8 1.00x
🐘 Postgres Nitro 3.588s (-0.7%) 4.014s (~) 0.426s 8 1.03x
🐘 Postgres Next.js (Turbopack) 3.755s 4.015s 0.260s 8 1.07x
🌐 Redis Next.js (Turbopack) 4.154s (+1.0%) 5.012s (+2.9%) 0.859s 6 1.19x
💻 Local Next.js (Turbopack) 7.150s 7.619s 0.469s 5 2.05x
💻 Local Nitro 8.043s (-0.7%) 8.519s (-5.6% 🟢) 0.477s 4 2.30x
💻 Local Express 8.175s (+20.4% 🔺) 8.773s (+25.1% 🔺) 0.598s 4 2.34x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.846s (-45.3% 🟢) 4.390s (-34.4% 🟢) 1.544s 7 1.00x
▲ Vercel Nitro 3.000s (+10.8% 🔺) 4.414s (+13.3% 🔺) 1.415s 7 1.05x
▲ Vercel Next.js (Turbopack) 3.349s (+8.6% 🔺) 4.966s (+5.9% 🔺) 1.616s 7 1.18x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 1.214s (-4.8%) 2.010s (~) 0.796s 15 1.00x
🐘 Postgres Nitro 1.258s (-1.2%) 2.011s (~) 0.753s 15 1.04x
🐘 Postgres Next.js (Turbopack) 1.271s 2.011s 0.740s 15 1.05x
🌐 Redis Next.js (Turbopack) 1.279s (-2.4%) 2.006s (~) 0.728s 15 1.05x
💻 Local Express 1.512s (+3.3%) 2.005s (~) 0.494s 15 1.25x
💻 Local Nitro 1.528s (~) 2.005s (~) 0.477s 15 1.26x
💻 Local Next.js (Turbopack) 1.571s 2.073s 0.502s 15 1.29x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 2.206s (-5.2% 🟢) 3.634s (-4.2%) 1.428s 9 1.00x
▲ Vercel Nitro 2.208s (-18.1% 🟢) 3.876s (-2.4%) 1.669s 8 1.00x
▲ Vercel Express 2.231s (-1.9%) 3.675s (-3.0%) 1.444s 9 1.01x

🔍 Observability: Next.js (Turbopack) | Nitro | Express

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.337s (-4.8%) 3.010s (~) 0.673s 10 1.00x
🐘 Postgres Nitro 2.447s (~) 3.012s (~) 0.565s 10 1.05x
🐘 Postgres Next.js (Turbopack) 2.494s 3.012s 0.518s 10 1.07x
🌐 Redis Next.js (Turbopack) 2.569s (+0.7%) 3.008s (~) 0.439s 10 1.10x
💻 Local Next.js (Turbopack) 2.924s 3.678s 0.753s 9 1.25x
💻 Local Nitro 2.981s (~) 3.454s (~) 0.473s 9 1.28x
💻 Local Express 3.046s (+10.0% 🔺) 3.760s (+21.0% 🔺) 0.714s 8 1.30x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.366s (-7.7% 🟢) 4.048s (+11.8% 🔺) 1.683s 8 1.00x
▲ Vercel Express 2.487s (+7.7% 🔺) 3.979s (+8.6% 🔺) 1.492s 8 1.05x
▲ Vercel Next.js (Turbopack) 2.878s (-2.6%) 4.150s (-4.1%) 1.272s 8 1.22x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 3.502s (-2.5%) 4.014s (~) 0.513s 8 1.00x
🐘 Postgres Nitro 3.580s (-0.6%) 4.015s (~) 0.435s 8 1.02x
🐘 Postgres Next.js (Turbopack) 3.768s 4.015s 0.247s 8 1.08x
🌐 Redis Next.js (Turbopack) 4.191s (+0.7%) 4.725s (-2.9%) 0.534s 7 1.20x
💻 Local Nitro 8.503s (-0.7%) 9.023s (~) 0.520s 4 2.43x
💻 Local Express 8.588s (+18.5% 🔺) 9.023s (+16.2% 🔺) 0.435s 4 2.45x
💻 Local Next.js (Turbopack) 8.697s 9.018s 0.320s 4 2.48x

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.942s (+11.3% 🔺) 4.442s (+18.0% 🔺) 1.499s 7 1.00x
▲ Vercel Nitro 3.719s (+1.1%) 5.269s (+7.9% 🔺) 1.550s 6 1.26x
▲ Vercel Next.js (Turbopack) 3.749s (+16.6% 🔺) 5.463s (+18.2% 🔺) 1.714s 6 1.27x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 0.159s (-28.2% 🟢) 1.000s (+0.8%) 0.001s (-18.8% 🟢) 1.011s (~) 0.853s 10 1.00x
🌐 Redis Next.js (Turbopack) 0.187s (+2.9%) 1.000s (~) 0.002s (-11.8% 🟢) 1.007s (~) 0.820s 10 1.18x
💻 Local Next.js (Turbopack) 0.188s 1.000s 0.011s 1.017s 0.830s 10 1.18x
💻 Local Nitro 0.197s (~) 1.003s (~) 0.012s (~) 1.017s (~) 0.820s 10 1.24x
💻 Local Express 0.199s (+43.6% 🔺) 1.003s (~) 0.012s (+11.5% 🔺) 1.017s (~) 0.819s 10 1.25x
🐘 Postgres Next.js (Turbopack) 0.217s 1.002s 0.002s 1.014s 0.797s 10 1.37x
🐘 Postgres Nitro 0.224s (+2.8%) 0.995s (-0.5%) 0.001s (-6.7% 🟢) 1.013s (~) 0.789s 10 1.41x

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.627s (~) 2.396s (~) 0.668s (+12979.3% 🔺) 3.690s (-75.2% 🟢) 2.063s 10 1.00x
▲ Vercel Express 1.636s (+1.0%) 2.438s (-12.3% 🟢) 0.653s (+14413.3% 🔺) 3.686s (+10.6% 🔺) 2.050s 10 1.00x
▲ Vercel Next.js (Turbopack) 1.747s (+8.7% 🔺) 3.071s (+20.1% 🔺) 0.398s (+1897.5% 🔺) 4.149s (+32.7% 🔺) 2.402s 10 1.07x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Next.js (Turbopack) 8/12
🐘 Postgres Express 12/12
▲ Vercel Nitro 5/12
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 9/12
Next.js (Turbopack) 🌐 Redis 4/12
Nitro 🐘 Postgres 7/12
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run

VaguelySerious and others added 2 commits March 18, 2026 13:26
c
Signed-off-by: Peter Wielander <mittgfu@gmail.com>
Restore and update test coverage that was reduced in the initial change.
The test suite now covers 15 scenarios (up from 9):

Flush-on-write behavior:
- write() resolves only after data reaches server
- Single chunk uses writeToStream (not writeToStreamMulti)
- Falls back to sequential writes when writeToStreamMulti unavailable
- Multiple sequential writes trigger separate flush cycles
- Concurrent writes wait for in-progress flush before buffering

Close behavior:
- closeStream called on close
- Remaining buffer flushed on close
- Empty buffer on close skips write methods

Abort & error handling:
- Abort discards buffer and skips closeStream
- Write errors propagate to caller
- Close errors propagate to caller
- Flush errors during write propagate to caller

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link
Contributor

@vercel vercel bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Suggestion:

The abort() handler in BufferedWritableStream clears the flush timer but never settles the flushWaiters promises, causing the internal write() async function to hang forever on an unsettled promise.

Fix on Vercel

@@ -0,0 +1,15 @@
---
title: Changelog
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a stub so we can add changelog entries for bigger changes later

* @vercel/workflow

packages/next/ @ijjk @vercel/workflow
packages/next/src @ijjk @vercel/workflow
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This ensures JJ doesn't get accidentally tagged on every release PR when only changesets change

@VaguelySerious VaguelySerious marked this pull request as ready for review March 18, 2026 21:17
@VaguelySerious VaguelySerious requested a review from a team as a code owner March 18, 2026 21:17
…sh timer but never settles the `flushWaiters` promises, causing the internal `write()` async function to hang forever on an unsettled promise.

This commit fixes the issue reported at packages/core/src/serialization.ts:560

**Bug explanation:**

In `BufferedWritableStream`, the `write()` method (line 530) pushes chunks to a buffer, schedules a flush via a timer, and then awaits a promise that is added to `flushWaiters` (line 545):

```js
await new Promise<void>((resolve, reject) => {
  flushWaiters.push({ resolve, reject });
});
```

Normally, when the flush timer fires (line 514), it captures `flushWaiters`, replaces it with a fresh array, calls `flush()`, and then resolves or rejects the captured waiters based on the flush result.

When `abort()` is called (line 560), it:

1.  Clears the flush timer (preventing the timer callback from ever running)
2.  Discards the buffer

But it does NOT settle the promises in `flushWaiters`. Since the timer that would have resolved/rejected those waiters has been cleared, and nothing else references them, those promises will **never settle**. The internal async function of `write()` is suspended at the `await` on line 545 and will remain suspended forever - it becomes a memory leak. The closures captured by those async functions (referencing `buffer`, `flushWaiters`, the world object, etc.) will also be retained in memory.

While the WritableStream infrastructure will reject the external write promise returned to consumers, the *internal* async function body doesn't get cancelled - JavaScript has no mechanism to cancel a suspended async function. It will hang indefinitely on the unsettled promise.

**Fix explanation:**

The fix adds code to the `abort()` handler to reject all pending `flushWaiters` before clearing the array. It:

1.  Captures the current `flushWaiters` into a local variable
2.  Replaces `flushWaiters` with an empty array (same pattern used in the timer callback)
3.  Rejects each waiter with the abort reason (or a default "Stream aborted" error)

This ensures the `write()` async functions' awaited promises settle (with rejection), allowing those async functions to complete and their closures to be garbage collected. The rejection from `w.reject()` will propagate up through the `write()` function, but since the stream is already being aborted, this is the correct behavior.


Co-authored-by: Vercel <vercel[bot]@users.noreply.github.com>
Co-authored-by: VaguelySerious <mittgfu@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant