fix(bun): Consume fetch response body to prevent memory leak#19927
fix(bun): Consume fetch response body to prevent memory leak#19927Karavil wants to merge 4 commits intogetsentry:developfrom
Conversation
Bun's fetch implementation retains the backing ArrayBuffer of unconsumed response bodies indefinitely. This causes a memory leak when sending many Sentry envelopes, as each response's ArrayBuffer accumulates in memory. This applies the same fix that was made for the Cloudflare transport in getsentry#18545 — consuming the response body with response.text() after extracting the needed headers. In production, this leak manifests as ~8KB ArrayBuffers accumulating at ~8/sec (one per envelope), leading to OOM kills after ~5 hours on containers with 4GB memory limits.
Semver Impact of This PR🟢 Patch (bug fixes) 📋 Changelog PreviewThis is how your changes will appear in the changelog. New Features ✨Deps
Bug Fixes 🐛Core
Deps
Other
Internal Changes 🔧Deps Dev
Other
🤖 This preview updates automatically when you update the PR. |
Use `void response.text().catch(() => {})` instead of `await response.text()`
to avoid adding latency to Sentry envelope sends. We don't need the body
content, just need to trigger the drain so Bun can free the ArrayBuffer.
|
@s1gr1d @JPeer264 — This applies the same response body consumption fix you shipped for the Cloudflare transport in #18545, now for the Bun transport. We ran into this in production: 130K leaked ArrayBuffers (1GB) causing OOM kills every 5h. One improvement over the Cloudflare fix: we use |
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
The `void response.text().catch(() => {})` pattern only handles async
promise rejections. If `response.text` is not a function, a synchronous
TypeError is thrown before `.catch()` is reached, rejecting the entire
promise chain and preventing the transport from returning status/headers.
Wrap in try/catch to handle both:
- try/catch: synchronous TypeError (response.text not a function)
- .catch(): async rejection (body read fails mid-stream)

Summary
Bun's
fetchretains the backingArrayBufferof unconsumed response bodies indefinitely. The Bun transport'smakeFetchTransportcallsfetch()but never consumes the response body, causing a memory leak when sending Sentry envelopes.This is the same class of bug that was fixed for the Cloudflare transport in #18545. We improve on that fix by using
void response.text().catch(() => {})instead ofawait response.text(): thevoidapproach drains the body asynchronously, adding zero latency to Sentry envelope sends. We don't need the response content, just need to trigger the drain so Bun releases the backing ArrayBuffer.Bun fetch memory leak: documented and ongoing
Bun has a well-documented history of fetch response bodies not being garbage collected. Despite multiple fixes across releases, the core problem persists for the simplest case (unconsumed bodies):
Merged fixes (proving the problem exists)
fix(fetch): allow Response to be GC'd before all request body receivedrefactor(Response): isolate body usage(fetch memory leak fix)fix(fetch): fix ReadableStream memory leak when using stream bodyfix(http): fix Strong reference leak in server response streamingfix: release ReadableStream Strong ref on fetch body cancel(260KB leaked per cancelled request)Still-open issues
cancel()is not calledRoot cause (from #10763): Bun holds a strong reference to the ReadableStream backing the response body. If never consumed, the strong ref prevents GC.
Improvement over the Cloudflare transport fix (#18545)
The Cloudflare transport fix in #18545 used
await response.text(), which blocks the transport until the body is fully read. This adds unnecessary latency to every Sentry envelope send.Our fix uses
void response.text().catch(() => {})instead:asyncneeded: the.then()callback stays synchronous, matching the original signature.catch(() => {}): if the drain fails for any reason, it doesn't crash the transportThis is a strictly better pattern for any transport where we don't need the response body content.
Production evidence (heap snapshots)
We observed this leak in production on a Bun service running
@sentry/bun:get bufferaccessorAfter applying this fix, the leak was eliminated.
Reproduction
Expected: RSS stays flat. Observed on Bun 1.3.x: RSS grows linearly.