Skip to content

JSZip runs very slowly when called from a Bun.serve or node:http handler, but not when called from a CLI script (Node.js is unaffected) #11054

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
fructiferous opened this issue May 13, 2024 · 11 comments
Assignees
Labels
linux An issue that occurs on Linux node.js Compatibility with Node.js APIs performance An issue with performance

Comments

@fructiferous
Copy link

What version of Bun is running?

1.1.8+89d25807f

What platform is your computer?

Linux 6.6.30_1 x86_64 unknown

What steps can reproduce the bug?

Using Bun with JSZip (https://www.npmjs.com/package/jszip) to operate on a zip within the context of a Bun.serve or node:http handler yields an enormous performance degradation that is neither present when running the same code with Bun as a CLI script, nor when using Node.js.

In short:

  • Bun on the CLI + JSZip = fast
  • Node.js on the CLI + JSZip = fast
  • Bun + Bun.serve/node:http + JSZip = slow
  • Node.js + node:http + JSZip = very fast

This was tested using the latest bun alpine container. However, the problem also appears when using an environment in which Bun was installed with curl -fsSL https://bun.sh/install | bash -s "bun-v1.1.8" and I'm pretty sure it also occurs within the context of the official bun lambda runtime (https://github.com/oven-sh/bun/blob/main/packages/bun-lambda/runtime.ts) when deployed to AWS Lambda (I don't have scripts to reproduce the issue in those contexts, but the simple case below should be enough to trigger it. Regardless, it's not specific to alpine).

The issue can be reproduced by using the following script, which can be invoked with either bun index.js to run the standalone version, or by using bun index.js serve, which will start a server on 127.0.0.1:8080 that can be poked with something like curl or wget to trigger the same code from the request handler:

index.js

import http from "node:http";
import JSZip from "jszip";

const data = "0123456789\n".repeat(2 ** 16);

const test = () =>
  new JSZip()
    .file("test", data)
    .generateInternalStream({ type: "nodebuffer" })
    .accumulate()
    .then((b) => `Buffer size: ${b.length}\n`);

const handler = async (req, res) =>
  res.end(req.method === "HEAD" ? "" : await test());

if (process.argv[2] === "serve") {
  const server = http.createServer(handler).listen(8080, "127.0.0.1");
  console.log("Server is up");
  process.on("SIGHUP", () => server.close());
} else {
  console.log(await test());
}

Which can be tested using the following hyperfine tests:

test.sh

#!/bin/sh -e

hyperfine \
	--export-markdown test-cli.md \
	-L runtime bun,node \
	--warmup 10 \
	'{runtime} index.js'

hyperfine \
	--export-markdown test-server.md \
	-L runtime bun,node \
	--setup '
		nohup {runtime} index.js serve >/dev/null 2>&1 &
		until wget --spider -O- http://127.0.0.1:8080
			do sleep 1
		done
	' \
	--cleanup 'pkill -f {runtime}" index.js serve" || true' \
	--warmup 10 \
	'wget -qO- http://127.0.0.1:8080/{runtime}'

Here's a docker oneliner to run the test script:

docker run --rm -it -v $(pwd):/app -w /app oven/bun:1.1.8-alpine sh -c 'apk add nodejs hyperfine && su bun test.sh'

What is the expected behavior?

I would've expected the code in the request handler to run just as fast as the code in the standalone CLI version, probably faster because of the reduced overhead (since the runtime is already up and running).

What do you see instead?

The Bun request handler (this means both Bun.serve and node:http; the test script uses the latter, but the former performs similarly) appears to run orders of magnitude slower than the CLI version, an issue that isn't present when the code is run with Node.js.

Test results:

test-cli.md

Command Mean [ms] Min [ms] Max [ms] Relative
bun index.js 53.6 ± 2.6 49.7 59.3 1.00
node index.js 82.9 ± 3.1 77.4 90.1 1.55 ± 0.09

test-server.md

Command Mean [ms] Min [ms] Max [ms] Relative
wget -qO- http://127.0.0.1:8080/bun 713.3 ± 17.6 688.5 738.7 136.51 ± 15.12
wget -qO- http://127.0.0.1:8080/node 5.2 ± 0.6 3.3 7.3 1.00

Additional information

The larger the zip file, the bigger the slowdown appears to be (~200x for a decently sized ~1MB zip, if I recall correctly). Similar results were achieved when poking the server with cURL, just to eliminate wget as a suspect. I don't have any code prepped to explicitly verify this, but I did run some other tests to confirm JSZip produces the same output in all of these cases/contexts, so it appears to be purely a performance issue.

I'm currently working around this issue by running the affected code in a child process using Bun.spawnSync - it's not pretty, but it works.

@fructiferous fructiferous added the bug Something isn't working label May 13, 2024
@Electroid
Copy link
Contributor

This is very likely related to node:streams performance, which is also mentioned here: #7428

@Electroid Electroid added node.js Compatibility with Node.js APIs performance An issue with performance and removed bug Something isn't working labels May 14, 2024
@Fxlr8
Copy link

Fxlr8 commented Sep 10, 2024

I have the same issue. Using Bun + Elysia and zipping a file in request handler. Bun 1.1.27

@Jarred-Sumner
Copy link
Collaborator

Fixed in Bun v1.2.8.

Runtime Requests per second
Bun v1.2.8 620
Node v23.10.0 416
Bun v1.2.7 55

@dylan-conway
Copy link
Member

Fixed by @cirospaciari in #18599

@fructiferous
Copy link
Author

@Jarred-Sumner, @dylan-conway, I just ran the original test again and I'm still seeing a ~200x slowdown with Bun 1.2.8 (also with 1.2.9) compared to Node.js. I'm running docker run --rm -it -v $(pwd):/app -w /app oven/bun:1.2.8-alpine sh -c 'apk add nodejs hyperfine && su bun test.sh' with the index.js and test.sh from the original post.

As before the CLI version is unaffected:

test-cli.md:

Command Mean [ms] Min [ms] Max [ms] Relative
bun index.js 56.8 ± 2.0 53.4 62.4 1.00
node index.js 77.6 ± 2.1 73.7 82.8 1.37 ± 0.06

Whereas the version behind an HTTP server shows the discrepancy:

test-server.md:

Command Mean [ms] Min [ms] Max [ms] Relative
wget -qO- http://127.0.0.1:8080/bun 704.1 ± 16.0 688.8 722.5 193.53 ± 24.43
wget -qO- http://127.0.0.1:8080/node 3.6 ± 0.5 2.7 6.0 1.00

@Jarred-Sumner What test did you run to obtain those numbers?

@dylan-conway
Copy link
Member

Looks like this was fixed on macos in v1.2.8, but not linux. Reopening

@dylan-conway dylan-conway reopened this Apr 9, 2025
@dylan-conway dylan-conway added the linux An issue that occurs on Linux label Apr 9, 2025
@cabralpinto
Copy link

cabralpinto commented Apr 13, 2025

Having the same issue in a SvelteKit project running on Windows. Running zip.generateAsync takes a few seconds with Bun and only a couple milliseconds with Vite.

@Jarred-Sumner
Copy link
Collaborator

I can confirm this is now fixed on Linux in Bun v1.2.10. It should also be fixed on Windows, but have not manually verified this

Command Mean [ms] Min [ms] Max [ms] Relative
wget -qO- http://127.0.0.1:8080/bun-1.2.8 709.3 ± 11.4 687.4 725.1 218.82 ± 29.34
wget -qO- http://127.0.0.1:8080/bun 3.2 ± 0.4 2.2 4.9 1.00
wget -qO- http://127.0.0.1:8080/node 3.6 ± 0.5 2.6 5.2 1.12 ± 0.21

@cabralpinto
Copy link

@Jarred-Sumner Thanks for answering, but running bun upgrade only takes me to v1.2.9 and I don't see v1.2.10 in the releases page. What am I missing?

@Jarred-Sumner
Copy link
Collaborator

@cabralpinto 1.2.10 is currently in canary and not released yet, but you can run bun upgrade --canary

@cabralpinto
Copy link

@Jarred-Sumner Thanks! Can confirm the issue fixed on Windows in the canary version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
linux An issue that occurs on Linux node.js Compatibility with Node.js APIs performance An issue with performance
Projects
None yet
Development

No branches or pull requests

7 participants