Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

serving memory stored static files? #964

Open
Uzlopak opened this issue Nov 19, 2023 · 8 comments
Open

serving memory stored static files? #964

Uzlopak opened this issue Nov 19, 2023 · 8 comments
Labels
help wanted Extra attention is needed

Comments

@Uzlopak
Copy link

Uzlopak commented Nov 19, 2023

I felt that multiple times, I wanted to serve a firstful of static files. LIke favicon.ico, a css, a html, and a frontend js file. They make maybe around 1 - 2 mb. But using fastify-static feels wrong. I just want to serve these files over and over. They never change.

fastify-static is reading every time the file from the disk. everytime it generates the etag.

What I could do, is explicitly loading the files via fs.readFileSync and store them as buffer and just send them directly via the handler.

Of course I simplify my case very much. E.g. @fastify/send also handles etags and stuff. And compression is also not implemented. But well... its a poc.

For comparison the example/server.js in fastify-static.

'use strict'

const fs = require('node:fs')
const path = require('node:path')
const fastify = require('fastify')({ logger: false })

const ondex = fs.readFileSync(path.join(__dirname, '/public/index.html'))

fastify
  .register(require('../'), {
    // An absolute path containing static files to serve.
    root: path.join(__dirname, '/public')
  })
  .get('/ondex.html', (req, reply) => {
    reply
      .type('text/html')
      .send(ondex)
  })
  .listen({ port: 3000 }, err => {
    if (err) throw err
  })

index.html (standard fastify-static)

aras@aras-Lenovo-Legion-5-17ARH05H:~/workspace/fastify-static$ autocannon http://localhost:3000/index.html
Running 10s test @ http://localhost:3000/index.html
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 3 ms  │ 3 ms │ 0.28 ms │ 0.71 ms │ 24 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬────────┬─────────┬─────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%    │ 97.5%   │ Avg     │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼────────┼─────────┼─────────┼─────────┼─────────┤
│ Req/Sec   │ 6103    │ 6103    │ 9991   │ 10831   │ 9738.6  │ 1271.02 │ 6101    │
├───────────┼─────────┼─────────┼────────┼─────────┼─────────┼─────────┼─────────┤
│ Bytes/Sec │ 3.11 MB │ 3.11 MB │ 5.1 MB │ 5.53 MB │ 4.97 MB │ 648 kB  │ 3.11 MB │
└───────────┴─────────┴─────────┴────────┴─────────┴─────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 10

97k requests in 10.02s, 49.7 MB read

ondex.html (just buffered)

aras@aras-Lenovo-Legion-5-17ARH05H:~/workspace/fastify-static$ autocannon http://localhost:3000/ondex.html
Running 10s test @ http://localhost:3000/ondex.html
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms  │ 0 ms │ 0.01 ms │ 0.12 ms │ 28 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Req/Sec   │ 28239   │ 28239   │ 39263   │ 39519   │ 38208.73 │ 3170.32 │ 28237   │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼─────────┼─────────┤
│ Bytes/Sec │ 10.4 MB │ 10.4 MB │ 14.4 MB │ 14.5 MB │ 14 MB    │ 1.16 MB │ 10.4 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 11

420k requests in 11.01s, 154 MB read
@Uzlopak Uzlopak added the help wanted Extra attention is needed label Nov 19, 2023
@jsumners
Copy link
Member

What is the question?

@Uzlopak
Copy link
Author

Uzlopak commented Nov 19, 2023

Is there an existing plugin?
Should I/we invest in such a plugin?

@jsumners
Copy link
Member

No such plugin exists that I know of. My opinion is that static assets are better served by a tool designed for serving them, e.g. nginx.

@Uzlopak
Copy link
Author

Uzlopak commented Nov 19, 2023

This is a very obvious opinion. I also thought of that. But in the end, sometimes you just want to serve a bunch of files.

@mcollina
Copy link
Member

This can be a huge footgun, as the content of the files might cause the process to exit with an out of memory error.

My guess is that we could at increase the perf of @fastify/static by dropping this PassThrough https://github.com/fastify/fastify-static/blob/37056ce4a277004bb118a4ef0b92bdbaf4bf4d20/index.js#L232-L240. Each "stage" you add in a streaming pipeline halves the throughput, so I think we could get a significant boost.

A slightly better approach would be to overhaul @fastify/send to include a full cache for files under 5 MB. This kind of caches is what WeakRef and FinalizationRegistry excels at.

@barelyhuman
Copy link

@mcollina a little more detail about the caching logic that you expect?

Draft PR: barelyhuman/send#1

If that's correct then the cache wouldn't work because the reference to the complete buffer would be lost the moment the stream ends.

If what's implemented is wrong, then I'd like to understand what the expectation of the cache was.

@mcollina
Copy link
Member

The store would need to be passed in to send() and not constructed on demand.

@barelyhuman
Copy link

I've made changes to it, please confirm the expectation.

I also wanted to point of that there's an issue in the tests where the reference stays alive when using a stream. I'm unaware of the reason so I'm still looking into it .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants