New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Jupyterlab too slow to load on slow networks #3561
Comments
Newer lab releases (after 0.29) suffered some slow-down in my experience as well. It was strong enough that I was forced to stop on the 0.29 for "production" until this is resolved. So here is additional data point on this. |
Hi @AbdealiJK and @jochym, as of JupyterLab 0.31 (available as |
I'll report any changes when I test on my staging install |
I still need to use the earlier versions for my current setup due to a bunch of reasons. I understand uglify will reduce the size, bit the data usage can be even further reduced by gzipping the file... |
I have checked 0.30.6 with 0.9.0 and it is still unusably slow on my local system - with google drive notebook. At the same time I am working remotely on 0.29 + 0.8.0 so this is a definitive regression. |
We could use the WebPack compression plugin, but I'm not sure how to configure tornado to serve compressed files. |
Additional datapoint: 0.31.2 has the same issue - slow loading from remote server. It is slightly better but definitely not fixed. Particularly with google-drive hosted notebooks (jupyterlab/jupyterlab-google-drive#113), but it is bad with server-hosted notebooks as well. |
https://stackoverflow.com/questions/16124908/python-tornado-compressing-static-files 😄 It's supposed to be very simple to implement the compressed static file sending - although I've only done it for django and nodejs and not tornado. |
Just to report: 0.31.5 got usable again. I would not say it is fixed but it is much better. We can use JL witth google drive again without frustration. |
nginx gzip module support gzip on proxfied respond. http://nginx.org/en/docs/http/ngx_http_gzip_module.html#gzip_proxied |
I have just setup jupyter on server, behind a nginx revser proxy as gateway.
It is faster, the initial two >1Mb big .js file, adding together is <1Mb through nginx. It's also possible to limit the concurrent request rate and bandwidth. |
Our recent upgrade to webpack 4 split our huge js bundle down into several smaller bundles. Is this still an issue with whoever was having issues above? |
Hi, I use the latest docker image of datascience notebook, but there is still a huge js. And it took about 20 seconds to load it on a remote chrome. |
My vendors~main.xxxx.js is larger than 100MB and is not even cached, see #5558 |
@flixr - what is your jlab version, and what plugins do you have installed? |
Using jupyterlab 1.0.1 with quite a view plugins, see attached Dockerfile (uses https://github.com/jupyter/docker-stacks/blob/7a3e968dd21268c4b7a6746458ac34e5c3fc17b9/scipy-notebook/Dockerfile as base) |
That is a lot of javascript you are bundling up in jlab with all those plugins. Thanks for being more specific about it. |
Yeah, I was actually quite surprised that it amounts to so much.... |
I've also been seeing some very large JS bundles, though not quite 100 MB. Part of this could be mitigated by doing a production build |
@ian-r-rose thanks for the info, will try with |
I didn't have any luck with |
In my experience, when you have bundle sizes / build times such as this, it is due to large amount of duplication (the same dependency tree included multiple times due to incompatible version dependencies). No matter the cause, this should help you troubleshoot:
This should allow you to see what is taking up space in your bundle (and/or share it with us). Try to identify any repeating patterns (e.g. a large package being included multiple places). If you find any such packages, please run |
Unfortunately I'm not really familiar with node and the like...
And the size it reports for As far as I can see there is not much duplication and plotly and beakerx are the biggest chunks.... |
Ah, got it: |
@flixr @ian-r-rose the solution is to use --dev-build=False everytime you install something. the javascript went down from 55 to 8 Megabyte. The developers should make this default... |
@juliusvonkohout as I already mentioned this didn't work for me:
|
While a minimizing prod build still times out, I got down the size of the vendors~main.xxx.js by turning off
before your |
@flixr The jupyter lab build --minimize=False --dev-build=False If you don't have any locally linked/installed labextensions, then you can also drop the |
Thanks @telamonian and @flixr for sharing your solutions! |
I am using Jupyterlab deployed on a machine on the cloud. It seems to be too slow to open up the main /lab page when i try to open it on my browser.
Now, my network speed is very slow, so I am OK with it, but I am trying to see how I can optimize it.
I can see that the bundle.js file I have is the biggest file it is opening and that takes about 5 minutes to download. When I manually gzipped the file, I noticed that the filesize reduced to one-fourth the filesize.
I was under the impression that NotebookServer/JupyterLab would be able to auto-handle these sort of optimizations and that it would send the file gzipped as Chrome will accept gzipped files.
Do I need to enable any config to enable this sort of compression or what can I do to enable this ?
Any other suggestions to speed it up would be great too! My target connection I want to cater to is 50KB/sec.
PS: I am using Jupyterhub+Notebook+Lab
The text was updated successfully, but these errors were encountered: