Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Session requests are ~70% slower in version 2.26.0 #5891

Closed
omermizr opened this issue Jul 28, 2021 · 5 comments
Closed

Session requests are ~70% slower in version 2.26.0 #5891

omermizr opened this issue Jul 28, 2021 · 5 comments
Labels

Comments

@omermizr
Copy link

When using a session, requests are slower than in previous versions.
My metrics indicate a performance degradation, and after some testing I suspect this is the root cause.

Expected Result

Same performance as in version 2.25

Actual Result

When I use a session, requests are about 70% slower than in previous versions.
This seems to be the PR that caused the regression: #5681
I removed the rebuild_proxies call locally and performance shot back up.
I ran a benchmark on 1000 requests making a real-world request with the same session.
In version 2.25.1 it took ~13s, in version 2.26.0 it takes ~23s.

System Information

$ python -m requests.help
{
  "chardet": {
    "version": "3.0.4"
  },
  "charset_normalizer": {
    "version": "2.0.3"
  },
  "cryptography": {
    "version": "3.4.7"
  },
  "idna": {
    "version": "2.10"
  },
  "implementation": {
    "name": "CPython",
    "version": "3.7.10"
  },
  "platform": {
    "release": "5.4.0-1055-azure",
    "system": "Linux"
  },
  "pyOpenSSL": {
    "openssl_version": "101010bf",
    "version": "19.1.0"
  },
  "requests": {
    "version": "2.26.0"
  },
  "system_ssl": {
    "version": "101000cf"
  },
  "urllib3": {
    "version": "1.26.6"
  },
  "using_charset_normalizer": false,
  "using_pyopenssl": true
}

This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).

@omermizr
Copy link
Author

I dug a little deeper and it looks like all the time is spent inside getproxies_environment, here: requests.utils.proxy_bypass:781.
We are deployed in a k8s cluster and have a lot of environment variables (around ~530 ATM), so it makes sense that iterating these would take time.
I cached getproxies_environment's result and performance is back up.

Note: My OS is Ubuntu 18.04.

dbaxa added a commit to dbaxa/requests that referenced this issue Jul 29, 2021
…building proxies if proxies have been supplied.

Signed-off-by: David Black <dblack@atlassian.com>
@bityob
Copy link

bityob commented Jul 29, 2021

Thanks @omermizr for finding this issue.

In this gist I have a full repro for this latency issue, comparing requests 2.26 vs 2.25.1 packages.

We can see that when we have 500 environment variables, there is a huge difference between the versions.

requests-2.26-env-count-500-result.png

requests-2.25.1-env-count-500-result.png

But without any variables, there is no difference at all.

requests-2.26-env-count-0-result.png

requests-2.25.1-env-count-0-result.png

See here the full details and how to run it by yourself -
https://gist.github.com/bityob/02b1a823b478ad74a8440b86f95e13a6/

@olka
Copy link

olka commented Oct 30, 2021

Any updates on this one?
Some hints how to workaround it?
Our application spends 80% of time in decoding huge JSON payloads and rebuilding list of proxies 😅
Screenshot 2021-10-31 at 1 22 35

@ykharko
Copy link

ykharko commented Nov 17, 2021

Hi guys. Do you know when in theory this issues might be solved and when new version of the library might be released?
Do you have any plans for that or smth like that?

@nateprewitt
Copy link
Member

We expect the current fix to go out with the 2.27.0 release which is currently scheduled for this coming Monday. There is still an edge case where we may see a slow down in environments with a large number of proxies available. The long term recommendation for this is passing the proxies parameter directly with the call to circumvent this look up. We unfortunately don't have a better way to handle this since it's a fundamental problem of the search functionality in the standard library.

Resolving now that the fixes have been merged for release.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 30, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants