Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate CI performance #7318

Open
ulope opened this issue Aug 24, 2021 · 3 comments
Open

Investigate CI performance #7318

ulope opened this issue Aug 24, 2021 · 3 comments
Labels
Component / CI Component / Tooling Issues that benefit from external tooling Flag / Tech Debt Flag / Testing Type / Optimization Issues that are performance related

Comments

@ulope
Copy link
Collaborator

ulope commented Aug 24, 2021

Problem Definition

In the last couple of months our CI performance has noticeably decreased.
The current average workflow runtime seems to be 20+ minutes.

Ideally we want to return to the the 10 - 15 min range.

Things to investigate:

  • Where is time being spent?
    • Workspace up-/download
      • This seems relatively slow esp. compared to cache
      • Maybe move some files (e.g. python venv) to cache and only keep config / setting in workspace
    • Caching (is it working as intended?)
  • Are we being throttled by concurrency limits? (Conversation with Circle @karlb @czepluch)
  • Can we improve test run-time by further parallelization? (diminishing returns due to spin up overhead)
  • Can the integration tests be optimized?
    E.g.:
    • Possibility to not restart eth node / synapse / etc. for every test
    • Do the contracts have to be deployed for every test?
@ulope ulope added Component / CI Type / Optimization Issues that are performance related Flag / Testing Flag / Tech Debt Component / Tooling Issues that benefit from external tooling labels Aug 24, 2021
@ulope ulope added this to Backlog in Raiden Berlin Sprint Aug 24, 2021
@czepluch
Copy link
Contributor

I can setup a meeting with CircleCI if we believe that would be helpful.

@karlb
Copy link
Contributor

karlb commented Aug 27, 2021

Are we being throttled by concurrency limits? (Conversation with Circle @karlb @czepluch)

Yes, we are. I have been contacted by a CircleCI employee who would like to talk to us about it. I'll forward the contact details to both of you, again.

@istankovic
Copy link
Contributor

Today we mentioned a possibility of bumping up the required Python version to 3.8 (@palango, @ezdac).
If we decide to go that way, we can simply remove all the python-3.7 stuff from our CI, which I guess would help somewhat.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Component / CI Component / Tooling Issues that benefit from external tooling Flag / Tech Debt Flag / Testing Type / Optimization Issues that are performance related
Projects
Development

No branches or pull requests

4 participants