Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Screenshots can be very slow #3069

Closed
lukeapage opened this issue Jan 6, 2019 · 5 comments · Fixed by #4038
Closed

Screenshots can be very slow #3069

lukeapage opened this issue Jan 6, 2019 · 5 comments · Fixed by #4038

Comments

@lukeapage
Copy link
Contributor

lukeapage commented Jan 6, 2019

Current behavior:

Taking a screenshot takes a long amount of time. I have a high-dpi monitor and it can take ~21 seconds per screenshot.

I suspect the extreme time taken is down to high dpi, so the bytes are x4, but still even on normal dpi it takes a number of seconds.

If you take 2-3 screenshots per test and have 200 tests, it adds up to a significant amount of time.

Problem:

Logs:

Sat, 05 Jan 2019 15:20:09 GMT cypress:server:screenshot (s2) capture and check { tries: 1, totalDuration: 0 }
Sat, 05 Jan 2019 15:20:11 GMT cypress:server:screenshot (s2) received screenshot data from automation layer data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAADwEAAAd5CAIAAAAhWRY6AAAgAElEQVR4nOzde3Bc5Z0n/Kdbat3llt
Sat, 05 Jan 2019 15:20:11 GMT cypress:server:screenshot parsing png
Sat, 05 Jan 2019 15:20:18 GMT cypress:server:screenshot got png
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) read buffer to image 3841 x 1913
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) helper pixels 
 { topLeft: { r: 68, g: 68, b: 68, a: 255, isNotWhite: true },
  topLeftRight: { r: 68, g: 68, b: 68, a: 255, isWhite: false },
  topLeftDown: { r: 68, g: 68, b: 68, a: 255, isWhite: false },
  bottomLeft: { r: 0, g: 0, b: 0, a: 255, isWhite: false },
  topRight: { r: 0, g: 0, b: 0, a: 255, isWhite: false },
  bottomRight: { r: 0, g: 0, b: 0, a: 255, isBlack: true } }
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) pixelConditionFn { pixelRatio: 2.2501464557703574, subject: 'app', hasPixels: false, expectedPixels: false }
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) no previous image to compare
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) resolving with image { tries: 1, totalDuration: 0 }
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) pixel ratio is 2.2501464557703574
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) multi-part 1/1
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) dimensions before are { x: 0, y: 0, width: 1024, height: 600 }
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) dimensions for cropping are { x: 0, y: 0, width: 2304.149970708846, height: 1350.0878734622145 }
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2) crop: from 0, 0
Sat, 05 Jan 2019 15:20:19 GMT cypress:server:screenshot (s2)         to 2304.149970708846 x 1350.0878734622145
Sat, 05 Jan 2019 15:20:23 GMT cypress:server:screenshot (s2) save filename.png
Sat, 05 Jan 2019 15:20:33 GMT cypress:server:screenshot (s2) saving..
Sat, 05 Jan 2019 15:20:33 GMT cypress:server:screenshot (s2) saved

I tried using pngjs directly to load the image - it took 1s to load, not explaining the 7s above. I added some logging.. and suddenly it was quick..

Sat, 05 Jan 2019 18:49:07 GMT cypress:server:screenshot (s2) capture and check { tries: 1, totalDuration: 0 }
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot (s2) received screenshot data from automation layer data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAADwEAAAd5CAIAAAAhWRY6AAAgAElEQVR4nOzde3Bc5Z0n/Kdbat3llt
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot starting inflate
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot completed inflate and filter
Sat, 05 Jan 2019 18:49:08 GMT cypress:server:screenshot (s2) read buffer to image 3841 x 1913

I ran it again with full logs and this time it was slow..

Sun, 06 Jan 2019 05:27:48 GMT cypress:server:screenshot starting inflate
...
Sun, 06 Jan 2019 05:27:54 GMT cypress:server:screenshot completed inflate and filter

In addition, I noticed that the cypress server process is basically at 100% cpu.

What logs are in between?

Sun, 06 Jan 2019 05:27:48 GMT cypress:server:screenshot starting inflate
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:timers clearing timer id 5 from queue { '5': { args: [], ms: 2500, cb: [Function: cb] }, '67': { args: [], ms: 85000, cb: [Function] }, '68': { args: [], ms: 85000, cb: [Function] } }
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:timers queuing timer id 5 after 2500 ms
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:specs looking for test specs in the folder: C:\git\app
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:events got request for event: get:project:status, { id: 'XYZ', path: 'C:\\git\\app' }
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:project get project status for XYZ C:\git\app
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:file get values from C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:48 GMT cypress:server:file attempt to get lock on C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:50 GMT cypress:server:file gettin lock succeeded or failed for C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:50 GMT cypress:server:file read C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:timers clearing timer id 5 from queue { '5': { args: [], ms: 2500, cb: [Function: cb] }, '67': { args: [], ms: 85000, cb: [Function] }, '68': { args: [], ms: 85000, cb: [Function] } }
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:timers queuing timer id 5 after 2500 ms
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:specs looking for test specs in the folder: C:\git\app
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:file read succeeded or failed for C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:file attempt to unlock C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:timers queuing timer id 69 after 2000 ms
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:timers clearing timer id 69 from queue { '5': { args: [], ms: 2500, cb: [Function: cb] }, '67': { args: [], ms: 85000, cb: [Function] }, '68': { args: [], ms: 85000, cb: [Function] }, '69': { args: [], ms: 2000, cb: [Function: timeoutTimeout] } }
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:file unlock succeeded or failed for C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:project got auth token ZZZ
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:project get project from api XYZ C:\git\app
Sun, 06 Jan 2019 05:27:51 GMT cypress:server:api request to url: GET https://api.cypress.io/projects/XYZ with params: {"headers":{"x-route-version":"2","x-os-name":"win32","x-cypress-version":"3.1.4"}}
Sun, 06 Jan 2019 05:27:52 GMT cypress:server:specs found spec file C:/git/app/integration.spec.js
...
Sun, 06 Jan 2019 05:27:52 GMT cypress:server:specs found 43 spec files: [ ... ]
Sun, 06 Jan 2019 05:27:54 GMT cypress:server:screenshot completed inflate and filter

So it seems that I intermittently get the above going round in a loop.

There are a number of things here.

  1. get:project:status
  2. looking for test specs
  3. attempt to get lock on

The spec file globbing is done every 2.5 seconds - and because of our setup I assumed this was why the screenshot was slow - it certainly seems to be the cause of constant cpu usage - ideally it would listen for file system changes.

There were a couple of problems that lead to this:

  1. testFiles option is not documented (https://docs.cypress.io/guides/references/configuration.html#Global) - that lead my colleague to specify all files, then use ignoreTestFiles
  2. ignoreTestFiles is less efficient because it uses minimatch per file, instead of using the minimatch filter, combined with a leading negate (so converting the glob to a regex happens every file instead of once)

However, this wasn't the solution to the problem, as I then reproduced the situation where screenshots were fast. The only difference is that there is no big time gaps in the "attempt to get lock" log messages..

Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file get values from C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file attempt to get lock on C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file gettin lock succeeded or failed for C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file read C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file read succeeded or failed for C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file attempt to unlock C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:timers queuing timer id 70 after 2000 ms
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:timers clearing timer id 70 from queue { '5': { args: [], ms: 2500, cb: [Function: cb] }, '66': { args: [], ms: 85000, cb: [Function] }, '69': { args: [], ms: 85000, cb: [Function] }, '70': { args: [], ms: 2000, cb: [Function: timeoutTimeout] } }
Sun, 06 Jan 2019 17:11:08 GMT cypress:server:file unlock succeeded or failed for C:\Users\luke\AppData\Roaming\Cypress\cy\production\cache

And right now.. I can't reproduce the massive delays any more. Looking at the code I suspect something to do with the lock file ends up synchronously locking the system for a couple of seconds every time it is called.

I wonder if its connected with if getting the lock and reading the file goes over the debounce time and you have multiple attempts to lock/unlock at once and you get some kind of deadlock.. I need to do more debugging if I reproduce this again.

I don't think its connected with the slow glob as going back to changes at the beginning of the investigation I still can't reproduce.

@lukeapage
Copy link
Contributor Author

I eventually got it to reproduce with logs - I think its related to the file globbing. - I didn't see anything fishy with the unlocking so please ignore that.

I think that if the timing is right, the globbing happens during all the async calls in the screenshot and it taking time and the screenshot taking time, basically prolong the screenshot taking dramatically.

If its lucky, the globbing occurs just before the screenshot and its able to complete with only a single globbing run inbetween.

@lukeapage
Copy link
Contributor Author

@jennifer-shehane @chrisbreiding
Maybe I should open a new issue that simplfies this down to a single bug.

server/open_project polls server/utils/specs every 2500ms. Because we have our cypress spec files inline with the rest of our code, we have thousands of directories. On my top of the range machine with ssd, glob takes 950ms to complete. On my colleagues machines it takes 2-3 times that.

The solution would be to use a watcher to watch for any file changes and to re-run the spec glob when that watcher sees changes.

I see you already use chokidar for watching files - but the poll option is true.. I'm not sure whether that invalidates the performance saving of watching (I don't know why you have that since the default is to rely on os events now).

If the code files were in javascript and not coffeescript I'd happily look into making a PR, but I find coffeescript too annoying as someone who's never written in it.

pauledwin pushed a commit to pauledwin/cypress that referenced this issue Apr 26, 2019
pauledwin pushed a commit to pauledwin/cypress that referenced this issue Apr 26, 2019
pauledwin pushed a commit to pauledwin/cypress that referenced this issue Apr 26, 2019
@cypress-bot cypress-bot bot added stage: needs review The PR code is done & tested, needs review and removed stage: needs investigating Someone from Cypress needs to look at this labels Apr 26, 2019
pauledwin pushed a commit to pauledwin/cypress that referenced this issue Apr 29, 2019
pauledwin pushed a commit to pauledwin/cypress that referenced this issue Apr 29, 2019
@cypress-bot cypress-bot bot added stage: pending release and removed stage: needs review The PR code is done & tested, needs review labels May 7, 2019
@cypress-bot
Copy link
Contributor

cypress-bot bot commented May 7, 2019

The code for this is done in cypress-io/cypress#4038, but has yet to be released.
We'll update this issue and reference the changelog when it's released.

1 similar comment
@cypress-bot
Copy link
Contributor

cypress-bot bot commented May 7, 2019

The code for this is done in cypress-io/cypress#4038, but has yet to be released.
We'll update this issue and reference the changelog when it's released.

chrisbreiding pushed a commit that referenced this issue May 7, 2019
* address #3069 by using chokidar fs events

* use chokidar directly for watching specs and add tests

the watchers lib watches for file contents changing and doesn’t watch for files being added or removed
@cypress-bot
Copy link
Contributor

cypress-bot bot commented May 17, 2019

Released in 3.3.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants