Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using FFmpeg to record a test that goes through multiple browsers causes test to hang when running multiple tests #6288

Closed
adlantz opened this issue Jun 7, 2021 · 10 comments
Labels
FREQUENCY: level 2 STATE: Need clarification An issue lacks information for further research. TYPE: bug The described behavior is considered as wrong (bug).

Comments

@adlantz
Copy link

adlantz commented Jun 7, 2021

What is your Test Scenario?

I want to run multiple tests, some of which open multiple windows using the t.openWindow() function, and be able to record them using FFMpeg.

What is the Current behavior?

Say we have a test A that opens multiple windows using t.openWindow(). Then we have a test B that may or may not open multiple windows. Then we run testcafe chrome A.js B.js --video artifacts/videos. Test A will complete as expected but when it moves on to test B it will get stuck and run until there's a FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory error. For some reason this doesn't happen every time I run it, but it happens most of the time. Removing either the --video option or the t.openWindow() calls prevents this from occurring. My guess is that FFmpeg has some problem dealing with multiple windows opening in a test.

What is the Expected behavior?

The tests all run and are recorded regardless of whether they open multiple windows or not.

What is your web application and your TestCafe test code?

Code is proprietary. Website needs authentication. I made a github repo that reproduces the issue.

Steps to Reproduce:

Check out this repo I made to easily reproduce it.

Your Environment details:

  • testcafe version: 1.14.2
  • node.js version: v14.16.1
  • command-line arguments: testcafe chrome mptest1.js mptest2.js --video artifacts/videos
  • browser name and version: chrome 89
  • platform and version: macOS 10.15.7
  • other: Sometimes it works! Please run it a few times in a row until it fails.
@need-response-app need-response-app bot added the STATE: Need response An issue that requires a response or attention from the team. label Jun 7, 2021
@felis2803 felis2803 added FREQUENCY: level 1 TYPE: bug The described behavior is considered as wrong (bug). and removed STATE: Need response An issue that requires a response or attention from the team. labels Jun 9, 2021
@felis2803
Copy link
Contributor

Thank you for your excellent report. I reproduced the bug. Please watch for our updates on this issue to keep abreast of our progress.

For team:
This issue looks similar to #6037.

@alejandro-serrano
Copy link

Hi TestCafe Team,

Is there any update on this issue?

We are facing it and is very easy to reproduce even with the Getting Started example.

Environment details:

TestCafé version: 1.16.1
Node.js version: v16.13.0
Command-line: testcafe chrome tests/getting_started.js
Browser name and version: Chrome 95.0.4638.69
Platform and version: MacOS 10.15.7
Other: .testcaferc.json

{
    "skipJsErrors": true,
    "stopOnFirstFail": false,
    "disableMultipleWindows": true,
    "videoPath":"./videos",
    "videoOptions": {
      "pathPattern": "${DATE}_${TIME}/${TEST}/${USERAGENT}/${TEST_INDEX}.mp4"
      },
    "reporter": [
      {
        "name": "spec"
      },
      {
        "name": "json",
        "output": "./reports/report.json"
      },
      {
        "name": "xunit",
        "output": "./reports/report.xml"
      },
      {
        "name": "html",
        "output": "./reports/report.html"
      }
    ]
  }

We are disabling video recording by removing those options from the testcafe config file to be able to run our tests in chrome.

Thank you!

@need-response-app need-response-app bot added the STATE: Need response An issue that requires a response or attention from the team. label Nov 12, 2021
@felis2803
Copy link
Contributor

felis2803 commented Nov 16, 2021

Hello,

Thank you for your report. This issue is not scheduled for the next sprint, so we cannot name a timeline for its solution.

@need-response-app need-response-app bot removed the STATE: Need response An issue that requires a response or attention from the team. label Nov 16, 2021
@Matenko
Copy link

Matenko commented Dec 8, 2021

This is currently affecting our test runs as well. We'd really like to be able to reliably record our test runs to debug failures in our GitHub Workflows but as of right now, the process often hangs until the process throws an out of memory error.

@need-response-app need-response-app bot added the STATE: Need response An issue that requires a response or attention from the team. label Dec 8, 2021
@felis2803 felis2803 removed the STATE: Need response An issue that requires a response or attention from the team. label Dec 10, 2021
@alejandro-serrano
Copy link

Hi TestCafe Team,

Is there any update on this issue? Or, is there any workaround?

Thank you!

@need-response-app need-response-app bot added the STATE: Need response An issue that requires a response or attention from the team. label May 23, 2022
@miherlosev miherlosev added the STATE: No updates No updates are available at this point. label May 25, 2022
@github-actions
Copy link

No updates yet. Once we get any results, we will post them in this thread.

@github-actions github-actions bot removed STATE: Need response An issue that requires a response or attention from the team. STATE: No updates No updates are available at this point. labels May 25, 2022
@andrzej-woof
Copy link

I think I've bumped into same issue, going over debugger I think I see a loop here

    async _capture () {
        while (this.active) {
            try {
                const frame = await this.connection.provider.getVideoFrameData(this.connection.id);

                if (frame) {
                    await this.emit('frame');
                    await this._addFrame(frame);
                }
            }
            catch (error) {
                this.debugLogger(error);
            }
        }
    }

caused by this part here
image

throwing an exception

Error: WebSocket is not open: readyState 3 (CLOSED)
    at sendAfterClose (/xxxx/node_modules/ws/lib/websocket.js:967:17)
    at WebSocket.send (/xxxx/node_modules/ws/lib/websocket.js:405:7)
    at Chrome._enqueueCommand (/xxxx/node_modules/chrome-remote-interface/lib/chrome.js:286:18)
    at /xxxx/node_modules/chrome-remote-interface/lib/chrome.js:88:22
    at new Promise (<anonymous>:null:null)
    at Chrome.send (/xxxx/node_modules/chrome-remote-interface/lib/chrome.js:87:20)
    at Object.handler [as captureScreenshot] (/xxxx/node_modules/chrome-remote-interface/lib/api.js:32:23)
    at BrowserClient.getScreenshotData (/xxxx/node_modules/testcafe/lib/browser/provider/built-in/dedicated/chrome/cdp-client/index.js:239:50)

and then going through the loop again and again

Not sure what's the root cause, but if such error happens maybe worth changing browser active state?

@need-response-app need-response-app bot added the STATE: Need response An issue that requires a response or attention from the team. label May 25, 2022
@Aleksey28
Copy link
Collaborator

Thank you for your research. We will take your results into account when fixing this issue.

@need-response-app need-response-app bot removed the STATE: Need response An issue that requires a response or attention from the team. label May 27, 2022
@miherlosev
Copy link
Collaborator

Hi folks,

I tried reproducing the issue with testcafe@2.6.0, and everything works fine. Could you please run your tests with this version and let us know your results?

@miherlosev miherlosev added the STATE: Need clarification An issue lacks information for further research. label May 19, 2023
@github-actions
Copy link

This issue was automatically closed because there was no response to our request for more information from the original author. Currently, we don't have enough information to take action. Please reach out to us if you find the necessary information and are able to share it. We are also eager to know if you resolved the issue on your own and can share your findings with everyone.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
FREQUENCY: level 2 STATE: Need clarification An issue lacks information for further research. TYPE: bug The described behavior is considered as wrong (bug).
Projects
None yet
Development

No branches or pull requests

7 participants