Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Output format JPG not compatible with Lemur fallback images #1438

Open
gruberro opened this issue Sep 29, 2022 · 1 comment · May be fixed by #1456
Open

Output format JPG not compatible with Lemur fallback images #1438

gruberro opened this issue Sep 29, 2022 · 1 comment · May be fixed by #1456

Comments

@gruberro
Copy link

gruberro commented Sep 29, 2022

Hello everybody!

We're heavily using the outputFormat: "jpg"-option in large projects, as this helps us keeping the repository (reference files are stored in LFS) smaller. I think we might have discovered a problem with the fallback images (e.g. '/capture/resources/unexpectedErrorSm.png'). These are PNGs, regardless the outputFormat option. So whenever some of these fallback images are stored as a reference, the following test-run results in an exception like:

See: /src/tests/backstop/bitmaps_test/20220929-064330/failed_diff_xxx.jpg
/usr/local/lib/node_modules/backstopjs/core/util/compare/store-failed-diff.js:20
    fs.writeFileSync(failedDiffFilename, data.getDiffImageAsJPEG(85));
                                              ^

TypeError: data.getDiffImageAsJPEG is not a function
    at module.exports (/usr/local/lib/node_modules/backstopjs/core/util/compare/store-failed-diff.js:20:47)
    at /usr/local/lib/node_modules/backstopjs/core/util/compare/compare.js:21:14
    at processTicksAndRejections (node:internal/process/task_queues:96:5)

Even this might be acceptable in the end, right now this error is catched by the test runner and test execution continues/ends with success. When running tests as part of the CI pipeline everything seems pretty ok, and no errors are reported. This error can only be discovered by reading the test entire output, what - at least I think so - usually doesn't happen in an automated ci-env.

IMHO the error should cause the test run to exit with an error code. Having different fallback/error images relying on the outputFormat could help too 😄 . I'm interested in your opinion on it before trying to contribute a possible solution.

@gruberro
Copy link
Author

@garris any feedback on this?

@vntw vntw linked a pull request Jan 9, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant