Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Coverage formats very different when testing the same code via Jest and when converting from v8 #56

Closed
dvail opened this issue Aug 8, 2019 · 3 comments

Comments

@dvail
Copy link

dvail commented Aug 8, 2019

Hello, over the past week or so I have been doing a bit of experimenting with coverage reports generated from puppeteer-to-istanbul (and in turn this library) and comparing them to coverage reports generated from jest --coverage.

The final goal from all of this would be to generate coverage reports for unit/integration tests via Jest and coverage reports for e2e tests using Puppeteer and then merge them into one final coverage report that is then shipped off to SonarQube. The idea being that code missed by the unit tests might get covered by the e2e tests so I can get a more realistic view of total coverage.

The issue I'm having is that the branch/statement/function/line counts and coverage are completely different when running the same code in each test suite. I've got a minimal repo to demonstrate what I am seeing here https://github.com/dvail/istanbul-report-example and some screenshots of the html coverage reports that each test produces.

When running with Jest:
image

When running with Puppeteer:
image

When running the Jest tests with c8 for good measure:
image

In the first image, it seems like coverage is perfect. All hit and missed lines/branches/etc are detected and the counts at the top logically make sense.

In the second image, the coverage does seem technically correct. Lines that are not hit are red, but everything else is considered covered. Even the comments and extraneous blank lines at the end of the file.

It seems to me that this is likely due to the differences in the way coverage is collected from the istanbul library and from v8 (described here), as the v8 coverage report looks almost identical to what is visible in Chrome when opening the sample page and viewing the coverage developer tab:
image

Is this something that is a known issue, or maybe just a limitation in the way coverage is reported from v8? Is there any reasonable way to combine coverage reports generated from the two tools, or is that something I should just not do? :)

Thanks!

(The code in question ties together istanbul, v8-to-istanbul, puppeteer-to-istanbul, nyc, and jest, but I figured that this would be the most appropriate project to post this in.)

@bcoe
Copy link
Member

bcoe commented Aug 20, 2019

I believe the issue with Jest is most likely source-map related, there's an outstanding PR to add V8-based coverage to Jest:

jestjs/jest#8596

c8 does not currently have access to any source maps injected by Jest, so is likely unable to remap coverage appropriately.

Jest's built in coverage uses Istanbul, which is a completely different parser than V8, so I'm not shocked that there are discrepancies.

@dvail
Copy link
Author

dvail commented Aug 22, 2019

Ok thanks, that definitely makes sense. I'll track the jest issue and see if we can switch to all-v8 coverage sometime in the future.

@dvail dvail closed this as completed Aug 22, 2019
@sijakret
Copy link

sijakret commented Oct 8, 2021

Did you manage to resolve this?
I am seeing the same issue with web-test-runner

please excuse the cropped views but i think it get's the point across.

karma + babel + istanbul reporter
image
web test runner + chromium v8 + v8-to-istanbul
image

any tips on how to ignore the irrelevant lines (i.e. imports, comments, etc..) ?

edit: v8 report does not seem to distinguish between statements and lines.
edit2: maybe this is just a reporter misconfiguration?
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants