Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Option for comparing coverage reports #92

Open
lucavallin opened this issue May 3, 2023 · 13 comments
Open

Option for comparing coverage reports #92

lucavallin opened this issue May 3, 2023 · 13 comments
Assignees
Labels

Comments

@lucavallin
Copy link

lucavallin commented May 3, 2023

At this moment, tparse shows the current coverage percentage. It would be useful to show, next to the current percentage, the difference with another given test report, for example one containing the coverage information for the main branch.

I am thinking of something like:

go test -json -cover -v ./... | tparse -all -format-markdown -compare=main-coverage.json

Which results in a table like:

STATUS ELAPSED PACKAGE COVER PASS FAIL SKIP
🟢 PASS 0.02s github.com/user/package1 100.0% (+20.3%) 3 0 0
🟢 PASS 1.16s github.com/user/package2 35.2% (-3.2%) 1 0 0
🟢 PASS 0.20s github.com/user/package3 70.5% (-) 4 1 0
🟢 PASS 0.78s github.com/user/package4 83.0% (+13.8%) 6 2 0
🟢 PASS 0.25s github.com/user/package5 48.8% (-) 2 0 1

This would make it easy and quick to see if a pull request is increasing or decreasing a project's test coverage.

@mfridman
Copy link
Owner

mfridman commented May 3, 2023

Ye this is a great idea, I've also wanted something similar for elapsed .. to understand if there's a trend over time where tests get slower/faster.

Something I haven't thought deeply about is where things like "main-coverage.json" come from, are they checked-in?

@lucavallin
Copy link
Author

Glad to hear! My first guess would be to have "main-coverage.json" checked in, that file could for example be automatically created and pushed to the main branch by Actions. A possible alternative would be to store and retrieve it from, for example, a storage bucket, or any other place that can provide a static URL to the file.

However, I think "where does the file come from?" shouldn't be a concern of tparse.

@lucavallin lucavallin changed the title Diff option for coverage percentage Option for comparing coverage reports May 3, 2023
@mfridman
Copy link
Owner

mfridman commented May 3, 2023

Curious, when tparse reads -compare=main-coverage.json .. what are the contents?

Sorry, I might be missing something 😅 , but is this a file tparse creates or would this be a json blob from go test ..?

I ask because I'm failing to recall what .json file the go toolchain produces.

EDIT: IIRC there are tools like https://github.com/axw/gocov that can produce a .json file

@lucavallin
Copy link
Author

lucavallin commented May 3, 2023

The first that comes to mind is writing the output of go test to file, like go test -json -cover -v ./... > test-output.json, this would be handy given that tparse can already understand it. I am not particularly fond of tools like gocov since it's an extra dependency that does something extremely specific.

The most idiomatic way of doing it would be to parse the coverage file that you can get with -coverprofile, as in go test -json -cover -coverprofile=coverage.out -v ./.... This would require some extra work, but personally, I think tparse would benefit from being able to parse coverprofiles anyway: sometimes I'd like to just run go tests, and then use tparse to present them in a human-friendly way (that removes the need for the set -o pipefail hack too).

I have been searching for a tool for presenting tests results, and I picked tparse because no other tool does:

  • Present results in a human-friendly way, especially markdown for comments in pull requests and (Actions) job summaries
  • Show coverage percentage (junit-based tools somehow don't pick that up)
  • Show the coverage diff with previous runs of the tests (this feature we're talking about in the issue)
  • Bonus points: have a GitHub Action in the marketplace to go with the tool, maybe it could even annotate PRs

@mfridman
Copy link
Owner

mfridman commented May 3, 2023

This is great feedback!

At one point I thought about adding a tparse-specific JSON blob that could be checked in but could also be shipped to a service. I started working on https://gotest.io to track go tests over time but paused those efforts due to a full-time job.

I know I've been saying this for a while, but I do intend to find some time this summer to get this tool to a stable v1 and address a few of the open issues. (I still write a lot of Go code, and use this tool quite a bit).

@lucavallin
Copy link
Author

lucavallin commented May 4, 2023

@mfridman Great to hear. I'll be happy to help if/when I have some free time, but no promises. This issue is the most interesting for me, so if we agree on the (rough) design, I will give it a try.

@lucavallin
Copy link
Author

I had a look at the source code, reading and parsing a file to compare against would be quite simple. Besides adding the -compare flag (or something similar), I added the following to app.go on line 57:

if option.Compare != "" {
		var compareReader io.ReadCloser
		if compareReader, err = os.Open(option.Compare); err != nil {
			return 1, err
		}

		compare, err := parse.Process(compareReader)
		if err != nil {
			return 1, err
		}
                 // this one just to print out the contents of compare
		spew.Dump(compare)
		defer compareReader.Close()
	}

@mfridman
Copy link
Owner

Was hacking on tparse this weekend and was thinking about this issue.

Presumably when running tparse in CI with a -compare flag, you'd need to have an "against" (main-coverage.json) file .. would this be something that's already checked into vcs? Or would the against file be stored in cache or some blob storage? I guess this doesn't matter, since it's just a bag of bytes we read, more a curiosity.

There are at least 2 ways we can go .. have tparse store some custom .json format (maybe in a specific directory with some timestamp) or have the caller store the raw go test output (which tparse -file already knows how to parse).

Fwiw I really like this idea, and with a bit of guidance can help get something in place.

@lucavallin
Copy link
Author

Hey @mfridman! Thanks for looking into this.
I think the easiest would be to have the file in git, but if the -compare flag could take any URI, then it might be easy to point it to either a local file, a cache, or a file in a blob storage.

My preference for the format would be the raw test output. It is not ideal because Go already has a coverage format that is more "semantically" correct, however, tparse does not understand this yet. Having a custom format instead would take the tool further away from Go's defaults... if that makes sense.

@mfridman mfridman self-assigned this Aug 4, 2023
@mfridman
Copy link
Owner

mfridman commented Aug 4, 2023

Alright, made some progress on this today.

happy path

# go<version> test -count=1 fmt strings bytes bufio crypto log mime sort time -json -cover > go<version>.json

$ tparse -file go1.20.json -compare go1.17.json
┌───────────────────────────────────────────────────────────────────┐
│  STATUS │ ELAPSED │ PACKAGE │     COVER     │ PASS │ FAIL │ SKIP  │
│─────────┼─────────┼─────────┼───────────────┼──────┼──────┼───────│
│  PASS   │  0.62s  │ bufio   │ 93.4% (+0.8%) │  87  │  0   │  0    │
│  PASS   │  1.56s  │ bytes   │ 95.5% (+0.2%) │ 138  │  0   │  0    │
│  PASS   │  0.75s  │ crypto  │ 5.9% (+5.9%)  │  5   │  0   │  0    │
│  PASS   │  0.69s  │ fmt     │ 95.2% (0.0%)  │  79  │  0   │  1    │
│  PASS   │  0.87s  │ log     │ 68.0% (+1.1%) │  10  │  0   │  0    │
│  PASS   │  1.02s  │ mime    │ 94.0% (+0.3%) │  24  │  0   │  0    │
│  PASS   │  1.72s  │ sort    │ 58.7% (-2.1%) │  76  │  0   │  1    │
│  PASS   │  1.27s  │ strings │ 97.9% (-0.1%) │ 117  │  0   │  0    │
│  PASS   │  7.80s  │ time    │ 92.5% (+0.9%) │ 349  │  0   │  1    │
└───────────────────────────────────────────────────────────────────┘

error state

Still, display the package summary but output a warning below the table if there is an error opening the compare file or parsing its contents.

This might be an opportunity to add a -fail-on-any-error but I think tparse should strive to ALWAYS print the summary table unless something catastrophic happens since users are trusting their test output piped into this tool.

go run main.go -file go1.20-strings.json -compare invalid.json

┌───────────────────────────────────────────────────────────┐
│  STATUS │ ELAPSED │ PACKAGE │ COVER │ PASS │ FAIL │ SKIP  │
│─────────┼─────────┼─────────┼───────┼──────┼──────┼───────│
│  PASS   │  0.94s  │ strings │ 97.9% │ 117  │  0   │  0    │
└───────────────────────────────────────────────────────────┘
warning: failed to open compare file: invalid.json

Questions

Is it confusing to show a green cell in Cover even though the overall coverage went down when comparing it against another test output?

CleanShot 2023-08-04 at 17 15 20@2x

The colours for the Cover column in the package summary are based on some arbitrary logic from many years ago:

  • >0 && <= 50 .. red
  • > 50 && < 80 .. yellow
  • >= 80 .. green

@mfridman mfridman pinned this issue Aug 4, 2023
@lucavallin
Copy link
Author

Hey @mfridman, awesome work! The happy path looks great, and I agree with you on having tparse to always print the summary unless a cosmic cataclysm hits us (or an alien invasion). It's useful to give users the possibility to decide the behaviour, but I don't think this is something that needs to happen right away.

About the question, I think I would like a decrease in coverage to be obvious at a glance. Do you think it'd be possible to have the (-0.1%) part in orange/yellow if negative, and maybe a different shade of green if positive? I wouldn't make 97.9% yellow, it's still a great level of coverage to have!

@mfridman
Copy link
Owner

Merged a crude implementation in #101

-compare       Compare against a previous test output file. (experimental)

Thanks for the great suggestion. There's still a bit of cleanup to do, but overall I think it's a good first step..

@lucavallin
Copy link
Author

🎉 Great work, glad I could be of help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants