Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Assumption violation is not reflected on text UI #1525

Open
davido opened this issue May 27, 2018 · 18 comments
Open

Assumption violation is not reflected on text UI #1525

davido opened this issue May 27, 2018 · 18 comments

Comments

@davido
Copy link

davido commented May 27, 2018

It would be helpful if assumption violations would be reflected on the text UI. For this test class with three test methods:

package org.gerritcon.mv2016;

import static org.junit.Assert.assertEquals;
import static org.junit.Assume.assumeTrue;

import org.junit.Test;

public class PrintyTest {
  @Test
  public void test_passed() {
    assertEquals("foo", "foo");
  }

  @Test
  public void test_failed() {
    assertEquals("foo", "bar");
  }

  @Test
  public void test_assumption_violation() {
    assumeTrue(false);
  }
}

JUnit text UI produces this result:

$ java -classpath buck-out/gen/lib__printy_tests#testsjar__output/printy_tests#testsjar.jar:/home/davido/Downloads/hamcrest-core-1.3.jar:/home/davido/Downloads/junit-4.12.jar org.junit.runner.JUnitCore org.gerritcon.mv2016.PrintyTest
JUnit version 4.12
..E.
Time: 0.008
There was 1 failure:
1) test_failed(org.gerritcon.mv2016.PrintyTest)
org.junit.ComparisonFailure: expected:<[foo]> but was:<[bar]>
	at org.junit.Assert.assertEquals(Assert.java:115)
[...]
	at org.junit.runner.JUnitCore.main(JUnitCore.java:36)

FAILURES!!!
Tests run: 3,  Failures: 1

The impression is that 2 tests were successful even though, only single test was successful and another one failed with assumption violation. This information is lost.

Compared to it the output from Buck build tool is:

$ buck test :printy_tests
FAILURE org.gerritcon.mv2016.PrintyTest test_failed: expected:<[foo]> but was:<[bar]>
[-] PROCESSING BUCK FILES...FINISHED 0.0s 🏖  (Watchman reported no changes)
[-] DOWNLOADING... (0.00 B/S AVG, TOTAL: 0.00 B, 0 Artifacts)
[-] BUILDING...FINISHED 0.0s [100%] (6/6 JOBS, 0 UPDATED, 0 [0.0%] CACHE MISS)
[-] TESTING...FINISHED 0.3s (1 PASS/1 SKIP/1 FAIL)
RESULTS FOR //:printy_tests
FAIL    <100ms  1 Passed   1 Skipped   1 Failed   org.gerritcon.mv2016.PrintyTest
FAILURE org.gerritcon.mv2016.PrintyTest test_failed: expected:<[foo]> but was:<[bar]>
org.junit.ComparisonFailure: expected:<[foo]> but was:<[bar]>
	at org.junit.Assert.assertEquals(Assert.java:115)
[...]
	at java.lang.Thread.run(Thread.java:748)

TESTS FAILED: 1 FAILURE
Failed target: //:printy_tests

This line reflects more accurately what happened:

FAIL    <100ms  1 Passed   1 Skipped   1 Failed   org.gerritcon.mv2016.PrintyTest

You may wonder, why would I care to improve text JUnit UI; in the end devs are using dedicated build tools and IDE integrations and not java -classpath sut.jar:junit.jar org.junit.runner.JUnitCore <test class>?

It turns out, that other build tools rely on JUnit text UI to report test outcome. Given that assumption violation is lost here, the other build tools also cannot offer more accurate test outcome.

@JxJxJxMxNxS
Copy link

Hello i want to contribute to JUnit i can take a look of this if it's ok

@kcooney
Copy link
Member

kcooney commented May 30, 2018

If the concern is tools that are consuming the output, those tools would need to be updated to understand the changes proposed here. The proposed changes could break those tools in the interim.

Given all that, the potential costs of the change may outweigh the benefits.

An assumption failure isn't a failure; the test was skipped because the assumptions of the test were not true with the current parameters. So showing it as passed, while confusing to some users, might be correct for others.

@kcooney
Copy link
Member

kcooney commented May 30, 2018

Of course we could have the output for assumption failures be 'I' like it is for ignored tests. That would match what Buck apparently does...

@davido
Copy link
Author

davido commented May 31, 2018

The proposed changes could break those tools in the interim.

I don't see how this is possible. A tool foo uses version bar of JUnit, where assumption violation is not reflected on the text UI. JUnit fixes that bug, and releases version baz. Tool foo bumps JUnit version to baz and adapts its code to changes in JUnit text UI.

Of course we could have the output for assumption failures be 'I' like it is for ignored tests. That would match what Buck apparently does...

+1. Still the preferred solution would be to add unique char say 'A' to differentiate assumption violations from ignored tests.

@kcooney
Copy link
Member

kcooney commented May 31, 2018

I don't see how this is possible. A tool foo uses version bar of JUnit, where assumption violation is
not reflected on the text UI.

@davido Users compile against a particular version if JUnit. Tools that run tests are often devoped using a different version of JUnit, but are designed to allow developers to upgrade to a newer version of JUnit without forcing them to upgrade the tool.

In any case, tools that read the output of the text UI might break if the output changes. No idea how likely that is.

Let's just have assumption failures print out an "I"

@JxJxJxMxNxS we would be glad to have your help.

@davido
Copy link
Author

davido commented May 31, 2018

Let's just have assumption failures print out an "I"

Fine, I will look into it.

@JxJxJxMxNxS
Copy link

Great!
Should this be the output for the test above?:

rest-core-1.3.jar org.junit.runner.JUnitCore PrintyTest
JUnit version 4.13-SNAPSHOT
.I.E.
Time: 0.015
There was 1 failure:

  1. test_failed(PrintyTest)
    org.junit.ComparisonFailure: expected:<[foo]> but was:<[bar]>
    at org.junit.Assert.assertEquals(Assert.java:117)
    at org.junit.Assert.assertEquals(Assert.java:146)
    at PrintyTest.test_failed(PrintyTest.java:13)

FAILURES!!!
Tests run: 3, Failures: 1

or in the Test run field should be just 2 tests?

@stefanbirkner
Copy link
Contributor

@davido Can you give us an example of a build tool that relies on JUnit text UI to report test outcome.

@davido
Copy link
Author

davido commented Jun 1, 2018

Can you give us an example of a build tool that relies on JUnit text UI to report test outcome.

Bazel, see my analysis in this issue: bazelbuild/bazel#3476.

@kcooney
Copy link
Member

kcooney commented Jun 1, 2018

@davido last I checked Bazel installs a custom listener which produces Ant-style XML.

@davido
Copy link
Author

davido commented Jun 1, 2018

My analysis was, that for its own output rendering standard log file produced by JUnit text UI is used. Ant-Style XML is produced for consumption by/integration with external tools.

@kcooney
Copy link
Member

kcooney commented Jun 1, 2018

@davido I will respond on the Bazel bug

@kcooney
Copy link
Member

kcooney commented Jun 1, 2018

For the record, I don't think this change would have any affect on that Bazel issue.

@kcooney
Copy link
Member

kcooney commented Jun 1, 2018

@stefanbirkner So given that we no longer have examples of tools that parse the output, should we print something different for assumption failures?

@stefanbirkner
Copy link
Contributor

I feel it is a good idea. For me your proposal that we handle it like a normal ignore makes sense, because I consider failed assumptions as dynamic ignores.

@kcooney
Copy link
Member

kcooney commented Jun 1, 2018

@stefanbirkner given the comments in Result.Listener the original authors apparently decided that it made more sense to treat assumption failures as passed tests instead of ignored tests. I think I agree; ignored tests should either be eventually fixed or removed. In contrast, assumption failures happen when a parameterized test or theory applied to some possible dates values but not all.

In any case, I don't think we should change the meaning of the methods in Result.getIgnoredCount() so it starts including assumption failures.

@davido
Copy link
Author

davido commented Jun 2, 2018

In any case, I don't think we should change the meaning of the methods in Result.getIgnoredCount() so it starts including assumption failures.

The whole assumption violation outcome was completely forgotten. That why my suggestion to get it right, and extend JUnit and make assumption violations first class citizens. By that I mean: 'A' char and add new method: Result.getAssumptionViolationCount().

@kcooney
Copy link
Member

kcooney commented Jun 2, 2018

The whole assumption violation outcome was completely forgotten.

Based on the comments, I think the authors assumed developers wouldn't want to think of assumption failures as any different than a passed test. Perhaps they didn't want to repeat the failures vs errors confusion of JUnit 3?

add new method: Result.getAssumptionViolationCount()

There is an abandoned pull request for this; see #1294 ... would you be able to help get it across the finish line?

As for the text UI, I don't personally look at the output of periods and F's (it gets lost with all the other test output) but if you do, what about something simple like "-"? And wouldn't summary text at the end of the run be infinitely more useful?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants