Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some Flaky tests #3170

Open
Agorguy opened this issue Sep 8, 2023 · 2 comments · May be fixed by #3171
Open

Some Flaky tests #3170

Agorguy opened this issue Sep 8, 2023 · 2 comments · May be fixed by #3171
Labels
status: waiting for feedback We need additional information before we can continue

Comments

@Agorguy
Copy link

Agorguy commented Sep 8, 2023

Describe the bug
We found tests flaky failing more frequently when running in a constraint environment.

  • Assert core version: using sha commit 9b653c5
  • java version: 17
  • test framework version: ?
  • docker image: maven:3.8.3-openjdk-17

Test case reproducing the bug

The test scenario is showed below.

Comment

Hello, We tried running your project and discovered that it contains some flaky tests (i.e., tests that nondeterministically pass and fail). We found these tests to fail more frequently when running them on certain machines of ours.

To prevent others from running this project and its tests in machines that may result in flaky tests, we suggest adding information to the README.md file indicating the minimum resource configuration for running the tests of this project as to prevent observation of test flakiness.

If we run this project in a machine with 1cpu and 1gb ram, we observe flaky tests. We found that the tests in this project did not have any flaky tests when we ran it on machines with 2cpu and 2gb ram.

Here is a list of the tests we have identified and their likelihood of failure on a system with less than the recommended 2 CPUs and 2 GB RAM.

  1. CompletableFutureAssert-succeedsWithin#should-allow-assertion-on-future-result-when-completed-normally-within-timeout
  2. . FutureAssert-succeedsWithin#should-allow-assertion-on-future-result-when-completed-normally-within-timeout
  3. Iterables-assertDoesNotHaveDuplicates#should-pass-within-time-constraints

Thank you for your attention to this matter. We hope that our recommendations will be helpful in improving the quality and performance of your project, especially for others to use.

Reproducing

FROM maven:3.8.3-openjdk-17

WORKDIR /home/

RUN git clone https://github.com/assertj/assertj && \
  cd assertj && \
  git checkout 9b653c5
  
WORKDIR /home/assertj

RUN mvn install -DskipTests

ENTRYPOINT ["mvn", "test", "-fn"]

Build the image:

$> mkdir tmp

$> cp Dockerfile tmp

$> cd tmp

$> docker build -t assertj . # estimated time of build 3m

Running:
this configuration likely prevents flakiness (no flakiness in 10 runs)

$> docker run --rm --memory=2g --cpus=2 --memory-swap=-1 assertj | tee output.txt
$> grep "Failures:"  output.txt # checking results

this other configuration –similar to the previous– can’t prevent flaky tests (observation in 10 runs)

$> docker run --rm --memory=1g --cpus=1 --memory-swap=-1 assertj | tee output2.txt
$> grep "Failures:"  output2.txt # checking results
@scordio
Copy link
Member

scordio commented Sep 8, 2023

We tried running your project and discovered that it contains some flaky tests (i.e., tests that nondeterministically pass and fail).

Could you please share details about the failures you got?

@scordio scordio added the status: waiting for feedback We need additional information before we can continue label Sep 8, 2023
@Agorguy
Copy link
Author

Agorguy commented Sep 11, 2023

@scordio here are the stack traces of each flaky test at the time of its failure, please let me know if there's anything else I can share with you.

test Iterables assertDoesNotHaveDuplicates#should pass within time constraints

<testcase name="should pass within time constraints" classname="Iterables assertDoesNotHaveDuplicates" time="3.597">
<failure message="&#10;Expecting actual:&#10; 2107L&#10;to be less than:&#10; 2000L " type="java.lang.AssertionError"><![CDATA[java.lang.AssertionError:


Expecting actual:
2107L
to be less than:
2000L
at org.assertj.core.error.AssertionErrorCreator.assertionError(AssertionErrorCreator.java:93)
at org.assertj.core.internal.Failures.failure(Failures.java:126)
at org.assertj.core.internal.Comparables.assertLessThan(Comparables.java:228)
at org.assertj.core.internal.Comparables.assertLessThan(Comparables.java:193)
at org.assertj.core.api.AbstractLongAssert.isLessThan(AbstractLongAssert.java:205)
at org.assertj.core.internal.iterables.Iterables_assertDoesNotHaveDuplicates_Test.should_pass_within_time_constraints(Iterables_assertDoesNotHaveDuplicates_Test.java:86)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
]]></failure>

test FutureAssert succeedsWithin#should allow assertion on future result when completed normally within timeout

<testcase name="should allow assertion on future result when completed normally within timeout" classname="FutureAssert succeedsWithin" time="0.496">
<failure message="&#10;Expecting&#10; &lt;CompletableFuture[Incomplete]&gt;&#10;to be completed within 110L Millis.&#10;&#10;exception caught while trying to get the future result: java.util.concurrent.TimeoutException&#10; 
at java.base/java.util.concurrent.CompletableFuture.timedGet(CompletableFuture.java:1960)&#10; 
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2095)&#10; 
at org.assertj.core.internal.Futures.assertSucceededWithin(Futures.java:100)&#10; 
at org.assertj.core.api.AbstractFutureAssert.internalSucceedsWithin(AbstractFutureAssert.java:461)&#10; 
at org.assertj.core.api.AbstractFutureAssert.succeedsWithin(AbstractFutureAssert.java:262)&#10; 
at org.assertj.core.api.future.FutureAssert_succeedsWithin_Test.should_allow_assertion_on_future_result_when_completed_normally_within_timeout(FutureAssert_succeedsWithin_Test.java:51)&#10; 
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&#10; 
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(N ativeMethodAccessorImpl.java:77)&#10; 
at ...

test CompletableFutureAssert succeedsWithin#should allow assertion on future result when completed normally within timeout

<testcase name="should allow assertion on future result when completed normally within timeout" classname="CompletableFutureAssert succeedsWithin" time="0.397">
<failure message="&#10;Expecting&#10; &lt;CompletableFuture[Completed: &quot;done&quot;]&gt;&#10;to be completed within 110L Millis.&#10;&#10;exception caught while trying to get the future result: java.util.concurrent.TimeoutException&#10; 
at java.base/java.util.concurrent.CompletableFuture.timedGet(CompletableFuture.java:1960)&#10; 
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2095)&#10; 
at org.assertj.core.internal.Futures.assertSucceededWithin(Futures.java:100)&#10; 
at org.assertj.core.api.AbstractCompletableFutureAssert.internalSucceedsWithin(AbstractCompletableFutureAssert.java:443)&#10; 
at org.assertj.core.api.AbstractCompletableFutureAssert.succeedsWithin(AbstractCompletableFutureAssert.java:439)&#10; 
at org.assertj.core.api.future.CompletableFutureAssert_succeedsWithin_Test.should_allow_assertion_on_future_result_when_completed_normally_within_timeout(CompletableFutureAssert_succeedsWithin_Test.java:50)&#10; 
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&#10; 
at 

nicokosi added a commit to nicokosi/assertj that referenced this issue Oct 25, 2023
Use `flaky` and `performance` JUnit `@Tag` annotations.
Cf. https://junit.org/junit5/docs/current/user-guide/#writing-tests-tagging-and-filtering

Use `@Nested` annotation when mixing common tests with performance tests.
Cf. https://junit.org/junit5/docs/current/user-guide/#writing-tests-nested

Fixes assertj#3170.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: waiting for feedback We need additional information before we can continue
Projects
None yet
2 participants