Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slowest test times from pytest are incorrect #286

Open
goodspark opened this issue Feb 7, 2019 · 9 comments · May be fixed by #288
Open

Slowest test times from pytest are incorrect #286

goodspark opened this issue Feb 7, 2019 · 9 comments · May be fixed by #288

Comments

@goodspark
Copy link

  • Python 3.7
  • pytest==4.2.0
  • freezegun==0.3.11
import pytest
from freezegun import freeze_time
from datetime import timedelta
@pytest.fixture
def freezer():
  with freeze_time(
    # ignore=['_pytest.runner']
  ) as f:
    yield f
def test(freezer):
  freezer.tick(delta=timedelta(seconds=60)
pytest --durations 1
============= slowest 1 test durations ==============
60.00s call     test_blah.py::test
============= 1 passed in 0.06 seconds ==============

If I uncomment the ignore line, the durations match what I expect:

============= slowest 1 test durations ==============
0.03s setup    test_blah.py::test
============= 1 passed in 0.08 seconds ==============

I also see in the master branch _pytest.runner. was added. But this doesn't work for me. Perhaps we should also add _pytest.runner?

@goodspark
Copy link
Author

Actually I tried this on a fork and broke several tests. Would like your input!

@boxed
Copy link
Contributor

boxed commented Feb 7, 2019

If we add _pytest.runner I think you'll end up with the time not actually frozen inside your tests!

@goodspark
Copy link
Author

goodspark commented Feb 7, 2019

Hm. Well, when I add it manually in my fixture, freezegun seems to work fine:

from datetime import timedelta

import pytest
from freezegun import freeze_time
import time
import random


@pytest.fixture
def freezer():
    with freeze_time(ignore=['_pytest.runner']) as f:
        yield f


def test_asd(freezer):
    a = time.time()
    tick_time = random.randint(1, 1000)
    freezer.tick(delta=timedelta(seconds=tick_time))
    b = time.time()

    assert b - a == tick_time

    time.sleep(1)

    assert time.time() == b
============================== test session starts ==============================
platform linux -- Python 3.7.1, pytest-4.2.0, py-1.7.0, pluggy-0.8.1
rootdir: /home/sam/tmp/py37, inifile: setup.cfg
collected 1 item                                                                

test_blah.py .                                                            [100%]

=========================== 1 passed in 1.07 seconds ============================

goodspark pushed a commit to goodspark/freezegun that referenced this issue Feb 8, 2019
This will fix pytest duration calculation. Fixes spulec#286
@goodspark goodspark linked a pull request Feb 8, 2019 that will close this issue
@jerry-git
Copy link

jerry-git commented Mar 7, 2020

Hello! Any update on #288? I ran into this with pytest-split which utilises duration related information for splitting the full test suite into "optimal sub suites". I basically needed to add some ugly hacks to cope with incorrect durations due to freezegun (which is an awesome library btw 👍).

@BenQuigley
Copy link

BenQuigley commented Sep 20, 2022

This is an issue for us as well. Our pytest output is

============================= slowest 10 durations =============================
1660129378.00s setup    code_base/test_name_removed.py
1660129376.24s setup    code_base/test_name_removed.py
1660129373.84s setup    code_base/test_name_removed.py
1660128993.35s setup    code_base/test_name_removed.py
1633128865.69s setup    code_base/test_name_removed.py
1586904688.27s setup    code_base/test_name_removed.py
1581677793.94s setup    code_base/test_name_removed.py
1581677793.14s setup    code_base/test_name_removed.py
1581634593.66s setup    code_base/test_name_removed.py
1474325275.86s setup    code_base/test_name_removed.py

Removing the freezegun.freeze_time decorators fixes the issue and causes the test to report a real runtime.

@BenQuigley
Copy link

Does the issue seem to only occur inside unittest.TestCase.setUp?

from unittest import TestCase
from freezegun import freeze_time

@freeze_time("2000-01-01")
def test__freezegun_decorator__duration():
    pass

def test__freezegun_context_manager__duration():
    with freeze_time("2000-01-01"):
        pass

@freeze_time("2000-01-01")
class ClassTests:
    def test__freezegun_decorator__duration(self):
        pass

@freeze_time("2000-01-01")
class UnitTestClassTests(TestCase):
    def test__freezegun_decorator__duration(self):
        pass

Result:

========================================= slowest 10 durations ==========================================
946518590.17s setup    freezer.py::UnitTestClassTests::test__freezegun_decorator__duration
1.10s call     freezer.py::test__freezegun_decorator__duration
1.02s call     freezer.py::test__freezegun_context_manager__duration
0.03s setup    freezer.py::test__freezegun_decorator__duration

(5 durations < 0.005s hidden.  Use -vv to show these durations.)

Versions:

freezegun==1.2.1
pytest-freezegun==0.4.2

@blueyed
Copy link

blueyed commented Dec 1, 2022

related: #463

@yulia-che
Copy link

So we ended up overriding #freeze_time in our test helpers for now. It feels quite hacky though. Is this what everybody doing?

@boxed
Copy link
Contributor

boxed commented Nov 30, 2023

@yulia-che I think most people migrate to time-machine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants