Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add performance helpers #122

Open
JamesHutchison opened this issue Dec 2, 2023 · 1 comment
Open

Add performance helpers #122

JamesHutchison opened this issue Dec 2, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@JamesHutchison
Copy link
Owner

JamesHutchison commented Dec 2, 2023

When optimizing performance in tests, one thing you can do is take things that are deterministic, but expensive (such as encryption related functions) and cache them.

For example:

@pytest.fixture(scope="session", autouse=True)
def cache_derive_key() -> Iterable[None]:
    orig_derive_key = derive_key

    @functools.lru_cache()
    def derive_key_with_memoize(password: str, salt: str) -> bytes:
        return orig_derive_key(password, salt)

    MegaPatch.it(derive_key, new=derive_key_with_memoize)

To leverage this, you probably also have to make things that are random deterministic. Example:

uuids_stable = [uuid.uuid4() for _ in range(500)]


@pytest.fixture(autouse=True)
def deterministic_uuids() -> None:
    uuids = copy.copy(uuids_stable)
    MegaPatch.it(uuid.uuid4, new=lambda: uuids.pop())

This issue is to make generic functions to create these kinds of patches. I would imagine unique values can get created on demand rather than pre-making them. Then every test, the uuid4 will return the same values in the same order (similar to a side_effect)

Another example. This one does a cached list for each unique arguments.

pk_cache: defaultdict[
    tuple[tuple, tuple[tuple[str, Any], ...]], list[RSAPrivateKey]
] = defaultdict(list)


@pytest.fixture(autouse=True)
def deterministic_private_key():
    orig_generate_private_key = rsa.generate_private_key
    # create shallow copy
    cache = {k: copy.copy(v) for k, v in pk_cache.items()}

    def generate_private_key_with_per_test_cache(*args, **kwargs) -> rsa.RSAPrivateKey:
        key = (args, tuple(sorted(kwargs.items())))
        if result := cache.get(key):
            return result.pop(0)
        pk = orig_generate_private_key(*args, **kwargs)
        pk_cache[key].append(pk)
        return pk

    MegaPatch.it(rsa.generate_private_key, new=generate_private_key_with_per_test_cache)
@JamesHutchison JamesHutchison added the enhancement New feature or request label Dec 3, 2023
@JamesHutchison
Copy link
Owner Author

JamesHutchison commented Dec 8, 2023

Testing with the pytest hot reloader it looks like there's a compatibility issue with re-runs and using the lru cache. Restructuring so the function isn't defined inside a nested scope fixed the issue.

@functools.lru_cache()
def encrypt_with_memoize(data: bytes, key: bytes) -> bytes:
    return orig_encrypt(data, key)


@functools.lru_cache()
def decrypt_with_memoize(encrypted_data: bytes, key: bytes) -> bytes:
    return orig_decrypt(encrypted_data, key)


@pytest.fixture(scope="session", autouse=True)
def cache_fernet() -> Iterable[None]:
    MegaPatch.it(encrypt_fernet, new=encrypt_with_memoize)
    MegaPatch.it(decrypt_fernet, new=decrypt_with_memoize)

    yield


orig_derive_key = derive_key


@functools.lru_cache()
def derive_key_with_memoize(password: str, salt: str) -> bytes:
    return orig_derive_key(password, salt)


@pytest.fixture(scope="session", autouse=True)
def cache_derive_key() -> Iterable[None]:
    MegaPatch.it(derive_key, new=derive_key_with_memoize)

    yield

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant