Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dsp tests #310

Merged
merged 33 commits into from Aug 1, 2022
Merged

dsp tests #310

merged 33 commits into from Aug 1, 2022

Conversation

gipert
Copy link
Member

@gipert gipert commented Jul 6, 2022

  • added some first tests
  • fix/add type hints
  • fix docstrings
  • make overwrite mode = false in build_raw and build_dsp

@gipert gipert added tests Testing the package dsp Digital Signal Processing labels Jul 6, 2022
@gipert gipert requested a review from iguinn July 6, 2022 12:52
@codecov-commenter
Copy link

codecov-commenter commented Jul 6, 2022

Codecov Report

Merging #310 (7d67619) into main (9ec0884) will increase coverage by 4.53%.
The diff coverage is 65.77%.

@@            Coverage Diff             @@
##             main     #310      +/-   ##
==========================================
+ Coverage   44.88%   49.42%   +4.53%     
==========================================
  Files          74       74              
  Lines        6372     6396      +24     
==========================================
+ Hits         2860     3161     +301     
+ Misses       3512     3235     -277     
Impacted Files Coverage Δ
src/pygama/dsp/__init__.py 100.00% <ø> (ø)
src/pygama/math/peak_fitting.py 26.61% <ø> (ø)
src/pygama/dsp/processors/fftw.py 20.00% <20.00%> (ø)
src/pygama/dsp/processors/pulse_injector.py 33.33% <33.33%> (ø)
src/pygama/dsp/processors/param_lookup.py 36.36% <36.36%> (ø)
src/pygama/dsp/processors/moving_windows.py 18.75% <45.45%> (ø)
src/pygama/dsp/processors/wiener_filter.py 13.04% <66.66%> (ø)
src/pygama/dsp/processors/pole_zero.py 25.80% <75.00%> (ø)
src/pygama/dsp/processors/multi_t_filter.py 26.47% <80.00%> (ø)
src/pygama/dsp/build_dsp.py 81.81% <100.00%> (+65.32%) ⬆️
... and 28 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9ec0884...7d67619. Read the comment docs.

@gipert gipert marked this pull request as ready for review July 6, 2022 15:54
@gipert gipert added the docs Package documentation label Jul 8, 2022
@jasondet jasondet mentioned this pull request Jul 19, 2022
@jasondet jasondet changed the base branch from refactor to main July 21, 2022 00:17
@jasondet
Copy link
Collaborator

@iguinn can you review this when you get a chance?

Copy link
Collaborator

@iguinn iguinn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As someone fairly inexperienced in writing tests for python, I want to make sure that I understand the goals for each test:
test_processing_chain.py: this seems to be essentially have integration tests to make sure that it can run the kinds of processors we want it to run (like running a scipy processor) and unit tests to make sure that the parsing of different cases works
test_build_dsp.py: this seems to be a functional test that's designed to make sure that we can process a real file
test_waveform_browser.py: this seems to be a functional test that draws waveforms

For test_build_dsp, as I commented above, do we want this to use a full-fledged LEGEND DSP config file, or do we want to pare it down to more of a minimal working file? My (possibly flawed) understanding is that we would prefer to use unit testing for the individual processors and design this test to be simpler?

For test_waveform_browser.py, this involves actually drawing waveforms. How does this work for automatic testing? Is it more or less just going to look for if any errors are thrown, or is this more meant for someone to run on their own to visually confirm that waveforms are drawn correctly?

from pygama.lgdo.lh5_store import LH5Store, ls
from pygama.raw import build_raw

config_dir = Path(__file__).parent/'configs'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we sure this is how we want to handle paths to json config files for testing? Another option would be to add the files to our setup.cfg (https://setuptools.pypa.io/en/latest/userguide/datafiles.html) and use importtools (https://docs.python.org/3/library/importlib.html#module-importlib.resources) to use them in cases like this. This could also be useful if we wanted to have a default config file that we use for tutorials and such.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's useful t to install test data/configs. By design, the test suite is meant to be run by developers on the pygama source (not the installed code!).

For tutorials/docs the story is different. I agree that we should include them as installed data files.



def test_build_dsp_basics(lgnd_test_data):
build_dsp(lgnd_test_data.get_path('lh5/LDQTA_r117_20200110T105115Z_cal_geds_raw.lh5'),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I'm missing something here...What kind of object is lgnd_test_data and how is it getting passed here? I assume this is pointing to the test data repository?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's defined here as a Pytest fixture. Fixtures are useful to share objects between tests.

@@ -0,0 +1,327 @@
{
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we sure we want to use a full-fledged DSP file like this, rather than a more minimal example? It seems like the job of testing the individual processors should go to unit_tests, and for testing build_dsp and processing_chain, we may want a more minimal file

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that we want unit tests for the processors folder. My plan (not part of this PR) is to have a test for each single processor. Still, testing something similar to what we have in the L200 data production seems useful to me...

@jasondet
Copy link
Collaborator

@iguinn can you check these conflicts when you get a chance?

@gipert
Copy link
Member Author

gipert commented Aug 1, 2022

As someone fairly inexperienced in writing tests for python, I want to make sure that I understand the goals for each test:
test_processing_chain.py: this seems to be essentially have integration tests to make sure that it can run the kinds of processors we want it to run (like running a scipy processor) and unit tests to make sure that the parsing of different cases works
test_build_dsp.py: this seems to be a functional test that's designed to make sure that we can process a real file
test_waveform_browser.py: this seems to be a functional test that draws waveforms

Correct.

For test_build_dsp, as I commented above, do we want this to use a full-fledged LEGEND DSP config file, or do we want to pare it down to more of a minimal working file? My (possibly flawed) understanding is that we would prefer to use unit testing for the individual processors and design this test to be simpler?

Correct, see my comment above.

For test_waveform_browser.py, this involves actually drawing waveforms. How does this work for automatic testing? Is it more or less just going to look for if any errors are thrown, or is this more meant for someone to run on their own to visually confirm that waveforms are drawn correctly?

Yes, It is more or less just looking for if any errors are thrown. Testing image outputs is fairly complicated, obviously, and I'm not sure we'll ever have time to work on this...

@gipert
Copy link
Member Author

gipert commented Aug 1, 2022

@jasondet the Sphinx bug has been fixed: sphinx-doc/sphinx#10701

I'm going to merge this, I'll continue working on tests in future PRs.

@gipert gipert merged commit 5ae84bc into legend-exp:main Aug 1, 2022
@gipert gipert deleted the dsp-tests branch August 10, 2022 12:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Package documentation dsp Digital Signal Processing tests Testing the package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants