Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/custom resampler #419

Open
wants to merge 4 commits into
base: master
Choose a base branch
from

Conversation

wreise
Copy link
Collaborator

@wreise wreise commented Jun 7, 2020

Reference issues/PRs
None

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Description
Currently, the minus i-th entry in SlidingWindow().resample(y) is the last element from the minus i-th full window.
This PR offers a generalization, where SlidingWindow.target_resampler is applied to SlidingWindow().transform(y)[i] to obtain SlidingWindow().resample(y)[i].
The default SlidingWindow.target_resampler=lambda x: x[-1] makes the default behavior unchanged.

Screenshots (if appropriate)

Any other comments?
I am not sure if this is the right place to introduce that. Maybe another transformer, which inherits from Sliding window would be better?

Checklist

  • I have read the guidelines for contributing.
  • My code follows the code style of this project. I used flake8 to check my Python changes.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I have added tests to cover my changes.
  • All new and existing tests passed. I used pytest to check this on Python tests.

@wreise wreise added the enhancement New feature or request label Jun 7, 2020
@wreise wreise self-assigned this Jun 7, 2020
@wreise wreise requested a review from ulupo June 7, 2020 09:01
Copy link
Collaborator

@gtauzin gtauzin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I made a technical comment, but I am not sure this is the proper way of doing it. It's quite unsafe as depending of target_resampler, the resampling may not be consistent with the way the input is transformed in transform.

self.width = width
self.stride = stride

if target_resampler is None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Parameters logic should not be in the init of a sklearn-estimator. Move the logic in fit and use it to define an attribute as we do for metric_params in diagrams.PairwiseDistances for example

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants