Skip to content

Doctests for RMSE and MeanPairwiseDistance #2307

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Nov 1, 2021

Conversation

DevPranjal
Copy link
Contributor

Addresses #2265

Description: Added doctests for RMSE and MeanPairwiseDistance

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

Sorry, something went wrong.

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
@github-actions github-actions bot added docs module: metrics Metrics module labels Oct 31, 2021
Comment on lines +344 to +351
# create default evaluator for doctests
def process_function(engine, batch):
y_pred, y = batch
return y_pred, y
default_evaluator = Engine(process_function)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have added a global for the default evaluator to which @vfdev-5 was referring to in this comment.
I also felt we could mention it somewhere... What could be a right place to mention it, if necessary?
Thanks :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DevPranjal Thank you. I think we already discussed that point, and the idea is to have simple and copy/paste pieces of code, even if a lot of code could be reused.

@vfdev-5 what do you think ?

Copy link
Collaborator

@vfdev-5 vfdev-5 Nov 1, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sdesrozis I do not see the point to make API examples cluttered with other "unrealistic" code just to make them executable.
We can however publicly display doctest_global_setup to users such that they could reuse it if needed reproducible code

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The point is the user can't copy/paste the code to test. I thought we would like to provide users with easily reproducible examples.

Anyway, let's choose whether we hide things and let's deploy 😊

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know, but it means for this metric we'll have the following:

import torch
from torch import nn, optim
from ignite.engine import *
from ignite.handlers import *
from ignite.metrics import *
from ignite.utils import *

manual_seed(666)

def process_function(engine, batch):
    y_pred, y = batch
    return y_pred, y

default_evaluator = Engine(process_function)

# --- API example 
metric = MeanPairwiseDistance(p=4)
metric.attach(default_evaluator, 'mpd')
preds = torch.Tensor([
                [1, 2, 4, 1],
                [2, 3, 1, 5],
                [1, 3, 5, 1],
                [1, 5, 1 ,11]
])
target = preds * 0.75
state = default_evaluator.run([[preds, target]])
print(state.metrics['mpd'])

You have half of the code snippet of preps.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are talking about more or less 30 code lines, and using lambda function, we could reduce this. Not sure that it's worth hiding rather than exposing.

Anyway, that's a matter of taste.

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Copy link
Collaborator

@vfdev-5 vfdev-5 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @DevPranjal !

@vfdev-5 vfdev-5 merged commit a93e481 into pytorch:master Nov 1, 2021
@ydcjeff ydcjeff mentioned this pull request Nov 5, 2021
51 tasks
fco-dv pushed a commit to fco-dv/ignite that referenced this pull request Nov 23, 2021

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
* Add doctests for RMSE and MeanPairwiseDistance

* Add doctests for RMSE and MeanPairwiseDistance

* Make metric name consistent
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants