Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluate XRAI's label sensitivity #2

Open
2 tasks
aboggust opened this issue Jun 7, 2023 · 0 comments
Open
2 tasks

Evaluate XRAI's label sensitivity #2

aboggust opened this issue Jun 7, 2023 · 0 comments
Labels
good first issue Good first issue to address new evaluation results Add results from an existing evaluation to an untested saliency method

Comments

@aboggust
Copy link
Collaborator

aboggust commented Jun 7, 2023

Existing label sensitivity evaluations have not characterized XRAI. As a result, its saliency card does not have results for label sensitivity tests.

Evaluate XRAI's label sensitivity and record the results in its saliency card. The label sensitivity tests are:

@aboggust aboggust added new evaluation results Add results from an existing evaluation to an untested saliency method good first issue Good first issue to address labels Jun 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good first issue to address new evaluation results Add results from an existing evaluation to an untested saliency method
Projects
None yet
Development

No branches or pull requests

1 participant