Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Grounding DINO] Add support for cross-attention in GroundingDinoMultiHeadAttention #30364

Merged

Conversation

EduardoPach
Copy link
Contributor

What does this PR do?

This PR fixes #30176 by adding support for cross-attention masking and related integration test.

Now one can do batched inference with different text lengths such as:

import torch
from transformers import AutoModelForZeroShotObjectDetection, AutoProcessor

model_id = "IDEA-Research/grounding-dino-tiny"

model = AutoModelForZeroShotObjectDetection.from_pretrained(model_id)
processor = AutoProcessor.from_pretrained(model_id)

images = [...]
text = ["a cat. a remote control.", "a person. a cat. a remote control. a couch."]

inputs = processor(images=image, text=text, padding="longest", return_tensors="pt")

with torch.no_grad():
    outputs = model(**inputs)

@amyeroberts
Copy link
Collaborator

Thanks for adding @EduardoPach! Let us know when it's ready for review!

@EduardoPach
Copy link
Contributor Author

Thanks for adding @EduardoPach! Let us know when it's ready for review!

It's already ready for review, it was easier than I was expecting

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this and a corresponding test! ❤️

Main comments are for robustness when using the model in transformers

@EduardoPach EduardoPach requested a review from amyeroberts April 22, 2024 19:14
Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding!

@amyeroberts amyeroberts merged commit c651ea9 into huggingface:main Apr 23, 2024
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add Cross Attention Masking for Grounding DINO
2 participants