Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input images are stretched when using test #1644

Open
KreesoAuriga opened this issue Apr 19, 2024 · 1 comment
Open

Input images are stretched when using test #1644

KreesoAuriga opened this issue Apr 19, 2024 · 1 comment

Comments

@KreesoAuriga
Copy link

I trained a model on my own data set of 22,000 256x256 jpg image pairs. But when I use test to generate B images from A images that are also 256x256, the results are only the left 128x256 pixels of the image stretched out to 256x256.
During training, the output in visdom looked correct, no stretching. I cannot figure out what would cause this. I've not added any extra switch values.

My command for test is
python test.py --dataroot ./datasets/biosphereNormalMaps --name biosphereNormalMaps --model pix2pix --direction AtoB

I've tried making the test images jpg and png with the same result.

@KreesoAuriga
Copy link
Author

Oh I see what it is now. The test script assumes that the images provided have an A domain image on the left and a B domain image on the right. So when I give it a 256x256 image that is just an A image, it stretches that image to 512x256 and treats the left half as A and the right half as B.
Is there a command line option, or a dedicated script, for generating images from data that is just a single A or B image?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant