-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make sure DECTRIS compound HDF5 files work #1544
Comments
Just tested this with h5py versions 3.0 to current, and all of them seem to fail to "see" / visit the dataset correctly. |
It seems that the master file just contains a list of separate dataset entries for each of the compound files:
The total scan is 1024x1024, and it looks like stacking all these separate entries together would give you that. I couldn't find an entry that combines the files. I'll try a few things to see if we can make it work somehow. |
Following this example, I got it to work by creating a proper virtual HDF5 dataset for the whole thing. Code to create the master entry: Master entry: master.h5 To open in LiberTEM, open with 1024x1024 nav shape. |
It seems that the file also has a few other interesting entries for our corrections, to get rid of the 65535 max int value for dead pixels:
|
In conclusion, it could be worth it to explore an Eiger-NeXus dataset derived from HDF5 that creates the temporary HDF5 virtual dataset in a temporary file and interprets the pixel mask and flatfield correction. For the time being we can build these virtual datasets with the code in the notebook in a separate file for each acquisition. |
Make sure https://er-c-data-mgmt.fz-juelich.de/browse/?path=/adhoc/arina_cryo4d_stem/maxleotest_master.h5 works in the web GUI without having to specify the dataset path.
Also, we could consider adding hdf5plugin as a dependency and not only an extra since datasets acquired with DECTRIS camera default settings require it.
The text was updated successfully, but these errors were encountered: