Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make sure DECTRIS compound HDF5 files work #1544

Open
uellue opened this issue Nov 8, 2023 · 7 comments
Open

Make sure DECTRIS compound HDF5 files work #1544

uellue opened this issue Nov 8, 2023 · 7 comments

Comments

@uellue
Copy link
Member

uellue commented Nov 8, 2023

Make sure https://er-c-data-mgmt.fz-juelich.de/browse/?path=/adhoc/arina_cryo4d_stem/maxleotest_master.h5 works in the web GUI without having to specify the dataset path.

Also, we could consider adding hdf5plugin as a dependency and not only an extra since datasets acquired with DECTRIS camera default settings require it.

@uellue uellue added this to the 0.14 milestone Nov 8, 2023
@sk1p
Copy link
Member

sk1p commented Nov 8, 2023

Just tested this with h5py versions 3.0 to current, and all of them seem to fail to "see" / visit the dataset correctly.

@uellue
Copy link
Member Author

uellue commented Dec 18, 2023

It seems that the master file just contains a list of separate dataset entries for each of the compound files:

 - entry : <HDF5 group "/entry" (4 members)>
		 - data : <HDF5 group "/entry/data" (105 members)>
			 - data_000001 : <HDF5 dataset "data": shape (10000, 96, 96), type "<u2">
					 - image_nr_high : 10000
					 - image_nr_low : 1
			 - data_000002 : <HDF5 dataset "data": shape (10000, 96, 96), type "<u2">
					 - image_nr_high : 20000
					 - image_nr_low : 10001
			 - data_000003 : <HDF5 dataset "data": shape (10000, 96, 96), type "<u2">
					 - image_nr_high : 30000
					 - image_nr_low : 20001
			 - data_000004 : <HDF5 dataset "data": shape (10000, 96, 96), type "<u2">
					 - image_nr_high : 40000
					 - image_nr_low : 30001

The total scan is 1024x1024, and it looks like stacking all these separate entries together would give you that. I couldn't find an entry that combines the files. I'll try a few things to see if we can make it work somehow.

@uellue
Copy link
Member Author

uellue commented Dec 18, 2023

@uellue
Copy link
Member Author

uellue commented Dec 18, 2023

@uellue
Copy link
Member Author

uellue commented Dec 18, 2023

Following this example, I got it to work by creating a proper virtual HDF5 dataset for the whole thing.

Code to create the master entry: create_true_master.ipynb

Master entry: master.h5

To open in LiberTEM, open with 1024x1024 nav shape.

@uellue
Copy link
Member Author

uellue commented Dec 18, 2023

It seems that the file also has a few other interesting entries for our corrections, to get rid of the 65535 max int value for dead pixels:

with h5py.File('maxleotest_master.h5') as f:
    pixel_mask = f['entry/instrument/detector/detectorSpecific/pixel_mask']
    print(pixel_mask, np.min(pixel_mask), np.max(pixel_mask))
    
    print(bool(f['entry/instrument/detector/flatfield_correction_applied']))
    
    flatfield = f['entry/instrument/detector/detectorSpecific/flatfield']
    print(flatfield, np.min(flatfield), np.max(flatfield))

<HDF5 dataset "pixel_mask": shape (96, 96), type "<u4"> 0 4
True
<HDF5 dataset "flatfield": shape (96, 96), type "<f4"> 1.0 1.0

@uellue
Copy link
Member Author

uellue commented Dec 18, 2023

In conclusion, it could be worth it to explore an Eiger-NeXus dataset derived from HDF5 that creates the temporary HDF5 virtual dataset in a temporary file and interprets the pixel mask and flatfield correction.

For the time being we can build these virtual datasets with the code in the notebook in a separate file for each acquisition.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants