Skip to content

salykova/instant-DexNerf

Repository files navigation

Instant DexNerf

Depth/3D Shape estimation of transparent objects using multiview posed RGB images. The project was inspired by Dex-NeRF: Using a Neural Radiance field to Grasp Transparent Objects and Instant Neural Graphics Primitives. Combination of two methods provides both fast training/rendering speed and accurate depth map estimation.

How to install

For installation steps please refer to Instant NGP.

How to run

There is an example of a scene with transparent object. Run from the command line

./build/testbed --scene data/nerf/canister/transforms.json

In the GUI you can adjust sigma parameter and switch between normal and Dex depth rendering. By default, sigma = 15.

How to create depth maps of captured scenes

In /scripts folder there is a main.py script for depth map generation.

  1. First, fill the scene_dirs list with paths to your folders with rgb images. These folders must have the following structure. In /data/nerf/canister you can find examples of groundtruth_handeye.txt and intrinsics.txt
├── scene_folder
│   ├── img_dir (folder with rgb images. default "rgb", but you can change the name in main.py) 
│   ├── groundtruth_handeye.txt (contains c2w extrinsics as quaternions for each image. x,y,z,w format)
|   ├── intrinsics.txt (w h fx 0 cx 0 fy cy 0 0 1 [360 or 180]). 180 for forward-facing scene, 360 for 360° scene.
  1. Set parameters for training and rendering: depth_dir, sigma_thrsh, aabb_scale, train_steps
  2. Run main.py Rendered depth maps are found in scene_folder/depth_dir folder.

Note: if you use world coordinate system different from ours, please adapt transform_matrix in ours2nerf.py. c2w matrices are multiplied by transform_matrix before they written to the transforms.json file. Otherwise, poses are expected to be in OPENCV format.

Depending on your scene geometry, you may need to tune scale, offset parameters !!!

Results

Example 1

000039 rendered_depth1

Example 2

000050 rendered_depth2

Depth error

Example 1

error_depth1 cbar

Example 2

error_depth2 cbar

Citation

Kudos to the authors for their great work

@inproceedings{IchnowskiAvigal2021DexNeRF,
  title={{Dex-NeRF}: Using a Neural Radiance field to Grasp Transparent Objects},
  author={Ichnowski*, Jeffrey and Avigal*, Yahav and Kerr, Justin and Goldberg, Ken},
  booktitle={Conference on Robot Learning (CoRL)},
  year={2020}
}
@article{mueller2022instant,
    author = {Thomas M\"uller and Alex Evans and Christoph Schied and Alexander Keller},
    title = {Instant Neural Graphics Primitives with a Multiresolution Hash Encoding},
    journal = {ACM Trans. Graph.},
    issue_date = {July 2022},
    volume = {41},
    number = {4},
    month = jul,
    year = {2022},
    pages = {102:1--102:15},
    articleno = {102},
    numpages = {15},
    url = {https://doi.org/10.1145/3528223.3530127},
    doi = {10.1145/3528223.3530127},
    publisher = {ACM},
    address = {New York, NY, USA},
}