This repository provides a dataset generator that converts AI2-THOR environments into the structured JSON format used by the Blender dataset, also known as the NeRF-Synthetic dataset, commonly used in Novel View Synthesis (NVS) research, such as Neural Radiance Fields (NeRF) and 3D Gaussian Splatting.
- 1. Installation
- 2. How to Use
- 3. Render Custom Scenes
- 4. Depth Map and Point Cloud
- 5. Optional Parameters
The recommended way to install this project is via uv.
uv tool install git+https://github.com/Tomoya-Matsubara/thor2blender.gitThis will install the thor2blender command-line tool.
Note
If you prefer not to install the tool in your environment, you can run it
through uv without installation. For this, clone the repository and navigate
into the project directory:
git clone https://github.com/Tomoya-Matsubara/thor2blender.git
cd thor2blenderThe simplest way to use the tool is to run the following command:
thor2blenderThis will generate a Blender (NeRF-Synthetic) style dataset from the first training scene in ProcTHOR-10K.
Note
If you did not install the thor2blender tool, you can run it using uv as
follows:
uv run thor2blenderIn AI2-THOR, scenes are managed by JSON files. You can render custom scenes by specifying the path to your scene JSON file as follows:
thor2blender scene.source=/path/to/your/scene.jsonMethods like 3D Gaussian Splatting initialize 3D Gaussians from a point cloud. If not provided, the 3D Gaussians are randomly initialized, which may lead to delayed convergence.
thor2blender supports depth rendering from AI2-THOR and point cloud reconstruction using the obtained depth maps. You can enable depth rendering and point cloud reconstruction as follows:
thor2blender rendering.ai2thor.render_depth_image=trueA number of optional parameters are available to customize the rendering process. You can find the full list of parameters and their descriptions by running:
thor2blender --helpThe configurations are managed by Hydra, allowing you to override any parameter directly from the command line.
