A collaboration friendly studio for NeRFs
Insights
nerfstudio-project/nerfstudio
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
main
Use Git or checkout with SVN using the web URL.
Work fast with our official CLI. Learn more .
If nothing happens, download GitHub Desktop and try again.
Launching GitHub Desktop
If nothing happens, download GitHub Desktop and try again.
Launching Xcode
If nothing happens, download Xcode and try again.
Launching Visual Studio Code
Your codespace will open once ready.
There was a problem preparing your codespace, please try again.
Latest commit
swapping doc & view wrong links
7a02d6f
Failed to load latest commit information.
Type
Sep 6, 2022
View code
Supported Features
About
Nerfstudio provides a simple API that allows for a simplified end-to-end process of creating, training, and testing NeRFs. The library supports a more interpretable implementation of NeRFs by modularizing each component. With more modular NeRFs, we hope to create a more user-friendly experience in exploring the technology. Nerfstudio is a contributer friendly repo with the goal of buiding a community where users can more easily build upon each other's contributions.
It’s as simple as plug and play with nerfstudio!
Ontop of our API, we are commited to providing learning resources to help you understand the basics of (if you're just getting start), and keep up-to-date with (if you're a seasoned veteran) all things NeRF. As researchers, we know just how hard it is to get onboarded with this next-gen technology. So we're here to help with tutorials, documentation, and more!
Finally, have feature requests? Want to add your brand-spankin'-new NeRF model? Have a new dataset? We welcome any and all contributions ! Please do not hesitate to reach out to the nerfstudio team with any questions via Discord .
We hope nerfstudio enables you to build faster
????
and contribute to our NeRF community
????
.
Quickstart
The quickstart will help you get started with the default vanilla nerf trained on the classic blender lego scene. For more complex changes (e.g. running with your own data/ setting up a new NeRF graph, please refer to our references .
1. Installation: Setup the environment
Create environment
We reccomend using conda to manage dependencies. Make sure to install Conda before preceding.
conda create --name nerfstudio -y python=3.8; conda activate nerfstudio python -m pip install --upgrade pip
Dependencies
Install pytorch with CUDA (this repo has been tested with CUDA 11.3) and tiny-cuda-nn
pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 -f https://download.pytorch.org/whl/torch_stable.html pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
Installing nerfstudio
If you would want the latest and greatest:
git clone git@github.com:nerfstudio-project/nerfstudio.git cd nerfstudio pip install -e .
2. Setting up the data
Download the original NeRF Blender dataset. We support the major datasets and allow users to create their own dataset, described in detail here .
ns-download-data --dataset=blender ns-download-data --dataset=nerfstudio --capture=poster
2.x Using custom data
If you have custom data in the form of a video or folder of images, we've provided some COLMAP and FFmpeg scripts to help you process your data so it is compatible with nerfstudio.
After installing both software, you can process your data via:
ns-process-data --data FOLDER_OR_VIDEO --output-dir {PROCESSED_DATA_DIR}
3. Training a model
To run with all the defaults, e.g. vanilla nerf method with the blender lego image
# To see what models are available. ns-train --help # To see what model-specific cli arguments are available. ns-train nerfacto --help # Run with nerfacto model. ns-train nerfacto # We provide support for other models. E.g. to run instant-ngp. ns-train instant-ngp # To train on your custom data. ns-train nerfacto --data {PROCESSED_DATA_DIR}
3.x Training a model with the viewer
You can visualize training in real-time using our web-based viewer.
Make sure to forward a port for the websocket to localhost. The default port is 7007, which you should expose to localhost:7007.
# with the default port ns-train nerfacto --vis viewer # with a specified websocket port ns-train nerfacto --vis viewer --viewer.websocket-port=7008 # port forward if running on remote ssh -L localhost:7008:localhost:7008 {REMOTE HOST}
For more details on how to interact with the visualizer, please visit our viewer walk-through .
4. Rendering a trajectory during inference
After your model has trained, you can headlessly render out a video of the scene with a pre-defined trajectory.
# assuming previously ran `ns-train nerfacto` ns-render --load-config=outputs/data-nerfstudio-poster/nerfacto/{TIMESTAMP}/config.yml --traj=spiral --output-path=output.mp4
Learn More
And that's it for getting started with the basics of nerfstudio.
If you're interested in learning more on how to create your own pipelines, develop with the viewer, run benchmarks, and more, please check out some of the qucklinks below or visit our documentation directly.
Section