You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We optimize all implementations for 600, 000 iterations with a batch size of 1024 on 4 A100s, where Ha-NeRF and NeRF-W take 20 and 18 hours respectively.
These models spend the same inference time once the appearance vector is given.
Ha-NeRF takes a few milliseconds to obtain the appearance vector from the CNN-based module, while NeRF-W needs minutes to optimize the vector.
Both methods require around 20 GB memory for training and 5 GB memory for inference.
Hello, how long does it take you to train a model (such as brandenburg_gate), and what hardware configuration do you use for training?
The text was updated successfully, but these errors were encountered: