-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generating Partial Maps #9
Comments
Hi Quest2GM, As far as I can remember, the warning messages are expected after each exploration. About the dataset size, I think it is going to be large but don't remember the actual size. Best, |
I was referring to the error: "Tried to advertise a service already advertised in the node", and not the warning messages. Let me know if this is expected or not. Also, you had to run the hector exploration simulator through the entire dataset 50 times correct? The reason why I ask is because, wouldn't that create a lot of duplicates in the training set? Perhaps I'm not understanding it correctly, but I'm just trying to make sense of this part of the paper: Thanks again, and your support is very much appreciated. Siddarth |
Hi Siddarth, The "Tried to advertise a service already advertised in the node" error message is also not a problem if the next exploration runs fine after the previous one ends. Running the simulation multiple times on one map will not create duplicate data if the start positions of the robot are different (which should be the case, you can double check that). Best, |
I was wondering if you remembered the number of epochs, learning rate and weight decay you used to help the dataset converge. This is not mentioned in the paper. I am having a little trouble doing so as the test loss and the training loss curves are very noisy. Thanks again. Regards, |
I don't remember the numbers, but they should be same or similar to the ones in the current training script: kth_partial_dataset_train.py |
Hi @rakeshshrestha31,
After running the stage simulator on the kth floorplans (with ~150 unique floorplans) once, I was able to produce a dataset of ~100,000 total partial maps (determined through https://github.com/rakeshshrestha31/online_map_completion/blob/fbc6b06de3a991329f7aa82680e5fbaa058622ac/training_scripts/kth_partial_dataset_train.py#L154). This taking a whopping 18GB of storage on my computer. Though according to the paper, you mentioned that you had 1.5 million partial maps. So would that mean that I need to run the simulator through the dataset 15 more times (I can't imagine how much storage that would take)? Note that running the simulator through the dataset once takes about a day, so just wanted to confirm that I am doing things right before proceeding.
Also, I am getting this error when running the simulation - this error happens at the end of each exploration, and I'm not sure if its expected or not:
Thank you and any help is much appreciated,
Quest2GM
The text was updated successfully, but these errors were encountered: