-
Pre-Install: git, catkin, ROS
-
Workspace
$ mkdir -p project_VO_ws/src $ cd project_VO_ws $ catkin init
-
This repo: our work
$ cd project_VO_ws/src $ git clone https://github.com/TienPoly/VO_duckiebot.git
-
Direct method VO: viso2 package
$ cd project_VO_ws/src $ git clone https://github.com/TienPoly/viso2.git
-
Ground projection: to get the relative pose for VISO 2 https://github.com/duckietown/Software/tree/master18/catkin_ws/src/10-lane-control/ground_projection
-
Deep learning: to do!
-
Vicon (optional)
- Vicon motion capture system overview
- Ros interface (configured at MRASL of Polytechnique Montreal) from: https://github.com/MRASL/ros_vrpn_client.git
- Depedencies
- vrpn_catkin package from: https://github.com/ethz-asl/vrpn_catkin
- catkin_simple package from: https://github.com/catkin/catkin_simple.git
- glog_catkin package from: https://github.com/ethz-asl/glog_catkin.git
-
Build
$ cd project_VO_ws $ catkin build
- Run camera_node on your duckiebot (1st terminal)
$ docker -H hostname.local run -it --net host --privileged --name base -v /data:/data duckietown/rpi-duckiebot-base:master18 /bin/bash $ roslaunch duckietown camera.launch veh:="hostname" raw:="false"
- Check camera (2nd terminal)
$ cd project_VO_ws && source devel/setup.bash $ export ROS_MASTER_URI=http://hostname.local:11311/ $ rqt
- Run joystick container (2nd terminal or using portainer)
$ docker -H hostname.local run -dit --privileged --name joystick --network=host -v /data:/data duckietown/rpi-duckiebot-joystick-demo:master18
- Run Vicon on your desktop (2nd terminal): the Vicon object name is 'duckiebot_hostname'
$ cd project_VO_ws && source devel/setup.bash $ export ROS_MASTER_URI=http://hostname.local:11311/ $ roslaunch ros_vrpn_client mrasl_vicon_duckiebot.launch object_name:=duckiebot_hostname
- Making your Duckiebot move and Record data on your desktop (3rd terminal)
$ export ROS_MASTER_URI=http://hostname.local:11311/ $ rosbag record /hostname/camera_node/camera_info /hostname/camera_node/image/compressed /duckiebot_hostname/vrpn_client/estimated_odometry
An example of this bag file: razor_3.bag
NOTE: by default, decoder_node is run on Duckiebot at very low frequency (2Hz) due to limited computation. To get more images for deep learning, we run this node on a local desktop.
-
Run roscore (1st terminal)
-
Run decoder_node at 10Hz (maximum 30Hz) on your desktop (2nd & 3rd terminals)
$ rosbag play bag_file.bag --topic /hostname/camera_node/image/compressed /duckiebot_hostname/vrpn_client/estimated_odometry $ cd project_VO_ws && source devel/setup.bash $ roslaunch vo_duckiebot decoder_node.launch veh:="hostname" param_file_name:="decoder_10Hz"
-
Run synchronization_node (3th terminal): synchronization between image/raw and vicon data
$ cd project_VO_ws && source devel/setup.bash $ roslaunch vo_duckiebot data_syn.launch veh:="hostname" veh_vicon:="duckiebot_hostname"
-
Record new data (4th terminal)
$ rosbag record /hostname/camera_node/image/raw /hostname/vicon_republish/pose
-
Verify camera info and Check image_raw published at 10Hz
$ rostopic echo /hostname/camera_node/camera_info $ rostopic hz /hostname/camera_node/image/raw
Even we run this node at 10Hz, this topic is published at about 8Hz!
An example of the new bag file: razor_3_syn.bag
- can not run ground_projection locally
- run on duckiebot => segment is not published (00-infrastructure/duckietown_msgs/msg/Segment.msg)
- to run at duckietown (A222)
-
txt file from bag using MATLAB: run script_to_run.m with your new bag file
-
png image from image/raw: create a new folder, e.g.
images_10Hz
$ ./bag2img.py bag_file_syn.bag images_10Hz/ /hostname/camera_node/image/raw
An example of the text file and png images: Duckiebot
- Offline
- Online
- Ground projection => relative pose