Skip to content

TienPoly/vo_duckiebot

Repository files navigation

Visual Odometry for Duckiebot

Installation

Data collection

  • Run camera_node on your duckiebot (1st terminal)
    $ docker -H hostname.local run -it --net host --privileged --name base -v /data:/data duckietown/rpi-duckiebot-base:master18 /bin/bash
    $ roslaunch duckietown camera.launch veh:="hostname" raw:="false" 
  • Check camera (2nd terminal)
    $ cd project_VO_ws && source devel/setup.bash
    $ export ROS_MASTER_URI=http://hostname.local:11311/
    $ rqt 
  • Run joystick container (2nd terminal or using portainer)
    $ docker -H hostname.local run -dit --privileged --name joystick --network=host -v /data:/data duckietown/rpi-duckiebot-joystick-demo:master18 
  • Run Vicon on your desktop (2nd terminal): the Vicon object name is 'duckiebot_hostname'
    $ cd project_VO_ws && source devel/setup.bash
    $ export ROS_MASTER_URI=http://hostname.local:11311/
    $ roslaunch ros_vrpn_client mrasl_vicon_duckiebot.launch object_name:=duckiebot_hostname 
  • Making your Duckiebot move and Record data on your desktop (3rd terminal)
    $ export ROS_MASTER_URI=http://hostname.local:11311/      
    $ rosbag record /hostname/camera_node/camera_info /hostname/camera_node/image/compressed /duckiebot_hostname/vrpn_client/estimated_odometry 

An example of this bag file: razor_3.bag

Decoder and Synchronization (on your desktop)

NOTE: by default, decoder_node is run on Duckiebot at very low frequency (2Hz) due to limited computation. To get more images for deep learning, we run this node on a local desktop.

  • Run roscore (1st terminal)

  • Run decoder_node at 10Hz (maximum 30Hz) on your desktop (2nd & 3rd terminals)

    $ rosbag play bag_file.bag --topic /hostname/camera_node/image/compressed  /duckiebot_hostname/vrpn_client/estimated_odometry
    $ cd project_VO_ws && source devel/setup.bash
    $ roslaunch vo_duckiebot decoder_node.launch veh:="hostname" param_file_name:="decoder_10Hz" 
  • Run synchronization_node (3th terminal): synchronization between image/raw and vicon data

    $ cd project_VO_ws && source devel/setup.bash
    $ roslaunch vo_duckiebot data_syn.launch veh:="hostname" veh_vicon:="duckiebot_hostname" 
  • Record new data (4th terminal)

    $ rosbag record /hostname/camera_node/image/raw /hostname/vicon_republish/pose 
  • Verify camera info and Check image_raw published at 10Hz

    $ rostopic echo /hostname/camera_node/camera_info
    $ rostopic hz /hostname/camera_node/image/raw 

    Even we run this node at 10Hz, this topic is published at about 8Hz!

An example of the new bag file: razor_3_syn.bag

Ground projection: to do

  • can not run ground_projection locally
  • run on duckiebot => segment is not published (00-infrastructure/duckietown_msgs/msg/Segment.msg)
  • to run at duckietown (A222)

Data export

  • txt file from bag using MATLAB: run script_to_run.m with your new bag file

  • png image from image/raw: create a new folder, e.g. images_10Hz

    $ ./bag2img.py bag_file_syn.bag images_10Hz/ /hostname/camera_node/image/raw 

    An example of the text file and png images: Duckiebot

VISO2: TO DO

  • Offline
  • Online
  • Ground projection => relative pose

DEEP LEARNING 1

DEEP LEARNING n

TO DO: presentation, new video with camera calibration, viso2, other direct method, Ground projection, deep learning

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published