diff --git a/README.md b/README.md index 963f8128..c7a930a3 100644 --- a/README.md +++ b/README.md @@ -12,7 +12,7 @@ Our docs use extended markdown as implemented by MkDocs. * create a python3 environment `python -m venv env` * activate the python environment `source env/bin/activate` * install MkDocs `pip install mkdocs` -* install MkDocs redirect `pin install mkdocs-redirects` +* install MkDocs redirect `pip install mkdocs-redirects` * `mkdocs serve` starts a local webserver at localhost:8000. This is a live server; it will be updated when you save any of the .md files in the docs folder. So you should be running this as you make changes so you can see their effects. * `mkdocs build` Builds a static site in `./site` directory * config docs by editing `./mkdocs.yml` diff --git a/docs/assets/Hokuyo_Lidar_Wiring.png b/docs/assets/Hokuyo_Lidar_Wiring.png new file mode 100644 index 00000000..ef8c0f82 Binary files /dev/null and b/docs/assets/Hokuyo_Lidar_Wiring.png differ diff --git a/docs/parts/lidar.md b/docs/parts/lidar.md index 098c9008..252dd1c2 100644 --- a/docs/parts/lidar.md +++ b/docs/parts/lidar.md @@ -1,22 +1,22 @@ # Lidar -A Lidar sensor can be used with Donkeycar to provide obstacle avoidance or to help navigate on tracks with walls. It records data along with the camera during training and this can be used for training +A Lidar sensor can be used with Donkeycar to provide obstacle avoidance or to help navigate on tracks with walls. It records data along with the camera during training. However, there are no deep learning autopilots which currently use recorded lidar information. -NOTE: Lidar is currently only supported in the Dev branch. To use it, after you git clone donkeycar, do a `git checkout dev` ![Donkey lidar](../assets/lidar.jpg) ## Supported Lidars -We currently only support the RPLidar series of sensors, but will be adding support for the similar YDLidar series soon. +We currently only support the RPLidar and Hokuyo LX series of sensors, but will be adding support for the YDLidar series soon. We recommend the [$99 A1M8](https://amzn.to/3vCabyN) (12m range) +## RPLidar Setup -## Hardware Setup +### Hardware Setup Mount the Lidar underneath the camera canopy as shown above (the RPLidar A2M8 is used there, but the A1M8 mounting is the same). You can velcro the USB adapter under the Donkey plate and use a short USB cable to connect to one of your RPi or Nano USB ports. It can be powered by the USB port so there's no need for an additional power supply. -## Software Setup +### Software Setup Lidar requires the glob library to be installed. If you don't already have that, install it with `pip3 install glob2` @@ -34,6 +34,25 @@ LIDAR_UPPER_LIMIT = 270 ``` ![Lidar limits](../assets/lidar_angle.png) +## Setup (Hokuyo) + +### Hardware Setup + +![Hokuyo lidar](../assets/Hokuyo_Lidar_Wiring.png) + +A Hokuyo Lidar must be mounted from the bottom using M3 screws. The bottom of the lidar generates a significant amount of heat, so it is recommended to either raise it using standoffs or with a [heatsink](https://racecarj.com/products/aluminum-heat-sink-for-hokuyo-ust-10lx). + +LX-series lidars (e.g Hokuyo UST-10LX, used in F1Tenth cars) accept a voltage range from 10-30 V, with 12 V or 24 V recommended. The most straightforward method for powering the lidar is by cutting off the JST connector at the end and directly soldering the power cables (see [datasheet](https://autonomoustuff.com/-/media/Images/Hexagon/Hexagon%20Core/autonomousstuff/pdf/hokuyo-ust-10lx-datasheet.ashx?la=en&hash=95B57270899F50608C18BF48CC1AD043)) to a boost converter connected to a battery. You should use a multimeter to verify the voltage of the boost converter before wiring it up in order to not fry this very expensive piece of hardware. + +The Ethernet cable should also be connected to your central computer (Pi, Jetson, etc.). + +### Software Setup +You will need to assign an IP to the ethernet interface on your Raspberry Pi (or Jetson, etc.). Refer to [f1tenth docs](https://f1tenth.readthedocs.io/en/stable/getting_started/firmware/firmware_hokuyo10.html) on how to do this with Ubuntu. For Raspbian, you can edit `/etc/interface.d/` by creating a new file called `eth0` (or the name of your ethernet interface) and then creating a rule to put the interface on the same subnet as the Lidar. Once this is completed, you can verify setup by pinging the lidar at `192.168.0.10`. + +To get the Hokuyo working with Donkey, install the Hokuyo python package: `pip install git+https://github.com/mgagvani/hokuyolx.git`. From there, set the Lidar type in the Donkey configuration as detailed above. + +If you want to visualize the Lidar, you can create a "virtual camera" which shows a visualization of the Lidar's measurements. To do this, set the camera type in `myconfig.py` to `LIDAR_PLOT`. + ## Template support Neither the [deep learning template](/guide/train_autopilot/#deep-earning-autopilot) nor the [path follow template](/guide/path_follow/path_follow/) supports Lidar data directly. There is an issue to [add Lidar data to the deep learning template](https://github.com/autorope/donkeycar/issues/910). Lidar would also be very useful in the [path follow template](/guide/path_follow/path_follow/) for obstacle detection and avoidance. If you are interested in working on such projects, please [join the discord community](https://www.donkeycar.com/community.html) and let us know; we will be happy to provide you with support.