-
Notifications
You must be signed in to change notification settings - Fork 17
Baxter simulator API
This page serves as a lookup reference for all the hardware and functionality on the Baxter Research SDK robot's simulator. The main interface of the Baxter RSDK is via ROS Topics and Services, which you will find listed and described below along with other core information needed to interface with Baxter.
==========
Robot | Movement | Sensors+ | I/O |
---|---|---|---|
|
==========
For the baxter_interface
Python classes (built on top of the ROS API), please see the Code API Reference page at: http://api.rethinkrobotics.com
Be sure that you 'Enable' the robot before attempting to control any of the joints. The easiest method for controlling the robot is to use the enable_robot.py
ROS executable found in the following example.
A useful configuration for Baxter is the 'shipping pose' (the pose of Baxter's arms upon arrival). This configuration describes Baxter's arms tucked into a compact form for ease of movement through doorways etc. The easiest method for tucking/untucking the arms to/from the shipping pose is the example program 'tuck_arms.py' ROS executable:
Baxter has 7 joints (DoF) in each of its two arms and two more joints in its head (side-to-side panning, and binary, up-down nodding). The control for the head is done separately from the arms; however, you can read the current joint states (position, velocity, and effort) for all the joints on both arms and head by subscribing to one topic:
/robot/joint_states
(sensor_msgs/JointState
)
where the units for the position of a joint are in (rad), the units of velocity are in (rad/s) and the units of effort in each joint is in (Nm).
The following sections cover the individual joint sensing and control in more detail:
- [Arm Joints](#Arm Joints)
- [Head Joints](#Head Joints)
Component IDs:
left_e0
, left_e1
, left_s0
, left_s1
, left_w0
, left_w1
, left_w2
, right_e0
, right_e1
, right_s0
, right_s1
, right_w0
, right_w1
, right_w2
/robot/joint_states
(sensor_msgs/JointState
)
-
name[i]
: '<component_id>' ofi
-th joint in message value arrays. -
position[i]
: position of jointi
rad -
velocity[i]
: velocity of jointi
in rad/s -
effort[i]
: effort applied in jointi
in Nm
Joint states are published by the robot_state_publisher
and are updated with information from the sensors for every cycle.
Set Joint State Publishing Rate
/robot/joint_state_publish_rate
(std_msgs/UInt16
)
- The rate at which the joints are published can be controlled by publishing a frequency on this topic.
- Default rate is 100Hz; Maximum is 1000Hz.
- Note: In current version of SDK (v0.6.0), there is a known bug where high rates will cause many topics to lag and become unresponsive.
There are currently three modes for controlling the arms: Position Mode, Velocity Mode, and Torque Mode.
Alternatively, a joint trajectory action server has been created in support of timestamped joint position trajectories using the ROS standard joint trajectory action.
For more information on joint control, see the following movement example programs.
Each arm can be in independent control modes by publishing the desired control mode to the topic for the appropriate arm.
The first is Position Mode, in which users publish target joint angles for given joints and the internal controller drives the arm to the published angles.
Warning: Advanced Control Mode.
In Velocity Control Mode, users publish velocities for given joints and the joints will move at the specified velocity.
Warning: Advanced Control Mode.
In Torque Control Mode, users publish torques for given joints and the joints will move at the specified torque.
Switching Modes
/robot/limb/<side>/joint_command
([baxter_core_msgs/JointCommand
][baxter_core_msgs-JointCommand])
- Mode is set implicitly by specifying the mode in the command message. Publish a
JointCommand
message to thejoint_command
topic for a given arm to set the arm into the desired control mode. - Constants for each mode are defined in the
JointCommand
message type.
The Joint Trajectory Action provides an ROS action interface for tracking trajectory execution.
The joint trajectory action server provides an action interface for execution of trajectories requested by the client, known as a Goal request. The joint trajectory action server then executes the request trajectory communicating the Result response. The actionlib (action server/client interface) package differs from ROS services, simple request/response interface, in that actions allow mid-execution cancellation, they can also provide feedback during execution as to the progress of the Goal request.
Joint Trajectory Action Server
/robot/limb/<limb>/follow_joint_trajectory/cancel
/robot/limb/<limb>/follow_joint_trajectory/feedback
/robot/limb/<limb>/follow_joint_trajectory/goal
/robot/limb/<limb>/follow_joint_trajectory/result
/robot/limb/<limb>/follow_joint_trajectory/status
# Verify that the robot is enabled:
$ rosrun baxter_tools enable_robot.py
# Start the joint trajectory action server:
$ rosrun baxter_interface joint_trajectory_action_server.py
Please see the simple joint trajectory example or joint trajectory playback example for examples of creating a client of this action server and requesting a joint trajectory.
The joint trajectory action server provides a number of parameters which describe it's behavior during the trajectory execution. These were largely designed to follow these standards.
Note: All of these parameters will be initialized on startup of the trajectory_controller.py if they were not previously specified.
/rethink_rsdk_joint_trajectory_controller/<joint_name>__kp
The proportional gain with which the joint trajectory controller will track the commanded trajectory for the specified joint.
/rethink_rsdk_joint_trajectory_controller/<joint_name>__kd
The derivative gain with which the joint trajectory controller will track the commanded trajectory for the specified joint.
/rethink_rsdk_joint_trajectory_controller/<joint_name>__ki
The integral gain with which the joint trajectory controller will track the commanded trajectory for the specified joint.
/rethink_rsdk_joint_trajectory_controller/goal_time (double, default: 0.0)
The amount of time (in seconds) that the controller is permitted to be late to the goal. If goal_time has passed and the controller still has not reached the final position (within the parameters described by /rethink_rsdk_joint_trajectory_controller/<joint_name>_goal
, then the goal is aborted.
/rethink_rsdk_joint_trajectory_controller/<joint>_goal (double, default: -1.0)
The maximum final error for for the trajectory goal to be considered successful. Negative numbers indicate that there is no constraint. Given in units of joint position (radians). If this constraint is violated, the goal is aborted.
/rethink_rsdk_joint_trajectory_controller/trajectory (double, default: -1.0)
The maximum error for at any point during execution for the trajectory goal to be considered successful. Negative numbers indicate that there is no constraint. Given in units of joint position (radians). If this constraint is violated, the goal is aborted.
Dynamic Reconfigure GUI is suggest for use with ROS Distributions >=Groovy for setting these parameters.
# Start the dynamic reconfigure GUI:
$ rosrun rqt_reconfigure rqt_reconfigure
Expand the joint trajectory controller's parameters by choosing rethink_rsdk_joint_trajectory_controller
from the left menu. Use the sliders/input fields to specify these parameters dynamically.
Alternatively, these parameters can be set via a YAML file, command line, or programmatically (rospy, roscpp).
The head state topic will give you the current pan
angle (side-to-side) of the head and report boolean status flags if the robot is currently moving its head or nodding. Note: Flags may not report 'true' values until after the first respective movement command is sent. Even though the head nodding cannot be visualized in gazebo, the head state topic would indicate if the command is passed
Component IDs:
head_nod
, head_pan
Head State
/robot/head/head_state
([baxter_core_msgs/HeadState
][baxter_core_msgs-HeadState])
-
pan
field gives you the current angle (radians) of the head. 0 is forward,-pi/2
to Baxter's right, and+pi/2
to Baxter's left. -
isPanning
andisNodding
are boolean fields that will switch to True while the robot is executing a command. Note: TheisPanning
field is initialized to True upon startup and will update thereafter.
Head (Joint) State
/robot/joint_states
(sensor_msgs/JointState
)
- The position of the head may also be determined from the
joint_state
message. Note: The 'nod' joint will never update, as it is only a binary state.
Pan Head
/robot/head/command_head_pan
([baxter_core_msgs/HeadPanCommand
][baxter_core_msgs-HeadPanCommand])
-
target
sets the target angle. 0.0 is straight ahead. -
speed
is an integer from [0-100], 100 = max speed. - Setting an angle in the command_head_pan topic does not gurantee the head will get to that position. There is a small deband around the reference angle around the order of +/- 0.12 radians. Note: Speed of the head movement cannot be controlled at this point of time.
Nod Head
/robot/head/command_head_nod
(std_msgs/Bool
)
- Send True to nod!
# Check head position/state:
$ rostopic echo /robot/head/head_state
# Move (pan) head side-to-side:
$ rostopic pub /robot/head/command_head_pan baxter_core_msgs/HeadPanCommand -- 10.0 100
# Make head nod up-down:
$ rostopic pub /robot/head/command_head_nod std_msgs/Bool True
Published at 100 Hz, the endpoint state topic provides the current Cartesian Position, Velocity and Effort at the endpoint for either limb.
Endpoint State
/robot/limb/<side>/endpoint_state
([baxter_core_msgs/EndpointState
][baxter_core_msgs-EndpointState])
- The endpoint state message provides the current
position/orientation pose
,linear/angular velocity
, andforce/torque effort
of the robot end-effector at 100 Hz. Pose is in Meters, Velocity in m/s, Effort in Nm. Note: The wrench and the twist values would be 0 for the simulated baxter since it is not implemented.
There are two IR Range sensors on each hand. The sensor data is published on the tf frame <side>_hand_range
, and points down the +x axis, as is convention for rviz.
Component IDs:
left_hand_range
, right_hand_range
Read Range Sensor Measurements
/robot/range/<component_id>
(sensor_msgs/Range
)
- Range is published in meters in the
range
field. - When the sensor goes beyond its max range, the invalid value of 65.5350036621(m) is published instead. Users should check for valid measurements using the
min_range
andmax_range
fields.
Mounted in a ring around the head are 12 sonar distance sensors.
Component IDs:
head_sonar
The sonar range measurements are published as a set of 3D coordinate points in space around Baxter, called a PointCloud. This gives a "mapping" of detections in the workspace and is well-suited for occupancy-oriented reasoning in the robot's world.
Read Sonar Point Cloud
/robot/sonar/head_sonar/state
(sensor_msgs/PointCloud
)
- The PointCloud message takes all the individual sonar range measurements and instead, uses the known locations of the sonar sensors on the robot, to map the readings into 'points' in 3D space that mark the location of detected objects.
- The points for all incoming, valid, current readings are collected each time and published as an array in the PointCloud message.
- Note: the result of this is that each published message contains measurements for only a subset of the sensors, not readings for all 12 sensors every time.
- There will be a Point for each currently active measurement - this means a message published at one instant could contain points for only 4 of the sensors, and, also, another instant the published message could contain two points for the same sensor if the sensor had two valid measurements in the last time interval.
- The
points
field contains the array of the actual 3D Points - one for each current measurement - in terms of the (x,y,z) coordinates for each point, relative to Baxter'sbase
tf frame. - The
channels
field contains arrays with additional information about each point. The values in each array map 1:1 to the values at the same indices in thepoints
array.- The array with the name
SensorId
lists which physical sensor each point came from. - The array with the name
Distance
lists the original range measurement from that sensor.
- The array with the name
There are four Navigators on that are simulated as a QT plugin in the simulated baxter: two for each side of the torso and one for each arm. Each Navigator is comprised of three push buttons, and an indexing scroll wheel.
Component IDs:
left_itb
, right_itb
, torso_left_itb
, torso_right_itb
Read Button States
/robot/itb/<component_id>/state
([baxter_core_msgs/ITBState
][baxter_core_msgs-ITBState])
- The states of the three push buttons are the first three values in the boolean
buttons[]
array. A value of 'True' indicates the button is currently pushed down. The order is as follows:- OK_BUTTON (
buttons[0]
): The small rectangular button with 'Ok' text. - CANCEL_BUTTON (
buttons[1]
): The button 'above' the OK_BUTTON, typically with a 'Cancel' text. - SHOW_BUTTON (
buttons[2]
): The SHOW_BUTTON, or "Rethink Button", is 'below' the OK_BUTTON, and typically is labeled as the 'Show'.
- OK_BUTTON (
Read Wheel Index
/robot/itb/<component_id>/state
([baxter_core_msgs/ITBState
][baxter_core_msgs-ITBState])
- The
wheel
field returns an integer between [0-255]. Each physical 'click' of the wheel corresponds to a +/-1 increment. The value will loop when it goes above or below the bounds.
There are two sets of lights on each Navigator: the 'inner' light ring around the circular OK Button, and the 'outer' oval light ring around the entire Navigator. Each light is identified by the 'Component ID' of its Navigator followed by _light_inner
or _light_outer
.
Component IDs:
Inner Lights: left_itb_light_inner
, right_itb_light_inner
, torso_left_itb_light_inner
, torso_right_itb_light_inner
Outer Lights: left_itb_light_outer
, right_itb_light_outer
, torso_left_itb_light_outer
, torso_right_itb_light_outer
Turn LEDs On/Off
/robot/digital_io/command
([baxter_core_msgs/DigitalOutputCommand
][baxter_core_msgs-DigitalOutputCommand])
-
name:
<navigator_component_id>_light_inner
,<navigator_component_id>_light_outer
-
value: {True, False}
-
Publish a DigitalOutputCommand message with the component id of the light as the
name
and avalue
ofTrue
orFalse
to turn the LEDs On or Off, respectively.
Check State of LEDs
/robot/digital_io/<light_component_id>/state
([baxter_core_msgs/DigitalIOState
][baxter_core_msgs-DigitalIOState])
- The
state
field will give you the current setting of the LED, ON(1) or OFF(0). _Note: The Navigator LED are not simulated in gazebo and hence viewing the lights go on and off is not possible. However, the topics mentioned above indicate the appropriate states of these lights.
There are two shoulder buttons on the back of the torso, one on each side. The state of each button is published in a DigitalIOState message under its own topic (DigitalIOState constants: PRESSED==1, UNPRESSED==0).
Component IDs:
left_shoulder_button
, right_shoulder_button
Read Button Pressed
/robot/digital_io/<side>_shoulder_button/state
([baxter_core_msgs/DigitalIOState
][baxter_core_msgs-DigitalIOState])
- The integer field
state
will read PRESSED (1) when the button is pressed down, and UNPRESSED (0) otherwise.
The two buttons and one touch sensor in the cuff of each hand are simulated as QT push back buttons in gazebo. The state of each button is published in a DigitalIOState message under its own topic (DigitalIOState constants: PRESSED==1, UNPRESSED==0).
Component IDs:
left_lower_cuff
, right_lower_cuff
Read Cuff Squeezed
/robot/digital_io/<side>_lower_cuff/state
([baxter_core_msgs/DigitalIOState
][baxter_core_msgs-DigitalIOState])
- Integer
state
will read PRESSED (1) when the cuff sensor is squeezed, and UNPRESSED (0) otherwise.
This is the rectangular button under the 'CUFF' section with the label 'Ok'.
Component IDs:
left_lower_button
, right_lower_button
Read OK Button Pressed
/robot/digital_io/<side>_lower_button/state
([baxter_core_msgs/DigitalIOState
][baxter_core_msgs-DigitalIOState])
- Integer
state
will read PRESSED (1) when the button is pressed, and UNPRESSED (0) otherwise.
This is the rectangular button under the 'CUFF' section with the label 'Grasp'.
Component IDs:
left_upper_button
, right_upper_button
Read Grasp Button Pressed
/robot/digital_io/<side>_upper_button/state
([baxter_core_msgs/DigitalIOState
][baxter_core_msgs-DigitalIOState])
- Integer
state
will read PRESSED (1) when the button is pressed, and UNPRESSED (0) otherwise.
Read Digital Input State
/robot/digital_io/<component_id>/state
([baxter_core_msgs/DigitalIOState
][baxter_core_msgs-DigitalIOState])
-
state
: field will be 0 (for True, Pressed, On) or 1 (for False, Released, Off). If the component is an output, then thestate
field will be the current setting of the output. -
isInputOnly
: field tells you if the component is an input (sensor) only, and cannot output (not a light, actuator, nor indicator).
Control Digital Output
/robot/digital_io/command
([baxter_core_msgs/DigitalOutputCommand
][baxter_core_msgs-DigitalOutputCommand])
-
name:
<component_id>
-
value: {True, False}
-
Publish a DigitalOutputCommand message with the component id of the Output as the
name
and avalue
ofTrue
orFalse
to turn the Output On or Off, respectively.
All Digital Component IDs:
Outputs:
Inner Lights: left_itb_light_inner
, right_itb_light_inner
, torso_left_itb_light_inner
, torso_right_itb_light_inner
Outer Lights: left_itb_light_outer
, right_itb_light_outer
, torso_left_itb_light_outer
, torso_right_itb_light_outer
Inputs:
Back Shoulder Buttons: left_shoulder_button
, right_shoulder_button
Cuff (Squeeze) Sensor: left_lower_cuff
, right_lower_cuff
Cuff OK Button: left_lower_button
, right_lower_button
Cuff Grasp Button: left_upper_button
, right_upper_button
Note: The /robot/digital_io_states is not yet simulated to publish compiled states of the digital_io and the individual state topic of each of the navigator would furnish those information
You can access Baxter's two hand cameras and the head camera using the standard ROS image types and image_transport mechanism listed below. These cameras would show rendered images of the gazebo environment. Useful tools for using cameras in ROS include rviz and the image_view program.
IMPORTANT: There is no limitation on the number of cameras used simultaneously in simulation as in the real baxter.
Component IDs:
left_hand_camera
, right_hand_camera
, head_camera
Raw Image
/cameras/<component_id>/image
(sensor_msgs/Image
)
Camera Intrinsics
/cameras/<component_id>/camera_info
(sensor_msgs/CameraInfo
)
# View now open camera topics
$ rostopic list /cameras
/cameras/head_camera/camera_info
/cameras/head_camera/image
/cameras/right_hand_camera/camera_info
/cameras/right_hand_camera/image
/cameras/left_hand_camera/camera_info
/cameras/left_hand_camera/image
Images can be displayed on Baxter's LCD screen by publishing the image data as a ROS sensor_msgs/Image
.
Display Image
/robot/xdisplay
(sensor_msgs/Image
)
- Publish image data as a ROS Image message to update the display.
- The screen resolution is 1024 x 600. Images smaller than this will appear in the top-left corner.
- There are dedicated ROS packages for working with and sending ROS Image messages, including image_transport and image_pipeline.
- Useful tools for working with images in ROS include image_view and republish. Also see camera_drivers for assistance working with your own cameras.
For more information on displaying images to Baxter's LCD screen, see the Display Image Example.
For more information on the working of each example with the simulator refer here
- Launching the gazebo simulator, fails with error messages.
- Make sure that you are within the simulator mode and re-launch the simulator.
$ ./baxter.sh sim
- Make sure that your_ip variable is set properly to your ip address in the baxter.sh script, or manually set
your ip address.
$ export ROS_IP=<ip address>
- Make sure that you are within the simulator mode and re-launch the simulator.
- None of the example programs are running or the topics are not being published with any values.
- Wait till the ROS_INFO, 'Simulator is loaded and started successfully' appears on your terminal before using our simulator.
- Something else?
- Please let us know
[baxter_core_msgs-AnalogIOState]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/AnalogIOState.msg [baxter_core_msgs-AnalogIOStates]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/AnalogIOStates.msg [baxter_core_msgs-AnalogOutputCommand]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/AnalogOutputCommand.msg [baxter_core_msgs-AssemblyState]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/AssemblyState.msg [baxter_maintenance_msgs-CalibrateArmEnable]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_maintenance_msgs/msg/CalibrateArmEnable.msg [baxter_core_msgs-DigitalIOState]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/DigitalIOState.msg [baxter_core_msgs-DigitalIOStates]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/DigitalIOStates.msg [baxter_core_msgs-DigitalOutputCommand]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/DigitalOutputCommand.msg [baxter_core_msgs-EndEffectorCommand]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/EndEffectorCommand.msg [baxter_core_msgs-EndEffectorProperties]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/EndEffectorProperties.msg [baxter_core_msgs-EndEffectorState]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/EndEffectorState.msg [baxter_core_msgs-HeadPanCommand]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/HeadPanCommand.msg [baxter_core_msgs-HeadState]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/HeadState.msg [baxter_core_msgs-ITB]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/ITB.msg [baxter_core_msgs-ITBStates]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/ITBStates.msg [baxter_core_msgs-JointCommand]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/JointCommand.msg [baxter_maintenance_msgs-LSCores]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_maintenance_msgs/srv/LSCores.srv [baxter_maintenance_msgs-RMCores]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_maintenance_msgs/srv/RMCores.srv [baxter_core_msgs-RobustControllerStatus]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/RobustControllerStatus.msg [baxter_maintenance_msgs-TareData]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_maintenance_msgs/msg/TareData.msg [baxter_maintenance_msgs-TareEnable]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_maintenance_msgs/msg/TareEnable.msg [baxter_core_msgs-EndpointState]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/msg/EndpointState.msg [baxter_core_msgs-ListCameras]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/srv/ListCameras.srv [baxter_core_msgs-CloseCamera]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/srv/CloseCamera.srv [baxter_core_msgs-OpenCamera]: http://github.com/RethinkRobotics/baxter_common/blob/release-0.7.0/baxter_core_msgs/srv/OpenCamera.srv
© 2008-2013 Rethink Robotics. All rights reserved.