Monthly Archives: May 2015

ROS interconnections

In this post I will cover the current state of my ROS setup for controlling the Bioloid’s motors and custom hardware.

The rqt_graph package provides a way of visualising the interconnections between all running ROS nodes. The following graphs show the current setup:

Here is a breakdown of the running nodes:

  • bioloid_sensors (MCU Arduino code): The MCU is running an Arduino program using the rosserial_arduino library, and publishes the IMU’s data as ROS messages over serial-to-USB. The rosserial package was useful as an easy way of sending ROS messages from the MCU to the controlling PC.
  • serial_node: This is a node included in the rosserial_python package, which reads serial data from the MCU.
  • imu_tf_broadcaster: This is a custom node which uses the IMU’s data to calculate the robot’s orientation, as detailed in this previous post.
  • dummy_odom_frame_broadcaster: This is a static_transform_publisher node from tf, which transforms from the map to the odom frame without any actual change in orientation or position. This is simply to keep in line with the ROS conventions, as explained in this previous post.
  • ax_joint_controller: This is another custom node for communicating with the AX-12 servos. This is a work-in-progress, but currently acts as a ROS wrapper around the USB2Dynamixel/USB2AX library. It publishes the servo positions (in radians) as a ROS sensor_msgs JointState message, on the ax_joint_states topic. It also runs a number of ROS services for controlling the servos from other ROS nodes. These are either higher level commands (such as for setting all motors to home position), or low level commands for directly reading from or writing to the AX-12 Control Table’s addresses.
  • joint_state_publisher: The joint_state_publisher is a tool for setting and publishing joint state values for a given URDF file. The node here reads the values from the ax_joint_states topic (by setting the source_list parameter) and publishes on the joint_states topic. I suppose this node could currently be bypassed entirely, however it does provide a nice GUI window for visualising the joint positions as sliders, and is also useful for moving the robot’s joints in RViz when ax_joint_controller is offline.
  • robot_state_publisher: The robot_state_publisher works in tandem with the joint_state_publisher and publishes the state of the robot to tf. The robot_state_publisher reads the configuration of the kinematic chain from the URDF file (set by the robot_description parameter). The set up of the URDF file has been covered in a previous post. The virtual model of the robot is then displayed in RViz, with joint positions reflecting those of the physical robot.

That is about it for the current setup. I have some work-in-progress nodes which interface with the above framework and test some of the robot’s motions, so in a following post I will discuss these and show some new videos of the physical robot in action!

Wiring up the electronics

It’s been a while since the last update! I have mostly been working on the commununication between ROS and the servos, and now have a working read-write ROS servo interface, but will post about this later.

In the meanwhile, I added a bar display made up of 10 green LEDs for information feedback, as well as 4 blue decorative LEDs. I then moved the breadboarded electronics to a more permanent prototype board. I was initially hoping to make a box-shaped board which could fit nicely inside the Bioloid’s hollow “groin”, and would have the LED bar display in the front. However the USB cable connector protruded too much, so the LED bar will have to sit one side of the board. I also managed to squeeze the IMU on the underside of the board. There is still some room above the board, so I may be able to squeeze even more of the wiring/electronics in there.

Once the board was ready, I temporarily put all the hardware parts in roughly the right locations to gauge how much space will be taken up and how long the wiring should be. I have removed the AX-S1 sensor, as I probably won’t have much use for it. I think the Raspberry will sit much better on top of the torso rather than on the back, where the old CM-5 used to be. It’s very tempting to turn the screen into a VR face! As you can see, it will be a struggle to mount the massive 5500 mAh battery on the back, so I am planning on downsizing.

Here are some work-in-progress pictures plus a video: