Tag Archives: ROS

Balancing ideas

I had briefly tried in the past a simple balancing function using a PID controller, which aims to balance the Bioloid using the ankle motors, by trying to keep the IMU (accelerometer/gyro) vertical. The result was mixed success, but was only an early test. I am considering revisiting this balancing test, but this time using a number of PID controllers to control multiple groups of leg motors(e.g. hips, knees and ankles), while also using the GUI to make testing faster and easier.

 

On another note, I recently came across the Nengo Neural Simulator, which is a framework for creating neural networks of leaky integrate-and-fire (LIF) neurons for creating complex computational models. It has been used to create Spaun, the world’s largest functional brain model which is able to perform a number of functions such as vision, memory, counting, as well as drawing what is sees by controlling a simple arm.

What stands out is how easy it is to use the Nengo GUI to build neural networks. The interface runs in the browser and visualisations of neuron spiking activity and other metrics are easily shown for each graphical object. There is also support for scripting in Python. Installing it and trying it out for yourself is pretty simple, just follow the Getting Started guide here.

 

It would be really interesting to see if some form of PID controller could be built using Nengo, and then used to control the Bioloid’s balancing!

Qt Style Sheets and C++

This is a quick post to show some more updates to the styling of the GUI.

I have been experimenting with customising the look-and-feel of my GUI using Qt Style Sheets (QSS) which are closely related to HTML Cascading Style Sheets (CSS).

Customised widgets

I have so far customised most of the widgets which appear in the GUI, as shown in the following examples. I have chosen a blue/grey theme throughout, with some exceptions for specialised widgets and items.

Code

Initially the style sheets were embedded in QStrings inside the code, but they were dotted around various classes which meant a lot of code was duplicated as I kept adding content. I then moved the style sheets (QSS) for each widget to a separate text file, and set them via each class individually. Finally I found a way of greatly simplifying this by putting all the code into one single file. This is applied through the program’s main window (QMainWindow). Exception widgets use custom QSS Selector ID, which is set with setObjectName(). This is a nice way of applying specialised styles to specific widgets, such as e.g. a button that needs to stand out.

Here are two QSS examples of the standard and special button shown earlier:

QPushButton {
    background-color: qlineargradient(x1: 0, y1: 0, x2: 0, y2: 1, stop: 0 lightsteelblue, stop: 1 steelblue);
    border-color: #8F8F91;
    border-style: outset;
    border-width: 4px;
    border-radius: 10px;
}
QPushButton:flat {
    border: none; /* no border for a flat push button */
}
QPushButton:default {
    border-color: royalblue; /* make the default button prominent */
}
QPushButton:pressed {
    background-color: qlineargradient(x1: 0, y1: 0, x2: 0, y2: 1, stop: 0 royalblue, stop: 1 dodgerblue);
    border-style: inset;
}
QPushButton#redPushButton {
    color: white;
    background-color: red;
    border: solid lightgrey;
    border-style: outset;
    border-width: 4px;
    border-radius: 4px;
}
QPushButton#redPushButton:flat {
    border: none;  /* no border for a flat push button */
}
QPushButton#redPushButton:default {
    border-color: grey;  /* make the default button prominent */
}
QPushButton#redPushButton:pressed {
    color: red;
    background-color: darkred;
    border: solid red;
    border-style: inset;
    border-width: 4px;
    border-radius: 4px;
}

As usual, the latest source code is available on GitHub if you want to have a look.

Motor dials updated

I have made some updates to the motor dials which control the motor positions. They can now change mode and control motor speed and load. Also, the GUI is regularly updated with some important feedback from each motor: motor voltage and temperature, LED and torque on/off state, and feedback on all the alarm states.

At this point I’m starting to think that an internal model of all the motor control table data would be useful at this point! Rather than classes making direct requests to the motor controller to receive motor information, all the state data could be kept by the controller and updated regularly. Classes would then simply get the latest data from this model when needed. This is however partially the way the controller works already, as it has a model of the ROS-style joint states which hold present positions, present speeds and present torques (loads), as well as goal positions, moving speeds and torque limits. The joint states are published continuously by a ROS publisher. Present and goal positions are the most important data, as the AX-12 by default only performs position control. Moving speed is simply the speed that the motor will use to move between positions, so cannot be used for e.g. a velocity feedback loop. “Torque” is a bit of a misleading term here, as there is no torque sensing in the motors. Torque sensors are only available in much more expensive motors than these. The load values reported by the AX-12 are related to the motor current, but cannot be read while the motor is actually moving. Two notable sources which have more detailed information on this somewhat unclear measurement can be found here on the RoboSavvy forum and here.

 

I think I’m done with updating these graphical widgets for the time being, as it is detracting from the main goals of exploring MoveIt! and getting the robot walking.

Sensor grapher

A sensor plotting window has now been added to the GUI, which shows all data from the IMU and pressure sensors. The accelerometer, magnetometer, gyroscope, heading data and Force Sensing Resistors’ (FSRs) data are all published as ROS messages as shown in this post, so reading them in the Qt GUI is fairly straightforward, in a similar way to how the joint states are being read. The graphs are made using a third party library for Qt called QCustomPlot.

Each graphs show a scrolling 10 second window of buffered data, which can be paused/played. With QCustomPlot it’s easy to enable user interactions with graphs (drag axis ranges with mouse, zoom with mouse wheel, etc.), so I enabled this option whenever the graph is in a paused state.

The y-axis units are currently showing raw data, which I will probably update to show standard values.

Screenshot from 2016-04-03 19-46-07.png

A useful thing I found in Qt with QDockWidget, which is used to create dockable/floating sub-windows, is that these widgets can also be tabbed to save space on the screen. How can this be done in code you may ask? The useful functions I found were: setDockNestingEnabled() (or setDockOptions()), tabifyDockWidget() for QMainWindow, and raise() to select the default tabbed dock widget you want displayed.

That’s pretty much all there is to the sensor grapher. I might add more features to it in the future, but for now it does the basics!

List of ideas

As I have many ideas floating about on what to work on next, I made a list to see try and see which ones are worth prioritising:

  • GUI updates
    • Graph for visualising sensor data
    • Improved motion editing
  • Static balancing
  • Implement MoveIt! trajectory following via GUI
  • Walking routines
  • Advanced movement: Trajectories generate from MoveIt!, combined with active balancing as the robot moves

This is just an initial list, which I hope to expand on in the future.

My immediate plans right now are to continue tidying up some GUI graphical details, implement a simple sensor data plotter (I know rqt_plot does a fairly good job of this already, but I’d like to have sensor data integrated in my GUI) and get back to trajectory generation tests using MoveIt!

More software updates

With the new PC’s development tools up and running, I’ve made a number of updates to the GUI side of things, as well as the background joint controller node. Here are some of the most important from this month.


Joint controller updates

An issue I was having with the joint controller node – a main function of which is to act as a ROS wrapper class around the Dynamixel API – was that I seemed to be getting erroneous values when executing a simple one-off read from the motors. I suspected this was related to the fact that the node is also constantly reading the motors’ feedback values as fast as possible in its main loop. Even though the Dynamixel library seems to have checks for a busy comms bus, there was likely some issue with multiple threads trying to access the bus. This was easily fixed by using ros::spinOnce() in the main loop instead of using an AsyncSpinner which starts up a separate thread.

There is still some problem where positional values for some of the motors seem to be getting constantly corrupted, but only when all the torques are enabled and the motors are struggling to achieve all their goals positions (making the typical whining noise that servos have). I’ve yet to narrow down if this is a software issue or not.

Update to read/write services

The ROS read/write services have been simplified now, and two of the most useful commands are the ones that perform a sync read and write across a number of motors. As these are ROS services they can also be called from a terminal command line. For example, to receive the current position, current speed and current torque of motors 1, 3 and 5 all at once, you can run:

rosservice call /ReceiveSyncFromAX '[1, 3, 5]' 36 3

36 is the start address for the low byte of the present position, and 3 indicates the number of control table values, including the start address, to read for each motor. This is calling the sync_read function that the USB2AX offers, via a ROS service.

In a similar way you can write goal position (100), goal speed (300) and max torque (512) to motors 1, 3 and 5 all at once by running:

rosservice call /SendSyncToAX '[1, 3, 5]' 30 '[100, 300, 512, 100, 300, 512, 100, 300, 512]'

One restriction is that the sync read and write functions can only read/write consecutive control table addresses, as explained in the USB2AX link above, as well as here in the Robotis documentation.


New motor control table value editor

On the GUI side of things, I’ve made a new “motor address editor” as an improvement on the previous “motor value editor”. The new editor allows reading/writing of all control table addresses of the AX-12. You don’t have to worry about writing low/high bytes to the two-byte values, as that is handled by the editor; simply write any valid value to the parameter of interest.

I have kept the old editor as two useful features it has is let me easily write the same value to all motors, as well as send position/speed/torque in “standard” units (rad, rad/s and torque %) instead of raw values.


Recording and executing poses

I added a new test function which moves the robot through a sequence of queued poses that are saved in the GUI. The robot pauses between each sequence based on each pose’s specified delay time.

Because the function’s execution has to pause and wait for the specified dwell time time, I added the code into a separate thread. Originally I did this by subclassing QThread, but then updated it and created a “worker object” which is moved to a thread using moveToThread(), as this seems to be the better way in Qt.

Although this is very simplistic motion editor functionality which has been done many times before by other tools, I thought it would be useful to add to my GUI, as it could prove useful in the future as a complement or even a replacement of the MoveIt! ideas if they don’t work out. This GUI is now essentially a one-stop shop for easily testing various ROS functionality and other ideas as I keep on developing along the way!

New development PC

Progress has been slow as my laptop is having overheating problems, so running ROS for any extended periods was getting difficult. For this reason I have almost finished setting up a Linux development environment on my main PC which is more than capable of running ROS and all the graphical tools. I will keep using the laptop for less intensive tasks and for development away from home.

I am using Linux Mint 17.3 Rosa 64-bit on the PC vs Linux Mint 17 Qiana 32-bit on the laptop (laptop had Wi-Fi driver issues with 64-bit version), but still sticking with ROS Indigo instead of upgrading to the more recent Jade, to avoid any surprises.

Installation of ROS was straightforward as with the laptop, and follows the Ubuntu installation instructions, as Mint is built on Ubuntu. The only edit I had to make was change the repository name to use the Ubuntu Trusty codename instead of the Mint Qiana codename, so the repository name looks like this:

deb http://packages.ros.org/ros/ubuntu trusty main

I use Dropbox to keep everything in sync and share the same catkin workspace between the two machines. The only minor annoyance is having to delete the build and devel folders and rebuilding my packages when switching between machines, because of the 32-bit/64-bit difference.

Further integration with MoveIt!

The integration of the Bioloid with MoveIt! has reached an important milestone, with the real robot now being able to respond to MoveIt! planned trajectories! There’s still a lot to do to improve performance, but the basic framework is ready. In the next few paragraphs I will try to describe the MoveIt! configuration process up to now.

In a previous post I mentioned how the SRDF file was created as the fist step to configuring the robot to use MoveIt! The joint_state_publisher was already set up to publish the values of the joint states, which it read from my AX-12+ motor controller, so RViz was able to display the current state of the robot’s joints live when connected to the robot. Commands could also be sent to move the robot servo’s (such as those sent by the control GUI), but this was done using my own custom ROS services. The link to closing the loop between MoveIt! and the robot was to make MoveIt! able to command motor movement.


The joint controllers

Some background reading quickly led to the tutorial page on controller configuration. In trying to figure out how to first make my AX-12+motor controller provide the required FollowJointTrajectory action, I first looked at the wiki of the actionlib stack and used a C++ SimpleActionServer. However this would mean having to then implement the trajectory following, something which is already by the ROS joint_trajectory_controller package. So instead I looked into how I could get the robot to use this controller, which to lead to reading about ros_control and a very good tutorial from Gazebo. It took some time to understand how all these packages work together, and finally found that I had to implement the hardware_interface for my AX-12+ motor controller. The biggest challenge was figuring out how to configure the controller YAML files and actually get the controller_manager to spawn the controllers correctly.

I have used various roslaunch files and configuration files that set up and run the joint controllers, which I will upload shortly onto my GitHub page, along with updates to the C++ ROS AX-12+ motor controller.

The connections between the various ROS nodes and topics have now grown quite a bit in complexity, compared to the initial ROS interconnections, and now look something like this (captured using rqt_graph):

rqt_graph_control_moveit_and_gui_2015-11-03


Useful links

I have put together a list of links, mostly from the ROS answers site, which were very helpful in getting the controllers to finally work:

Control GUI updates

Progress is going well with my Qt control GUI. The main updates are:

  • New button to cut torque on all motors.
  • New motor value editor, used for writing to the servo control table addresses. The address of choice can be selected, and a value can be set to an individual servo or to all servos at once, using the broadcast ID (254). Also included are three custom functions which write position/speed or torque using more intuitive values: radians, rad/sec or % torque respectively.
  • New position dials for directly controlling the position of each motor.
  • In order to control and receive feedback from the motors, all these GUI functions interface with the ROS publish/subscribe mechanism or ROS services, which my custom usb2ax_controller ROS package is providing in its own separate ROS node.
  • I have been playing around with Qt style sheets to customise the GUI’s main elements such as background, buttons etc.. This was fairly time consuming but overall I think it improves the overall look and feel rather than just having standard grey boxes! I have also used QDockWidget to make various parts of the GUI easy to hide/show, pop out or dock and resize. This seems much more flexible than putting all widgets on one form, or using multiple standard windows or even tabbed windows.

Now the foundations are laid, it’s time to delve deeper into MoveIt!

Robot control GUI first steps

I have switched focus to software at the moment, and am writing a basic user interface that will let me interact with the servo controller and the ROS packages, specifically the MoveIt! Move Group Interface/C++ API.

The interface is written in Qt using the Qt Creator IDE. As this is very ROS focused, I am building the GUI as a ROS package using catkin_make, which was tricky to set up inside Qt Creator, but worked in the end!

Screenshot from 2015-09-22 23-45-47

Currently the GUI can start a ROS node and receive current and goal position/velocity for each servo. It also enables the creation of a list of joint poses from the current robot’s state, and the ability to add them to a second list, which will act as a queue for moving through sequences of poses later on. The lists can be manipulated and also saved/loaded to CSV text files. A group of slider widgets shows the robot’s current joint positions, with added markers to show ‘target’ positions.

A number of buttons are yet not implemented, but the idea is to make this a quick and easy interface to a number of motion plans using the MoveIt! API, as I found it cumbersome trying to test with just a simple test C++ file. Hopefully this is the first step to a GUI with many features to come!

All source code is available on GitHub if you want to have a look.

Robot VR model updates

The robot model has got a small update to make it better resemble its real counterpart rather than the default Bioloid humanoid.

I have also had some success testing the Move Group Interface/C++ API with the robot, as well as managed to start a Qt widgets project using ROS’s CMake build system. This is in its very early stages right now, but will form the basis for further tests and advanced motions with the MoveIt! package.

Playing around with Gazebo

I thought it would be interesting to see if I could get my current Bioloid URDF file into Gazebo and try out some simulations. It turned out not to be too difficult; in order to load the model in Gazebo, all I had to add to the URDF were inertial elements for each of the links.

I set up a couple of tables and beer cans in the scene, and dropped the poor Bioloid from a height!

If you already have a URDF description of your robot, Gazebo is fairly easy to set up and has great potential for any simulation-based project, however as I am focusing more on the physical robot, I probably won’t be using it much for the time being. That, plus the fact that my laptop can barely handle the program without overheating!

Git repo created

I have just set up a GitHub account and added the ROS setup code I am currently working on. You can find it all at https://github.com/dxydas/ros-bioloid.

As I am a Git noob, here are some useful links which helped me along the way, and which may be of us to others:

Heads up!

The Raspberry Pi has now been fitted to the Bioloid using some spare brackets and sponge padding.

A GUI for the Pi’s screen has also been made using Python, Tkinter and the rospy library. This is actually my first Python program from scratch, so it’s far from perfect but it’s simple and does the job for now!

The program visualises the sensor data from the force sensors and IMU, which the Arduino is publishing on ROS topics. The slider values are unfiltered data, the same which is used by another ROS node to fuse the IMU data and provide a good estimate of the torso’s 3D orientation. The roll/pitch/yaw widgets are used to visualise a simple transformation of the accelerometer/magnetometer data, as was performed previously here. The force sensing resistor (FSR) widget is not active, as the resistors have not been wired in yet, but eventually each FSR’s value will be visualised with a coloured square box ranging from yellow (un-pressed) to red (fully pressed).

ROS interconnections

In this post I will cover the current state of my ROS setup for controlling the Bioloid’s motors and custom hardware.

The rqt_graph package provides a way of visualising the interconnections between all running ROS nodes. The following graphs show the current setup:

Here is a breakdown of the running nodes:

  • bioloid_sensors (MCU Arduino code): The MCU is running an Arduino program using the rosserial_arduino library, and publishes the IMU’s data as ROS messages over serial-to-USB. The rosserial package was useful as an easy way of sending ROS messages from the MCU to the controlling PC.
  • serial_node: This is a node included in the rosserial_python package, which reads serial data from the MCU.
  • imu_tf_broadcaster: This is a custom node which uses the IMU’s data to calculate the robot’s orientation, as detailed in this previous post.
  • dummy_odom_frame_broadcaster: This is a static_transform_publisher node from tf, which transforms from the map to the odom frame without any actual change in orientation or position. This is simply to keep in line with the ROS conventions, as explained in this previous post.
  • ax_joint_controller: This is another custom node for communicating with the AX-12 servos. This is a work-in-progress, but currently acts as a ROS wrapper around the USB2Dynamixel/USB2AX library. It publishes the servo positions (in radians) as a ROS sensor_msgs JointState message, on the ax_joint_states topic. It also runs a number of ROS services for controlling the servos from other ROS nodes. These are either higher level commands (such as for setting all motors to home position), or low level commands for directly reading from or writing to the AX-12 Control Table’s addresses.
  • joint_state_publisher: The joint_state_publisher is a tool for setting and publishing joint state values for a given URDF file. The node here reads the values from the ax_joint_states topic (by setting the source_list parameter) and publishes on the joint_states topic. I suppose this node could currently be bypassed entirely, however it does provide a nice GUI window for visualising the joint positions as sliders, and is also useful for moving the robot’s joints in RViz when ax_joint_controller is offline.
  • robot_state_publisher: The robot_state_publisher works in tandem with the joint_state_publisher and publishes the state of the robot to tf. The robot_state_publisher reads the configuration of the kinematic chain from the URDF file (set by the robot_description parameter). The set up of the URDF file has been covered in a previous post. The virtual model of the robot is then displayed in RViz, with joint positions reflecting those of the physical robot.

That is about it for the current setup. I have some work-in-progress nodes which interface with the above framework and test some of the robot’s motions, so in a following post I will discuss these and show some new videos of the physical robot in action!

Representing the robot in ROS

Building a virtual robot

In this post, I’ll be describing the first stage in getting a working representation of the, as of yet unnamed robot, into the ROS world. I will also be briefly exploring the MoveIt! library, as this might be a useful tool for the future.

The virtual representation of the Bioloid will be built using CAD models of all the individual servos and support brackets. The robot will be loaded up in RViz, where its links an joints will be manipulated. Inverse kinematics will hopefully be calculated by the MoveIt! package later on. But first, a definition is needed of how the joints and links are all connected, and how to move between their frames of reference. The individual CAD components will be oriented correctly onto these frames.

On a side note, I had the chance to photograph the Bioloid in its current state, and also had a Raspberry Pi 2 delivered!

After some background reading, I found out that the best way to create this representation in ROS is by using a standardised XML format file called the Unified Robot Description Format (URDF).


CAD models

In the past I had made a start on drawing a CAD model of the Robotis servo, but since then Robotis has released CAD drawings of all their robot kits online here (old link).

I tidied up the .stp files using FreeCAD, by removing some placeholder parts or sub-parts which would be of no use (e.g. screws, gears and electronics on the inside of the servos and CM-5). I then converted the files into .dae (Collada) format, and imported them into Blender to add some colour textures. For some reason, saving again as .dae in Blender shrunk the model dimensions by 100. I haven’t worried too much about the actual component size at the moment, as only their relative scales have to be correct for now. There is also a bug in RViz which replaces the ambient color of Collada materials by light grey if at least one component of the specified ambient color is 0. To fix this, I manually edited the file and replaced all 0’s with 0.1 in the “ambient” tags. Below are some of the main components, as rendered in Blender.


The Unified Robot Description Format

The URDF file itself is written in a plain markup language. As the definition of various links is written more conveniently with some basic maths, and because many bits of code will inevitably be repeated or recycled (e.g. the mirroring of the left-hand side based on the right-hand side), ROS has a macro language called xacro, which makes it much easier to create and maintain URDF files. A URDF is generated from a xacro file using:

rosrun xacro xacro.py model.xacro > model.urdf

Creating the file for the Bioloid took a fair while, as I created all the translations by eye without knowing the actual distance measurements between the various links, but rather by relying on the CAD components as each one was placed in the chain. I started with the right side of the model, first with the easier arms, then with the legs.

The validity of the URDF file can be checked with teh check_urdf tool. Another great tool for visualising the final result is urdf_to_graphiz, which generates a diagram of the joint and link tree. The tree of my model is shown below. Each joint (blue circles) is positioned with respect to its parent link (black rectangles), and the following/child link is positioned has the same origin as the joint, as shown here. The xyz and rpy labels next to the arrows show the translation and rotation required to get from parent frame to child frame, or in other words, it is the representation of the child’s frame with respect to the parent’s frame. You will also notice the addition of the IMU link, as well as an additional camera link, although this is just a placeholder for a possible camera in the future, and at the moment won’t be used.

Graphviz diagram of URDF file

Graphviz diagram of URDF file


The robot in RViz

The current state of the robot is shown in the screenshots below. The robot is displayed in RViz with the help of the ROS joint_state_publisher and robot_state_publisher. It is fully articulated and the individual joints can be moved with the help of the GUI which joint_state_publisher provides. The joint states will later be published from the joint values read by the real Bioloid servos. In addition to the kinematic model, I created bounding boxes around the components for the collision boundaries, which are shown in red. This was after the fact I realised that without them, the MoveIt! plugin would use the full CAD geometry in its collision detection routines, which made it almost grind to a halt!


Integration with MoveIt!

I have only played around with MoveIt! briefly so far, but the results seem very promising. The library has a useful graphical setup assistant, which essentially enhances the URDF with a Semantic Robot Description Format (SRDF) file. The URDF only contains information about how the joints and links are arranged, as well as some other information such as joint limits, and the visual and collision data. The SRDF doesn’t replace the URDF, but exists separately and contains other information, such as further self-collision checking, auxiliary joints, groups of joints, links and kinematic chains, end effectors and poses. So far I haven’t found any need to edit the SRDF directly, as it can be generated and edited by the setup assistant.

The assistant generates a new ROS package with various templates for path planning and visualisation, which is done via an RViz plugin. So far I have managed to interact with the virtual Bioloid’s arms and legs, in a similar way shown in this PR2 robot tutorial. The aim will be to later on create some poses and walking gaits which I can try out on the real robot, but that is all for now!


Useful links

URDF:

MoveIt!:

RViz scaling and ambient colour issues:

 

Robot Revival

The Bioloid story

Many moons ago, I purchased my first humanoid robot, an 18-servo Bioloid Comprehensive Kit. At the time, humanoid robotics for hobbyists was at its early stages, and I chose the Bioloid after much deliberation and comparison with its then main competitors, the Hitec Robonova and Kondo KHR-2HV. I went for the Bioloid mainly because of the generous parts list, which doesn’t limit the design to just a humanoid robot, as well as the powerful AX-12+ Dynamixel servos. These have a number of advantages over the more traditional simple servos, such as multiple feedback options (position, temperature, load voltage, input voltage), powerful torque, upgradeable firmware, internal PID control, continuous rotation option, a comms bus that enables the servos to be daisy-chained … and the list goes on!

After building some of the standard variants trying out the demos, attempting a custom walker, and playing around with Embedded C on its CM-5 controller board, I never got around to actually doing the kit any real justice. I have finally decided to explore the potential of this impressive robot, and make all that money worthwhile!

This post begins one of hopefully many, in which I will detail my progress with the Bioloid robot.


Initial hardware ideas

The general plan for hardware is to extend the base platform with various components, avoiding the need for custom electronic boards as far as possible, as I want my main focus to be on software.
The Bioloid’s servos are powered and controlled by the CM-5 controller, which has an AVR ATMega128 at its core. I have played around with downloading custom Embedded C to the CM-5, but in terms of what I have in mind, it is much more convenient to be able to control the servos directly from a PC. The standard solution is the USB2Dynamixel, however much of this chunky adaptor is taken up by an unnecessary serial port, so I went for a functionally identical alternative, a USB2AX by a company called Xevelabs. The PC/laptop control will hopefully be replaced by a Raspberry Pi 2 Model B (on back order!) for a more mobile solution. I have not thought about mobile power yet!

Despite the impressive servos, the stock Bioloid offers little in terms of sensors. A provided AX-S1 sensor module has IR sensors/receiver, a microphone and a buzzer, all built in to a single package, which resembles on of the servos, and acts as the Bioloid’s head. Although updated controllers by Robotis have emerged over the years, the original CM-5 had no way of directly integrating sensors to it, and was limited to the AX-S1.

Since a bunch of servos without any real-world feedback does not really make a robot, I am going to add a number of sensors to the base robot. The current plan is to use a MinIMU-9 v3 for tilt/orientation sensing, and a number of Interlink FSR 400 Short force sensing resistors on the feet. Very conveniently, the undersides of the Bioloid’s feet have indents in their four corners which perfectly match the shape of the FSRs! A Pololu A-Star 32U4 Mini SV (essentially an Arduino board) will perform the data collection and pass it over to the PC via serial-to-USB.

That is as far as my current considerations go in terms of hardware. At some point I will look into vision, which may be as simple as a normal webcam. I originally thought that an Xbox Kinect would be ruled out because of size, but apparently it can be done!


Initial software ideas

I plan on using ROS (Robot Operating System) as the main development platform, with code in C++. As well as playing around with ROS in the past, its popularity in a large number of robotics projects and large number of libraries makes it a very development platform. Also, I recently discovered the MoveIt! package, which would be great to try out for the Bioloid’s walking gait.

For simplicity, the A-Star will be programmed using the Arduino IDE. I was pleasantly surprised that I wouldn’t have to write any serial comms code to get the sensor data into the ROS environment, as a serial library for the Arduino already exists. ROS is already looking like a good choice!

The A-Star will initially just serve ROS messages to the PC. It may potentially perform other functions if it has the processing power to spare, but for now there is no need. A ROS service running on the PC will be needed to interface with the Dynamixel servos, instructing the servos to move, reading their feedback and publishing the robot’s joint states to various other ROS nodes.

My next post will focus on the new sensor hardware. Until then, please let me know your thoughts and suggestions, all feedback is welcome!