# Body Moving

New video!

I have been testing the movement of the robot’s base in the world, while keeping the legs fixed to the ground, as a test of the robot’s stability and flexibility.

The robot base can now be controlled, either via the GUI, keyboard or gamepad, in the following ways:

• Translation in XYZ
• Roll/pitch/yaw
• Movement of the two spine joints – Front of robot remains still, while rear adjusts
• Movement of the two spine joints – Front of robot attempts to counteract the motion of the rear

You may notice the real robot can’t move its upper leg all the way horizontally as the IK might suggest is possible, because there is a small clash between the AX-12 and the metal bracket, but this should be fixed by filing or bending the curved metal tabs:

I have recently written an OpenCM sketch to control the robot servos, in a way similar to how it was being done with the older Arbotix-M, but this time using the Robotis libraries for communicating with the motors.

I have also been making various updates to the Python test code, with a few of the main issues being:

• Improved the code for positioning the base and base target in world
• Updated base/spine transforms – Front legs now move with base, not first spine joint
• Fixed the leg IK – Legs now remain in line with ground when the base moves
• Added new keyboard/joystick input modes for controlling base position, base orientation, spine joints
• Updated the serial string sending function and fixed some minor issues
• Moved a load of script configuration variables to a separate Params module
• Added a combo box to the GUI front-end for loading a selection of CSV files (as an update to the previous two fixed buttons)

All the latest code can be found on the Quadbot17 GitHub project page as usual.

# 5DOF leg test rig build and gamepad control

The hardware for the test rig has arrived, and the first leg has been set up!

The metal brackets are from Trossen Robotics and largely match the dimensions of their equivalent plastic parts from Robotis. Some parts do not have metal counterparts, so I used spare plastic parts from my Bioloid kit. I also ordered an ArbotiX-M controller (Arduino-based), which communicates with the PC via a spare SparkFun FT232RL I had. The test rig frame is made out of MakerBeam XL aluminium profiles.
Unfortunately, I thought I had more 3-pin cables than I actually did, so I can’t control all the motors just yet. However, I’ve got an Xbox One controller controlling the IK’s foot target position and a simple serial program on the ArbotiX-M which receives the resulting position commands for each motor.

The code for both the Python applet and the Arduino can be found on GitHub here.

Some of the more useful snippets of code are shown below:

The following code handles gamepad inputs and converts them to a natural-feeling movement, based on equations of motion. The gamepad input gets converted into a virtual force, which is opposed by a drag proportional to the current velocity. The angles resulting from the IK are sent to the controller via serial port:


def __init__(self, master):
self.master = master
self.daemon = True  # OK for main to exit even if instance is still running
self.paused = False
self.triggerPolling = True
# Input vars
devices = DeviceManager()
self.target = targetHome[:]
self.speed = [0, 0, 0]
self.inputLJSX = 0
self.inputLJSY = 0
self.inputRJSX = 0
self.inputRJSY = 0
self.inputLJSXNormed = 0
self.inputLJSYNormed = 0
self.inputRJSXNormed = 0
self.inputRJSYNormed = 0
self.dt = 0.005

def run(self):
while 1:
with self.cond:
if self.paused:
self.cond.wait()  # Block until notified
self.triggerPolling = True
else:
if self.triggerPolling:
self.pollInputs()
self.pollIK()
self.pollSerial()
self.triggerPolling = False
# Get joystick input
try:
for event in events:
self.processEvent(event)
except:
pass

def pause(self):
with self.cond:
self.paused = True

def resume(self):
with self.cond:
self.paused = False
self.cond.notify()  # Unblock self if waiting

def processEvent(self, event):
#print(event.ev_type, event.code, event.state)
if event.code == 'ABS_X':
self.inputLJSX = event.state
elif event.code == 'ABS_Y':
self.inputLJSY = event.state
elif event.code == 'ABS_RX':
self.inputRJSX = event.state
elif event.code == 'ABS_RY':
self.inputRJSY = event.state

def pollInputs(self):
# World X
self.inputLJSYNormed = self.filterInput(-self.inputLJSY)
self.target[0], self.speed[0] = self.updateMotion(self.inputLJSYNormed, self.target[0], self.speed[0])
# World Y
self.inputLJSXNormed = self.filterInput(-self.inputLJSX)
self.target[1], self.speed[1] = self.updateMotion(self.inputLJSXNormed, self.target[1], self.speed[1])
# World Z
self.inputRJSYNormed = self.filterInput(-self.inputRJSY)
self.target[2], self.speed[2] = self.updateMotion(self.inputRJSYNormed, self.target[2], self.speed[2])
with self.cond:
if not self.paused:
self.master.after(int(self.dt*1000), self.pollInputs)

def pollIK(self):
global target
target = self.target[:]
runIK(target)
with self.cond:
if not self.paused:
self.master.after(int(self.dt*1000), self.pollIK)

def pollSerial(self):
if 'ser' in globals():
global ser
global angles
writeStr = ""
for i in range(len(angles)):
x = int( rescale(angles[i], -180.0, 180.0, 0, 1023) )
writeStr += str(i+1) + "," + str(x)
if i  3277) or (i  3277:
oldMax = 32767
elif i < -3277:
oldMax = 32768
inputNormed = math.copysign(1.0, abs(i)) * rescale(i, 0, oldMax, 0, 1.0)
else:
inputNormed = 0
return inputNormed

def updateMotion(self, i, target, speed):
c1 = 10000.0
c2 = 10.0
mu = 1.0
m = 1.0
u0 = speed
F = c1*i - c2*u0  # Force minus linear drag
a = F/m
t = self.dt
x0 = target
# Equations of motion
u = u0 + a*t
x = x0 + u0*t + 0.5*a*pow(t, 2)
# Update self
target = x
speed = u
return target, speed

def rescale(old, oldMin, oldMax, newMin, newMax):
oldRange = (oldMax - oldMin)
newRange = (newMax - newMin)
return (old - oldMin) * newRange / oldRange + newMin



The following Python code is the leg’s Forward Kinematics (click to expand):


def runFK(angles):
global a
global footOffset
global T_1_in_W
global T_2_in_W
global T_3_in_W
global T_4_in_W
global T_5_in_W
global T_F_in_W

a = [0, 0, 29.05, 76.919, 72.96, 45.032]  # Link lengths "a-1"

footOffset = 33.596

s = [0, 0, 0, 0, 0, 0]
c = [0, 0, 0, 0, 0, 0]
for i in range(1,6):

# +90 around Y
T_0_in_W = np.matrix( [ [  0,  0,  1,  0],
[  0,  1,  0,  0],
[ -1,  0,  0,  0],
[  0,  0,  0,  1] ] )

T_1_in_0 = np.matrix( [ [ c[1], -s[1],  0, a[1]],
[ s[1],  c[1],  0,    0],
[    0,     0,  1,    0],
[    0,     0,  0,    1] ] )

T_2_in_1 = np.matrix( [ [ c[2], -s[2],  0, a[2]],
[    0,     0, -1,    0],
[ s[2],  c[2],  0,    0],
[    0,     0,  0,    1] ] )

T_3_in_2 = np.matrix( [ [ c[3], -s[3],  0, a[3]],
[ s[3],  c[3],  0,    0],
[    0,     0,  1,    0],
[    0,     0,  0,    1] ] )

T_4_in_3 = np.matrix( [ [ c[4], -s[4],  0, a[4]],
[ s[4],  c[4],  0,    0],
[    0,     0,  1,    0],
[    0,     0,  0,    1] ] )

T_5_in_4 = np.matrix( [ [ c[5], -s[5],  0, a[5]],
[    0,     0, -1,    0],
[-s[5], -c[5],  1,    0],
[    0,     0,  0,    1] ] )

T_F_in_5 = np.matrix( [ [  1,  0,  0,  footOffset],
[  0,  1,  0,  0],
[  0,  0,  1,  0],
[  0,  0,  0,  1] ] )

T_1_in_W = T_0_in_W * T_1_in_0
T_2_in_W = T_1_in_W * T_2_in_1
T_3_in_W = T_2_in_W * T_3_in_2
T_4_in_W = T_3_in_W * T_4_in_3
T_5_in_W = T_4_in_W * T_5_in_4
T_F_in_W = T_5_in_W * T_F_in_5



The following Python code is the leg’s Inverse Kinematics (click to expand):


def runIK(target):
# Solve Joint 1
num = target[1]
den = abs(target[2]) - footOffset

# Lengths projected onto z-plane
a2p = a[2]*c0
a3p = a[3]*c0
a4p = a[4]*c0
a5p = a[5]*c0

j4Height = abs(target[2]) - a2p - a5p - footOffset

j2j4DistSquared = math.pow(j4Height, 2) + math.pow(target[0], 2)
j2j4Dist = math.sqrt(j2j4DistSquared)

# # Solve Joint 2
num = target[0]
den = j4Height
psi = math.degrees( math.atan2(num, den) )
num = pow(a3p, 2) + j2j4DistSquared - pow(a4p, 2)
den = 2.0*a3p*j2j4Dist
if abs(num) <= abs(den):
phi = math.degrees( math.acos(num/den) )
angles[1] = - (phi - psi)

# Solve Joint 3
num = pow(a3p, 2) + pow(a4p, 2) - j2j4DistSquared
den = 2.0*a3p*a4p
if abs(num) <= abs(den):
angles[2] = 180.0 - math.degrees( math.acos(num/den) )

# # Solve Joint 4
num = pow(a4p, 2) + j2j4DistSquared - pow(a3p, 2)
den = 2.0*a4p*j2j4Dist
if abs(num) <= abs(den):
omega = math.degrees( math.acos(num/den) )
angles[3] = - (psi + omega)

# Solve Joint 5
angles[4] = - angles[0]

runFK(angles)



The following Arduino code is the serial read code for the Arbotix-M (click to expand):


#include

int numOfJoints = 5;

void setup()
{
Serial.begin(38400);
}

void loop()
{
}

void serialEvent() {
int id[numOfJoints];
int pos[numOfJoints];
while (Serial.available())
{
for (int i = 0; i < numOfJoints; ++i)
{
id[i] = Serial.parseInt();
pos[i] = Serial.parseInt();
}
{
for (int i = 0; i  0) && (0 <= pos[i]) && (pos[i] <= 1023 ) )
SetPosition(id[i], pos[i]);
}
}
}
}



# Fitting the foot force sensors

I finally got round to installing the force sensing resistors to the underside of the Bioloid’s feet. I am using the FSR 400 Short model, with male end connectors, which are small and flat and didn’t fit in the square female header pins which often come at the end of breadboard wires. I didn’t want to solder directly as I’ve damaged these sensors in the past, so I followed a suggestion on the Adafruit website and ended up using these PCB terminal blocks.

The force sensing element of the FSR fits nicely in the pre-existing cut-out on the underside of the Bioloid foot plate, and the pins are fed through a hole in the plate so they can be connected on the topside. However, in the standard Bioloid humanoid configuration, the foot plates are attached at an offset from the ankle angle brackets, which means there is not enough space between the footplate and the bracket to connect the pins. To fix this, I moved the footplates so that the ankle brackets sit along their centres. There is still some free space between the footplates when the Bioloid sits in its resting position, so hopefully the feet will not collide with each other after this change.

For completeness, I also made changes to the CAD model and also added FSR CAD models to the feet, although that is just a cosmetic addition to the model (on a related side-note, I also found a CAD model of the USB2AX which adds to the realism)! The robot’s URDF files and CAD models have been updated on GitHub.

Here are some pictures of the results:

The Raspberry Pi has now been fitted to the Bioloid using some spare brackets and sponge padding.

A GUI for the Pi’s screen has also been made using Python, Tkinter and the rospy library. This is actually my first Python program from scratch, so it’s far from perfect but it’s simple and does the job for now!

The program visualises the sensor data from the force sensors and IMU, which the Arduino is publishing on ROS topics. The slider values are unfiltered data, the same which is used by another ROS node to fuse the IMU data and provide a good estimate of the torso’s 3D orientation. The roll/pitch/yaw widgets are used to visualise a simple transformation of the accelerometer/magnetometer data, as was performed previously here. The force sensing resistor (FSR) widget is not active, as the resistors have not been wired in yet, but eventually each FSR’s value will be visualised with a coloured square box ranging from yellow (un-pressed) to red (fully pressed).

# Using quaternions to create a better IMU complementary filter

## A better alternative to the RPY approach

After realising in my previous post that solving the gimbal lock problem for the complementary filter requires fiddly and inelegant fixes, I decided to dive into the world of quaternions. Luckily, the theory behind quaternions doesn’t need to be fully understood, in order to use them to represent 3D rotations. Particularly useful are the conversions between quaternion and rotation matrix, and axis-angle representation.

I initially tried to make the Arduino MCU (A-Star) perform the filtering process and pass the IMU frame transform to the PC, however this presented a couple of issues, the first being that there is no standard vector and matrix library in Arduino. Second, although I could write a very simple library for vector cross and dot products etc., the MCU’s 32 KB of flash memory was already almost full with all the rosserial and IMU libraries, leaving little space for anything else.
Hence I opted for letting the MCU simply pass a number of ROS message streams over to the PC, which could then do the required transformations using the ROS tf library. The Arduino code is listed at the end of this post (bioloid_sensors.ino).

The MCU reads the MinIMU-9 v3 IMU data and publishes it as ROS messages on the following topics:

• LSM303 accelerometer x, y, z values (“accel”)
• LSM303 magnetometer x, y, z values (“magnet”)
• L3G gyroscope x, y, z values (“gyro”)
• LSM303 magnetometer heading value (“heading”), derived from the LSM303’s library, as the angular difference in the horizontal plane between the x axis and magnetic North

ROS has a very handy plotting tool for topics, called rqt_plot. Below is an example dump of all the IMU data.

IMU messages plotted in rqt_plot

## Quaternion-based filter

Having spent too much time on the RPY approach already, I wanted to find a simple way to achieve a relatively stable orientation from the IMU readings. I ended up implementing the method from this AHRS maths tutorial from the Firetail UAV system. The method is explained very well in that link, so there is no need to go into the details. The only change I have made is in the calculation of the filter coefficient, based on a set time constant as was done previously. My version using a ROS subscriber on the PC side is again listed at the end of this post (imu_tf_broadcaster.cpp and .h files).

## Videos

Here are two videos comparing the original RPY approach against the improved quaternion approach. The resulting IMU transform is published by tf, which is used by the robot model you see in RViz, the ROS visualisation tool. The model is generated using a URDF file; I will explain this in detail in a following post.

Although in both cases the rotation is fairly smooth, you can see the problems that the RPY filtering encounters when it nears the gimbal lock position (when the robot lies horizontally). For my purposes of orientating the robot, I think the current quaternion approach is perfectly suited. I doubt I will be needing to play around with Kalman filters and the likes, as I currently don’t need the precision that UAVs etc. may need!

So that’s it for IMUs and orientation for the time being. In my next post I will start detailing the current progress with the virtual bioloid in ROS, which was seen in the above videos.

Code (click to expand):


// Hardware:
// Pololu A-Star 32U4 Mini SV
// Pololu MinIMU-9 v3 (L3GD20H and LSM303D)
// Interlink FSR 400 Short (x6)

// Important! Define this before #include <ros.h>
#define USE_USBCON

#include <Wire.h>
#include <ros.h>
#include <std_msgs/Int16MultiArray.h>
#include <geometry_msgs/Vector3.h>
#include <std_msgs/Float32.h>
#include <AStar32U4Prime.h>
#include <LSM303.h>
#include <L3G.h>

// Set up the ros node and publishers
ros::NodeHandle nh;
std_msgs::Int16MultiArray msg_fsrs;
std_msgs::MultiArrayDimension fsrsDim;
ros::Publisher pub_fsrs("fsrs", &msg_fsrs);
geometry_msgs::Vector3 msg_accel;
ros::Publisher pub_accel("accel", &msg_accel);
geometry_msgs::Vector3 msg_magnet;
ros::Publisher pub_magnet("magnet", &msg_magnet);
geometry_msgs::Vector3 msg_gyro;
ros::Publisher pub_gyro("gyro", &msg_gyro);
unsigned long pubTimer = 0;

const int numOfFSRs = 6;
const int FSRPins[] = {A0, A1, A2, A3, A4, A5};
int FSRValue = 0;
LSM303 compass;
L3G gyro;
const int yellowLEDPin = IO_C7;  // 13
int LEDBright = 0;
int LEDDim = 5;
unsigned long LEDTimer = 0;

void setup()
{
// Array for FSRs
msg_fsrs.layout.dim = &fsrsDim;
msg_fsrs.layout.dim[0].label = "fsrs";
msg_fsrs.layout.dim[0].size = numOfFSRs;
msg_fsrs.layout.dim[0].stride = 1*numOfFSRs;
msg_fsrs.layout.dim_length = 1;
msg_fsrs.layout.data_offset = 0;
msg_fsrs.data_length = numOfFSRs;
msg_fsrs.data = (int16_t *)malloc(sizeof(int16_t)*numOfFSRs);

nh.initNode();

// Wait until connected
while (!nh.connected())
nh.spinOnce();

Wire.begin();

// Enable pullup resistors
for (int i=0; i<numOfFSRs; ++i)
pinMode(FSRPins[i], INPUT_PULLUP);

if (!compass.init())
{
nh.logerror("Failed to autodetect compass type!");
}
compass.enableDefault();

// Compass calibration values
compass.m_min = (LSM303::vector<int16_t>){-3441, -3292, -2594};
compass.m_max = (LSM303::vector<int16_t>){+2371, +2361, +2328};

if (!gyro.init())
{
nh.logerror("Failed to autodetect gyro type!");
}
gyro.enableDefault();

pubTimer = millis();
}

void loop()
{
if (millis() > pubTimer)
{
for (int i=0; i<numOfFSRs; ++i)
{
msg_fsrs.data[i] = FSRValue;
delay(2);  // Delay to allow ADC VRef to settle
}

// Compass - accelerometer:
// 16-bit, default range +-2 g, sensitivity 0.061 mg/digit
// 1 g = 9.80665 m/s/s
// e.g. value for z axis in m/s/s will be: compass.a.z * 0.061 / 1000.0 * 9.80665
//      value for z axis in g will be: compass.a.z * 0.061 / 1000.0
// Gravity is measured as an upward acceleration:
// Stationary accel. shows +1 g value on axis facing directly "upwards"
// Convert values to g
msg_accel.x = (float)(compass.a.x)*0.061/1000.0;
msg_accel.y = (float)(compass.a.y)*0.061/1000.0;
msg_accel.z = (float)(compass.a.z)*0.061/1000.0;

// Compass - magnetometer:
// 16-bit, default range +-2 gauss, sensitivity 0.080 mgauss/digit
msg_magnet.x = (float)(compass.m.x);
msg_magnet.y = (float)(compass.m.y);
msg_magnet.z = (float)(compass.m.z);
// Heading from the LSM303D library is the angular difference in
// the horizontal plane between the x axis and North, in degrees.
// Convert value to rads, and change range to +-pi

// Gyro:
// 16-bit, default range +-245 dps (deg/sec), sensitivity 8.75 mdps/digit
msg_gyro.x = (float)(gyro.g.x)*0.00875*M_PI/180.0;
msg_gyro.y = (float)(gyro.g.y)*0.00875*M_PI/180.0;
msg_gyro.z = (float)(gyro.g.z)*0.00875*M_PI/180.0;

pub_fsrs.publish(&msg_fsrs);
pub_accel.publish(&msg_accel);
pub_magnet.publish(&msg_magnet);
pub_gyro.publish(&msg_gyro);

pubTimer = millis() + 10;  // wait at least 10 msecs between publishing
}

// Pulse the LED
if (millis() > LEDTimer)
{
LEDBright += LEDDim;
analogWrite(yellowLEDPin, LEDBright);
if (LEDBright == 0 || LEDBright == 255)
LEDDim = -LEDDim;

// 50 msec increments, 2 sec wait after each full cycle
if (LEDBright != 0)
LEDTimer = millis() + 50;
else
LEDTimer = millis() + 2000;
}

nh.spinOnce();
}




int main(int argc, char **argv)
{
ros::NodeHandle n;
ros::Rate loop_rate(1000);  // Hz

while(n.ok())
{
tf::StampedTransform(
loop_rate.sleep();
ros::spinOnce();

//        std::cout << "dt: " << broadcaster.getDt() << std::endl;
//        std::cout << "q.x: " << broadcaster.getQ().x() << std::endl;
//        std::cout << "q.y: " << broadcaster.getQ().y() << std::endl;
//        std::cout << "q.z: " << broadcaster.getQ().z() << std::endl;
//        std::cout << "q.w: " << broadcaster.getQ().w() << std::endl;
//        std::cout << "----" << std::endl;
}

return 0;
}

prevt(ros::Time::now().toSec()),
dt(0.0),
timeConst(1.0),
filterCoeff(0.0)
{
q.setRPY(0.0, 0.0, 0.0);
}

{

}

{
accel.setX(msg->x);
accel.setY(msg->y);
accel.setZ(msg->z);
}

{
magnet.setX(msg->x);
magnet.setY(msg->y);
magnet.setZ(msg->z);
}

{
}

{
dt = ros::Time::now().toSec() - prevt;

gyro.setX(msg->x);
gyro.setY(msg->y);
gyro.setZ(msg->z);

angularVel.setX(gyro.x());
angularVel.setY(gyro.y());
angularVel.setZ(gyro.z());

filterCoeff = timeConst / (timeConst + dt);

// Use accelerometer and magnetometer data to correct gyro drift
correctOrientation();

updateRotation();

prevt = ros::Time::now().toSec();

//    std::cout << "angularVel x: " << angularVel.x() << std::endl;
//    std::cout << "angularVel y: " << angularVel.y() << std::endl;
//    std::cout << "angularVel z: " << angularVel.z() << std::endl;
//    std::cout << "----" << std::endl;
}

{
// New quaternion, from axis-angle notation for gyro
tf::Quaternion qNew(angularVel.normalized(), angularVel.length()*dt);

// Update previous value
q *= qNew;
q.normalize();
}

{
// Use acceleration data only if vector is close to 1 g
if ( fabs(accel.length() - 1) <= 0.1 )
{
// These vectors have to be perpendicular.
// As there is no guarantee that North is perpendicular to Down,
// set North to the cross product of East and Down.
// Gravity is measured as an upward acceleration.
tf::Vector3 Down(accel);  // Should be -accel, but not sure why this produces inverted axes!?
tf::Vector3 E = Down.cross(magnet);
tf::Vector3 N = E.cross(Down);

Down.normalize();
E.normalize();
N.normalize();

// The rows of the rotation matrix represent the coordinates in the original
// space of unit vectors along the coordinate axes of the rotated space
tf::Matrix3x3 gyroRotMat = tf::Matrix3x3(q);

// Correction vector
tf::Vector3 cv = ( N.cross(gyroRotMat.getRow(0)) +
E.cross(gyroRotMat.getRow(1)) +
Down.cross(gyroRotMat.getRow(2)) ) * filterCoeff;

angularVel += cv;
}
}




#include "ros/ros.h"
#include "geometry_msgs/Vector3.h"
#include "std_msgs/Float32.h"

{
public:
void accelCallback(const geometry_msgs::Vector3::ConstPtr& msg);
void magnetCallback(const geometry_msgs::Vector3::ConstPtr& msg);
void gyroCallback(const geometry_msgs::Vector3::ConstPtr& msg);
//void updateTimers();
void updateRotation();
void correctOrientation();
tf::Quaternion getQ() const {return q;}
double getPrevt() const {return prevt;}
double getDt() const {return dt;}
float getTimeConst() const {return timeConst;}
void setTimeConst(float value) {timeConst = value;}

private:
tf::Vector3 accel;
tf::Vector3 magnet;
tf::Vector3 gyro;

tf::Vector3 angularVel;
tf::Quaternion q;
double prevt;
double dt;
float timeConst;
float filterCoeff;
};



# Using the Arduino to read sensor data

## Force Sensing Resistors and Inertial Measurement Unit

The first hardware parts to be tested were the inertial measurement unit (IMU) and the force sensing resistors (FSRs). The Arduino IDE was used for programming. The FSRs are straightforward analogue inputs, so I will concentrate on the IMU.

In order to be in line with the ROS coordinate frame conventions, the robot’s base frame will be in this orientation (right-hand rule):

• x axis: forward
• y axis: left
• z axis: up

For now I am assuming the IMU is in the same orientation, but as it is not yet mounted to the robot, it might change later on. Again, in keeping with the ROS conventions on relationships beteen coordinate frames, the following transforms and their relationships have to be defined:

$\text{map} \rightarrow \text{odom} \rightarrow \text{base\_link}$

All these frames and their relationships can be easily managed using the ROS tf package.
I have added the imu_link to this chain, which represents the IMU’s orientation with respect to the odom frame. An additional link is also added to transform the standard base_link to the robot’s torso orientation. So the transform chain actually looks like this:
$\text{map} \rightarrow \text{odom} \rightarrow \text{imu\_link} \rightarrow \text{base\_link} \rightarrow \text{torso\_link}$

## Calculating roll/pitch/yaw

I initially decided to let the A-Star (Arduino-compatible) board perform the IMU calculations and pass roll/pitch/yaw (RPY) values, and pass them as a ROS message to the PC. A ROS node will read these values and broadcast the transform from odom to imu_link using tf.

The IMU I am using, the MinIMU-9 v3 from Pololu, consists of an LSM303D accelometer-magnetometer chip and an L3GD20H gyro chip in one single package. The accelerometer, magnetometer and gyro are all three-axis types, providing 9 degrees of freedom and allow for a full representation of orientation in space. The data can be easily read using the provided Arduino libraries.

Rotations in Cartesian space will be represented in terms of intrinsic Euler angles, around the three orthogonal axes x, y and z of the IMU. As the order in which rotations are applied is important, I decided to use the standard “aerospace rotation sequence”, which has an easy mnemonic: An aeroplane yaws along the runway, then pitches during takeoff, and finally rolls to make a turn once it is airborne. The rotation sequence is thus:

• Rotation about z: yaw (±180°)
• Rotation about y: pitch (±90°)
• Rotation about x: roll (±180°)

Pitch has to be restricted to ±90° in order to represent any rotation with a single set of RPY values. The accelerometer alone can extract roll and pitch estimates, but not yaw, as the gravity vector doesn’t change when the IMU is rotated about the y axis. This is where the magnetometer (compass) comes in use.

Hence, using these accelerometer and magnetometer values, the IMU’s orientation can be fully described in space. However these measurements are very noisy, and should only be used as long-term estimates of orientation. Also, any accelerations due to movement and not just gravity add disturbances to the estimate. To compensate for this, the gyro readings can be used as a short term and more accurate representation orientation, as the gyro readings are more precise and not affected by external forces.

## The complementary filter

Gyro measurements alone suffer from drift, so they have to be combined with the accelerometer/magnetometer readings to get a clean signal. A fairly simple way to do this is using what is known as a complementary filter. It can be implemented in code fairly simply, in the following way:

$Roll_{curr} = C_f \times (Roll_{prev} + X_{gyro} \times dt) + (1 - C_f) \times Roll_{accel}$
$Pitch_{curr} = C_f \times (Pitch_{prev} + Y_{gyro} \times dt) + (1 - C_f) \times Pitch_{accel}$
$Yaw_{curr} = C_f \times (Yaw_{prev} + Z_{gyro} \times dt) + (1 - C_f) \times Yaw_{magnet}$

On every loop, the previous RPY values are updated with the new IMU readings. The complementary filter is essentially a high-pass filter on the gyro readings, and a low-pass filter on the accelerometer/magnetometer readings. $C_f$ is a filtering coefficient, which determines the time boundary between trusting the gyroscope vs trusting the accelerometer/magnetometer, defined as:
$C_f = \frac{ t_{const} }{ (t_{const} + dt) }$
$dt$ is the loop time, and $t_{const}$ is a time constant, which determines the length of recent history for which the gyro data takes precedence over the accelerometer/magnetometer data, in calculating the new values. I have set this to 1 second, so the accelerometer data from over a second ago will take precedence and correct any gyro drift. A good source of information on the complementary filter can be found here.

## Issues with filtering roll/pitch/yaw values

However, when using RPY calculations, applying this filter is not so straightforward. Firstly, when the roll/yaw values cross over the ±180° point, the jump has to be taken into account otherwise it causes large erroneous jumps in the averaged value. This can be corrected fairly simply, however another problem is gimbal lock, which in this case is caused then the pitch crosses the +-pi/2 points (looking directly. This causes the roll/yaw values to jump abruptly, and as a result the IMU estimation “jumps” around the problem points. My attempt at solving this issue by resetting the estimated values whenever IMU was near the gimbal lock position was not very effective.

After a lot of frustration, I decided that learning how to reliably estimate position using quaternions was the way to go, although they may at first seem daunting. So at this point I abandoned any further experimentation using RPY calculations, however the work was a good introduction to the basics of the A-Star board and representing orientations using an IMU. The Arduino code is posted here for reference. Links to tutorials on using accelerometer and gyro measurements can be found in the comments.

Code (click to expand):


// Hardware:
// Pololu A-Star 32U4 Mini SV
// Pololu MinIMU-9 v3 (L3GD20H and LSM303D)
// Interlink FSR 400 Short (x6)

// Important! Define this before #include <ros.h>
#define USE_USBCON

#include <Wire.h>
#include <ros.h>
#include <std_msgs/Int16MultiArray.h>
#include <geometry_msgs/Vector3.h>
#include <AStar32U4Prime.h>
#include <LSM303.h>
#include <L3G.h>
#include <math.h>

// Set up the ros node and publishers
ros::NodeHandle nh;
std_msgs::Int16MultiArray msg_fsrs;
std_msgs::MultiArrayDimension fsrsDim;
ros::Publisher pub_fsrs("fsrs", &msg_fsrs);
geometry_msgs::Vector3 msg_rpy;
ros::Publisher pub_rpy("rpy", &msg_rpy);
unsigned long pubTimer = 0;

const int numOfFSRs = 6;
const int FSRPins[] = {A0, A1, A2, A3, A4, A5};
int FSRValue = 0;
LSM303 compass;
L3G gyro;
const int yellowLEDPin = IO_C7;  // 13
int LEDBright = 0;
int LEDDim = 5;
unsigned long LEDTimer = 0;

float aX = 0.0;
float aY = 0.0;
float aZ = 0.0;
float gX = 0.0;
float gY = 0.0;
float gZ = 0.0;
float accelRoll = 0.0;
float accelPitch = 0.0;
float magnetYaw = 0.0;
float roll = 0.0;
float pitch = 0.0;
float yaw = 0.0;
float accelNorm = 0.0;
int signOfz = 1;
int prevSignOfz = 1;
float dt = 0.0;
float prevt = 0.0;
float filterCoeff = 1.0;
float mu = 0.01;
float timeConst = 1.0;

void setup()
{
// Array for FSRs
msg_fsrs.layout.dim = &fsrsDim;
msg_fsrs.layout.dim[0].label = "fsrs";
msg_fsrs.layout.dim[0].size = numOfFSRs;
msg_fsrs.layout.dim[0].stride = 1*numOfFSRs;
msg_fsrs.layout.dim_length = 1;
msg_fsrs.layout.data_offset = 0;
msg_fsrs.data_length = numOfFSRs;
msg_fsrs.data = (int16_t *)malloc(sizeof(int16_t)*numOfFSRs);

nh.initNode();

// Wait until connected
while (!nh.connected())
nh.spinOnce();

Wire.begin();

// Enable pullup resistors
for (int i=0; i<numOfFSRs; ++i)
pinMode(FSRPins[i], INPUT_PULLUP);

if (!compass.init())
{
nh.logerror("Failed to autodetect compass type!");
}
compass.enableDefault();

// Compass calibration values
compass.m_min = (LSM303::vector<int16_t>){-3441, -3292, -2594};
compass.m_max = (LSM303::vector<int16_t>){+2371, +2361, +2328};

if (!gyro.init())
{
nh.logerror("Failed to autodetect gyro type!");
}
gyro.enableDefault();

pubTimer = millis();
prevt = millis();
}

void loop()
{
if (millis() > pubTimer)
{
for (int i=0; i<numOfFSRs; ++i)
{
msg_fsrs.data[i] = FSRValue;
delay(2);  // Delay to allow ADC VRef to settle
}

if (compass.a.z < 0)
signOfz = -1;
else
signOfz = 1;

// Compass - accelerometer:
// 16-bit, default range +-2 g, sensitivity 0.061 mg/digit
// 1 g = 9.80665 m/s/s
// e.g. value for z axis in m/s/s will be: compass.a.z * 0.061 / 1000.0 * 9.80665
//      value for z axis in g will be: compass.a.z * 0.061 / 1000.0
//
// http://www.analog.com/media/en/technical-documentation/application-notes/AN-1057.pdf
//accelRoll = atan2( compass.a.y, sqrt(compass.a.x*compass.a.x + compass.a.z*compass.a.z) ) ;
//accelPitch = atan2( compass.a.x, sqrt(compass.a.y*compass.a.y + compass.a.z*compass.a.z) );
// Angle between gravity vector and z axis:
//float accelFi = atan2( sqrt(compass.a.x*compass.a.x + compass.a.y*compass.a.y), compass.a.z );
//
// Alternative way:
// http://www.freescale.com/files/sensors/doc/app_note/AN3461.pdf
// Angles - Intrinsic rotations (rotating coordinate system)
//
// "Aerospace rotation sequence"
// yaw, then pitch, then roll (matrices multiplied from right-to-left)
// R_xyz = R_x(fi) * R_y(theta) * R_z(psi)
//
// Convert values to g
aX = (float)(compass.a.x)*0.061/1000.0;
aY = (float)(compass.a.y)*0.061/1000.0;
aZ = (float)(compass.a.z)*0.061/1000.0;
//accelRoll = atan2( aY, aZ );
// Singularity workaround for roll:
accelRoll  = atan2( aY, signOfz*sqrt(aZ*aZ + mu*aX*aX) );  // +-pi
accelPitch = atan2( -aX, sqrt(aY*aY + aZ*aZ) );            // +-pi/2

// Compass - magnetometer:
// 16-bit, default range +-2 gauss, sensitivity 0.080 mgauss/digit
// Heading from the LSM303D library is the angular difference in
// the horizontal plane between the x axis and North, in degrees.
// Convert value to rads, and change range to +-pi
magnetYaw = - ( (float)(compass.heading())*M_PI/180.0 - M_PI );

// Gyro:
// 16-bit, default range +-245 dps (deg/sec), sensitivity 8.75 mdps/digit
gX = (float)(gyro.g.x)*0.00875*M_PI/180.0;
gY = (float)(gyro.g.y)*0.00875*M_PI/180.0 * signOfz;
gZ = (float)(gyro.g.z)*0.00875*M_PI/180.0;

dt = (millis() - prevt)/1000.0;

filterCoeff = 1.0;
// If accel. vector magnitude seems correct, e.g. between 0.5 and 2 g, use the complementary filter
// Else, only use gyro readings
accelNorm = sqrt(aX*aX + aY*aY + aZ*aZ);
if ( (0.5 < accelNorm) || (accelNorm > 2) )
filterCoeff = timeConst / (timeConst + dt);

// Gimbal lock: If pitch crosses +-pi/2 points (z axis crosses the horizontal plane), reset filter
// Doesn't seem to work reliably!
if ( (fabs(pitch - M_PI/2.0) < 0.1) && (signOfz != prevSignOfz) )
{
roll += signOfz*M_PI;
yaw += signOfz*M_PI;
}
else
{
// Update roll/yaw values to deal with +-pi crossover point:
if ( (roll - accelRoll) > M_PI )
roll -= 2*M_PI;
else if ( (roll - accelRoll) < - M_PI )
roll += 2*M_PI;
if ( (yaw - magnetYaw) > M_PI )
yaw -= 2*M_PI;
else if ( (yaw - magnetYaw) < - M_PI )
yaw += 2*M_PI;

// Complementary filter
roll  = filterCoeff*(roll  + gX*dt) + (1 - filterCoeff)*accelRoll;
pitch = filterCoeff*(pitch + gY*dt) + (1 - filterCoeff)*accelPitch;
yaw   = filterCoeff*(yaw   + gZ*dt) + (1 - filterCoeff)*magnetYaw;
}

roll  = fmod(roll,  2*M_PI);
pitch = fmod(pitch,   M_PI);
yaw   = fmod(yaw,   2*M_PI);

msg_rpy.x = roll;
msg_rpy.y = pitch;
msg_rpy.z = yaw;

pub_fsrs.publish(&msg_fsrs);
pub_rpy.publish(&msg_rpy);

prevSignOfz = signOfz;

prevt = millis();

pubTimer = millis() + 10;  // wait at least 10 msecs between publishing
}

// Pulse the LED
if (millis() > LEDTimer)
{
LEDBright += LEDDim;
analogWrite(yellowLEDPin, LEDBright);
if (LEDBright == 0 || LEDBright == 255)
LEDDim = -LEDDim;

// 50 msec increments, 2 sec wait after each full cycle
if (LEDBright != 0)
LEDTimer = millis() + 50;
else
LEDTimer = millis() + 2000;
}

nh.spinOnce();
}



# Robot Revival

## The Bioloid story

Many moons ago, I purchased my first humanoid robot, an 18-servo Bioloid Comprehensive Kit. At the time, humanoid robotics for hobbyists was at its early stages, and I chose the Bioloid after much deliberation and comparison with its then main competitors, the Hitec Robonova and Kondo KHR-2HV. I went for the Bioloid mainly because of the generous parts list, which doesn’t limit the design to just a humanoid robot, as well as the powerful AX-12+ Dynamixel servos. These have a number of advantages over the more traditional simple servos, such as multiple feedback options (position, temperature, load voltage, input voltage), powerful torque, upgradeable firmware, internal PID control, continuous rotation option, a comms bus that enables the servos to be daisy-chained … and the list goes on!

After building some of the standard variants trying out the demos, attempting a custom walker, and playing around with Embedded C on its CM-5 controller board, I never got around to actually doing the kit any real justice. I have finally decided to explore the potential of this impressive robot, and make all that money worthwhile!

This post begins one of hopefully many, in which I will detail my progress with the Bioloid robot.

## Initial hardware ideas

The general plan for hardware is to extend the base platform with various components, avoiding the need for custom electronic boards as far as possible, as I want my main focus to be on software.
The Bioloid’s servos are powered and controlled by the CM-5 controller, which has an AVR ATMega128 at its core. I have played around with downloading custom Embedded C to the CM-5, but in terms of what I have in mind, it is much more convenient to be able to control the servos directly from a PC. The standard solution is the USB2Dynamixel, however much of this chunky adaptor is taken up by an unnecessary serial port, so I went for a functionally identical alternative, a USB2AX by a company called Xevelabs. The PC/laptop control will hopefully be replaced by a Raspberry Pi 2 Model B (on back order!) for a more mobile solution. I have not thought about mobile power yet!

Despite the impressive servos, the stock Bioloid offers little in terms of sensors. A provided AX-S1 sensor module has IR sensors/receiver, a microphone and a buzzer, all built in to a single package, which resembles on of the servos, and acts as the Bioloid’s head. Although updated controllers by Robotis have emerged over the years, the original CM-5 had no way of directly integrating sensors to it, and was limited to the AX-S1.

Since a bunch of servos without any real-world feedback does not really make a robot, I am going to add a number of sensors to the base robot. The current plan is to use a MinIMU-9 v3 for tilt/orientation sensing, and a number of Interlink FSR 400 Short force sensing resistors on the feet. Very conveniently, the undersides of the Bioloid’s feet have indents in their four corners which perfectly match the shape of the FSRs! A Pololu A-Star 32U4 Mini SV (essentially an Arduino board) will perform the data collection and pass it over to the PC via serial-to-USB.

That is as far as my current considerations go in terms of hardware. At some point I will look into vision, which may be as simple as a normal webcam. I originally thought that an Xbox Kinect would be ruled out because of size, but apparently it can be done!

## Initial software ideas

I plan on using ROS (Robot Operating System) as the main development platform, with code in C++. As well as playing around with ROS in the past, its popularity in a large number of robotics projects and large number of libraries makes it a very development platform. Also, I recently discovered the MoveIt! package, which would be great to try out for the Bioloid’s walking gait.

For simplicity, the A-Star will be programmed using the Arduino IDE. I was pleasantly surprised that I wouldn’t have to write any serial comms code to get the sensor data into the ROS environment, as a serial library for the Arduino already exists. ROS is already looking like a good choice!

The A-Star will initially just serve ROS messages to the PC. It may potentially perform other functions if it has the processing power to spare, but for now there is no need. A ROS service running on the PC will be needed to interface with the Dynamixel servos, instructing the servos to move, reading their feedback and publishing the robot’s joint states to various other ROS nodes.

My next post will focus on the new sensor hardware. Until then, please let me know your thoughts and suggestions, all feedback is welcome!