Tag Archives: 3D

Further software and hardware updates

Over the last few weeks I have been thinking about the overall controller architecture, 3D printing some test Raspberry Pi cases, and updating the Python test software.

Plotter updates

Timed operations

After refactoring the giant single-file script from the previous post, I noticed the script was actually running very inefficiently, partly down to the fact that the 2D canvas had to redraw the whole robot on every frame, but also because of the way some of the threaded classes (e.g. input handler, serial handler) were inadvertently constantly interrupting the main thread, because they were relying on use of the root window’s after() method. The timing loops are cleaned up now and the code runs much more efficiently.

Here is an example I figured out, of how to create a class to run a function in a separate thread, with a defined time interval, that is pausable/stoppable. It subclasses the Thread class, and uses the help of an Event object to handle the timing functionality. After creating an object of this class, you call its start() method, which invokes the run() method in a separate thread. The pause(), resume() and stop() methods perform the respective actions. The example is a stripped-down version of an input handler, which applies the equations of motion to user inputs, as shown in a past post.

import threading
from time import time
import math
class InputHandlerExample(threading.Thread):
    def __init__(self):
        # Threading/timing vars
        threading.Thread.__init__(self)
        self.event = threading.Event()
        self.dt = 0.05  # 50 ms
        self.paused = False
        self.prevT = time()
        self.currT = time()
        self.target = 0.0
        self.speed = 0.0

        # Input vars
        self.inputNormalised = 0.0

    def run(self):
        while not self.event.isSet():
            if not self.paused:
                self.pollInputs()
            self.event.wait(self.dt)

    def pause(self):
        self.paused = True

    def resume(self):
        self.paused = False

    def stop(self):
        self.event.set()

    def pollInputs(self):
        # Upate current time
        self.currT = time()

        # Get user input somehow here, normalise value
        #self.inputNormalised = getInput()

        # Update target and speed
        self.target, self.speed = self.updateMotion(
            self.inputNormalised, self.target, self.speed)

        # Make use of the returned target and speed values
        #useResults(target, speed)

        # Update previous time
        self.prevT = self.currT

    def updateMotion(self, i, target, speed):
        m = 1.0
        u0 = speed

        # Force minus linear drag
        inputForceMax = 1000
        dragForceCoef = 5
        F = inputForceMax*i - dragForceCoef*u0
        a = F/m
        t = self.currT - self.prevT

        # Zero t if it's too large
        if t > 0.5:
            t = 0.0
        x0 = target

        # Equations of motion
        u = u0 + a*t
        x = x0 + u0*t + 0.5*a*math.pow(t, 2)

        return x, u

3d plot

So far the 2D representation of the robot worked OK, but it has its limitations. So I finally decided to put some effort into updating the representation of the robot. I went with matplotlib and its 3D extensions with the mplot3d toolkit.

The joints are simply represented using the scatter() function, while the lines are simply drawn with plot(). The text() function overlays the number of the joints over the joint positions. The leg target widgets use a combination of the above functions. The 3D plot is updated using the FuncAnimation class.

Since this designed for Matlab-style plotting rather than a simulator, the performance is still somewhat lacking (not as fast as the 2D canvas), but I wanted to be able to get something working fairly quickly and not waste too much time with something like OpenGL, since my aim is to have a test program and not build a 3D simulator from scratch. There’s always ROS and Gazebo if I need to take this further!

As a side note, I have added the option in the main program to be able to select between 2D or 3D drawing mode, as well as no drawing, which should be useful for speeding up execution when running on the Raspberry Pi in headless mode (no screen).

Here is a screenshot and a video of the results:

Quadbot 17 3D Drawing 002


Canvas redrawing

Another efficiency issue was the fact that previously the canvas was cleared an then redrawn from scratch on every frame update. To improve efficiency, both for the 2D and the 3D cases, the drawing classes now have structs to keep track of all element handles, and the redraw functions move the elements to their updated positions. For the 2D case (TkInter canvas), this is done with the coords() method. For the 3D case (matplotlib) it was a bit trickier, but I managed it with a combination of methods:

  • For lines (plot): set_data() and set_3d_properties()
  • For points (scatter): ._offsets3d = juggle_axes()
  • For text overlay (text): set_position and set_3d_properties()

Updated class diagram

The interaction between the different elements is cleaned up and now looks like this:

QB17QuadKinematics Class Diagram 02


Hardware architecture

Now, to make a switch and talk a bit more about the hardware, here is the current plan of how all the components will probably look like:

Hardware Components Diagram 01

I have in fact got two Raspberry Pis now, a Pi 3 Model B as well as a Pi Zero W. Both now have built-in WiFi, but I am going with the larger Pi 3 as the current choice for main controller, favouring the quad-core CPU over the Zero’s single core.

I tried 3D printing some cases from Thingiverse for each Pi, and here are the results. A great plus point of the Pi 3 case is that it has mounting points which will be easy to locate on the main robot body.

You can find my builds and their sources on Thingiverse:

Robot VR model updates

The robot model has got a small update to make it better resemble its real counterpart rather than the default Bioloid humanoid.

I have also had some success testing the Move Group Interface/C++ API with the robot, as well as managed to start a Qt widgets project using ROS’s CMake build system. This is in its very early stages right now, but will form the basis for further tests and advanced motions with the MoveIt! package.

Representing the robot in ROS

Building a virtual robot

In this post, I’ll be describing the first stage in getting a working representation of the, as of yet unnamed robot, into the ROS world. I will also be briefly exploring the MoveIt! library, as this might be a useful tool for the future.

The virtual representation of the Bioloid will be built using CAD models of all the individual servos and support brackets. The robot will be loaded up in RViz, where its links an joints will be manipulated. Inverse kinematics will hopefully be calculated by the MoveIt! package later on. But first, a definition is needed of how the joints and links are all connected, and how to move between their frames of reference. The individual CAD components will be oriented correctly onto these frames.

On a side note, I had the chance to photograph the Bioloid in its current state, and also had a Raspberry Pi 2 delivered!

After some background reading, I found out that the best way to create this representation in ROS is by using a standardised XML format file called the Unified Robot Description Format (URDF).


CAD models

In the past I had made a start on drawing a CAD model of the Robotis servo, but since then Robotis has released CAD drawings of all their robot kits online here (old link).

I tidied up the .stp files using FreeCAD, by removing some placeholder parts or sub-parts which would be of no use (e.g. screws, gears and electronics on the inside of the servos and CM-5). I then converted the files into .dae (Collada) format, and imported them into Blender to add some colour textures. For some reason, saving again as .dae in Blender shrunk the model dimensions by 100. I haven’t worried too much about the actual component size at the moment, as only their relative scales have to be correct for now. There is also a bug in RViz which replaces the ambient color of Collada materials by light grey if at least one component of the specified ambient color is 0. To fix this, I manually edited the file and replaced all 0’s with 0.1 in the “ambient” tags. Below are some of the main components, as rendered in Blender.


The Unified Robot Description Format

The URDF file itself is written in a plain markup language. As the definition of various links is written more conveniently with some basic maths, and because many bits of code will inevitably be repeated or recycled (e.g. the mirroring of the left-hand side based on the right-hand side), ROS has a macro language called xacro, which makes it much easier to create and maintain URDF files. A URDF is generated from a xacro file using:

rosrun xacro xacro.py model.xacro > model.urdf

Creating the file for the Bioloid took a fair while, as I created all the translations by eye without knowing the actual distance measurements between the various links, but rather by relying on the CAD components as each one was placed in the chain. I started with the right side of the model, first with the easier arms, then with the legs.

The validity of the URDF file can be checked with teh check_urdf tool. Another great tool for visualising the final result is urdf_to_graphiz, which generates a diagram of the joint and link tree. The tree of my model is shown below. Each joint (blue circles) is positioned with respect to its parent link (black rectangles), and the following/child link is positioned has the same origin as the joint, as shown here. The xyz and rpy labels next to the arrows show the translation and rotation required to get from parent frame to child frame, or in other words, it is the representation of the child’s frame with respect to the parent’s frame. You will also notice the addition of the IMU link, as well as an additional camera link, although this is just a placeholder for a possible camera in the future, and at the moment won’t be used.

Graphviz diagram of URDF file

Graphviz diagram of URDF file


The robot in RViz

The current state of the robot is shown in the screenshots below. The robot is displayed in RViz with the help of the ROS joint_state_publisher and robot_state_publisher. It is fully articulated and the individual joints can be moved with the help of the GUI which joint_state_publisher provides. The joint states will later be published from the joint values read by the real Bioloid servos. In addition to the kinematic model, I created bounding boxes around the components for the collision boundaries, which are shown in red. This was after the fact I realised that without them, the MoveIt! plugin would use the full CAD geometry in its collision detection routines, which made it almost grind to a halt!


Integration with MoveIt!

I have only played around with MoveIt! briefly so far, but the results seem very promising. The library has a useful graphical setup assistant, which essentially enhances the URDF with a Semantic Robot Description Format (SRDF) file. The URDF only contains information about how the joints and links are arranged, as well as some other information such as joint limits, and the visual and collision data. The SRDF doesn’t replace the URDF, but exists separately and contains other information, such as further self-collision checking, auxiliary joints, groups of joints, links and kinematic chains, end effectors and poses. So far I haven’t found any need to edit the SRDF directly, as it can be generated and edited by the setup assistant.

The assistant generates a new ROS package with various templates for path planning and visualisation, which is done via an RViz plugin. So far I have managed to interact with the virtual Bioloid’s arms and legs, in a similar way shown in this PR2 robot tutorial. The aim will be to later on create some poses and walking gaits which I can try out on the real robot, but that is all for now!


Useful links

URDF:

MoveIt!:

RViz scaling and ambient colour issues: