DeepVac

Table of Contents

Overview

This page documents the creation of a vacuum cleaning robot.

Although this is a fundamentally "solved" problem (e.g. Roomba), this project presents an opportunity to return to creating robots whose control systems are complex neural networks. The intent is to make use of "deep" learning tools - particularly TensorFlow - for the control system in order to get more intelligent behaviour than would be expected from a Roomba-like device.

Mechanical and Electrical

Introduction

This section describes the mechanical and electrical assembly. After a number of kludgey attempts to retrofit one of my other platforms with a vaccum, I decided to outsource the problem by starting with an existing vaccum robot and bending it to my needs.

I performed a small amount of research to see what hacks others had done to existing platforms. iRobot produced a hacker-friendly version of the Roomba (Create2) which I thought was great! Except that, to make room for all the hacks, iRobot took out the vacuum... The robot I ultimately chose for the task is the Rollibot which was available on the shelf at Canada Computers. In addition to the reasonable price, I also do a fair amount of shopping at CC so when it comes time to replace cleaning parts and vacuum consummables, they are within easy reach.

I noticed that there were literally no hacks for this vacuum. After a cursory investigation under the hood, I noted that the controller board is ultimately unhackable - no abusable interfaces were obvious. So this will have to be a wholesale, warranty-busting hack that swaps the existing board with something more amenable.

Overview of Rollibot Construction

Platform Construction

The figure below shows a production version of a Rollibot. This is a more expensive (and possibly capable) version of the one I'm tearing apart.

Figure M1: A Stock Photo of the Rollibot from the Website

The figure below shows the top of the robot I purchased (less a plastic cover - I couldn't wait to start ripping....)

Figure M2: The Top of the Rollibot

The figure below shows the bottom of the Rollibot. Two wheels on the centre axis and a caster forward had me wondering why this robot doesn't fall back on itself. You can see the battery compartment just forward of the vacuum hole (and just behind the caster). The weight of the battery is significantly more than the vacuum motor so the robot stays upright.

Figure M3: The Bottom of the Rollibot

The figure below shows the Rollibot split open. Opening it up was not difficult at all. The bumper shield must be taken off first. Once open, I had to be careful because the top and bottom remain connected through the wires shown in the figure. I didn't want to pull any of the connectors out before I had a chance to label them. But once that was done, the top and bottom can be fully separated by pulling out a few connectors.

Figure M4: Rollibot Guts

Once I pulled the top and bottom apart, I set to work on the top. This is where the robot's new brain is going to sit (there is very little actual room inside). So I need to make the surface there amenable to drilling holes and attaching nuts and bolts. I started this process by splitting the pop-up lid into its component parts as shown in the figure below. There is a black plastic cover that is slightly curved covering the pop-up lid. This cover serves as both a display and a keyboard interface for the user. I popped it off and tossed it.

Figure M5: Rollibot Display and Keyboard Cover Removal

Underneath the black cover, the pop-up lid is nice and flat (prime real estate for a new brain). The keyboard/display system is embedded in the pop-up lid as shown below.

Figure M6: Rollibot Display and Keyboard (Mounted)

I removed the keyboard/display circuit. They were held in place by two screws and connected to the rest of the robot through a convenient connector just underneath. I found the circuit to be kinda neat in that it used surface mount LEDs for the digital readout - relying on layout of the plastic to channel the light to look like the bars that you would see in a normal LED. Neat. Of course, I couldn't reuse either of these so I tossed them in my "to examine later" bin.

Figure M7: Rollibot Display and Keyboard Circuit

Controller

The Rollibot controller circuit board is shown in the figure below. The main controller appears at the center. I numbered the male and female headers and also mapped them to the components that they control in the unlikely event that I would need to return the board back to operation.

Figure M8: Rollibot Brains

The Rollibot is controlled by an ARM-based microcontroller (STM32F103-VB) as shown in the figure below. A cursory exam of the remainder of the board indicates two locations that might be used to interface to it and possibly hack new code. One of these is outlined in red in the figure. The other is an unpopulated, four-pin through-hole header that sits some ditance away from the controller. If there had been an easily identified interface that had some hope of being bent to my needs, I would have considered keeping the board in place and modifying it to act as a slave to a new controller. This approach would allow me to keep the sensorimotor interface intact. Unfortunately, it also involves really getting to know the innards of the existing circuit board and resident code - which might have been easy if it exposed an API.

Figure M9: Rollibot Brains

Motors

As shown in the figure below, the motors were surprisingly easy to extract from the frame. They are self-contained and held in place by three screws. Each motor has a circuit board containing a shaft encoder circuit (the black circle on the blue circuit board). Each motor also has a spring that forces the wheel downwards. When the wheel is at its maximum downward reach, there is an embedded switch that opens to tell the Rollibot that it is no longer touching ground. The switch appears above the shaft encoder board (two blue wires) in the figure.

Figure M 10: Rollibot Motors

Sensors

The Rollibot is equipped with five downward facing sensor ports for detecting the absense of floor - an upsetting situation that occurs when crossing over an edge. The locations of these sensors are outlined in red in the figure below.

Figure M11: Rollibot Edge Sensor Locations

I removed the edge sensors for some closer examination. These are shown in the figure below. You will note that all of the edge sensors are interfaced to connectors 6 and ?? which can be seen in Figure M8.

Figure M12: Edge Sensors

I wanted to know if the edge sensors had any built-in "smarts" - i.e. did they do any processing locally. Given that they were only two-wire interfaces, I didn't hold out any hope. The figure below shows one of the sensors cracked open for a better look. As I suspected, its just a transmitter-receiver LED pair - most likely working in the IR band. Signal modulation and conditioning would have to be done on the circuit board (hence all of the surface mount resistors).

Figure M13: Interior of an Edge Sensor

The rollibot also comes with four forward facing IR sensors. These are mounted into dedicated slots along the front edge of the Rollibot similar to the one shown in the figure below. To get access to these sensors, it is necessary to remove the a semicircular plastic mount and split it in half. The two pieces can be seen sandwiched together in the figure.

Figure M14: Forward Facing IR Sensor Slot

I opened the plastic mount and removed the sensors. They are all connected to headers XXX and YYY as shown in the figure below.

Figure M15: Forward Facing Sensors

The forward facing sensors transmit through a tinted, translucent lense attached to the front of the Rollibot. This lense also acts as a bumper and is used to activate bump switches. The bumper reminds me of RoboCop.

Figure M16: Rollibot RoboCop Bumper

There is a final set of four IR sensors on the Rollibot. These are shown in the figure below. They appear to be IR receivers only. There are three on the front and one facing backward. The very front sensor is contained in a plastic box that very narrowly constrains the line-of-site of the sensor. I haven't confirmed this but my working hypothesis is that these are the "go home" sensors that are used to find the recharging base. The base itself is tinted and translucent so I suspect it contains a modulated IR beacon. The algorithm used by Rollibot is likely to turn in the direction that maximizes the IR signal being received on the front, center sensor and then to move in that direction. Because of the shadow being cast by the shell containing the center sensor, it is able to very precisely wiggle onto the rechaging slots. But that's just a hypothesis....

Figure M17: IR Homing Sensors

There are six switch sensors. As noted previously, two are attached to the wheel mechanism to determine if the robot has left the ground. As shown in the figure below, three more are attached to the front of the Rollibot (outlined in red) and are activated by depressing the RoboCop bumper. The bumper is pushed back into place after hitting something by the aluminum spring attachment outlined in purple.

Figure M18: Bumper Switches and Return Spring

There is one final switch sensor used to detect if the vacuum container is in the slot. There is also one final IR sensor that shines across the mouth of the vacuum container which is used to see if there is any garbage obstructing the opening.



Modifications

Sensors Modification

Every one of the IR sensors needs to have a custom circuit on the back end to generate and modulate the detection signal. That is why the circuit board is overrun by resistors. From a cost perspective, I can see why the designers chose this path. The sensors and their circuits are dirt cheap. However, I decided that integrating a custom conditioning circuit would be harder than just replacing the sensors with something having more more intelligence at the front end. This section describes the results of that effort.

Edge Sensors

I shuffled around in my box of parts and found an unopend Sharp distance sensor that I had purchased from RobotShop.com.

Figure M19: RobotShop's Sharp "GP2Y0D810Z0F Distance Sensor - 2cm to 10cm" (Product Code : RB-Pol-157)

It turns out that this sensor has the same width and almost the same height as the original Rollibot edge sensor. As shown in the figure below, the two are remarkably similar.

Figure M20: Sharp Sensor and Original Edge Sensor

In fact, the Sharp sensor not only fits well, it is also held in place by the original plastic hook when it is seated flush with the top of the holder. All that is needed to get the sensor to sit flush is to clip a small plastic extrusion on the top lip of the holder as shown in the figure below.

Figure M21: Inserting a Sharp Sensor Into Edge Sensor Slot

Dinstance Sensors

TBD

Wiring Harness

TBD

Controller Board

TBD

Control System

Introduction

Sensorimotor Interface Controller

For this project, I am swapping out the original controller with something that can be customized to my needs. However, since I plan to use a neural network or some other CPU intensive approach for learning and adaptation, I want to create a standard interface to the sensors and motors that I know will be supported by any device I choose (e.g. laptop, Raspberry Pi, Edison).

I am going to base my interface on the ever-capable Teensy 3.2. Why this device? I have it in a drawer (along with a bunch of others). More importantly, I want to try implementing the sensorimotor interface with the following embedded operating systems:

In the case of MBED OS, there are a large number of supported platforms identified in the hardware section. Only one device in my drawer of microcontrollers is supported on that list - Teensy 3.1 (3.1 and 3.2 are essentially the same device). Indeed, it is only supported because of community efforts. A quick look around at the forums indicates that the more capable Teensy versions are not supported and will not likely attain that status in the future. So Teensy 3.2 it is!

In the case of eChronos, the requirement is that the microcontroller be an ARMv7m (ARM Cortex M3, M4, M7). The Teensy is based on the MK20DX - a Kinetis K20 Cortex-M4.

In the case of FreeRTOS, M4 based as well.

MBED OS

Overview

There are a large number of embedded communities out there. Barriers to entry vary widely - although not as much as they even ten years ago. If you are looking to get something up and running in a bare minimum of time then a good choice would be something like Arduino. It has a large community and there are plenty of examples to help guide the learning curve. But the single most important reason for its success is that the coding environment and build cycle are as close to "plug and play" as you can get. You download and install one executable, select a target board, select the "Hello World" example (blinking light), hit download and you've overcome the entry barrier. Coding in Arduino is not trivial but it also not mind-bending and there are thousands of examples available to learn from.

Unfortunately, Arduino trades depth of control for breadth of access. If you need the features of an operating system (in particular multi-tasking) and you need real-time response, then off-the-shelf Arduino falls short. Note: if you are feeling like a jerk, copy this statement into an forum to trigger a holy war and watch the world burn.

There are a number of embedded operating systems that you can choose from. Most of these are cross-platform and require compiler tools to be downloaded and installed. In most cases, getting the compiled code onto your board is your problem. However, if you are a fan of ARM, then MBED is a good choice. The biggest advantage of MBED is that you can create and manage your project using just a browswer - including compiling the code to a hex file (however, if you prefer, you can also download the compiler tools and execute them directly). Like Arduino, you can make a target selection in the browser window and have your code compiled directly for that device. And the device pages are well designed. Combined, these create inviting conditions for relative newbies.

Getting Started

Getting started with MBED OS requires you to sign-up for an account at the MBED OS site. Once you have an account, your default interface will look similar to the one in the figure below.

Figure: MBED OS Intro

There are two places you want to visit from this page as indicated in red in the figure. The first is to go select your board and the second is to go to the compiler (if you plan to use the online version of the compiler as I am.

Clicking on the "Hardware" dropdown and selecting "Boards" takes you to a location similar to the one in the figure below.

This page identifies all board types compatible with MBED OS. Scrolling down, I found the Teensy 3.1 board.

As I indicated above, it is a community contribution rather than a vendor manufactured board (denoted by CC).

Clicking on the Teensy board brings us to its product page as shown in the figure below. This page contains an overview of the platform and a schematic of the board. This schematic is critically impartant when coding - the MCU references in BLUE on the far left and right are the names you will use when you want to refer to an MCU pin.

The other item to note in the product page is the button that allows you to add it as a target to the online compiler (it is circled in red in the figure above). When you click this button, you get the confirmation shown in the figure below.

Now if you click on the "Compiler" button at the top, you will arrive at a coding interface that targets the Teensy be default. This coding environment is shown in the figure below. The current hardware target is identified in the top right corner.

If you click on the "New" button in the top right corner, you can add and work on a new program. You are asked for a target and program name in a new dialogue as shown in the figure below.

In this example, I named the new program "Myfirst" and when I clicked "OK", the compiler added it to the tree as shown in the figure below.

If you right click on the program name (Myfirst in this case), you are given a number of options as shown in the figure below. In order to get stuff done without re-inventing every wheel, it is helpful to have a library or two available to tap into. So we click "Import Library -> From Import Wizard" as shown in the figure below.

Selecting "Import Library -> From Import Wizard" brings us to a library search page as shown in the figure below. Initially, the tab is empty. I typed the word "mbed" in the search window and the results shown appear after a few seconds. After doing a bit of research on the options available, I've determined that a very useful library would be the top one (the one which is also the most imported and which is authored by the MBED project as opposed to a user contribution). However, as you can see, there are a large numbe of other libraries available which are somehow related to my initial search.

Double-clicking on "mbed" adds it to my project as shown in the figure below. Because I am not sure of what's available in the library, I spent some time browsing the classes.

If you right click on the program name again and select "New File", the new file window dialogue shown below opens up.

I gave my new file the name main.cpp. This file is added to Myfirst and a coding window opens as shown in the figure below. In theory, everything is ready to go!!!

Hello World Blinkenlight

In order to validate the code/compile/run/fix cycle, I created a simple "hello world" program as shown below. The objective here is to blink a light and send a message out a serial port.

A few notable things about this program:

  • In order to know how to manage pins - in particular analog in and digital in/out - I spent some time scrolling through the MBED library. The DigitialOut module requires a pin name (for program purposes) and a pin ID. The pin name I chose is "led". The pin ID is physically pin 15 of the Teensy board which is also connected to an actual onboard LED. Its also known as D13 of the MK20DX chip. However, to access it through MBED, its identification is PTC5.
  • There are a number of serial ports on the Teensy. Once again, the Serial module requires a name and pin numbers for Tx and Rx, respectively. I chose the serial port attached to pins D0 and D1 of the chip and these correspond to PTB17 and PTB16 respectively.
  • The Serial class has a method called printf. I use this to print out the "Hello World" message.

If you hit the "Compile" button and it succeeds without error, you will receive feedback similar to the figure shown below.

Take note that when it finishes compiling, it downloads a HEX file for you. This is identified with the arrow. To move this hex file to the Teensy, you will need to get the Teensy Loader Application over at PJRC. Once you have it installed, you will need to point the loader at the HEX file, hit the load button on the Teensy, and then click on the download and reboot buttons in the loader. In my case, I had success.


Initial Sensorimotor Interface Code

The code below shows a polling loop I wrote to test signal connections and operation.

This code failed to operate. That is, the motors did not turn and the sensor lights did not activate. I dropped in some cookie crumbs (i.e. I put in a delay and activated the on-board LED). I eventually managed to make it partially work when I disabled activating PWM on PTC4. After toggling the PWM on that pin a few more times, I had a fortuitous accident. I had connected a two-line serial LCD to the default serial port on the Teensy for a previous project - which I never bothered to disconnect. I noticed that when I enabled PWM on PTC4, the LCD displayed a message: "Pinmap not found". After doing some googling, it appears that not all errors will be caught by the MBED compiler and, when this happens, the runtime will push a message out the serial port.

In this case, it appeared that the MBED library might be broken in some way. In order to confirm and fix a problem under the hood, I would need to get access to the sources. This page was useful in that regard. Instead of adding the compiled MBED library into the project, it indicates adding MBED-DEV which will provide all of the source.

Take note: MBED OS 2 is called MBED and its sources are called MBED-DEV. If you want to work with MBED OS 5, you need to install MBED-OS. Not confusing at all....

Fixing the problem required locating the files containing the pinmap. As shown int he figure below, I found a promising file in the library structure under Targets. Freescale makes the K20XX which is the device on the Teensy. It turns out that there was a Teensy specific target directory containing the file peripheralpins.c. The original PWM pin mapping within this file is also shown in the figure below.

Notice that PTC4 is not represented in the mapping. The pinout for the Teensy indicates that PWM5 is connected to PTC4.

Also note that the pin name PWM_5 is missing and that the comments for PTD5 and PTD6 map both ports to the same FTM and channel.

To figure out what required changing, it was necessary to dive into the pinout for the K20DX256 chip. I made a very focussed summary of pin-to-function mapping shown in the table below. Each pin can have "Alternative" functionality. In the perpipheralpins.c file, the alternate function number is the last number in each mapping triplet.

I noted some confution between the PWM naming convention on the MBED pin mapping figure and the PWM naming within the peripheralpins.c file. Specifically, peripheralpins.c refers to PWM using the convention PWM_nn whereas the mapping figure uses PWM nn (where it is assumed that the absence of a number is PWM 0).

Referring to the PeripheralNames.h file provides the actual mappings between ports, pins, and FTM channels. I took this file as ground truth. The mapping is shown below.

Using all of this information together, I was able to correct the peripheralpins.c file as follows:

I dropped the code into the board and it solved the problem.

eChronos RTOS

TBD

FreeRTOS

TBD



Adpative Learning Control System

It is my intent in this section to revisit neural networks and "deep learning" in the context of robotics control systems and to apply these to DeepVac.

My preliminary research list is identified in the references below. It will grow as I become re-acquinted with the field. Rough notes shall appear below as I document my progress. Eventually, all of this will lead to the creation of sections on DeepVac's brain.

Notes

Tutorials and Instructables

Link Description
Bcomposes Simple end-to-end TensorFlow examples. Provides an end-to-end example of using Tensorflow and python and makes reference to WildML. In addition, SCIKIT-Learn can be used to automatically generate linear, moon and saturn test data.
Pybrain A modular Machine Learning Library for Python.
Pygame A library for making games in Python. Could be useful for illustration of learning agents

References

  1. The slump and resurgence of AI: New York Times Magazine.
  2. Comparison of "Deep Learning" Software".
  3. G. Lewis-Kraus, "The Great A.I. Awakening", New York Times, 14 Dec 2016.
  4. Y. LeCun, L. Bottou, Y. Bengio, P. Haffner, "Gradient-Based Learning Applied to Document Recognition", Proc. of the IEEE, November 1998.
  5. J. Schmidhuber,Deep learning in neural networks: An overview". Neural Networks, Vol 61, pp 85-117, Elsevier, Jan 2015
  6. Google Tensorflow".
  7. Steven Dufresne, "Introduction to Tensorflow".