The Ardberry Buggy

IMG_0396Back in the day when I was experimenting with the Lego NXT components in 2011 (part 1, part 2 and part 3), the intention was to use these components in an internet-controlled buggy that would serve a webpage and take commands over it. That was 3 years ago now, during which time what little work that had been done on it – designing a PCB that ultimately wasn’t of much use after I realised that using an Arduino to host a dynamic site probably wasn’t the best of ideas, fiddling with a Raspberry Pi streaming a webcam and buying the lego wheels in 2012 – has been sitting in a small metal box in my room just waiting to be built. Projects have come and gone, exams have been taken and secondary (high) school has been graduated, yet that box remained (more or less) unopened as one of the few constants in the ever-changing life of a student about to go to university.
Today, dear readers, I would like to announce that this is no longer the case, and I present to you the result of three years of procrastination followed by two months of desperately working out how the simplest of web protocols work. I present: the Ardberry (Arduino-Raspberry Pi) Buggy!

Ardberry Buggy Schematic

It is a web-enabled vehicle that can be commanded via a webpage, served by the Raspberry Pi. The webpage also shows a live video feed from the Raspberry Pi camera, which is mounted on two servo motors, allowing for 180 degree movement of the camera in two axes. The web interface allows the user to get a measure of the distance to the closest object in front of the buggy using the  NXT ultrasonic rangefinder sensor.

This post is an instruction manual of how to build this buggy for my own documentation purposes. You will need to be comfortable with using SSH or other means to get the required files onto the Pi and be able to change Apache configuration settings in the command line. You will also have to be able to upload programs to an Arduino and have the necessary kit to wire up power supplies and the servos in any way you see fit. Certain bits of knowledge will be assumed, like what Static IPs are, but anything you don’t understand can be gleaned from a few well-worded google searches. I’m not spoon-feeding you here.

The hardware required is listed below and the necessary files are freely available on Github. 

 

Electronic and Mechanical Hardware

  • Any mobile phone rechargeable external battery pack + micro USB cable
  • 9V battery + battery pack clips
  • Raspberry Pi 2 Model B
  • Raspberry Pi Camera module
  • Raspberry Pi WiPi wireless module
  • USB A/B cable
  • 100pF capacitor
  • Genuino (Arduino) UNO microcontroller
  • Lego NXT Shield (available here. EAGLE design files available on Git repository.)
  • Lego NXT connector cables (available here)
  • Lego NXT Motors (x2)
  • Lego NXT ultrasonic rangefinder
  • Hextronic HXT900 servo motors (available here) (x2)
  • Lego wheels (x4)
  • Lego x-bar length 8

The Raspberry Pi

IMG_0398

The Pi is the brains of the entire thing. It hosts the webpage, commands the Arduino (to which all the motors and sensors are connected) and streams live video from the camera. On it, you will need to have an Apache web server running and CGI installed and configured to use /var/www as the location of index.html for the webpage and /var/www/cgi-bin as the folder for the scripts. This article describes how to set up the CGI aspect nicely. I would recommend using Raspbian Wheezy as the Linux distribution, as most software is readily compatible with it, it comes with Python and is easy to get up and running. In addition, you will need to setup networking using the WiPi module (and for reconnecting if the Pi loses the signal for some reason). This is best left until the end so that you can more easily debug an error using SSH over a wired (and more reliable) ethernet connection.

Upload the files in the “Raspberry_Pi_code” folder on the Git to /var/www.

Congratulations! You have a working webpage. Navigate to the internal IP of the Pi http://[IP here] and see whether you see something that resembles a control interface. If not then something has gone wrong.

You will also need to set up and configure the camera streaming software. Refer to this for details. Note that you will also need to configure autostart on boot as well. When prompted, install the files in /var/www/camera.

Your Network Configuration

For this part I cannot give you a step-by-step guide as everyone’s home network is different. Essentially, you need to type ifconfig to find your Pi’s MAC code (the Hardware Address) and use that in your router configuration page (e.g. Airport Utility for Apple routers) to assign a static internal IP address to your Pi. Then you need to set up port forwarding so that incoming connections on port 80 (the http port) are forwarded to the Pi’s internal IP on port 80 (unless you have changed the default port in Apache settings).

Next, you need to somehow get yourself a static external IP or a dynamic DNS service that can be updated automatically by an app running on a computer (e.g. a script running on the Pi) or by the router. I used duiadns.net, as they allow for automatic IP updating using an Apple Airport Router. There are several options that you may want to explore. Regardless, this setup means that the buggy cannot be used outside your home network.

To access the webpage you can now type in the address you set up with duiadns.net. I used my domain’s DNS management to set up a CNAME record to point a subdomain of thecompblog.com to my duiadns address.

 

The Genuino (Arduino) UNO

IMG_0399

This is quite simple. Just upload the sketch in the “Arduino code” folder to the UNO! You will also have to plug the NXT Shield onto the board. This shield, if bought from TKJ Electronics, will need to be assembled. Before you upload the sketch, make sure to set the car diameter and the wheel diameter which is used for distance and turning calculations.

The capacitor is connected to the 3V3 and RST (reset) pins. This ensures that when a new serial connection is opened by the script running on the Pi for every new command, the Arduino does not auto-reset and initialise everything again (and take ages to actuate any command). The UNO is connected via a USB A/B cable to the Pi. It is worth noting that any board with UNO shield compatibility (like the Leonardo) can be used in place of the UNO.

The servo motors are powered by the 5V and GND pins of the UNO – they draw too much current for the Pi to power them. Running them both simultaneously actually forces the Pi to restart due to a lack of power. The servo signal pins are connected to pins A0 and A1.

 

The Camera

IMG_0402

This is simply connected via the included ribbon cable to the Pi’s ribbon connector closest to the ethernet port as in the photo. You should see a red light when the streaming software is active.

 

The Servo Assembly

IMG_0401

You will need access to a 3D printer to print the necessary holders in the Git for the servo motors unless you create a better (and probably cooler) solution. The entire system relies on superglue to hold it together which isn’t the most elegant solution but worked for the timeframe of this project. Quick tip: blu-tac is a great way to make sure that the motors fit snugly into their holders.

Set the servos to 90 degrees by powering them up and connecting them to an Arduino sending the servo command on a pin or by some other means in order to make sure that when you mount the holders and glue them in place everything is properly aligned. (90, 90 is the default position of the camera mount). Pop the camera into its holder and bingo, thats the servo assembly complete!

 

The Web Interface

Screen Shot 2015-12-07 at 00.27.33

As you can see, it is quite simple. The left-hand arrows control the buggy wheels and the right-hand arrows control the servo assembly. In the middle you can see the live stream, a timestamp so you know that the stream is still functional, the distance last read by the ultrasonic rangefinder, a status indicator and some buttons.

To update the rangefinder reading, click “get distance”. “Ping” checks that the serial line is still live. “Camera Reset” sets the servo assembly to its neutral position. The text boxes allow you to enter centimetre/degree values for moving the buggy should the arrows not provide enough flexibility.

 

Everything Else

IMG_0403

Again, you will need a 3D printer to print out the remaining STL files. However, please be aware that my designs resulted in the buggy not being able to turn properly, as the wheels were too close together. This was in part due to the limitations of my 3D printer having a print area of only 140x140mm, so I had to stack the buggy into two layers and couldn’t move the wheels much further apart. The code on the Git reflects this limitation by staggering the turn moves rather than preforming it in what should be one smooth motor movement. I have tried to fix this hardware fault in the design files on the Git.

Once the parts are printed, mount the NXT motors however you like. Again, because I apparently love shortcuts, I just used cable ties through the mounting holes on the motors and the 3D-printed lower chassis. Mount the ultrasonic rangefinder in the bracket so that it faces outwards. Then use some glue (again…) to secure the top chassis to the lower chassis, and use blu-tac (again) to secure the Raspberry Pi and UNO into place. Use (you guessed it) glue to connect the servo assembly to the top chassis. Blu-tac down a 9V battery next to the servo assembly and find a place for your 5V Pi power supply (maybe on the bottom chassis).

Mount the wheels and connect up everything. With that, you should have a functional buggy!

Clearly, I have glossed over the details of some steps that have been documented comprehensively elsewhere online.

 

The Software – An Explanation

Ardberry Buggy Schematic1

I cannot write good software. I am a self-taught programmer and I re-use bits of code from the internet that seem to do what I want them to do without fully understanding what on earth is going on. The code I have written is likely inefficient (like sending strings down the serial line and getting the Arduino to do time-consuming string operations and string to int conversions because I didn’t have the will to work out how to do it another way) and very convoluted, but I have taken every effort to document the code well in such a way that it can be modified for use in other configurations. The above schematic is hopefully fairly easy to understand and below I have included some info on how it works.

The buggy receives two numbers, the “command selector” (sel) and the “command parameter” (cmd). The fist tells the buggy what movement the parameter is instructing it to do. It receives this data in the form of a string “x y” where x is sel and y is cmd, separated by a space.

The front-end javascript sends GET requests to the script. For the commands that can be issued by clicking on arrows and entering text into the fields, the function that handles that event has two “modes”: 0 for the arrows (default cmd values) and 1 for the text fields (the user-entered cmd values). The buttons each have an event handler associated with them to handle each case.

I am very happy to receive emails asking for clarification if necessary!


 

Command Selector

Command Description

Command Parameter

1

move forward centimetres to move up to threshold

2

turn right degrees to turn up to threshold

3

move backwards centimetres to move up to threshold

4

turn left degrees to turn up to threshold

5

get ultrasonic rangefinder data None (0)

6

move vertical servo degrees to change position (±x)

7

move horizontal servo degrees to change position (±x)

8

set servo position to neutral None (0)

9

ping serial line and toggle pin 13 LED None (0)

 

Python Error Code

Problem

A

invalid selector number

B

parameter entered is out of threshold range, set in the python script

C

An invalid (not a number) character was entered into the text field

D

serial connection timed out

E

serial connection could not be established

 

The Story

I came across these NXT components way back in 2011 and decided to do something with them. This was the year I was due to take my GCSEs. Alongside these exams, other competitions and miscellaneous things had to be done. That summer I was off at Stanford and then A-Levels started, along with CanSat and a host of other responsibilities. Any progress I made with the Ardberry project would quickly be undone after a prolonged period of inactivity, and so not much progress was actually made! Additionally, I was constantly modifying the hardware of the project. Initially I was going to host the site on the Arduino and use the Pi only for video streaming. I even designed and had a custom PCB made that was an effective clone of the NXT Shield for the Arduino MEGA that would allow for a WiFi shield to be used in conjunction with it. Even though this is assembled and ready to go, I never used it, and I have lost the modified NXT shield libraries, so re-commissioning it would take some reverse engineering. Such is the case with badly documented electronics projects!

After I discovered that I would be taking a gap year, I decided to use the months I had before my travels to finally get this thing made. I took a quick online course in web development to refresh and further my rusty web development skills and using Brackets, marked a simple HTML/CSS website up. Thoroughly pleased with myself for accomplishing this daft task, I began work on the Arduino code, which I had done in a few days. As mentioned, the protocol that I came up with was neither very efficient nor very expandable – it allows for 9 commands and no more at the moment – as it was the simplest way for me to test it at the time.

With that done I began work on the Python script for the Pi. I maybe should have used PHP or some other language, granted, but I wasn’t prepared to learn a new language (yet) for a task I knew could be accomplished with Python, which I was comfortable with. Again, Python isn’t the most efficient thing to use but it works. The next challenge that kept me up for days was trying to figure out how to pass data from the web interface to the script. I tried a variety of convoluted and silly ideas until I realised that it could be accomplished really easily with CGI libraries available for Python.

Then came the debugging. I was happy to find that Apache has quite a good habit of logging everything, including the error messages that the Python compiler spits out. The developer tools in Chrome were also monumentally helpful in debugging the front end javascript. The Arduino code was debugged by using the Arduino IDE serial interface. Problems like hardware not handshaking properly, serial connections not opening, wonderfully vague “premature end of script headers” errors and a whole host of other wonderfully irritating issues plagued my thoughts at night until I had found a solution.

Then the 3D printing component of the project came along. I was forced to use my cheaper PLA as the nicer stuff had run out and I wasn’t particularly inclined to spend £15 on another spool when I already had a 100m spool of the stuff lying around. The usual issues with parts not sticking to the bed properly, warping parts and misaligned axes popped up, until all of a sudden there was an issue with the hot-end not heating up properly. There was a loose connection somewhere and when I was too lazy to investigate properly, I tried forcing the pin into the offending connector, which caused 12V to short to 5V. A small “pop” later and it was perfectly evident that the ATMEGA chip had blown. As it turned out, it was cheaper to get a whole new controller from Aliexpress than the ATMEGA by itself from Farnell with the associated kit needed for a clean de-soldering operation. So a week later and after a very long night fiddling with irritating wires, I had a functioning 3D printer again!

I printed out the parts. To my dismay, the holes for the idle wheel axles was too small. I didn’t have access to any tools to enlarge those holes, nor the will to re-print a large part for a small issue. So I took the initiative to find another solution. I have a set of allan keys and one of them was large enough to force into the holes to enlarge it. After several minutes of twisting and pushing an allan key through a plastic hole, it emerged with a good deal of plastic dust on the other end of the hole. It was tiring but the axles now fit!

I cable-tied the motors down, super glued the necessary bits to other bits until I had the buggy! Some small stuff naturally needed to be debugged still, but now that is done, at long last I can call the project completed.

Three years of procrastination and two months of learning a good deal later, it is finally over!

Advertisements

One thought on “The Ardberry Buggy

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s