Thursday, February 27, 2014

48. Robotics with PiBot. III - Running the Test Programs

12,000 page views!!
We're nearly at the point where we can start doing some exciting robotics!  There are currently four test programs for the PiBot - python programs which put the current system through its paces, to make sure all the hardware is working as it should. 

Firstly here is a listing of the firmware, which has to be running on the Bot Board's Arduino:

Notice that I have embedded these code listings from my Gist account on GitHub, which is where the open source software from the PiBot Team is published. These are my versions of the PiBot Team's software. Hopefully my future versions, on following Blog posts, will be advanced versions of these. As these are embedded, any changes I make to these on GitHub will be reflected in these listings, so I must be careful to generate new Gists.

The four python programs are:
1. drive_in_square.py,
2. monitor_ultrasonic.py,
3. robot_teleop.py and 

4. test_pibot_hardware.py.  
The code listings that I will give here contain some small adjustments which I made to suit my particular needs.

drive_in_square.py

This test program allows the PiBot to proceed in a straight line for a fixed time, make a right-angled turn, do a straight line, do a right angle, do a straight line and do a right angle - making a square shaped track.  The program continues indefinitely, so it has to be interrupted by performing a Ctrl-Z on the Pi's remote terminal on the PCfollowed by touching together the two flying leads on the PiBot to do an Arduino reset

You've seen this program in action in the last post, but here's the version with a shortened square:

OK - it's not quite a square - when I changed the speed, I should have adjusted the delay times too - anyway you get the gist! (where have I heard that word before?).

Here's the code:

Notice that the speed can be specified in the command line which runs the program.  Lines 11 and 12 pick up the argument if you put it in, and otherwise it assigns a default value. For example, if you enter the command:

python drive_in_square.py
the default value of speed at line 9 will be used, while the command
python Drive_in_square.py 255
will enable the command line argument with the value 255 to be picked up, and this value will override line 9.

monitor_ultrasonic.py
This program causes the ultrasonic transceiver to transmit and receive, so that the estimated distance to a solid object can be printed on the remote terminal.  The PiBot doesn't move.

Here it is running:


It may be a little difficult to see, but the screen gives a continuous readout of the distance from the ultrasound detector to a solid object - for example, my hand.

Here's the code:


robot_teleop.py
The teleop program makes the PiBot respond to 1-character keyboard commands - f for forward, b for backward, l for left, r for right turn and s for stop, each followed by an Enter

Here it is running:


It's not very slick, but remember that these are only test programs for checking that the hardware works properly.  The glamour comes later!

Here's the code:


test_pibot_hardware.py
This program tests all hardware connected to the PiBot.  Currently the final kit parts have not been fitted - the panning step motor and the tilting servo for aiming the PiCam, but the code for some of these functions is already there, ready for connection.

Here it is running:


The sound at the start is of bongos, followed by a 'drum-roll', and at the end of the routine there are 4 sounds - a klaxon sound, followed by a 'ding', followed by 2 cat meeows.  The distance estimated by the ultrasound transducer is displayed on the computer terminal, as in the previous program, monitor_ultrasonic.py.  The hardware that is attached appears to be working properly, including the 8-neopixel strip.   

The stepper motor and servo software is included, but this hardware hasn't been attached yet.

Here's the code:

Notice that I am using a couple of mp3 sound bites downloaded from the internet, including one of a cat meeow, hoping to stimulate some interest in our cat, but it doesn't fool her - she just ignores it!

What's next - I hear you say - well, some tidying up of the above, and dreaming up what I could make it do etc.

Some observations so far - the 4 x AA rechargeable batteries work a treat, and for a surprisingly long time.  I have 2 sets of 4 x 2400 mA-h AAs, so one set can be charging while I'm running the PiBot with the other set.  I haven't timed the workload available, but so far I have only had one episode (the Pi wouldn't boot) where the only remedy seemed to be to replace the batteries, and that worked.

Presumably by the time I start running the PiCam, the system will start to eat up batteries.  I'm happy that the Pi I'm using is a Model A because the power consumption is less (1.5W) than that of the Model B (3.5W).

Things I would like to do?

  1. I assume that I will be able to use the ultrasound distance system to prevent the PiBot from colliding with things - my SD card is getting an awful battering!
  2. Experimenting with combinations of colours of the neopixels.
  3. Of course, mounting the camera and transmitting images would be great. (ref HERE).
  4. I also have a successful XBee pair system working away (ref HERE), so I will try to think how that could be included in the PiBot.
  5. An IR receiver would be great, to smarten up the remote control. (ref HERE).
  6. I saw some code for a thermistor, and I would like to incorporate my previously-built thermopile system (ref HERE), which is still working away on my desk, telling me how cold a bag of ice is, or how hot my tea is.
  7. Another project which is working away on my desk very successfully, is the recently built light follower (ref HERE) - it works a treat and I would like to somehow incorporate that into the PiBot.
  8. I would like to put a powerful headlamp system on.  This could be self-powering like a high-power LED flashlight (or two?) (ref HERE).
  9. A talking system would be good.
  10. Get it to make my tea, wash the car.........

But first, I will try to fully understand and maybe explain, the current software.

Wednesday, February 19, 2014

47. Robotics with PiBot. II - Setting up the Software

11,000 page views!!
Here's what the hardware currently looks like:

It's powered up!  There are electrons flowing through its veins, but it's still asleep so it doesn't do anything yet.  It needs some software!  You may be able to see the blue LED lit on the Bot Board, the red PWR LED lit up on the Pi, and the flashing blue LED on the WiFi dongle.
It's on a raised platform just in case the wheels should start turning and it takes off!  The voltage regulator can be seen wrapped in green heat-shrink (I had to change the order of the regulated orange lead and the unregulated red lead before connecting to the Bot Board's power header). 

The wires to the neo-pixel header have been connected, and there are two flying leads - a purple and a green, which are attached to the two pins nearest to the GPIO pins on the Bot Board's 6-pin ISP header near the blue power LED.  These are for when the Bot Board's Arduino needs to be re-set - they just need to be momentarily touched together to do a micro-controller re-set.

The rainbow-coloured 5-wire strip is running from the Bot Board's Serial interface to a USB to CP2102 TTL UART Module which I purchased for £3.70 from ATelecs on ebay.  This module has a USB connection for inserting into the PC's USB socket.  

Be careful not to connect both the Pi's power supply (through the Pi's micro USB connector) and the UART module to the PC at the same time.  There is currently no battery in the PiBot, and in any case, its main switch is in the off position.  This is a third potential power conflict, so pay particular attention to what power sources are connected!

I thought I would start with a clean sheet - a new SD card, for the PiBot.  Actually, it's an 8 GB micro SD card slotted into a micro SD to SD adapter.  Here are the steps I took:
  1. Download SDFormatterv4 from https://www.sdcard.org/downloads/formatter_4/eula_windows/
  2. Unzip this and put the SD adapter card into the USB/SD adapter, and run SDFormatterv4
  3. Download NOOBS from www.raspberrypi.org/downloads, unzip and copy the files onto the SD card.  The NOOBS_v1_3_4.zip was only 1.27GB - much more compact than the images I used to download.
  4. Put the SD card into a Model B Raspberry Pi (because it has an Ethernet connector), connect to the WiFi Router via Ethernet, and fire up the Pi
  5. Insert the WiFi dongle (the Edimax appears to be the most recommended)
  6. Start the WiFi Config program on the Pi's desktop, click on the Scan button, and double-click the required network from the listed networks which have been found.
  7. Choose TKIP encryption, enter the WiFi Wireless Key code into PSK, then Add - that's it!
  8. Move both the Edimax dongle and the SD card to the PiBot's Pi (mine has a Model A Pi).
  9. Download PUTTY (I used this before in an earlier post) to the PC
  10. Download TightVNCViewer to the PC (again I used this before in earlier posts)
  11. On the Pi, do a sudo apt-get update, and sudo install tightvncserver
  12. Run tightvncserver by typing vncserver:1 (I used the same command as before - vncserver:1 -geometry 800x600 -depth24 - the extra attributes -geometry 800x600 specifies the size of desktop to be created - the default is 1024 x 768
    -depth24 : 24 specifies the pixel depth in bits of the desktop to be created - the default is 16 and other possible values are 8, 15 and 24 - don't use anything else.
  13. Run  PUTTY and TightVNCViewer on the PC (you can save the VNC session for later convenience) and hey presto - we have the PiBot's Pi desktop running on the PC, through WiFi!!
  14. Download the driver for the USB to CP2102 TTL UART Converter from http://www.geeetech.com/Documents/CP2102%20USB%20Driver.rar, unzip, and run the installer file.
  15. Download the Arduino sketch for the Bot Board's Arduino from https://github.com/pi-oneers/bot-firmware and upload to the Bot Board via the UART Converter.
     
    Okay -  this brings us to the point where we can communicate with the PiBot from the PC over the WiFi network.  But we still need to fill the PiBot's brain with thoughts and intentions which make it do some proper robot stuff!!
      Even before we do this, there's some further preparation to be done - the system needs to be configured:
       
  16. Enable the SPI interface on the Raspberry Pi:
    a.  first do a sudo raspi-config, choose Advanced Options and then choose Enable SPI interface by default
    b.  do a  sudo nano /etc/modprobe.d/raspi-blacklist.conf to open that file for editing, and make sure that the SPI line is NOT blacklisted, by making sure the line is commented out with a '#' at the start
    c.  restart with a sudo reboot
    d.  check that the module has been loaded with a lsmod | grep spi_bcm2708
  17. Install the SPIdev python module:
    sudo apt-get install python-dev
    mkdir python-spi
    cd python-spi
    wget https://raw.github.com/doceme/py-spidev/master/setup.py
    wget https://raw.github.com/doceme/py-spidev/master/spidev_module.c
    cd bot-command && sudo python setup.py install
  18. Download the PiBot code repository from Github
    The following command can be used to do a software update:
    git clone https://github.com/pi-oneers/bot-command
  19. Test the hardware:
    There are some test programs in the /python-spi/bot-command/examples folder:
    test_pibot_hardware.py,
    drive_in_square.py,
    monitor_ultrasonic.py and
    robot_teleop.py
    which should put the hardware through its paces.  These can be executed by using, for example:
    cd python-spi/bot-command/examples
    sudo python test_pibot_hardware.py

    or sudo python drive_in_square.py
    or sudo python monitor_ultrasonic.py
    or sudo python robot_teleop.py
Here's a shot of the PiBot on its pedestal, connected to the laptop, after uploading the Arduino code:

And here's the screen shot to show the headless Pi's desktop and the Arduino IDE with the Bot Board's sketch after loading.

And I got it to move!  Have a look at the following video - it's battery-powered, WiFi-enabled, Arduino-firmwared and controlled by python on the Raspberry Pi!


This python script - drive_in_square.py - simply sends the Bot around the sides of a square (for ever - or until the batteries run out, or, in this case, it collides with something solid!)  I also got the script monitor_ultrasonic.py to run successfully - it gives a continuous reading of distance from the ultrasound transducer to an obstacle, in centimeters.  There are a couple of other test scripts which I have to get going yet - but it's so exciting that the Bot has come to life!!

Next time I will try to explain the software and develop some more python programs.






Friday, February 7, 2014

46. Robotics with PiBot. I - Assembling the PiBot

10,000 page views!!
This is intended to be the first in a set of posts called Robotics with PiBot.  I have no idea how many of these posts there will be, or even if this one is the last, should I blow the whole thing up as soon as I switch it on, making it the end of everything!

The PiBot is a kit for a 2-wheels-at-the-front, one-ball-bearing-behind robot (a tri-bot?), with the Raspberry Pi as its brain.  The kit can be purchased from Team PiBot at http://pibot.org/, where all the current details about the PiBot can be found.  

This is, of course, open source hardware and software, and I get the feeling it will be a long-term project.  Included with the pieces for assembly of the structure of the PiBot is a Bot Board, which allows the PiBot's systems to interact with the Raspberry Pi (which is not supplied).

Hopefully, it will look something like this when it's finished:
The kit currently consists of a chassis, the Bot Board interface, mentioned above, an audio capability (there's a speaker - that round object at the front), independent left and right motors with gearing assemblies, a PiCam control system (on the long arm at the top, Raspberry Pi Camera not supplied) consisting of a stepper motor and a servo (so that's why I was doing all this stuff with motors!), an 8 neo-pixel LED strip (the one illustrated above shows instead, a 16 neo-pixel ring surrounding the speaker), an ultrasound distance sensor (described as its 'eyes') and of course the potential to add almost anything you can imagine!

This post has no circuits, no code, only pictures - but watch this space!

Below are some pictures of the kit and its assembly:
Here are the 8 parts of the kit - straight out of the box, before assembly.
Part 1 before assembly - you can see the
speaker near the middle, and the red O-rings
used for holding the plastic pieces of the
chassis together.
Part 1 assembled
Part 2 before assembly.  The yellow items are
the left and right DC motors.
After assembling part 2
Part 3 before assembly.  The small red board is the audio module.
Part 3 assembled and mounted on the PiBot
- top view

Part 3 assembled and mounted on the PiBot
- there are some 'spare' parts
Part 3 has been assembled and mounted on
the other side of the PiBot.
Part 4 before assembly - 4 x AA battery holder
and ball bearing castor to act as a back wheel.
Part 4 - the battery holder (sans batteries) and the ball bearing mounted. Underside view.
So that's what the spare parts - perspex plate and 2 O-rings - are for! (a lid to stop the battery holder falling out).
Top view
Top view.  Note the on-off switch at the bottom left.
Underside view.  Contact with the ground
is at three points: each wheel, and the ball
bearing castor.
Underside view.
Top view
Top view
Part 5 - the interface board (Bot Board)
with 2 spare wires (I didn't need them).
Part 5 - underneath the Bot Board. Note the
small gel foot at the bottom right, to provide
a soft separation from the Pi's capacitor.
The Raspberry Pi (Model A) installed - it has to be a Rev 2 board with the mounting holes.  On the left you can see the SD card protruding from the Pi at the front of the PiBot: 
Top view - before the Bot Board (on the right)
has been attached to the Pi.
The audio plug has been connected.
Top view - the Bot Board has been mounted
on top of the Pi using the Pi's GPIO pins. 
I didn't use my Model B as it is a Rev 1 which doesn't have mounting holes. I'm not sure if the Model A will have enough USB sockets etc - I might have to buy yet another Raspberry Pi!
Top view - with some of the wiring connected.
Top view
The total weight at this stage, without batteries and camera assembly etc is 361g.  

I was advised to skip the pan and tilt parts, so I skipped Part 6 and 7 and went on to Part 8, the ultrasound transducer and neo-pixels:
Part 8. The ultrasound transducer (transmitter
and detector) - distance sensor (topside)
,
and LED strip (neo-pixels). My strip goes at the rear rather than at the front.
Part 8. The ultrasound transducer (underside) 
and the 
underside of the LED strip (neo-pixels)
Again there was only one place to mount the ultrasound transducer, so this is what the project looks like now:
From the front.  
From behind. This is the only place that
I could find to place the neo-pixels.
Note that the ultrasound transducer board is upside-down compared with the first picture above, showing how it should look when finished.

But what's this:
This is the voltage regulator, but at the moment I don't know where it goes.  One other thing - the wires go in the order Black, Orange, Red, and apparently they should be Black, Red, Orange, and may need to be corrected?

At this point I will pause while waiting for more details about what to do!!

To do:
Part 6. The stepper motor (to pan the PiCam)
Part 7. The servo (to tilt the PiCam)
Do come back later - there's lots more to come!

Tuesday, February 4, 2014

45. Improved Light Follower with Wide Field of View

I have made an improvement to the collimation system, which happens to be the same design that Geo Bruce had in his original instructables.com post at http://www.instructables.com/id/Arduino-Solar-Tracker/ !  My previous design of an 'eye' was restricting the field seen, and the tracking system was only making adjustments within a narrow field of view.  Geo's design exposes the light-dependent resistors (LDRs) to light from a much bigger solid angle, and therefore allows the tracker to 'watch' the light levels almost all around it.  (Strictly speaking, the field of view of the whole detection system is a solid angle of almost 2π steradians, [a hemisphere] so the field of view of each LDR is π/2 steradians).  So this makes the system more sensitive to light variations in its vicinity.

Here is what the collimation system looks like now:





I made the collimator's leaves from some cardboard and hot-glued it to a black plastic cap which was screwed onto the pan and tilt bracket.  

The system is now very sensitive - a result of this new collimator, replacing the 10kΩ resistors by 1kΩ resistors in series with the LDRs, and fine adjustment of the two trimmer potentiometers, 'delay' and 'tolerance'.

Here's a video of it working - watch how it follows the beam of the torch - even in a brightly daylight-lit room.  It reacts when I shadow it with my hand.  It even twitches when a tall truck passes by!  I'm watching it slowly pan across the sky as the time passes.  I'm looking forward to seeing it track the sun across the sky (if the sun ever comes out!).


I have also tidied up the software, and here's the code:
1:  // Originally written by Geo Bruce at:  
2:  // http://www.instructables.com/id/Arduino-Solar-Tracker/   
3:  // then slightly modified by S & S Feb 2014  
4:    
5:  #include <Servo.h>                    // include Servo library   
6:    
7:  Servo horizontal;                     // horizontal servo  
8:  int servoh = 90;                      // initialise the position of   
9:                                        // the horizontal servo  
10:    
11:  Servo vertical;                      // vertical servo   
12:  int servov = 90;                     // initialise the position of  
13:                                       // the vertical servo  
14:    
15:  int ldrlt = 0;                       // LDR left top = analog pin 0  
16:  int ldrrt = 1;                       // LDR right top = analog pin 1  
17:  int ldrlb = 2;                       // LDR left bottom = analog pin 2  
18:  int ldrrb = 3;                       // LDR left bottom = analog pin 3  
19:    
20:  void setup()  
21:  {  
22:   Serial.begin(9600);  
23:                                       // servo connections  
24:                                       // attach digital pins  
25:   horizontal.attach(9);   
26:   vertical.attach(10);  
27:  }  
28:    
29:  void loop()   
30:  {  
31:   int lt = analogRead(ldrlt);         // left top  
32:   int rt = analogRead(ldrrt);         // right top  
33:   int lb = analogRead(ldrlb);         // left bottom  
34:   int rb = analogRead(ldrrb);         // right bottom  
35:     
36:   int dtime = analogRead(4)/20;       // read delay potentiometer   
37:   int tol = analogRead(5)/4;          // read tolerance potentiometer  
38:     
39:   int avt = (lt + rt) / 2;            // average value top  
40:   int avd = (lb + rb) / 2;            // average value bottom  
41:   int avl = (lt + lb) / 2;            // average value left  
42:   int avr = (rt + rb) / 2;            // average value right  
43:    
44:   int dvert = avt - avd;              // difference between top and bottom  
45:   int dhoriz = avl - avr;             // difference between left and right  
46:      
47:   if (-1*tol > dvert || dvert > tol)  // check if the difference is within the  
48:                                       // tolerance else change the vertical angle  
49:   {  
50:   if (avt > avd)  
51:   {  
52:    servov = ++servov;  
53:     if (servov > 180)   
54:     {   
55:     servov = 180;  
56:     }  
57:   }  
58:   else if (avt < avd)  
59:   {  
60:    servov= --servov;  
61:    if (servov < 0)  
62:   {  
63:    servov = 0;  
64:   }  
65:   }  
66:   vertical.write(servov);  
67:   }  
68:     
69:   if (-1*tol > dhoriz || dhoriz > tol) // check if the difference is within the   
70:                                        // tolerance else change the horizontal angle  
71:   {  
72:   if (avl > avr)  
73:   {  
74:    servoh = --servoh;  
75:    if (servoh < 0)  
76:    {  
77:    servoh = 0;  
78:    }  
79:   }  
80:   else if (avl < avr)  
81:   {  
82:    servoh = ++servoh;  
83:     if (servoh > 180)  
84:     {  
85:     servoh = 180;  
86:     }  
87:   }  
88:   horizontal.write(servoh);  
89:   }  
90:    delay(dtime);   
91:  }