NOTE THIS IS A WORK IN PROGRESS, PUBLISHED HERE ONLY IN THE SPIRIT OF CROWD SOURCED DEVELOPMENT SO OUR FELLOW STAKEHOLDERS CAN CONTRIBUTE TO A CURRICULUM "OF THE STUDENTS, BY THE STUDENTS FOR THE STUDENTS!"
So you want to build robots?
Robotics is more than just a hobby, and more than just “the next big thing”. Robotics is a game changer for the human civilization and for the natural world.
But you’ll face so many choices.
One of the things you are going to have to master is organizing information into the correct folders on your computer. Just like it is impossible to find things in a messy room, the computer won’t be able to do its work if everything isn’t put neatly into the drawers where they belong. When installing software for robotics this is particularly important!
In this curriculum we favor the use of Ubuntu as our computer operating system. We use "Natty Narwhal, the Ubuntu version 11.10. This makes a difference because each system has its own nuances and bugs and libraries, so for our instructions and experiences to be relevant to all we recommend that all participants download and install this free 11.10 Ubuntu Operating System.
Ubuntu is a Linux distribution, meaning that it the descendant of the Linux operating system which has been made to act and feel similar to the Windows and Macintosh operating environments.
Staying open source is important to us because it offers the possibility to universalize robotics education. All computer systems can run a Linux distribution and everybody who has access to a computer can afford an operating system like Ubuntu because it is free.
But installing most of the robotics packages developed for Linux can be daunting until you know how to do it. Ubuntu, like most Linux distributions, doesn’t come ready to help you install packages that come as source code. You need to prepare it.
The best tutorial that I have found is here:
Fortunately, most of the software we will use in this program can be installed automatically from the "Ubuntu Software Center" by simply clicking on "install". And all of it is free.
For those who want to go "under the hood" and try to install software from source in the terminal, keep the following advice in mind, gleaned from from a website: “Unix/Linux command shells don’t search the current directory for commands, unless the current directory is listed in the $path variable. So, unless the command is built-in like ‘grep’, ‘type’, or ‘ls’, or the command’s path is listed in the $path variable, the command shell won’t find the command. Unless you specify the current directory as the path by typing “./[command]” at the command line.
The single dot stands for the current directory. So when you type “./cool_prog” the OS knows to run “cool_prog” from the current directory.”
Fortunately, most of you won't have to worry about these nuances for most of the program. Later on you may really desire to get deep into the command line interface. But don't sweat it if now isn't the time.
We will be using Arduino boards as our microcontrollers because they are “open-source” in both their architecture and programming, meaning that you can build your own from scratch and that the software is available for free. The Arduino programming language is a simplified Java program that does all of the heavy lifting for you. There are many variants and clones of the Arduino. We will rely mainly on the Arduino Uno Rev 3 boards and the Sparkfun Arduino Uno clones (until the Uno is obsolete and then we will migrate up the path with them!)
We will also be using a program such as Minibloq to help you get into coding. Minibloq is a graphical development environment for Arduino and other platforms. Its main objective is to help in teaching programming. It is specially used in robotics at high school.
Goal: Getting students confident to go under the hood.
We feel that too many Robotics Education programs, in an effort to make engineering “friendly” and “fun”, sugar-coat the curriculum by using very simple commercial interfaces which provide simple entry points into the world of controlling robots. This is laudable in some respects (low-threshold) but many students come away with the impression that robots are toys and that there will always be user-friendly apps out there to do all the heavy lifting. Familiar with using gamepads and joysticks to control video games and radio-controlled vehicles, and now able to create simple “programs” to control easy to assemble robots using simple graphical user interfaces, many students may be attracted to robotics intially but may never feel the desire to peek, much less dive, “under the hood” to develop an interest in applying their intelligence to understanding the underlying logical structures and creative processes that a successful career in applied robotics engineering will demand.
We want our program to be "low-threshold/high-cieling" i.e. easy to get into but with the potential for unlimited expansion and wide vistas.
Our goals are two-pronged. On the one hand we want to invite young people who never thought they could develop an interest in computer science and robotics engineering to get excited about the field and get involved in some of the activities that may make them develop a more enduring interest. Therefore we want our program to provide for maritime robotics the same fun and functionality that LEGO and other commercial offerings do for terrestrial robotics. On the other hand we are designing our program to provide a gateway into the pipeline for a comprehensive understanding of STEM topics that can help build a new generation of scientist-engineers who are mission ready for solving real world problems. Since these problems can not be solved by a mere cursory interest in robotics or an ability to create cool toys, we are seeking to create a curriculum that makes it not just easy to make robots, but easier to understand the tough subject material that normally drives kids away from robotics as a rigorous discipline.
Why do some students shy away from robotics?
Most of the intimidation and frustration when working with computers comes from the alien linear interface. We have to be taught languages each having its own peculiar grammar and syntax and we have to be taught how to read and write, channeling our brains into sequential, and (in our case) left-to-right thinking. For organisms that evolved in a 3D non-linear environment this can feel unnatural.
Computers and robots, because they are digital and listen only to long strings of “on-off” switches, represented by 1s and 0s, make this problem acute. Nobody can speak “binary” so we have invented languages that interpret what we want and translate them into a very precise code that only machines can understand. In fact “under the hood” it is called “machine language”.
The most famous motto of computers and their physical counterparts - robots - is that “they always do precisely what you tell them to but rarely what you want”.
Our brains use sophisticated forms of ‘fuzzy logic’ and statistical inference to ‘guess’ at meaning and interpret what people say and figure out what they might mean (they are not always the same thing!). That is what makes us appear so “intelligent”. Artificial intelligence is unable to achieve such flexibility and nuance in inference (yet!) so unless you tell the computer or robot precisely what to do, without error, the confusion can cause the entire system to hang. An errant comma or the use of a semi-colon instead of a colon can screw things up, and correct spelling is essential.
Figuring out where things went wrong, i.e. “debugging” can be a demoralizing experience.
So much of what keeps students away from this exciting field can be the difficulty in learning to “talk” to computers.
I thought we were talking about robots, and now you are talking about computers. What are robots anyway, and how do they relate to my computer?
One thing we need to point out here before we go any further: for our purposes in this course, computers are robots and robots are computers. They both “compute”, i.e. engage in logical operations that are ultimately mathematics based and that turn inputs into outputs. And they both do “work” that most humans consider drudgery. (Remember that “robot” is a Slavic word meaning “worker”. And the Slavs didn’t care to be slaves... which may partially explain why a “server” these days is not a human servant but a computer-based robot that sends information to client computers; historically the Slavs didn’t like being serfs, led a revolution and well, ... you get the point!)
The fact that the output of most computers -- the work they do -- occurs on a screen or through audio speakers while most of what we think of as “robots” include motion in their output is trivial. The mechanisms of moving objects in space does tend to complicate things but the idea of moving things and the principle of moving a pixel or a cursor on a screen, or moving air through vibrating a speaker or moving a wheel or an “actuator” arm in our tangible time-space are all very similar.
This implies that when you master the ability to create an effect on the computer screen you are most of the way there in terms of having the same effect in the real world. In other words, a simulation program running in a model of the world can often be a good predictor of what that program would do in the real world, though obviously much is left out of the virtual world that could affect your robot in the real world.
In both the simulation and in the real world trial you are using electric signals to give commands to turn things on or off. But in the case of what we tend to call “robots” those things can be more than dots of colored light on a screen or the frequency and amplitude of vibrations in a speaker. It can include motors and servos (servos are basically motors that can only spin to a certain point). Once you understand this point it becomes clear that the drawer to your CD/DVD player, the laser diode that moves back and forth reading your DVD, and the inkjet printer head that moves back and forth over the blank page to create your term paper are all robots. And here is where the fun kicks in -- many robots can be constructed from parts already found in things like discarded DVD players, disk drives, printers and scanners. These were all robots themselves that the computer talked to and told what to do. And what do computers say to these “peripheral devices”?
Mostly computers tell motors and servers and relays “start spinning now” or “open now” and then “okay, stop spinning now” or “close now”. They can also, by reversing the polarity of the signal (in other words reverse the direction of the current) tell them “now reverse your direction of spin”. The motors themselves can spin wheels on surface vehicles, or can spin propellers for boats and submarines, or they can move gears and control lever arms or wing-flaps and ailerons. The rotary motion of the motor or servo, or the creation and cessation of a magnetic field (such as is used to control relays) can move legs, fingers, cameras, eyeballs, you name it.
Basically any device that can be controlled by turning an electric current off and on can be controlled by a computer program and if it does some kind of work, saving us time and labor, it can be considered a “robot”.
Do you need to program computers to make robots?
Well, no. The on-off switches and the logic used to make things move can be created by hand. In the old days of computers people flipped a huge number of switches by hand to turn on and off electrical current to tell the computer what to do. But electricity isn’t really necessary to make a robot. Technically robots could be run by water wheels or old Spanish windmills or horses going around treadmills, or wind-up springs powering a series of mechanical switches and gates. As long as the logic of on-off or stop-start enables work to be done, one could technically call such a contraption a robot. Mechanical puppets and Cookoo clocks were kinds of robots.
Sea Perch is an example of an entry level maritime robot that is controlled in real time by a human operator. Using a controller with with simple on-off buttons (one controlling the forward motion and one the backwards motion of each of three motors with a propeller) the Sea Perch can be steered in any direction. But this limits its options. And since in the case of an underwater robot it is difficult to get the on-off signals to the motors through water, the Sea Perch has to be “tethered”, i.e. it has to be connected by long wires (Sea Perch uses ethernet cables because they are inexpensive, easy to find and can carry many on-off signals so that 3 motors can be controlled by six of the eight wires in the ethernet cable; two of them are not used).
For water surface, terrestrial and aerial robots like boats, cars and airplanes or helicopters, getting the signal to the craft isn’t so hard because we can transmit on-off signals wirelessly through the air using infra-red light (IR) or radio frequencies (RF) or even sound waves (sonar). We can use newer services like Bluetooth (shortrange RF) and WiFI (longer range RF) for protocols like TCP/IP, but the physics is essentially the same.
Remember that light, whether visible or infra-red, is a form of electromagnetic radiation as are radio waves (on the long end of the spectrum) and X-rays (on the short end). They don’t penetrate through water very well, but they go through air fine, depending on the signal strength (as a function of amplitude and frequency -- i.e. how energetic they are). So robots moving in air can be driven by a wireless remote control (think of radio controlled toy cars and airplanes and helicopters) Alternately you can use wireless controllers with on-off switches to control the movement of pixels on a computer screen or speakers (think of a wireless X-box or PS3 or WII controller)
But what about when you want to talk to your robot when it goes out of range, or when you want a single button to control several motors to create one coordinated movement (for example to adjust a rudder and a propeller at the same time to keep a craft on course)? For that you need to chain commands together and store them so that you can make them play with the push of a single button or call (an analogy would be if you told a humanoid robot “walk!” and that single command coordinated the motion of both legs). This is where computer programming comes in.
Just like Hollywood when we want to tell an actor what to do, we write what we call a “script”. A script can move the body or control the voice of an actor in a certain way. A script can also move an ACTuatOR. An actuator is a type of actor; it is an object that acts according to the commands in the script.
We can write a script that tells a robot to “turn left” but this action could involve several motions. If, for example, the robot has two wheels (or two propeller in the case of a boat) we could say “The right propeller needs to turn on while the left one stays off” or we could say “the right motor turns on spinning forward and the left motor turns on spinning backward”. In Sea Perch this can be done by hand. One uses one’s right hand to push the right toggle to make the right motor spin forward and left hand to push the left toggle in reverse to spin the motor backward. But if you wanted to make the robot go around in circles all day without having to keep your hands on the buttons you would either need to make it so that the switches stayed pressed or you could write a script telling the one motor to spin forward and the other backward. But then you would need a way to store that script and run it.
The simplest way to do that is to use your computer. You would learn (and it isn’t that hard) to tell your computer in a language it understands that it should send the “on” signal for forward motion to one motor and an “on” signal for reverse motion to the other motor and save it in a script and then “run” or “execute” the script.
The way you might do it in Lego Mindstorms for the NXT controller is described here:
“You could create two objects to start a driving robot (where the wheels are attached to ports B and C and Motor B is called mB and Motor C is mC):
mB = NXTMotor('B', 'Power', 100);
mC = NXTMotor('C', 'Power', -100);
This script tells motor B to move forward at full power (100%) forward and motor C to move backward at full power (negative 100%). You can see how various combinations could result in different turning speeds, and that if mB and mC were both positive 25 the robot would move forward at a quarter speed.
In our PORPOISE program we will be using Python, a higher level language that is not so very far away from English that you wouldn’t understand it, but close enough to machine language that it can be efficiently compiled (it is almost impossible for human beings to write programs in machine language; people used to write in assembly code to get closer to an ability to read and write this language. For example, to use an example from Wikipedia, in machine code a sentence might look like this:
Meanwhile, the same sentence in hexadecimal code, which is easier for humans to read, would look like this:
In Assembly Language the same sentence, commented, would look like this:
MOV AL, 61h ; Load AL with 97 decimal (61 hex)
And in plain English, like this:
“Move a copy of the following value into AL”
In between the assembly code and the plain English there are a whole bunch of “higher level languages”.
What computer programming language should I learn?
We are going to start with Arduino's variant of the C language because it gets us up and running fast and it is the one that will enable us to use the Arduino microcontrollers we are basing the curriculum on right out of the box. And let's face it -- serious stuff is done with C all over the world, so it is a great skill to have under your belt.
But we will also encourage explorations of a higher level and intuitive language called Python.
The best part of Python from our perspective is that it 1) is open source and platform independent, meaning that you can run it on any computer using any operating system, and that it is free, so everybody can afford it and 2) it is “extensible” meaning that it will work with and talk to all the other languages, like C and C# and C++ (which are standard but are considered awkward and hard to learn) and Java, which is used for a lot of web applications.
Python is a great bridge to learning other computer languages but is a a powerful language in its own right for serious program (the 3D game engine Blender is written in Python).
Don't worry about that for now though -- as I said, we start with the simplified Arduino version of C, for which there are huge libraries and resources already written, so most of what you will be doing is changing parameters to suit your particular application, cobbling together already written modules, and hacking stuff to fit your task.
Ultimately any computer programming language can be used to run robots, and you will probably enjoy learning quite a few, just as it is fun to travel to different countries and experience the flavor of different human languages. And computer languages all get translated into the same binary code anyway!
Anything you write in a computer language that is not binary (1s and 0s or “on-off” commands) has to be turned into 1s and 0s or “interpreted” or “translated” and we call that act of translating “compling”. So we will be using translation programs called “compilers” to do that.
But where should we “compile” the commands we can to give their robot. It would seem simplest to do it straight to your computer but then you would need to bring your computer with you wherever you want to run your robot and the robot has to be in range of the computer – either via a cable (like a USB cable or Ethernet cable) or WiFi, Bluetooth or IR.
Easier would be if you could put the computer on the robot, and many people do, but then you need a pretty big robot, and bigger usually means more expensive.
To get around this people have developed “micro-controllers” which are basically tiny computers. Because they are small they usually aren’t that powerful and can’t hold that many scripts or programs in their memory. But they don’t have to be very expensive either, and that is nice if you intend to put your computer on a robot vehicle that is in or on the water where there is a risk of water damage (or on a vehicle that could drive off a cliff or crash into a tree!).
The micro-controller we will be using in this program is called an Arduino board. Arduino, invented in Italy, is Italian for “strong friend” and you will find it is exactly that because of how easy it makes robotic programming.
The Arduino Uno, which is the one we recommend, can be purchased for about $30 on-line, but like all the software we use in this program, the Arduino hardware board is “open-source” meaning that the plans or “schematics” to build it from scratch are available for free on-line. Therefore, those who want to go further and learn to manufacture their own microcontrollers, often from spare parts and recovered junk, are welcome to do so.
(In the case of MIT’s MOOS-IvP (IvP stands for “interval programming” and is the “helm” of the craft) surface vehicle robotics they use a ClearPath Robotics Kingfisher (http://www.clearpathrobotics.com/kingfisher) controlled by a microcomputer called a “gumstix” COM (“computer on module” see http://www.gumstix.com/ ). They are fairly powerful, but also expensive, running about $115 for the entry module. But they enable MIT to do more elaborate things with their robot than can be accomplished with the basic Arduino (see http://188.8.131.52/moos-dawg11/pmwiki/pmwiki.php?n=Talk.28-Gariepy)
Arduinos are made to be extensible too, through add-on modules called “shields” so the capabilities of your robot can grow and grow. In addition, besides outputting the control signals to the motors or servos moving your robot, microcontrollers can have INPUTS so the world can “talk” to your robot.
We call the device that translates the input into electrical signals “sensors”. Of course you are already familiar with the concept: your mouse is an example of an input device that senses motion and sends direction information to the CPU (Central Processing Unit – a type of microcontroller on your computer motherboard!) that controls the movement of the cursor on the screen. A joystick is another input device that senses motion. A keyboard is an input device that senses pressure or “touch”. A webcam is an input device that senses light. All of these common input devices can be thought of as “sensors”. In a very real sense your computer is a robot!
To recap, the joystick and mouse are motion sensors, turning physical movement into electrical signals, and a keyboard is a touch sensor, a camera is a light sensor. The Arduino board and other microcontrollers, like any computer, can be configured to input electrical signals as well as to output them.
To make it easiest we might recommend the Teagueduino board, which is basically an arduino that has easy to plug and play jacks mounted on it. This is the approach used by the LEGO Nxt microcontroller (often called a “brick”) and VEX and many others. But these microcontrollers are very expensive and you can’t build them yourself, much less easily take them apart to modify them. The brick becomes effectively that – a “black box” whose insides are unknown to the student or hobbyist. Matthew Francis Landau, a robotics student from Los Angeles who has been building robots since he was twelve, said, “I wouldn’t put my LEGO NXT controller on a boat. I’d be afraid I would damage it. I want something cheap enough that I can afford to lose it but good enough that I wouldn’t want to.” Arduinos serve that function, and Teagueduino has all the ease of the LEGO brick. It can also be self built from free plans available on line.
PROGRAMMING YOUR MICROCONTROLLER
LEGO has an expensive proprietary language called “Mindstorms” that is based on LabView. It is a visual programming language (VPL) that lets students program by moving graphical icons around on a computer screen and connecting them with virtual wires. It is fairly intuitive and lets you compile the code and then send it by USB to the microcontroller. Mindstorms, like other VPLs, is considered very simple for basic robotics tasks. The problem is that it is expensive and it doesn’t show you how the neat picture-like (icon) programming blocks translate into a computer code language. When you need to get under the hood and debug you can find it very frustrating.
With arduino you have more options. The code itself is written in a free software environment that compiles and sends it to the microcompiler. The language Arduino uses is a variant of C that is considered simpler than C, but it can also use standard C or Python so it gives maximum flexibility. Also, because it is open source and there are a huge number of robot enthusiasts using Arduino boards (including the Sea Perch group at MITs Sea Grant division) you can find free example code for almost everything you want to do and you can cut and paste and mash up code snippets to control your robot without writing any code while at the same time you can learn how to program over time by observing what those blocks of code do.
Furthermore, the open source world has created its own VPL for the Arduino so that one can have the best of both worlds – a simple graphical programming interface like LEGO’s and an easy “under the hood” environment for trouble shooting and debugging.
But given that a code that can turn on and off motors can also turn on and off pixels on a screen and thereby simulate movement you may be tempted to ask “is there a way to test my program on a virtual robot on a computer screen before testing the program on my real robot and risking that it crash or sink because I sent it into a tailspin?”
The answer is yes.
This concept, now available for LEGO at additional cost, is also available in the now free Microsoft Robotics Developer Kit. In MSRDK you can used a VPL to program and compile then test out your program on a virtual robot and then, when you are satisfied, you can compile and download it to a real physical robot. And with a free module written by Korean University students at helloapps.com, you can “look under the hood” and program with line code too. Helloapps even lets you choose languages like Python and Java. So the combination of MSRDK and Helloapps SPL (Simulation Programming Language) would seem ideal. But there are two problems with it. The first is that you can’t model your own virtual robot in Microsoft, so you are stuck running what the commercial vendors have supplied to them (there is a virtual LEGO NXT robot in there, for example). The second is that the program, which Microsoft started making available for free in 2008 to encourage more users, only runs on Windows and the new version that allows the KINECT sensor from Xbox to be used, only runs on Windows 7. Microsoft’s strategy, giving MSRDK away for free, is a great way to get more people to buy windows and develop Microsoft based robots but doesn’t help schools and students on tight budgets or those that use different platforms.
We recommend the use of the open source operating system “Ubuntu” (or another Linux distribution; MIT uses Linux for all its robots), open source architecture microcontrollers like Arduino, also used by MIT, and Open Source programming languages like Python. Is there an open-source equivalent to MSRDK for doing simulation?
Thanks to a project out of the LAAS in France there is. It is called MORSE – the “Modular Open Robotics Software Simulation Environment.” It uses Python as its scripting language and is integrated with a very powerful Python based open source 3D modeling and game engine program called “Blender” for its simulations. Because it is coupled with Blender, you can prototype your own virtual robot (Blender enables mesh creation) or download and insert any other 3D object (boats, cars, planes, trains etc.; there are thousands of them in free 3D libraries on the internet) and place sensors on them and operate your robot in the virtual world of your choosing (MORSE even comes with a simulated marine surface environment).
To make a roboboat you don’t need to start from scratch, for example you could download a hull or boat model that is somewhat like what you have in mind and then “mod” the mesh in Blender. In this way you can learn the principles of mesh modeling.
WHY DO WE HAVE TO LEARN THIS?
I know what some of you may be thinking: “Why do we gotta learn this?” or “what am I ever going to do with this? How is this useful in my life?”. Often it isn’t enough to say “one day you can use all these skills and knowledge to get a job”. With the world changing as it is few of us feel that we can afford to wait that long or that there is any guarantee that today’s skills will lead to a high paying job – or any job – tomorrow. Particularly when it comes to robotics, we realize that what we are learning and creating may actually put many people OUT OF JOBS. And isn’t that the point? When Czech author Carl Capek coined the term “robot” in his 1927 ? theater play RUR, the world was eager for machines that could replace human drudgery. Throughout the industrial revolution, machine technology did take much of the human burden away from lifting and pushing and digging and increasingly removed the operator from hazardous materials and dangerous situations.
But those machines, whether tractors, trucks, airplanes, weapons systems or submarines, still had to be operated by a human being. Humans had to control the steering wheel of a car, truck, airplane or boat, or the knobs and levers of a mill or digger or camera, and while there is considerable expertise and art to these skills at times, many of them are tiring and operator exhaustion can be dangerous.
Certainly many machines can now be run from a distance by remote control, removing the operator from many risks (crashing, drowning, being exposed to toxins). But more and more robotics is moving toward true “auto-mobiles” or “self-moving” machines.
We call them Autonomous Vehicles or Autonomous Unmanned Vehicle Systems – the “AUVS” in the “AUVSI”, the organization responsible for today and tomorrow’s international robotics education. Naturally the more autonomous we make our machines the more unmanned they become, whether they are vehicles, call centers or supermarket check-out lines. More and more cameras adjust their f-stops, shutter speed and focus automatically and this too is a kind of robotics.
In fact there are very few repetitive tasks in society, very few fabrication tasks and even fewer design tasks that are not being roboticized. What you and your friends need to prepare for is a future where human creativity and compassion become the only tradeable commodities that the robots you help create won’t replace. With this in mind the study of robotics becomes ever more urgent. American society alone needs at least 400,000 new mechantronics engineers in the next half decade to solve urgent problems and keep our economies going. But the need for other jobs will keep decreasing at the same time.
Human workers will most likely end up being robot designers, prototype builders, programmers, operators and technicians; when it comes to production runs robots will mostly be making themselves.
Robots are now what we call “embedded devices” – they aren’t clunky humanoids from some science fiction film. Rather than build a robot that walks like a person to push a broom or mop or vacuum, the IRobot Roomba concept was simply to make an autonomous moving vacuum. Rather than build a robot secretary or maid like in the Jetsons, your android phone app does your scheduling and organizing while Dragon Voice Type takes memos so you don’t have to put tire your fingers on a keyboard. Robot surgical arms and lasers are doing the jobs that doctors once did and with much greater safety and precision. All of these trends will continue, with training in the control and use of robots being the paramount human endeavor.
This program is designed to give you an easy and fun – and relevant – entry into the world of robotics. By using only open source operating systems, software and architecture and emphasizing off-the-shelf common components (what the 4-H Robotics Program calls “Junk Drawer Robotics”) we ensure that nobody is left behind. The curriculum is designed around a basic PORPOISE kit that we’ve assembled to make your job easier (no need to hunt around for parts or software) but everything in it can also be sourced separately (some from stuff you might recycle).
The basic kit enables the creation of a roboboat made from many of the same common materials found in the Sea Perch kit – ½” PVC plumbing tubes, foam floatation, motors and propellers, wax and old film canisters, PCB circuit boards and boxes and electronic components – and it can be built with most of the same tools as Sea Perch. What differs is the PORPOISE concept turns these materials and a few others into a boat rather than a sub (although you can mod them into anything you like!), and the arduino board that we recommend can be used to operate the boat autonomously – i.e. without human operational labor. Our idea is that you start out with 4 sensors – much like the LEGO mindstorms kit model, but using sensors relevant to maritime missions:
1) An RGB/IR light sensor and emitter
2) An Ultrasonic Distance Sensor (IR doesn’t work well in bright sunlight as those who have tried to use small RC helicopters outside can attest. The Ultrasonic ranger emits 40 KHz chirps and receives echoes to compute distances between 3 cm and 3 meters).
3) A navigational compass
4) A touch sensor for prow collision detection
We also suggest using two motors/propellers (for propulsion and steering) used to control lever arms and grips.
One can go further and add an arduino shell with an LCD screen to notify the status of uploaded programs. When mounted in the supplied waterproof housing the resulting box has all the functionality of a LEGO kit, but you will have built it yourselves and understand how it is put together. As SERVO magazine author Anthony Cherone stated in the 08-2011 issue, “The potential for expansion is infinite, creating opportunities beyond immediate imagination.”
Your curriculum will contain the instructions for adding several more sensors that are not included in the base kit to keep the price down, but which can be added at any time. One of these is a GPS sensor ($60) that tells the operator where the robot is and includes instructions for how to upload your experiments to the MIT Sea Perch/Google Digital Ocean Project.
Another is a ping sensor which can be positioned behind the robot to detect objects approaching from the rear, useful in some competitions. Another is a vision system that enables the use of a Logitech USB Webcam for real time views of the surroundings and for communications which can be sent to remote laptops for live web cam feeds. This requires, however, a more powerful microcontroller. The curriculum explains how to replace the $30 Arduino with a $60 32-bit Netduino board (but bear in mind, Netduino can only be programmed currently in the Microsoft .NET environment).
Because of the important mission of marine robotics in environmental monitoring instructions will also be given on how to use temperature, humidity and pressure sensors as well as sonar depth sensing for ocean terrain imaging (the moon has been more extensively mapped than our underwater world) and for indicating underwater features and objects such as schools of fish, rocks, reefs and shipwrecks.
Besides the pushbutton touch sensor provided, instructions will be provided to help you use accelerometers (which can be used not only to measure speed but to determine which hull designs have greater efficiency) , pressure transducers and flexible resistors, and vibration sensors, and even sensitive limit switches made with piano wire “cat whiskers” to enable your roboboat to be more intimately aware of its environment and report back to you so you can modify code on the fly to save your mission.
It was this kind of sensor reporting that enabled Rutgers University’s RU27 (“Scarlet”) to autonomously cross the Atlantic Ocean in 2009 with only minor adjustments in code during the mission and only one interception in the Azores to clean the craft of barnacles and fouling. While the kit we provide doesn’t give you those capabilities out of the box, we want you to be thinking outside of the box and this curriculum will give you examples, suggestions and example code to help you develop a robotic marine craft with real possibilities.
Even the possibility of controlling your roboboat with simple voice commands (start, stop, left right) via microphone input to ADC ports with a preamp and amplifier processed by a microcontroller and DSP will be covered. Processing of visual information (video digitization, image processing) like that being done for the MIT Kingfisher surface navigation is also covered, though this is an advanced area that the kit can not accommodate.
Your curriculum will explain where many of the sensors necessary to enhance your robots capabilities can be found in every day life and how some can even be salvaged. Relays, actuators, servos, solenoids, accelerometers and motors can be pulled out of old cars (accelerometers and solenoids) printers and scanners (servos, motors) and satellite dishes (actuators).
Novel and simple propulsion techniques will also be discussed (solenoids controlling air balloons for example, chemical reactions etc.) so that the mechantronics you are learning can be put to practical use at many scales and with many budgets.
What about skills becoming obsolete? Things are moving so fast, if I learn these programs will I still have a job? One of the reasons for using open-source hardware and software and staying away from “black box” proprietary technologies is precisely so that you can continue to evolve with the art and science of this technology without incurring extra cost.
Some young people around the world have already taken their love of robotics engineering and developed ideas to share with the world so that more people can get into this exciting field. We highlight their work in this curriculum.
One of these young people is South African Luke Taylor, who, while in the 9th grade, was a finalist at the Google Science Fair in 2011. Luke wrote,
“Programming robots can be slow and challenging. In trying to assist a beginner with the NXT Mindstorms set, I asked myself whether it would not be possible to design an application that could translate English instructions directly into compilable code that the robot could execute. If successful, it should be of great assistance to those struggling with existing graphic and text-based programming languages. It could possibly also be of use to those wanting to spend less time on writing computer code. With this in mind, I embarked on some research, with the aim of designing a tool that could help the robot understand commands written in natural human language. To prove that it was possible to develop such functionality, I decided to limit design and testing to a prototype robot called Tribot and use only a basic set of instructions (including default values and a limited number of variables). The resulting application called SIMPLE not only manages to analyse and translate English sentences into C-code and compile and download these, but it also assists users via prompts that request required information to program the robot effectively.”
Another Google Science fair finalist whose work we feature is Skanda Koppula, a sophomore at North Allegheny Intermediate School who used an Arduino board to create a surface craft capable of hydrographic underwater mapping.
I used the following materials in my research:
- Arduino Duemilonova
- 1 USB Cable
- Sensors/Data Collection Devices:
- Ultrasound Transducer Rangefinder
- GPS Arduino Module
- Bluetooth™ Gold-mate Module
- Vessel Mobility Devices:
- Two 3 Volt DC Motors
- Hi-Tec Full Rotation Servo Motor
- General Electronics Materials:
- Solder-less Breadboard
- Lots and lots of wires...
- Two 1.5 ohm resistors
- General Construction Materials:
- Foam board
- Acrylic Sheet
- Multiple Lego Technic Parts
- Balsa Wood
- Laptop with the Processing IDE
- Specifications: Windows 7, 2 GB, 1.86 GHz dual-core
Skanda modeled his surface craft using Google Sketchup as shown here. He used the open source programming language Processing to communicate with the Arduino board.
Note to teachers: As the makers of Pyrobot state, “a robot experimenter need only concentrate on the behavior-level details of the robot. Robots may feature a disparate set of sensors and movements, and yet, depending on the specific robots or simulators used, Pyro provides a uniform way of accessing those features without getting bogged down by low-level details. ”
“Pyro has been used in many different courses at both the undergraduate and graduate level. Courses include
- Androids: Design & Practice (Bryn Mawr College)
- Artificial Intelligence (Bryn Mawr College, Swarthmore College)
- Cognitive Science (Bryn Mawr College)
- Developmental Robotics (Bryn Mawr College, Swarthmore College)
- Mobile Robotics (UMass Lowell, graduate)
- Robotics (UMass Lowell, undergraduate)”
Morphic tile scripts.
Severin, from Planete Science in France, recommended we use the same software they do for kids 10 to 16 along with the Arduino board:
“At Planete Sciences, we mainly use (for kids between 10 and 16) Squeak + the Arduino board (check for instance Physical Etoys, here: http://tecnodacta.com.ar/gira/projects/physical-etoys/). Not only these projects are open source and platform independants, but Squeak/Etoys was build from the beginning with a strong educational vision (check http://en.wikipedia.org/wiki/Constructionist_learning for instance). I've a lot of resources readily available in French for robotic workshops. I wanted to translate it to English since a while, but it would be quite a task. If you can at least read French, I'd be more than happy to send you support material.”
There is also an arduino simulator called Virtual Breadboard that costs $60:
It enables students to test code before building the Arduino board and risking components or time soldering. It has merit to it but the price may deter teachers from adopting it. It also only runs on Windows.
In PORPOISE we have adopted the philosophy behind the MORSE project: “The point when we want to simplify things, is that we often have to limit the freedom of the user. The complexity in simplification is to let users be free enough to make what they want without asking them to know every concept behind our technologies...”
There will always be students that will find robotics and its associated core areas (mechatronics, programming) intimidating, alienating or simply uninteresting. We feel, however, that this should not keep them from being involved in the robotics education experience because one can never predict how people’s minds change after exposure. To ensure that we bring a wider variety of students to the table, PORPOISE proposes that part of the curriculum involve having members of the robotics team who are “scribes” and “documentary videographers”. By having “photojournalists” and “film-makers” as part of the robotics team more students can take an active role and interest. Our experiences teaching science through video production in the inner city schools of Los Angeles in the 1990s and in an environmental science center in Egypt in the 2000s proved that when students are tasked with documenting a process they train themselves to pay deep attention to what is going on, they actively listen and ask questions and they become enthusiastic about the topic they are covering. In writing the script, doing the camera work and doing the editing (even in composing music for a video) they quickly master many of the conceptual areas of the project.
Because video production software can be prohibitively expensive, PORPOISE, which intends to keep costs as low as possible, has been working with an open-source Linux based video production program called OpenShot Video. It is as easy to use as Adobe Premiere or Final Cut Pro in many regards (though it doesn’t show audio waveforms yet) and enables animated titling and a host of transitions. It also outputs to formats for the web like youtube.
Robot Simulation Software
AUV Workbench - Freeware, simulation software designed for expensive AUV platforms, but easy to use with nice graphics. Developed by NPS Center for AUV Research, Naval Postgraduate School.
Player Stage - Freeware, robot simulation software, but not user friendly.
Microsoft Robotics Studio Visual Simulation Environment - Less than user friendly robot simulation software bought from AEGIS.
PHUN - 2D Physics Sandbox - Freeware, a 2D physics simulator that is *very* simple to learn and can even simulate water.
3D Graphic Robot Simulation - Graphic java-based 3D robot arm simulator
DMOZ Open Directory list - list of robot simulators
Sinbad - Java 3d robot simulator for scientific and educationnal purposes. It is mainly dedicated to researchers/programmers who want a simple basis for studying Situated Artificial Intelligence, Machine Learning, and more generally AI algorithms, in the context of Autonomous Robotics and Autonomous Agents.
Webots - very expensive professional robot simulation software
(In California we are working this semester in the Alpha Testing phase with Washington Preparatory High School (with the students of Principal Dr. Todd Ullah and Technology coordinator Chris Brandon and Parent Sponsor Roy Harper) and with Venice High School (with the students of Mr. Azadi, the sponsor of the after school Venice High Robotics Club, and Ms. Annette Mercer, the Parent Sponsor of the club). Venice also has an SOC program (Servicemembers Opportunity Colleges) and parties interested in STEM education through hands on activities should also explore working with SOCNAV, which gives credit for non-traditional learning.)