Robotics and Mechatronics
In grad school, I started out going to lectures about robots, and demos with robots. Then I decided I wanted to make some robots. Like any engineering project, I thought it was best to start simple, and then keep building on what I know. I know I like to code in C++ and python, I know some electronics and machine learning and want to learn more, and I love making puppets. So why not mash them all together?
First I needed to brush up on my mechatronics skills. I worked my way through an intro Arduino kit, and reminded myself of what I learned in electronics lab. Although I didn’t make up these projects, I’m posting them here because in doing them, I learned how to put some cool pieces together and then started applying the same concepts to more advanced projects.
Basic Arduino learning projects
… and then came Raspberry Pi
Arduinos are great for timing and pretty simple to use. But I wanted to try using python optical character recognition, text to voice, and speech recognition projects. So I dove into the world of Raspberry Pi. Here is my first robot with a Raspberry Pi. It is very simple, and is made out of cardboard. The mouth moves with a servo motor. I used the Raspberry Pi 3 B+ and programmed in python to get the mouth to move, eyes to light up, and the robot to talk. I chose a Raspberry Pi 3 B+ based on other users’ recommendations and because it has wifi and most of the other functionality of a laptop or desktop computer. Most of my struggles getting this project up and running were that the Raspberry Pi processor and functionality is still very different from what I was used to on a desktop. Even something as basic as installing python packages from conda channels doesn’t work on the Pi: Anaconda isn’t supported. I found workarounds for this and other issues, but I kept having to re-learn that different processors have different constraints, and you have to know the constraints in order to make the product you want.
Just for fun, the robot introduces itself in three languages: English, Swahili, and Spanish. Espeak supports many different languages including these, though it seems the voice is pretty similar in all of the languages I tried. Originally, I wanted the robot to ask the user who it is and have the user respond verbally. However, after much consternation with ALSA, I realized that the Raspberry Pi does not easily support audio input, as it does not have a sound card and the audio jack is only for speakers/headphones. Therefore, I changed the input to be the user enters their name on the keyboard. This works fine for a basic project. In fact, this project is so basic that a Raspberry Pi 3 B+ was overkill, and on the next project I tried an even more basic processor: the Raspberry Pi Zero. I planned to experiment with a USB microphone and/or a microphone that I can connect directly to the GPIO pins.
Challenges with Raspberry Pi Zero
I tried using the Pi Zero W for my next project because it is so cheap and small. The problem is it doesn’t have an audio jack. I thought I would be able to easily surmount this by wiring up some speakers to a circuit and making sound using pulse width modulation. A variety of instructions make this look easy. How hard could it be to follow directions?
Well, I got a circuit to play very nice sound through speakers, but I could not get this to work on the Raspberry Pi, despite a variety of forays into the audio configuration files to force the GPIO pins to do PWM. Code seemed fine, and still no sound.
Blubbo with Raspberry Pi 3B+
After spending quite a long time wondering how to get sound to work correctly with the Pi Zero, I went back to the 3B+ to focus on the interesting part of getting Blubbo to talk, sing, fart, and blink his eyes. Blubbo is a farting fish who likes music. He is a character that my Dad invented for bedtime stories when my sister and I were children. Since Dad turned 65 this year, I decided to make a physical Blubbo who would attend Dad’s birthday party.
The specifications I created were to make a fish puppet named Blubbo that talks and sings and occasionally farts. Talking and singing are both coupled with mouth movement. Talking occurs when the Blubbo program starts and ends. Singing is initiated by user input through capacitive sensors on the fins, and farting is coupled with crazy eye blinking and occurs randomly.
I am still finalizing a program with proper waiting for user input between songs, for the moment a scaled down version can greet, sing, and then say goodbye (as seen in the video below). Blubbo has a repertoire of several songs as well as farts and eye movements.
As part of this project I learned more about using multithreading and the importance of making (and following) an overall system architecture diagram and thinking through what can cause a wait or interrupt etc.
I also ended up having an inordinate amount of trouble with the RGB LEDs because it turned out that the two common anode RGBs for the two eyes actually had a different order of their green and blue legs, despite appearing the same in every other way! I learned the importance of checking a spec sheet closely but also checking parts individually rather than assuming what I see on a spec sheet is accurate. In retrospect, it would have been easier to use three different LEDs, but at the time color mixing with RGB LEDs seemed both fun and easy.
My next steps in this project will be to get the capacitive sensors to trigger singing a song if Blubbo isn’t already singing, finalizing the farts occurring at random times throughout the program, make a way for the user to exit the program before timeout, and to couple mouth movement with the beat of the song. Once I’m done with these tasks which are all coding challenges at this point, I’ll put the speakers, Raspberry-Pi, and a power bank back inside of Blubbo and control him remotely from a laptop via SSH. (I did this for when he attended the birthday party, but the video above depicts the Pi outside of his body and connected to peripherals for fine tuning of the program).
Of course a fish puppet needs friends, and the next project is already in the works.