Pneumatic Robotic Arm Workshop

This pneumatic robotic arm workshop is design to introduce basic concepts of robotics and making to grade-school students. The design is based on ones used in middle school and high school robotic competitions. We have created a simplified version for one-time workshops with kids of all ages.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The two main science concepts are:

  • Leverage: A lever is a simple machine consisting of a bar that pivots on a fixed point (fulcrum). Levers are used to amplify input force. The robotics arm requires placing the syringes in positions that exploit leverage. You can find out more here.
  • Pneumatic:Pneumatic power uses compressed air as an energy source. Basic components of a pneumatic engine are: reservoir, pump, value and cylinder. In this workshop the syringe is the pneumatic engine. Pneumatic power is widely used in robotics and industry. Here is a link for other project ideas. 

Required Supplies:

Each student will require:

  1. x4 syringes
  2. x2 4-inch piece of tubing
  3. 1  4×4 piece of wood
  4. x5 Popsicle sticks
  5. X2 nut and bolts

The photo below is the full-scale model used in high school competitions. It requires 2-3 students to control. One of the Hip Monster sisters built it at a Sacred Heart Robotics Camp in San Francisco, CA.

Here is a side view with the arm down.

The competition involves stacking blocks and the score is based on time it takes to move all the blocks and height of the stack. Controlling the arm is a true team effort with 2-3 students working together to move the arm. The winning design not only requires good engineering but perfect team work. Engineering competitions are ideal ways for kids to develop technical as well as social skills.  Below is a video of the arm in action:

<video of it working>

For our grade school work shop we choose a smaller and simpler design that only required one student to control the robotic arm. You can still have a team competition with two students per robot (one controlling each syringe) if desired.

Below are several views of our simplified design. Instead of zip ties we use rubber bands and tape.

Here is a view from above. This design does not use hot glue and is suited for all ages.

This is another design suited for more advanced students.

Here is a side view showing the placement of the syringe in the middle of the base to provide better range of movement.

In the video below one of the Hip Monster’s sister’s team does a quick build of an arm.

 

YouTube player

Here are the step by step instructions:

  • Drill a hole in the center of the square plywood which will be the base for your robotic arm.
  • Now push a bolt through the hole and secure it using a nut. The bolt will be the support for your arm.
  • Drill a hole on one end of four popsicle sticks.
  • Use the two popsicle sticks placed on either side of the bolt with the holes on the top.
  • Secure using rubber bands making sure to let it pivot.
  • Secure a syringe to a popsicle stick. This popsicle stick provides leverage helping move the arm.
  • Use rubber bands instead of tape or glue. Rubber bands let the mechanism flex as the pump extends pushing the arm.
  • Attach the piping and connect another syringe.
  • Adjust the two syringes so when you depress one the other extends.
  • Attach one end to the popsicle stick using a rubber band.
  • Next secure the other end to the edge of the base using tape.
  • Slow depress the syringe pump your arm will move!
  • Now attach two popsicle sticks to the top of the arm.
  • Secure with a bolt and nut.
  • Secure the syringe pump to the forearm with rubber bands.
  • Now attach the syringe base to the arm using tape.
  • Connect the other syringe.
Now you pneumatic robotic arm is complete!

To improve performance you can turn your pneumatic robot to a hydraulic powered one by just adding water! You can get more information here.

YouTube player

Happy Creating! 

Updates to RobotFreedom.AI

Since our last update at Maker Faire, we’ve made significant improvements to our robot lineup, focusing on increasing autonomy, human interactions and emotional expression. Core to this is our AI framework, RobotFreedom.AI now available on GitHub.

The design principle for our autonomous AIs ():

  1. Runs completely locally on a RaspberryPi (no third-party API call out)
  2. Has distinctive personalities based on the big 5 personality traits
  3. Learns from interactions (initiated when in sleep mode)
  4. Makes informed decisions based on a knowledge graph
  5. Can carry on a conversation between themselves and with a human

To achieve this we used an Agentic AI framework rather than just tapping direction into a chat bot. By having a core AI that was designed to meet our specifics goals we had more control of its behavior and could also directly see how the AI worked in real-time which provide to be a great educational tool.

One of the key upgrades is the addition of machine learning algorithms, which enable the robots to learn from their interactions and adapt quickly adapt new situations. This allows them to become even more autonomous in their decision-making processes, making them more efficient and effective in completing tasks.

We’ve also made notable strides in expanding the robots’ interactive capabilities, incorporating features such as voice recognition, gesture control, and tactile feedback. These enhancements enable users to engage with the robots on a deeper level, fostering a more immersive and engaging experience.

Some of the specific updates include:

* Advanced sensor arrays for improved navigation and obstacle detection

* Enhanced machine learning algorithms for adaptive decision-making

* Voice recognition and speech-to-text capabilities

* Tactile feedback mechanisms for haptic interaction

These updates have significantly enhanced the robots’ autonomy, interactivity, and overall user experience. We’re excited to continue refining our designs and pushing the boundaries of what’s possible with robotics and AI.

We have been busy working on our next release of our robot software platform

Major features:

  • Robots can coordinate actions using web socket communication
  • Dedicated http server for each robot
  • Added Piper TTS for voice
  • Added Vosk for speech recognition
  • Added Ollama and LangChain for chat bot.
  • Improved random movement generator.
  • Tons of bug fixes
  • Improved debug mode
  • Low memory mode for RaspberryPis 1-3

Tested on OsX and RaspberryPi 1-5.

You can see our Robotic AI platform in action here.

Happy creating!

Arduino Robotic Controller Software Update

The RobotFreedom robots are controlled by two code bases. The first runs on a RaspberryPi and is written in Python. You can read more about it here. The second code base controls the movements and lights on the robot. It is written in C and runs on a Arduino. This article will get you started on developing with that code base. You can download it GitHub.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The movement controller is designed to be light and simple compared to the main AI code-base and is ideal for a beginner. The focus is to provide an interface for a human or AI to control a robot’s moving components (arms, legs and wheels). We use a Arduino Mega Board because it has a plenty of digital pin to attach our components to. Below is an image of a Arduino Mega board.

Arduino’s can be controlled via serial communication through a USB port or you can code it to run independently. Our robotic walkers are controlled only by an Arduino. This project is intended to be controlled by an AI installed on a RaspberryPi.

The purpose of the program is to receive incoming messages and perform the requested action. For example ‘a’ means raise the left arm. When the program receives an ‘a’ command it sends a command to a H-Bridge which then send power to a linear actuator to raises the left arm.

To start, install the Arduino IDE on your preferred development OS. Linux, OXS and Windows is supported. You can get the code here.

Next, download the required library and copy them to your development folder.

Adafruit_BusIO
Adafruit_I2CDevice
Adafruit-GFX-Library
FastLED

Launch the Arduino IDE and from the menu bar select:
Sketch>Include Library>Add Zip Library…

Launch the Arduino IDE and from the menu bar select:
Sketch>Include Library>Add Zip Library…

Then point to one of the zip files you have downloaded. Repeat for each required library.

To open the project, double click on the movement_controller.ino file and the Arduino IDE should automatically launch. If it does not launch you can load the sketch by selecting File>Open then navigate to your project file.
Now choose a board type. When you connect your Arduino board it should be auto-detected by the IDE. For some brands you may have to manual select it from the combo box. For off-brand Arduino we recommend searching forums for the best match. Many specify incorrect boards in their descriptions.

Next select >Sketch>Verify Compile. At the bottom of the IDE a console will appear and provide a detailed log. If you missed any libraries you will receive an error message. Loading any required libraries and try again.

Once the code is compiled select sketch>upload to deploy your code.

Below is a list of the components the code can control:

H-Bridge
FastLED
Linear Actuator

The image below is the wiring for the Arduino:

To test select Tools>Serial Monitor from the main menu. At the bottom of the IDE a new tab titled “Serial Monitor” will appear. From this console you can directly communicate with the Arduino.

 In the console window type “5”. You should see a response from the Arduino in the console and if the LED is connected to the Arduino it should turn white and flash in a circular pattern.

Now you can control your own robot!
Happy Creating!

 

Number Nine is Rewired

We are learning weight is everything when it comes to good performance from our robots. One of our best jumpers, Number Nine, used splicing connectors that had very useful push handles but were way too heavy for continued use.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The old connectors were perfect when we were prototyping designs but once we settled on a wiring diagram it was time to move on to the much lighter push-in designs. The video below is a sped up video of one of the Hip Monster’s sister team (age 13) rewiring Number Nine with the new connector:

YouTube player

And now for testing! Here is a video of Number Nine is back in action and ready for more upgrade:

YouTube player

Happy creating!

Gunnerkrigg Court S13

When people visit our workshop, the first thing they would see is a big box of parts labeled S13. The HipMonsters sister team use that box for all the leftover pieces when we upgrade our robots (mostly parts from Number Two and Number Three).

The idea for the box came from the online graphic novel series Gunnerkrigg Court, which is one of our all-time favorite works of fiction. This is the mysterious S13 box in Gunnerkrigg Court waiting to be assembled.

This is the one page that sent the Hip Monster’s team on a four year journey to build a robot that could carry on a conversation.

During the Covid pandemic, being able to build your own robot to play with was very appealing to the Hip Monster’s sister team. Gunnerkrigg Court and Girl Genius Online made building robots seem easy. Years later, the whole team now knows that building robots is fun, but also hard and tedious. Our robots can now talk and move on their own, but are still not as good as S13. Given we lack etheric powers (what the supernatural force is called in Gunnerkrigg court) we think we did fairly well.

It was raining over the weekend and we are tired of working on real robots (some of which now talk back at us) so decide to rebuild our first non-work robot from the scraps.

Above is our real-life replication of the assembly of S13. Here in the top left photo we have laid out all of the pieces we found in the box. In the top right photo we are assembling the legs.

The rebranded S13 almost complete.

Gunnerkrigg Count was probably the work of fiction that was the most influential in our decision to build robots. During the pandemic, the adventurous spirit of the two central characters (Annie and Kat) challenged us to push ourselves.

Our emotional AI which controls all our robots is loosely based on S13’s conversation with another robot later in the series about having an ocean of feelings to swim in. When we designed the AI we made sure that at a high level, the code held true to the ocean analogy. Our robots swim in emotions, stimuli, and personality. There is an algorithm that runs deep in the code that lets the robot adjust its behavior given what it experiences.

Here is our very much over used copy of the first volume of Gunnerkrigg Court. We are saving up to buy new hardcover additions.

we hope you find your inspiration.

Fully Autonomous Robots

This video is the first time we were able to record two of our robots talking autonomously. While we were building them, they talked to each other all the time, but capturing on film proved harder than we thought. In this video, both robots are listening to what the other robot says and responding with replies generated by a chat bot based on what they hear.  

 

The robots are completely offline and only use open-source software. They are powered by a RaspberryPi and have a local LangChain chat bot (TinyLlama LLM). They use Vosk for speech recognition and Piper to synthesize speech. Vosk does a fairly good job converting the Piper voice (it did not recognize anything spoken using eSpeech). Piper works well most of the time but can miss a few words and freeze up unexpectedly. The pause mid-video is due to one of the robots briefly not being able to speak due to a buffer overflow issue. 

 

We also have distinct personalities and LLM prompts for all our robots, although in this clip they are hard to distinguish. The only thing noticeable is how  one robot moves its arms much more than the other. 

We have four modes:

  • Puppet: a human controls the robot in real-time
  • Scripted: The robot follows a script with minimal autonomous actions
  • Autonomous: The robot responds to outside stimuli on its won
  • Blended AI: the robot has a script but improvises what it says and how it moves.

Moving forward we will have two types of videos, scripted mode and fully autonomous. The puppet mode will use a human created script to control the robots. The fully autonomous films will be the robots talking on their own “off camera”.  

YouTube player

We are working on releasing the code based used in this video, but it is a bit too rough at this stage. 

Happy creating! 

Maker Faire Bay Area Robot’s View

Thanks to everyone who helped this year’s Maker Faire Bay Area be so special! We are looking forward to seeing everyone next year and are already improving our show. Below is a photo our booth before the event started. It is hard to believe over one thousand people visited us over the course three days!  

Maker Faire Bay Area

Want to see how our autonomous robots experienced Maker Faire Bay Area? Check out the video below, generated based on the stimuli, emotions, and actions of HipMonsters’ two robots over the course of three days at the Maker Faire.

The robots recorded the following sensory data:

💙 Noise: A sudden, loud noise. Represented by the color Blue.

💚 Distance: Motion within 1 foot. Represented by the color Green.

🧡 Movement: Motion within 6 feet. Represented by the color Orange.

💛 Speech: The spoken word “robotics”. Represented by the color Gold.

💗 Touch: Contact on the touch sensor. Represented by the color Pink.

🤖 Frequency of Stimuli: How often or rarely the robots received stimuli. Captured by the Movement of the cube.

🔉 Mood: Happy or overstimulated. Reflected in the choice of Sound.

Turn up the volume of the video! It’s not music you’re hearing, but the robots’ moods given the stimuli.

Since we engaged the Touch sensor at the end of each demo, this means we ran 420 complete demos over 3 days. Our robots have been well socialized!

YouTube player

Happy Creating!

Getting Started with Raspberry PI

Originally, we set up this site to focus on woodcrafting and painting but as our interests grew, we have increasingly used Raspberry Pis to add motion and life into our work. This post will get you started using Raspberry Pi’s in your creations.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

Why Raspberry Pi?

  1. Powerful computing platform with easy-to-use languages.
  2. Low energy consumption and runs quietly and cooly.
  3. Rich online support and user base.
  4. Has 26 pins built in enabling rapid integration with Internet of Things (IoT) technology.

RaspberryPi 5

Peripherals

Today, most people developed on a laptop or tablet, but Raspberry Pi’s require old fashion peripherals: power cables, screen, keyboard and mouse. You need to setup a physical development environment and make sure you have all the necessary peripherals. Newer Raspberry Pi uses a Micro HDMI port so you will need a converter. We do a lot of coding on the couch so built a makeshift laptop as seen below.  

DIY RaspberryPi Laptop

A side view of our Raspberry Pi laptop.

DIY RaspberryPi Laptop

A front view of our laptop.

A mouse can get some to get use to so we recommend a wireless keyboard (seen above) with a built-in trackpad. One plus is the keyboard + trackpad only uses up one USB port.

The Hard Drive

A Raspberry Pi’s OS is stored on a Micro SD. To start we recommend getting two with at least 64 GB. If you do any images or sound the drive fills up fast. You will also need at least two readers. One USB A for the Raspberry Pi when you transfer code and one for your other machine to build the OS image from.

SD card and reader

Building the OS Image

You can buy Micro SD cards with built in OS. If you do not have a laptop or desktop that is you only real option. You can also build your own OS image using tool provided by Raspberry Pi. You dan download it here: raspberrypi.com/software.

We recommend modifying the advance setting to pre-configure your login and Wi-Fi password.

Booting the Device

Make sure to use the appropriate power supply as specified by RaspBerryPi. Depending on the version, booting can take a while. Once it has completed booting you should see a screen that looks like most standard desktop environments.

Linux Desktop

Raspberry Pi’s OS is ARM version of Linux. If you have used Linux most of the standard tools will be available. If you have only used Windows or OSX the environment should seem very familiar. All the desktop environments follow the same basic principles. If you have never used a desktop environment this is a great place to start!

Configuring Your Environment

The keyboard defaults to UK. If you are not in the UK many of the keys will not work as expected. In Preferences, open up the Mouse and Keyboard Setting then click the Keyboard layout button at the bottom. In the combo box choose the appropriate country.

We also recommend a smaller or not image for the background to use less memory.

Developing Your Next Big Thing!

We started using Scratch as a development tool. If that works for you and makes sense keep using it! Here is a link on how to install it on a Raspberry PI.

We have migrated to mow using Python and C++. To write code we use the Geany Programmer’s Editor. It lacks some features of Visual Studio Code (what we develop on in Windows and OSX) but has a light foot print.

Typically, we write code for a Raspberry Pi on both a MacBook and the Raspberry Pi itself. We do find the MacBook is similar enough environment we do not need to change our code too much. If you look at our code in GitHub we you we often have different logic based on which environment the code is run on. Note: there are some packages that only work on Raspberry Pi such as interfaces to sensors. In these sections of the code, we have non-functioning stub if the platform is OSX.

We transfer code using the SD reader. Both OSX and Linux auto-detect SD cards when attacked but with Linux it can take a bit so be patient. Also, sometimes Linux cannot write to large SD card so try a small on first.

Our next post will dive deeper into the basic of programming Python on a Raspberry Pi. For now, if you have never used Linux or a desktop environment we recommend just browsing the Web using Chromium (the open source base to Chrome) to familiarize yourself.

Happy Creating!

 

 

 

 

 

 

Number One On Its Own

Number One looks very simple, it’s just a burnt out hair drier with wheels. As out first design we opted for a wheeled robot that followed a more traditional form, but it has been repeatedly updated over the years and now is completely autonomous with a mind of its own, making it one of our most complex robots. Powered by a RaspberryPi, our new Number One is now a Edge AI mobile sensor.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

DIY wheeled robot

The handle of the blow drier servers as a functional hub for the electronic component. The two batteries (one for the RaspberryPi and one for the motors) are attached to the back to allow for quick replacement. The camera is mounted at the top to provide a good overall view. The display, which is mostly for show, is forward facing. We added “bumpers” to the screen on each counter to help protect it in from falling or bumping in to something. The first screen hit a end table and developed a crack, which convinced us that it needed some armor.

DIY Wheeled Robot RaspberryPi

To protect the range finder, we added wooden bumper. Originally the range sensor had no protection, but after a few good hits we decided a bumper was a good idea. The range finder has proven to be sturdy but the wires to tend to fall off.

DIY Wheeled Robot RaspberryPi

Above is a back view. When we first built Number One it the components were completely attached using electrical tape. While this worked surprisingly well, it did not look good. Most components are now bolted on or attached using leather to help the robot look more aesthetic.

DIY Wheeled Robot RaspberryPi

The RaspberryPi is attached in front for easy access. The USB and other access ports are easily accessed allowing for quick repairs. We use a wireless keyboard to control the RaspberryPi. While the robot is autonomous (it makes decisions on its own) when it first gets power the AI part of the robot does not turn on. The robot can only become active after we execute a command. The original model turned on automatically, but that proved to be a bit of a headache when something went wrong.

Robot layout

The above image is the layout design using software from Fritzing.org. This is a far simpler layout that what we made for Number Two and Number Three. We may add more sensors over time, but to enable a fast response and to reduce power needs we decided to keep the number of sensors to a minimum.  Another difference is we are not using an Arduino to control the movement. For beginners this is a better design to learn with.

Here is Number One in action! Come see it live at this year’s Bay Area Maker Faire! 

You can download the code from our GitHub.

Happy Creating!

Bay Area Maker Faire Update

The HipMonster’s team was quiet online over the summer but working hard in our workshop finishing up our educational presentation on robotics, Robot Freedom. Here is a quick preview of our Robot Freedom which you can see in person at this year’s Bay Area Maker Faire.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

DIY pneumatic robot with bell.

Here is our pneumatic robot designed to put a ring into robotics! Learn how to power a robot by just using your own strength and coordinating with a friend. See how many times you can ring the bell!

DIY Wheeled robot.

Our DIY robotic car is completely controlled by our emotional AI platform. It uses sensors to learn from its surroundings and go in the right direction. See it navigate the world with emotions and learn how you can build one too.

DIY steampunk Leibniz Calculator

Add, subtract, multiply, and divide using our DIY Leibniz calculator. A steampunk computer that you can build at your home. This calculator can do amazing math with a relatively simple design. Before there was electronics, there was gears!

Steampunk autonomous robot

See the updated Number Three, now a fully autonomous android with emotions. It takes in information from a variety of sensors and processes the information to change its mood. Help it learn to not be afraid of humans!

Steampunk autonomous robot (centaur)

And Number Two (our centaur robot) has gotten updated as well. The AI platform will soon be available on GitHub so you can build your own emotional AI.

Number Three and Number Two also have a hidden feature when you activate a certain sensor.

We are looking forward to seeing all of you at this year’s Maker Faire!

Happy Creating!