Pneumatic Robotic Arm Workshop

This pneumatic robotic arm workshop is design to introduce basic concepts of robotics and making to grade-school students. The design is based on ones used in middle school and high school robotic competitions. We have created a simplified version for one-time workshops with kids of all ages.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The two main science concepts are:

  • Leverage: A lever is a simple machine consisting of a bar that pivots on a fixed point (fulcrum). Levers are used to amplify input force. The robotics arm requires placing the syringes in positions that exploit leverage. You can find out more here.
  • Pneumatic:Pneumatic power uses compressed air as an energy source. Basic components of a pneumatic engine are: reservoir, pump, value and cylinder. In this workshop the syringe is the pneumatic engine. Pneumatic power is widely used in robotics and industry. Here is a link for other project ideas. 

Required Supplies:

Each student will require:

  1. x4 syringes
  2. x2 4-inch piece of tubing
  3. 1  4×4 piece of wood
  4. x5 Popsicle sticks
  5. X2 nut and bolts

The photo below is the full-scale model used in high school competitions. It requires 2-3 students to control. One of the Hip Monster sisters built it at a Sacred Heart Robotics Camp in San Francisco, CA.

Here is a side view with the arm down.

The competition involves stacking blocks and the score is based on time it takes to move all the blocks and height of the stack. Controlling the arm is a true team effort with 2-3 students working together to move the arm. The winning design not only requires good engineering but perfect team work. Engineering competitions are ideal ways for kids to develop technical as well as social skills.  Below is a video of the arm in action:

<video of it working>

For our grade school work shop we choose a smaller and simpler design that only required one student to control the robotic arm. You can still have a team competition with two students per robot (one controlling each syringe) if desired.

Below are several views of our simplified design. Instead of zip ties we use rubber bands and tape.

Here is a view from above. This design does not use hot glue and is suited for all ages.

This is another design suited for more advanced students.

Here is a side view showing the placement of the syringe in the middle of the base to provide better range of movement.

In the video below one of the Hip Monster’s sister’s team does a quick build of an arm.

 

YouTube player

Here are the step by step instructions:

  • Drill a hole in the center of the square plywood which will be the base for your robotic arm.
  • Now push a bolt through the hole and secure it using a nut. The bolt will be the support for your arm.
  • Drill a hole on one end of four popsicle sticks.
  • Use the two popsicle sticks placed on either side of the bolt with the holes on the top.
  • Secure using rubber bands making sure to let it pivot.
  • Secure a syringe to a popsicle stick. This popsicle stick provides leverage helping move the arm.
  • Use rubber bands instead of tape or glue. Rubber bands let the mechanism flex as the pump extends pushing the arm.
  • Attach the piping and connect another syringe.
  • Adjust the two syringes so when you depress one the other extends.
  • Attach one end to the popsicle stick using a rubber band.
  • Next secure the other end to the edge of the base using tape.
  • Slow depress the syringe pump your arm will move!
  • Now attach two popsicle sticks to the top of the arm.
  • Secure with a bolt and nut.
  • Secure the syringe pump to the forearm with rubber bands.
  • Now attach the syringe base to the arm using tape.
  • Connect the other syringe.
Now you pneumatic robotic arm is complete!

To improve performance you can turn your pneumatic robot to a hydraulic powered one by just adding water! You can get more information here.

YouTube player

Happy Creating! 

Number 10 Gets A Screen

We have started upgrading all our robot to run the new RobotFreedom.AI framework. For Number Ten, the main missing piece was a screen. Some of our early designs were not built with a screen in mind and adapting the design has taken a few iterations.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

Here is a list of some materials and components used:

Number Ten was a tricky design to fit a screen on and keep to its original design. After many attempts we settled on using a L-Bracket placed at the front the body to mount the screen.

Now we reassembled the legs onto the body. Number Ten was one of several experiments the Hip Monster’s sister team built to come up with the most unusual way to move a robot. The robot moves forward, left and right by sliding one foot forward. On the bottom of each foot is edge shaped gripers that provide traction when pushed against but slide when pushed forward.

The screen is light enough to only need a few attachments to hold it in place. For added support we used a wire at the top of the screen to keep it secure while moving. Number Ten has never fallen forward so we need less protection for the electronics and screens than some of out other designs.

Our we assemble the various components Number Ten will need. We recommend using a usb hub  for the wireless keyboard dongle. If you have several robots you will want to reuse the keyboard and will need quick access to the dongle. Typically, once we settle on a final layout for the RaspberryPi it is in a secure but difficult to each place making removing the dongle difficult. For people with less than perfect eye sight we recommend using a magnifying glass and bright lights when connection the GPIO pin to the RaspberryPi.

YouTube player

And here is a quick video of Number Ten display screen working. It is a light weight version of our main display better suited for older RaspberryPis.

YouTube player

Happy creating!

A Selection of Wands

We make wands to relax and to practice our woodworking skills. The process usually involves spokeshaving, sanding, staining, and waxing the wood. We always use wood from our yard to make the wands extra unique. Here is a selection of some of our wands!

DIY Wooden Wands
This piece of wood had a handle that bends that was very difficult to spokeshave, but turned out very well. We stained it red and brown then sanded the wood for the marbled finish.
DIY Wood Wands

Unusual for us, this wand is made from bamboo. Instead of spokeshaving, we just sanded this wand to maintain the classic look of the bamboo. The black color of the wood is natural, not stained.

DIY Wooden wands

This wand is nice and strait. We spoke shaved it smooth and stained it mahogany to add a pop of color. This wand was also sanded to have a smooth finish.

Happy creating!

Updates to RobotFreedom.AI

Since our last update at Maker Faire, we’ve made significant improvements to our robot lineup, focusing on increasing autonomy, human interactions and emotional expression. Core to this is our AI framework, RobotFreedom.AI now available on GitHub.

The design principle for our autonomous AIs ():

  1. Runs completely locally on a RaspberryPi (no third-party API call out)
  2. Has distinctive personalities based on the big 5 personality traits
  3. Learns from interactions (initiated when in sleep mode)
  4. Makes informed decisions based on a knowledge graph
  5. Can carry on a conversation between themselves and with a human

To achieve this we used an Agentic AI framework rather than just tapping direction into a chat bot. By having a core AI that was designed to meet our specifics goals we had more control of its behavior and could also directly see how the AI worked in real-time which provide to be a great educational tool.

One of the key upgrades is the addition of machine learning algorithms, which enable the robots to learn from their interactions and adapt quickly adapt new situations. This allows them to become even more autonomous in their decision-making processes, making them more efficient and effective in completing tasks.

We’ve also made notable strides in expanding the robots’ interactive capabilities, incorporating features such as voice recognition, gesture control, and tactile feedback. These enhancements enable users to engage with the robots on a deeper level, fostering a more immersive and engaging experience.

Some of the specific updates include:

* Advanced sensor arrays for improved navigation and obstacle detection

* Enhanced machine learning algorithms for adaptive decision-making

* Voice recognition and speech-to-text capabilities

* Tactile feedback mechanisms for haptic interaction

These updates have significantly enhanced the robots’ autonomy, interactivity, and overall user experience. We’re excited to continue refining our designs and pushing the boundaries of what’s possible with robotics and AI.

We have been busy working on our next release of our robot software platform

Major features:

  • Robots can coordinate actions using web socket communication
  • Dedicated http server for each robot
  • Added Piper TTS for voice
  • Added Vosk for speech recognition
  • Added Ollama and LangChain for chat bot.
  • Improved random movement generator.
  • Tons of bug fixes
  • Improved debug mode
  • Low memory mode for RaspberryPis 1-3

Tested on OsX and RaspberryPi 1-5.

You can see our Robotic AI platform in action here.

Happy creating!

Arduino Robotic Controller Software Update

The RobotFreedom robots are controlled by two code bases. The first runs on a RaspberryPi and is written in Python. You can read more about it here. The second code base controls the movements and lights on the robot. It is written in C and runs on a Arduino. This article will get you started on developing with that code base. You can download it GitHub.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The movement controller is designed to be light and simple compared to the main AI code-base and is ideal for a beginner. The focus is to provide an interface for a human or AI to control a robot’s moving components (arms, legs and wheels). We use a Arduino Mega Board because it has a plenty of digital pin to attach our components to. Below is an image of a Arduino Mega board.

Arduino’s can be controlled via serial communication through a USB port or you can code it to run independently. Our robotic walkers are controlled only by an Arduino. This project is intended to be controlled by an AI installed on a RaspberryPi.

The purpose of the program is to receive incoming messages and perform the requested action. For example ‘a’ means raise the left arm. When the program receives an ‘a’ command it sends a command to a H-Bridge which then send power to a linear actuator to raises the left arm.

To start, install the Arduino IDE on your preferred development OS. Linux, OXS and Windows is supported. You can get the code here.

Next, download the required library and copy them to your development folder.

Adafruit_BusIO
Adafruit_I2CDevice
Adafruit-GFX-Library
FastLED

Launch the Arduino IDE and from the menu bar select:
Sketch>Include Library>Add Zip Library…

Launch the Arduino IDE and from the menu bar select:
Sketch>Include Library>Add Zip Library…

Then point to one of the zip files you have downloaded. Repeat for each required library.

To open the project, double click on the movement_controller.ino file and the Arduino IDE should automatically launch. If it does not launch you can load the sketch by selecting File>Open then navigate to your project file.
Now choose a board type. When you connect your Arduino board it should be auto-detected by the IDE. For some brands you may have to manual select it from the combo box. For off-brand Arduino we recommend searching forums for the best match. Many specify incorrect boards in their descriptions.

Next select >Sketch>Verify Compile. At the bottom of the IDE a console will appear and provide a detailed log. If you missed any libraries you will receive an error message. Loading any required libraries and try again.

Once the code is compiled select sketch>upload to deploy your code.

Below is a list of the components the code can control:

H-Bridge
FastLED
Linear Actuator

The image below is the wiring for the Arduino:

To test select Tools>Serial Monitor from the main menu. At the bottom of the IDE a new tab titled “Serial Monitor” will appear. From this console you can directly communicate with the Arduino.

 In the console window type “5”. You should see a response from the Arduino in the console and if the LED is connected to the Arduino it should turn white and flash in a circular pattern.

Now you can control your own robot!
Happy Creating!

 

Number Nine is Rewired

We are learning weight is everything when it comes to good performance from our robots. One of our best jumpers, Number Nine, used splicing connectors that had very useful push handles but were way too heavy for continued use.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The old connectors were perfect when we were prototyping designs but once we settled on a wiring diagram it was time to move on to the much lighter push-in designs. The video below is a sped up video of one of the Hip Monster’s sister team (age 13) rewiring Number Nine with the new connector:

YouTube player

And now for testing! Here is a video of Number Nine is back in action and ready for more upgrade:

YouTube player

Happy creating!

Gunnerkrigg Court S13

When people visit our workshop, the first thing they would see is a big box of parts labeled S13. The HipMonsters sister team use that box for all the leftover pieces when we upgrade our robots (mostly parts from Number Two and Number Three).

The idea for the box came from the online graphic novel series Gunnerkrigg Court, which is one of our all-time favorite works of fiction. This is the mysterious S13 box in Gunnerkrigg Court waiting to be assembled.

This is the one page that sent the Hip Monster’s team on a four year journey to build a robot that could carry on a conversation.

During the Covid pandemic, being able to build your own robot to play with was very appealing to the Hip Monster’s sister team. Gunnerkrigg Court and Girl Genius Online made building robots seem easy. Years later, the whole team now knows that building robots is fun, but also hard and tedious. Our robots can now talk and move on their own, but are still not as good as S13. Given we lack etheric powers (what the supernatural force is called in Gunnerkrigg court) we think we did fairly well.

It was raining over the weekend and we are tired of working on real robots (some of which now talk back at us) so decide to rebuild our first non-work robot from the scraps.

Above is our real-life replication of the assembly of S13. Here in the top left photo we have laid out all of the pieces we found in the box. In the top right photo we are assembling the legs.

The rebranded S13 almost complete.

Gunnerkrigg Count was probably the work of fiction that was the most influential in our decision to build robots. During the pandemic, the adventurous spirit of the two central characters (Annie and Kat) challenged us to push ourselves.

Our emotional AI which controls all our robots is loosely based on S13’s conversation with another robot later in the series about having an ocean of feelings to swim in. When we designed the AI we made sure that at a high level, the code held true to the ocean analogy. Our robots swim in emotions, stimuli, and personality. There is an algorithm that runs deep in the code that lets the robot adjust its behavior given what it experiences.

Here is our very much over used copy of the first volume of Gunnerkrigg Court. We are saving up to buy new hardcover additions.

we hope you find your inspiration.

Carved Wooden Seals

These are old carvings that we forgot to post. We were inspired by the Chinese wax seals and wanted to practice wood carving, so we decided to make our own unique wooden seals.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

Chinese wax seals

Here are some of our favorite seals. Most were purchased at a little store in the middle of San Francisco’s China Town right off of post street.

Crafted WaxSeals AlligatorInspired by Claude the Alligator, we decided to make an alligator seal. Its tail curves behind it and opens its jaws wide. The stamp would go on the bottom.

Crafted WaxSeals Alligator

This is the side view, where is shows off its tiny feet. There is some texture on the tail to replicate an alligator’s scales. At first the tail was indicated with a shallow grove but the carving was hard to make out or we opted to cut a slot to highlight the tai more clearly.

Crafted WaxSeals, Cat

This is the other wooden seal, which is a cat. The cat is perched on top of a wooden column, with its tail curved down the side.

Crafted WaxSeals, Cat

This carving captures the shape of a cat, with its pointed ears and curved body. We tied a red string around its neck like a collar to add a pop of color.

At the bottom of the seal, we added a stamp. We made the stamp out of an eraser and carved on the ancient Chinese character for ocean.

Here are the two wooden seals together!

Happy Creating!

Intro to Our Workshop!

In this video Ted from the HipMonster’s team shows our workshop and describes how we train our robots. We have fifteen DIY robots throughout the workshop that listen in on our conversations to learn from us while we work. The robots are completely autonomous and learn on their own. If you are interested in building your own, our website has instructions. These designs are meant for all ages, but even K-12 kids can get started building their own robots.

YouTube player

The robots have their own site, RobotFreedom.com. Watch them they recap the week’s event between themselves.

Please like and subscribe to this channel and follow us BlueSky or Instagram!

Fully Autonomous Robots

This video is the first time we were able to record two of our robots talking autonomously. While we were building them, they talked to each other all the time, but capturing on film proved harder than we thought. In this video, both robots are listening to what the other robot says and responding with replies generated by a chat bot based on what they hear.  

 

The robots are completely offline and only use open-source software. They are powered by a RaspberryPi and have a local LangChain chat bot (TinyLlama LLM). They use Vosk for speech recognition and Piper to synthesize speech. Vosk does a fairly good job converting the Piper voice (it did not recognize anything spoken using eSpeech). Piper works well most of the time but can miss a few words and freeze up unexpectedly. The pause mid-video is due to one of the robots briefly not being able to speak due to a buffer overflow issue. 

 

We also have distinct personalities and LLM prompts for all our robots, although in this clip they are hard to distinguish. The only thing noticeable is how  one robot moves its arms much more than the other. 

We have four modes:

  • Puppet: a human controls the robot in real-time
  • Scripted: The robot follows a script with minimal autonomous actions
  • Autonomous: The robot responds to outside stimuli on its won
  • Blended AI: the robot has a script but improvises what it says and how it moves.

Moving forward we will have two types of videos, scripted mode and fully autonomous. The puppet mode will use a human created script to control the robots. The fully autonomous films will be the robots talking on their own “off camera”.  

YouTube player

We are working on releasing the code based used in this video, but it is a bit too rough at this stage. 

Happy creating!