Number 10 Gets A Screen

We have started upgrading all our robot to run the new RobotFreedom.AI framework. For Number Ten, the main missing piece was a screen. Some of our early designs were not built with a screen in mind and adapting the design has taken a few iterations.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

Here is a list of some materials and components used:

Number Ten was a tricky design to fit a screen on and keep to its original design. After many attempts we settled on using a L-Bracket placed at the front the body to mount the screen.

Now we reassembled the legs onto the body. Number Ten was one of several experiments the Hip Monster’s sister team built to come up with the most unusual way to move a robot. The robot moves forward, left and right by sliding one foot forward. On the bottom of each foot is edge shaped gripers that provide traction when pushed against but slide when pushed forward.

The screen is light enough to only need a few attachments to hold it in place. For added support we used a wire at the top of the screen to keep it secure while moving. Number Ten has never fallen forward so we need less protection for the electronics and screens than some of out other designs.

Our we assemble the various components Number Ten will need. We recommend using a usb hub  for the wireless keyboard dongle. If you have several robots you will want to reuse the keyboard and will need quick access to the dongle. Typically, once we settle on a final layout for the RaspberryPi it is in a secure but difficult to each place making removing the dongle difficult. For people with less than perfect eye sight we recommend using a magnifying glass and bright lights when connection the GPIO pin to the RaspberryPi.

YouTube player

And here is a quick video of Number Ten display screen working. It is a light weight version of our main display better suited for older RaspberryPis.

YouTube player

Happy creating!

Updates to RobotFreedom.AI

Since our last update at Maker Faire, we’ve made significant improvements to our robot lineup, focusing on increasing autonomy, human interactions and emotional expression. Core to this is our AI framework, RobotFreedom.AI now available on GitHub.

The design principle for our autonomous AIs ():

  1. Runs completely locally on a RaspberryPi (no third-party API call out)
  2. Has distinctive personalities based on the big 5 personality traits
  3. Learns from interactions (initiated when in sleep mode)
  4. Makes informed decisions based on a knowledge graph
  5. Can carry on a conversation between themselves and with a human

To achieve this we used an Agentic AI framework rather than just tapping direction into a chat bot. By having a core AI that was designed to meet our specifics goals we had more control of its behavior and could also directly see how the AI worked in real-time which provide to be a great educational tool.

One of the key upgrades is the addition of machine learning algorithms, which enable the robots to learn from their interactions and adapt quickly adapt new situations. This allows them to become even more autonomous in their decision-making processes, making them more efficient and effective in completing tasks.

We’ve also made notable strides in expanding the robots’ interactive capabilities, incorporating features such as voice recognition, gesture control, and tactile feedback. These enhancements enable users to engage with the robots on a deeper level, fostering a more immersive and engaging experience.

Some of the specific updates include:

* Advanced sensor arrays for improved navigation and obstacle detection

* Enhanced machine learning algorithms for adaptive decision-making

* Voice recognition and speech-to-text capabilities

* Tactile feedback mechanisms for haptic interaction

These updates have significantly enhanced the robots’ autonomy, interactivity, and overall user experience. We’re excited to continue refining our designs and pushing the boundaries of what’s possible with robotics and AI.

We have been busy working on our next release of our robot software platform

Major features:

  • Robots can coordinate actions using web socket communication
  • Dedicated http server for each robot
  • Added Piper TTS for voice
  • Added Vosk for speech recognition
  • Added Ollama and LangChain for chat bot.
  • Improved random movement generator.
  • Tons of bug fixes
  • Improved debug mode
  • Low memory mode for RaspberryPis 1-3

Tested on OsX and RaspberryPi 1-5.

You can see our Robotic AI platform in action here.

Happy creating!

Arduino Robotic Controller Software Update

The RobotFreedom robots are controlled by two code bases. The first runs on a RaspberryPi and is written in Python. You can read more about it here. The second code base controls the movements and lights on the robot. It is written in C and runs on a Arduino. This article will get you started on developing with that code base. You can download it GitHub.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The movement controller is designed to be light and simple compared to the main AI code-base and is ideal for a beginner. The focus is to provide an interface for a human or AI to control a robot’s moving components (arms, legs and wheels). We use a Arduino Mega Board because it has a plenty of digital pin to attach our components to. Below is an image of a Arduino Mega board.

Arduino’s can be controlled via serial communication through a USB port or you can code it to run independently. Our robotic walkers are controlled only by an Arduino. This project is intended to be controlled by an AI installed on a RaspberryPi.

The purpose of the program is to receive incoming messages and perform the requested action. For example ‘a’ means raise the left arm. When the program receives an ‘a’ command it sends a command to a H-Bridge which then send power to a linear actuator to raises the left arm.

To start, install the Arduino IDE on your preferred development OS. Linux, OXS and Windows is supported. You can get the code here.

Next, download the required library and copy them to your development folder.

Adafruit_BusIO
Adafruit_I2CDevice
Adafruit-GFX-Library
FastLED

Launch the Arduino IDE and from the menu bar select:
Sketch>Include Library>Add Zip Library…

Launch the Arduino IDE and from the menu bar select:
Sketch>Include Library>Add Zip Library…

Then point to one of the zip files you have downloaded. Repeat for each required library.

To open the project, double click on the movement_controller.ino file and the Arduino IDE should automatically launch. If it does not launch you can load the sketch by selecting File>Open then navigate to your project file.
Now choose a board type. When you connect your Arduino board it should be auto-detected by the IDE. For some brands you may have to manual select it from the combo box. For off-brand Arduino we recommend searching forums for the best match. Many specify incorrect boards in their descriptions.

Next select >Sketch>Verify Compile. At the bottom of the IDE a console will appear and provide a detailed log. If you missed any libraries you will receive an error message. Loading any required libraries and try again.

Once the code is compiled select sketch>upload to deploy your code.

Below is a list of the components the code can control:

H-Bridge
FastLED
Linear Actuator

The image below is the wiring for the Arduino:

To test select Tools>Serial Monitor from the main menu. At the bottom of the IDE a new tab titled “Serial Monitor” will appear. From this console you can directly communicate with the Arduino.

 In the console window type “5”. You should see a response from the Arduino in the console and if the LED is connected to the Arduino it should turn white and flash in a circular pattern.

Now you can control your own robot!
Happy Creating!

 

Number Nine is Rewired

We are learning weight is everything when it comes to good performance from our robots. One of our best jumpers, Number Nine, used splicing connectors that had very useful push handles but were way too heavy for continued use.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

The old connectors were perfect when we were prototyping designs but once we settled on a wiring diagram it was time to move on to the much lighter push-in designs. The video below is a sped up video of one of the Hip Monster’s sister team (age 13) rewiring Number Nine with the new connector:

YouTube player

And now for testing! Here is a video of Number Nine is back in action and ready for more upgrade:

YouTube player

Happy creating!

Gunnerkrigg Court S13

When people visit our workshop, the first thing they would see is a big box of parts labeled S13. The HipMonsters sister team use that box for all the leftover pieces when we upgrade our robots (mostly parts from Number Two and Number Three).

The idea for the box came from the online graphic novel series Gunnerkrigg Court, which is one of our all-time favorite works of fiction. This is the mysterious S13 box in Gunnerkrigg Court waiting to be assembled.

This is the one page that sent the Hip Monster’s team on a four year journey to build a robot that could carry on a conversation.

During the Covid pandemic, being able to build your own robot to play with was very appealing to the Hip Monster’s sister team. Gunnerkrigg Court and Girl Genius Online made building robots seem easy. Years later, the whole team now knows that building robots is fun, but also hard and tedious. Our robots can now talk and move on their own, but are still not as good as S13. Given we lack etheric powers (what the supernatural force is called in Gunnerkrigg court) we think we did fairly well.

It was raining over the weekend and we are tired of working on real robots (some of which now talk back at us) so decide to rebuild our first non-work robot from the scraps.

Above is our real-life replication of the assembly of S13. Here in the top left photo we have laid out all of the pieces we found in the box. In the top right photo we are assembling the legs.

The rebranded S13 almost complete.

Gunnerkrigg Count was probably the work of fiction that was the most influential in our decision to build robots. During the pandemic, the adventurous spirit of the two central characters (Annie and Kat) challenged us to push ourselves.

Our emotional AI which controls all our robots is loosely based on S13’s conversation with another robot later in the series about having an ocean of feelings to swim in. When we designed the AI we made sure that at a high level, the code held true to the ocean analogy. Our robots swim in emotions, stimuli, and personality. There is an algorithm that runs deep in the code that lets the robot adjust its behavior given what it experiences.

Here is our very much over used copy of the first volume of Gunnerkrigg Court. We are saving up to buy new hardcover additions.

we hope you find your inspiration.

Intro to Our Workshop!

In this video Ted from the HipMonster’s team shows our workshop and describes how we train our robots. We have fifteen DIY robots throughout the workshop that listen in on our conversations to learn from us while we work. The robots are completely autonomous and learn on their own. If you are interested in building your own, our website has instructions. These designs are meant for all ages, but even K-12 kids can get started building their own robots.

YouTube player

The robots have their own site, RobotFreedom.com. Watch them they recap the week’s event between themselves.

Please like and subscribe to this channel and follow us BlueSky or Instagram!

Fully Autonomous Robots

This video is the first time we were able to record two of our robots talking autonomously. While we were building them, they talked to each other all the time, but capturing on film proved harder than we thought. In this video, both robots are listening to what the other robot says and responding with replies generated by a chat bot based on what they hear.  

 

The robots are completely offline and only use open-source software. They are powered by a RaspberryPi and have a local LangChain chat bot (TinyLlama LLM). They use Vosk for speech recognition and Piper to synthesize speech. Vosk does a fairly good job converting the Piper voice (it did not recognize anything spoken using eSpeech). Piper works well most of the time but can miss a few words and freeze up unexpectedly. The pause mid-video is due to one of the robots briefly not being able to speak due to a buffer overflow issue. 

 

We also have distinct personalities and LLM prompts for all our robots, although in this clip they are hard to distinguish. The only thing noticeable is how  one robot moves its arms much more than the other. 

We have four modes:

  • Puppet: a human controls the robot in real-time
  • Scripted: The robot follows a script with minimal autonomous actions
  • Autonomous: The robot responds to outside stimuli on its won
  • Blended AI: the robot has a script but improvises what it says and how it moves.

Moving forward we will have two types of videos, scripted mode and fully autonomous. The puppet mode will use a human created script to control the robots. The fully autonomous films will be the robots talking on their own “off camera”.  

YouTube player

We are working on releasing the code based used in this video, but it is a bit too rough at this stage. 

Happy creating! 

Getting Started with Raspberry PI

Originally, we set up this site to focus on woodcrafting and painting but as our interests grew, we have increasingly used Raspberry Pis to add motion and life into our work. This post will get you started using Raspberry Pi’s in your creations.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

Why Raspberry Pi?

  1. Powerful computing platform with easy-to-use languages.
  2. Low energy consumption and runs quietly and cooly.
  3. Rich online support and user base.
  4. Has 26 pins built in enabling rapid integration with Internet of Things (IoT) technology.

RaspberryPi 5

Peripherals

Today, most people developed on a laptop or tablet, but Raspberry Pi’s require old fashion peripherals: power cables, screen, keyboard and mouse. You need to setup a physical development environment and make sure you have all the necessary peripherals. Newer Raspberry Pi uses a Micro HDMI port so you will need a converter. We do a lot of coding on the couch so built a makeshift laptop as seen below.  

DIY RaspberryPi Laptop

A side view of our Raspberry Pi laptop.

DIY RaspberryPi Laptop

A front view of our laptop.

A mouse can get some to get use to so we recommend a wireless keyboard (seen above) with a built-in trackpad. One plus is the keyboard + trackpad only uses up one USB port.

The Hard Drive

A Raspberry Pi’s OS is stored on a Micro SD. To start we recommend getting two with at least 64 GB. If you do any images or sound the drive fills up fast. You will also need at least two readers. One USB A for the Raspberry Pi when you transfer code and one for your other machine to build the OS image from.

SD card and reader

Building the OS Image

You can buy Micro SD cards with built in OS. If you do not have a laptop or desktop that is you only real option. You can also build your own OS image using tool provided by Raspberry Pi. You dan download it here: raspberrypi.com/software.

We recommend modifying the advance setting to pre-configure your login and Wi-Fi password.

Booting the Device

Make sure to use the appropriate power supply as specified by RaspBerryPi. Depending on the version, booting can take a while. Once it has completed booting you should see a screen that looks like most standard desktop environments.

Linux Desktop

Raspberry Pi’s OS is ARM version of Linux. If you have used Linux most of the standard tools will be available. If you have only used Windows or OSX the environment should seem very familiar. All the desktop environments follow the same basic principles. If you have never used a desktop environment this is a great place to start!

Configuring Your Environment

The keyboard defaults to UK. If you are not in the UK many of the keys will not work as expected. In Preferences, open up the Mouse and Keyboard Setting then click the Keyboard layout button at the bottom. In the combo box choose the appropriate country.

We also recommend a smaller or not image for the background to use less memory.

Developing Your Next Big Thing!

We started using Scratch as a development tool. If that works for you and makes sense keep using it! Here is a link on how to install it on a Raspberry PI.

We have migrated to mow using Python and C++. To write code we use the Geany Programmer’s Editor. It lacks some features of Visual Studio Code (what we develop on in Windows and OSX) but has a light foot print.

Typically, we write code for a Raspberry Pi on both a MacBook and the Raspberry Pi itself. We do find the MacBook is similar enough environment we do not need to change our code too much. If you look at our code in GitHub we you we often have different logic based on which environment the code is run on. Note: there are some packages that only work on Raspberry Pi such as interfaces to sensors. In these sections of the code, we have non-functioning stub if the platform is OSX.

We transfer code using the SD reader. Both OSX and Linux auto-detect SD cards when attacked but with Linux it can take a bit so be patient. Also, sometimes Linux cannot write to large SD card so try a small on first.

Our next post will dive deeper into the basic of programming Python on a Raspberry Pi. For now, if you have never used Linux or a desktop environment we recommend just browsing the Web using Chromium (the open source base to Chrome) to familiarize yourself.

Happy Creating!

 

 

 

 

 

 

Number One On Its Own

Number One looks very simple, it’s just a burnt out hair drier with wheels. As out first design we opted for a wheeled robot that followed a more traditional form, but it has been repeatedly updated over the years and now is completely autonomous with a mind of its own, making it one of our most complex robots. Powered by a RaspberryPi, our new Number One is now a Edge AI mobile sensor.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

DIY wheeled robot

The handle of the blow drier servers as a functional hub for the electronic component. The two batteries (one for the RaspberryPi and one for the motors) are attached to the back to allow for quick replacement. The camera is mounted at the top to provide a good overall view. The display, which is mostly for show, is forward facing. We added “bumpers” to the screen on each counter to help protect it in from falling or bumping in to something. The first screen hit a end table and developed a crack, which convinced us that it needed some armor.

DIY Wheeled Robot RaspberryPi

To protect the range finder, we added wooden bumper. Originally the range sensor had no protection, but after a few good hits we decided a bumper was a good idea. The range finder has proven to be sturdy but the wires to tend to fall off.

DIY Wheeled Robot RaspberryPi

Above is a back view. When we first built Number One it the components were completely attached using electrical tape. While this worked surprisingly well, it did not look good. Most components are now bolted on or attached using leather to help the robot look more aesthetic.

DIY Wheeled Robot RaspberryPi

The RaspberryPi is attached in front for easy access. The USB and other access ports are easily accessed allowing for quick repairs. We use a wireless keyboard to control the RaspberryPi. While the robot is autonomous (it makes decisions on its own) when it first gets power the AI part of the robot does not turn on. The robot can only become active after we execute a command. The original model turned on automatically, but that proved to be a bit of a headache when something went wrong.

Robot layout

The above image is the layout design using software from Fritzing.org. This is a far simpler layout that what we made for Number Two and Number Three. We may add more sensors over time, but to enable a fast response and to reduce power needs we decided to keep the number of sensors to a minimum.  Another difference is we are not using an Arduino to control the movement. For beginners this is a better design to learn with.

Here is Number One in action! Come see it live at this year’s Bay Area Maker Faire! 

You can download the code from our GitHub.

Happy Creating!

Wiring of Number Two and Three

The HipMonster’s sister team decided to push our robotics to the next level. They were dissatisfied with remote controlled robots with no personality or pre-programmed robots who were predictable. What they wanted was a more independent android which could interact with and learn from its environment. While AI would drive this vision, just as important would be sensors and mechanics to enable the robots to come to life.

To start upgrading Number Two and Number Three, we explored different wiring layouts using Fritzing. Fritzing is an open source software program that lets you design and prototype component layouts virtually. This is a great tool for experts and beginners alike and can save you time and money in developing your next electronic project. The images below are exported from Fritzing and show layouts for our improved robots.

Please note, this material is provided for informational purposes only and is not a guide on how to create the designs. Please take a look at our disclaimer.

Fritzing diagram of steampunk robots

The above image is the layout for the Arduino and motors that allow the robots to move, as well as a decorative LED light. The linear actuators are controlled by H-Bridges and the motors by relays. We use a 12 volt battery for power. The Arduino receives commands from a RaspberryPi, which controls the LED light and  brings everything together. Written in C++, the code for the Arduino is based off of our Walker code.

Sensor diagram for steampunk robot

The above image is the layout for the RaspberryPi and the sensors. The signal processing and AI that is written in Python would live on the RaspberryPi. After much experimenting, we found it was best to have most sensors connected directly to the RaspberryPi and dedicate the Arduino completely to movement. Here is a good tutorial on using a motion sensor with a RaspberryPi.

While we wanted a robot with modern AI and technology, we still wanted a steampunk feel. So we decided to use wood for the baseboard, use vintage wiring techniques, and use leather to secure components and wires.

Computer parts for a robot

Once the layouts were finalized and the components acquired for our design, we started exploring different layouts for the baseboard. The baseboard is the most critical piece for our robot’s design. Not only does it secure all the electronics, but also provides structural support for the arm movements. While wiring the board, finding the right layout proved to be more of an art than science. The electronics, power, wiring and the robot’s skeleton all needed to fit together seamlessly, but often one or two components would refuse to play well with the others. The biggest issue was arranging the cabling to minimize stress on the connectors. For example, the HDMI slot needs to point downward or the stress would bend it over time. Number Two and Number Three also needed slightly different boards to work well with their different designs.

Wooden computer baseboard

Above is the final form of the baseboard with the mounting screws attached. Remember to test the sizing on the mounting screws on each component before attaching them to the board. Also make sure to double check your measuring before drilling holes.

Wiring robot components together

Here we are wiring the board for Number Two. We found it was good to test each connection after it was attached to make sure the wires had a clean connection and would not come off. While wiring two or three wires is easy, but after wiring a larger amount, mistakes can be made. If just one wire was in the wrong place or was stripped incorrectly, you could spend hours tracking it down. Thankfully both the Arduino and RaspberryPi are forgiving, but the sensors are not. If you wire a sensor incorrectly it will overheat and burn out.

Here is another view of us wiring the board. Before attaching it to the robots, we tested everyone repeatedly. Even our cat helped in the testing by batting the wires as the motors kicked in.

And here is the Number Three with its new board in action! The color circle indicates which sensor is receiving input. When the robot receives stimuli, it responds by either moving or speaking to try and encourage more stimuli.

Come see Number Three, Number Two, and more at this year’s Bay Area Maker Faire.

Happy Creating!