Animals like dogs, cats, raccoons, rhinoceroses, and many more get around on four legs. To help imitate this natural phenomenon, maker “Technovation” decided to create a low-cost quadruped robot using 12 servo motors and variety of 3D-printed and laser-cut parts.
Each leg features two servos that move inline with the body, as well as one arranged with its rotation axis at 90 degrees. This enables it to walk forward, scoot side-to-side, and perform a variety of twisting motions.
The robot is powered by an Arduino Uno, along with a sensor shield for easy motor connections. Inverse kinematics can be used to properly calculate servo moves, which is integrated into the device’s control sketch.
After studying the way a worm wiggles, Nicholas Lauer decided to create his own soft robotic version. What he came up with uses an Arduino Uno for control, inflating six 3D-printed segments sequentially to order to generate peristaltic motion for forward movement.
The robotic worm uses a 12V mini diaphragm pump to provide inflation air, while a series of transistors and solenoid valves directly regulate the airflow into the chambers.
The build looks pretty wild in the video below, and per Lauer’s write-up, you’re encouraged to experiment to see what kind of timing produces the most expedient motion. Code, STLs, and a detailed BOM are available on GitHub.
Sourino — which comes from the French word for mouse, “souris,” plus Arduino — is a small robot by 11-year-old maker Electrocat, meant to entertain kitties and kids alike.
The device features a 3D-printed body roughly shaped like a mouse, controlled by a Nano along with three HC-SR04 ultrasonic sensors poking out for autonomous navigation. An IR sensor is implemented for remote operation, and two small gearmotors with a driver board enable it to move around on the floor.
As seen in the video below, Sourino is able to travel a path made out of books and interact with (more like drive crazy!) the house cat. Full build instructions are found here, including a parts list, Arduino code, and CAD files.
MABEL has individually articulated legs to enhance off-road stability, prevent it from tipping, and even make it jump (if you use some really fast servos). Harry is certain that anyone with a 3D printer and a “few bits” can build one.
MABEL builds on the open-source YABR project for its PID controller, and it’s got added servos and a Raspberry Pi that helps interface them and control everything.
Thanks to a program based on the open-source YABR firmware, an Arduino handles all of the PID calculations using data from an MPU-6050 accelerometer/gyro. Raspberry Pi, using Python code, manages Bluetooth and servo control, running an inverse kinematics algorithm to translate the robot legs perfectly in two axes.
IKSolve is the class that handles the inverse kinematics functionality for MABEL (IKSolve.py) and allows for the legs to be translated using (x, y) coordinates. It’s really simple to use: all that you need to specify are the home values of each servo (these are the angles that, when passed over to your servos, make the legs point directly and straight downwards at 90 degrees).
MABEL is designed to work by listening to commands on the Arduino (PID contoller) end that are sent to it by Raspberry Pi over serial using pySerial. Joystick data is sent to Raspberry Pi using the Input Python library. Harry first tried to get the joystick data from an old PlayStation 3 controller, but went with the PiHut’s Raspberry Pi Compatible Wireless Gamepad in the end for ease.
Keep up with Harry’s blog or give Raspibotics a follow on Twitter, as part 3 of his build write-up should be dropping imminently, featuring updates that will hopefully get MABEL jumping!
Donald wanted users to get as much interaction and feedback as possible, rather than simply pressing a button and receiving a random drink. So with this machine, the interaction comes in four ways: instructions provided on the screen, using a key card to bypass security, placing and removing a cup on the tray, and entering an order number on the keypad.
In addition to that, feedback is provided by way of lighting changes, music, video dialogue, pump motors whirring, and even the clicks of relays at each stage of the cocktail making process.
Ordering on the keypad
The keypad allows people to punch in a number to trigger their order, like on a vending machine. The drink order is sent to the Hello Drinkbot software running on the Raspberry Pi 3B that controls the pumps.
Getting your cup filled
In order for the machine to be able to tell when a vessel is placed under the dispenser spout, and when it’s removed, Donald built in a switch under a 3D-printed tray. Provided the vessel has at least one ice cube in it, even the lightest plastic up is heavy enough to trigger the switch.
The RFID card reader
Cocktail machine customers are asked to scan a special ID card to start. To make this work, Donald adapted a sample script that blinks the card reader’s internal LED when any RFID card is detected.
Interactive video screen
This bit is made possible by MP4Museum, a “bare-bones” kiosk video player software that the second Raspberry Pi inside the machine runs on boot. By connecting a switch to the Raspberry Pi’s GPIO, Donald enabled customers to advance through the videos one by one. And yes, that’s an official Raspberry Pi Touch Display.
The Hello Drinkbot ‘bartender’
Donald used the Python-based Hello Drinkbot software as the brains of the machine. With it, you can configure which liquors or juices are connected to which pumps, and send instructions on exactly how much to pour of each ingredient. Everything is configured via a web interface.
Via a bank of relays, microcontrollers connect all the signals from the Touch Display, keypad, RFID card reader, and switch under the spout.
Donald shared an exhaustive kit list on his original post, but basically, what you’re looking at is…
This is a ‘Spot Micro’ walking quadruped robot running on Raspberry Pi 3B. By building this project, redditor /thetrueonion (aka Mike) wanted to teach themself robotic software development in C++ and Python, get the robot walking, and master velocity and directional control.
Mike was inspired by Spot, one of Boston Dynamics’ robots developed for industry to perform remote operation and autonomous sensing.
The mini ‘Spot Micro’ bot rocks a three-axis angle command/body pose control mode via keyboard and can achieve ‘trot gait’ or ‘walk gait’. The former is a four-phase gait with symmetric motion of two legs at a time (like a horse trotting). The latter is an eight-phase gait with one leg swinging at a time and a body shift in between for balance (like humans walking).
Mike breaks down how they got the robot walking, right down to the order the servos need to be connected to the PCA9685 control board, in this extensive walkthrough.
Here’s the code
And yes, this is one of those magical projects with all the code you need stored on GitHub. The software is implemented on a Raspberry Pi 3B running Ubuntu 16.04. It’s composed on C++ and Python nodes in a ROS framework.
Mike isn’t finished yet: they are looking to improve their yellow beast by incorporating a lidar to achieve simple 2D mapping of a room. Also on the list is developing an autonomous motion-planning module to guide the robot to execute a simple task around a sensed 2D environment. And finally, adding a camera or webcam to conduct basic image classification would finesse their creation.