This is a ‘Spot Micro’ walking quadruped robot running on Raspberry Pi 3B. By building this project, redditor /thetrueonion (aka Mike) wanted to teach themself robotic software development in C++ and Python, get the robot walking, and master velocity and directional control.
Mike was inspired by Spot, one of Boston Dynamics’ robots developed for industry to perform remote operation and autonomous sensing.
The mini ‘Spot Micro’ bot rocks a three-axis angle command/body pose control mode via keyboard and can achieve ‘trot gait’ or ‘walk gait’. The former is a four-phase gait with symmetric motion of two legs at a time (like a horse trotting). The latter is an eight-phase gait with one leg swinging at a time and a body shift in between for balance (like humans walking).
Mike breaks down how they got the robot walking, right down to the order the servos need to be connected to the PCA9685 control board, in this extensive walkthrough.
Here’s the code
And yes, this is one of those magical projects with all the code you need stored on GitHub. The software is implemented on a Raspberry Pi 3B running Ubuntu 16.04. It’s composed on C++ and Python nodes in a ROS framework.
Mike isn’t finished yet: they are looking to improve their yellow beast by incorporating a lidar to achieve simple 2D mapping of a room. Also on the list is developing an autonomous motion-planning module to guide the robot to execute a simple task around a sensed 2D environment. And finally, adding a camera or webcam to conduct basic image classification would finesse their creation.
As a part of their masters program at the University of Stuttgart, Jan Ingo Haller and Lorin Samija created a robotic pet that moves in a manner that may not be immediately evident. With the internals obscured by a cloth covering, the moving OLOID, or mOLOID, seems to roll from one vague lobe section to another like some sort of claymation creature.
The mOLOID’s unique locomotion is due to an internal “oloid” structure, an arrangement of two circles at 90°. Two servos move weights around the perimeter of each circle to vary its center of gravity, causing it to flop back and forth.
An Arduino Uno controls the mOLOID, which features a passive infrared sensor that allows it to react to the environment and an HC-05 Bluetooth module for user interface. A small speaker also provides audible feedback.
Corona has changed our lives: it requires us to physicially distance, which in turn leads to social distancing. So what could be a solution? Maybe a pet? But no, Corona comes from animals. Let’s save ourselves from another Corona 2.0. But if we have to keep away from humans (to not infect and not be infected) and animals but remain the social beings we are, what should we do?
Have no despair! We have found a solution: the moving OLOID a.k.a. mOLOID. It combines interesting geometry (a bit nerdy but nerdy is trendy!) with many aspects of pets: it can make you smile, moves on its own, makes cute sounds and listens to you — at least most of the time.
While 2020 may seem like a very futuristic year, we still don’t have robotic maids like the Jetsons’ Rosie the Robot. For his latest element14 Presents project, DJ Harrigan decided to create such a bot as a sort of animatronic character, using an ESP8266 board for interface and overall control, and a MKR ZERO to play stored audio effects.
The device features a moveable head, arms and eyes, and even has a very clever single-servo gear setup to open and close its mouth.
UI is via smartphone running a Blynk app, and Rosie’s antennas can light up along with a “beep beep” sound to let you know it needs your attention!
The Physical Twin travels on a three-wheeled chassis and mounts a four-axis arm with a brush. An operator controls the arm to dip the brush into an onboard paint container, and can then manipulate it for application.
The controller consists of a joystick for movement as well as a mini version of the arm. Four potentiometers measure arm input angles, which are duplicated on four corresponding servos on the robot. A pair of Arduino Mega boards are used for the setup — one on the mobile robot and another in the remote unit.
You can see the device in action in the videos below, showing off direct operation and the ability to play back prerecorded movements.
One of the simplest ways to make a mobile robot involves differential steering, where two wheels move at different speeds as needed to turn, and a roller on the back keeps it from tipping over. The MrK_Blockvader is an excellent take on this type of bot, demonstrated in the first clip below. It features a nice blocky body comprised out of 3D-printed parts, wheels driven by tiny gear motors, and an integrated roller ball on the back.
The MrK_Blockvader is controlled via an Arduino Nano, along with an nRF24 breakout that allows it to receive signals from a radio transmitter unit. The build includes LED lighting as well as a piezo buzzer for all the beeps and boops. It can also take advantage of various sensors if necessary.
The eventual goal is to use the MrK_Blockvader in a network of robots, hinted at in the second video with a worker at its side.
Mechanical table hockey games, where players are moved back and forth and swing their sticks with a series of knobs, can be a lot of fun; however, could one be automated? As Andrew Khorkin’s robotic build demonstrates, the answer is a definite yes — using an Arduino Mega and a dozen stepper motors to score goals on a human opponent.
The project utilizes an overhead webcam to track the position of the players and puck on the rink, with a computer used for object detection and gameplay. Each player is moved with two steppers, one of which pushes the control rod in and out, while the other twists the player to take shots.
Training the game took six months of work, which really shows in the impressive gameplay seen below.