Up until the present day, if you need butter, you simply ask another human to “pass the butter,” leading to minor inconvenience and awkwardness. Engineering students in Brussels have come up with a novel solution: a robot that brings the butter to you!
The robot, inspired by Rick and Morty’s Butter Bot, is powered by an Arduino Uno and summoned to hungry humans via an infrared remote control.
When the signal detected by onboard IR sensors, the robot moves over using continuous-rotation modded servos, then flips its cap-like lid to reveal the butter inside.
Building robots can be difficult, and if you want to construct something humanoid, designing the mechanics alone can be a significant task. ASPIR, which stands just over four feet tall, looks like a great place to start.
John Choi’s 3D-printed robot can move its arms, legs, and head via 33 servo motors, all controlled by an Arduino Mega, along with a servo shield.
The documentation found here is excellent; however, it comes with a warning that this is a very advanced project, taking several months to build along with $2,500 in parts. Even if you’re not willing to make that commitment, it’s worth checking out for inspiration, perhaps parts of the ASPIR could be adapted to your own design!
With Halloween around the corner, hackers are gearing up for festivals and trick-or-treaters, hoping to spook visitors or simply impress others with their automation prowess. DIY bloggers Ash and Eileen are no different, and decided to enter a local scarecrow contest in the “Out of this World” category. Their entry? Moo-Bot, an Arduino-powered sheet metal cow that looks like it came straight off the set of a 1950s sci-fi flick.
Not that that is a bad thing; somehow this retro-futuristic bovine looks quite interesting. Making it even better is that the robotic cow’s eyes are made out of two OLED displays, and that it can interact with observers through an internal speaker.
When someone presses a button on its nose, the onboard Uno powers up and tells a pre-recorded series of cow jokes via an MP3 player module. Power is supplied by eight D batteries, which is enough to keep the Moo-Bot going for a few months.
While it might seem like a long time away to most people, if you’re looking to make an amazing automated display for Halloween, it’s time to start planning! One idea would be an automated skeleton robot like SKELLY.
This particular robot was built using an Arduino Mega, a Cytron PS2 Shield, a modified sensor shield, and a wireless PS2 controller. SKELLY is equipped with a total of eight servos: six for bending his shoulders, elbows and wrists, one for running his mouth, and another for turning his head. There is also a pair of LEDs for eyes, and a small motor in his head with a counterweight that allows him to shake.
SKELLY is programmed using the Visuino visual programming environment. As seen in the videos below, the robot–which is the author’s first–is quite nimble, waving and moving along with an automatic piano!
With the lack of people capable of turning written or spoken words into sign language in Belgium, University of Antwerp masters students Guy Fierens, Stijn Huys, and Jasper Slaets have decided to do something about it. They built a robot known as Aslan, or Antwerp’s Sign Language Actuating Node, that can translate text into finger-spelled letters and numbers.
Project Aslan–now in the form of a single robotic arm and hand–is made from 25 3D-printed parts and uses an Arduino Due, 16 servos, and three motor controllers. Because of its 3D-printed nature and the availability of other components used, the low-cost design will be able to be produced locally.
The robot works by receiving information from a local network, and checking for updated sign languages from all over the world. Users connected to the network can send messages, which then activate the hand, elbow, and finger joints to process the messages.
We present a computational design system that allows novices and experts alike to easily create custom robotic devices. The core of our work consists of a design abstraction that models the way in which electromechanical components can be combined to form complex robotic systems. We use this abstraction to develop a visual design environment that enables an intuitive exploration of the space of robots that can be created using a given set of actuators, mounting brackets and 3d-printable components. Our computational system also provides support for design auto-completion operations, which further simplifies the task of creating robotic devices. Once robot designs are finished, they can be tested in physically simulated environments and iteratively improved until they meet the individual needs of their users.