Tag Archives: robots

GoodBoy is a robot dog that runs on Arduino

via Arduino Blog

Daniel Hingston wanted to build a four-legged walking robot for several years, and with current coronavirus restrictions he finally got his chance. His 3D-printed robodog, dubbed “GoodBoy,” is reminiscent of a miniature version of Boston Dynamics’ Spot, which helped inspired the project. 

It’s extremely clean, with wiring integrated into the legs mid-print. Two micro servos per leg move it in a forward direction, controlled by an Arduino Uno.

Obstacle avoidance is provided by a pair of ultrasonic sensor “eyes,” allowing it to stop when something is in its path. An LDR sensor is also implemented, which when covered by its human minder commands it to present its paw for shaking.

Be sure to check out a short demo of GoodBoy below! 

Meet your new robotic best friend: the MiRo-E dog

via Raspberry Pi

When you’re learning a new language, it’s easier the younger you are. But how can we show very young students that learning to speak code is fun? Consequential Robotics has an answer…

The MiRo-E is an ’emotionally engaging’ robot platform that was created on a custom PCB  and has since moved onto Raspberry Pi. The creators made the change because they saw that schools were more familiar with Raspberry Pi and realised the potential in being able to upgrade the robotic learning tools with new Raspberry Pi boards.

The MiRo-E was born from a collaboration between Sheffield Robotics, London-based SCA design studio, and Bristol Robotics Lab. The cute robo-doggo has been shipping with Raspberry Pi 3B+ (they work well with the Raspberry Pi 4 too) for over a year now.

While the robot started as a developers’ tool (MiRo-B), the creators completely re-engineered MiRo’s mechatronics and software to turn it into an educational tool purely for the classroom environment.

Three school children in uniforms stroke the robot dog's chin

MiRo-E with students at a School in North London, UK

MiRo-E can see, hear, and interact with its environment, providing endless programming possibilities. It responds to human interaction, making it a fun, engaging way for students to learn coding skills. If you stroke it, it purrs, lights up, move its ears, and wags its tail. Making a sound or clapping makes MiRo move towards you, or away if it is alarmed. And it especially likes movement, following you around like a real, loyal canine friend. These functionalities are just the basic starting point, however: students can make MiRo do much more once they start tinkering with their programmable pet.

These opportunities are provided on MiRoCode, a user-friendly web-based coding interface, where students can run through lesson plans and experiment with new ideas. They can test code on a virtual MiRo-E to create new skills that can be applied to a real-life MiRo-E.

What’s inside?

Here are the full technical specs. But basically, MiRo-E comprises a Raspberry Pi 3B+ as its core, light sensors, cliff sensors, an HD camera, and a variety of connectivity options.

How does it interact?

MiRo reacts to sound, touch, and movement in a variety of ways. 28 capacitive touch sensors tell it when it is being petted or stroked. Six independent RGB LEDs allow it to show emotion, along with DOF to move its eyes, tail, and ears. Its ears also house four 16-bit microphones and a loudspeaker. And two differential drive wheels with opto-sensors help MiRo move around.

What else can it do?

The ‘E’ bit of MiRo-E means it’s emotionally engaging, and the intelligent pet’s potential in healthcare have already been explored. Interaction with animals has been proved to be positive for patients of all ages, but sometimes it’s not possible for ‘real’ animals to comfort people. MiRo-E can fill the gap for young children who would benefit from animal comfort, but where healthcare or animal welfare risks are barriers.

The same researchers who created this emotionally engaging robo-dog for young people are also working with project partners in Japan to develop ‘telepresence robots’ for older patients to interact with their families over video calls.

The post Meet your new robotic best friend: the MiRo-E dog appeared first on Raspberry Pi.

This robot looks like a ball and transforms itself into a quadruped to move

via Arduino Blog

Gregory Leveque has created an adorable 3D-printed robot that not only walks on four legs, but folds up into a ball when not in use. 

To accomplish this, the round quadruped utilizes one servo to deploy each leg via a parallelogram linkage system and another to move it forwards and backwards. A clever single-servo assembly is also implemented on the bottom to fill gaps left by the legs.

The device is controlled by an Arduino Nano, along with a 16-channel servo driver board. Obstacle avoidance is handled via an ultrasonic sensor, which sticks out of the top half of the sphere and rotates side to side using yet another servo. 

It’s an impressive mechanical build, especially considering its diminutive size of 130mm (5.12in) in diameter.

Make it rain chocolate with a Raspberry Pi-powered dispenser

via Raspberry Pi

This fully automated M&M’s-launching machine delivers chocolate on voice command, wherever you are in the room.

A quick lesson in physics

To get our head around Harrison McIntyre‘s project, first we need to understand parabolas. Harrison explains: “If we ignore air resistance, a parabola can be defined as the arc an object describes when launching through space. The shape of a parabolic arc is determined by three variables: the object’s departure angle; initial velocity; and acceleration due to gravity.”

Harrison uses a basketball shooter to illustrate parabolas

Lucky for us, gravity is always the same, so you really only have to worry about angle and velocity. You could also get away with only changing one variable and still be able to determine where a launched object will land. But adjusting both the angle and the velocity grants much greater precision, which is why Harrison’s machine controls both exit angle and velocity of the M&M’s.

Kit list

The M&M’s launcher comprises:

  • 2 Arduino Nanos
  • 1 Raspberry Pi 3
  • 3 servo motors
  • 2 motor drivers
  • 1 DC motor
  • 1 Hall effect limit switch
  • 2 voltage converters
  • 1 USB camera
  • “Lots” of 3D printed parts
  • 1 Amazon Echo Dot

A cordless drill battery is the primary power source.

The project relies on similar principles as a baseball pitching machine. A compliant wheel is attached to a shaft sitting a few millimetres above a feeder chute that can hold up to ten M&M’s. To launch an M&M’s piece, the machine spins up the shaft to around 1500 rpm, pushes an M&M’s piece into the wheel using a servo, and whoosh, your M&M’s piece takes flight.

Controlling velocity, angle and direction

To measure the velocity of the fly wheel in the machine, Harrison installed a Hall effect magnetic limit switch, which gets triggered every time it is near a magnet.

Two magnets were placed on opposite sides of the shaft, and these pass by the switch. By counting the time in between each pulse from the limit switch, the launcher determines how fast the fly wheel is spinning. In response, the microcontroller adjusts the motor output until the encoder reports the desired rpm. This is how the machine controls the speed at which the M&M’s pieces are fired.

Now, to control the angle at which the M&M’s pieces fly out of the machine, Harrison mounted the fly wheel assembly onto a turret with two degrees of freedom, driven by servos. The turret controls the angle at which the sweets are ‘pitched’, as well as the direction of the ‘pitch’.

So how does it know where I am?

With the angle, velocity, and direction at which the M&M’s pieces fly out of the machine taken care of, the last thing to determine is the expectant snack-eater’s location. For this, Harrison harnessed vision processing.


Harrison used a USB camera and a Python script running on Raspberry Pi 3 to determine when a human face comes into view of the machine, and to calculate how far away it is. The turret then rotates towards the face, the appropriate parabola is calculated, and an M&M’s piece is fired at the right angle and velocity to reach your mouth. Harrison even added facial recognition functionality so the machine only fires M&M’s pieces at his face. No one is stealing this guy’s candy!

So what’s Alexa for?

This project is topped off with a voice-activation element, courtesy of an Amazon Echo Dot, and a Python library called Sinric. This allowed Harrison to disguise his Raspberry Pi as a smart TV named ‘Chocolate’ and command Alexa to “increase the volume of ‘Chocolate’ by two” in order to get his machine to fire two M&M’s pieces at him.

       

Drawbacks

In his video, Harrison explaining that other snack-launching machines involve a spring-loaded throwing mechanism, which doesn’t let you determine the snack’s exit velocity. That means you have less control over how fast your snack goes and where it lands. The only drawback to Harrison’s model? His machine needs objects that are uniform in shape and size, which means no oddly shaped peanut M&M’s pieces for him.

He’s created quite the monster here, in that at first, the machine’s maximum firing speed was 40 mph. And no one wants crispy-shelled chocolate firing at their face at that speed. To keep his teeth safe, Harrison switched out the original motor for one with a lower rpm, which reduced the maximum exit velocity to a much more sensible 23 mph… Please make sure you test your own snack-firing machine outdoors before aiming it at someone’s face.

Go subscribe

Check out the end of Harrison’s videos for some more testing to see what his machine was capable of: he takes out an entire toy army and a LEGO Star Wars squad by firing M&M’s pieces at them. And remember to subscribe to his channel and like the video if you enjoyed what you saw, because that’s just a nice thing to do.

The post Make it rain chocolate with a Raspberry Pi-powered dispenser appeared first on Raspberry Pi.

mechDOG, a 12-servo robotic pup

via Arduino Blog

Mech-Dickel Robotics has designed a beautiful quadruped robot dubbed mechDOG, which utilizes a dozen servos for motion. This gives each leg three degrees of freedom, allowing the cat-sized beast to travel a meter in 8.46 seconds. While it won’t break any speed records, creating a walking motion on this sort of unstable platform is an impressive feat in itself.

mechDOG is controlled by an Arduino Uno, while a Lynxmotion Smart Servo Adapter Board interfaces with the servos themselves. The device is remote-controlled via an RF unit, though it does have a pair of ultrasonic sensors that presumably could be used for obstacle avoidance. 

You can check it out in action in the videos below, looking sharp in its yellow-finished aluminum sheet metal chassis.

This mouth mechanism is controlled by your typing

via Arduino Blog

Will Cogley, known for his awesome animatronics, has created a robotic mouth that’s already a work of art and could form the basis of something even more amazing. 

The device features an array of servo mechanisms to actuate its jaw, forceps, cheeks, and a tongue. The cheek assemblies are particularly interesting, employing two servos each and a linkage system that allows it to move in a variety of positions.

For control, the project uses a Python program to break typed sentences up into individual sounds. It then sends these to an Arduino, which poses the mouth in sequence. Cogley has also experimented with microphone input and hopes to explore motion capture with it in the future.