Tag Archives: Featured

ClearCrawler Strandbeeest walks under Arduino control

via Arduino Blog

Maker Jeremy S. Cook has been building Theo Jansen-style walkers for literally years, and after several iterations has come up with what he calls the “ClearCrawler.” 

This little guy stands at just over 15 inches tall — including its comparatively large clear cylindrical head — and travels around via a pair of motors that move four legs on either side like tank treads.

For control, Cook is using an Arduino Nano onboard, along with a motor driver, plus an Uno and joystick shield as the remote unit. Communication between the two is accomplished by a pair of nRF24L01+ radio modules. 

Code for the project is available on GitHub, and the build is split up into an electronics and mechanical section in the videos below.

2D-RFID input at the tip of your fingers

via Arduino Blog

Researchers at the University of Waterloo in Canada have developed a novel hand-based input technique called Tip-Tap that amazingly requires no batteries. 

The wearable device uses a series of three custom RFID tags on both the thumb and index finger with half an antenna on each digit. When the fingertips are touched together, a signal is sent to the computer indicating where the thumb and index finger intersect, which is mapped as a position on a 2D grid.

Usability experiments were carried out using an Arduino Mega, with both on-screen visual feedback and without. Possible applications could include the medical field, where Tip-Tap can be added to disposable gloves enabling surgeons to access a laptop without dictating inputs to an assistant or sterilization issues.

We describe Tip-Tap, a wearable input technique that can be implemented without batteries using a custom RFID tag. It recognizes 2-dimensional discrete touch events by sensing the intersection between two arrays of contact points: one array along the index fingertip and the other along the thumb tip. A formative study identifies locations on the index finger that are reachable by different parts of the thumb tip, and the results determine the pattern of contacts points used for the technique. Using a reconfigurable 3×3 evaluation device, a second study shows eyes-free accuracy is 86% after a very short period, and adding bumpy or magnetic passive haptic feedback to contacts is not necessary. Finally, two battery-free prototypes using a new RFID tag design demonstrates how Tip-Tap can be implemented in a glove or tattoo form factor.

Meet Aster, the 3D-printed humanoid robot

via Arduino Blog

If you’d like to build your own vaguely humanoid robot, but don’t care about it getting around, then look no farther than Aster

The 3D-printed bot is controlled by an Arduino Uno, with a servo shield to actuate its 16 servo motors. This enables it to move its arms quite dramatically as seen in the video below, along with its head. The legs also appear to be capable of movement, though not meant to walk, and is supported with a column in the middle of its structure.

Aster’s head display is made out of an old smartphone, and in the demo it shows its eyes as green geometric objects, an animated sketch, and then, somewhat shockingly, as different humans. Print files for the project are available here and the design is actually based on the more expensive Poppy Humanoid.

Awesome dad builds an Arduino-powered button box for his toddler son

via Arduino Blog

Like most one-year-olds, CodePanda’s son really likes pushing buttons. Rather than purchasing a so-called busy board that might teach him skills like unlocking doors or plugging in electrical outlets, he decided to build his own custom device controlled by an Arduino Uno.

The resulting toy features a wide variety of lights, buttons and switches, and makes sounds to keep the little guy entertained. In the center, a big green button activates an analog voltmeter, which not only looks cool, but actually indicates the battery level of the unit.

While you probably won’t want to build this exact interactive box, CodePanda’s project is available on GitHub for inspiration and/or modification!

This YouTuber recreated the D-O droid from Star Wars: Episode 9 with Arduino

via Arduino Blog

While it’s yet to make its premiere, Matt Denton has already built the D-O droid from Star Wars: The Rise of Skywalker using a MKR WiFi 1010 for control, along with a MKR IMU Shield and a MKR Motor Carrier

The droid scoots around on what appears to be one large wheel, which conceals the Arduino boards as well as other electronics, batteries, and mechanical components. Denton’s wheel design is a bit more complicated mechanically than it first appears, as its split into a center section, with thin drive wheels on the side that enable differential steering.

On top, a cone-shaped head provides sounds and movement, giving the little RC D-O a ton of personality. The droid isn’t quite finished as of the video below, but given how well it works there, the end product should be amazing!

FaceWidgets blends on-face switches with the VR world

via Arduino Blog

When using a virtual reality (VR) system, you may need to flip a switch, touch a button, etc., which can be represented by a carefully coordinated series of pixels in front of your eyes. As a physical alternative — or augmentation — researchers at the National Chiao Tung University in Hsinchu, Taiwan have developed a system of interchangeable physical control panels, called FaceWidgets, that reside on the backside of head-mounted unit itself.

When a wearer places their palm near their face (and headset), this is sensed and an on-screen canvas appears depending on the application. They can then manipulate these widgets both physically and in the virtual world to control the experience. 

Physical interactions are detected with the help of an Arduino Mega and the facial control pad even extends and retracts for optimal usage via a motor shield and stepper motors.

We present FaceWidgets, a device integrated with the backside of a head-mounted display (HMD) that enables tangible interactions using physical controls. To allow for near range-to-eye interactions, our first study suggested displaying the virtual widgets at 20 cm from the eye positions, which is 9 cm from the HMD backside. We propose two novel interactions, widget canvas and palm-facing gesture, that can help users avoid double vision and allow them to access the interface as needed. Our second study showed that displaying a hand reference improved performance of face widgets interactions. We developed two applications of FaceWidgets, a fixed-layout 360 video player and a contextual input for smart home control. Finally, we compared four hand visualizations against the two applications in an exploratory study. Participants considered the transparent hand as the most suitable and responded positively to our system.