Each of the four legs are driven using 9g micro servos, controlled by an Arduino Nano. A human operates the spider-inspired robot with a remote consisting of an Arduino Uno and a small joystick module, while pair of NRF24L01 radio transceivers provide a link between the robot and controller.
Despite its simple construction, the quadruped moves around impressively well…
For the most part, the next generation of wearable technology development has been focused around your wrist, arm, ears, and even your face. Hair, however, remains a unique and much less explored material… until now, at least.
That’s because the team of Sarah Sterman, Molly Nicholas, Christine Dierk, and Professor Eric Paulos at UC Berkeley’s Hybrid Ecologies Lab have created interactive hair extensions capable of changing shape and color, sensing touch, and communicating over Bluetooth. The aptly named “HairIO” conceals a skeleton of nitinol wire, a shape memory alloy (SMA) that morphs into different forms when exposed to heat. An Arduino Nano handles control, enabling it to respond to stimulus such as messages from your phone using an Adafruit Bluefruit board.
That’s not the only trick of these fibers, as they can use thermochromic pigments to change color along with the SMA action, and respond to touch via capacitive sensing.
Human hair is a cultural material, with a rich history displaying individuality, cultural expression and group identity. It is malleable in length, color and style, highly visible, and embedded in a range of personal and group interactions. As wearable technologies move ever closer to the body, and embodied interactions become more common and desirable, hair presents a unique and little-explored site for novel interactions. In this paper, we present an exploration and working prototype of hair as a site for novel interaction, leveraging its position as something both public and private, social and personal, malleable and permanent. We develop applications and interactions around this new material in HairIO: a novel integration of hair-based technologies and braids that combine capacitive touch input and dynamic output through color and shape change. Finally, we evaluate this hair-based interactive technology with users, including the integration of HairIO within the landscape of existing wearable and mobile technologies.
Performing an instrument well is hard enough, but flipping through sheet music while playing can slightly delay things in the best case, or can cause you to lose your concentration altogether. Music displayed on a computer is a similar story; however, Maxime Boudreau has a great solution using an Arduino Nano inside of a 3D-printed pedal assembly.
When set up with software found here, Boudreau’s DIY device allows you to control PDF sheet music on your laptop with the tap of a foot. While designed to work with a macOS app, there’s no reason something similar couldn’t be worked out under Windows or Linux as needed.
“ChrisN219” is the proud owner of an antique Coke machine that he uses to store his favorite beverages. While a very cool decoration, it doesn’t have a way to reveal how many cans are left.
To add this functionality, he turned to an Arduino Nano along with an ultrasonic sensor that he embedded inside the machine to sense how high the cans are stacked. This allows the user to know when it’s time to stock up again, and after inserting another ultrasonic sensor to the display unit on top, an OLED screen automatically shows the sodas (or beers) available as someone approaches it.
In order to separate their office and shop areas, NYC CNC installed a rubber strip assembly that had to be pushed out of the way every time someone wanted to walk through. Although functional, it was also quite annoying, so they installed a system that uses a pneumatic cylinder to automatically move the rubber strips out of the way.
The device uses an Arduino Nano for control and VL53L0X time-of-flight sensors for presence detection. In addition, it features a clever gear and belt assembly to mirror one side of the door with the other.
You can find more details of the build in the video below and check out the project’s components, Fusion 360 design files, and Arduino code here.
When dealing with robotics and other electronics projects, it can be important to know how many revolutions a motor is making. From here, you can infer the distance that your device has traveled, or any number of other important variables.
If you’d like to get started with this type of sensing, this electronoobs tutorial will show you how to get things hooked up using an Arduino and a computer, along with an oscilloscope to verify measurements up to 10,000 RPM.
In his setup, an IR emitter/receiver bounces light off a spinning object. When light reflects back, it opens the circuit, causing the output to be grounded via a pulldown resistor, telling you that a revolution has been made. The 3D-printed device also features an OLED screen.
To emit infrared light we need a IR LED and to detect it a IR sensible transistor. Usually you could find those as a one unique module. To amplify the signal I’ve used the LM324 amplifier. You will also need a 100 ohm resistor and a 4.7k ohm one. To supply the system we will need a basic 9V battery and connector, an Arduino Nano, and an OLED screen. The case is 3D printed…