Our modern societies create a lot of garbage, which we can fortunately remove from our homes thanks to local waste management services. But the garbage people won’t come sift through your house for refuse, which forces you to utilize trash bins. Those bins never seem to be nearby when you need them, which is why James Bruton built the Binbot 9000.
The Binbot 9000 is exactly what it sounds like: a robotic trash can. No longer must the bin remain stationed in some out-of-the-way location. Instead, Binbot 9000 can drive around a home in search of people who need to throw things away.
Bruton started by placing a standard trash can on a robotic frame built using aluminum extrusion and 3D-printed parts. It has two drive wheels with encoders, which an Arduino Mega 2560 controls. To navigate through the home while avoiding collisions, Bruton added an NVIDIA Jetson Nano single-board computer and a Raspberry Pi Camera. The Jetson runs computer vision software and feeds commands to the Arduino via serial.
The computer vision software looks for simple targets printed on sheets of paper. The robot rotates until it sees and centers a target in the video frame. It will then drive forward until it reaches the target, rotates 90 degrees, and repeats the process. If it collides with something (ideally someone’s foot), the wheel encodes will detect the stall and the robot will open its lid with a servo. After someone deposits trash and closes the lid, the robot will go back into its target-seeking cycle.
By placing targets in strategic locations around his home, Bruton gave Binbot 9000 the ability to drive around his home in efficient paths. Whenever he needs to throw something away, he can nudge the robot to stop it and deposit his garbage. It also responds to voice commands, so Bruton can summon it or send it home as needed.
Automated weaving machines are one of the most important (and underappreciated) advancements to come from the industrial revolution. Prior to their invention, most people only owned a few garments that were woven and maintained by the family. With the introduction of machines able to churn out textiles, affordable clothing suddenly became available. As an expert in the industry, Roger de Meester was able to construct a fully automated weaving machine controlled by Arduino boards.
Unlike the early weaving machines of the industrial revolution that could only produce patterns inherent to their construction, de Meester’s desktop weaving machine utilizes sophisticated computer control to produce a huge range of patterns on demand. A new pattern can be completely different from the preceding pattern and the machine can even adjust the pattern on-the-fly during the weaving process, meaning it can create rich tapestries.
This machine is incredibly complex, as it doesn’t rely on any mechanical coupling. That means that every facet of the machine’s operation is adjustable via a stepper motor, DC motor, or servo motor. There are a lot of motors to drive, so de Meester needed multiple Arduino boards: an Arduino Mega 2560 and two Arduino Nanos. The mechanical components are 3D-printed (like the shuttles) or made from aluminum extrusion and wood (like the frame).
None of our descriptions can give this project justice, so be sure to watch the video to see de Meester’s machine in action.
As part of his ongoing PorscheKart project, YouTuber Wesley Kagan wanted a better way to steer his V12 custom-built race car, as the previous wheel was simply a mechanical linkage to the front steering. Instead, this new version would closely mimic the layout and functionality of an actual Formula 1 wheel, complete with all of the buttons, dials, switches, and the central screen.
The base of the wheel was formed from a laser-cut sheet of aluminum while the surrounding grips were painstakingly 3D-printed out of TPU filament. For the electronics, Kagan decided to use a pair of Arduino Micros, which were split between handling button inputs and driving the display, while an Arduino Mega 2560 gathers sensor data and sends it as a string to the two boards. Because of the limited number of pins, he wired each of the three rotary switches’ output pins to a differently valued resistor, thereby letting the analog input on the Micro know which position is selected by the incoming voltage.
The final steps of building this upgraded steering included connecting the 3.5” LCD screen to one of the Arduino Micro boards and wiring everything together with the help of a couple harnesses to minimize the mess. However, creating the graphics program proved to be a challenge due to the limited space in ROM for storing all of the draw function calls, which is why Kagan plans on eventually swapping it out for a static image that has the values filled-in. To see more about the project, you can watch his build log video below and read this blog post.
We’re currently seeing something of a technological blitzkrieg as corporations and engineers attempt to solve the problem of tactility in virtual reality (VR). Modern VR headsets provide quite realistic visual and auditory immersion, but that immersion falls apart when users find themselves unable to physically interact with virtual objects. Developed by a team of National Chengchi University researchers, ELAXO is an Arduino-controlled exoskeleton glove that enables complex force feedback for VR applications.
ELAXO looks unwieldy — it is like an exoskeleton glove made up of 3D-printed struts and joints. In the demonstrated setup, ELAXO mounts to the user’s wrist and has force feedback structures attached to their thumb and first two fingers. Each finger receives four servo motors, four small DC motors, and one larger DC motor. Those motors attach to joints to create on-demand physical resistance to movement.
For two fingers and a thumb, ELAXO requires a total of 12 servos, 12 small DC motors, and three large DC motors. Each finger also needs an infrared (IR) sensor, for a total of three. In addition, the large DC motors contain encoders that use two wires each. Controlling those takes a lot of I/O pins, which is why the ELAXO team chose an Arduino Mega board for their prototype. It controls the motors through eight dual TB6612FNG drivers.
The Arduino powers the motors according to what happens in the VR world. For example, if a user tries to touch a stationary object, the motors on that finger might get full power to keep the joints from bending and to provide a feeling of solid resistance. Other actions, like rotating a knob, result in less resistance. By gaining granular control over the resistance of each joint, ELAXO can produce convincing force feedback.
When you use a “gyroscope” in Arduino and robotics projects, generally this means a small IMU that leverages several methods of sensing to tell how a device is moving. However, physical gyroscopes are able to employ a spinning disk stay upright mechanically. Could one be combined with advanced electronics to stabilize a robot or other craft?
James Bruton answers this question in the video below, going from a “bare” gyroscope, to an unpowered gimbal, and finally to a simulated boat. This utilizes a powered gimbal for stabilization that’s tilted in one axis by a DYNAMIXEL servo. Angle is measured using an Arduino Pro Mini along with an MPU-6050 IMU, and the gyroscope is controlled by an Arduino Mega.
Virtual reality (VR) technology has improved dramatically in recent years and there are now a number of VR headsets on the market that provide high-quality visual immersion. But VR systems still struggle to stimulate our other senses. When you can’t feel the virtual objects that you can see, the immersion falls apart. That’s why an international team of researchers has developed GuideBand, which is an arm-mounted contraption that physically guides players within VR.
This device looks a bit like an external fixation apparatus for securing broken bones. It straps onto the user’s arm and has three motors controlled by an Arduino Mega via TB6612FNG motor drivers. The first motor moves the device’s gantry radially around the user’s arm. The second motor adjusts the angle of attack, offset perpendicularly from the forearm. The third motor acts as a winch and pulls a cable attached to a strap on the user’s arm.
The unique layout of GuideBand lets it impart the feeling of pulling onto the user’s forearm, like a parent tugging their child through a grocery store. That guidance could correspond directly to action in the virtual world, such as an NPC (Non-Player Character) pulling the player out of the way of danger. Or it can provide more subtle direction, like a game tutorial demonstrating how the player should move to interact with a virtual object.
As with many other VR haptic feedback systems, GuideBand is highly experimental and we don’t expect to see it on the market anytime soon. But it is still an interesting solution to a specific problem with virtual reality.