Daniel Hingston wanted to build a four-legged walking robot for several years, and with current coronavirus restrictions he finally got his chance. His 3D-printed robodog, dubbed “GoodBoy,” is reminiscent of a miniature version of Boston Dynamics’ Spot, which helped inspired the project.
It’s extremely clean, with wiring integrated into the legs mid-print. Two micro servos per leg move it in a forward direction, controlled by an Arduino Uno.
Obstacle avoidance is provided by a pair of ultrasonic sensor “eyes,” allowing it to stop when something is in its path. An LDR sensor is also implemented, which when covered by its human minder commands it to present its paw for shaking.
Be sure to check out a short demo of GoodBoy below!
What is one to do when stuck indoors due to bad weather or other circumstances, without the ability to ride your beloved bicycle? If you’re game designer Jelle Vermandere, you build your own cycling simulator as seen in the clip below.
Vermandere not only created a computer simulation in Unity, but a custom Arduino Uno rig that allows him to use his actual bike as the controller.
The game features procedurally-generated maps, along with competitors using Vermandere’s own likeness scanned in as the model. When the racing begins, wheel speeds are sensed via a magnetic window sensor and steering is handled by a LEGO potentiometer rig.
This post is written by Jan Jongboom and Dominic Pajak.
Running machine learning (ML) on microcontrollers is one of the most exciting developments of the past years, allowing small battery-powered devices to detect complex motions, recognize sounds, or find anomalies in sensor data. To make building and deploying these models accessible to every embedded developer we’re launching first-class support for the Arduino Nano 33 BLE Sense and other 32-bit Arduino boards in Edge Impulse.
The trend to run ML on microcontrollers is called Embedded ML or Tiny ML. It means devices can make smart decisions without needing to send data to the cloud – great from an efficiency and privacy perspective. Even powerful deep learning models (based on artificial neural networks) are now reaching microcontrollers. This past year great strides were made in making deep learning models smaller, faster and runnable on embedded hardware through projects like TensorFlow Lite Micro, uTensor and Arm’s CMSIS-NN; but building a quality dataset, extracting the right features, training and deploying these models is still complicated.
Using Edge Impulse you can now quickly collect real-world sensor data, train ML models on this data in the cloud, and then deploy the model back to your Arduino device. From there you can integrate the model into your Arduino sketches with a single function call. Your sensors are then a whole lot smarter, being able to make sense of complex events in the real world. The built-in examples allow you to collect data from the accelerometer and the microphone, but it’s easy to integrate other sensors with a few lines of code.
Download the Arduino Nano 33 BLE Sense firmware — this is a special firmware package (source code) that contains all code to quickly gather data from its sensors. Launch the flash script for your platform to flash the firmware.
Launch the Edge Impulse daemon to connect your board to Edge Impulse. Open a terminal or command prompt and run:
Your device now shows in the Edge Impulse studio on the Devices tab, ready for you to collect some data and build a model.
Once you’re done you can deploy your model back to the Arduino Nano 33 BLE Sense. Either as a binary which includes your full ML model, or as an Arduino library which you can integrate in any sketch.
Your machine learning model is now running on the Arduino board. Open the serial monitor and run `AT+RUNIMPULSE` to start classifying real world data!
Integrates with your favorite Arduino platform
We’ve launched with the Arduino Nano 33 BLE Sense, but you can also integrate Edge Impulse with your favourite Arduino platform. You can easily collect data from any sensor and development board using the Data forwarder. This is a small application that reads data over serial and sends it to Edge Impulse. All you need is a few lines of code in your sketch (here’s an example).
After you’ve built a model you can easily export your model as an Arduino library. This library will run on any Arm-based Arduino platform including the Arduino MKR family or Arduino Nano 33 IoT, providing it has enough RAM to run your model. You can now include your ML model in any Arduino sketch with just a few lines of code. After you’ve added the library to the Arduino IDE you can find an example on integrating the model under Files > Examples > Your project – Edge Impulse > static_buffer.
To run your models as fast and energy-efficiently as possible we automatically leverage the hardware capabilities of your Arduino board – for example the signal processing extensions available on the Arm Cortex-M4 based Arduino Nano BLE Sense or more powerful Arm Cortex-M7 based Arduino Portenta H7. We also leverage the optimized neural network kernels that Arm provides in CMSIS-NN.
A path to production
This release is the first step in a really exciting collaboration. We believe that many embedded applications can benefit from ML today, whether it’s for predictive maintenance (‘this machine is starting to behave abnormally’), to help with worker safety (‘fall detected’), or in health care (‘detected early signs of a potential infection’). Using Edge Impulse with the Arduino MKR family you can already quickly deploy simple ML based applications combined with LoRa, NB-IoT cellular, or WiFi connectivity. Over the next months we’ll also add integrations for the Arduino Portenta H7 on Edge Impulse, making higher performance industrial applications possible.
Researchers across several universities have developed a controller that provides tangible interaction for 3D augmented reality data spaces.
The device is comprised of three orthogonal arms, embodying X, Y, and Z axes which extend from a central point. These form an interactive space for 3D objects, with linear potentiometers and a rotary button on each axis as a user interface.
At the heart of it all is an Arduino Mega, which takes in data from the sliders to section a model. This enables users to peer inside of a representation with an AR headset, “slicing off” anything that gets in the way by defining a maximum and minimum view plane. The sliders are each motorized to allow them to move together and to provide force feedback.
To accomplish this, the round quadruped utilizes one servo to deploy each leg via a parallelogram linkage system and another to move it forwards and backwards. A clever single-servo assembly is also implemented on the bottom to fill gaps left by the legs.
The device is controlled by an Arduino Nano, along with a 16-channel servo driver board. Obstacle avoidance is handled via an ultrasonic sensor, which sticks out of the top half of the sphere and rotates side to side using yet another servo.
It’s an impressive mechanical build, especially considering its diminutive size of 130mm (5.12in) in diameter.
3D printing allows us to make a wide variety of shapes, but adding interactive features generally means somehow strapping various electronics to them. The AirTouch project, however, presents an alternative option by enabling a fabricated object to sense up to a dozen different touch points with no components or complex calibration necessary.
Instead compressed air is pumped into the 3D-printed item, which escapes via up to 12 tiny holes. As each hole is touched, a barometric sensor picks up the pressure response, which is then interpreted by an Arduino Uno board as user input.
The system has been tested on a variety of interactive figures, from a model rabbit to a bar graph. A short demo can be seen below, while the project’s research paper is found here.