Monthly Archives: January 2020

LoRa made easy: Connect your devices to the Arduino IoT Cloud

via Arduino Blog

An important new feature is now available in the Arduino IoT Cloud — full support for LoRa® devices!

LoRa® is one of our favorite emerging technologies for IoT because it enables long-range and low power transmission of data without using cellular or WiFi connections. It’s a very powerful and promising technology but it comes with its own complexity. In our pursuit to make IoT easier, we’ve already released a few products that enable anyone to build a LoRa® device (or a fleet of LoRa® devices!). Thanks to the Arduino MKR WAN 1310 board, combined with the Arduino Pro Gateway you can create your own LoRaWAN™ network. But we have decided to do more than that, and it’s time to release one more important piece….

The  Arduino IoT Cloud now provides an incredibly easy way to collect data sent by your LoRa® devices. With a few clicks, the IoT Cloud will generate a sketch template for the boards that you can adapt to read data from your sensors, pre-process it as you want, and then send it to the IoT Cloud. With a few more clicks (no coding required), you’ll be able to create a graphical dashboard that displays the collected data in real-time and lets users see their history through charts and other widgets. You will not need to worry about coding your own compression, serialization and queueing algorithm, as it will all be done under the hood in a smart way — you’ll be able to transmit multiple properties (more than five), pushing the boundary beyond the packet size limits of LoRaWAN™

This is our take on edge computing – you program the device to collect and prepare your data locally, and then we take care of shipping such data to a centralized place.

Such a simplified tool for data collection is already quite innovative, but we decided to take it an important step further. All the available solutions for LoRa® currently focus on collecting data, but they do not address it from the other way round i.e. sending data from a centralized application to the LoRa® device(s). Arduino IoT Cloud now lets you do this — you’ll be able to control actuators connected to your device by sending messages via LoRa®, with no coding needed.

Build and control your own LoRaWAN™ network with Arduino IoT Cloud, the Pro Gateway and the new improved MKR WAN 1310 board that features the latest low-power architecture to extend the battery life and enable the power consumption to go as low as 104uA.

Vulkan is coming to Raspberry Pi: first triangle

via Raspberry Pi

Following on from our recent announcement that Raspberry Pi 4 is OpenGL ES 3.1 conformant, we have some more news to share on the graphics front. We have started work on a much requested feature: an open-source Vulkan driver!


Standards body Khronos describes Vulkan as “a new generation graphics and compute API that provides high-efficiency, cross-platform access to modern GPUs”. The Vulkan API has been designed to better accommodate modern GPUs and address common performance bottlenecks in OpenGL, providing graphics developers with new means to squeeze the best performance out of the hardware.

First triangle

The “first triangle” image is something of a VideoCore graphics tradition: while I arrived at Broadcom too late to witness the VideoCore III version, I still remember the first time James and Gary were able to get a flawless, single-tile, RGB triangle out of VideoCore IV in simulation. So, without further ado, here’s the VideoCore VI Vulkan version.

First triangle out of Vulkan

Before you get too excited, remember that this is just the start of the development process for Vulkan on Raspberry Pi. Igalia has only been working on this new driver for a few weeks, and we still have a very long development roadmap ahead of us before we can put an actual driver in the hands of our users. So don’t hold your breath, and instead look forward to more news from us and Igalia as they make further development progress.

The post Vulkan is coming to Raspberry Pi: first triangle appeared first on Raspberry Pi.

Friday Product Post: Agents of Qwiic Shields

via SparkFun: Commerce Blog

Hello everyone! This week we have a load of new products, starting with two new Qwiic shields - one for the Arduino Nano and one for our popular Thing Plus line. We also have new versions of our micro:climate kit and weather:bit Carrier Board, both for micro:bit. Last, and certainly not least, we have two brand new LIDAR Mappers from SLAMTEC that we know a lot of you have been asking about! Let's jump in and take a closer look.

Shields up!

SparkFun Qwiic Shield for Arduino Nano

SparkFun Qwiic Shield for Arduino Nano

SparkFun Qwiic Shield for Thing Plus

SparkFun Qwiic Shield for Thing Plus


The SparkFun Qwiic Shield for Arduino Nano and Qwiic Shield for Thing Plus provide you with a quick and easy way to enter into SparkFun's Qwiic ecosystem with your Arduino Nano or Thing Plus boards. The Qwiic Shields connect the I2C bus (GND, 3.3V, SDA and SCL) on your boards to four SparkFun Qwiic connectors (two horizontally mounted, and two vertically mounted). The Qwiic connect system allows for easy daisy-chaining - so long as your devices are on different addresses, you can connect as many Qwiic devices as you would like. Additionally, the Thing Plus Shield is also compatible with the Feather footprint!

Be your own meteorologist!

SparkFun micro:climate kit for micro:bit - v3.0

SparkFun micro:climate kit for micro:bit - v3.0

SparkFun weather:bit - micro:bit Carrier Board (Qwiic)

SparkFun weather:bit - micro:bit Carrier Board (Qwiic)


The SparkFun micro:climate kit is a full weather station kit built on top of the weather:bit Carrier Board. Unlike previous weather kits we've carried, this micro:climate kit is Qwiic-enabled and includes our tried-and-true Weather Meters and Soil Moisture Sensor, so whether you’re an agriculturalist, a professional meteorologist or a hobbyist, you will be able to build a high-grade weather station powered by the micro:bit. You can even talk via wireless communication between two micro:bits with this kit, so you can monitor the weather without being exposed to it! Of course, if you are just looking for the weather:bit Carrier Board inside the kit, we have that available alone as well! Please be aware that neither of these products come with a micro:bit and it will need to be purchased separately.

SLAMTEC Mapper Developer Kit - Laser Mapping Sensor (M1M1)

SLAMTEC Mapper Developer Kit - Laser Mapping Sensor (M1M1)

SLAMTEC Mapper Pro Kit - Laser Mapping Sensor (M2M1)

SLAMTEC Mapper Pro Kit - Laser Mapping Sensor (M2M1)


The SLAMTEC Mapper Developer and Pro Kits are a new type of laser sensor introduced by (you guessed it) SLAMTEC, which is different from the traditional LIDAR. Each version has built-in functions of simultaneous localization and mapping (SLAM), and is suitable for many applications such as robot navigation and positioning, environmental mapping and hand-held measurement.

That's it for this week! As always, we can't wait to see what you make! Shoot us a tweet @sparkfun, or let us know on Instagram or Facebook. We’d love to see what projects you’ve made!

Never miss a new product!

comments | comment feed

The Synthfonio is a guitar-shaped MIDI instrument

via Arduino Blog

Learning to play an an instrument well takes a lot of time, which many people don’t have. To address this, Franco Molina — who enjoys MIDI controllers and writing music, but describes himself as being terrible at playing the keyboard — created the Synthfonio.

Molina’s DIY device is vaguely reminiscent of a guitar, with a series of keys on the neck that indicate the chords and key signatures, and another set roughly positioned where you’d strum a guitar to play the notes.

The Synthfonio is assembled from laser-cut MDF sections, and utilizes a MKR WiFi 1010 to take care of I/O and MIDI functions. A second microcontroller in the form of an ATmega328 on a breadboard is used to produce actual synth sounds, though most Arduinos would be suitable either function.

The Synthfonio features 2 sets of keys, one to define chords and key signatures, and another one to actually play the notes. Whatever chord is pressed in the instruments neck keys, will define the pitch of the keys on the instrument handle. Similar to a guitar, violin, and other string instruments; with the added advance that the Synthfonio is a smart device that can deduce the chords being played from a single set of notes. This way, for example, the musician can use the handle keys to play chords, melodies, and arpeggios in the key of A, just by pressing the A key on the neck. In the same way, pressing the A key on the neck in conjunction with the C key (minor third of A) will activate an A minor tonality for the handle keys.

This can allow any player to execute a 4-chord melody, accompaniment, or even improvisation; with no more than one or two fingers in position.

Enginursday: Light Suit Update

via SparkFun: Commerce Blog

The Wonderful Mistakes I've Made to Get to this Point

Pretty much as long as I've been playing with electronics I've been working on this wacky Light Suit project to wear out to festivals to light up the night. I've learned so much along the journey. I've made so many Enginursday posts about it that at this point you may be getting sick of the project, but I feel an obligation to keep folks updated all the way through the end.

When I arrived at SparkFun, I was in the second iteration of the project and I needed to fix a few 3D parts. I wore this updated version to a few shows and quickly broke the gesture sensing that I had soldered into my hands. I came back into the office on Monday and bang! The SparkFun Qwiic Flex Glove Controller was born in order to isolate the exceedingly fragile solder points on the Flex Sensor. From here, I decided to scrap the second version as the 1-mm thick fibers I was using were brittle and hard to work with.

The Current Light Suit

The Current Light Suit

In the third version I got a lot closer to the vision in my head. I designed some custom PCBs to mount addressable LEDs to instead of individually PWM-ing each Common-Cathode LED (I still have unplaced anger at whoever let me live so long without knowing about the WS2812). I was also able to use the SparkFun Qwiic Flex Glove Controller that I had designed after the failure of the second iteration. I sourced new fiber to test some neat audio-reactive and gesture-reactive code.

However, I had one major flaw: the LED modules I had were connected via Qwiic cables as I had this strange obsession with making the system solderless for easy repair out in the wild. This worked for a little bit, but some of the cables had a little bit less friction with their receptacles, so exuberant movements would promptly unplug things. Also, the cables would occasionally break from the connector after a lot of movement.

The moral of the story here is that you'll need some pretty sturdy plugs if you want to use them with any sort of wearable. Qwiic cables, while useful and readily available, were not necessarily the right choice for this application. I was also having problems figuring out how to create multiple I2C ports to listen to all of my gesture sensing equipment. However, I solved most of these issues in the fourth iteration of the suit.

The Brain

To control the whole operation I made a custom board with an ESP32-WROOM. I started my design from SparkFun's ESP32 Thing Plus design. From here it was a matter of changing a few things to suit the board to my purposes. I substituted the CP2102 Serial Converter for a CH340G, as it's easier to hand place/rework. I also added breakouts for I2C and LED data lines in each corner of the board to control each limb of the suit. I tossed on a MEMS microphone for audio reaction. I also gave it two 18650 batteries in series and a 5V 3A LDO to give me voltage for my LEDs. I admit that this could've been done far better (at full charge, the LDO has to drop 3.4 volts, gross) but give me a break, I've got a strip of these LDOs and they're not gonna place themselves in designs.

In the future, my plan is to put the batteries in parallel with some sort of charger IC that I haven't had the time to source, as well as the beefiest Boost regulator I can find. Anyway, there's always room for improvement, but check out the board below.

Light Suit Brain Board File

The Brain for the Light Suit

This guy then sits in a small 3D-printed box that sits right where a belt buckle would go, I send my belt through part of the box to anchor it to my body.

Box for the Brain

3D Printed Brain Box

LEDs and Fiber Optic

Now that we have control figured out, we'll have to get some LEDs mated up to fiber optic. For this, we'll use an itty bitty custom PCB with two APA102 LEDs on each side.

LED Board

LED Board

This fits in between two identical 3D-printed parts, which align the PCB and LEDs with four strands of 5mm diameter fiber optic. Screwing these together clamps down on PCB, fiber and control/power lines all at once for a secure fit.

Fiber Grip

Fiber-LED Grip

This 3D part is then sewn onto some custom adjustable armbands (shout out to Mom for letting me use her sewing machine). We cut up a ton of fiber optic to the proper lengths and connected it all up.

Single Fiber Grip Module

Single Fiber Grip Module


I go through a detailed analysis of my gesture controls in this blog post about using multiple I2C ports on ESP32. However, I basically have two of the SparkFun Qwiic Flex Glove Controller sewn onto each hand. In each foot I have a custom board, based on the same ADS1015 Analog->I2C conversion chip as the gloves, that senses how my weight is balanced on each foot. I have two I2C buses broken out from my ESP32, one for the hand controls and another for the legs. As I was monkeying around last night, I realized that I'd bent a few of the sensors in travel (the demo always works, right?) and they don't really sense so well anymore. Due to this, all I really got to look good for video was the audio buffer portion. Check it out below and stay tuned for a final code update.

comments | comment feed

Arduino Pro IDE v0.0.4-alpha is here!

via Arduino Blog

Our dev team has some more exciting news to share: Arduino Pro IDE v0.0.4-alpha has been released.

Highlights include:

  • Automatic Arduino language server (LS) recovery. From now on, if the LS process terminates, it restarts automatically.
  • Updated the bundled Clang version to 9.0.0. Bundled Clangd into the application for all supported platforms.
  • Better keybinding support for the upload, verify, and the serial monitor.

You can download the latest version here.