Monthly Archives: March 2021

BERT Machine Learning Model and the RP2040 Thing Plus

via SparkFun: Commerce Blog

The SparkFun RP2040 Thing Plus is awfully enticing to use for machine learning...not only does it have 16 MB of flash memory, but the RP2040 SoC enables the maximum performance for machine learning inference at the lowest power, due to its energy-efficient dual Arm Cortex-M0+ cores working at a comparatively higher frequency of 133 MHz.

SparkFun Thing Plus - RP2040

SparkFun Thing Plus - RP2040

DEV-17745
$17.95

Since a version of the TensorFlow Lite Micro library has been ported for the Raspberry Pi Pico, we can try to start running machine learning models on RP2040 boards that can detect people in images, or recognize gestures and voices. But beyond that, TensorFlow has documentation for building really useful text recognition machine learning models, including the BERT Question Answer model.

Even if you haven't heard of the BERT Question Answer model, it's likely that you've either interacted with systems that follow the same principles. It's what allows machines to read and comprehend human language and interact with us in return. Developed by Google, it stands for Bidirectional Encoder Representations from Transformers, which basically means it uses encoders and decoders to read text input and produce predictions. The bidirectional part means that it reads the text from both left to right and right to left, so that it can understand the context of a word within its text. Basically, it's the key to building machines that can actually communicate with us, like the chat bots you interact with on the web. For example, you can see how the model might be able to pick out an answer from the passage below.

alt text

The question is, can it be converted onto a microcontroller like the RP2040 Thing Plus, which has limited RAM and storage and thus places constraints on size of the machine learning model? We attempted this, by training a model through TensorFlow, and then converting it to C files that could be loaded onto the RP2040 Thing Plus and thus fed new data.

Training the Model

TensorFlow has extensive documentation that leads you through training the model, but the amount of code required is surprisingly simple. It comes down to five main steps: choosing the model (in this case it's MobileBert, since it’s thin and compact for resource-limited microcontroller use), loading in data, retraining the model with the given data, evaluating it, and exporting it to TensorFlow Lite format (tflite).

# Chooses a model specification that represents the model.
spec = model_spec.get('mobilebert_qa')

# Gets the training data and validation data.
train_data = QuestionAnswerDataLoader.from_squad(train_data_path, spec,     is_training=True)
validation_data = QuestionAnswerDataLoader.from_squad(validation_data_path,     spec, is_training=False)

# Fine-tunes the model.
model = question_answer.create(train_data, model_spec=spec)

# Gets the evaluation result.
metric = model.evaluate(validation_data)

# Exports the model to the TensorFlow Lite format with metadata in the     export directory.
model.export(export_dir)

The data we'll be giving it comes from a large scale dataset meant to train machine reading comprehension called TriviaQA.

Once we export the model to a file with the format tflite, we'll need to convert the model to a C array such that it can run on a microcontroller using xxd.

xxd -i converted_model.tflite > model_data.cc

Running Inference on the RP2040 Thing Plus

Running inference on a device basically means loading and testing the model onto the microcontroller. TensorFlow does have documentation for running inference on microcontrollers, and includes many steps, including loading the library headers, model headers, loading the module, allocating memory, and testing the model/retraining with new data that it hasn't seen before.

#include "pico/stdlib.h"
#include "tensorflow/lite/micro/all_ops_resolver.h"
#include "tensorflow/lite/micro/micro_error_reporter.h"
#include "tensorflow/lite/micro/micro_interpreter.h"
#include "tensorflow/lite/schema/schema_generated.h"
#include "tensorflow/lite/version.h"
#include "conv_bert_quant.h"

Lastly, we can run the model with an input, like the passage below, and see in the console what kind of answers the model produces to comprehend the text.

alt text

Final Thoughts

It's quite amazing that a microcontroller can run a machine learning model like BERT, but thanks to TensorFlow, it's possible to run all sorts of machine learning modules. What kind of machine learning applications interest you? I highly reccomend giving them a try on one of the RP2040 boards because they are so well equipped for this kind of heavy lifting. Comment below what you want to try, and happy hacking!

comments | comment feed

Custom machine stand ‘lets you know’ if drill bits aren’t stored properly

via Arduino Blog

YouTuber Cranktown City recently acquired a new milling machine/drill press, and needed somewhere sturdy to place it. Rather than buying something, he went to work making a nice custom stand with a drawer on top and space for a toolbox below that.

To help keep things organized, this top drawer features a 3D-printed drill index with an interesting trick. In addition to providing storage for the drill bits, it “encourages” you to put them back. Each drill cavity has a small switch, all of which are daisy-chained together. The switch signal is fed to an Arduino Nano, which reads high when all drills are present, and low if one or more is missing. If one is missing for too long, it triggers a sound module that insults him into proper organization, and lights up a strip of LEDs as an extra reminder.

Code and CAD for the project is available on GitHub if you’d like to try something similar!

The post Custom machine stand ‘lets you know’ if drill bits aren’t stored properly appeared first on Arduino Blog.

“Let’s Invent the Future Together”

via SparkFun: Commerce Blog

We're excited to participate in this year’s GTC21 conference, with two great sessions for attendees to watch! While we can't give away too many details right now, stayed tuned for more info and links to our sessions. We can't wait to share what we've been working on!


Haven't heard of GTC21? GTC21 is a great conference held by NVIDIA that focuses on artificial intelligence. This year’s online format allows you to explore the latest in AI from the comfort of your desk, couch, backyard, etc. With more than 1,400 sessions free to attend, this conference covers multiple industries and interests for those interested in AI. Topics range from autonomous machines, to high-performance computing, to graphics and game design, to data science and deep learning, to IoT! The week of AI exploration and innovation starts with a keynote from NVIDIA's Founder and CEO, Jensen Huang. See more details below!

Conference details:

  • April 12-16th
  • Watch online demos, live and recorded sessions, panels and more!
  • Register for free over on GTC21's site

While there are more than a thousand(!) sessions to watch for free, NVIDIA is also offering opportunities for paid, hands-on training with their Deep Learning Institute (DLI). Attendees can spend a full day further exploring topics such as AI, accelerated computing or accelerated data science, and earn an NVIDIA DLI certificate! Explore training available on the conference's site.

Keynote details:

Keynote talk details. Image of NVIDIA Found and CEO Jensen Huang

This year’s conference kicks off with a keynote from NVIDIA’s Founder and CEO, Jensen Huang. Expect exciting announcements and more about NVIDIA’s company vision for computing. Keynote begins Monday, April 12th, at 8:30 a.m. PDT/ 9:30 a.m. MDT/ 11:30 a.m. EDT. More information about the keynote is available on GTC21's site.

comments | comment feed

Drag-n-drop coding for Raspberry Pi Pico

via Raspberry Pi

Introducing Piper Make: a Raspberry Pi Pico-friendly drag-n-drop coding tool that’s free for anyone to use.

piper make screenshot
The ‘Digital View’ option displays a dynamic view of Raspberry Pi Pico showing GPIO states

Edtech startup Piper, Inc. launched this brand new browser-based coding tool on #PiDay. If you already have a Raspberry Pi Pico, head to make.playpiper.com and start playing with the coding tool for free.

Pico in front of Piper Make screen
If you already have a Raspberry Pi Pico, you can get started right away

Complete coding challenges with Pico

The block coding environment invites you to try a series of challenges. When you succeed in blinking an LED, the next challenge is opened up to you. New challenges are released every month, and it’s a great way to guide your learning and give you a sense of achievement as you check off each task.

But I don’t have a Pico or the components I need!

You’re going to need some kit to complete these challenges. The components you’ll need are easy to get hold of, and they’re things you probably already have lying around if you like to tinker, but if you’re a coding newbie and don’t have a workshop full of trinkets, Piper makes it easy for you. You can join their Makers Club and receive a one-off Starter Kit containing a Raspberry Pi Pico, LEDs, resistors, switches, and wires.

Piper Make starter kit
The Starter Kit contains everything you need to complete the first challenges

If you sign up to Piper’s Monthly Makers Club you’ll receive the Starter Kit, plus new hardware each month to help you complete the latest challenge. Each Raspberry Pi Pico board ships with Piper Make firmware already loaded, so you can plug and play.

Piper Make starter kit in action
Trying out the traffic light challenge with the Starter Kit

If you already have things like a breadboard, LEDs, and so on, then you don’t need to sign up at all. Dive straight in and get started on the challenges.

I have a Raspberry Pi Pico. How do I play?

A quick tip before we go: when you hit the Piper Make landing page for the first time, don’t click ‘Getting Started’ just yet. You need to set up your Pico first of all, so scroll down and select ‘Setup my Pico’. Once you’ve done that, you’re good to go.

Scroll down on the landing page to set up your Pico before hitting ‘Getting Started’

The post Drag-n-drop coding for Raspberry Pi Pico appeared first on Raspberry Pi.

Graphic routines for Raspberry Pi Pico screens

via Raspberry Pi

Pimoroni has brought out two add‑ons with screens: Pico Display and Pico Explorer. A very basic set of methods is provided in the Pimoroni UF2 file. In this article, we aim to explain how the screens are controlled with these low-level instructions, and provide a library of extra routines and example code to help you produce stunning displays.

You don't have to get creative with your text placement, but you can
You don’t have to get creative with your text placement, but you can

You will need to install the Pimoroni MicroPython UF2 file on your Pico and Thonny on your computer.

All graphical programs need the following ‘boilerplate’ code at the beginning to initialise the display and create the essential buffer. (We’re using a Pico Explorer – just change the first line for a Pico Display board.)

import picoexplorer as display
# import picodisplay as display
#Screen essentials
width = display.get_width()
height = display.get_height()
display_buffer = bytearray(width * height * 2)
display.init(display_buffer)

The four buttons give you a way of getting data back from the user as well as displaying information
The four buttons give you a way of getting data back from the user as well as displaying information

This creates a buffer with a 16-bit colour element for each pixel of the 240×240 pixel screen. The code invisibly stores colour values in the buffer which are then revealed with a display.update() instruction.

The top-left corner of the screen is the origin (0,0) and the bottom-right pixel is (239,239).

Supplied methods

display.set_pen(r, g, b)

Sets the current colour (red, green, blue) with values in the range 0 to 255.

grey = display.create_pen(100,100,100)

Allows naming of a colour for later use.

display.clear()

Fills all elements in the buffer with the current colour.

display.update()

Makes the current values stored in the buffer visible. (Shows what has been written.)

display.pixel(x, y)

Draws a single pixel with the current colour at
point(x, y).

display.rectangle(x, y ,w ,h) 

Draws a filled rectangle from point(x, y), w pixels wide and h pixels high.

display.circle(x, y, r)

Draws a filled circle with centre (x, y) and radius r.

display.character(78, 112, 5 ,2)

Draws character number 78 (ASCII = ‘N’) at point (112,5) in size 2. Size 1 is very small, while 6 is rather blocky.

display.text("Pixels", 63, 25, 200, 4)

Draws the text on the screen from (63,25) in size 4 with text wrapping to next line at a ‘space’ if the text is longer than 200 pixels. (Complicated but very useful.)

display.pixel_span(30,190,180)

Draws a horizontal line 180 pixels long from point (30,190).

display.set_clip(20, 135, 200, 100)

While the screens are quite small in size, they have plenty of pixels for display
While the screens are quite small in size, they have plenty of pixels for display

After this instruction, which sets a rectangular area from (20,135), 200 pixels wide and 100 pixels high, only pixels drawn within the set area are put into the buffer. Drawing outside the area is ignored. So only those parts of a large circle intersecting with the clip are effective. We used this method to create the red segment.

display.remove_clip()

This removes the clip.

display.update()

This makes the current state of the buffer visible on the screen. Often forgotten.

if display.is_pressed(3): # Y button is pressed ?

Read a button, numbered 0 to 3.

You can get more creative with the colours if you wish
You can get more creative with the colours if you wish

This code demonstrates the built-in methods and can be downloaded here.

# Pico Explorer - Basics
# Tony Goodhew - 20th Feb 2021
import picoexplorer as display
import utime, random
#Screen essentials
width = display.get_width()
height = display.get_height()
display_buffer = bytearray(width * height * 2)
display.init(display_buffer)

def blk():
    display.set_pen(0,0,0)
    display.clear()
    display.update()

def show(tt):
    display.update()
    utime.sleep(tt)
   
def title(msg,r,g,b):
    blk()
    display.set_pen(r,g,b)
    display.text(msg, 20, 70, 200, 4)
    show(2)
    blk()

# Named pen colour
grey = display.create_pen(100,100,100)
# ==== Main ======
blk()
title("Pico Explorer Graphics",200,200,0)
display.set_pen(255,0,0)
display.clear()
display.set_pen(0,0,0)
display.rectangle(2,2,235,235)
show(1)
# Blue rectangles
display.set_pen(0,0,255)
display.rectangle(3,107,20,20)
display.rectangle(216,107,20,20)
display.rectangle(107,3,20,20)
display.rectangle(107,216,20,20)
display.set_pen(200,200,200)
#Compass  points
display.character(78,112,5,2)   # N
display.character(83,113,218,2) # S
display.character(87,7,110,2)   # W
display.character(69,222,110,2) # E
show(1)
# Pixels
display.set_pen(255,255,0)
display.text("Pixels", 63, 25, 200, 4)
display.set_pen(0,200,0)
display.rectangle(58,58,124,124)
display.set_pen(30,30,30)
display.rectangle(60,60,120,120)
display.update()
display.set_pen(0,255,0)
for i in range(500):
    xp = random.randint(0,119) + 60
    yp = random.randint(0,119) + 60
    display.pixel(xp,yp)
    display.update()
show(1)
# Horizontal line
display.set_pen(0,180,0)
display.pixel_span(30,190,180)
show(1)
# Circle
display.circle(119,119,50)
show(1.5)
display.set_clip(20,135, 200, 100)
display.set_pen(200,0,0)
display.circle(119,119,50)
display.remove_clip()

display.set_pen(0,0,0)
display.text("Circle", 76, 110, 194, 3)
display.text("Clipped", 85, 138, 194, 2)
display.set_pen(grey) # Previously saved colour
# Button Y
display.text("Press button y", 47, 195, 208, 2)
show(0)
running = True
while running:
    if display.is_pressed(3): # Y button is pressed ?
        running = False
blk()

# Tidy up
title("Done",200,0,0)
show(2)
blk()

Straight lines can give the appearance of curves
Straight lines can give the appearance of curves

We’ve included three short procedures to help reduce code repetition:

def blk() 

This clears the screen to black – the normal background colour.

def show(tt)

This updates the screen, making the buffer visible and then waits tt seconds.

def title(msg,r,g,b)

This is used to display the msg string in size 4 text in the specified colour for two seconds, and then clears the display.

As you can see from the demonstration, we can accomplish a great deal using just these built-in methods. However, it would be useful to be able to draw vertical lines, lines from point A to point B, hollow circles, and rectangles. If these are written as procedures, we can easily copy and paste them into new projects to save time and effort.

You don't need much to create interesting graphics
You don’t need much to create interesting graphics

In our second demonstration, we’ve included these ‘helper’ procedures. They use the parameters (t, l, r, b) to represent the (top, left) and the (right, bottom) corners of rectangles or lines.

def horiz(l,t,r):    # left, top, right

Draws a horizontal line.

def vert(l,t,b):   # left, top, bottom

Draws a vertical line.

def box(l,t,r,b):  # left, top, right, bottom

Draws an outline rectangular box.

def line(x,y,xx,yy): 

Draws a line from (x,y) to (xx,yy).

def ring(cx,cy,rr,rim): # Centre, radius, thickness

Draws a circle, centred on (cx,cy), of outer radius rr and pixel thickness of rim. This is easy and fast but has the disadvantage that it wipes out anything inside ring

def ring2(cx,cy,r):   # Centre (x,y), radius

Draw a circle centred on (cx,cy), of radius rr with a single-pixel width. Can be used to flash a ring around something already drawn on the screen. You need to import math as it uses trigonometry.

def align(n, max_chars):

This returns a string version of int(n), right aligned in a string of max_chars length. Unfortunately, the font supplied by Pimoroni in its UF2 is not monospaced.

What will you create with your Pico display?
What will you create with your Pico display?

The second demonstration is too long to print, but can be downloaded here.

It illustrates the character set, drawing of lines, circles and boxes; plotting graphs, writing text at an angle or following a curved path, scrolling text along a sine curve, controlling an interactive bar graph with the buttons, updating a numeric value, changing the size and brightness of disks, and the colour of a rectangle.  

The program is fully commented, so it should be quite easy to follow.

The most common coding mistake is to forget the display.update() instruction after drawing something. The second is putting it in the wrong place.

When overwriting text on the screen to update a changing value, you should first overwrite the value with a small rectangle in the background colour. Notice that the percentage value is right-aligned to lock the ‘units’ position. 

It’s probably not a good idea to leave your display brightly lit for hours at a time. Several people have reported the appearance of ‘burn’ on a dark background, or ‘ghost’ marks after very bright items against a dark background have been displayed for some time. We’ve seen them on our display, but no long-term harm is evident. Blanking the screen in the ‘tidy-up’ sequence at the end of your program may help.

We hope you have found this tutorial useful and that it encourages you to start sending your output to a display. This is so much more rewarding than just printing to the REPL.

If you have a Pimoroni Pico Display, (240×135 pixels), all of these routines will work on your board.

Issue 41 of HackSpace magazine is on sale NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents. As always, every issue is free to download from the HackSpace magazine website.

The post Graphic routines for Raspberry Pi Pico screens appeared first on Raspberry Pi.

Arduino World Gathering 2021: the official community conference you can’t miss

via Arduino Blog

We’re proud to announce the Arduino World Gathering, taking place everywhere in October 2021. Multiple days packed with workshops, lightning talks and project demos; a virtual event for everyone to enjoy.

This is a conference made by you. Whether you built a cool project with Arduino for fun or profit, you want to share a neat hack with the novice users, or you want to host a workshop a particular skill, technique, or special know-how you’ve acquired – we want you. 

Hackers, creators, designers, engineers, educators. Stop what you’re doing and start putting your ideas together now. A call for proposals will open soon.

We’ll talk about hardware, software, open source, creative technology, interactive art, smart products, professional applications, education, home automation, Internet of Things, artificial intelligence and more. All things Arduino!

Enter your email to get notified about the call for proposals and any other AWG updates:

Interested in sponsoring the conference? Contact us.

The post Arduino World Gathering 2021: the official community conference you can’t miss appeared first on Arduino Blog.