Tag Archives: Raspberry Pi

The Arduino Nano RP2040 Connect is here

via Arduino Blog

It was back in January that we first introduced you to the Arduino Nano RP2040 Connect. The first Arduino board to include Raspberry Pi silicon. It’s been a roller coaster ride getting it to you, and enthusiasm during the wait has been incredibly encouraging. The wait, you’ll be glad to hear, is over.

The Arduino Nano RP2040 Connect mountain climbing

The RP2040 Processor

Working with the Raspberry Pi Foundation is nothing short of a pleasure. The teams there make some incredible devices, and their first in-house silicon is no exception. These guys get it.

This system-on-a-chip is a 32-bit dual-core ARM Cortex-M0+ microcontroller, clocked at 133MHz and is powerful enough to run TensorFlow Lite. It’s young, but proving to be incredibly popular with makers, as well as electronics manufacturers. It’s going to be incredibly exciting to see how the Arduino community reacts to it. We can only imagine what you guys can achieve with the extra features of the Nano RP2040 Connect board.

Welcome the Arduino Nano RP2040 Connect

So it was an easy choice for Arduino to put an RP2040 at the core of a new board. We felt so strongly about the excellence of this new chip that we knew it deserved a powerful, premium Nano board that is unrivalled in terms of features.

First and foremost is the inclusion of the u-blox NINA-W102 WiFi and Bluetooth radio module. Nano users are probably quite familiar with this excellent module already.

Coupled with a six-axis machine learning-capable IMU motion sensor, on-board microphone for sound and voice activation, an RGB LED and loads of multi-function GPIO pins, this is the project maker’s dream come true. And all on such a tiny board.

Nano RP2040 Connect in the Cloud

Just like everything Arduino, the hardware of the Nano RP2040 Connect is only half the story.

Right off the bat this device is fully compatible with the Arduino Cloud. It landed at just the right moment, as Arduino Cloud plans were given an overhaul. These offer a lot more on the free tier, while bringing in a new Entry Plan that really unlocks the power of the cloud.

Nano RP2040 Connect to Arduino Cloud

Because the Nano RP2040 Connect is a connected device, this opens up all kinds of possibilities. Not least of all over-the-air updates and programming. This alone can make a Cloud accompaniment to the board worthwhile. It gives you full, incredibly easy access to the hardware. This is true even after it’s been deployed, installed or buried in the guts of a project. If it’s got a WiFi signal, you can do everything as if it was plugged in by USB. Furthermore, it has the added bonus of smartphone control through the Arduino IoT Remote app.

The Cloud even makes it super easy for your Nano RP2040 Connect to communicate wirelessly with other boards. Any devices connected to your Arduino Cloud can communicate, and we’re not just talking about official Arduino boards.

So Much Software

A couple of weeks ago we updated the official Arduino Mbed Core to provide native RP2040 support.

The plug-and-play nature of the Arduino Core means you can use existing sketches you made for, say, a Nano 33 BLE Sense on your brand new Nano RP2040 Connect. So you can have this little workhorse up and running within minutes, if you’ve already been working on some project sketches. Plus, it’s compatible with the entire RP2040 software ecosystem, so if this is an upgrade for an existing RP2040 board, you’re good to go.

If you’re just getting started on sketches for the device, it offers full support for MicroPython. There’s even a free OpenMV license bundled in, for any machine vision projects you might have planned.

Arduino Nano RP2040 Connect

Go Get Your Arduino Nano RP2040 Connect

Yes, there’s a limited supply at launch. We built as many as possible for the first run. But a lot have been sent out to our reseller partners. So head on over to the store right now if you want to be one of the first to get this premium RP2040 board.

If you want to stay up to date on all things Arduino Nano RP2040 Connect, make sure you’re signed up to our email list. From there we’ll keep you advised on restocking, new updates, special offers and everything else to do with this tiny, but mighty, board.

The post The Arduino Nano RP2040 Connect is here appeared first on Arduino Blog.

Welcome Raspberry Pi to the world of microcontrollers

via Arduino Blog

‘Raspberry and chips,’ not something you’d like to eat but in the world of silicon it’s actually a great combination. Eben Upton recently shared with us Raspberry Pi’s exciting vision for a revolutionary product that they were working on: a microcontroller, the RP2040, based on Raspberry Pi silicon.

The news was both disruptive and exciting at the same time. At Arduino, we love to put our hands on innovative technologies, micros, sensors and all the building blocks that allow us to fulfill our mission of making technology simple to use for everyone. The curiosity was growing and a few weeks later we were already tinkering with the initial development tools. The processor is a very intriguing beast — it’s a dual-core Cortex-M0+ microcontroller with fairly sophisticated architecture.

Since we have been experimenting quite a bit with multi-core processors with our Pro product, the “Portenta,” we decided to build an Arduino board based on this new silicon.

We started from the Nano format with its own tiny footprint, leveraging on some of the existing key features of other Nanos like the versatile u-blox NINA WiFi and Bluetooth module. The goal being to enable people to develop connected products leveraging our hardware powered by Raspberry silicon, a solid radio module with exceptional performance, and the Arduino Create IoT Cloud.

The new board will come packed with some high-quality MEMS sensors from STM (namely a 9-axis IMU and a microphone), a very efficient power section, and a bunch of other innovations that you can already spot from the design. 

Whereas the majority of microcontrollers use embedded flash, the new RP2040 chip uses external flash. To provide plenty of space for all your code and storage we’ve included 16MB flash memory — this is also particularly useful to allow OTA (over-the-air) updates.

But there’s more! We are going to port the Arduino core to this new architecture in order to enable everyone to use the RP2040 chip with the Arduino ecosystem (IDE, command line tool, and thousands of libraries). Although the RP2040 chip is fresh from the plant, our team is already working on the porting effort… stay tuned.

While we consider what other products to develop to leverage the RP2040 architecture, we’d love to hear what you’d like us to build with this exciting new processor.

Join us in welcoming the new Raspberry Pi RP2040 and the newborn Arduino Nano RP2040 Connect, which will be available for pre-order in the next few weeks!

– Massimo Banzi (co-founder & chairman) and Fabio Violante (CEO)

Raspberry Pi robot prompts proper handwashing

via Raspberry Pi

Amol Dwshmukh from the University of Glasgow got in touch with us about a social robot designed to influence young people’s handwashing behaviour, which the design team piloted in a rural school in Kerala, India.

In the pilot study, the hand-shaped Pepe robot motivated a 40% increase in the quality and levels of handwashing. It was designed by AMMACHI Labs and University of Glasgow researchers, with a Raspberry Pi serving as its brain and powering the screens that make up its mouth and eyes.

How does Pepe do it?

The robot is very easy to attach to the wall next to a handwashing station and automatically detects approaching people. Using AI software, it encourages, monitors, and gives verbal feedback to children on their handwashing, all in a fun and engaging way.

Amol thinks the success of the robot was due to its eye movements, as people change their behaviour when they know they are being observed. A screen displaying a graphical mouth also meant the robot could show it was happy when the children washed their hands correctly; positive feedback such as this promotes learning new skills.

Amol’s team started work on this idea last year, and they were keen to test the Pepe robot with a group of people who had never been exposed to social robots before. They presented their smiling hand-face hybrid creation at the IEEE International Conference on Robot & Human Interactive Communication (see photo below). And now that hand washing has become more important than ever due to coronavirus, the project is getting mainstream media attention as well.

Photo borrowed from the official conference gallery

What’s next?

The team is now planning to improve Pepe’s autonomous intelligence and scale up the intervention across more schools through the Embracing the World network.

Pepe had a promising trial run, as shown by these stats from the University of Glasgow’s story on the pilot study:

  • More than 90% of the students liked the robot and said they would like to see Pepe again after school vacation.
  • 67% of the respondents thought the robot was male, while 33% thought it was female, mostly attributing to the robot’s voice as the reason
  • 60% said it was younger than them, feeling Pepe was like a younger brother or sister, while 33% thought it was older, and 7% perceived the robot to be of the same age
  • 72% of the students thought Pepe was alive, largely due to its ability to talk

The post Raspberry Pi robot prompts proper handwashing appeared first on Raspberry Pi.

Raspberry Pi listening posts ‘hear’ the Borneo rainforest

via Raspberry Pi

These award-winning, solar-powered audio recorders, built on Raspberry Pi, have been installed in the Borneo rainforest so researchers can listen to the local ecosystem 24/7. The health of a forest ecosystem can often be gaged according to how much noise it creates, as this signals how many species are around.

And you can listen to the rainforest too! The SAFE Acoustics website, funded by the World Wide Fund for Nature (WWF), streams audio from recorders placed around a region of the Bornean rainforest in Southeast Asia. Visitors can listen to live audio or skip back through the day’s recording, for example to listen to the dawn chorus.

Listen in on the Imperial College podcast

What’s inside?

We borrowed this image of the flux tower from Sarab Sethi’s site

The device records data in the field and uploads it to a central server continuously and robustly over long time-periods. And it was built for around $305.

Here’s all the code for the platform, on GitHub.

The 12V-to-5V micro USB converter to the power socket of the Anker USB hub, which is connected to Raspberry Pi.

The Imperial College London team behind the project has provided really good step-by-step photo instructions for anyone interested in the fine details.

Here’s the full set up in the field. The Raspberry Pi-powered brains of the kit are safely inside the green box

The recorders have been installed by Imperial College London researchers as part of the SAFE Project – one of the largest ecological experiments in the world.

Screenshot of the SAFE Project website

Dr Sarab Sethi designed the audio recorders with Dr Lorenzo Picinali. They wanted to quantify the changes in rainforest soundscape as land use changes, for example when forests are logged. Sarab is currently working on algorithms to analyse the gathered data with Dr Nick Jones from the Department of Mathematics.

The lovely cross-disciplinary research team based at Imperial College London

Let the creators of the project tell you more on the Imperial College London website.

The post Raspberry Pi listening posts ‘hear’ the Borneo rainforest appeared first on Raspberry Pi.

Rotary encoders: Raise a Glitch Storm | Hackspace 34

via Raspberry Pi

A Glitch Storm is an explosive torrent of musical rhythms and sound, all generated from a single line of code. In theory, you can’t do this with a Raspberry Pi running Python – in this month’s new issue, out now, the HackSpace magazine team lovingly acquired a tutorial from The Mag Pi team to throw theory out the window and show you how.

What is a Glitch Storm

A Glitch Storm is a user-influenceable version of bytebeat music. We love definitions like that here at the Bakery: something you have never heard of is simple a development of something else you have never heard of. Bytebeat music was at the heart of the old Commodore 64 demo scene, a competition to see who could produce the most impressive graphs and music in a very limited number of bytes. This was revived/rediscovered and christened by Viznut, aka Ville-Matias Heikkilä, in 2011. And then JC Ureña of the ‘spherical sound society’ converted the concept into the interactive Glitch Storm.

Figure 1: Schematic for the sound-generating circuit

So what is it?

Most random music generators work on the level of notes; that is, notes are chosen one at a time and then played, like our Fractal Music project in The MagPi #66. However, with bytebeat music, an algorithm generates the actual samples levels that make up the sound. This algorithm performs bitwise operations on a tick variable that increments with each sample. Depending on the algorithm used, this may or may not produce something musically interesting. Often, the samples produced exhibit a fractal structure, which is itself similar on many levels, thus providing both the notes and structure.

Enter the ‘Glitch Storm’

With a Glitch Storm, three user-controlled variables – a, b, and c – can be added to this algorithm, allowing the results to be fine-tuned. In the ‘Algorithms’ box, you can see that the bytebeat algorithms simply run; they all repeat after a certain time, but this time can be long, in the order of hours for some. A Glitch Storm algorithm, on the other hand, contains variables that a user can change in real-time while the sample is playing. This exactly what we can do with rotary encoders, without having the algorithm interrupted by checking the state of them all the time.

Figure 2: Schematic for the control box

What hardware?

In order to produce music like this on the Raspberry Pi, we need some extra hardware to generate the sound samples, and also a bunch of rotary encoders to control things. The samples are produced by using a 12-bit A/D converter connected to one of the SPI ports. The schematic of this is shown in Figure 1. The clock rate for the transfer of data to this can be controlled and provides a simple way of controlling, to some extent, the sample rate of the sound. Figure 2 shows the wiring diagram of the five rotary encoders we used.

Making the hardware

The hardware comes as two parts: the D/A converter and associated audio components. These are built on a board that hangs off Raspberry Pi’s GPIO pins. Also on this board is a socket that carries the wires to the control box. We used an IDC (insulation displacement connector) to connect between the board and the box, as we wanted the D/A connection wires to be as short as possible because they carry a high frequency signal. We used a pentagonal box just for fun, with a control in each corner, but the box shape is not important here.

Figure 3: Front physical layout of the interface board


The board is built on a 20-row by 24-hole piece of stripboard. Figure 3 and Figure 4 show the physical layout for the front and back of the board. The hole number 5 on row 4 is enlarged to 2.5mm and a new hole is drilled between rows 1 and 2 to accommodate the audio jack socket. A 40-way surface-mount socket connector is soldered to the back of the board, and a 20-way socket is soldered to the front. You could miss this out and wire the 20-way ribbon cable direct to the holes in these positions if you want to economise.

Figure 4: Rear physical layout of the interface board

Further construction notes

Note: as always, the physical layout diagram shows where the wires go, not necessarily the route they will take. Here, we don’t want wires crossing the 20-way connector, so the upper four wires use 30AWG Kynar wire to pop under the connector and out through a track hole, without soldering, on the other side. When putting the 20-way IDC pin connector on the ribbon cable, make sure the red end connector wire is connected to the pin next to the downward-pointing triangle on the pin connector. Figure 5 shows a photograph of the control box wiring

Figure 5: Wiring of the control board

Testing the D/A

The live_byte_beat.py listing on GitHub is a minimal program for trying out a bytebeat algorithm. It will play until stopped by pressing CTRL+C. The variable v holds the value of the sample, which is then transferred to the D/A over SPI in two bytes. The format of these two bytes is shown in Figure 6, along with how we have to manipulate v to achieve an 8-bit or 12-bit sample output. Note that all algorithms were designed for an 8-bit sample size, and using 12 bits is a free bonus here: it does sound radically different, and not always in a good way.

The main software

The main software for this project is on our GitHub page, and contains 24 Pythonised algorithms. The knobs control the user variables as well as the sample rate and what algorithm to use. You can add extra algorithms, but if you are searching online for them, you will find they are written in C. There are two major differences you need to note when converting from C to Python. The first is the ternary operation which in C is a question mark, and the second is the modulus operator with a percent sign. See the notes that accompany the main code about these.

Figure 6: How to program the registers in the D/A converter

Why does this work?

There are a few reasons why you would not expect this to work on a Raspberry Pi in Python. The most obvious being that of the interruptions made by the operating system, regularly interrupting the flow of output samples. Well, it turns out that this is not as bad as you might fear, and the extra ‘noise’ this causes is at a low level and is masked by the glitchy nature of the sound. As Python is an interpreted language, it is just about fast enough to give an adequate sample rate on a Raspberry Pi 4.

Make some noise

You can now explore the wide range of algorithms for generating a Glitch Storm and interact with the sound. On our GitHub page there’s a list of useful links allowing you to explore what others have done so far. For a sneak preview of the bytebeat type of sound, visit magpi.cc/bytebeatdemo; you can even add your own algorithms here. For interaction, however, there’s no substitute for having your own hardware. The best settings are often found by making small adjustments and listening to the long-term effects – some algorithms surprise you about a minute or two into a sequence by changing dramatically.

Get HackSpace magazine issue 34 — out today

HackSpace magazine issue 34: on sale now!

HackSpace magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the directly from PDF from the HackSpace magazine website.

Subscribers to HackSpace for 12 months to get a free Adafruit Circuit Playground, or choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

If you liked this project, it was first featured in The MagPi Magazine. Download the latest issue for free or subscribe here.

The post Rotary encoders: Raise a Glitch Storm | Hackspace 34 appeared first on Raspberry Pi.

International Space Station Tracker | The MagPi 96

via Raspberry Pi

Fancy tracking the ISS’s trajectory? All you need is a Raspberry Pi, an e-paper display, an enclosure, and a little Python code. Nicola King looks to the skies

The e-paper display mid-refresh. It takes about three seconds to refresh, but it’s fast enough for this kind of project

Standing on his balcony one sunny evening, the perfect conditions enabled California-based astronomy enthusiast Sridhar Rajagopal to spot the International Space Station speeding by, and the seeds of an idea were duly sown. Having worked on several projects using tri-colour e-paper (aka e-ink) displays, which he likes for their “aesthetics and low-to-no-power consumption”, he thought that developing a way of tracking the ISS using such a display would be a perfect project to undertake.

“After a bit of searching, I was able to find an open API to get the ISS location at any given point in time,” explains Sridhar. I also knew I wouldn’t have to worry about the data changing several times per second or even per minute. Even though the ISS is wicked fast (16 orbits in a day!), this would still be well within the refresh capabilities of the e-paper display.”

The ISS location data is obtained using the Open Notify API – visit magpi.cc/isslocation to see its current position

Station location

His ISS Tracker works by obtaining the ISS location from the Open Notify API every 30 seconds. It appends this data point to a list, so older data is available. “I don’t currently log the data to file, but it would be very easy to add this functionality,” says Sridhar. “Once I have appended the data to the list, I call the drawISS method of my Display class with the positions array, to render the world map and ISS trajectory and current location. The world map gets rendered to one PIL image, and the ISS location and trajectory get rendered to another PIL image.”

The project code is written in Python and can be found on Sridhar’s GitHub
page: magpi.cc/isstrackercode

Each latitude/longitude position is mapped to the corresponding XY co-ordinate. The last position in the array (the latest position) gets rendered as the ISS icon to show its current position. “Every 30th data point gets rendered as a rectangle, and every other data point gets rendered as a tiny circle,” adds Sridhar.

From there, the images are then simply passed into the e-paper library’s display method; one image is rendered in black, and the other image in red.

Track… star

Little wonder that the response received from friends, family, and the wider maker community has been extremely positive, as Sridhar shares: “The first feedback was from my non-techie wife who love-love-loved the idea of displaying the ISS location and trajectory on the e-paper display. She gave valuable input on the aesthetics of the data visualisation.”

Software engineer turned hardwarehacking enthusiast and entrepreneur, Sridhar Rajagopal is the founder of Upbeat Labs and creator of ProtoStax – a maker-friendly stackable, modular,
and extensible enclosure system.

In addition, he tells us that other makers have contributed suggestions for improvements. “JP, a Hackster community user […] added information to make the Python code a service and have it launch on bootup. I had him contribute his changes to my GitHub repository – I was thrilled about the community involvement!”

Housed in a versatile, transparent ProtoStax enclosure designed by Sridhar, the end result is an elegant way of showing the current position and trajectory of the ISS as it hurtles around the Earth at 7.6 km/s. Why not have a go at making your own display so you know when to look out for the space station whizzing across the night sky? It really is an awesome sight.

Get The MagPi magazine issue 96 — out today

The MagPi magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the directly from PDF from the MagPi magazine website.

Subscribers to the MagPi for 12 months to get a free Adafruit Circuit Playground, or choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

The post International Space Station Tracker | The MagPi 96 appeared first on Raspberry Pi.