Author Archives: Arduino Team

Creating an online robot fighting game using Arduino MKR1000 WiFi

via Arduino Blog

This is a guest post from Surrogate, a team of developers building games that people play in real-life over the internet.

We introduced this concept last year, and have launched three games so far. Our final game of 2019 was SumoBots Battle Royale — where players from anywhere in the world can fight real robots on a battle royale style arena. The aim of the project was to have the game run semi-autonomously, meaning that the bots could self-reset in between the games, and the arena could run by itself with no human interaction. This was our most complex project to date, and we wanted to share some parts of the build process in more detail, specifically, how we’ve built these robots and hooked them online for people to control remotely.

Robot selection

We’ve started our process by choosing which robots we’d want to use for the game. There were a couple of requirements for the robots when making the evaluation:

  • Are able to withstand 24/7 collision
  • Easily modifiable and fixable
  • Can rotate on the same spot
  • Must have enough space to fit the electronics

After looking at a lot of different consumer robots, maker projects, and competitive fighting bots, we’ve decided to use the JSUMO BB1 robots for this game. We liked the fact that these bots have a metal casing which makes them very durable, all parts are easily replaceable and can be bought separately, and it has 4 independent motors (motor shields included), one for each wheel, which allows it to rotate on the same spot.

We were pretty skeptical of being able to fit all the electronics into the original casing, but we decided to go with this robot anyways, as it had the best overall characteristics. As this robot is easily modifiable, we can always 3D print an extra casing to fit all the parts.

What is the board?

Now that we’ve decided on the robot, it was the time to define what electronics should we use in this build. As usual, it all starts with the requirements. Here’s what we need for the game to run smoothly:

  • The robot should be able to recover from any position
  • Can stay online while charging
  • Supports WiFi network connection and offers reliable connectivity
  • Easily programmable and supports OTA updates
  • Can control four motors simultaneously

Based on these requirements we had the following electronics layout in mind:

We had to find a board that is energy efficient, can send commands to motors, supports parallel charging and has a small footprint on the robot size. With so many requirements, finding the perfect board can be a challenge.

Arduino to the rescue

Fortunately, Arduino was there to help us out. They offer a rich selection of boards to fit every possible robotics project out there and have very detailed documentation for each of the boards. 

More importantly, Arduino is known for its high quality, something that is crucial for semi-autonomous types of applications. Coming from an embedded software background and having to work with all sorts of hardware, we often see that some features or board functionalities are not fully finished which can lead to all sorts of unpleasant situations.

After looking at the Arduino’s collection of boards we quickly found a perfect candidate for our project, the Arduino MKR1000 WiFi. This board fits all of our main requirements for the motor controls, is easily programmable via Arduino IDE, and due to its low power design is extremely power efficient, allowing us to have a lower capacity battery. Additionally, it has a separate WiFi chip onboard, which solely focuses on providing a reliable WiFi connection, something that is very important in our use case.

Now that we’ve decided on the “brain” of our robot, it was time to choose the rest of the components.

Robust hardware means working software

Something to keep in mind is that when working with hardware, you should always try to avoid any possible risks. This means that you should always over-do your minimal hardware requirements where possible. The reason is — if your hardware doesn’t work as intended, your whole software stack becomes unusable too. Always chose reliable hardware components for mission-critical applications.

Some of our electric components might look a bit overkill, but due to the nature of our projects, they are a critical requirement.

Avoiding the battery explosions

As there is a lot of robot collision involved in the game, we decided to go with a high safety standard battery solution. After evaluating multiple options on the market, we decided to go with the RRC2040 from RRC (Germany). It has a capacity of 2950 mAh that allows us to run the robots for up to five hours on a single charge. It has an internal circuitry for power management, protection features and it supports SMBUS communications (almost like I2C), and is certified for all of the consumer electronics battery standards. For charging, we used RRC’s charging solution designed specifically for this battery and that offers the possibility to feed power to the application while the battery is being charged.

Note: the Arduino MKR1000 has a pretty neat charging solution on the board itself. You can connect the battery to the board directly as the main power source, and you charge it directly through the MKR1000’s micro USB port. We really wanted to use it to save space and have a more robust design, but due to the large capacity of our battery, we couldn’t use it at full potential. In our future projects with smaller scale robots, we definitely plan to use the board’s internal charging system, as it works perfectly for 700-1800 mAh power packs.

Bot recovery

For the bot to be able to recover from falling on its head, we’ve implemented a flipping servo. We didn’t want to have any risk of not enough torque, so we went with DS3218, which is capable of lifting up to 20KG of weight. Here’s how it works:

Hooking everything together

Now that we’ve decided on all of the crucial elements of this setup, it was time to connect all the elements together. As the first step, we figured what would be the best step way to locate all the pieces within the bot. We then 3D-printed a casing to protect the electronics. With all of the preliminary steps completed, we’ve wired all of the components together and mounted them inside of the casing. Here’s how it looks:

It was really convenient for us that all the pins on the board could be connected just by plugging them in, this avoids a lot of time spent on soldering the cables for 12 robots and more importantly, allowed us to cut out the risk of bad soldering that usually can’t be easily identified.

Arduino = Quick code

Arduino MKR1000 offered us the connectivity we needed for the project. Each sumo robot hosts their own UDP server using MKR1000 WiFi libraries to receive their control commands for a central control PC and broadcasting their battery charge status. The user commands are translated to three different PWM signals using Arduino Servo library for the flipping, left and right side motor controllers. The board used has support for hardware PWM output which was useful for us.  Overall we managed to keep the whole Arduino code in a few hundred lines of code due to the availability of Servo and Wifi libraries.

The out of the box ArduinoOTA support for updating the code over the WiFi came in handy during the development phase, but also anytime we update the firmware for multiple robots at the same time. No need to open the covers and attach a USB cable! We created a simple Bash script using the OTA update tool bundled in Arduino IDE to send firmware updates to every robot at the same time.  

To summarize

It’s pretty amazing that we live in the age where you can use a mass market, small form factor board like the Arduino MKR1000 and have so much functionality. We’ve had a great experience developing our SumoBots Battle Royale game using the board. It made the whole process very smooth and streamlined, the documentation was right on point, and we never had to hit a bottleneck where the hardware wouldn’t work as expected.

More importantly, the boards have proven to be very robust throughout the time. These SumoBots have been used for more than 3,000 games already, and we haven’t seen a single failure from the MKR1000. For a game where you literally slam the robots in to each other at a high speed, that’s pretty impressive to say the least.

We look forward to working with Arduino on our future games, and we can’t wait to see what they will be announcing in 2020!

The Lightwaves is a participatory audio-visual installation

via Arduino Blog

Music and synchronized lighting can be a beautiful combination, evident by panGenerator’s recent installation that was commissioned by the M?skie Granie concert tour in Poland.

The interactive sculpture was comprised of 15 drums that trigger waves of light traveling toward a huge helium-filled sphere floating above the area, appearing to charge it with sound and light energy as the instruments are played. 

“The audience was invited to drum collectively and together create an audio-visual spectacle – intensity of which depended on the speed and intensity of the drumming. That fulfilled the main goal of creating interactive art experience in which the audience can actively
participate in the event rather than just passively enjoy the music, gathering and playing together.”

The project incorporated 200 meters of addressable RGB LEDs and measured in at roughly 300 square meters, making it likely the biggest such build ever seen there. According to the designers, each of the drums featured a custom PCB equipped with an Arduino Nano and microphone, and used an MCP2515-based CAN setup for communication. 

All of this was assembled and taken down seven times over two months in cities around the country. Be sure to check out this dazzling display in action in the video below! 

How to deal with API clients, the lazy way — from code generation to release management

via Arduino Blog

This post is from Massimiliano Pippi, Senior Software Engineer at Arduino.

The Arduino IoT Cloud platform aims to make it very simple for anyone to develop and manage IoT applications and its REST API plays a key role in this search for simplicity. The IoT Cloud API at its core consists of a set of endpoints exposed by a backend service, but this alone is not enough to provide a full-fledge product to your users. What you need on top of your API service are:

  • Good documentation explaining how to use the service.
  • A number of plug-and-play API clients that can be used to abstract the API from different programming languages.

Both those features are difficult to maintain because they get outdated pretty easily as your API evolves but clients are particularly challenging: they’re written in different programming languages and for each of those you should provide idiomatic code that works and is distributed according to best practices defined by each language’s ecosystem.

Depending on how many languages you want to support, your engineering team might not have the resources needed to cover them all, and borrowing engineers from other teams just to release a specific client doesn’t scale much. 

Being in this exact situation, the IoT Cloud team at Arduino had no other choice than streamlining the entire process and automate as much as we could. This article describes how we provide documentation and clients for the IoT Cloud API.

Clients generation workflow

When the API changes, a number of steps must be taken in order to ship an updated version of the clients, as it’s summarized in the following drawing. 

As you can see, what happens after an engineer releases an updated version of the API essentially boils down to the following macro steps:

1. Fresh code is generated for each supported client.
2. A new version of the client is released to the public.

The generation process

Part 1: API definition

Every endpoint provided by the IoT Cloud API is listed within a Yaml file in OpenAPI v3 format, something like this (the full API spec is here):

/v2/things/{id}/sketch:
    delete:
      operationId: things_v2#deleteSketch
      parameters:
      - description: The id of the thing
        in: path
        name: id
        required: true
        schema:
          type: string
      responses:
        "200":
          content:
            application/json:
              schema:
                $ref: '#/components/schemas/ArduinoThing'
          description: OK
        "401":
          description: Unauthorized
        "404":
          description: Not Found

The format is designed to be human-readable, which is great because we start from a version automatically generated by our backend software that we manually fine-tune to get better results from the generation process. At this stage, you might need some help from the language experts in your team in order to perform some trial and error and determine how good the generated code is. Once you’ve found a configuration that works, operating the generator doesn’t require any specific skill, the reason why we were able to automate it.

Part 2: Code generation

To generate the API clients in different programming languages we support, along with API documentation we use a CLI tool called openapi-generator. The generator parses the OpenAPI definition file and produces a number of source code modules in a folder on the filesystem of your choice. If you have more than one client to generate, you will notice very soon how cumbersome the process can get: you might need to invoke openapi-generator multiple times, with different parameters, targeting different places in the filesystem, maybe different git repositories; when the generation step is done, you have to go through all the generated code, add it to version control, maybe tag, push to a remote… You get the gist. 

To streamline the process described above we use another CLI tool, called Apigentools, which wraps the execution of openapi-generator according to a configuration you can keep under version control. Once Apigentools is configured, it takes zero knowledge of the toolchain to generate the clients – literally anybody can do it, including an automated pipeline on a CI system.

Part 3: Automation

Whenever the API changes, the OpenAPI definition file hosted in a GitHub repository is updated accordingly, usually by one of the backend engineers of the team. A Pull Request is opened, reviewed and finally merged on the master branch. When the team is ready to generate a new version of the clients, we push a special git tag in semver format and a GitHub workflow immediately starts running Apigentools, using a configuration stored in the same repository. If you look at the main configuration file, you might notice for each language we want to generate clients for, there’s a parameter called ‘github_repo_name’: this is a killer feature of Apigentools that let us push the automation process beyond the original plan. Apigentools can output the generated code to a local git repository, adding the changes in a new branch that’s automatically created and pushed to a remote on GitHub.

The release process

To ease the release process and to better organize the code, each API client has its own repo: you’ll find Python code in https://github.com/arduino/iot-client-py, Go code in https://github.com/arduino/iot-client-go and so on and so forth. Once Apigentools finishes its run, you end up with new branches containing the latest updates pushed to each one of the clients’ repositories on GitHub. As the branch is pushed, another GitHub workflow starts (see the one from the Python client as an example) and opens a Pull Request, asking to merge the changes on the master branch. The maintainers of each client receive a Slack notification and are asked to review those Pull Requests – from now on, the process is mostly manual.

It doesn’t make much sense automate further, mainly for two reasons:

  1. Since each client has its own release mechanism: Python has to be packaged in a Wheel and pushed to PyPI, Javascript has to be pushed to NPM, for Golang a tag is enough, docs have to be made publicly accessible. 
  2. We want to be sure a human validates the code before it’s generally available through an official release.

Conclusions

We’ve been generating API clients for the IoT Cloud API like this for a few months, performing multiple releases for each supported programming language and we now have a good idea of the pros and cons of this approach.

On the bright side: 

  • The process is straightforward, easy to read, easy to understand.
  • The system requires very little knowledge to be operated.
  • The time between a change in the OpenAPI spec and a client release is within minutes.
  • We had an engineer working two weeks to set up the system and the feeling is that we’re close to paying off that investment if we didn’t already.

On the not-so-bright side: 

  • If operating the system is trivial, debugging the pipeline if something goes awry requires a high level of skill to deep dive into the tools described in this article.
  • If you stumble upon a weird bug on openapi-generator and the bug doesn’t get attention, contributing patches upstream might be extremely difficult because the codebase is complex.

Overall we’re happy with the results and we’ll keep building up features on top of the workflow described here. A big shoutout to the folks behind openapi-generator and Apigentools!

Build a Nano-based binary Nixie clock with 18 IN-2 tubes

via Arduino Blog

Nixie tubes are, of course, an elegant display method from a more civilized age, but actually powering and controlling them can be a challenge. This can mean a great project and learning opportunity, but if you’d rather just skip ahead to programming these amazing lights, then Marcin Saj’s IN-2 binary Nixie clock is definitely worth a look.

This retro-style unit features a 6 x 3 array of small IN-2 tubes, which are turned to “1” or “0” depending on the time. Reading the results takes a bit of binary math, but it would be good practice for those that would like to improve their skills. 

The clock is available for purchase, and can be driven by a classic Nano, Nano Every as well as a Nano 33 IoT — the last of which enables you to connect to the NTP server or cloud over WiFi.

A wireless monitoring solution for solar power systems in remote locations

via Arduino Blog

Researchers in Thailand have developed a ZigBee-based wireless monitoring solution for off-grid PV installations capable of tracking the sun across the sky, tilting the panel hourly. The elevation for the setup is adjusted manually once per month for optimum energy collection. The prototype is controlled by a local Arduino Uno board, along an H-bridge motor driver to actuate the motor and a 12V battery that’s charged entirely by solar power.

The system features a half-dozen sensors for measuring battery terminal voltage, solar voltage, solar current, current to the DC-DC converter, the temperature of the power transistor of DC-DC converter, and the tilt angle of solar panels according to the voltage across the potentiometer. 

Data is transmitted wirelessly via an XBee ZNet 2.5 module to a remote Uno with an XBee shield. The real-time information is then passed on to and analyzed by a computer, which is also used to set the system’s time.

More details on the project can be found in the team’s paper.

Wireless sensing is an excellent approach for remotely operated solar power system. Not only being able to get the sensor data, such as voltage, current, and temperature, the system can also have a proper control for tracking the Sun and sensing real-time data from a controller. In order to absorb the maximum energy by solar cells, it needs to track the Sun with proper angles. Arduino, H-bridge motor driver circuit, and Direct Current (DC) motor are used to alter the tilt angle of the solar Photovoltaic (PV) panel following the Sun while the azimuth and the elevation angles are fixed at noon. Unlike the traditional way, the tilt rotation is proposed to be stepped hourly. The solar PV panel is tilted  in advance of current time to the west to produce more output voltage during an hour. As a result, the system is simple while providing good solar-tracking results and efficient power outputs.

This SpaceX fan created a levitating Starship lamp

via Arduino Blog

Although you might not be able to build or house your own SpaceX Starship, YouTuber “Embrace Racing” has created a levitating lamp model that will be much more attainable for non-multi-billionaires. 

The lamp’s landing pad features an Arduino Nano inside, which is used with WS2812 LEDs to simulate the smoke plume of the rocket through a 3D-printed “clear” PLA diffuser.

The base also contains a levitating module capable of supporting up to 400g to suspend the spacecraft in midair. While its height would tend to make it unstable, the onboard levitating magnet lowers the center of gravity, along with a battery and three LEDS that provide light from the bottom of the rocket itself. 

Print files and other project info are available on Thingiverse.