Tag Archives: Video

Improve your programming skills with an oscilloscope

via Arduino Blog

Oscillo-02

Starting a new project is always a fun yet effective way to hone your skills while exploring circuitry and programming. To help improve his engineering chops, Joop Brokking recently bought an inexpensive oscilloscope (a device for visualizing voltage over time in an x-y graph) and connected it to an Arduino Uno. He then shared his findings in a detailed tutorial on YouTube.

In the video below, Brokking is using a Hantek 6022BE 20MHz dual-channel oscilloscope and provides three examples to better understand what can go wrong when building a simple Arduino setup.

Shining Back liveset blows your mind with light and sound

via Arduino Blog

angle

Last year, we featured an awesome audiovisual project from ANGLE that applied videomapping techniques to their livesets. Now, the Florence-based duo is back with their latest A/V system, “Shining Back,” which was designed in collaboration with JoinT Studio’s Stefano Bonifazi.

Essentially, it’s a grid structure consisting of LED lights that pulse in a geometric matrix to the duo’s live rhythms. The installation runs on an Arduino Uno and uses Mad Mapper and Modul8 software.

The immersive atmosphere created by the music is emphasized by a new research in the visual realm. Taking an architectural form of a kaleidoscope the lighting visually weaves and refracts the music into a surreal yet symbiotic form.

Teach your drone what is up and down with an Arduino

via Arduino Blog

Gyroscopes and accelerometers are the primary sensors at the heart of an IMU, also known as an internal measurement unit — an electronic sensor device that measures the orientation, gravitational forces and velocity of a multicopter, and help you keep it in the air using Arduino.

Two videos made by Joop Brokking, a Maker with passion for RC model ‘copters, clearly explain how to program your own IMU so that it can be used for self-balancing your drone without Kalman filters,  libraries, or complex calculations.

Auto leveling a multicopter is pretty challenging. It means that when you release the pitch and roll controls on your transmitter the multicopter levels itself. To get this to work the flight controller of the multicopter needs to know exactly which way is down. Like a spirit level that is on top of the multicopter for the pitch and roll axis.

Very often people ask me how to make an auto level feature for their multicopter. The answer to a question like this is pretty involved and cannot be explained in one email. And that is why I made this video series.

You can find the bill of materials and code here.

Holopainting with Raspberry Pi

via Raspberry Pi

We’ve covered 2D light-painting here before. This project takes things a step further: meet 3D holopainting.

Holo_Painting-1

This project’s an unholy mixture of stop-motion, light-painting and hyperlapse from FilmSpektakel, a time-lapse and film production company in Vienna. It was made as part of a university graduation project. (With Raspberry Pis and Raspberry Pi camera boards, natch.)

Getting this footage out was a very labour-intensive process – but the results are stupendous. The subject was filmed by a ring of 24 networked Raspberry Pi cameras working like a 3d scanner, taking pictures around the ring with a delay of 83 milliseconds between each one so that movement could be captured.

Holopainting rig

 

They then cut out all of the resulting images – told you it was labour-intensive – and put them on a black background, then fed that data into a commercial light-painting stick. (If you don’t want to fork out a ton of cash for your own light-painting stick, there are instructions on building one with a Raspberry Pi over at Adafruit.)

A man dressed as a budget ninja walked the stick in front of a series of cameras set up where the original Raspberry Pi cameras had been, to create 3D images hanging in the air.

holopainting ninja

Presto: a holopainting – and the results are tremendous. Here’s a making-of video.

The Invention of #HoloPainting

Holopainting is a combination of the Light Painting, Stop Motion and Hyperlapse technique to create three dimensional light paintings. We didn’t want to use computer generated images, so we built a giant 3D scanner out of 24 Raspberry Pis with their webcams. These cameras took photos from 24 different perspectives of the person in the middle with a delay of 83 milliseconds, so the movement of the person also was recorded.

There’s a comment that often pops up when we describe a project like this: why bother? We’ll head that off right now: because you can. Because nobody’s done it before. Because the end results look phenomenal. We love it, and we’d love to see more projects like this!

The post Holopainting with Raspberry Pi appeared first on Raspberry Pi.

Machine learning for the maker community

via Arduino Blog

mellis-aday

At Arduino Day, I talked about a project I and my collaborators have been working on to bring machine learning to the maker community. Machine learning is a technique for teaching software to recognize patterns using data, e.g. for recognizing spam emails or recommending related products. Our ESP (Example-based Sensor Predictions) software recognizes patterns in real-time sensor data, like gestures made with an accelerometer or sounds recorded by a microphone. The machine learning algorithms that power this pattern recognition are specified in Arduino-like code, while the recording and tuning of example sensor data is done in an interactive graphical interface. We’re working on building up a library of code examples for different applications so that Arduino users can easily apply machine learning to a broad range of problems.

The project is a part of my research at the University of California, Berkeley and is being done in collaboration with Ben Zhang, Audrey Leung, and my advisor Björn Hartmann. We’re building on the Gesture Recognition Toolkit (GRT) and openFrameworks. The software is still rough (and Mac only for now) but we’d welcome your feedback. Installations instructions are on our GitHub project page. Please report issues on GitHub.

Our project is part of a broader wave of projects aimed at helping electronics hobbyists make more sophisticated use of sensors in their interactive projects. Also building on the GRT is ml-lib, a machine learning toolkit for Max and Pure Data. Another project in a similar vein is the Wekinator, which is featured in a free online course on machine learning for musicians and artists. Rebecca Fiebrink, the creator of Wekinator, recently participated in a panel on machine learning in the arts and taught a workshop (with Phoenix Perry) at Resonate ’16. For non-real time applications, many people use scikit-learn, a set of Python tools. There’s also a wide range of related research from the academic community, which we survey on our project wiki.

For a high-level overview, check out this visual introduction to machine learning. For a thorough introduction, there are courses on machine learning from coursera and from udacity, among others. If you’re interested in a more arts- and design-focused approach, check out alt-AI, happening in NYC next month.

If you’d like to start experimenting with machine learning and sensors, an excellent place to get started is the built-in accelerometer and gyroscope on the Arduino or Genuino 101. With our ESP system, you can use these sensors to detect gestures and incorporate them into your interactive projects!