Tag Archives: art

Sisyphus: the kinetic art table

via Raspberry Pi

Surely if he had been given the opportunity, Sisyphus would have engineered a way out of his eternal punishment of rolling a boulder up a hill. It’s just too bad for him that Raspberry Pi wasn’t around to help. While it’s a far cry from his arduous task, the Pi has been used to power Bruce Shapiro’s Sisyphus, a continuous and ever-changing kinetic art piece that creates unique design patterns in sand using a small metal ball.

the-sisyphus-table-1-730x548

Sisyphus is truly mesmerising. We learned this first-hand: at Maker Faire New York earlier this month, it captured the attention of not only the Raspberry Pi crew, but also thousands of attendees throughout the weekend. Sisyphus momentarily drowned out the noise and action of the Faire.

You can think of Sisyphus as a cross between an Etch A Sketch and Spirograph, except this is no toy.

Under the table is a two-motor robot (the “Sisbot”) that moves a magnet which draws a steel ball through the sand. The motors are controlled by a small Raspberry Pi computer which plays a set of path files, much like a music player plays an MP3 file.

Sisyphus

Bruce is using Kickstarter in the hope of transitioning Sisyphus from what’s currently a large art installation exhibited around the world into a beautiful piece to be enjoyed in the home, as both furniture and art.

annmarie thomas on Twitter

Sisyphus- Stunning art/furniture kickstarter (fully funded in <a day) by friend Bruce Shapiro. https://t.co/ijxHQ0fYb5

Bruce says:

Of all works I made, Sisyphus stood out – it was my first CNC machine to break out of the studio/shop. No longer tasked with cutting materials to be used in making sculptures, it was the sculpture itself. It was also unique in another way – I wanted to live with it in my home. I’ve spent the last three years perfecting a home version that’s beautiful, user-friendly, near-silent, and that will run for years.

Like most great Maker Faire projects, it’s centred around a wonderful community. The collaboration and access to tools in Shapiro’s local makerspace helped develop the final design seen today. While Shapiro’s original makerspace has since closed its doors, Shapiro and his fellow members opened up what is now Nordeast Makers. It’s where the production for Sisyphus will take place.

Sisyphus

The Kickstarter products come in three styles: an end table, and two different coffee tables. You might want to find another place to display your coffee table books, though, so as to keep Sisyphus’s designs visible…

kickstarter-products

This Kickstarter won’t be running forever, so be sure to pledge if you love the sound of the Sisyphus.

The post Sisyphus: the kinetic art table appeared first on Raspberry Pi.

Human Sensor

via Raspberry Pi

In collaboration with Professor Frank Kelly and the environmental scientists of King’s College London, artist Kasia Molga has created Human Sensor – a collection of hi-tech costumes that react to air pollution within the wearer’s environment.

Commissioned by Invisible Dust, an organisation supporting artists and scientists to create pieces that explore environmental and climate change, Molga took to the streets of Manchester with her army of Human Sensors to promote the invisible threat of air pollution in the industrial city.

Human Sensor

Angry little clouds of air pollution

Each suit is equipped with a small aerosol monitor that links to a Raspberry Pi and GPS watch. These components work together to collect pollution data from their location. Eventually, the suits will relay data back in real time to a publicly accessible website; for now, information is stored and submitted at a later date.

The Pi also works to control the LEDs within the suit, causing them to flash, pulse, and produce patterns and colours that morph in reaction to air conditions as they are read by the monitor.

Human Sensor

All of the lights…

The suit’s LED system responds to the presence of pollutant particles in the air, changing the colour of the white suit to reflect the positive or negative effect of the air around it. Walk past the grassy clearing of a local park, and the suit will turn green to match it. Stand behind the exhaust of a car, and you’ll find yourself pulsating red.

It’s unsurprising that the presence of the suits in Manchester was both well received and a shock to the system for the city’s residents. While articles are beginning to surface regarding the impact of air pollution on children’s mental health, and other aspects of the detrimental health effects of pollution have long been known, it’s a constant struggle for scientists to remind society of the importance of this invisible threat. By building a physical reminder, using the simple warning colour system of red and green, it’s hard not to take the threat seriously.

“The big challenge we have is that air pollution is mostly invisible. Art helps to makes it visible. We are trying to bring air pollution into the public realm. Scientific papers in journals work on one level, but this is a way to bring it into the street where the public are.” – Andrew Grieve, Senior Air Quality Analyst, King’s College

 

Human Sensor

23-29 July 2016 in Manchester Performers in hi tech illuminated costumes reveal changes in urban air pollution. Catch the extraordinary performances created by media artist Kasia Molga with Professor Frank Kelly from King’s College London. The hi-tech illuminated costumes reflect the air pollution you are breathing on your daily commute.

Human Sensor is supported by the Wellcome Trust’s Sustaining Excellence Award and by Arts Council England; Invisible Dust is working in partnership with Manchester, European City of Science.

The post Human Sensor appeared first on Raspberry Pi.

Autocomplete poetry

via Raspberry Pi

Raspberry Pi integrated into the world of art. I hadn’t come across much of this before, and I like it a lot. As a self-proclaimed ‘artist of stuff’, it’s always exciting to see something arty that calls to the maker inside. With Glaciers, NYC-based Zach Gage has achieved exactly that.

Glaciers was an art instillation that, like the landforms from which it takes its name, slowly developed over time. I say ‘was’, but with each of its constituent pieces still running and a majority already sold, Glaciers continues indefinitely. Using forty Raspberry Pis attached to forty plainly presented Adafruit e-ink screens, Gage used Google Search’s auto-complete function to create poetry.

install4_lg

We’ve all noticed occasional funny or poignant results of the way Google tries to complete your search query for you based on the vast amount of data that passes through its search engine daily. Gage has programmed the Raspberry Pis to select the top three suggestions that follow various chosen phrases and display them on the screens. The results are striking, often moving, and usually something that most people would acknowledge as poetry, or at least poetic.

The screens refresh daily as the Pis check Google for changes and update accordingly. For some search phrases, the autocompletions can change daily; for others, it could take years. A poem you’ve had upon your wall for months on end could suddenly change unexpectedly, updating to reflect the evolving trends of user queries on the internet.

“The best paintings you can look at a thousand times and you keep seeing new things.” – Zach Gage

Glaciers is certainly an intriguing installation, with pithy observations of the vulnerability of anonymous internet users in pieces such as:

Glacier03_lg

and the (somewhat) more light-hearted:

Glacier04_lg

Zach Gage is an indie video game creator, responsible for titles such as SpellTower and the somewhat fear-inducing Lose/Lose (Space Invaders meets permanent file deletion with some 17000 files already lost to the game since launch). He’s previously used Raspberry Pis in other projects, such as his Twitter-fuelled best day ever and Fortune. I bet this isn’t the last time he does something fabulous with a Pi.

The post Autocomplete poetry appeared first on Raspberry Pi.

Make masterpieces with a homemade CNC painting machine

via Arduino Blog

Longtime artist Jeff Leonard has built a pair of Arduino-driven CNC painting machines with the motivation to grow his toolbox and expand the kinds of marks he could make simply by hand. By pairing the formal elements of painting with modern-day computing, the Brooklyn-based Maker now has the ability to create things that otherwise would’ve never been possible.

Machine #1 consists of a 5’ x 7’ table and is capable of producing pieces of art up to 4’ x 5’ in size. The device features a variety of tools, including a Beugler pinstriping paint wheel, a brush with a peristaltic pump syringe feed, an airbrush with a five-color paint feed system and five peristaltic pumps from Adafruit, a squeegee, and pencils, pens, markers and other utensils.

In terms of hardware, it’s equipped with three NEMA 23 stepper motors, three Big Easy Drivers, as well as an Arduino Mega and an Uno. There are two servos and five peristaltic pumps on the carriage–the first servo raises and lowers the tool, while the second presses the trigger on the airbrush. An Adafruit motor shield on the Uno controls the pumps, and the AccelStepper library is used for the Big Easy Drivers.

According to Leonard:

I am coding directly into the Arduino. There are many different codes that I call and overlap and use as a painter overlaps techniques and ideas. There is a lot of random built into the code, I don’t know what the end result will be when I start. Typically on any kind of CNC machining the end result has been made in the computer and the machine executes the instructions. I am building a kind of visual synthesizer that I can control in real-time. There are many buttons and potentiometers that I am controlling while the routines are running. I take any marks or accidents that happen and learn how to incorporate them into a painting.

I am learning Processing now and how to incorporate it into the image making.

Machine #2, however, is a bit different. This one is actually a standup XY unit that was made as a concept project. It paints using water on magic paper that becomes black when wet and disappears as it dries, used mainly as a way to practice calligraphy or Chinese brush painting. Not only does it look great, there’s no clean up either!

In terms of tools, the machine has a brush and an airbrush. Two NEMA 17 stepper motors are tasked with the XY motion. There are also three servos–one servo lifts and lowers the armature away from the paper since there is no Z-axis, another controls the angle of the brush, and the third presses the trigger of the airbrush. A peristaltic pump helps to refill the water cup, along with a small fan. The system is powered by an Arduino Uno with an Adafruit Motor Shield using the Adafruit Motor Shield Library v2.

As awesome as it all sounds, you really have to see these gadgets in action and their finished works (many of which can be found on Instagram).

Holopainting with Raspberry Pi

via Raspberry Pi

We’ve covered 2D light-painting here before. This project takes things a step further: meet 3D holopainting.

Holo_Painting-1

This project’s an unholy mixture of stop-motion, light-painting and hyperlapse from FilmSpektakel, a time-lapse and film production company in Vienna. It was made as part of a university graduation project. (With Raspberry Pis and Raspberry Pi camera boards, natch.)

Getting this footage out was a very labour-intensive process – but the results are stupendous. The subject was filmed by a ring of 24 networked Raspberry Pi cameras working like a 3d scanner, taking pictures around the ring with a delay of 83 milliseconds between each one so that movement could be captured.

Holopainting rig

 

They then cut out all of the resulting images – told you it was labour-intensive – and put them on a black background, then fed that data into a commercial light-painting stick. (If you don’t want to fork out a ton of cash for your own light-painting stick, there are instructions on building one with a Raspberry Pi over at Adafruit.)

A man dressed as a budget ninja walked the stick in front of a series of cameras set up where the original Raspberry Pi cameras had been, to create 3D images hanging in the air.

holopainting ninja

Presto: a holopainting – and the results are tremendous. Here’s a making-of video.

The Invention of #HoloPainting

Holopainting is a combination of the Light Painting, Stop Motion and Hyperlapse technique to create three dimensional light paintings. We didn’t want to use computer generated images, so we built a giant 3D scanner out of 24 Raspberry Pis with their webcams. These cameras took photos from 24 different perspectives of the person in the middle with a delay of 83 milliseconds, so the movement of the person also was recorded.

There’s a comment that often pops up when we describe a project like this: why bother? We’ll head that off right now: because you can. Because nobody’s done it before. Because the end results look phenomenal. We love it, and we’d love to see more projects like this!

The post Holopainting with Raspberry Pi appeared first on Raspberry Pi.

Le Myope – a confused camera

via Raspberry Pi

This is very silly indeed.

Salade Tomate Oignon in Paris seems to be making a bit of a habit of doing outlandish things with Raspberry Pi and other people’s photography. You might remember Layer Cam from a couple of years ago, which allows you to point a sandwich box pretending to be a camera at a landmark and serves up somebody else’s picture of the same thing, using GPS coordinates and Google Image Search.

His newest Raspberry Pi hack, Le Myope (for non-Francophones, that’s The Shortsighted), actually includes a camera – but the results are not what you’d expect. Here’s a bit of video to show you more.

Le myope: a similar images Raspberry Pi camera

Short-sighted camera based on a Raspberry Pi and Google similar images. Find instruction and code to build your own: http://saladetomateoignon.com/Wordpress/a-short-sighted-raspberry-pi-camera/ Music: Samuel Belay – Qeresh Endewaza Logo: Alice www.alicesawicki.com Images: Charly www.nnprod.com

Salade Tomate Oignon says:

Even more imprecise than a blurry polaroid picture, or than a filter-abused instagram shot.
Using the most advanced algorithms based on machine learning and computer vision, here is ‘Le myope’, a short-sighted camera.
The new iteration of the layercam ‘Why are you taking this picture? It’s already on the Internet!’ is a Raspberry Pi based camera, that takes a picture and returns a similar one from Google similar image search.
Use it in a popular place and chances are that you will get the same picture taken by someone else. (That happened with the mural during one of the tests)
Use it in a remote place and get random roughly similar pictures from all over the internet!

This is an extremely daft project which pleased us out of all proportion. You can find code and instructions to build your own at Salade Tomate Oignon’s website. Go forth and take other people’s photographs.

 

The post Le Myope – a confused camera appeared first on Raspberry Pi.