Enginursday: Gesture-Reactive Wearable Fiber Optic

via SparkFun: Commerce Blog

As I wrap up this big old wearable light suit, I’ve decided to add in some extra controls to my audio reactive control system. I’ve always enjoyed the idea of being able to control the colors very specifically myself, as opposed to being at the whim of the aural environment.

Options

I looked over several different sensor solutions in order to find some effective form of gesture or motion control for my lights. The first one I checked out was a Qwiic Accelerometer, which I got to create some pretty neat colors. But I found that I wasn’t able to control them all that well without my hands being forced into certain positions and orientations. In other words, I was unable to simultaneously dance and control colors. I also checked out a TOF Distance Sensor, which worked fantastic. But with one distance sensor in each hand, I was only able to control two colors at once, and when either sensor was facing the sky, these colors would max out. So again, I was unable to move how I wanted and be the color I wanted to be.

Hardware

To enable myself to move and change colors independently, I created a small Qwiic-enabled breakout for a few flex sensors that go into each glove. (Maybe you’ll even see it on SparkFun’s storefront someday…hint hint, wink wink:-) The board has an ADC-to-I2C that takes the readings and converts them into nice, easy I2C data so we don’t have to use any of the analog pins on our microcontroller. This approach was used to save wiring going from the gloves to the microcontroller, as pulling an analog read off of each finger would take quite a bit more cabling.

Software

Now to translate readings from our hands into colors on the suit. This part is relatively simple and involves using Arduino’s handy dandy Map function to map the values attainable by the flex sensors (400–800) to values that make sense to an LED (0–255). However, this function “loops” our numbers back around if they get too low or too high. This makes it extraordinarily difficult to completely maximize or minimize a color value, as you’d need to hold your finger perfectly at 0 or 255. To mitigate this, we first pass our value through Arduino’s handy dandy Constrain function to keep our values from falling into a range that will cause them to loop around. Then we simply write these values into our LEDs, and we’re good to go! A snippet of the example code is shown below, and the library and breakout board used in these gloves will be available soon for all of your gesture-control needs.

#include "FastLED.h"
#include <Wire.h>
#include <SparkFun_ADS1015_Arduino_Library.h>

#define DATA_LEFT_ARM 11
#define CLOCK_LEFT_ARM 13
#define DATA_RIGHT_ARM 0
#define CLOCK_RIGHT_ARM 32
#define DATA_LEFT_LEG 7
#define CLOCK_LEFT_LEG 14
#define DATA_RIGHT_LEG 21
#define CLOCK_RIGHT_LEG 20
#define DATA_BELT 28
#define CLOCK_BELT 27

CRGB leftArm[NUM_LEDS];
CRGB rightArm[NUM_LEDS];
CRGB leftLeg[NUM_LEDS];
CRGB rightLeg[NUM_LEDS];
CRGB belt[NUM_LEDS];

ADS1015 glove;
ADS1015 glove1;

uint16_t red;
uint16_t green;
uint16_t blue;
uint16_t waveFrequency;
uint8_t redMap;
uint8_t greenMap;
uint8_t blueMap;
uint8_t waveFrequencyMap;

void setup() {
    FastLED.addLeds<APA102, DATA_LEFT_ARM, CLOCK_LEFT_ARM, RBG, DATA_RATE_MHZ(8)>(leftArm, 48);
    FastLED.addLeds<APA102, DATA_LEFT_ARM, CLOCK_LEFT_ARM, BGR, DATA_RATE_MHZ(8)>(leftArm, 16);

    FastLED.addLeds<APA102, DATA_RIGHT_ARM, CLOCK_RIGHT_ARM, RBG, DATA_RATE_MHZ(2)>(rightArm, 48);
    FastLED.addLeds<APA102, DATA_RIGHT_ARM, CLOCK_RIGHT_ARM, BGR, DATA_RATE_MHZ(2)>(rightArm, 32, 16);

    FastLED.addLeds<APA102, DATA_LEFT_LEG, CLOCK_LEFT_LEG, RBG, DATA_RATE_MHZ(2)>(leftLeg, 40);

    FastLED.addLeds<APA102, DATA_RIGHT_LEG, CLOCK_RIGHT_LEG, BGR, DATA_RATE_MHZ(2)>(rightLeg, 40);
    FastLED.addLeds<APA102, DATA_RIGHT_LEG, CLOCK_RIGHT_LEG, RBG, DATA_RATE_MHZ(2)>(rightLeg, 8);

    FastLED.addLeds<APA102, DATA_BELT, CLOCK_BELT, BGR, DATA_RATE_MHZ(2)>(belt, NUM_LEDS);
    FastLED.setBrightness(100);
    Serial.begin(9600);
    Wire.begin();

    if (glove.begin() == false) {
        Serial.println("Device not found. Check wiring.");
        while (1);
    }
    if (glove1.begin(Wire, 100000, 0x4A, 9600) == false) {
        Serial.println("Device not found. Check wiring.");
        while (1);
    }
}

void loop() { 
    basicGlove();
}

void gloveRead () {
    red = constrain(glove.getAnalogData(0), 500, 800);
    green = constrain(glove.getAnalogData(1), 470, 780);
    blue = constrain(glove1.getAnalogData(0), 500, 815);
    waveFrequency = constrain(glove1.getAnalogData(1), 550, 850);

    redMap = map(red, 500, 800, 0, 255);
    greenMap = map(green, 470, 780, 0, 255);
    blueMap = map(blue, 500, 815, 0, 255);
    waveFrequencyMap = map(waveFrequency, 550, 850, 0, 255);
}

void basicGlove() {
    gloveRead();
    for (int LED = 0; LED < 16; LED++) {
        rightArm[LED] = CRGB(redMap, redMap, redMap);
        leftArm[LED] = CRGB(redMap, greenMap, blueMap);
        rightLeg[LED] = CRGB(redMap, greenMap, blueMap);
        leftLeg[LED] = CRGB(redMap, greenMap, blueMap);
        belt[LED] = CRGB(redMap, greenMap, blueMap);
    }
    for (int LED = 16; LED < 48; LED++) {
        rightArm[LED] = CRGB(redMap, greenMap, blueMap);
        leftArm[LED] = CRGB(redMap, greenMap, blueMap);
        rightLeg[LED] = CRGB(redMap, greenMap, blueMap);
        leftLeg[LED] = CRGB(redMap, greenMap, blueMap);
        belt[LED] = CRGB(redMap, greenMap, blueMap);
    }
    FastLED.show();
}

The finished product, with me in control of the colors, is shown below.

alt text

comments | comment feed

facepunch: the facial recognition punch clock

via Raspberry Pi

Get on board with facial recognition and clock your screen time with facepunch, the facial recognition punch clock from dekuNukem.

dekuNukem facepunch raspberry pi facial recognition

image c/o dekuNukem

How it works

dekuNukem uses a Raspberry Pi 3, the Raspberry Pi camera module, and an OLED screen for the build. You don’t strictly need to include the OLED board, but it definitely adds to the overall effect, letting you view your daily and weekly screen time at a glance without having to access your Raspberry Pi for data.

As dekuNukem explains in the GitHub repo for the build, they used a perf board to mount the screen and attached it to the Raspberry Pi. This is a nice, simple means of pulling the whole project together without loose wires or the need for a modified case.

dekuNukem facepunch raspberry pi facial recognition

image c/o dekuNukem

This face_recognition library lets the Pi + camera register your face. You’ll also need a well lit 400×400 photograph of yourself to act as a reference for the library. From there, a few commands should get you started.

Uses for facial recognition

You could simply use facepunch for its intended purpose, but here at Pi Towers we’ve been discussing further uses for the build. We’re all guilty of sitting for too long at our desks, so why not incorporate a “get up and walk around” notification? How about a flashing LED that tells you to “drink some water”? You could even go a little deeper (though possibly a little Big Brother) and set up an “I’m back at my desk” notification on Slack, to let your colleagues know you’re available.

You could also take this foray into facial recognition and incorporate it into home automation projects: a user-identifying Magic Mirror, perhaps, or a doorbell that recognises friends and family.

What would you do with facial recognition on a Raspberry Pi?

The post facepunch: the facial recognition punch clock appeared first on Raspberry Pi.

A Look Back: 15 Years of SparkFun and the Maker Movement

via SparkFun: Commerce Blog

As we continue to celebrate SparkFun’s 15th Anniversary, we are taking a look back not just at our own history, but at that of the maker community beyond SparkFun’s walls. Our supremely talented graphic designer, Pete, made this beautiful timeline illustrating how major events at SparkFun align with major events in the maker movement and pop culture.* Check it out below!

Remember something that isn't on the timeline? Let us know in the comments below!

Read more about SparkFun’s history in Nate’s 15th anniversary retrospective.

Read Nate's Post

*Information about the maker movement used in this timeline was originally published by Maker Media.

comments | comment feed

Build a Binary Clock with engineerish

via Raspberry Pi

Standard clocks with easily recognisable numbers are so last season. Who wants to save valuable seconds simply telling the time, when a series of LEDs and numerical notation can turn every time query into an adventure in mathematics?

Build a Binary Clock with Raspberry Pi – And how to tell the time

In this video I’ll be showing how I built a binary clock using a Raspberry Pi, NeoPixels and a few lines of Python. I also take a stab at explaining how the binary number system works so that we can decipher what said clock is trying to tell us.

How to read binary

I’ll be honest: I have to think pretty hard to read binary. It stretches my brain quite vigorously. But I am a fan of flashy lights and pretty builds, so YouTube and Instagram rising star Mattias Jähnke, aka engineerish, had my full attention from the off.

“If you have a problem with your friends being able to tell the time way too easily while in your house, this is your answer.”

Mattias offers a beginners’ guide in to binary in his video and then explains how his clock displays values in binary, before moving on to the actual clock build process. So make some tea, pull up a chair, and jump right in.

Binary clock

To build the clock, Mattias used a Raspberry Pi and NeoPixel strips, fitted snugly within a simple 3D-printed case. With a few lines of Python, he coded his clock to display the current time using the binary system, with columns for seconds, minutes, and hours.

The real kicker with a binary clock is that by the time you’ve deciphered what time it is – you’re probably already late.

418 Likes, 14 Comments – Mattias (@engineerish) on Instagram: “The real kicker with a binary clock is that by the time you’ve deciphered what time it is – you’re…”

The Python code isn’t currently available on Mattias’s GitHub account, but if you’re keen to see how he did it, and you ask politely, and he’s not too busy, you never know.

Make your own

In the meantime, while we batter our eyelashes in the general direction of Stockholm and hope for a response, I challenge any one of you to code a binary display project for the Raspberry Pi. It doesn’t have to be a clock. And it doesn’t have to use NeoPixels. Maybe it could use an LED matrix such as the SenseHat, or a series of independently controlled LEDs on a breadboard. Maybe there’s something to be done with servo motors that flip discs with different-coloured sides to display a binary number.

Whatever you decide to build, the standard reward applies: ten imaginary house points (of absolutely no practical use, but immense emotional value) and a great sense of achievement to all who give it a go.

The post Build a Binary Clock with engineerish appeared first on Raspberry Pi.

A journey into Capcom’s CPS2 silicon – Part 2

via Dangerous Prototypes

capcom_dl1727

Here’s an informative part 2 of the Capcom CPS2 reverse engineering series by Eduardo Cruz:

Capcom’s Play System 2, also known as CPS2, was a new arcade platform introduced in 1993 and a firm call on bootlegging. Featuring similar but improved specs to its predecessor CPS1, the system introduced a new security architecture that gave Capcom for the first time a piracy-free platform. A fact that remained true for its main commercial lifespan and that even prevented projects like Mame from gaining proper emulation of the system for years.

See the full post on the Arcade Hacker blog. Be sure to see Part 1 here.