Author Archives: Avra Saslow

DIY Contactless Temperature Monitor System with RFID

via SparkFun: Commerce Blog

Here at SparkFun, we’ve done our best to find ways to safely show up to the office and warehouse throughout the pandemic, to ensure that our products are built to our high standards. And really, everybody has their hand on a product at some point in the process, whether that’s engineering, marketing, shipping, kitting, or any of the other half-dozen departments working hard to make the best possible version of a product.

Working in the office also means that we’ve had to take extra precautions to ensure our safety throughout the workplace. One of the methods is daily check-ins, in which everybody entering the building has their temperature taken and documented to monitor if anybody might have a fever unknowingly. This entails one of our employees spending a few hours each morning just taking temperatures and recording employees' names and associative temperatures. While I’m a strong proponent of the human presence each morning, as it’s an outlet for me to talk to somebody outside my house, I was also curious if I could build a system that might automate the process so that they don’t have to sit there for so long.

We already use RFID cards as a means to enter the building, so I thought the easiest system might be to scan our employee cards, which would trigger an infrared temperature sensor to take body temperature, and record the information in an Excel spreadsheet.

I picked up the RFID kit as a method to experiment with multiple cards, and not just my singular employee card.

SparkFun RFID Qwiic Kit

SparkFun RFID Qwiic Kit

KIT-15209
$44.95
2

I also wanted to use the IR Thermometer Evaluation Board because it comes with the MLX90614-ABB - a single-zone infrared thermometer, capable of sensing object temperatures between -70 and 380°C. It uses a SMBus, which is like an I2C interface, to communicate with the chip. That means you really only need to devote two wires from your microcontroller to interface with it.

SparkFun IR Thermometer Evaluation Board - MLX90614

SparkFun IR Thermometer Evaluation Board - MLX90614

SEN-10740
$34.95
1

Using these two boards means we'll use three different libraries:

#include <Wire.h> // I2C library
#include <SparkFunMLX90614.h> // SparkFunMLX90614 Arduino library 
#include "SparkFun_Qwiic_Rfid.h"

There are three discrete parts of the code that ultimately make this kind of system. First, there's creating a function that scans RFID cards and saves the tag. Second, that function triggers another function to record the temperature of the body in front of the RFID scanner. And third, all of this data is recorded and sent to an Excel spreadsheet.

Both the IR temperature function and the RFID scanning function initialize Serial and each of the sensors, and then read the data to Serial. If you are interested in this code, let us know in the comments and we'll share it! What becomes really helpful, though, is when we can export the serial monitor data to Excel so we can track temperature trends in real life.

With PLX-DAQ, we can send the real-time data collected by Arduino into Excel, where it's much easier to process data. First, make sure to download the PLX-DAQ software here.

DATA specifies that the rest of the serial output will be labeled as data and recorded in the column.

void Initialize_streamer(){
    Serial.println("CLEARDATA"); 
    Serial.println("LABEL,Date,Time,Temperature,Name"); 
}

void Write_streamer(){
    Serial.print("DATA"); 
    Serial.print(","); // Move to next column using a comma
    Serial.print("DATE"); 
    Serial.print(","); 
    Serial.print("TIME"); 
    Serial.print(","); 
    Serial.print(RfidReading); //employee tag
    Serial.print(",");
    Serial.print(TempReading); //from IR temp sensor
    Serial.print(","); 
    Serial.println(); //End of the row, move to next row
}

When you open the PLX-DAQ software, you'll have to choose the correct port in Arduino as well as the correct baud, click "connect," and the data should show up in the Excel file.

alt text

This is a simple example of combining RFID scanners with an IR temperature sensor, but it ultimately provides a very useful, real-world tool. What other ideas do you have for implementing RFID in creative ways? Check out RFID and start experimenting yourself with the RFID Starter Kit, and happy temperature tracking and hacking!

comments | comment feed

Perfecting Coffee Roasting with the Qwiic Photodetector Breakout

via SparkFun: Commerce Blog

Zach Halvorson has always had a tenacity for combining his passion for coffee with his academic interests in computer vision and photonics. He’s built a variety of projects that incorporate coffee with technology thus far, including a home roaster made out of a popcorn air popper, coupled with a remote switch, Arduino, AC dimmer and thermocouples.

He is fascinated with the intersectionality of coffee and tech. “As an engineering student with a passion for coffee, finding ways to design new tools and apply my skill set and background is something I’ve enjoyed doing and have found to be incredibly self-motivating," he said. "The coffee industry has a long history, yet still there are endless avenues for innovation and new applications of technology, so it’s very easy to stay excited about working in this area, trying to make an impact and creating new tools in the future!”

He had the chance to build on the intersection between coffee and computer science at a product tech innovation fellowship at Boston University, where he’s currently studying Biomedical Engineering. Ultimately, the project he worked on there developed into a company, Espresso Vision Inc., which produces products and services entirely related to developing technological tools for the coffee community.

“The original project idea was to develop a computer vision algorithm that converts, in real-time, an analog pressure gauge value into digital values for data logging," Zach said. "This was specifically for use in the espresso-making process, in which the applied pressure to the coffee grounds over time can have drastic effects on the taste and flavor of the brewed espresso. The algorithm was first built in R, and was converted into OpenCV using C++, to be deployed in an iOS app. The algorithm works and is fully developed, and we continue to work on the accompanying iOS app, with a goal of potentially releasing the app in 2021.”

As Zach developed Espresso Vision Inc., he realized that many products within the coffee roasting community weren’t at all affordable. He decided to build an affordable roast level sensor, called Roast Vision, that quantifies the degree to which coffee beans have been roasted using a sample of finely ground coffee, and was able to bring the cost down from well over $1,000 to $299.

alt text

Zach was kind of enough to send us a Roast Vision and some samples of coffee so that we could test it out for ourselves.

alt text

After about a teaspoon of coffee grounds are loaded into the sample cup, Roast Vision measures the level of roast based on photon detection. It associates the roast level with a specific number, ranging from 0-35 (very dark to very light). The whole measurement process takes just a few seconds.

SparkFun Photodetector Breakout - MAX30101 (Qwiic)

SparkFun Photodetector Breakout - MAX30101 (Qwiic)

SEN-16474
$19.95
1

Zach explains the meticulous design process he underwent in creating Roast Vision, and the countless iterations he went through:

“Once I settled in on the SparkFun NIR sensor, I began testing different arrangements, from top-loaded designs to bottom-loaded, square and circle sample cups, ultimately settling in on a top-loaded design with an integrated sample cup and acrylic windows. However, once I saw the SparkFun Qwiic Photodetector Breakout on a Product Showcase, I immediately knew that I had to test that sensor. The previous NIR sensor was very sensitive, but the aperture is open, and I had concerns that even if I made it a sealed device, over time coffee would get in, and if it ended up inside the sensor it would ruin the unit. Being able to use the photodetector that has a glass cover over the LEDs and sensor was a huge improvement in the long term accuracy and reliability of Roast Vision. That was the biggest sudden change, due to a challenge of preventing coffee from contaminating the sensor.”

alt text

For Zach, Espresso Vision Inc. will continue to be a place for developing affordable tools for the coffee community, at many points in the process. From brewing espresso or other types of coffee to roasting, and potentially even further up the supply chain to crop monitoring and other types of projects, he hopes to push the boundaries and create new demand and enthusiasm for technical projects and innovation in the field.

If you want to read more about Roast Vision, the Daily Coffee News just featured the product on their blog, and you can always check out the product for yourself at Espresso Vision Inc..

comments | comment feed

Tracking Trail Usage with the OpenMV Cam H7 Plus

via SparkFun: Commerce Blog

Happy first day of fall, SparkFans!

This summer, amidst the pandemic, I've tried to spend as much time as I can outdoors on trails. I’m from a small town in Colorado, where I grew up riding and racing mountain bikes on the local trails. The youth bike program that I rode with often partnered with another local organization that built and maintained trails, so as riders, we could give back and build more trails for the greater community.

Over the years, as trails in my hometown and throughout Colorado have become more popular, I’ve noticed that some local governments are funding efforts to determine what type of trail user is most prominent and on which trails. By understanding if one trail sees more bikers vs. hikers, or if one trail generally sees very little traffic at all, trail building organizations can focus their efforts to ensure that pre-existing trails fit the needs of the user. Usually, this research is conducted by paying someone to sit at trail and manually tally what kind of users pass by.

This could be a perfect place for technology to survey and compile data about trail users instead of it being manually done. Specifically, this is a prime example of when the OpenMV Cam H7 Plus could implement some object recognition and tally up each time a person is detected.

alt text

The basic starting point in the OpenMV IDE is setting up the sensor settings. You can change the contrast, window size, and other formats for whatever you might be viewing.

sensor.set_contrast(3)
sensor.set_gainceiling(16)
sensor.set_framesize(sensor.QVGA)      
sensor.set_windowing((240, 240))     
sensor.skip_frames(time=2000)

The other important piece in the script is the the frame rate (FPS) clock - it will determine how many snapshots the camera should take and in what time frame.

For this specific use case, I turned to TensorFlow to load Google's Person Detection Model to see if a person is in view. The person detection network is built-in to the OpenMV Cam's firmware, so it has already been trained to classify images as either containing a person, not containing a person, or unsure whether there is a person.

net = tf.load('person_detection')
labels = ['unsure', 'person', 'no_person']

alt text

I found the biggest issue to be that there's so much data when running the model; it's overwhelming to sift through the saved SD card data. The camera needs to be running to determine if there's a person in the frame, but it'd be optimal if the camera was on low-power sleep mode and would start the frame rate clock once it determines if there is a person in view. Otherwise, it returns thousands of data points.

However, since OpenMV is built on Python, it is easy to develop a visualization within an open source documentation platform like Jupyter Notebooks that can query those data points and only return values where a person was detected over a certain threshold (i.e. 80 percent certainty).

The other issue is that the person detection model is based off what the Google machine learning library has already been trained to see, so tracking a person on a bicycle isn't completely accurate, because it is a different object than just a person. To ensure accuracy for different trail users, you'd have to retrain the model to include every different kind of trail user and classify them individually. So for the purpose of this project, it's useful to just determine how many people are on a trail at a time.

However, the model as it is is really quite accurate, and is ideal for collecting data in a short time span. If you are interested in further explanation of the ML model, would like to see the code for the project, or just see the visualization that displays how many people were detected, just let me know in the comments and we can post that!

Happy fall - go out build some machine vision projects with the OpenMV Cam H7 Plus! I can't even express how fun this module is, so happy hacking with it!

comments | comment feed

Streaming Artemis DK Sensor Data to Web App

via SparkFun: Commerce Blog

One of the aspects I was most excited about from the new SparkFun Artemis Development Kit is its expanded functionality of BLE. With Bluetooth, the scope of my projects are opened up to the entire world instead of just my local one, and with a plethora of sensors on board, this module is just begging to stream real time data to the world wide web! So that’s just what we did.

The first piece is using the ArduinoBLE library to listen for peripherals to connect the Dev Kit.

alt text

With the Artemis line, we just have to compile the Arduino firmware written, drop the .bin file into the Artemis USB itself, and from there we can run the program. If you open the serial monitor, it should say “starting accelerometer…Artemis Dev Kit BLE Sensor Updater.”

alt text

We can start the web app using yarn, which will load the local host, and display the web app we’ve built using React.

alt text

The web page will need to be specifically connected to the Artemis board and once it is, it will immediately load up real-time sensor data from the module by utilizing a JavaScript visualization library called Nivo.

alt text

The firmware from Arduino reads the real time values for the x, y and z axes from the on-board accelerometer, as well as the LED values. The values for the LED jump so drastically because the LED is either on or off (1 or 0).

alt text

The Arduino firmware can be altered to read in the real-time values for the camera and microphone as well, if you wanted to display autoexposure or frequency data on the visualization. You would need to change the driver code for each respective sensor to read that specific device register.

Nivo is an incredibly robust visualization library, so you can start filtering noise or customizing visualizations easily once you start streaming data. This is really just a starting point in displaying all of the data from the on-board sensors - you can develop the firmware and application to visualize whatever sensor you might need, and design it however you like with CSS and JavaScript. Another avenue for improving the web app would be to push the application to the cloud and host it on a public server.

So whether you want to alter the design aesthetic, read in values from the camera and microphone, push to the cloud, or incorporate machine learning, the Artemis Dev Kit and this project are great starting points to connect to the world wide web with Bluetooth.

Happy sensor streaming, and happy hacking with the Artemis Dev Kit!

comments | comment feed

Drifting Downhill with the SparkFun OpenLog Artemis

via SparkFun: Commerce Blog

There are an infinite number of questions to be asked about the world around us. Why is my mail so banged up when it's delivered? Where did the banging around occur, and how large was the force? What is the ambient light when lightning bugs decide to come out and flicker? How do wind and/or speed determine when a moving vehicle will tip on a banked corner? How much does doing 360s on a drift trike slow you down on your ride, and what is the optimal speed at which to enter a 360?

We decided to tackle that last question, and using the new SparkFun OpenLog Artemis (“OLA”), gather data quickly about how the change in acceleration affects Cassy’s drift triking. The OLA was the ideal module to use for this project because it was so simple to configure and ultimately read data from.

All we did was load in a blank FAT32 micro SD card, hook it up to a power source, ensure the IMU was enabled, and send it speeding down a hill with Cassy. The data is written to a CSV file on the SD card as well as formatted correctly with column headers, so you won’t have to do anything to read the data… not even write one line of code!

SparkFun OpenLog Artemis

SparkFun OpenLog Artemis

DEV-16832
$49.95

If you do want to dive into the analytics of the data, the fact that it is already organized in a CSV makes it easy to play around with in any code editor/language. We specifically used Jupyter Notebook and Python and its associative libraries to visualize how speed changes when doing 360s on a trike.

The SparkFun OpenLog Artemis is the ultimate plug and play partner; it enables you to be an everyday scientist and ask questions about the world around you, and allows you to have fast feedback through its datalogging and sensing capabilities.

What questions will you start answering with it in your pocket? Let us know, and be sure to check out the project video below, as well as Rob's product video that goes over all the specs. Happy drifting and hacking!

comments | comment feed

Developing Web Applications on a Raspberry Pi with Flask

via SparkFun: Commerce Blog

So, you've got yourself a Raspberry Pi, and now you want to connect it to the internet to display all the data you’ve collected. Exciting stuff! But how in the heck do you actually do that?

There are a few popular methods to create web servers and web applications using your small single-board computer, and today we're going to cover one of the most commonly used - Flask!

First, let's cover what a web framework is. A web application framework is a collection of libraries and modules that enable web application developers to write applications without worrying about low-level details such as protocol and thread management.

Each framework has a different way in which it builds routes, models, views, database interactions, and the overall application configuration. All of this comes down to a compact set of API endpoints that each back end must implement, along with the allowed HTTP methods:

  • GET: retrieves data
  • POST: sends HTML from data to server
  • HEAD: same as GET method, but just returns the HTTP headers, and no response body
  • PUT: replaces the resource at the current URL with the resource contained within the request
  • DELETE: deletes the specified resource

Flask is one of the most popular Python web application frameworks, because it's easy to get started with and is incredibly readable.

It's built on Jinja2, a Python template engine that renders dynamic web pages. This is what ultimately allows you to pass Python variables into Flask's HTML templates. It's also built on Werkzeug, which implements a Web Server Gateway Interface (WSGI) server to run Python code to create a web application.

Starting your first application in Flask is just a few lines of code. Make sure to check that you have Python 3 downloaded on your Pi. We will specifically use pip to install Flask, like this: sudo pip3 install flask

Here's the web app:

from flask import Flask
app = Flask(__name__)

@app.route('/')
def index():
    return 'Hello SparkFun World!'

if __name__ == '__main__':
    app.run() `

However, just because Flask is a microframework doesn't mean the whole app exists inside an Python file. You'll ultimately have many other files, including HTML and CSS to design your webpage how you'd like. One of the best parts of Flask is that it allows you to create templates so your HTML stays consistent throughout your entire website.

First, make a directory within your app (mkdir app/templates) to create your template. Mine will be called index.html, and will display the air quality at different coordinates (thanks to my pHAT and Qwiic Environmental Sensors!).

<html>
    <head>
        <title>{{ title }}</title>
    </head>
    <body>
        <h1> GPS Coordinates: {{ coordinates }} </h1>
        <h1> Air Quality Index: {{ AQI }} </h1>
    </body>
</html>

The {{ and }} symbols show Python variables on a webpage. They are just placeholders for variables.

Now, to use our template we have to render it, and Flask has a module that can do that: from flask import render_template. We'll also change a bit of the code in our initial web app to allow for this new index file to be read.

app.route('/')
@app.route('/index')
def index():
    coordinates_Boulder = '40.0150° N, 105.2705° W'
    AQI_Boulder = '126'
    return render_template('index.html', title='Air Quality in Colorado', coordinates = coordinates_Boulder, AQI = AQI_Boulder)

Instead of having a static website, it'd be nice to continuously monitor the air quality, right?

import RPi.GPIO as GPIO
from flask import Flask, render_template
app = Flask(__name__)
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
sen = 16
senStats = GPIO.LOW

# Set button and PIR sensor pins as an input
GPIO.setup(sen, GPIO.IN)

@app.route("/")
def index():
    # Read Sensors Status
    senStats = GPIO.input(sen)
    templateData = {
        'title' : 'Air Quality Index',
        'Air Quality Sensor'  : senStats
    }
return render_template('index.html', **templateData)
if __name__ == "__main__":
app.run(host='0.0.0.0', port=80, debug=True)

Flask allows us to read in data through Python and display it on our webpage.

The opportunities are endless with this. You can connect with SQLAlchemy to enable a database abstraction layer, customize your webpage with CSS, or ultimately push it to the cloud over Google Cloud, Heroku or AWS. Let us know what kind of web dev you want to see more of, whether it's design and user experience, cloud technologies, or database integration!

comments | comment feed