Tag Archives: raspberrypi

The $3 Arduino

via Wolf Paulus » Embedded

Buying and using an official Arduino Board like the standard Arduino Uno is the perfect way to get started with the Arduino language, common electronic components, or your own embedded prototyping project. However, once you have mastered the initial challenges and have built some projects, the Arduino Board can get in the way.

For instance, many enthusiasts, who started out with hacking and extending the Arduino hardware and software, have now moved on to the Raspberry Pi, which is equally or even less expensive, but more powerful and complete. The Raspberry Pi comes with Ethernet, USB, HDMI, Analog-Audio-Out, a lot of memory, and much more; adding all these capabilities to the Arduino with Arduino-Shields, would probably cost several hundred Dollars. However, the Raspberry lacks some of Arduino’s I/O capabilities like Analog-Inputs.
Combining an Arduino with a Raspberry Pi, may make sense for a lot of project; but we don’t need or want to integrate an Arduino Board – the Arduino’s core, the ATMEGA Microcontroller, is all what’s needed.

We still want to program the Arduino with the familiar Arduino IDE, but the boot-loader doesn’t really help, once the ATMEGA Micro is programmed and is ready to communicate with the Raspi or any other platform.

This is probably the most minimal ATmega168-20PU based Arduino you can come up with. The ATmega168 (available of about $3) was the default Arduino chip for quite some time, before being replaced by the ATmega328, doubling the available memory. The chip is powered with +5V on Pin-7 and grounded via Pin-8; the LED is between Pins 19 and 22.

Here you see it processing this rather simple blinky program:

int p = 13;                // LED connected to digital pin 13
void setup() {
  pinMode(p, OUTPUT);      // sets the digital pin as output
}
  
void loop() {
  digitalWrite(p, HIGH);   // sets the LED on
  delay(100);              // .. for 10th of a sec 
  digitalWrite(p, LOW);    // sets the LED off again
  delay(1000);             //  waits for a second
  digitalWrite(p, HIGH);   // sets the LED on
  delay(500);              // .. for 1/2 a sec 
  digitalWrite(p, LOW);    // sets the LED off again
  delay(500);              // .. for 1/2 a second
}

Since we don’t want any bootloader (waiting for a serial upload of new software) on this chip but rather have it immediately start executing the program, we need to tell the Arduino IDE about our bare-bones board and the fact that it doesn’t have a boot loader.

Bare-Bones ATmega168 Board Definition

To add a new hardware definition to the Arduino IDE, create a hardware/BareBones folder inside your Arduino folder (the place where all your sketches are stored). Create a boards.txt file with the following content.
On the Mac for instance, I ended up with this:
/Users/wolf/Documents/Arduino/hardware/BareBones/boards.txt

minimal168.name=ATmega168 bare bone (internal 8 MHz clock)
minimal168.upload.speed=115200
minimal168.bootloader.low_fuses=0xE2
minimal168.bootloader.high_fuses=0xDD
minimal168.bootloader.extended_fuses=0×00
minimal168.upload.maximum_size=16384
minimal168.build.mcu=atmega168
minimal168.build.f_cpu=8000000L
minimal168.build.core=arduino:arduino
minimal168.build.variant=arduino:standard

Wiring up the ATmega-168

To flash the chip, we use the SPI (MOSI/MISO/SCK) Pins like shown here:

Connections from an AT-AVR-ISP2 to the ATmega-168 are as follows:

  1. GND -> ATmega168-Pin 8
  2. +5V -> ATmega168-Pin 7
  3. MISO -> ATmega168-Pin 18
  4. SCK -> ATmega168-Pin 19
  5. RSET -> ATmega168-Pin 1
  6. MOSI -> ATmega168-Pin 17

Burning without a Boot-Loader

Instead of an Arduino Board, I connected one of those shiny blue AT-AVR-ISP2 programmers from ATMEL (available for about $30) to the Mac. In the Arduino-IDE, in the Tools Menu, under Boards, I selected and ‘ATmega168 bare bone (internal 8 MHz clock)’ and under Programmers, ‘AVRISP mkII’.
Hitting ‘Upload’ will now use the AT-AVR-ISP2 to reset the chip, flash the program, and verify the new memory content. All in all, it takes about 75 seconds.
Once the chip is now powered, it will immediately start executing the program.

Switching the Internal Clock to 8MHz

Using the Fuse Calculator we can find the proper ATmega168 fuse settings, to use the internal RC Oscillator and setting it to 8Mhz.
The avrdude arguments look something like this: -U lfuse:w:0xe2:m -U hfuse:w:0xdf:m -U efuse:w:0x00:m.
Avrdude is one of the tools that the Arduino IDE deploys on your computer. You can either execute Avrdude with those arguments directly, like so:

avrdude -p m168 -b 115200 -P usb -c avrispmkII -V -e -U lfuse:w:0xe2:m -U hfuse:w:0xdf:m -U efuse:w:0x00:m

or just execute the ‘Burn Bootloader’ command in the Arduino IDE’s Tools menu.
While this will NOT burn a bootloader on the ATmege168 chip, it will set the fuses appropriately. Either way, this step needs to be performed only once.

ATtiny85

Let’s minimize the ‘Minimal Arduino’ even more, for instance by using the tiny ATtiny85 Microcontroller.
Read more: http://wolfpaulus.com/journal/embedded/avrmc

Streaming Your Webcam w/ Raspberry Pi

via Wolf Paulus » Embedded

[Last updated on Feb. 2. 2013 for (2012-12-16-wheezy-raspbian) Kernel Version 3.2.27+]

Three years ago, we bought two small Webcams and since we wanted to use them on Linux and OS X, we went with the UVC and Mac compatible Creative LIVE! CAM Video IM Ultra. This Webcam (Model VF0415) has a high-resolution sensor that lets you take 5.0-megapixel pictures and record videos at up to 1.3-megapixel; supported resolutions include 640×480, 1290×720, and 1280×960. If you like, you can go back and read what I was thinking about the IM Ultra, back in 2009. Today, it’s not much used anymore, but may just be the right accessory for a Raspberry Pi.

With the USB Camera attached to the Raspi, lsusb returns something like this:

lsusb

Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 002: ID 0424:9512 Standard Microsystems Corp.
Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp.
Bus 001 Device 004: ID 7392:7811 Edimax Technology Co., Ltd EW-7811Un 802.11n Wireless Adapter [Realtek RTL8188CUS]
Bus 001 Device 005: ID 041e:4071 Creative Technology, Ltd

Using the current Raspbian “wheezy” distribution (Kernel 3.2.27+), one can find the following related packages, ready for deployment:

  • luvcview, a camera viewer for UVC based webcams, which includes an mjpeg decoder and is able to save the video stream as an AVI file.
  • uvccapture, which can capture an image (JPEG) from a USB webcam at a specified interval

While these might be great tools, mpeg-streamer looks like a more complete, one-stop-shop kind-of solution.

Get the mpeg-streamer source code

Either install Subversion (svn) on the Raspberry Pi or use svn installed on your Mac or PC, to get the source-code before using Secure Copy (scp) to copy it over to your Raspi.

Here, I’m using svn, which is already installed on the Mac, before copying the files over to my Raspi, (username pi, hostname is phobos)

cd ~
mkdir tmp
cd tmp
svn co https://mjpg-streamer.svn.sourceforge.net/svnroot/mjpg-streamer mjpg-streamer
scp -r ./mjpg-streamer pi@phobos:mjpg-streamer

Please note: Looks like the repo got recently moved, Try this to check-out the code if the previous step does not work:

svn co https://svn.code.sf.net/p/mjpg-streamer/code/mjpg-streamer/ mjpg-streamer

Over on the Raspi, I tried to make the project, but quickly ran into error messages, hinting at a missing library.

ssh pi@phobos
cd mjpg-streamer/mjpg-streamer
make
...
jpeg_utils.c:27:21: fatal error: jpeglib.h: No such file or directory, compilation terminated.
make[1]: *** [jpeg_utils.lo] Error 1

After finding out, which libraries were available (apt-cache search libjpeg), I installed libjpeg8-dev like so: sudo apt-get install libjpeg8-dev. This time, I got a lot further, before hitting the next build error:


make
...
make[1]: *** [pictures/640x480_1.jpg] Error 127
make[1]: Leaving directory `/home/pi/mjpg-streamer/mjpg-streamer/plugins/input_testpicture'

After some google-ing, which resulted in installing ImageMagick like so: sudo apt-get install imagemagick, the next build attempt looked much more promissing:

make
..

and ls -lt shows the newly built files on top:

-rwxr-xr-x 1 pi pi 13909 Sep 8 07:51 input_file.so
-rwxr-xr-x 1 pi pi 168454 Sep 8 07:51 input_testpicture.so
-rwxr-xr-x 1 pi pi 31840 Sep 8 07:50 output_http.so
-rwxr-xr-x 1 pi pi 14196 Sep 8 07:50 output_udp.so
-rwxr-xr-x 1 pi pi 19747 Sep 8 07:50 output_file.so
-rwxr-xr-x 1 pi pi 29729 Sep 8 07:50 input_uvc.so
-rwxr-xr-x 1 pi pi 15287 Sep 8 07:50 mjpg_streamer
-rw-r--r-- 1 pi pi 1764 Sep 8 07:50 utils.o
-rw-r--r-- 1 pi pi 9904 Sep 8 07:50 mjpg_streamer.o

MJPG-streamer

MJPG-streamer is a command line tool to stream JPEG files over an IP-based network. MJPG-streamer relies on input- and output-plugins, e.g. an input-plugin to copy JPEG images to a globally accessible memory location, while an output-plugin, like output_http.so, processes the images, e.g. serve a single JPEG file (provided by the input plugin), or streams them according to existing mpeg standards.

Therefore, the important files that were built in the previous step are:

  • mjpg_streamer – command line tool that copies JPGs from a single input plugin to one or more output plugins.
  • input_uvc.so – captures such JPG frames from a connected webcam. (Stream up to 960×720 pixel large images from your webcam at a high frame rate (>= 15 fps) with little CPU load.
  • output_http.so – HTTP 1.0 webserver, serves a single JPEG file of the input plugin, or streams them according to M-JPEG standard.

Starting the Webcam Server

A simple launch command would look like this:
./mjpg_streamer -i "./input_uvc.so" -o "./output_http.so -w ./www"

MJPG Streamer Version: svn rev:
i: Using V4L2 device.: /dev/video0
i: Desired Resolution: 640 x 480
i: Frames Per Second.: 5
i: Format…………: MJPEG
o: HTTP TCP port…..: 8080
o: username:password.: disabled
o: commands……….: enabled

Open a Webbrowser on another computer on the LAN and open this url: http://{name or IP-address of the Raspi}:8080

However, experimenting with the resolution and frame rate parameters is well worth it and can improved the outcome.

UVC Webcam Grabber Parameters

The following parameters can be passed to this plugin:

-d video device to open (your camera)
-r the resolution of the video device,
can be one of the following strings:
QSIF QCIF CGA QVGA CIF VGA SVGA XGA SXGA
or a custom value like: 640×480
-f frames per second
-y enable YUYV format and disable MJPEG mode
-q JPEG compression quality in percent
(activates YUYV format, disables MJPEG)
-m drop frames smaller then this limit, useful
if the webcam produces small-sized garbage frames
may happen under low light conditions
-n do not initalize dynctrls of Linux-UVC driver
-l switch the LED “on”, “off”, let it “blink” or leave
it up to the driver using the value “auto”

HTTP Output Parameters

The following parameters can be passed to this plugin:

-w folder that contains webpages in flat hierarchy (no subfolders)
-p TCP port for this HTTP server
-c ask for “username:password” on connect
-n disable execution of commands

I have seen some good results with this
./mjpg_streamer -i "./input_uvc.so -n -f 15 -r 640x480" -o "./output_http.so -n -w ./www"
but even a much higher resolution didn’t impact the actually observed frame-rate all that much:
./mjpg_streamer -i "./input_uvc.so -n -f 15 -r 1280x960" -o "./output_http.so -n -w ./www"

MJPG Streamer Version: svn rev:
i: Using V4L2 device.: /dev/video0
i: Desired Resolution: 1280 x 960
i: Frames Per Second.: 15
i: Format…………: MJPEG
o: www-folder-path…: ./www/
o: HTTP TCP port…..: 8080
o: username:password.: disabled
o: commands……….: disabled

Webcam Stream Clients

The included Website (http://{name or IP-address of the Raspi}:8080) shows examples for how to connect a client to the Webcam stream. The easiest way is obviously a simple HTML page that works great with Google Chrome and Firefox but not so much with Safari. Anyways, it’s important to specify the width and height that was configured with the output_http.so, in the HTML as well


  <img alt="" src="http://phobos:8080/?action=stream" width="1280" height="960" />

Raspberry Pi Webcam Streamer

Taking the Raspberry Pi Web Stream Server Outside

This is the Raspberry Pi powered by a 5VDC, 700mA battery, with an (Edimax EW-7811Un) USB-WiFi Adapter and the Creative LIVE! CAM Video IM Ultra connected.

Video Lan Client for Viewing and Recording

Using Video Lan Client, you can view and also record the video stream, served by the Raspi.

Recorded Webcam Streamer

Movie, streamed from a Raspberry Pi

Raspberry Pi Webcam from Tech Casita Productions on Vimeo.

Let me know what Webcam software you found that works well on the Raspberry Pi.

Raspberry Pi, Battle of the Enclosures

via Wolf Paulus » Embedded

Overturned Basket with Raspberries and White Currants, 1882
By Eloise Harriet Stannard (1829 – 1915)

Eventually, you will start looking for an enclosure for the Raspberry Pi. Even during the early hardware development phase of your project, you can put the Raspberry Pi into an enclosure, given that the mainboard doesn’t have any on/off switches and that a Cobbler Breakout Kit provides easy access to the Raspi’s GPIO Pins (on a neighboring solderless breadboard).
Unlike for many other popular embedded development platforms, there are already many enclosures for the Raspberry Pi to chose from; many of which are listed over here at elinux.org.

We have bought two Adafruit Pi Boxes and two Raspberry Pi Cases from Barch Designs.

Adafruit Pi Box

  • Crystal-clear Acrylic (6 pieces)
  • Engraved Labels on all Connector Slots
  • Long Slot to connect a 26-pin IDC cable (e.g. Cobbler Breakout Kit)
  • No additional vents or cooling required
  • $14.95

Evaluation

The case has no screws or standoffs and the little feet have to be squeezed to make the pieces snap together.
Very elegant design, however, (probably accelerated by the Raspberry Pi’s heat emission) after a few days of use, the acrylic became extremely brittle and started to show cracks around the cutouts. One of the feet broke off, while we were trying to open the enclosure, rendering the case useless (all feet are needed to snap the enclosure parts together again.)
Despite operating extremely carefully, the same happened to the second case only a few days later. Kudos to Adafruit though. Once we mentioned our experience with the enclosure, a refund was issued.

While this could have been a temporary issue related to the acrylic used for our cases, we would not recommend the enclosure for longer use or if you needed to open and close the enclosure often or even rarely.

Raspberry Pi Case by Barch Designs

  • CNC Machined from Billet Aluminum
  • Customizable Engraving
  • Long Slot to connect a 26-pin IDC cable (e.g. Cobbler Breakout Kit)
  • Acts as a Heat Sink
  • LED Fiber Optics
  • $69.99 (incl. shipping)

Evaluation

This precisely enclosure is milled from Solid 6061-T6 Aircraft Grade Billet Aluminum in the USA. Fiber Optic cables that have been manufactured into the case and each cable is positioned directly above an LED.
The whole enclosure acts as an heat sink, even a small package of thermal paste is included.
While the price is about four times that of the Acrylic enclosure, if you need an enclosure that lasts you may want to consider this one. It is the Mercedes among the Pi cases, but money well spent.

Raspberry Pi / Case by Barch Designs / with EW-7811Un USB Wireless Adapter

Accessing Raspberry Pi via Serial

via Wolf Paulus » Embedded

Using a serial connection to connect to a Raspbery Pi has many advantages. The boot process (Kernel boot messages go to the UART at 115,200 bit/s) can be monitored, without the need to hookup an HDMI-Monitor. Once booted, you can of course login through a serial terminal as well, i.e. the serial connection allows logging-in form a remote computer without running an SSH daemon on the Raspi.

UART TXD and RXD pins are easily accessible (GPIO 14 and 15), however, like for all GPIO pins, the voltage levels are 3.3 V and are not 5 V tolerant!

Since most of the desktop and laptop computers don’t come equipped with a serial port anymore, accessing the Raspberry Pi via a Serial Connection requires some requisites. I have recently connected to the Raspberry Pi using three different hardware setups ..

1. USB to Serial Adapter

There are many USB-to-Serial adapters available and while not all of them are capable to handle the highest data transfer speeds, the Keyspan HS-19 (OS-X drivers are available here) is certainly one of the best.
However, adapters based on the Prolific 2303 chip, like the Zonet ZUC3100, seem to be a little less expensive, well-supported in Linux, and much more widespread. Drivers for 2303 based devices can be found here, if required, use GUEST as user name and password to gain access.
E.g. I’m currently using the Mac OS X Universal Binary Driver v1.4.0 on OS X 10.8.1 without any issues.

1.1. Level Shifter

Very few of those USB-to-Serial adapters have the standard RS-232 +/- 12V voltage levels on the serial ports (the Zonet ZUC3100 w/ the pl2303 chip however does!) and using a level shifter is certainly a very good idea. Since the Raspi wants no more than 3.3V, TTL-RS-232 level converters based on the Maxim MAX3232 are your best choice.

This photo shows the blue Zonet ZUC3100 Usb-to-Serial adapter, with a Maxim MAX3232 based level shifter. Since the level shifter needs to be powered, the Raspi’s 3.3V pin (red) and Ground (black) are connected to the Level-Shifter. Yellow and Orange are used for the Transmit and Receive lines.

On OS X, a simple Terminal with the screen /dev/tty.usbserial 115200 command issued is all what is needed to connect to the Raspberry Pi. A more dedicated application like CoolTerm may become handy as well.

2. FTDI Basic 3.3V – USB to Serial.

I have a basic breakout board for the FTDI FT232RL USB to serial IC, which is mainly used to program those Arduino boards that don’t have an USB connector. It can of course also be used for general serial applications. Big benefit here is that the FTDI Basic 3.3V already provides the 3.3V levels that the Raspbi requires. The Virtual COM Port Drivers (VCP-Drivers) for the host computer are available here
Since the FTDI Basic does’t need to be powered, only the TXD and RXD pins need to be connected.

This photo shows the FTDI Basic 3.3V Usb-to-Serial adapter, with only two (TXD and RXD) pins connected to the Raspberry Pi. Again, the FTDI Basic is powered through the USB connection coming from your host PC or Laptop. Still, the Raspberry Pi needs to be powered through its micro-usb port.

3. FTDI Basic 3.3V – USB to Serial.

If you look hard and long enough, you will find USB-to-Serial Cable, 6 Female Header Wires, 3.3V I/O, like this one over here at Micro Controller Shop. Adafruit has one as well here.

Cable like these are the easiest way ever to connect to the Raspberry Pi’s serial console port, since they can also power the Raspi.

The USB-to-Serial cable (uses an FTDI FT232RQ) is a USB-to-Serial (3.3V TTL level) converter cable which allows for a simple way to connect 3.3V TTL interface devices to USB.

The 3.3V TTL signals are color coded as follows:

  • Black – GND
  • Brown – CTS
  • Red – +5V DC supply from USB
  • Orange – TXD
  • Yellow – RXD
  • Green – RTS

This photo shows the Micro Controller Shop’s FTDI based 3.3V Usb-to-Serial adapter cable, powering the Raspberry Pi, as well as connecting to its TXD and RXD pins.

Open Source Hardware Camp 2012

via OSHUG

Open Source Hardware Camp 2012 will take place place in the north of England in the Pennine town of Hebden Bridge. Building on the success of last year's OSHCamp, it will be a weekend long event with ten talks on the Saturday and four parallel workshops on the Sunday.

Hebden Bridge is approximately 1 hour by rail from Leeds and Manchester. Budget accommodation is available at the Hebden Bridge Hostel which adjoins the venue, with private rooms available and discounts for group bookings. Details of other local accommodation can be found at www.hebdenbridge.co.uk.

There will be a social event on the Saturday evening from 8PM, and those interested in pre-event drinks on the Friday should join the discussion list.

Practical Experiences with the Google Android Accessory Development Kit (ADK)

The ADK is an exciting development platform that makes it possible to easily combine Android applications with custom hardware built around Arduino. Such combinations have the best of both worlds by enabling the creation of a mobile phone application with access to peripheral devices that is only limited by your imagination.

This talk will cover two projects that extend what the phone can do by integrating both input and output devices. And will cover some of the dos and don'ts of using the ADK and associated IDEs. If time permits there will also be a demonstration with a quick run through of the code.

Paul Tanner is a consultant, developer and maker in wood, metal, plastic, electronics and software. His day job is IT-based business improvement for SMEs. By night he turns energy nut, creating tools to optimise energy use. Paul graduated in electronics and was responsible for hardware and software product development and customer services in several product and service start-ups, switching to consulting in 2000.

If you can't wait to get your hands on the ADK software browse to http://developer.android.com/tools/adk.

The Internet of Things and Arduino

As connecting hardware to the network becomes cheaper and cheaper we're seeing the rise of what is being called the Internet of Things, or “IoT” for short.

This talk will give an introduction to the Internet of Things and explain how open hardware platforms such as Arduino are helping it grow. With plenty of examples of IoT projects, from using sensors to map global radiation levels to bakeries that tweet when the bread is fresh out of the oven.

Adrian McEwen has been connecting odd things to the Internet since the mid-90s. Starting with cash registers, and then as part of the team who were first to put a web browser onto a mobile phone. As the mobile phone and set-top box work became more mainstream he dropped down a level to Arduino which led to Internet-enabled bubble machines and chicken-food silos...

Adrian has been working with Arduino since 2008 — which is when Bubblino, the aforementioned bubble machine which watches twitter, was created — and is charge of the Arduino Ethernet library. He is based in Liverpool, where he runs MCQN Ltd, a company that builds IoT devices and products.

Developing Linux on Embedded Devices

This talk will provide an introduction to developing Linux on embedded devices. Firstly we will look at the capabilities of popular boards such as the BeagleBone and the Raspberry Pi. Then using the example of a BeagleBone controller for a 3D printer the talk with explain how to develop for an embedded device. It will consider what comprises an embedded Linux software stack. The talk will discuss boot loaders, kernels and root filesystems. We will discuss what are the minimum software packages required in a root file system. The talk will then go on to consider the tools required to develop for an embedded target. It will look at what tools are available to help the embedded developer and speed up this development process. Once you have developed your software you need to debug it. The talk will look at what debugging tools are available for debugging embedded devices.

Melanie Rhianna Lewis started a life long love of electronics as a child when her Dad helped her make a "crystal" radio with an ear piece, a coil of wire, a diode and a radiator! At the same time the home computer revolution started and she would lust after the "build your own computers" advertised in the electronics magazines of the time. She never got one but did end up the proud owner of a BBC Micro. Melanie learnt everything she could about the machine and including assembler, operating systems, drivers, interrupt, and, thanks to the circuit diagram in the Advanced User Guide, digital electronics. After the BBC Micro came the Acorn Archimedes and so started a long relationship with ARM processors. In the 90s Melanie became interested in Linux and then developed one of the first ARM Linux distributions running on an Acorn RISC PC. The hobby became a job and Melanie currently works for an embedded device consultancy near Bradford where a lot of her work is still with ARM processors.

Interfacing the Raspberry Pi to the World — Everything you need to know about P1

You've received your Pi, set up a web server on it and maybe played a few rounds of Quake. You're looking for a new challenge and suddenly the header on the corner of the board catches your eye. A quick Google search for "P1 Raspbery Pi" gets you to the eLinux wiki page on Low level peripherals, and you suddenly realise that you can do all sorts of fun stuff by adding extra bits to your Raspberry Pi using this magical expansion port. Where do you start? Is it safe to connect a motor directly to the pins? What sort of interesting components are out there?

In this talk we will look at the ways we can communicate with the outside world using the GPIO pins on the Raspberry Pi. We will explore the mechanical, electrical and software side of things and talk about a few example projects you can try at home, and the hardware limitations will be covered and workarounds provided.

Omer Kilic is theoretically still a research student at the University of Kent, although he intends to submit his thesis (which is about a reconfigurable heterogeneous computing framework) pretty soon. He likes tiny computers, things that 'just work' and beer. He currently works for Erlang Solutions in London, exploring the use of Erlang programming language in the Embedded Systems domain and develops tools and support material to help the adoption of this technology.

This talk will also serve as an introduction for the Raspberry Pi workshop on the Sunday, where we will explore the example projects covered in more detail.

Sensing Wearable Technology

An introduction to wearable technology that will include examples which incorporate sensors, plus work which makes use of the LilyPad Arduino, an open source, sewable microcontroller.

Rain Ashford creates wearable technology & electronic art, her most recent work involves investigating physiological sensing technologies and how they can be applied to wearable artworks to measure and interpret moods, health and lifestyle data. Rain also creates fun, interactive and aesthetically pleasing works that include gaming and musical elements. She is keen to demonstrate that electronics, components and circuitry doesn't have to be regarded as cold, boring, hard and boxy and instead can be fun, colourful and elegant, plus be integrated into an overall design of a work.

Rain’s background is in developing online activities for the BBC as a Senior Producer at BBC Learning and also as Technologist at BBC R&D, co-running BBC Backstage. She currently works as a freelance consultant for the Open University and for Technocamps designing and leading workshops in coding and electronics in the form of wearable technology for 11-19 year-olds, plus is a PhD researcher, peering into wearable electronics & art.

Running OpenBTS in the Real World

This talk will explore the OpenBTS project and describe how it uses software-defined radio and open source Internet telephony to create a small but complete GSM mobile phone network.

Experiences of operating OpenBTS installations on the Pacific island of Niue and at the Burning Man festival in the Nevada desert will be covered, along with how OpenBTS has been integrated with other systems for use in disaster relief. Licensing permitting there will also be a live demonstration.

Tim Panton is a software engineer with a particular interest in projects that blend web applications and person-to-person speech into an integrated user experience. He has many years hands-on experience with the OpenBTS project, working closely with the core development team on numerous installations.

Tim is currently working on the Phono.com, Tropo.com and Rayo.org products at VoxeoLabs, producing web developer-friendly APIs by using XMPP protocols to drive innovative telephony applications that can be used anywhere by anyone.

Developing a Heavy Lift UAV — Pitfalls, Problems and Opportunities

Unmanned aerial vehicles (UAV) are suitable for replacing dull, dirty and dangerous airborne tasks. The next future developments in UAV use are in heavy lift and vertical take-off and landing (VTOL). The ability to place a useful load in a geographic location of choice becomes pressing in many applications. The problems are that helicopters are excellent heavy lift machines but are limited by range and payload. Aeroplanes don’t provide the VTOL unless heavy engines and complex gearboxes are utilised.

The development of the conventional take-off and landing (CTOL) UAV is the beginning of a utilitarian UAV which is modular and low cost. The future will involve VTOL and higher payloads (Euro-pallet sized). This presentation will show a path of development from CTOL, through to VTOL Olecopter and ultimately a heavy lift (pallet container) UAV.

Edward Strickland is a Chartered Engineer with a background in aerospace and a degree in Aeronautical Engineering. He was the project manager for the Empire Test Pilot School, has lived and worked in Tanzania as a VSO volunteer, and has produced a CTOL airframe for the OpenRelief project which has been designed so that it can be constructed in developing countries using local resources.

The 3D Printed Revolution

Over recent years Open Source 3D printers have quickly developed alongside their commercial counterparts offering affordable and accessible alternatives. This talk will cover experiences using commercial printers and how the speaker's interests have moved to open source designs and how the two compare. Examples will be shown of projects using these technologies, such as "Fable", a clock manufactured by Selective Laser Sintering, and a wrist watch designed to be printed on a RepRap. There will also be a run through of the design considerations and how files were created, fixed and sliced in preparation to print on a RepRap.

Mark Gilbert graduated in 2000 from Sheffield Hallam University with a degree in Industrial Design Innovation. After several years working as a design engineer, Mark started working as a freelance industrial designer for several companies in the Northwest. Over the last 6 years he has also worked closely with the Bolton Science and Technology Centre as the "Designer in Residence" where he has developed workshops around the centre's 3D printing and CAD facilities.

In 2008 Mark set up the design studio Gilbert13 with his wife Angela where they design and develop products inspired by experimentation into digital manufacturing processes, 3D printing and additive manufacturing. Recent projects have taken their experience from rapid prototyping to use 3D printing as a manufacturing tool that can change the way people design, co create and distribute objects.

The Bots are Coming

In the last two decades we have seen software and data change the fabric of economics, and the advent of personal computing and the Internet enable many new business models. However, the next two decades will be even more radical as that wave of innovation shifts from the virtual domain to a physical manifestation. Atoms are the new bits and the open sourcing and democratisation of bot technology is allowing us to enter into an era of personal production. And this talk will explore how 3D printing and additive manufacturing are revolutionising production as we know it.

Alan Wood originally trained in systems engineering, got lost in software engineering and open source for a decade, before returning back to his hardware roots via the open source hardware and makers movement that has gathered momentum over the last few years.

DIYBIO - The Next Frontier

DIYBIOMCR is an public group based at MadLab dedicated to making biology an accessible pursuit for citizen scientists, amateur biologists and biological engineers who value openness and safety. This talk will give an overview of the movement, and what is going on at MadLab involving not only biology but also diverse fields such as hardware-hackers, artists, journalists and the open-source movement.

Hwa Young Jung is a co-founder and a director of MadLab, a community centre for creative, tech and science based the Manchester. Over 50 user groups meet once a month, including DIYBIOMCR, initially a joint funded project with MMU and the Wellcome Trust.

Sunday Workshops

Workshops will be reasonably informal and shaped by the participants, and details are subject to change depending upon the level of interest expressed.

Please feel free to bring along equipment and components provided that you are able to take full responsibility for your own personal safety and that of others. Common sense should be exercised!

Practical IoT Applications with the Google ADK and Arduino

Hands on IoT building sessions that follow on from Saturday's ADK and Arduino talks.

Run by: Paul Tanner & Adrian McEwen.

Bring an Arduino with Ethernet and/or a Google ADK if you have one, along with sensors, LEDs and displays etc.

Interfacing the Raspberry Pi to the World

Here you will learn how to connect a selection of devices to your Raspberry Pi utilising the methods discussed during Saturday's talk.

Run by: Omer Kilic & Melanie Rhianna Lewis.

We will have a few Raspberry Pi boards available for the workshop but please bring your own if you were one of the lucky ones to have received one, along with breadboard and any useful components if you have these.

Building GSM Networks with Open Source

A look at the practical steps involved in creating a low power GSM network using open source technology.

Run by: Tim Panton & Andrew Back.

Note: this workshop will be subject to a spectrum licence being granted.

Practical 3D Printing

In this workshop we will work with simple models that will be printed out using a RepRap.

Run by: Alan Wood, Mark Gilbert & Mike Beardmore.

Note:

  • Please aim to arrive for 09:00 on the Saturday as the event will start at 09:30 prompt.
  • A light lunch and refreshments will be provided on the Saturday. Please ensure that you make any dietary requirements clear when registering.

Sponsored by:

OSHCamp kit bags provided by:

Open Source Hardware Camp 2012

via OSHUG

Open Source Hardware Camp 2012 will take place place in the north of England in the Pennine town of Hebden Bridge. Building on the success of last year's OSHCamp, it will be a weekend long event with ten talks on the Saturday and four parallel workshops on the Sunday.

Hebden Bridge is approximately 1 hour by rail from Leeds and Manchester. Budget accommodation is available at the Hebden Bridge Hostel which adjoins the venue, with private rooms available and discounts for group bookings. Details of other local accommodation can be found at www.hebdenbridge.co.uk.

There will be a social event on the Saturday evening from 8PM, and those interested in pre-event drinks on the Friday should join the discussion list.

Practical Experiences with the Google Android Accessory Development Kit (ADK)

The ADK is an exciting development platform that makes it possible to easily combine Android applications with custom hardware built around Arduino. Such combinations have the best of both worlds by enabling the creation of a mobile phone application with access to peripheral devices that is only limited by your imagination.

This talk will cover two projects that extend what the phone can do by integrating both input and output devices. And will cover some of the dos and don'ts of using the ADK and associated IDEs. If time permits there will also be a demonstration with a quick run through of the code.

Paul Tanner is a consultant, developer and maker in wood, metal, plastic, electronics and software. His day job is IT-based business improvement for SMEs. By night he turns energy nut, creating tools to optimise energy use. Paul graduated in electronics and was responsible for hardware and software product development and customer services in several product and service start-ups, switching to consulting in 2000.

If you can't wait to get your hands on the ADK software browse to http://developer.android.com/tools/adk.

The Internet of Things and Arduino

As connecting hardware to the network becomes cheaper and cheaper we're seeing the rise of what is being called the Internet of Things, or “IoT” for short.

This talk will give an introduction to the Internet of Things and explain how open hardware platforms such as Arduino are helping it grow. With plenty of examples of IoT projects, from using sensors to map global radiation levels to bakeries that tweet when the bread is fresh out of the oven.

Adrian McEwen has been connecting odd things to the Internet since the mid-90s. Starting with cash registers, and then as part of the team who were first to put a web browser onto a mobile phone. As the mobile phone and set-top box work became more mainstream he dropped down a level to Arduino which led to Internet-enabled bubble machines and chicken-food silos...

Adrian has been working with Arduino since 2008 — which is when Bubblino, the aforementioned bubble machine which watches twitter, was created — and is charge of the Arduino Ethernet library. He is based in Liverpool, where he runs MCQN Ltd, a company that builds IoT devices and products.

Developing Linux on Embedded Devices

This talk will provide an introduction to developing Linux on embedded devices. Firstly we will look at the capabilities of popular boards such as the BeagleBone and the Raspberry Pi. Then using the example of a BeagleBone controller for a 3D printer the talk with explain how to develop for an embedded device. It will consider what comprises an embedded Linux software stack. The talk will discuss boot loaders, kernels and root filesystems. We will discuss what are the minimum software packages required in a root file system. The talk will then go on to consider the tools required to develop for an embedded target. It will look at what tools are available to help the embedded developer and speed up this development process. Once you have developed your software you need to debug it. The talk will look at what debugging tools are available for debugging embedded devices.

Melanie Rhianna Lewis started a life long love of electronics as a child when her Dad helped her make a "crystal" radio with an ear piece, a coil of wire, a diode and a radiator! At the same time the home computer revolution started and she would lust after the "build your own computers" advertised in the electronics magazines of the time. She never got one but did end up the proud owner of a BBC Micro. Melanie learnt everything she could about the machine and including assembler, operating systems, drivers, interrupt, and, thanks to the circuit diagram in the Advanced User Guide, digital electronics. After the BBC Micro came the Acorn Archimedes and so started a long relationship with ARM processors. In the 90s Melanie became interested in Linux and then developed one of the first ARM Linux distributions running on an Acorn RISC PC. The hobby became a job and Melanie currently works for an embedded device consultancy near Bradford where a lot of her work is still with ARM processors.

Interfacing the Raspberry Pi to the World — Everything you need to know about P1

You've received your Pi, set up a web server on it and maybe played a few rounds of Quake. You're looking for a new challenge and suddenly the header on the corner of the board catches your eye. A quick Google search for "P1 Raspbery Pi" gets you to the eLinux wiki page on Low level peripherals, and you suddenly realise that you can do all sorts of fun stuff by adding extra bits to your Raspberry Pi using this magical expansion port. Where do you start? Is it safe to connect a motor directly to the pins? What sort of interesting components are out there?

In this talk we will look at the ways we can communicate with the outside world using the GPIO pins on the Raspberry Pi. We will explore the mechanical, electrical and software side of things and talk about a few example projects you can try at home, and the hardware limitations will be covered and workarounds provided.

Omer Kilic is theoretically still a research student at the University of Kent, although he intends to submit his thesis (which is about a reconfigurable heterogeneous computing framework) pretty soon. He likes tiny computers, things that 'just work' and beer. He currently works for Erlang Solutions in London, exploring the use of Erlang programming language in the Embedded Systems domain and develops tools and support material to help the adoption of this technology.

This talk will also serve as an introduction for the Raspberry Pi workshop on the Sunday, where we will explore the example projects covered in more detail.

Sensing Wearable Technology

An introduction to wearable technology that will include examples which incorporate sensors, plus work which makes use of the LilyPad Arduino, an open source, sewable microcontroller.

Rain Ashford creates wearable technology & electronic art, her most recent work involves investigating physiological sensing technologies and how they can be applied to wearable artworks to measure and interpret moods, health and lifestyle data. Rain also creates fun, interactive and aesthetically pleasing works that include gaming and musical elements. She is keen to demonstrate that electronics, components and circuitry doesn't have to be regarded as cold, boring, hard and boxy and instead can be fun, colourful and elegant, plus be integrated into an overall design of a work.

Rain’s background is in developing online activities for the BBC as a Senior Producer at BBC Learning and also as Technologist at BBC R&D, co-running BBC Backstage. She currently works as a freelance consultant for the Open University and for Technocamps designing and leading workshops in coding and electronics in the form of wearable technology for 11-19 year-olds, plus is a PhD researcher, peering into wearable electronics & art.

Running OpenBTS in the Real World

This talk will explore the OpenBTS project and describe how it uses software-defined radio and open source Internet telephony to create a small but complete GSM mobile phone network.

Experiences of operating OpenBTS installations on the Pacific island of Niue and at the Burning Man festival in the Nevada desert will be covered, along with how OpenBTS has been integrated with other systems for use in disaster relief. Licensing permitting there will also be a live demonstration.

Tim Panton is a software engineer with a particular interest in projects that blend web applications and person-to-person speech into an integrated user experience. He has many years hands-on experience with the OpenBTS project, working closely with the core development team on numerous installations.

Tim is currently working on the Phono.com, Tropo.com and Rayo.org products at VoxeoLabs, producing web developer-friendly APIs by using XMPP protocols to drive innovative telephony applications that can be used anywhere by anyone.

Developing a Heavy Lift UAV — Pitfalls, Problems and Opportunities

Unmanned aerial vehicles (UAV) are suitable for replacing dull, dirty and dangerous airborne tasks. The next future developments in UAV use are in heavy lift and vertical take-off and landing (VTOL). The ability to place a useful load in a geographic location of choice becomes pressing in many applications. The problems are that helicopters are excellent heavy lift machines but are limited by range and payload. Aeroplanes don’t provide the VTOL unless heavy engines and complex gearboxes are utilised.

The development of the conventional take-off and landing (CTOL) UAV is the beginning of a utilitarian UAV which is modular and low cost. The future will involve VTOL and higher payloads (Euro-pallet sized). This presentation will show a path of development from CTOL, through to VTOL Olecopter and ultimately a heavy lift (pallet container) UAV.

Edward Strickland is a Chartered Engineer with a background in aerospace and a degree in Aeronautical Engineering. He was the project manager for the Empire Test Pilot School, has lived and worked in Tanzania as a VSO volunteer, and has produced a CTOL airframe for the OpenRelief project which has been designed so that it can be constructed in developing countries using local resources.

The 3D Printed Revolution

Over recent years Open Source 3D printers have quickly developed alongside their commercial counterparts offering affordable and accessible alternatives. This talk will cover experiences using commercial printers and how the speaker's interests have moved to open source designs and how the two compare. Examples will be shown of projects using these technologies, such as "Fable", a clock manufactured by Selective Laser Sintering, and a wrist watch designed to be printed on a RepRap. There will also be a run through of the design considerations and how files were created, fixed and sliced in preparation to print on a RepRap.

Mark Gilbert graduated in 2000 from Sheffield Hallam University with a degree in Industrial Design Innovation. After several years working as a design engineer, Mark started working as a freelance industrial designer for several companies in the Northwest. Over the last 6 years he has also worked closely with the Bolton Science and Technology Centre as the "Designer in Residence" where he has developed workshops around the centre's 3D printing and CAD facilities.

In 2008 Mark set up the design studio Gilbert13 with his wife Angela where they design and develop products inspired by experimentation into digital manufacturing processes, 3D printing and additive manufacturing. Recent projects have taken their experience from rapid prototyping to use 3D printing as a manufacturing tool that can change the way people design, co create and distribute objects.

The Bots are Coming

In the last two decades we have seen software and data change the fabric of economics, and the advent of personal computing and the Internet enable many new business models. However, the next two decades will be even more radical as that wave of innovation shifts from the virtual domain to a physical manifestation. Atoms are the new bits and the open sourcing and democratisation of bot technology is allowing us to enter into an era of personal production. And this talk will explore how 3D printing and additive manufacturing are revolutionising production as we know it.

Alan Wood originally trained in systems engineering, got lost in software engineering and open source for a decade, before returning back to his hardware roots via the open source hardware and makers movement that has gathered momentum over the last few years.

DIYBIO - The Next Frontier

DIYBIOMCR is an public group based at MadLab dedicated to making biology an accessible pursuit for citizen scientists, amateur biologists and biological engineers who value openness and safety. This talk will give an overview of the movement, and what is going on at MadLab involving not only biology but also diverse fields such as hardware-hackers, artists, journalists and the open-source movement.

Hwa Young Jung is a co-founder and a director of MadLab, a community centre for creative, tech and science based the Manchester. Over 50 user groups meet once a month, including DIYBIOMCR, initially a joint funded project with MMU and the Wellcome Trust.

Sunday Workshops

Workshops will be reasonably informal and shaped by the participants, and details are subject to change depending upon the level of interest expressed.

Please feel free to bring along equipment and components provided that you are able to take full responsibility for your own personal safety and that of others. Common sense should be exercised!

Practical IoT Applications with the Google ADK and Arduino

Hands on IoT building sessions that follow on from Saturday's ADK and Arduino talks.

Run by: Paul Tanner & Adrian McEwen.

Bring an Arduino with Ethernet and/or a Google ADK if you have one, along with sensors, LEDs and displays etc.

Interfacing the Raspberry Pi to the World

Here you will learn how to connect a selection of devices to your Raspberry Pi utilising the methods discussed during Saturday's talk.

Run by: Omer Kilic & Melanie Rhianna Lewis.

We will have a few Raspberry Pi boards available for the workshop but please bring your own if you were one of the lucky ones to have received one, along with breadboard and any useful components if you have these.

Building GSM Networks with Open Source

A look at the practical steps involved in creating a low power GSM network using open source technology.

Run by: Tim Panton & Andrew Back.

Note: this workshop will be subject to a spectrum licence being granted.

Practical 3D Printing

In this workshop we will work with simple models that will be printed out using a RepRap.

Run by: Alan Wood, Mark Gilbert & Mike Beardmore.

Note:

  • Please aim to arrive for 09:00 on the Saturday as the event will start at 09:30 prompt.
  • A light lunch and refreshments will be provided on the Saturday. Please ensure that you make any dietary requirements clear when registering.

Sponsored by:

OSHCamp kit bags provided by:

Raspberry Pi – Where to start?

via Wolf Paulus » Embedded

At its core, the Raspberry Pi uses the Broadcom BCM2835 System-on-a-chip. This single chip contains

  • an ARM1176 CPU (normally clocked at 700MHz)
  • a VideoCore 4 GPU, i.e. a low-power mobile multimedia processor (also used in the Roku-2)
  • 256 MByte SDRAM
  • in addition to the ARM’s MMU, a second coarse-grained Memory Management Unit for mapping ARM physical addresses onto system bus addresses.

The memory needs to be divided into ARM and GPU memory (happens by including one of the supplied start*.elf files into the boot partition). The minimum amount of memory which can be given to the GPU is 32MB. However that will restrict the multimedia performance and 32MB does not provide enough buffering for the GPU to do 1080p30 video decoding.

The second, slightly smaller chip on the Raspberry Pi board, is an LAN9512, an USB 2.0 hub and 10/100 MBit Ethernet controllers. The LAN9512 is a low-cost, power-efficient, small-footprint USB to Ethernet and multi-port USB connectivity solution in a single package, contains a Hi-Speed USB 2.0 hub with two fully-integrated downstream USB 2.0 PHYs, an integrated upstream USB 2.0 PHY, a 10/100 Ethernet MAC/PHY controller, and an EEPROM controller.

Single-Chip, Hi-Speed USB 2.0 Hub and High-Performance 10/100 Ethernet Controllers

Boot Process

Besides the hardware board itself, starting with the boot process seems to be as good an idea as any… When the Raspberry Pi powers up, it’s the GPU that is active, looking for bootcode.bin, loader.bin, start.elf at the root dir of the first partition at the (fat formatted) SDCard. I.e., booting is hardcoded to happen from the SDCard.
The GPU reads and executes bootcode.bin, which then loads loader.bin, which loads start.elf.
Again in the root dir of the first partition it looks for config.txt, contains information like the arm speed (defaults to 700MHz), address from where to load kernel.img, etc.
Now it kernel.img (arm boot binary file) is copied it to memory and the ARM11 is reset that it runs from the address where kernel.img (default kernel_address 0×8000) was loaded.

Memory Split

The memory needs to be divided into ARM and GPU memory and currently, we have three start.elf files to choose from (see below for details).

  • arm128_start.elf: 1:1, 128MBytes for the ARM11 and 128MBytes for the GPU
  • arm192_start.elf: 3:1, 192MBytes for the ARM11 and 64MBytes for the GPU
  • arm224_start.elf: 7:1, 224MBytes for the ARM11 and 32MBytes for the GPU

Broadcom states in their BCM2835 documentation that 32MBytes might not be enough memory for the GPU and until you reach the point where 128MByte aren’t quite enough for the ARM, you may want to go with the 1:1 spit.

Minimal Boot Image and Blinky Program

Let’s put this Boot Process assumptions that were made above to the test.

  • Prepare an SDCard card (a 1 GByte Class-2 cards works just fine) by formatting it with the MS-DOS (FAT) file system.
  • Download a Raspberry Pi Distribution (currently wheezy-raspbian vers.2012-07-15), uncompress the zip file and open the resulting image file 2012-07-15-wheezy-raspbian.img, for instance with DiskImageMounter, if you are using Mac OS X.
  • Copy bootcode.bin form the wheezy-raspbian.img to the root directory of the SDCard.
  • Copy loader.bin form the wheezy-raspbian.img to the root directory of the SDCard.
  • Copy arm128_start.elf form the wheezy-raspbian.img to the root directory of the SDCard and rename it to start.elf.
  • Copy config.txt form the wheezy-raspbian.img to the root directory of the SDCard.
  • Add the following two lines to your config.txt:
    kernel blinky.bin
    kernel_address 0×8000
  • Uncompress and copy blinky.bin to the root directory of the SDCard.

Now insert the SDCard into your Raspberry Pi and power it up. If all goes well, you should see the Raspberry Pi’s OK LED blink.
The five files, which total just over 2MBytes are probably the closest and smallest you can get to an Hello_World style program for the Raspberry Pi.

Stay tuned for how to create your own Raspberry Pi Tool Chain and how to make blinky.

Kits (Homesense, Quick2Wire)

via OSHUG

For those that are new to hardware development it can prove a daunting prospect, and kits that address the needs of those with little or no experience in this area have a vital role to play. At the nineteenth OSHUG meeting we will be hearing about two such kits, one that was designed to support user-led smart home innovation and that was based around the Arduino platform, and an experimenters kit for the Raspberry Pi that is currently in development.

The Homesense Project

The Homesense project was a European user-led, smart-home development project employing open source hardware. The project was led by Tinker London and EDF and engaged households supported by local experts in the design and development of smart home concepts.

The project was developed as a reaction to top-down design approaches commonly observed in technological development and home building. Most early research viewed smart homes as a single complex system that is designed and constructed from the ground up, and assumes that most aspects (physical building, digital infrastructure, furniture, appliances) are under the control of a single smart-home developer. (Kortuem et al. 2010)

In the contrasting reality however of multi-vendor development and retrofitting this is rarely the case. Inspired also by an argument that smart homes are developed by experts in a top down approach subsequently living with a smart home is acknowledged to be problematic to non-experts who lack control over respective technologies.

The Homesense project was therefore designed to enable user-led innovation within the home environment, building alongside existing environmental and social conditions allowing end-users to address their own concerns in their physical and ‘lived in’ space. Homesense sought to bring the open collaboration methods of online communities to physical infrastructures in the home. Designing a toolkit to support this approach is explored as a topic of this presentation.

Natasha Carolan is a PhD student at HighWire Doctoral Training Centre, Lancaster University where her research considers commodification of design and production processes in the digital economy. A product designer by background, her research explores open and user innovation, service design and value co-creation in areas of NPD and manufacturing. Natasha co-designed the Homesense toolkit by situating the toolkit as a cultural probe a strategy that Natasha believes is important in placing open source hardware in a democratic system as a tool for learning and empowerment.

Quick2Wire

Quick2Wire Limited is a start-up that is developing a range of OSH/OSS add-on products for the Raspberry Pi. The first product is an experimenter's kit, contaning an expansion board, a set of components with which to experiment, software to drive the Pi, and an instruction manual. This will be followed by a series of expansion kits, using I2C and SPI to add capabilities like ADC, DAC, PWM and stepper motor drivers.

All the hardware and software will be released under open source licences.

The presentation will conclude with a demonstration using hardware prototypes driven by a Raspberry Pi.

Romilly Cocking spent the ten years before his 'retirement' as an agile software developer, coach and trainer. He spent the first two years of retirement experimenting with robotics. Then Raspberry Pi came along, and now Romilly works full-time running Quick2Wire.

Note: Please aim to arrive for 18:00 - 18:20 as the event will start at 18:30 prompt.

Sponsored by:

Kits (Homesense, Quick2Wire)

via OSHUG

For those that are new to hardware development it can prove a daunting prospect, and kits that address the needs of those with little or no experience in this area have a vital role to play. At the nineteenth OSHUG meeting we will be hearing about two such kits, one that was designed to support user-led smart home innovation and that was based around the Arduino platform, and an experimenters kit for the Raspberry Pi that is currently in development.

The Homesense Project

The Homesense project was a European user-led, smart-home development project employing open source hardware. The project was led by Tinker London and EDF and engaged households supported by local experts in the design and development of smart home concepts.

The project was developed as a reaction to top-down design approaches commonly observed in technological development and home building. Most early research viewed smart homes as a single complex system that is designed and constructed from the ground up, and assumes that most aspects (physical building, digital infrastructure, furniture, appliances) are under the control of a single smart-home developer. (Kortuem et al. 2010)

In the contrasting reality however of multi-vendor development and retrofitting this is rarely the case. Inspired also by an argument that smart homes are developed by experts in a top down approach subsequently living with a smart home is acknowledged to be problematic to non-experts who lack control over respective technologies.

The Homesense project was therefore designed to enable user-led innovation within the home environment, building alongside existing environmental and social conditions allowing end-users to address their own concerns in their physical and ‘lived in’ space. Homesense sought to bring the open collaboration methods of online communities to physical infrastructures in the home. Designing a toolkit to support this approach is explored as a topic of this presentation.

Natasha Carolan is a PhD student at HighWire Doctoral Training Centre, Lancaster University where her research considers commodification of design and production processes in the digital economy. A product designer by background, her research explores open and user innovation, service design and value co-creation in areas of NPD and manufacturing. Natasha co-designed the Homesense toolkit by situating the toolkit as a cultural probe a strategy that Natasha believes is important in placing open source hardware in a democratic system as a tool for learning and empowerment.

Quick2Wire

Quick2Wire Limited is a start-up that is developing a range of OSH/OSS add-on products for the Raspberry Pi. The first product is an experimenter's kit, contaning an expansion board, a set of components with which to experiment, software to drive the Pi, and an instruction manual. This will be followed by a series of expansion kits, using I2C and SPI to add capabilities like ADC, DAC, PWM and stepper motor drivers.

All the hardware and software will be released under open source licences.

The presentation will conclude with a demonstration using hardware prototypes driven by a Raspberry Pi.

Romilly Cocking spent the ten years before his 'retirement' as an agile software developer, coach and trainer. He spent the first two years of retirement experimenting with robotics. Then Raspberry Pi came along, and now Romilly works full-time running Quick2Wire.

Note: Please aim to arrive for 18:00 - 18:20 as the event will start at 18:30 prompt.

Sponsored by: