Back in August 2014 we got very excited about one of the first Kickstarter projects to use the Raspberry Pi Compute Module. We’re pleased to announce that after much hard work, many late nights and far too much sugar and caffeine, the FiveNinjas team have started shipping actual real Slices to backers. Here’s a picture of Gordon and Jon working their ninja magic in a cold warehouse somewhere in deepest darkest Sheffield; the racking that can be seen in the picture contains parts for 1500 Kickstarter Slices.
Slices are assembled at a secret location in Sheffield, UK
We believe Slice is the first Kickstarter project using the Compute Module to start shipping to backers, narrowly beating our other favourite project the OTTO Camera (which also seems to be very close to shipping!).
Slice comes in black, red and silver
One of the things we wanted to see with the Compute Module was people using it to do just this type of thing – leverage the Raspberry Pi technology to create innovative and high-quality products with minimum resources, something that has historically been a difficult challenge.
The FiveNinjas team includes our very own James Adams and Gordon Hollingworth, who have been spending large amounts of their spare time working on the Slice hardware and software and, in doing so, discovering exactly what it’s like to build a product using the Compute Module and mass produce it. In the process a few wrinkles have been found, but mostly it has been a big success, and there’ll be another blog post soon from Gordon on the process used to test and program Slice in mass production (which is a very important and often overlooked part of creating a real product).
Slices being automatically programmed before packing and shipping
We’ve been told that to get the cost of the Slice motherboard down to an affordable level the Ninjas had to make a minimum order of 3000 Slice PCBs, so there are 1500 more Slices that can be built relatively quickly once the Kickstarter units have all been shipped. If you missed the Kickstarter and want to grab one of these extra units, head over to the brand new FiveNinjas store!
On Monday, Matt Timmons-Brown, The Raspberry Pi Guy, took a day out from revising for his GCSEs to come and do some video interviews with Eben and Gordon. We really enjoy working with Matt; he asks difficult questions, and I think that many of you will find this interview particularly interesting, as Eben talks about plans for open-sourcing the Pi’s graphics stack, what’s going on with the display board, what’s up with Windows 10, and much more.
Thanks Matt – come back to Pi Towers when your exams are over! (Next time, we want more Gordon!)
Recently I visited the University of York Computer Science department for the second year running to see the Raspberry Pi being used to great effect on the new intake of students. Last year I visited to judge Blue Pi Thinking, and since it was such a great success they decided to repeat the exercise this year.
The students are sent a Raspberry Pi before they come to university, with a special build and a document that describes the challenge. There are three main reasons for doing this. The first is to give the students a common computing platform that they can use throughout the year to base different projects/assignments on. Secondly, the students will learn a great amount and will kick start their education; and finally it becomes a real social event for the students to get together (some actually help each other out) with beer and competition (which of course go hand in hand!)
There are two elements to the challenge and the students can choose either (or both or neither!) The first is Blue Pi Thinking, where the challenge is to develop something creative; the second is BattlePi.
BattlePi is a game of battleships played automatically by the Raspberry Pi with a server through which two Raspberry Pis can communicate. The students are given the (well commented Python) software which initially just chooses random positions to make shots at. The student then has to modify the python code to implement better AI in both firing position and ship placement. At the end of freshers’ week the students all come together (with beer of course) to test their creations.
The Blue Pi Thinking creative competition whose aim is basically to create something was run as a judged event. The following are examples of projects that were created:
Playing card recognition. From first principles, the student had captured a picture of a playing card, thresholded it, segmented it to find the pips and count them (OK, it doesn’t work with face cards yet!) Then they apply an edge following algorithm to the pip to work out if it’s a heart, club, diamond or spade. I’d just like to say… Awesome!
Ultrasonic theremin. This student had taken a standard ultrasonic transducer and used it to create a theremin, again of some awe.
Raspberry Pi mosaic creator. This project was another very complex algorithm, which takes a picture on his Android mobile phone using his own app and transmits it to the Raspberry Pi. Then the Pi calculates the best match from a set of images for small sections of the image based on a calculated ‘average’ colour for the region. Then the software makes up the final image as a mosaic of the select set of images, and transmits it back to the phone!
Throughout the year the students will be using their Raspberry Pis to continue their education, and help develop next years BattlePi competition. Will Smith and Emma Hogson, who are in charge of undergraduate admissions have offered Raspberry Pi their complete BattlePi materials including all the software and instructions so we can give it to other universities…something we’re looking into with great interest.
May the worldwide BattlePi commence – watch this space!
At the end of August, Luke Westaway from CNET’s Adventures in Tech came to visit us with a film crew. Here’s the resulting video. We are impressed that somehow the CNET team managed to avoid moiré fringing effects with Gordon’s shirt.
We revealed the Raspberry Pi Compute Module back in April, and released the Compute Module Development Kit in the middle of June. Since then we’ve had a lot of interest and will shortly start shipping the Compute Module in volume to a variety of manufacturers who have already designed it into their products.
One of our goals with the Compute Module was to enable a generation of “Kickstarter consumer electronics” startups to develop commercial-quality products at relatively low volume. We’ve already we’ve seen the OTTO point-and-shoot camera, which was the first ever Kickstarter using the Compute module, and today marks the launch of another campaign which we hope will be even more successful.
Slice media player and remote
Slice is an XBMC media player built around the Compute Module, with simple custom skin, a shiny milled-aluminium case, and a cute ring of 25 RGB LEDs for (and I quote) “visual feedback and wow factor”. It’s been developed by Mo Volans, our old friends Paul Beech and Jon Williamson from Pimoroni, and our very own Gordon Hollingworth and James Adams; they’ve been burning the candle at both ends to get Slice to where it is now, and the prototypes are looking pretty drool-worthy.
Check out the video below, and then head on over to Kickstarter to see for yourself why we’re excited about Slice!
I’ve been pointed at a couple of videos which might interest you: you’ll learn something new from both of these.
First up, Eben explains more about the Compute Module to our friends at RS Components:
And a little later on, Gordon, our Head of Software, gave a talk to the Prime Conference at the Royal Institution about the decisions that led us to repatriate manufacture of the Raspberry Pi to the UK:
Liz: Gordon Hollingworth, our Director of Software, has been pointing the camera board at things, looking at dots on a screen, and cackling a lot over the last couple of weeks. We asked him what he was doing, so he wrote this for me. Thanks Gordon!
The Raspberry Pi is based on a BCM2835 System on a Chip (SoC), which was originally developed to do lots of media acceleration for mobile phones. Mobile phone media systems tend to follow behind desktop systems, but are far more energy efficient. You can see this efficiency at work in your Raspberry Pi: to decode H264 video on a standard Intel desktop processor requires GHz of processing capability, and many (30-40) Watts of power; whereas the BCM2835 on your Raspberry Pi can decode full 1080p30 video at a clock rate of 250MHz, and only burn 200mW.
Because we have this amazing hardware it enables us to do things like video encode and decode in real time without actually doing much work at all on the processor (all the work is done on the GPU, leaving the ARM free to shuffle bits around!) This also means we have access to very interesting bits of the encode pipeline that you’d otherwise not be able to look at.
One of the most interesting of these parts is the motion estimation block in the H264 encoder. To encode video, one of the things the hardware does is to compare the current frame with the previous (or a fixed) reference frame, and work out where the current macroblock (16×16 pixels) best matches the reference frame. It then outputs a set of vectors which tell you where the block came from – i.e. a measure of the motion in the image.
In general, this is the mechanism used within the application motion. It compares the image on the screen with the previous image (or a long-term reference), and uses the information to trigger events, like recording the video or writing a image to a disk, or triggering an alarm. Unfortunately, at this resolution it takes a huge amount of processing to achieve this in the pixel domain; which is silly if the hardware has already done all the hard work for you!
So over the last few weeks I’ve been trying to get the vectors out of the video encoder for you, and the attached animated gif shows you the results of that work. What you are seeing is the magnitude of the vector for each 16×16 macroblock equivalent to the speed at which it is moving! The information comes out of the encoder as side information (it can be enabled in raspivid with the -x flag). It is one integer per macroblock and is ((mb_width+1) × mb_height) × 4 bytes per frame, so for 1080p30 that is 120 × 68 × 4 == 32KByte per frame. And here are the results. (If you think you can guess what the movement you’re looking at here represents, let us know in the comments.)
Since this represents such a small amount of data, it can be processed very easily which should lead to 30fps motion identification and object tracking with very little actual work!
Gordon Hollingworth, our Director of Software, has been Googling himself, and mailed me to let me know about this video he found from Richard Ibbotson. Richard came by Pi Towers last month and filmed this little interview with Gordon and Eben – it’s worth a watch if you’re interested in what goes on behind the scenes. Enjoy!
The University of York asked if we could send someone up to judge a Raspberry Pi contest they’d been running for people joining the Computer Science department over the summer break. Our very own Dr Gordon Hollingworth is a York alumnus, so we sent him to revisit his old stomping grounds (in one of his collection of stylish Raspberry Pi t-shirts – you can buy your own at the Swag Store).
Freshers were given a Raspberry Pi when they won a place at the university back in August, and had been spending the time before arriving in York working on two challenges. Blue Pi Thinking challenged them to come up with the most creative use of the Pi they could think of; and BattlePi had them programming their Pis to beat all other entrants in a class-wide Battleships competition.
Gordon mailed me from his phone while he was there:
“I’m amazed by the quantity and quality of entries and the way they’re working together to improve their code as they go through the Battle Pi part of the contest… I sat down with one guy who was trying to find out why his code had crashed in the middle of a game, and then while describing it to his opponent he suddenly saw the problem! (I obviously explained at this point why people do code reviews!)”
Here’s some video from the day. We’re incredibly impressed at what people were producing; we hope that some of the participants will find time to write their projects up and share them with us.
The eagle-eyed will spot Liam Fraser, one of our earliest and most helpful supporters, in one of the shots. Liam moved to Cambridge after his A levels and spent last year as a gap year working for our hosts, Mythic Beasts, on the back of his work on the Pi; he’s been maintaining our downloads server too. He’s now headed off to study at York. We’ll miss him while he’s away – we hope you enjoy your time at university, Liam!
Dr Gordon Hollingworth, our Head of Software, has been in Orlando visiting Familab, one of our favourite hackspaces. (I love it there – unusually, they’re in a big industrial unit, so they’ve got a lot of space for really big hardware. They’ve got cherry pickers, traffic lights, an industrial CNC milling machine and a lot of Lego.) The Orlando Sentinel went along to have a chat with him: here’s some video they took on the day.
Really sorry about the autoplay; we know you all hate it, but the video player used here doesn’t give us the option to turn it off when the video is embedded.Begone, autoplay!
Gordon sent an email to the office mailing list from his phone while he was there, saying that the pinball machine you see featured in the video was the coolest physical project he’s seen done with a Pi so far. Think you can do better? Let us know!
Gordon Hollingworth, our Director of Software, is saddened that I called him out for being camera-shy the other day. So he’s offered to star in a new feature, which we’ll make a regular happening here if readers (that’s you!) like the idea.
If you have a question about the Pi you’d like Gordon and his whiteboard to answer (we’ve hung it up properly and bought in pens of many colours just for the occasion), please leave a comment below. We’ll select a few of the most interesting ones and film Gordon’s responses next week.
Liz: if you haven’t entered our contest to win a pre-production camera board, have a look at the post explaining what you’ll need to do. And if you’re looking for inspiration, here’s a guest post from Gordon, our Head of Software, about a mini-HD camera project he worked on at home using the prototype boards we showed the BBC back in 2011.
I may have mentioned that Gordon does a lot of cycling. He bodged up a 3D helmet cam a couple of years ago: here’s how he did it. (He has also made me include some 2D video because he likes showing off.)
Careful with the last video, which is in 3D – if you’re using bi-coloured 3D glasses to view it, as I did, you are liable to feel VERY motion sick if you’re susceptible to that sort of thing. Over to Gordon!
A few years ago I really wanted to play around with a helmet-mounted camera for my mountain biking. There were quite a few out in the market, but they were quite expensive, and it’s always difficult getting toys past my wife! Because I was working at Broadcom, I was able to get my hands on what we called the MicroDB (the thing David and Eben first showed to the BBC as the Raspberry Pi), and since I had all the software and a bit of competence, I decided to try doing a bit of HD helmet recording.
The hardware I used was based on the same BCM2835 chip that we all know and love. The hardware also had a PMU chip (power supply), which meant you could power it directly from a lithium ion battery and record 720p HD video for about an hour.
So I rigged up some properly engineered mounting. I used a rubber from my daughter’s pencil case (Americans, breathe easy – this is the UK word for what you call an eraser), a couple of cable ties, and a USB socket! I set out on a voyage of discovery…apologies in advance for the lycra clad arses, but It’s something you’ll just have to put up with!
Liz interjects: that’s not the half of it. Eben and Gordon have a regular date on Wednesdays where they take an hour and a half over lunch to go cycling and have a software meeting at the same time. This means a certain amount of strutting sweatily around the office dressed in lycra at the end of the ride. This week, Jack turned up, tutted and said: “You two do realise there are showers downstairs, don’t you.” The rest of us cheered.
This is an example of the helmet cam being used in a chain gang, which is a fast-moving (we’re doing around 26mph average for the whole of the clip) club ride, where you continuously rotate who’s cycling at the front, making it a very efficient way of travelling at speed!
This is another clip from the helmet cam, at the start of a mountain bike race held by a good friend of mine who’s an elite rider.
When I took these videos, I expected to experience the same feeling of speed as when you’re riding for real, but it doesn’t quite make it. The main issue is that the feeling of speed you get is a product of the full 3D stereoscopic experience that the 2D camera throws away. It’s there and it’s fun, but it doesn’t actually feel real; you don’t quite get the full-force feeling of what it’s like to tear down that trail!
I was missing a dimension, so I had to go find it again! OK, now you ask, surely it’s going to cost me a lot of money to buy a proper 3D camera, and you’d be right if you didn’t have a whole bunch of little camera boards kicking around in the office. I realised that all I needed was two of them, and a spot of work to synchronise the pictures: then Bob’s your uncle!
I took two MicroDB’s and connected them together (actually I used a USB -> USB connector which I then cable-tied to my bike helmet with a rubber/eraser to give it something soft to sink into). So what you get out is two videos (each 720p30). To get the images working together, you need to do some processing, which presents a number of problems:
1) The two cameras are not aligned and therefore you have to rotate and translate the images.
2) You also need to invert one of the images.
3) You need to hand-synchronise the two videos (and keep them synchronised during the video).
So I wrote a bit of software based on FFMPEG and SDL, and lots of handcrafted fun code to take the two videos and output them as one in a number of formats, including interleaved line (odd lines are left image, even lines right), horizontal half-resolution and vertical half-resolution (because we had a number of different 3D televisions to play with!) Application of Bresenham’s algorithm is so much fun!
I then went and did a 24-hour mountain bike race in a team of five (we came third that year) and recorded the first half of one of the laps in glorious 3D. You are going to either need a proper 3D television to watch this or use some red/green (actually cyan is closer) glasses (the kind you get in breakfast cereals!) – otherwise you can just hold two bits of suitably coloured filters against your face.
Liz again: editing this post, I have realised that the next video gives me motion sickness even without Gordon’s 3D glasses. Proceed with caution. Gordon, I can’t believe you kept this stuff up without sleeping for 24 hours.
Why am I showing you this? Well mostly because I had so much fun doing it, and it really shows how the real 3D helmet cameras can make the experience of home video just so much better if you’re doing something fast and aggressive. I hope you agree!
Finally, of course, the Raspberry Pi camera (now in production and being released next month) is very closely related to this one – although it’s actually higher quality; the images we’ve been seeing in test are looking fantastic. This project gives you an impression of the kind of thing you’ll be able to do with it with a bit of extra coding – and of the sort of extra legwork we’re looking for from people entering the competition to win a pre-production camera board.
I’ve been looking for where I put the video manipulation code; if I can find it, I’ll put it into GitHub somewhere so you can have a play yourself (if anyone is remotely interested)!
Finally – really finally – you have to think about the fact that the Raspberry Pi has two CSI interfaces, meaning there’s a potential to add two camera boards. Does that mean it would be possible to do all this completely on a single Raspberry Pi? We haven’t experimented with the idea yet – only the future can tell…
We’ve sent the first camera boards to production, and we’re expecting to be able to start selling them some time in April. And we’ve now got several pre-production cameras in the office that we’re testing and tweaking and tuning so the software will be absolutely tickety-boo when you come to buy one.
Gordon is in charge of things camera, and he’s got ten boards to give away. There is, however, a catch.
The reason we’re giving these cameras away is that we want you to help us to do extra-hard testing. We want the people we send these boards to to do something computationally difficult and imaginative with them, so that the cameras are pushed hard in the sort of bonkers scheme that we’ve seen so many of you come up with here before with your Pis, and so that we can learn how they perform (and make adjustments if necessary). The community here always seems to come up with applications for the stuff we do that we wouldn’t have thought of in a million years; we thought we should take advantage of that.
So we want you to apply for a camera, letting us know what you’re planning to do with it (and if you don’t do the thing you promise, we’ll send Clive around on his motorbike to rough you up). We want you to try to get the camera doing something imaginative. Think about playing around with facial recognition; or hooking two of them up together and modging the images together to create some 3d output; or getting the camera to recognise when something enters the frame that shouldn’t be there and doing something to the image as a result. We are not looking for entries from people who just want to take pictures, however pretty they are. (Dave Akerman: we’ve got one bagged up for you anyway, because the stuff you’re taking pictures of is cool enough to earn an exemption here. Everybody else, see Dave’s latest Pi in Space here. He’s put it in a tiny TARDIS.)
So if you have a magnificent, imaginative, computationally interesting thing you’d like to do with a Raspberry Pi camera board, email email@example.com. In your mail you’ll need to explain exactly what you plan to do; and Gordon, who is old-school, is likely to take your application all the more seriously if you can point to other stuff you’ve done in the past (with or without cameras), GitHub code or other examples of your fierce prowess. (He suggested I ask for your CVs, but I think we’ll draw the line there.) We will also need your postal address. The competition is open worldwide until March 12. We’re looking forward to seeing what you come up with!