Tag Archives: Python

Joker: a Raspberry Pi + Python joke machine

via Raspberry Pi

Today is a public holiday here in the UK, and Pi Towers is silent and still. Clive’s in a field “with no network (not even mobile),” he specifies, just in case someone were tempted to try and make him do something anyway. By the time this post appears, I’ll be pursuing a couple of kids around the Cambridge Museum of Technology. Liz and Eben have one-upped everyone by going to Scandinavia. So, in keeping with the leisurely, end-of-summer vibe of today, we thought we’d share a project that’s designed to amuse. We hope it’ll cheer up all those of you unlucky enough to live in places where you don’t automatically get to bunk off on the last Monday in August.

Raspython, a new project aiming to offer tutorials and learning resources for the Raspberry Pi community and for new makers and programmers in particular, brings us instructions for making Joker, a Raspberry Pi joke machine.

A fact that ought to be more widely known is that our own Ben Nuttall is founder and chairperson of the Pyjokes Society. He and co-founders Alex Savio, Borja Ayerdi and Oier Etxaniz have written pyjokes, a Python module offering lovingly curated one-liners for programmers, and it’s from this that Joker gets its material. Ben and friends encourage you to improve their collection by submitting the best programming jokes you know that can be expressed in 140 characters or fewer; you can propose them on GitHub via pyjokes’ proposal issue or via pull request.

Joker’s display is an affordable Adafruit 16×2 LCD Pi plate; this comes as a kit needing assembly, which Adafruit’s detailed instructions walk you through gently. With the LCD assembled and mounted, getting Joker up and running is just a matter of installing the pyjokes module, LCD drivers and Joker script, together with a little bit of other set-up to allow your Raspberry Pi to talk to the LCD.

Everything you need is in the tutorial, and it makes for a really great self-contained project. Give it a whirl!

The post Joker: a Raspberry Pi + Python joke machine appeared first on Raspberry Pi.

Calculator for audio output transformers

via Dangerous Prototypes

25k_8r_af_transformer

Dilshan Jayakody  writes:

    Audio output transformers are heavily used in vacuum tube and some (older) transistor base audio power amplifiers, but these days output transformer are quiet hard to find and expensive item. For homebrew projects the best option is to construct those transformers by ourselves and this script helps to calculate winding parameters for those transformers.
This “AF output transformer calculator” script is written using Python and it works with most of the commonly available Python interpreters.

The script is available at elect.wikispaces.com.

Details at Dilshan’s blog.

Astro Pi: Mission Update 5 – flight safety testing

via Raspberry Pi

Astro_Pi_Logo_WEB-300px

The road to space is long and winding, but the two Astro Pi flight units are almost there! The next thing for us after this is to hand over the final payload to the European Space Agency so it can be loaded onto the Soyuz-45S rocket for launch on December 15th with British ESA Astronaut Tim Peake.

To be allowed on the rocket, you need a flight safety certificate for your device, and these can only be obtained by presenting a whole host of measurements and test results to a panel of experts at ESA ESTEC in Holland.

The expertise and equipment to carry out many of these tests is well outside the capabilities of the Raspberry Pi Foundation, and without the facilities and personnel available through our UK Space partners this would not have been possible – we’ve had to use facilities and partners all over Europe to get the work done.

I’ll list below the tests that were done approximately in chronological order starting from March.

Power integration test

AIRBUS Defence and Space, Bremen, Germany >

Once in orbit, the Astro Pi will have two ways of getting power. It can use an AC inverter (above) that allows the crew to use all kinds of standard domestic appliances (like a normal USB power block); it’s also able to get power from any laptop USB port.

It is likely that when the Astro Pi is deployed in the Columbus module we will run from an AC inverter, but when we’re in the Cupola module we’ll just draw power from one of the laptops which is also there.

To gain permission to draw power from a laptop like this we needed to do a power integration test, to evaluate that the electrical load doesn’t have any adverse effect on the laptop.

astro_pi_bremen

The most common laptop on the ISS is the IBM Thinkpad T61P (circa 2007 from before Lenovo acquired them – Eben also uses one of these). Pictured above is an identical ground laptop with a special USB current probe connected to an oscilloscope. Note that this was done before we had the aluminium flight case, so you’re just seeing the Sense HAT, Raspberry Pi and camera parts of the whole Astro Pi unit.

The flight hardware was then powered up through the current probe so the oscilloscope could measure current inrush as well as maximum current when using the Astro Pi at max performance. Some diagnostic software was then used to check that there were no adverse affects experienced by the laptop.

Coin Cell Battery

Surrey Satellite Technology, Guildford, UK >

Since the Astro Pi will not be connected to the LAN on the ISS the only means it has of keeping the correct time is with a Real Time Clock (RTC) and a backup battery.

The flight stack up for Astro Pi is as follows:

  1. Raspberry Pi B+
  2. Custom RTC Board (has coin cell holder and push button contacts)
  3. Sense HAT

Batteries on the ISS have a whole host of possible hazards associated with them, and so any battery flown is subject to a stringent set of batch tests.

Astro Pi has a batch of eight Panasonic BR-1225 coin cells which were all tested together. Here is number 5, which is one of the ones that will fly:

battery5topbattery5bot

The test procedure involved visually inspecting the coin cells, measuring their width and size with callipers, testing their voltage output during open circuit and under load followed by exposing them to a vacuum of about 0.6 bar (~450 mmHg) for two hours.

Afterwards the measurements were redone to see if the coin cells had leaked, deformed or become unable to provide power.

Conformal Coating

Surrey Satellite Technology, Guildford, UK >

One of the safety requirements for circuit boards in space flight is that they are coated in a protective layer, rather like nail varnish, called conformal coating. This is a space grade silicone-based liquid that dries to form a hard barrier. In microgravity a metallurgical phenomenon called tin whiskers occurs. These are tiny hairs of metal that grow spontaneously from any metallic surface, especially solder joints.

The hazard here is that these little whiskers break off, float off and become lodged somewhere causing a short circuit. So the conformal coat has two purposes. One is to protect the PCB from any invading whiskers, and the other is to arrest any tin whiskers that may grow, and prevent them breaking free.

sense_hat_conformal

For the Sense HAT (above) we needed to define a number of keep out zones for the coating so as not to compromise the pressure and humidity sensors. The surfaces of the LEDs were not coated to avoid dulling their light too. If you look closely you can see the shiny coating on the HAT; in particular, see the joystick bottom right.

It’s much easier to see on two camera modules:

vis_conformal

ir_conformal

Vibration

AIRBUS Defence and Space, Portsmouth, UK >

Vibe testing is not actually required for safety, but we undertook it anyway as insurance that the payload would survive the vibration environment of launch. The testing involved placing an Astro Pi into some flight equivalent packaging and strapping it down onto a vibe table.

The vibe table is then programmed to simulate the severity of launch conditions on a Soyuz rocket.

The tests needed to be done in x, y and z axes. To accomplish this two different vibe tables were employed, one for up and down (z, see above) and one for back and forth (x and y, see below).

After the vibration sequence the Astro Pi was tested to ensure the vibration had not caused any issues, the case was also opened and the interior was inspected to ensure no connections had become loose.

Electromagnetic Compatibility (EMC)

AIRBUS Defence and Space, Portsmouth, UK >

EMC is the study and measurement of unintended electromagnetic signals that could interfere with other electronics. Almost all electronic devices these days undergo EMC testing in order to get CE or FCC markings. The Raspberry Pi B+ and Sense HAT both carry these markings; however their test results were obtained in a home-user setup, with a keyboard, mouse, HDMI monitor and Ethernet all connected.

The Astro Pi flight unit will be used without all of those. So these tests were required to ensure that, when used in this way, the Astro Pi doesn’t cause any problems to other systems on board the ISS (like life support).

The tests were conducted in a special EMC test chamber. The walls are lined with super-absorbent foam spikes that exclude all electromagnetic signals from coming into the room from the outside.

That way, any electromagnetic signal measured must have originated inside the room.

A test script was run on the Astro Pi to stress it to maximum performance while a series of antennae were used to examine different ranges of the electromagnetic spectrum.

A set of electromagnetic susceptibility tests was also conducted to evaluate how the Astro Pi would behave when experiencing strong magnetic fields.

No issues were found, and all tests passed.

Off Gassing

ESA ESTEC, Noordwijk, Holland >

The off-gassing test is done to ensure the payload does not give off any dangerous fumes that might be harmful to the crew.

The test involves placing the payload into a bell jar and pumping out all of the air. Synthetic air of known properties is then pumped in, and the whole jar is held at 50 degrees Celsius for 72 hours. Afterwards the synthetic air, plus any gasses released by the payload, are pumped out and analysed using a mass spectrometer.

off_gass

If you look closely, you can also see some Raspberry Pi SD cards in there. The test needed to be representative of the entire payload, so it’s one of the flight units plus five SD cards. The resulting measurements were then just doubled to account for two Astro Pi units with ten SD cards.

Thermal Capacity

Raspberry Pi, Cambridge, UK

This test needed to demonstrate that no touchable surface of the Astro Pi flight case would ever reach or exceed 45 degrees Celsius.

In microgravity the process of convection doesn’t occur, so the case was designed with thermal conduction in mind. Each of the square pins on the base can dissipate about 0.1 watts of heat. We also wanted to avoid any fans as these cause EMC headaches and other problems for safety (moving parts).

We used five temperature probes connected to another Raspberry Pi for the data logging. Four of the probes were placed in contact with the surface of the aluminium case using small thermal pads and kapton tape (HDMI side, base by the camera, SD card slot side and top side). One was used to monitor ambient temperature some distance away. The Astro Pi was then placed inside a small box to simulate the reduced airflow on board the ISS and was then stressed to maximum performance for four days.

The results showed that an equilibrium was reached fairly quickly where the only input into the system was the fluctuation of ambient temperature.

Sharp edges inspection

ESA ESTEC, Noordwijk, Holland >

This test was almost a formality, but was done so ESA could verify there were no sharp edges that could cause harm to the crew. The test was done using a special piece of fabric that was dragged over the surface of the flight case. If it snags then the test is failed, but thankfully we passed without issue first time.

The test is important because a crew member with a cut or infected hand is a serious problem on orbit.

Experiment Sequence Test

ESA-EAC, European Astronaut Centre, Cologne, Germany >

The experiment sequence test is a full end-to-end reproduction of everything that Tim Peake will do on orbit. It was done in a replica of the ISS Columbus module on the ground.

On orbit they have step by step procedures that the crew follow and these tests are an opportunity to improve and refine them. There is a procedure for deploying the Astro Pi, one for powering it from the ISS mains, and another for powering via laptop power. There is one for fault finding and diagnostics and also one for getting files off the Astro Pi for downlink to Earth.

The tests used a surrogate crew to play the role of Tim Peake. Each procedure was run, and any anomalies or problems that caused a deviation from the procedure were noted.

The Astro Pi will run a Python program called the MCP (master control program*) and this oversees the running of the competition winning code from the students. It is designed to monitor how long each has run for, and ensures that each receives the allotted run time, despite the Astro Pi being, potentially, rebooted multiple times from single event upsets due to the radiation environment on the ISS.

There were a couple of minor issues found, and we’re required to repeat one of the tests again in September. But otherwise everything worked successfully.

All the test reports are then combined into a Flight Safety Data Pack (FSDP). This also includes a flammability assessment which is an examination of all materials used in the payload and their risk of being a flame propagation path on the ISS. The main heavy lifting with the FSDP documentation was done by Surrey Satellite Technology, whom we’re eternally grateful to.

Thanks for reading if you made it this far! Next mission update will be after we’ve handed over the final payload.

The post Astro Pi: Mission Update 5 – flight safety testing appeared first on Raspberry Pi.

Education Summit Videos from EuroPython

via Raspberry Pi

At the end of July, a subset of our Education Team went to Bilbao in Spain for EuroPython. As well as giving a number of talks at the conference, we’d arranged with the organising committee to run an Education Summit and invite teachers along.

EuroPython Education Summit_Logo_FULL

On the Thursday of the conference, we had a day of Education talks lined up, starting with Carrie Anne who gave the opening keynote, ‘Education: A Python solution‘:

(see the slides)

We were lucky enough to meet the creator and BDFL of Python, Guido van Rossum, who also gave a keynote.

IMG_20150723_102228 IMG_1409resized IMG_1410resized

Watch my talk on ‘Physical Computing with Python and Raspberry Pi‘:

(see the slides)

Watch James’s first talk, ‘Raspberry Pi Weather Station‘:

(see the slides)

Watch James’s second talk, ‘Pycon – A teacher’s perspective‘:

(see the slides)

Alex Bradbury also gave a lightning talk on Pyland – a project he’s working on with a group of interns at the Cambridge Computer Lab. Pyland is a game designed for children to learn Python as a way to progress in the game. Watch out for more on Pyland next month!

IMG_20150720_120138

The conference had two Raspberry Pi powered arcade machines and all the TV screens showing the talks schedule were running on Pis too!

At the weekend’s sprints we had a team of developers working on PyGame Zero, and as part of the Education Summit we ran an intro session for teachers.

IMG_1484resized IMG_1483resized IMG_1482resized
The Pyjokes Society

The Pyjokes Society

This year, EuroPython was run by the organisers of PySS, a conference in San Sebastian, with a brilliant team of volunteers who helped make it all happen – they did a great job.

EuroPython

A huge thanks to the EuroPython team!

The post Education Summit Videos from EuroPython appeared first on Raspberry Pi.

Astro Pi: Mission Update 4

via Raspberry Pi

Astro_Pi_Logo_WEB-300px

Just over a week ago now we closed the Secondary School phase of the Astro Pi competition after a one week extension to the deadline. Students from all over the UK have uploaded their code hoping that British ESA Astronaut Tim Peake win run it on the ISS later this year!

Last week folks from the leading UK Space companies, the UK Space Agency and ESERO UK met with us at Pi Towers in Cambridge to do the judging. We used the actual flight Astro Pi units to test run the submitted code. You can see one of them on the table in the picture below:

The standard of entries was incredibly high and we were blown away by how clever some of them were!

Doug Liddle of SSTL said:

“We are delighted that the competition has reached so many school children and we hope that this inspires them to continue coding and look to Space for great career opportunities”

British ESA Astronaut Tim Peake - photo provided by UK Space Agency under CC BY-ND

British ESA Astronaut Tim Peake – photo provided by UK Space Agency under CC BY-ND

Jeremy Curtis, Head of Education at the UK Space Agency, said:

“We’re incredibly impressed with the exciting and innovative Astro Pi proposals we’ve received and look forward to seeing them in action aboard the International Space Station.

Not only will these students be learning incredibly useful coding skills, but will get the chance to translate those skills into real experiments that will take place in the unique environment of space.”

When Tim Peake flies to the ISS in December he will have the two Astro Pis in his personal cargo allowance. He’ll also have 10 especially prepared SD cards which will contain the winning applications. Time is booked into his operations schedule to deploy the Astro Pis and set the code running and afterwards he will recover any output files created. These will then be returned to their respective owners and made available online for everyone to see.

Code was received for all secondary school key stages and we even have several from key stage 2 primary schools. These were judged along with the key stage 3 entries. So without further adieu here comes a breakdown of who won and what their code does:

Each of these programs have been assigned an operational code name that will be used when talking about them over the space to ground radio. These are essentially arbitrary so don’t read into them too much!

Ops name: FLAGS

  • School: Thirsk School
  • Team name: Space-Byrds
  • Key stage: 3
  • Teacher: Dan Aldred
  • The judges had a lot of fun with this. Their program uses telemetry data provided by NORAD along with the Real Time Clock on the Astro Pi to computationally predict the location of the ISS (so it doesn’t need to be online). It then works out what country that location is within and shows its flag on the LED matrix along with a short phrase in the local language.

Ops name: MISSION CONTROL

  • School: Cottenham Village College
  • Team name: Kieran Wand
  • Key stage: 3
  • Teacher: Christopher Butcher
  • Kieran’s program is an environmental system monitor and could be used to cross check the ISS’s own life support system. It continually measures the temperature, pressure and humidity and displays these in a cycling split-screen heads up display. It has the ability to raise alarms if these measurements move outside of acceptable parameters. We were especially impressed that code had been written to compensate for thermal transfer between the Pi CPU and Astro Pi sensors.

Andy Powell of the Knowledge Transfer Network said:

“All of the judges were impressed by the quality of work and the effort that had gone into the winning KS3 projects and they produced useful, well thought through and entertaining results”

Ops name: TREES

  • School: Westminster School
  • Team name: EnviroPi
  • Key stage: 4 (and equivalent)
  • Teacher: Sam Page
  • This entry will be run in the cupola module of the ISS with the Astro Pi NoIR camera pointing out of the window. The aim is to take pictures of the ground and to later analyse them using false colour image processing. This will produce a Normalised Differentiated Vegetation Index (NDVI) for each image which is a measure of plant health. They have one piece of code which will run on the ISS to capture the images and another that will run on the ground after the mission to post process and analyse the images captured. They even tested their code by going up in a light aircraft to take pictures of the ground!

Ops name: REACTION GAMES

  • School: Lincoln UTC
  • Team name: Team Terminal
  • Key stage: 4 (and equivalent)
  • Teacher: Mark Hall
  • These students have made a whole suite of various reaction games complete with a nice little menu system to let the user choose. The games also record your response times with the eventual goal to investigate how crew reaction time changes over the course of a long term space flight. This entry caused all work to cease during the judging for about half an hour!

Lincoln UTC have also won the prize for the best overall submission in the Secondary School completion. This earns them a photograph of their school taken from space by an Airbus or SSTL satellite. Go and make a giant space invader please!

Ops name: RADIATION

  • School: Magdalen College School
  • Team name: Arthur, Alexander and Kiran
  • Key stage: 5 (and equivalent)
  • Teacher: Dr Jesse Petersen
  • This team have successfully made a radiation detector using the Raspberry Pi camera module, the possibility of which was hinted at during our Astro Pi animation video from a few months ago. The camera lens is blanked off to prevent light from getting in but this still allows high energy space radiation to get through. Due to the design of the camera the sensor sees the impacts of these particles as tiny specks of light. The code then uses OpenCV to measure the intensity of these specks and produces an overall measurement of the level of radiation happening.

What blew us away was that they had taken their Astro Pi and camera module along to the Rutherford Appleton Laboratory and fired a neutron cannon at it to test it was working!!!

The code can even compensate for dead pixels in the camera sensor. I am wondering if they killed some pixels with the neutron cannon and then had to add that code out of necessity? Brilliant.

These winning programs will be joined on the ISS by the winners of the Primary School Competition which closed in April:

Ops name: MINECRAFT

  • School: Cumnor House Girl’s School
  • Team name: Hannah Belshaw
  • Key stage: 2
  • Teacher: Peter Kelly
  • Hannah’s entry is to log data from the Astro Pi sensors but to visualise it later using structures in a Minecraft world. So columns of blocks are used to represent environmental measurements and a giant blocky model of the ISS itself (that moves) is used to represent movement and orientation. The code was written, under Hannah’s guidance, by Martin O’Hanlon who runs Stuff About Code. The data logging program that will run on the ISS produces a CSV file that can be consumed later by the visualisation code to play back what happened when Tim Peak was running it in space. The code is already online here.

Ops name: SWEATY ASTRONAUT

  • School: Cranmere Primary School
  • Team name: Cranmere Code Club
  • Key stage: 2
  • Teacher: Richard Hayler
  • Although they were entitled to have their entry coded by us at Raspberry Pi the kids of the Cranmere Code Club are collectively writing their program themselves. The aim is to try and detect the presence of a crew member by monitoring the environmental sensors of the Astro Pi. Particularly humidity. If a fluctuation is detected it will scroll a message asking if someone is there. They even made a Lego replica of the Astro Pi flight case for their testing!

Obviously the main winning prize is to have your code flown and run on the ISS. However the UK Space companies also offered a number of thematic prizes which were awarded independently of those that have been chosen to fly. Some cross over with the other winners was expected here.

  • Space Sensors
    Hannah Belshaw, from Cumnor House Girl’s School with her idea for Minecraft data visualisation.
  • Space Measurements
    Kieran Wand from Cottenham Village College for his ISS environment monitoring system.
  • Imaging and Remote Sensing
    The EnviroPi team from Westminster School with their experiment to measure plant health from space using NDVI images.
  • Space Radiation
    Magdalen College, Oxford with their Space Radiation Detector.
  • Data Fusion
    Nicole Ashworth, from Reading, for her weather reporting system; comparing historical weather data from the UK with the environment on the ISS.
  • Coding Excellence
    Sarah and Charlie Maclean for their multiplayer Labyrinth game.

Pat Norris of CGI said:

“It has been great to see so many schools getting involved in coding and we hope that this competition has inspired the next generation to take up coding, space systems or any of the many other opportunities the UK space sector offers. We were particularly impressed by the way Charlie structured his code, added explanatory comments and used best practice in developing the functionality”

We’re aiming to have all the code that was submitted to the competition on one of the ten SD cards that will fly. So your code will still fly even if it won’t be scheduled to be run in space. The hope is that, during periods of downtime, Tim may have a look through some of the other entries and run them manually. But this depends on a lot of factors outside of our control and so we can’t promise anything.

But wait, there’s more?

There is still opportunity for all schools to get involved with Astro Pi!

There will be an on-orbit activity during the mission (probably in January or February) that you can all do at the same time as Tim. After the competition winning programs have all finished the Astro Pi will enter a phase of flight data recording. Just like the black box on an aircraft.

This will make the Astro Pi continually record everything from all its sensors and save the data into a file that you can get! If you set your Astro Pi up in the same way (the software will be provided by us) then you can compare his measurements with yours taken on the ground.

There is then a lot of educational value in looking at the differences and understanding why they occur. For instance you could look at the accelerometer data to find out when ISS reboosts occurred or study the magnetometer data to find out how the earth’s magnetic field changes as they orbit the earth. A number of free educational resources will be provided that will help you to leverage the value of this exercise.

The general public can also get involved when the Sense HAT goes on general sale in a few weeks time.

Libby Jackson of the UK Space Agency said:

“Although the competition is over, the really exciting part of the project is just beginning. All of the winning entries will get see their code run in space and thousands more can take part in real life space experiments through the Flight Data phase”

IMG_0198

The post Astro Pi: Mission Update 4 appeared first on Raspberry Pi.

Raspberry Pi – Translator

via Wolf Paulus » Embedded | Wolf Paulus

Recently, I described how to perform speech recognition on a Raspberry Pi, using the on device sphinxbase / pocketsphinx open source speech recognition toolkit. This approach works reasonably well, but with high accuracy, only for a relatively small dictionary of words.

Like the article showed, pocketsphinx works great on a Raspberry Pi to do keyword spotting, for instance to use your voice, to launch an application. General purpose speech recognition however, is still best performed, using one of the prominent web services.

Speech Recognition via Google Service

Google’s speech recognition and related services used to be accessible and easy to integrate. Recently however, they got much more restrictive and (hard to belief, I know) Microsoft is now the place to start, when looking for decent speech related services. Still, let’s start with Google’s Speech Recognition Service, which requires an FLAC (Free Lossless Audio Codec) encoded voice sound file.

Google API Key

Accessing Google’s speech recognition service requires an API key, available through the Google Developers Console.
I followed these instructions for Chromium Developers and while the process is a little involved, even intimidating, it’s manageable.
I created a project and named in TranslatorPi and selected the following APIs for this project:

Goggle Developer Console: Enabled APIs for this project

Goggle Developer Console: Enabled APIs for this project

The important part is to create an API Key for public access. On the left side menu, select API & auth / Credentials. Here you can create the API key, a 40 character long alpha numeric string.

Installing tools and required libraries

Back on the Raspberry Pi, there are only a few more libraries needed, additionally to what was installed in the above mentioned on-device recognition project.

sudo apt-get install flac
sudo apt-get install python-requests
sudo apt-get install python-pycurl

Testing Google’s Recognition Service from a Raspberry Pi

I have the same audio setup as previously described, now allowing me to capture a FLAC encoded test recording like so:
arecord -D plughw:0,0 -f cd -c 1 -t wav -d 0 -q -r 16000 | flac - -s -f --best --sample-rate 16000 -o test.flac
..which does a high quality, wave type recording and pipes it into the flac encoder, which outputs ./test.flac

The following bash script will send the flac encoded voice sound to Google’s recognition service and display the received JSON response:

#!/bin/bash
# parameter 1 : file name, containg flac encoded voiuce recording
 
echo Sending FLAC encoded Sound File to Google:
key='<TRANSLATORPI API KEY>'
url='https://www.google.com/speech-api/v2/recognize?output=json&lang=en-us&key='$key
curl -i -X POST -H "Content-Type: audio/x-flac; rate=16000" --data-binary @$1 $url
echo '..all done'
Speech Recognition via Google Service - JSON Response

Speech Recognition via Google Service – JSON Response

{
  "result": [
    {
      "alternative": [
        {
          "transcript": "today is Sunday",
          "confidence": 0.98650438
        },
        {
          "transcript": "today it's Sunday"
        },
        {
          "transcript": "today is Sundy"
        },
        {
          "transcript": "today it is Sundy"
        },
        {
          "transcript": "today he is Sundy"
        }
      ],
      "final": true
    }
  ],
  "result_index": 0
}

More details about accessing the Google Speech API can be found here: https://github.com/gillesdemey/google-speech-v2

Building a Translator

Encoding doesn’t take long and the Google Speech Recognizer is the fastest in the industry, i.e. the transcription is available swiftly and we can send it for translation to yet another web service.

Microsoft Azure Marketplace

Creating an account at the Azure Marketplace is a little easier and the My Data section shows that I have subscribed to the free translation service, providing me with 2,000,000 Characters/month. Again, I named my project TranslatorPi. On the ‘Developers‘ page, under ‘Registered Applications‘, take note of the Client ID and Client secret, both are required for the next step.
regapp1

regapp2

Strategy

With the speech recognition from Google and text translation from Microsoft, the strategy to build the translator looks like this:

  • Record voice sound, FLAC encode it, and send it to Google for transcription
  • Use Google’s Speech Synthesizer and synthesize the recognized utterance.
  • Use Microsoft’s translation service to translate the transcription into the target language.
  • Use Google’s Speech Synthesizer again, to synthesize the translation in the target language.

For my taste, that’s a little too much for a shell script and I use the following Python program instead:

# -*- coding: utf-8 -*-
import json
import requests
import urllib
import subprocess
import argparse
import pycurl
import StringIO
import os.path
 
 
def speak_text(language, phrase):
    tts_url = "http://translate.google.com/translate_tts?tl=" + language + "&q=" + phrase
    subprocess.call(["mplayer", tts_url], shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
 
def transcribe():
    key = '[Google API Key]'
    stt_url = 'https://www.google.com/speech-api/v2/recognize?output=json&lang=en-us&key=' + key
    filename = 'test.flac'
    print "listening .."
    os.system(
        'arecord -D plughw:0,0 -f cd -c 1 -t wav -d 0 -q -r 16000 -d 3 | flac - -s -f --best --sample-rate 16000 -o ' + filename)
    print "interpreting .."
    # send the file to google speech api
    c = pycurl.Curl()
    c.setopt(pycurl.VERBOSE, 0)
    c.setopt(pycurl.URL, stt_url)
    fout = StringIO.StringIO()
    c.setopt(pycurl.WRITEFUNCTION, fout.write)
 
    c.setopt(pycurl.POST, 1)
    c.setopt(pycurl.HTTPHEADER, ['Content-Type: audio/x-flac; rate=16000'])
 
    file_size = os.path.getsize(filename)
    c.setopt(pycurl.POSTFIELDSIZE, file_size)
    fin = open(filename, 'rb')
    c.setopt(pycurl.READFUNCTION, fin.read)
    c.perform()
 
    response_data = fout.getvalue()
 
    start_loc = response_data.find("transcript")
    temp_str = response_data[start_loc + 13:]
    end_loc = temp_str.find(""")
    final_result = temp_str[:end_loc]
    c.close()
    return final_result
 
 
class Translator(object):
    oauth_url = 'https://datamarket.accesscontrol.windows.net/v2/OAuth2-13'
    translation_url = 'http://api.microsofttranslator.com/V2/Ajax.svc/Translate?'
 
    def __init__(self):
        oauth_args = {
            'client_id': 'TranslatorPI',
            'client_secret': '[Microsoft Client Secret]',
            'scope': 'http://api.microsofttranslator.com',
            'grant_type': 'client_credentials'
        }
        oauth_junk = json.loads(requests.post(Translator.oauth_url, data=urllib.urlencode(oauth_args)).content)
        self.headers = {'Authorization': 'Bearer ' + oauth_junk['access_token']}
 
    def translate(self, origin_language, destination_language, text):
        german_umlauts = {
            0xe4: u'ae',
            ord(u'ö'): u'oe',
            ord(u'ü'): u'ue',
            ord(u'ß'): None,
        }
 
        translation_args = {
            'text': text,
            'to': destination_language,
            'from': origin_language
        }
        translation_result = requests.get(Translator.translation_url + urllib.urlencode(translation_args),
                                          headers=self.headers)
        translation = translation_result.text[2:-1]
        if destination_language == 'DE':
            translation = translation.translate(german_umlauts)
        print "Translation: ", translation
        speak_text(origin_language, 'Translating ' + text)
        speak_text(destination_language, translation)
 
 
if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='Raspberry Pi - Translator.')
    parser.add_argument('-o', '--origin_language', help='Origin Language', required=True)
    parser.add_argument('-d', '--destination_language', help='Destination Language', required=True)
    args = parser.parse_args()
    while True:
        Translator().translate(args.origin_language, args.destination_language, transcribe())

Testing the $35 Universal Translator

So here are a few test sentences for our translator app, using English to Spanish or English to German:

  • How are you today?
  • What would you recommend on this menu?
  • Where is the nearest train station?
  • Thanks for listening.

Live Demo

This video shows the Raspberry Pi running the translator, using web services from Google and Microsoft for speech recognition, speech synthesis, and translation.