Monthly Archives: April 2021

Star Wars Arcade Cabinet | The MagPi #105

via Raspberry Pi

Why pay over the odds when you can build an accurate replica, and have fun doing it? For the latest issue of The MagPi Magazine, Rob Zwetsloot switches off his targeting computer to have a look.

header of the arcade cabinet bearing a Star Wars logo
Art had to be rescaled, but it’s been done faithfully

Getting the arcade machine of your dreams gets a little harder every day, especially the older they are. Making one, however, is always possible if you have the right skills and a Raspberry Pi.

“My project was to build a replica, or as close as I could reasonably manage, of the Atari Star Wars arcade cabinet,” James Milroy tells us. “I really wanted to build a cockpit as that’s what I played on in the eighties, but sadly I didn’t have the room to house it, so the compromise was to build a stand-up cabinet instead.”

The workings were simple when it came down to it: Raspberry Pi 3B+ with Pimoroni Picade X HAT. This gives us a power switch, audio amp, buttons, and a joystick if necessary. The replica yoke is interfaced with a USB adapter from the same company
The workings were simple when it came down to it: Raspberry Pi 3B+ with Pimoroni Picade X HAT. This gives us a power switch, audio amp, buttons, and a joystick if necessary. The replica yoke is interfaced with a USB adapter from the same company

Even then, the standard cabinet has a lot of detail, and James really nailed the look of it. Why build it from scratch, though? “Initially, I had toyed with sourcing an original cabinet and restoring it, but soon gave up on that idea after finding it nigh on impossible to source a cabinet here in the UK,” James explains. “Almost all cabinets for sale were located in the USA, so they were out of the question due to the high cost of shipping. Atari only made just over 12,500 cabinets worldwide, so their rarity meant that they commanded top dollar, effectively putting them out of my price range. It was at this point that I decided that if it was going to happen, then I would have to make it myself.”

star wars arcade cabinet full length shot

Making a cabinet is hard enough, but the control system would have to be an original Atari yoke. “The Atari yoke is considered the ‘holy grail’ of controllers and, again, is very hard to find,” James says. “My prayers were answered in October 2018 when a thread on a forum I was subscribed to popped up with a small Utah-based startup aiming to supply replica yokes at a realistic price to the arcade community. I grabbed two of these (one for my friend) and the project was on.”

Good feeling

When it came to actually emulating the game, for James there was only one choice: “My decision to go with a Raspberry Pi was a no-brainer really. I had previously made a bartop cabinet using a Raspberry Pi 3 and RetroPie/EmulationStation which I was really pleased with. So I had a platform that I already had experience with and knew was more than capable of emulating the one game I needed to run. Besides, the simplicity and low cost of the ecosystem for Raspberry Pi far outweighs the extra expense and effort required going down the PC route.”

The riser was a custom build by James that emulates lights from the films
The riser was a custom build by James that emulates lights from the film

With a custom build and emulation, authenticity of the gameplay experience could be a bit off. However, that’s not the case here. “I think that it plays just like the real arcade machine mainly due to the inclusion of the replica yoke controller, and adding your credit by pressing the button on the coin door,” says James. “Ideally a vector monitor or a CRT would go a long way to making it look just like the original, but a reasonable representation is possible on an LCD using shaders and anti-aliasing. Gameplay does seem to get really hard really quick, though; this could be due to an imperfect emulation, but is more likely due to my reactions having dulled somewhat in the last 38 years!”

Always in motion

While the current build is amazing as it is, James does have some ideas to improve it. “Overall, I’m really pleased with the way the cabinet has worked out,” he says. “I will be replacing Raspberry Pi 3B+ with a Raspberry Pi 4 to enable me to run a newer version of MAME which will hopefully offer a better emulation, sort some audio glitching I get with my current setup, and hopefully enable some graphical effects (such as bloom and glow) to make it look more like its running on a CRT.”

Get your copy of The Magpi #105 now!

You can grab the brand-new issue right now online from the Raspberry Pi Press store, or via our app on Android or iOS. You can also pick it up from supermarkets and newsagents, but make sure you do so safely while following all your local guidelines. There’s also a free PDF you can download.

The post Star Wars Arcade Cabinet | The MagPi #105 appeared first on Raspberry Pi.

Arduino-controlled gas mixing device fills DIY laser tubes

via Arduino Blog

Lasers come in two varieties: solid-state and gas tube. As the name suggests, the latter types contain gas. That is a mixture of gas in precise proportions. To fill his DIY laser tube, Cranktown City built an Arduino-controlled gas mixer.

This device has an Arduino Uno board that drives three relay modules. The first relay switches power to a gas pump, the second relay controls an output valve, and the third relay controls an input valve. A push button starts the pumping process. The pump turns on and the input valve opens. Gas from a storage tank is pumped into an inflatable bag. Once the bag is full, as detected by a limit switch, the two valves flip and the gas pumps into the laser tube.

Cranktown City knows the exact volume of the inflatable bag, so he knows how much gas has been pumped into the laser tube each time the device runs. Like mixing a cocktail, this lets him “pour” each part of the gas mixture into the laser tube until he ends up with the correct proportions.

The gas pump, Arduino, relays, and inflatable bag are all enclosed within a heavy duty case made from steel sheet cut on a plasma table. The resulting mixer is portable and robust enough to stand up to abuse of a shop environment. With this device, Cranktown City can continue with developing his DIY laser tube — a project we can’t wait to see completed.

The post Arduino-controlled gas mixing device fills DIY laser tubes appeared first on Arduino Blog.

Ped Dead Reckoning II: This Time it’s Inertial

via SparkFun: Commerce Blog

When last I left you on my journey to nail down non-GPS absolute positioning and navigation, you may recall that I was finding some difficulties maintaining accuracy, and I was not alone in my findings on this quest. Engineering reviews and Masters theses made it quite clear that I am not the only one having a rough time making this happen, but that certainly isn’t keeping me from continuing my search. So here in Part II I'll share what I’ve learned over the past few weeks (spoiler alert: I wander from pedestrian dead reckoning (PDR) down any avenue I can find where I might gain some insight into non-GPS inertial navigation.)

alt text
I did it for the first article, I felt it was only right to do it again for this one.

In expanding my search, I came across a few other communities that also continue to work on this dilemma. The first group that I came across was a group of R/C hobbyists working with submarines. I know that mariners, both above and below the surface, have been using inertial navigation systems with success for quite some time. I found a fair number of semi-autonomous underwater vehicles, including some hobby-level, Arduino-driven submarines, like this scratch-build beauty from John Redearth. However, the thing about R/C hobbyist submariners is that they enjoy controlling their submarines, so while some of them have certain levels of autonomy built in (usually for the ballast system), inertial navigation at the hobby level is fairly rare. Even searching up, projects like DARPA’s Hydra project and the ensuing Orca project by Boeing have shown that accurate inertial navigation without GPS is still an incredibly tough nut to crack. DARPA has been working on inertial navigation in unmanned underwater vehicles since 1988, and since the concept is the same whether in an underwater vehicle or on a pedestrian, I can't feel too bad not being able to get it nailed down in the two months I've been working on it.

alt text
Boeing's Echo Voyager uses a Kalman-filtered inertial navigation unit supported by Doppler velocity logs.
(Photo courtesy of Boeing Co.)

The real issue continues to be drift. For pedestrian navigation systems, zero velocity updates have been used with good effect to help minimize such drift. Basically, ZUPT zeros the measure of velocity with each perceived step. This seems to work to a certain degree for walking in a straight line at a regular gate, or while standing still. However if, as in my original foray down this twisted path, we are talking about firefighters in emergency situations, neither of the aforementioned scenarios are likely to occur for very long. Ideally, pedestrian dead reckoning would only be used for brief intervals when GPS lock was lost, but if you’re in a life-threatening emergency situation, hoping that you re-establish GPS lock isn’t a sound option.

alt text
Drifting is great for your tryke, but for PDR, not so much.

Another group that has more recently joined the hunt for the perfect inertial navigation solution is engineers working on VR hardware. If you’ve spent any time in a Sony, HTC, Oculus or any other decent brand of VR headset, then you know how attuned they are with your movements. Turn your head left or right, and your view pans left or right. Tilt your head up and down, the view tilts accordingly. Walk forward three steps… well, unless you’re pushing the accompanying “walk” button, your view doesn’t change. Oliver Kreylos, a researcher with the Department of Earth and Planetary Sciences at UC Davis, discusses this in a good video here. An idea he suggests, which has been suggested many times by many people, including me and a number of commenters on my last post, is to add fixed point transceivers, allowing the module on the user to read signal strength from the multiple fixed points, adding another set of data points to help with accuracy. He also makes it clear that without that, inertial tracking is, in his own words, "a no-go."

alt text
Okay, so there are only so many options when visualizing inertial navigation drift.

I did find a small group of engineers out of Israel, more specifically from the faculty of electrical engineering at the Israel Institute of Technology and the Department of Marine Technology University of Haifa, who are doing what most of us wish we could do when stumbling on a project - throw more sensors at it! Their piece, "Multiple Inertial Measurement Units — An Empirical Study", is an extremely in-depth look at improving inertial navigation accuracy by increasing the number of sensors used. For this paper, they’re using 32 6-DoF IMUs for a total of 192 inertial sensors! I’ve read one or two other research papers that used multiple IMUs to try to improve accuracy in pedestrian dead reckoning, usually on legs and arms, but nothing on this large a scale. If you’re really interested in utilizing this technology, this paper is a very good read.

Suffice to say, accurate PDR without the aid of GPS isn't a problem I'm going to be solving any time soon. I am pleased to see that it is being worked on in an ever-increasing range of technologies, by an ever-enlarging pool of engineers. But even with as far as we've come, our automotive dead reckoning can only maintain accurate positioning after GNSS loss for about 90 seconds. And really, if I'm to be completely honest, if I'm going to put multiple IMUs on myself, it most likely won't be for dead reckoning but for a sweet motion capture suit.

alt text

The more I learn, the less I realize I know. ~ Socrates (also me, on a daily basis.)

comments | comment feed

Meet SeedGerm: a Raspberry Pi-based platform for automated seed imaging

via Raspberry Pi

Researchers at the John Innes Centre for plant and microbial science were looking for a cost‐effective phenotyping platform for automated seed imaging. They figured a machine learning-driven image analysis was the quickest way to deliver this essential, yet challenging, aspect of agricultural research. Sounds complicated, but they found that our tiny computers could handle it all.

Two types of SeedGerm hardware with wired and wireless connectivity used for acquiring seed germination image series for different crop species
Two types of SeedGerm hardware with wired and wireless connectivity used for acquiring seed germination image series for different crop species

What is phenotyping?

A phenotype is an organism’s observable characteristics, like growing towards the light, or having a stripy tail, or being one of those people who can make their tongue roll up. An organism’s phenotype is the result of the genetic characteristics it has – its genotype – and the environment in which it lives. For example, a plant’s genotype might mean it can grow quickly and become tall, but if its environment lacks water, it’s likely to have a slow-growing and short phenotype.

Phenotyping means finding out and recording particular aspects of an organism’s phenotype: for example, how fast seeds germinate, or how broad a plant’s leaves are.

Why do seeds need phenotyping?

Phenotyping allows us to guess at a seed’s genotype, based on things we can observe about the seed’s phenotype, such as its size and shape.

We can study which seed phenotypes appear to be linked to desirable crop phenotypes, such as a high germination rate, or the ability to survive in dry conditions; in other words, we can make predictions about which seeds are likely to grow into good crops. And if we have controlled the environment in which we’re doing this research, we can be reasonably confident that these “good” seed phenotypes are mostly due not to variation in environmental conditions, but to properties of the seeds themselves: their genotype.

Close up of seed germ set up 1
A close up of the incubators, each with Raspberry Pi computers on top, running the show

Growers need seeds that germinate effectively and uniformly to maximise crop productivity, so seed suppliers are interested in making sure their samples meet a certain germination rate.

The phenotypic traits that are used to work out whether seeds are likely to be good for growers are listed in the full research paper. But in general, researchers are looking for things like width, length, roundness, and contour lines in seeds.

How does Raspberry Pi help?

Gathering observations for phenotyping is a difficult and time-consuming process, and in order to capture high‐quality seed imaging continuously, the team needed to design two types of hardware apparatus. Raspberry Pi computers (Raspberry Pi 2 Model B or Raspberry Pi 3 Model B+) power both SeedGerm hardware designs, with a Raspberry Pi camera also providing image data in the lower-cost design.

seed genotyping at a computer
The open source software at work next to one of the mini seed incubators

The brilliant team behind this project recognised the limitations of current seed imaging approaches, and looked to explore how automating the analysis of seed germination could scale up their work in an affordable way. The SeedGerm system benefits from the cost-effectiveness of Raspberry Pi hardware and the open source software the team chose, and that makes us super happy.

Read the whole research paper, published in New Phytologist, here.

Raspberry Pi in biological sciences

Dr Jolle Jolles, a behavioural ecologist at the Center for Ecological Research and Forestry Applications (CREAF) near Barcelona, Spain, and a passionate Raspberry Pi user, has recently published a detailed review of the uptake of Raspberry Pi in biological sciences. He found that well over a hundred published studies have made use of Raspberry Pi hardware in some way.

The post Meet SeedGerm: a Raspberry Pi-based platform for automated seed imaging appeared first on Raspberry Pi.

Playing Connect Four against a mini-golfing AI opponent

via Arduino Blog

Have you dreamed of combining the two incredible activities mini-golf and Connect Four together into the same game? Well one daring maker set out to do just that. Bithead’s innovative design involves a mini-golf surface with seven holes at the end corresponding to the columns. The system can keep track of where each golf ball is with an array of 42 color sensors that are each connected to one of seven I2C multiplexers, all leading to a single Arduino Uno

The player can select from six distinct levels of AI, all the way from random shots in the dark to Q Learning, which records previous game-winning moves to improve how it plays over time. It can putt by first loading a golf ball into a chamber and then spinning up a pair of high-RPM motors that launch it. For the human player, there is a pair of dispensers on the left that give the correct color of ball. 

The entire system runs on an Intel NUC that hosts the game which was written in C#. There’s a large 22″ touchscreen at the front that is mounted at eye-level for easy interactions. Although it took Bithead nearly 18 months and $3,500, the end result is spectacular.

Be sure to check out his great write-up, which has a couple of demonstration videos and a build log. 

The post Playing Connect Four against a mini-golfing AI opponent appeared first on Arduino Blog.

Arduino Mbed Core for RP2040 Boards

via Arduino Blog

Arduino support for the Raspberry Pi RP2040 chip is available now using the official Arduino Mbed Core. This is obviously very important, and exciting, for the upcoming Arduino Nano RP2040 Connect. But it goes beyond the Arduino device to also bring support to other boards built around the RP2040 chip.

Arduino Core and Mbed OS

Arduino is no stranger to Mbed OS. It’s a hugely important platform and operating system in the IoT space. This is due toits support for Cortex-M microcontrollers and its real-time operating system capabilities. So a lot of work was done when Mbed OS was adopted for the Nano 33 BLE and Nano 33 BLE Sense devices. Since all Mbed infrastructure and drivers were in place, we could easily support all new Arduino boards with minimal effort. Including new ones like the Portenta H7 and Nano RP2040 Connect.

Arduino Mbed Core for RP2040

This makes the Arduino Core plug-and-play, and an easy choice for getting your devices up and running quickly. We provide two cores; one for our Nano RP2040 Connect board, and one for other RP2040-based boards, including the Raspberry Pi Pico. As the core is based on Mbed OS you can choose between using the Arduino’s API or Mbed’s.

If you’d like to support any other RP2040 board with its custom features you can do so. It requires very little effort by cloning and tweaking the Arduino Mbed Core.

Ready for the Nano RP2040 Connect

All Arduino APIs are standardized, which means they can be used on all boards. If you have a sketch for your Nano 33 BLE, you can now upload it to a Nano RP2040 Connect and run it without making any changes.

In effect this means you can create sketches for the new Arduino board, even if you don’t have your hands on it yet. It also makes project upgrades very easy. Nor does the Arduino Core require a custom bootloader for RP2040 devices, as it uses the ROM-based bootloader from Raspberry Pi.

Check out the Arduino Core Mbed right here. And sign up to our Nano RP2040 Connect contact list for more news as it happens.

The post Arduino Mbed Core for RP2040 Boards appeared first on Arduino Blog.