Lateo.net - Flux RSS en pagaille (pour en ajouter : @ moi)

🔒
❌ À propos de FreshRSS
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierRaspberry Pi

Raspberry Pi listening posts ‘hear’ the Borneo rainforest

These award-winning, solar-powered audio recorders, built on Raspberry Pi, have been installed in the Borneo rainforest so researchers can listen to the local ecosystem 24/7. The health of a forest ecosystem can often be gaged according to how much noise it creates, as this signals how many species are around.

And you can listen to the rainforest too! The SAFE Acoustics website, funded by the World Wide Fund for Nature (WWF), streams audio from recorders placed around a region of the Bornean rainforest in Southeast Asia. Visitors can listen to live audio or skip back through the day’s recording, for example to listen to the dawn chorus.

Listen in on the Imperial College podcast

What’s inside?

We borrowed this image of the flux tower from Sarab Sethi’s site

The device records data in the field and uploads it to a central server continuously and robustly over long time-periods. And it was built for around $305.

Here’s all the code for the platform, on GitHub.

The 12V-to-5V micro USB converter to the power socket of the Anker USB hub, which is connected to Raspberry Pi.

The Imperial College London team behind the project has provided really good step-by-step photo instructions for anyone interested in the fine details.

Here’s the full set up in the field. The Raspberry Pi-powered brains of the kit are safely inside the green box

The recorders have been installed by Imperial College London researchers as part of the SAFE Project – one of the largest ecological experiments in the world.

Screenshot of the SAFE Project website

Dr Sarab Sethi designed the audio recorders with Dr Lorenzo Picinali. They wanted to quantify the changes in rainforest soundscape as land use changes, for example when forests are logged. Sarab is currently working on algorithms to analyse the gathered data with Dr Nick Jones from the Department of Mathematics.

The lovely cross-disciplinary research team based at Imperial College London

Let the creators of the project tell you more on the Imperial College London website.

The post Raspberry Pi listening posts ‘hear’ the Borneo rainforest appeared first on Raspberry Pi.

Try web development with Digital Making at Home

Join us for Digital Making at Home: this week, young people can find out how to create web pages with us! Through Digital Making at Home, we invite kids all over the world to code and make along with us and our new videos every week.

So get ready to contribute to the World Wide Web:

Let’s create web pages this week! Watch our video to get coding now.

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session and ask us all your question about the World Wide Web, the internet, and web development.

The post Try web development with Digital Making at Home appeared first on Raspberry Pi.

Raspberry Pi Off-World Bartender

Three things we like: Blade Runner, robots, and cocktails. That’s why we LOVE Donald Bell‘s Raspberry Pi–packed ‘VK-01 Off-World Bartender‘ cocktail making machine.

This machine was due to be Donald’s entry into the Cocktail Robotics Grand Challenge, an annual event in San Francisco. By the time the event was cancelled, he was too deep into his awesome build to give up, so he decided to share it with the Instructables community instead.

Donald wanted users to get as much interaction and feedback as possible, rather than simply pressing a button and receiving a random drink. So with this machine, the interaction comes in four ways: instructions provided on the screen, using a key card to bypass security, placing and removing a cup on the tray, and entering an order number on the keypad.

In addition to that, feedback is provided by way of lighting changes, music, video dialogue, pump motors whirring, and even the clicks of relays at each stage of the cocktail making process.

Ordering on the keypad

close up of the black keypad

The keypad allows people to punch in a number to trigger their order, like on a vending machine. The drink order is sent to the Hello Drinkbot software running on the Raspberry Pi 3B that controls the pumps.

Getting your cup filled

Inside the cup dispenser sensor showing the switch and LEDs
The switch under the lid and ring of LEDs on the base

In order for the machine to be able to tell when a vessel is placed under the dispenser spout, and when it’s removed, Donald built in a switch under a 3D-printed tray. Provided the vessel has at least one ice cube in it, even the lightest plastic up is heavy enough to trigger the switch.

The RFID card reader

Cocktail machine customers are asked to scan a special ID card to start. To make this work, Donald adapted a sample script that blinks the card reader’s internal LED when any RFID card is detected.

Interactive video screen

close up of the interactive screen on the machine showing Japanese style script

This bit is made possible by MP4Museum, a “bare-bones” kiosk video player software that the second Raspberry Pi inside the machine runs on boot. By connecting a switch to the Raspberry Pi’s GPIO, Donald enabled customers to advance through the videos one by one. And yes, that’s an official Raspberry Pi Touch Display.

Behind the scenes of the interactive screen with the Raspberry Pi wired up
Behind the scenes of the screen with the Raspberry Pi A+ running the show

The Hello Drinkbot ‘bartender’

screen grab of the hello drinkbot web interface

Donald used the Python-based Hello Drinkbot software as the brains of the machine. With it, you can configure which liquors or juices are connected to which pumps, and send instructions on exactly how much to pour of each ingredient. Everything is configured via a web interface.

Via a bank of relays, microcontrollers connect all the signals from the Touch Display, keypad, RFID card reader, and switch under the spout.

Here’s the Fritzing diagram for this beast

Supplies

Donald shared an exhaustive kit list on his original post, but basically, what you’re looking at is…

Pencil sketches of the machine from different angles
Donald’s friend Jim Burke‘s beautiful concept sketches

And finally, check out the Raspberry Pi–based Hello Drinkbot project by Rich Gibson, which inspired Donald’s build.

The post Raspberry Pi Off-World Bartender appeared first on Raspberry Pi.

Rotary encoders: Raise a Glitch Storm | Hackspace 34

A Glitch Storm is an explosive torrent of musical rhythms and sound, all generated from a single line of code. In theory, you can’t do this with a Raspberry Pi running Python – in this month’s new issue, out now, the HackSpace magazine team lovingly acquired a tutorial from The Mag Pi team to throw theory out the window and show you how.

What is a Glitch Storm

A Glitch Storm is a user-influenceable version of bytebeat music. We love definitions like that here at the Bakery: something you have never heard of is simple a development of something else you have never heard of. Bytebeat music was at the heart of the old Commodore 64 demo scene, a competition to see who could produce the most impressive graphs and music in a very limited number of bytes. This was revived/rediscovered and christened by Viznut, aka Ville-Matias Heikkilä, in 2011. And then JC Ureña of the ‘spherical sound society’ converted the concept into the interactive Glitch Storm.

Figure 1: Schematic for the sound-generating circuit

So what is it?

Most random music generators work on the level of notes; that is, notes are chosen one at a time and then played, like our Fractal Music project in The MagPi #66. However, with bytebeat music, an algorithm generates the actual samples levels that make up the sound. This algorithm performs bitwise operations on a tick variable that increments with each sample. Depending on the algorithm used, this may or may not produce something musically interesting. Often, the samples produced exhibit a fractal structure, which is itself similar on many levels, thus providing both the notes and structure.

Enter the ‘Glitch Storm’

With a Glitch Storm, three user-controlled variables – a, b, and c – can be added to this algorithm, allowing the results to be fine-tuned. In the ‘Algorithms’ box, you can see that the bytebeat algorithms simply run; they all repeat after a certain time, but this time can be long, in the order of hours for some. A Glitch Storm algorithm, on the other hand, contains variables that a user can change in real-time while the sample is playing. This exactly what we can do with rotary encoders, without having the algorithm interrupted by checking the state of them all the time.

Figure 2: Schematic for the control box

What hardware?

In order to produce music like this on the Raspberry Pi, we need some extra hardware to generate the sound samples, and also a bunch of rotary encoders to control things. The samples are produced by using a 12-bit A/D converter connected to one of the SPI ports. The schematic of this is shown in Figure 1. The clock rate for the transfer of data to this can be controlled and provides a simple way of controlling, to some extent, the sample rate of the sound. Figure 2 shows the wiring diagram of the five rotary encoders we used.

Making the hardware

The hardware comes as two parts: the D/A converter and associated audio components. These are built on a board that hangs off Raspberry Pi’s GPIO pins. Also on this board is a socket that carries the wires to the control box. We used an IDC (insulation displacement connector) to connect between the board and the box, as we wanted the D/A connection wires to be as short as possible because they carry a high frequency signal. We used a pentagonal box just for fun, with a control in each corner, but the box shape is not important here.

Figure 3: Front physical layout of the interface board

Construction

The board is built on a 20-row by 24-hole piece of stripboard. Figure 3 and Figure 4 show the physical layout for the front and back of the board. The hole number 5 on row 4 is enlarged to 2.5mm and a new hole is drilled between rows 1 and 2 to accommodate the audio jack socket. A 40-way surface-mount socket connector is soldered to the back of the board, and a 20-way socket is soldered to the front. You could miss this out and wire the 20-way ribbon cable direct to the holes in these positions if you want to economise.

Figure 4: Rear physical layout of the interface board

Further construction notes

Note: as always, the physical layout diagram shows where the wires go, not necessarily the route they will take. Here, we don’t want wires crossing the 20-way connector, so the upper four wires use 30AWG Kynar wire to pop under the connector and out through a track hole, without soldering, on the other side. When putting the 20-way IDC pin connector on the ribbon cable, make sure the red end connector wire is connected to the pin next to the downward-pointing triangle on the pin connector. Figure 5 shows a photograph of the control box wiring

Figure 5: Wiring of the control board

Testing the D/A

The live_byte_beat.py listing on GitHub is a minimal program for trying out a bytebeat algorithm. It will play until stopped by pressing CTRL+C. The variable v holds the value of the sample, which is then transferred to the D/A over SPI in two bytes. The format of these two bytes is shown in Figure 6, along with how we have to manipulate v to achieve an 8-bit or 12-bit sample output. Note that all algorithms were designed for an 8-bit sample size, and using 12 bits is a free bonus here: it does sound radically different, and not always in a good way.

The main software

The main software for this project is on our GitHub page, and contains 24 Pythonised algorithms. The knobs control the user variables as well as the sample rate and what algorithm to use. You can add extra algorithms, but if you are searching online for them, you will find they are written in C. There are two major differences you need to note when converting from C to Python. The first is the ternary operation which in C is a question mark, and the second is the modulus operator with a percent sign. See the notes that accompany the main code about these.

Figure 6: How to program the registers in the D/A converter

Why does this work?

There are a few reasons why you would not expect this to work on a Raspberry Pi in Python. The most obvious being that of the interruptions made by the operating system, regularly interrupting the flow of output samples. Well, it turns out that this is not as bad as you might fear, and the extra ‘noise’ this causes is at a low level and is masked by the glitchy nature of the sound. As Python is an interpreted language, it is just about fast enough to give an adequate sample rate on a Raspberry Pi 4.

Make some noise

You can now explore the wide range of algorithms for generating a Glitch Storm and interact with the sound. On our GitHub page there’s a list of useful links allowing you to explore what others have done so far. For a sneak preview of the bytebeat type of sound, visit magpi.cc/bytebeatdemo; you can even add your own algorithms here. For interaction, however, there’s no substitute for having your own hardware. The best settings are often found by making small adjustments and listening to the long-term effects – some algorithms surprise you about a minute or two into a sequence by changing dramatically.

Get HackSpace magazine issue 34 — out today

HackSpace magazine issue 34: on sale now!

HackSpace magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the PDF from the HackSpace magazine website.

Subscribers to HackSpace for 12 months get a free Adafruit Circuit Playground, or can choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

If you liked this project, it was first featured in The MagPi Magazine. Download the latest issue for free or subscribe here.

The post Rotary encoders: Raise a Glitch Storm | Hackspace 34 appeared first on Raspberry Pi.

Steampunk ‘Help is coming’ Raspberry Pi alert system

Tom Lee decided to combine his household with his sister-in-law during lockdown so that she could help him make childcare more manageable. The problem was, Tom’s household was a smidge frantic in the mornings, as the family struggled to be up and ready in time for his sister-in-law’s arrival.

Enter this Raspberry Pi–powered tracking device, which tells Tom when the family car is on its way with childcare support. The DIY appliance helps his household manage childcare routines like clockwork.

The magic is in the wooden box, but the light cage and electrical meter are all part of the show

When the family car is moving, a light turns on, and an antique electrical meter points to 30…20…10 to show the estimated minutes until the driver arrives. The movements of the car come in from a cellular Sinotrack OBD2 dongle pointed at a traccar server running on Raspberry Pi 3.

We see you in there, Raspberry Pi…

Tom explains: “I have not found traccar to be the greatest to work with, but you can make it forward everything it decodes to your own script pretty easily.”

Materials:

  • Arduino microcontrollers (ATMega328P & ESP8266 based)
  • Raspberry Pi (Model 1 and 3)
  • Dongle device in car (with SIM card and cellular service)
  • Light device with bulb and solid state relay
  • Antique electrical meter (for the steampunks among you – any similar device will do the job!) 
The light safety cage was rescued from an old workshop

The case (below) is a lasercut design Tom had made by online laser cutting business Ponoko.

Inside there’s a solid state relay and a first-generation Raspberry Pi (hidden under the black cable in the photo below). This Raspberry Pi model doesn’t have wireless connectivity, and Tom found that getting wireless working was a bit tricky for this project.

Tom produced a nice long webinar to show you exactly how this all works. So if you’d like to give this project a try, watch it for yourself.

You’ll learn how to…

Code resources

Oh, and he’s only gone and uploaded every single bit of code you’ll need on GitHub (what an angel):

The post Steampunk ‘Help is coming’ Raspberry Pi alert system appeared first on Raspberry Pi.

Teaching pigeons with Raspberry Pi

It’s been a long lockdown for one of our favourite makers, Pi & Chips. Like most of us (probably), they have turned their hand to training small animals that wander into their garden to pass the time — in this case, pigeons. I myself enjoy raising my glass to the squirrel that runs along my back fence every evening at 7pm.

  • Mock Frank the pigeon
  • Real Frank the pigeon

Of course, Pi & Chips has taken this one step further and created a food dispenser including motion-activated camera with a Raspberry Pi 3B+ to test the intelligence of these garden critters and capture their efforts live.

Bird behaviour

Looking into the cognitive behaviour of birds (and finding the brilliantly titled paper Maladaptive gambling by pigeons), Pi & Chips discovered that pigeons can, with practice, recognise objects including buttons and then make the mental leap to realise that touching these buttons actually results in something happening. So they set about building a project to see this in action.

Enter the ‘SmartFrank 3000’, named after the bossiest bird to grace Pi & Chips’s shed roof over the summer.

Steppers and servos

The build itself is a simple combo of a switch and dispenser. But it quickly became apparent that any old servo wasn’t going to be up to the job — it couldn’t move fast enough to open and close a hatch quickly or strongly enough.

The motor setup

Running a few tests with a stepper motor confirmed that this was the perfect choice, as it could move quickly enough, and was strong enough to hold back a fair weight of seed when not in operation.

It took a while to get the timing on the stepper just right to give a pretty consistent delivery of the seed…

A 3D-printed flap for the stepper was also fashioned, plus a nozzle that fits over the neck of a two-litre drinks bottle, and some laser-cut pieces to make a frame to hold it all together.

The switch

Now for the switch that Frank the pigeon was going to have to touch if it wanted any bird seed. Pi & Chips came up with this design made from 3mm ply and some sponge as the spring.

  • They soldered some wires to a spring clip from an old photo frame and added a bolt and two nuts
  • The second nut allowed for very fine adjustment

They soldered some wires to a spring clip from an old photo frame and added a bolt and two nuts. The second nut allowed very fine adjustment of the distance to make sure the switch could be triggered by as light a touch as possible.

Behind the scenes

Behind the scenes setup

Behind the scenes there’s a Raspberry Pi 3B+ running the show, together with a motor controller board for the stepper motor. This board runs from its own battery pack, as it needs 12V power and is therefore too heavy for Raspberry Pi to handle directly. A Raspberry Pi Camera Module has also been added and runs this motion detection script to start recording whenever a likely bird candidate steps up to the plate for dinner. Hopefully, we can soon get some footage of Frank the pigeon learning and earning!

The post Teaching pigeons with Raspberry Pi appeared first on Raspberry Pi.

Build a Raspberry Pi robot buggy with your kids

Join us for Digital Making at Home: this week, young people can build a Raspberry Pi robot buggy with us! Through Digital Making at Home, we invite kids all over the world to code and make along with us and our new videos every week.

So get your Raspberry Pi, wheels, wires, and breadboards ready! We’re building a robot:

Let’s build a robot together this week!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session with Estefannie from Estefannie Explains it All to ask us your questions about robots and build something cool with Adafruit’s Circuit Playground.

The post Build a Raspberry Pi robot buggy with your kids appeared first on Raspberry Pi.

Remote teams ring office bell with Raspberry Pi and Slack

Bustling offices… remember those? It feels like we’ve all been working from home forever, and it’s going to be a while yet before everyone is back at their desks in the same place. And when that does happen, if your workplace is anything like Raspberry Pi Towers, there will still be lots of people in your team who are based in different countries or have always worked from home.

This office bell, built by a person called Alex, is powered by a Raspberry Pi 3B+ and is linked to Slack, so when a milestone or achievement is announced on the chat platform by a remote team member, they get to experience ringing the office bell for themselves, no matter where in the world they are working from.

Kit list:

Close-up of the servo wired to the Raspberry Pi pins

Integrating with Slack

To get the Raspberry Pi talking to Slack, Alex used the slackclient module (Python 3.6+ only), which makes use of the Slack Real Time Messaging (RTM) API. This is a websocket-based API that allows you to receive events from Slack in real time and send messages as users.

With the Slack RTM API, you create an RTM client and register a callback function that the client executes every time a specific Slack event occurs. When staff tell the @pibot on Slack it’s ‘belltime’, the Raspberry Pi tells the servo to ring the bell in the office.

Alex also configured it to always respond with an emoji reaction when someone successfully rings the bell, so remote employees get some actual feedback that it worked. Here’s the script for that bit.

Alex also figured out how to get around WiFi connectivity drops: they created a cronjob that runs a bash script every 15 minutes to check if the bell ringer is running. If it isn’t running, the bash script starts it.

At the end of Alex’s original post, they’ve concluded that using a HAT would allow for more control of the servo and avoid frying the Raspberry Pi. They also cleaned up their set-up recently and switched the Raspberry Pi 3B+ out for a Raspberry Pi Zero, which is perfectly capable of this simple job.

The post Remote teams ring office bell with Raspberry Pi and Slack appeared first on Raspberry Pi.

Mini Raspberry Pi Boston Dynamics–inspired robot

This is a ‘Spot Micro’ walking quadruped robot running on Raspberry Pi 3B. By building this project, redditor /thetrueonion (aka Mike) wanted to teach themself robotic software development in C++ and Python, get the robot walking, and master velocity and directional control.

Mike was inspired by Spot, one of Boston Dynamics’ robots developed for industry to perform remote operation and autonomous sensing.

What’s it made of?

  • Raspberry Pi 3B
  • Servo control board: PCA9685, controlled via I2C
  • Servos: 12 × PDI-HV5523MG
  • LCD Panel: 16×2 I2C LCD panel
  • Battery: 2s 4000 mAh LiPo, direct connection to power servos
  • UBEC: HKU5 5V/5A ubec, used as 5V voltage regulator to power Raspberry Pi, LCD panel, PCA9685 control board
  • Thingiverse 3D-printed Spot Micro frame

How does it walk?

The mini ‘Spot Micro’ bot rocks a three-axis angle command/body pose control mode via keyboard and can achieve ‘trot gait’ or ‘walk gait’. The former is a four-phase gait with symmetric motion of two legs at a time (like a horse trotting). The latter is an eight-phase gait with one leg swinging at a time and a body shift in between for balance (like humans walking).

Mike breaks down how they got the robot walking, right down to the order the servos need to be connected to the PCA9685 control board, in this extensive walkthrough.

Here’s the code

And yes, this is one of those magical projects with all the code you need stored on GitHub. The software is implemented on a Raspberry Pi 3B running Ubuntu 16.04. It’s composed on C++ and Python nodes in a ROS framework.

  • Pose
  • Strut

What’s next?

Mike isn’t finished yet: they are looking to improve their yellow beast by incorporating a lidar to achieve simple 2D mapping of a room. Also on the list is developing an autonomous motion-planning module to guide the robot to execute a simple task around a sensed 2D environment. And finally, adding a camera or webcam to conduct basic image classification would finesse their creation.

The post Mini Raspberry Pi Boston Dynamics–inspired robot appeared first on Raspberry Pi.

Track your punches with Raspberry Pi

‘Track-o-punches’ tracks the number of punches thrown during workouts with Raspberry Pi and a Realsense camera, and it also displays your progress and sets challenges on a touchscreen.

In this video, Cisco shows you how to set up the Realsense camera and a Python virtual environment, and how to install dependencies and OpenCV for Python on your Raspberry Pi.

How it works

A Realsense robotic camera tracks the boxing glove as it enters and leaves the frame. Colour segmentation means the camera can more precisely pick up when Cisco’s white boxing glove is in frame. He walks you through how to threshold images for colour segmentation at this point in the video.

Testing the tracking

All this data is then crunched on Raspberry Pi. Cisco’s code counts the consecutive frames that the segmented object is present; if that number is greater than a threshold, the code sees this as a particular action.

Raspberry Pi 4 being mounted on the Raspberry Pi 7″ Touch Display

Cisco used this data to set punch goals for the user. The Raspberry Pi computer is connected to an official Raspberry Pi 7″ Touch Display in order to display “success” and “fail” messages as well as the countdown clock. Once a goal is reached, the touchscreen tells the boxer that they’ve successfully hit their target. Then the counter resets and a new goal is displayed. You can manipulate the code to set a time limit to reach a punch goal, but setting a countdown timer was the hardest bit to code for Cisco.

Kit list

Jeeeez, it’s hard to get a screen grab of Cisco’s fists of fury

A mobile power source makes it easier to set up a Raspberry Pi wherever you want to work out. Cisco 3D-printed a mount for the Realsense camera and secured it on the ceiling so it could look down on him while he punched.

The post Track your punches with Raspberry Pi appeared first on Raspberry Pi.

New twist on Raspberry Pi experimental resin 3D printer

Element14’s Clem previously built a giant Raspberry Pi-powered resin-based 3D printer and here, he’s flipped the concept upside down.

The new Raspberry Pi 4 8GB reduces slicing times and makes for a more responsive GUI on this experimental 3D printer. Let’s take a look at what Clem changed and how…

The previous iteration of his build was “huge”, mainly because the only suitable screen Clem had to hand was a big 4K monitor. This new build flips the previous concept upside down by reducing the base size and the amount of resin needed.

Breaking out of the axis

To resize the project effectively, Clem came out of an X,Y axis and into Z, reducing the surface area but still allowing for scaling up, well, upwards! The resized, flipped version of this project also reduces the cost (resin is expensive stuff) and makes the whole thing more portable than a traditional, clunky 3D printer.

Look how slim and portable it is!

How it works

Now for the brains of the thing: nanodlp is free (but not open source) software which Clem ran on a Raspberry Pi 4. Using an 8GB Raspberry Pi will get you faster slicing times, so go big if you can.

A 5V and 12V switch volt power supply sorts out the Nanotec stepper motor. To get the signal from the Raspberry Pi GPIO pins to the stepper driver and to the motor, the pins are configured in nanodlp; Clem has shared his settings if you’d like to copy them (scroll down on this page to find a ‘Resources’ zip file just under the ‘Bill of Materials’ list).

Raspberry Pi working together with the display

For the display, there’s a Midas screen and an official Raspberry Pi 7″ Touchscreen Display, both of which work perfectly with nanodlip.

At 9:15 minutes in to the project video, Clem shows you around Fusion 360 and how he designed, printed, assembled, and tested the build’s engineering.

A bit of Fusion 360

Experimental resin

Now for the fancy, groundbreaking bit: Clem chose very specialised photocentric, high-tensile daylight resin so he can use LEDs with a daylight spectrum. This type of resin also has a lower density, so the liquid does not need to be suspended by surface tension (as in traditional 3D printers), rather it floats because of its own buoyancy. This way, you’ll need less resin to start with, and you’ll waste less too whenever you make a mistake. At 13:30 minutes into the project video, Clem shares the secret of how you achieve an ‘Oversaturated Solution’ in order to get your resin to float.

Now for the science bit…

Materials

It’s not perfect but, if Clem’s happy, we’re happy.

Join the conversation on YouTube if you’ve got an idea that could improve this unique approach to building 3D printers.

The post New twist on Raspberry Pi experimental resin 3D printer appeared first on Raspberry Pi.

Raspberry Pi calls out your custom workout routine

If you don’t want to be tied to a video screen during home workouts, Llum AcostaSamreen Islam, and Alfred Gonzalez shared this great Raspberry Pi–powered alternative on hackster.io: their voice-activated project announces each move of your workout routine and how long you need to do it for.

This LED-lit, compact solution means you don’t need to squeeze yourself in front of a TV or crane to see what your video instructor is doing next. Instead you can be out in the garden or at a local park and complete your own, personalised workout on your own terms.

Kit list:

Raspberry Pi and MATRIX Device

The makers shared these setup guides to get MATRIX working with your Raspberry Pi. Our tiny computer doesn’t have a built-in microphone, so here’s where the two need to work together.

MATRIX, meet Raspberry Pi

Once that’s set up, ensure you enable SSH on your Raspberry Pi.

Click, click. Simple

The three sweet Hackster angels shared a four-step guide to running the software of your own customisable workout routine buddy in their original post. Happy hacking!

1. Install MATRIX Libraries and Rhasspy

Follow the steps below in order for Rhasspy to work on your Raspberry Pi.

2. Creating an intent

Access Rhasspy’s web interface by opening a browser and navigating to http://YOUR_PI_IP_HERE:12101. Then click on the Sentences tab. All intents and sentences are defined here.

By default, there are a few example sentences in the text box. Remove the default intents and add the following:

[Workout]start [my] workout

Once created, click on Save Sentences and wait for Rhasspy to finish training.

Here, Workout is an intent. You can change the wording to anything that works for you as long as you keep [Workout] the same, because this intent name will be used in the code.

3. Catching the intent

Install git on your Raspberry Pi.

sudo apt install git

Download the repository.

git clone https://github.com/matrix-io/rhasspy-workout-timer

Navigate to the folder and install the project dependencies.

cd rhasspy-workout-timernpm install

Run the program.

node index.js

4. Using and customizing the project

To change the workout to your desired routine, head into the project folder and open workout.txt. There, you’ll see:

jumping jacks 12,plank 15, test 14

To make your own workout routine, type an exercise name followed by the number of seconds to do it for. Repeat that for each exercise you want to do, separating each combo using a comma.

Whenever you want to use the Rhasspy Assistant, run the file and say “Start my workout” (or whatever it is you have it set to).

And now you’re all done — happy working out. Make sure to visit the makers’ original post on hackster.io and give it a like.

The post Raspberry Pi calls out your custom workout routine appeared first on Raspberry Pi.

Create a stop motion film with Digital Making at Home

Join us for Digital Making at Home: this week, young people can do stop motion and time-lapse animation with us! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get your Raspberry Pi and Camera Module ready! We’re using them to capture life with code this week:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session to make a motion-detecting dance game in Scratch!

The post Create a stop motion film with Digital Making at Home appeared first on Raspberry Pi.

Processing raw image files from a Raspberry Pi High Quality Camera

When taking photos, most of us simply like to press the shutter button on our cameras and phones so that a viewable image is produced almost instantaneously, usually encoded in the well-known JPEG format. However, there are some applications where a little more control over the production of that JPEG is desirable. For instance, you may want more or less de-noising, or you may feel that the colours are not being rendered quite right.

This is where raw (sometimes RAW) files come in. A raw image in this context is a direct capture of the pixels output from the image sensor, with no additional processing. Normally this is in a relatively standard format known as a Bayer image, named after Bryce Bayer who pioneered the technique back in 1974 while working for Kodak. The idea is not to let the on-board hardware ISP (Image Signal Processor) turn the raw Bayer image into a viewable picture, but instead to do it offline with an additional piece of software, often referred to as a raw converter.

A Bayer image records only one colour at each pixel location, in the pattern shown

The raw image is sometimes likened to the old photographic negative, and whilst many camera vendors use their own proprietary formats, the most portable form of raw file is the Digital Negative (or DNG) format, defined by Adobe in 2004. The question at hand is how to obtain DNG files from Raspberry Pi, in such a way that we can process them using our favourite raw converters.

Obtaining a raw image from Raspberry Pi

Many readers will be familiar with the raspistill application, which captures JPEG images from the attached camera. raspistill includes the -r option, which appends all the raw image data to the end of the JPEG file. JPEG viewers will still display the file as normal but ignore the (many megabytes of) raw data tacked on the end. Such a “JPEG+RAW” file can be captured using the terminal command:

raspistill -r -o image.jpg

Unfortunately this JPEG+RAW format is merely what comes out of the camera stack and is not supported by any raw converters. So to make use of it we will have to convert it into a DNG file.

PyDNG

This Python utility converts the Raspberry Pi’s native JPEG+RAW files into DNGs. PyDNG can be installed from github.com/schoolpost/PyDNG, where more complete instructions are available. In brief, we need to perform the following steps:

git clone https://github.com/schoolpost/PyDNG
cd PyDNG
pip3 install src/.  # note that PyDNG requires Python3

PyDNG can be used as part of larger Python scripts, or it can be run stand-alone. Continuing the raspistill example from before, we can enter in a terminal window:

python3 examples/utility.py image.jpg

The resulting DNG file can be processed by a variety of raw converters. Some are free (such as RawTherapee or dcraw, though the latter is no longer officially developed or supported), and there are many well-known proprietary options (Adobe Camera Raw or Lightroom, for instance). Perhaps users will post in the comments any that they feel have given them good results.

White balancing and colour matrices

Now, one of the bugbears of processing Raspberry Pi raw files up to this point has been the problem of getting sensible colours. Previously, the images have been rendered with a sickly green cast, simply because no colour balancing is being done and green is normally the most sensitive colour channel. In fact it’s even worse than this, as the RGB values in the raw image merely reflect the sensitivity of the sensor’s photo-sites to different wavelengths, and do not a priori have more than a general correlation with the colours as perceived by our own eyes. This is where we need white balancing and colour matrices.

Correct white balance multipliers are required if neutral parts of the scene are to look, well, neutral.  We can use raspistills guesstimate of them, found in the JPEG+RAW file (or you can measure your own on a neutral part of the scene, like a grey card). Matrices and look-up tables are then required to convert colour from ‘camera’ space to the final colour space of choice, mostly sRGB or Adobe RGB.

My thanks go to forum contributors Jack Hogan for measuring these colour matrices, and to Csaba Nagy for implementing them in the PyDNG tool. The results speak for themselves.

Results

Previous attempts at raw conversion are on the left; the results using the updated PyDNG are on the right.

Images 2 and 3 courtesy of Csaba Nagy; images 4 and 5 courtesy of Jack Hogan

DCP files

For those familiar with DNG files, we include links to DCP (DNG Camera Profile) files (warning: binary format). You can try different ones out in raw converters, and we would encourage users to experiment, to perhaps create their own, and to share their results!

  1. This is a basic colour profile baked into PyDNG, and is the one shown in the results above. It’s sufficiently small that we can view it as a JSON file.
  2. This is an improved (and larger) profile involving look-up tables, and aiming for an overall balanced colour rendition.
  3. This is similar to the previous one, but with some adjustments for skin tones and sky colours.

Note, however, that these files come with a few caveats. Specifically:

  • The calibration is only for a single Raspberry Pi High Quality Camera rather than a known average or “typical” module.
  • The illuminants used for the calibration are merely the ones that we had to hand — the D65 lamp in particular appears to be some way off.
  • The calibration only really works when the colour temperature lies between, or not too far from, the two calibration illuminants, approximately 2900K to 6000K in our case.

So there remains room for improvement. Nevertheless, results across a number of modules have shown these parameters to be a significant step forward.

Acknowledgements

My thanks again to Jack Hogan for performing the colour matrix calibration with DCamProf, and to Csaba Nagy for adding these new features to PyDNG.

Further reading

  1. There are many resources explaining how a raw (Bayer) image is converted into a viewable RGB or YUV image, among them Jack’s blog post.
  2. To understand the role of the colour matrices in a DNG file, please refer to the DNG specification. Chapter 6 in particular describes how they are used.

The post Processing raw image files from a Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Recreate Time Pilot’s free-scrolling action | Wireframe #41

Fly through the clouds in our re-creation of Konami’s classic 1980s shooter. Mark Vanstone has the code

  • Designed by Yoshiki Okamoto, Konami’s Time Pilot saw an arcade release in 1982.

Arguably one of Konami’s most successful titles, Time Pilot burst into arcades in 1982. Yoshiki Okamoto worked on it secretly, and it proved so successful that a sequel soon followed. In the original, the player flew through five eras, from 1910, 1940, 1970, 1982, and then to the far future: 2001. Aircraft start as biplanes and progress to become UFOs, naturally, by the last level.

Players also rescue other pilots by picking them up as they parachute from their aircraft. The player’s plane stays in the centre of the screen while other game objects move around it. The clouds that give the impression of movement have a parallax style to them, some moving faster than others, offering an illusion of depth.

To make our own version with Pygame Zero, we need eight frames of player aircraft images – one for each direction it can fly. After we create a player Actor object, we can get input from the cursor keys and change the direction the aircraft is pointing with a variable which will be set from zero to 7, zero being the up direction. Before we draw the player to the screen, we set the image of the Actor to the stem image name, plus whatever that direction variable is at the time. That will give us a rotating aircraft.

To provide a sense of movement, we add clouds. We can make a set of random clouds on the screen and move them in the opposite direction to the player aircraft. As we only have eight directions, we can use a lookup table to change the x and y coordinates rather than calculating movement values. When they go off the screen, we can make them reappear on the other side so that we end up with an ‘infinite’ playing area. Add a level variable to the clouds, and we can move them at different speeds on each update() call, producing the parallax effect. Then we need enemies. They will need the same eight frames to move in all directions. For this sample, we will just make one biplane, but more could be made and added.

Our Python homage to Konami’s arcade classic.

To get the enemy plane to fly towards the player, we need a little maths. We use the math.atan2() function to work out the angle between the enemy and the player. We convert that to a direction which we set in the enemy Actor object, and set its image and movement according to that direction variable. We should now have the enemy swooping around the player, but we will also need some bullets. When we create bullets, we need to put them in a list so that we can update each one individually in our update(). When the player hits the fire button, we just need to make a new bullet Actor and append it to the bullets list. We give it a direction (the same as the player Actor) and send it on its way, updating its position in the same way as we have done with the other game objects.

The last thing is to detect bullet hits. We do a quick point collision check and if there’s a match, we create an explosion Actor and respawn the enemy somewhere else. For this sample, we haven’t got any housekeeping code to remove old bullet Actors, which ought to be done if you don’t want the list to get really long, but that’s about all you need: you have yourself a Time Pilot clone!

Here’s Mark’s code for a Time Pilot-style free-scrolling shooter. To get it running on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 41

You can read more features like this one in Wireframe issue 41, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 41 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Recreate Time Pilot’s free-scrolling action | Wireframe #41 appeared first on Raspberry Pi.

Raspberry Pi keyboards for Japan are here!

When we announced new keyboards for Portugal and the Nordic countries last month, we promised that you wouldn’t have to wait much longer for a variant for Japan, and now it’s here!

Japanese Raspberry Pi keyboard

The Japan variant of the Raspberry Pi keyboard required a whole new moulding set to cover the 83-key arrangement of the keys. It’s quite a complex keyboard, with three different character sets to deal with. Figuring out how the USB keyboard controller maps to all the special keys on a Japanese keyboard was particularly challenging, with most web searches leading to non-English websites. Since I don’t read Japanese, it all became rather bewildering.

We ended up reverse-engineering generic Japanese keyboards to see how they work, and mapping the keycodes to key matrix locations. We are fortunate that we have a very patient keyboard IC vendor, called Holtek, which produces the custom firmware for the controller.

We then had to get these prototypes to our contacts in Japan, who told us which keys worked and which just produced a strange squiggle that they didn’t understand either. The “Yen” key was particularly difficult because many non-Japanese computers read it as a “/” character, no matter what we tried to make it work.

Special thanks are due to Kuan-Hsi Ho of Holtek, to Satoka Fujita for helping me test the prototypes, and to Matsumoto Seiya for also testing units and checking the translation of the packaging.

Get yours today

You can get the new Japanese keyboard variant in red/white from our Approved Reseller, SwitchScience, based in Japan.

If you’d rather your keyboard in black/grey, you can purchase it from Pimoroni and The Pi Hut in the UK, who both offer international shipping.

The post Raspberry Pi keyboards for Japan are here! appeared first on Raspberry Pi.

DSLR motion detection with Raspberry Pi and OpenCV

One of our favourite makers, Pi & Chips (AKA David Pride), wanted to see if they could trigger a DSLR camera to take pictures by using motion detection with OpenCV on Raspberry Pi.

You could certainly do this with a Raspberry Pi High Quality Camera, but David wanted to try with his swanky new Lumix camera. As well as a Raspberry Pi and whichever camera you’re using, you’ll also need a remote control. David sourced a cheap one from Amazon, since he knew full well he was going to be… breaking it a bit.

Breaking the remote a bit

When it came to the “breaking” part, David explains: “I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.”

Further breaking

Next, David added Dupont cables to the ends of the wires to allow access to the breadboard, holding the cables in place with a blob of hot glue. Then a very simple circuit using an NPN transistor to switch via GPIO gave remote control of the camera from Python.

Raspberry Pi on the right, working together with the remote control’s innards on the left

David then added OpenCV to the mix, using this tutorial on PyImageSearch. He took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.

He needed to add a delay to the start of the script so he could position stuff, or himself, in front of the camera with time to spare. Got to think of those angles.

David concludes: “The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.”

The whole setup mounted on a tripod ready to play

Here are some of the visuals captured by this Raspberry Pi-powered project…

Take a look at some more of David’s projects over at Pi & Chips.

The post DSLR motion detection with Raspberry Pi and OpenCV appeared first on Raspberry Pi.

Raspberry Pi won’t let your watched pot boil

One of our favourite YouTubers, Harrison McIntyre, decided to make the aphorism “a watched pot never boils” into reality. They modified a tabletop burner with a Raspberry Pi so that it will turn itself off if anyone looks at it.

In this project, the Raspberry Pi runs facial detection using a USB camera. If the Raspberry Pi finds a face, it deactivates the burner, and vice versa.

There’s a snag, in that the burner runs off 120 V AC and the Raspberry Pi runs off 5 V DC, so you can’t just power the burner through the Raspberry Pi. Harrison got round this problem using a relay switch, and beautifully explains how a relay manages to turn a circuit off and on without directly interfacing with the circuit at the two minute mark of this video.

The Raspberry Pi working through the switchable plug with the burner

Harrison sourced a switchable plug bar which uses a relay to turn its own switches on and off. Plug the burner and the Raspberry Pi into that and, hey presto, you’ve got them working together via a relay.

The six camera setup

Things get jazzy at the four minute 30 second mark. At this point, Harrison decides to upgrade his single camera situation, and rig up six USB cameras to make sure that no matter where you are when you you look at the burner, the Raspberry Pi will always see your face and switch it off.

Inside the switchable plug

Harrison’s multiple-camera setup proved a little much for the Raspberry Pi 3B he had to hand for this project, so he goes on to explain how he got a bit of extra processing power using a different desktop and an Arduino. He recommends going for a Raspberry Pi 4 if you want to try this at home.

Kit list:

  • Raspberry Pi 4
  • Tabletop burner
  • USB cameras or rotating camera
  • Switchable plug bar
  • All of this software
It’s not just a saying anymore, thanks to Harrison

And the last great thing about this project is that you could invert the process to create a safety mechanism, meaning you wouldn’t be able to wander away from your cooking and leave things to burn.

We also endorse Harrison’s advice to try this with an electric burner and most definitely not a gas one; those things like to go boom if you don’t play with them properly.

The post Raspberry Pi won’t let your watched pot boil appeared first on Raspberry Pi.

Design game graphics with Digital Making at Home

Join us for Digital Making at Home: this week, young people can explore the graphics side of video game design! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get ready to design video game graphics with us:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session to make a Space Invaders–style shooter game in Scratch!

The post Design game graphics with Digital Making at Home appeared first on Raspberry Pi.

International Space Station Tracker | The MagPi 96

Fancy tracking the ISS’s trajectory? All you need is a Raspberry Pi, an e-paper display, an enclosure, and a little Python code. Nicola King looks to the skies

The e-paper display mid-refresh. It takes about three seconds to refresh, but it’s fast enough for this kind of project

Standing on his balcony one sunny evening, the perfect conditions enabled California-based astronomy enthusiast Sridhar Rajagopal to spot the International Space Station speeding by, and the seeds of an idea were duly sown. Having worked on several projects using tri-colour e-paper (aka e-ink) displays, which he likes for their “aesthetics and low-to-no-power consumption”, he thought that developing a way of tracking the ISS using such a display would be a perfect project to undertake.

“After a bit of searching, I was able to find an open API to get the ISS location at any given point in time,” explains Sridhar. I also knew I wouldn’t have to worry about the data changing several times per second or even per minute. Even though the ISS is wicked fast (16 orbits in a day!), this would still be well within the refresh capabilities of the e-paper display.”

The ISS location data is obtained using the Open Notify API – visit magpi.cc/isslocation to see its current position

Station location

His ISS Tracker works by obtaining the ISS location from the Open Notify API every 30 seconds. It appends this data point to a list, so older data is available. “I don’t currently log the data to file, but it would be very easy to add this functionality,” says Sridhar. “Once I have appended the data to the list, I call the drawISS method of my Display class with the positions array, to render the world map and ISS trajectory and current location. The world map gets rendered to one PIL image, and the ISS location and trajectory get rendered to another PIL image.”

The project code is written in Python and can be found on Sridhar’s GitHub
page: magpi.cc/isstrackercode

Each latitude/longitude position is mapped to the corresponding XY co-ordinate. The last position in the array (the latest position) gets rendered as the ISS icon to show its current position. “Every 30th data point gets rendered as a rectangle, and every other data point gets rendered as a tiny circle,” adds Sridhar.

From there, the images are then simply passed into the e-paper library’s display method; one image is rendered in black, and the other image in red.

Track… star

Little wonder that the response received from friends, family, and the wider maker community has been extremely positive, as Sridhar shares: “The first feedback was from my non-techie wife who love-love-loved the idea of displaying the ISS location and trajectory on the e-paper display. She gave valuable input on the aesthetics of the data visualisation.”

Software engineer turned hardwarehacking enthusiast and entrepreneur, Sridhar Rajagopal is the founder of Upbeat Labs and creator of ProtoStax – a maker-friendly stackable, modular,
and extensible enclosure system.

In addition, he tells us that other makers have contributed suggestions for improvements. “JP, a Hackster community user […] added information to make the Python code a service and have it launch on bootup. I had him contribute his changes to my GitHub repository – I was thrilled about the community involvement!”

Housed in a versatile, transparent ProtoStax enclosure designed by Sridhar, the end result is an elegant way of showing the current position and trajectory of the ISS as it hurtles around the Earth at 7.6 km/s. Why not have a go at making your own display so you know when to look out for the space station whizzing across the night sky? It really is an awesome sight.

Get The MagPi magazine issue 96 — out today

The MagPi magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the directly from PDF from the MagPi magazine website.

Subscribers to the MagPi for 12 months to get a free Adafruit Circuit Playground, or choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

The post International Space Station Tracker | The MagPi 96 appeared first on Raspberry Pi.

Amazing science from the winners of Astro Pi Mission Space Lab 2019–20

The team at Raspberry Pi and our partner ESA Education are pleased to announce the winning and highly commended Mission Space Lab teams of the 2019–20 European Astro Pi Challenge!

Astro Pi Mission Space Lab logo

Mission Space Lab sees teams of young people across Europe design, create, and deploy experiments running on Astro Pi computers aboard the International Space Station. Their final task: analysing the experiments’ results and sending us scientific reports highlighting their methods, results, and conclusions.

One of the Astro Pi computers aboard the International Space Station
One of the Astro Pi computers aboard the International Space Station

The science teams performed was truly impressive, and the reports teams sent us were of outstanding quality. A special round of applause to the teams for making the effort to coordinate writing their reports socially distant!

The Astro Pi jury has now selected the ten winning teams, as well as eight highly commended teams:

And our winners are…

Vidhya’s code from the UK aimed to answer the question of how a compass works on the ISS, using the Astro Pi computer’s magnetometer and data from the World Magnetic Model (WMM).

Unknown from Externato Cooperativo da Benedita, Portugal, aptly investigated whether influenza is transmissible on a spacecraft such as the ISS, using the Astro Pi hardware alongside a deep literature review.

Space Wombats from Institut d’Altafulla, Spain, used normalized difference vegetation index (NDVI) analysis to identify burn scars from forest fires. They even managed to get results over Chernobyl!

Liberté from Catmose College, UK, set out to prove the Coriolis Effect by using Sobel filtering methods to identify the movement and direction of clouds.

Pardubice Pi from SPŠE a VOŠ Pardubice, Czech Republic, found areas of enormous vegetation loss by performing NDVI analysis on images taken from the Astro Pi and comparing this with historic images of the location.

NDVI conversion image by Pardubice Pi team – Astro Pi Mission Space Lab experiment
NDVI conversion image by Pardubice Pi team

Reforesting Entrepreneurs from Canterbury School of Gran Canaria, Spain, want to help solve the climate crisis by using NDVI analysis to identify locations where reforestation is possible.

1G5-Boys from Lycée Raynouard, France, innovatively conducted spectral analysis using Fast Fourier Transforms to study low-frequency vibrations of the ISS.

Cloud4 from Escola Secundária de Maria, Portugal, masterfully used a simplified static model and Fourier Analysis to detect atmospheric gravity waves (AGWs).

Cloud Wizzards from Primary School no. 48, Poland, scanned the sky to determine what percentage of the seas and oceans are covered by clouds.

Aguere Team 1 from IES Marina Cebrián, Spain, probed the behaviour of the magnetic field, acceleration, and temperature on the ISS by investigating disturbances, variations with latitude, and temporal changes.

Highly commended teams

Creative Coders, from the UK, decided to see how much of the Earth’s water is stored in clouds by analysing the pixels of each image of Earth their experiment collected.

Astro Jaslo from I Liceum Ogólnokształcące króla Stanisława Leszczyńskiego w Jaśle, Poland, used Reimann geometry to determine the angle between light from the sun that is perpendicular to the Astro Pi camera, and the line segment from the ISS to Earth’s centre.

Jesto from S.M.S Arduino I.C.Ivrea1, Italy, used a multitude of the Astro Pi computers’ capabilities to study NDVI, magnetic fields, and aerosol mapping.

BLOOMERS from Tudor Vianu National Highschool of Computer Science, Romania, investigated how algae blooms are affected by eutrophication in polluted areas.

AstroLorenzini from Liceo Statale C. Lorenzini, Italy used Kepler’s third law to determine the eccentricity, apogee, perigee, and mean tangential velocity of the ISS.

Photo of Italy, Calabria and Sicilia by AstroLorenzi team — Astro Pi Mission Space Lab experiment
Photo of Italy, Calabria and Sicilia (notice volcano Etna on the top right-hand corner) captured by the AstroLorenzi team

EasyPeasyCoding Verdala FutureAstronauts from Verdala International School & EasyPeasyCoding, Malta, utilised machine learning to differentiate between cloud types.

BHTeamEL from Branksome Hall, Canada, processed images using Y of YCbCr colour mode data to investigate the relationship between cloud type and luminescence.

Space Kludgers from Technology Club of Thrace, STETH, Greece, identified how atmospheric emissions correlate to population density, as well as using NDVI, ECCAD, and SEDAC to analyse the correlation of vegetation health and abundance with anthropogenic emissions.

The teams get a Q&A with astronaut Luca Parmitano

The prize for the winners and highly commended teams is the chance to pose their questions to ESA astronaut Luca Parmitano! The teams have been asked to record a question on video, which Luca will answer during a live stream on 3 September.

ESA astronaut Luca Parmitano aboard the International Space Station
ESA astronaut Luca Parmitano aboard the International Space Station

This Q&A event for the finalists will conclude this year’s European Astro Pi Challenge. Everyone on the Raspberry Pi and ESA Education teams congratulates this year’s participants on all their efforts.

It’s been a phenomenal year for the Astro Pi challenge: team performed some great science, and across Mission Space Lab and Mission Zero, an astronomical 16998 young people took part, from all ESA member states as well as Slovenia, Canada, and Malta.

Congratulations to everyone who took part!

Get excited for your next challenge!

This year’s European Astro Pi Challenge is almost over, and the next edition is just around the corner!

Compilation of photographs of Earth, taken by Astro Pi Izzy aboard the ISS
Compilation of photographs of Earth taken by an Astro Pi computer

So we invite school teachers, educators, students, and all young people who love coding and space science to join us from September onwards.

Follow our updates on astro-pi.org and social media to make sure you don’t miss any announcements. We will see you for next year’s European Astro Pi Challenge!

The post Amazing science from the winners of Astro Pi Mission Space Lab 2019–20 appeared first on Raspberry Pi.

Gender balance in computing: current research

We’ve really enjoyed starting a series of seminars on computing education research over the summer, as part of our strategy to develop research at the Raspberry Pi Foundation. We want to deepen our understanding of how young people learn about computing and digital making, in order to increase the impact of our own work and to advance the field of computing education.

Part of deepening our understanding is to hear from and work with experts from around the world. The seminar series, and our online research symposium, are an opportunity to do that. In addition, these events support the global computing education research community by providing relevant content and a forum for discussion. You can see the talks recordings and slides of all our previous seminar speakers and symposium speakers on our website.

Gender balance in your computing classroom: what the research says

Our seventh seminar presentation was given by Katharine Childs from our own team. She works on our DfE-funded Gender Balance in Computing programme and gave a brilliant summary of some of the recent research around barriers to gender balance in school computing.

Screenshot of a presentation about gender balance in computing. Text says: "Key questions: What are the barriers which prevent girls' participation in computing? Which interventions can support girls to choose computing qualifications and careers?"

In her presentation, Katharine considered belongingness, role models, relevance to real-world contexts, and non-formal learning. She drew out the links between theory and practice and suggested a range of interventions. I recommend watching the video of her presentation and looking through her slides. 

Katharine has also been publishing a number of excellent blog posts summarising her research on gender balance:

You can read more about our Gender Balance in Computing project and sign up to receive regular newsletters about it.

Join our autumn seminar series

From September, our computing education research seminars will take place on the first Tuesday of each month, starting at 17:00 UK time.

We’re excited about the range of topics to be presented, and about our fantastic lineup of speakers: an international group from Australia, the US, Ireland, and Scotland will present on a survey of computing education curricular and teaching around the world; Shuchi Grover will talk to us about formative assessment; and David Weintrop will share his work on block-based programming. I’ll be talking about my research on PRIMM and the benefits of language and talk in the programming classroom. And we’re lining up more speakers after that.

Find out more and sign up today at rpf.io/research-seminars!

Thank you

We’d like to thank everyone who has participated in our seminar series, whether as speaker or attendee. We’ve welcomed attendees from 22 countries and speakers from the US, UK, and Spain. You’ve all really helped us to start this important work, and we look forward to working with you in the next academic year!

The post Gender balance in computing: current research appeared first on Raspberry Pi.

Auto-blow bubbles with a Raspberry Pi-powered froggy

8 Bits and a Byte created this automatic bubble machine, which is powered and controlled by a Raspberry Pi and can be switched on via the internet by fans of robots and/or bubbles.

They chose a froggy-shaped bubble machine, but you can repurpose whichever type you desire; it’s just easier to adapt a model running on two AA batteries.

Raspberry Pi connected to the relay module

Before the refurb, 8 Bits and a Byte’s battery-powered bubble machine was controlled by a manual switch, which turned the motor on and off inside the frog. If you wanted to watch the motor make the frog burp out bubbles, you needed to flick this switch yourself.

After dissecting their plastic amphibian friend, 8 Bits and a Byte hooked up its motor to Raspberry Pi using a relay module. They point to this useful walkthrough for help with connecting a relay module to Raspberry Pi’s GPIO pins.

Now the motor inside the frog can be turned on and off with the power of code. And you can become controller of bubbles by logging in here and commanding the Raspberry Pi to switch on.

A screenshot of the now automated frog in situ as seen on the remo dot tv website

To let the internet’s bubble fans see the fruits of their one-click labour, 8 Bits and a Byte set up a Raspberry Pi Camera Module and connected their build to robot streaming platform remo.tv.

Bubble soap being poured into the plastic frog's mouth
Don’t forget your bubble soap!

Kit list:

The only remaining question is: what’s the best bubble soap recipe?

The post Auto-blow bubbles with a Raspberry Pi-powered froggy appeared first on Raspberry Pi.

This Raspberry Pi–powered setup improves home brewing

We spied New Orleans–based Raspberry Pi–powered home brewing analysis and were interested in how this project could help other at-home brewers perfect their craft.

Raspberry Pi in a case with fan, neatly tucked away on a shelf in the Danger Shed

When you’re making beer, you want the yeast to eat up the sugars and leave alcohol behind. To check whether this is happening, you need to be able to track changes in gravity, known as ‘gravity curves’. You also have to do yeast cell counts, and you need to be able to tell when your beer has finished fermenting.

“We wanted a way to skip the paper and pencil and instead input the data directly into the software. Enter the Raspberry Pi!”

Patrick Murphy

Patrick Murphy and co. created a piece of software called Aleproof which allows you to monitor all of this stuff remotely. But before rolling it out, they needed somewhere to test that it works. Enter the ‘Danger Shed’, where they ran Aleproof on Raspberry Pi.

The Danger Shed benefits from a fancy light-changing fan for the Raspberry Pi

A Raspberry Pi 3 Model B+ spins their Python-based program on Raspberry Pi OS and shares its intel via a mounted monitor.

Here’s what Patrick had to say about what they’re up to in the Danger Shed and why they needed a Raspberry Pi:

The project uses PyCharm to run the Python-based script on the Raspberry Pi OS

“I am the founder and owner of Arithmech, a small software company that develops Python applications for brewers. Myself and a few buddies (all of us former Army combat medics) started our own brewing project called Danger Shed Ales & Mead to brew and test out the software on real-world data. We brew in the shed and record data on paper as we go, then enter the data into our software at a later time.”

Look how neat and out of the way our tiny computer is

“We wanted a way to skip the paper and pencil and instead input the data directly into the software. Enter the Raspberry Pi! The shed is small, hot, has leaks, and is generally a hostile place for a full-size desktop computer. Raspberry Pi solves our problem in multiple ways: it’s small, portable, durable (in a case), and easily cooled. But on top of that, we are able to run the code using PyCharm, enter data throughout the brewing process, and fix bugs all from the shed!”

The Raspberry Pi in its case inc. fan

“The Raspberry Pi made it easy for us to set up our software and run it as a stand-alone brewing software station.”

Productivity may have slowed when Patrick, Philip, and John remembered you can play Minecraft on the Raspberry Pi

The post This Raspberry Pi–powered setup improves home brewing appeared first on Raspberry Pi.

Code retro games with Digital Making at Home

Join us for Digital Making at Home: this week, young people can recreate classic* video games with us! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get ready to code some classic retro games with us:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session!

* Be warned that we’re using the terms ‘classic/retro’ in line with the age of our young digital makers — a LOT of games are retro for them ?

The post Code retro games with Digital Making at Home appeared first on Raspberry Pi.

Maddie and Greg go live with computers!

Par : Alex Bate

This Saturday morning, our friends Maddie Moate and Greg Foot will be live at The Centre for Computing History for a computing- and retro gaming-inspired episode of their show Let’s Go Live, and you can tune in from 10am to join the fun.

Retro gaming and computer funtimes

Saturday’s show will be a retro feast of vintage video games, and will answer questions such as ‘What is a computer?’ and ‘How do computers work?’. As always, Maddie and Greg have a number of activities planned, including designing pixel art and going on a tech safari! They’re also extremely excited to step inside a giant computer and try to put it back together!

Let’s Go Live

Let’s Go Live is a family science show that Maddie and Greg began on day 1 of lockdown to help with the challenge of homeschooling. Since then, Maddie and Greg have hosted 50 live shows from their ‘spare room studio’ and caught the attention of millions of families across the world who enjoy tuning into their daily dose of fun, facts, and science activities.

After a short break, the two are now back for the summer holidays and plan to make Let’s Go Live bigger and better than ever by bringing you live shows from unique locations across the UK — a new venue each week!

Maddie and Greg will be live on Facebook and YouTube, and we recommend subscribing to Maddie’s channel to ensure you never miss an episode.

Do you know: All Maddie’s T-shirt merch is made in a factory run by Raspberry Pis? Here’s her video about her new line of T-shirts, and here’s our video about the factory.

But I want more!

We don’t blame you! If you’ve already been following Maddie and Greg on their Let’s Go Live journey throughout lockdown, and you’re looking for more fun online content to entertain you and your family, look no further than the Raspberry Pi Foundation’s Digital Making at Home:

Digital Making at Home

Each week, we share a themed code-along video and host a live stream to inspire families to have fun with coding and digital making at home! Join Christina, Marc, Mr C and their host of special guests as they work their way through our great coding activities. This week, the Digital Making at Home team has been exploring outer space, and they show you how to use Scratch and Python code to race the International Space Station, animate astronauts, and defy gravity.

And our next theme for Digital Making at Home — out tomorrow just when Let’s Go Live finishes — is retro games!

You’ll find all the episodes of Digital Making at Home on our website — new ones are added every Saturday morning. And on the website, you can also tune into our weekly code-along live stream every Wednesday at 2pm BST!

The post Maddie and Greg go live with computers! appeared first on Raspberry Pi.

Watch wildlife with a Raspberry Pi nature camera| Hackspace 33

The past few months have given us ample opportunity to stare at the creatures that reside outside. In issue 33 of Hackspace Magazine, out today, Rosie Hattersley looks at ways to track them.

It’s been a remarkable spring and early summer, and not just because many of us have had more time than usual to be able to appreciate our surroundings. The weather has been mild, the skies clear, and pollution levels low. As a result, it ought to be a bumper year for plants and wildlife. Unfortunately, the lockdown limited opportunities for embracing unexpectedly good weather while simultaneously making us more aware of the wildlife on our doorsteps.

“It’s a great time to take a fresh look at the world around us”

If you’re the outdoorsy type who likes to get out and stare intently at feathered friends from the comfort of a large shed on the edge of a lagoon, you may have spent the past few months getting to know suburban birds during your exercise walks, rather than ticking off unusual species. As things finally open up, it’s a great time to take a fresh look at the world around us, and some of the projects focused on the creatures we share it with.

Make your own nature cam

Equipped with a Raspberry Pi connected to a camera and USB power bank, we are able to spy on the wildlife in our garden. The Raspberry Pi Camera Module V2 is a good option here (it’s less intrusive than the newer High Quality Camera, though that would make a superb critter-cam). It’s important not to disturb wildlife with lighting, so use an infrared module, such as the NoIR Camera Module, if you want to snap evening or night-time wildlife activity. Connect the Camera Module to the Camera port on Raspberry Pi using the cable provided, then gently pull up the edges of the port’s plastic clip and insert the ribbon cable. Push the clip back into place and the Camera Module will remain attached. Try our ‘Getting started with the Raspberry Pi Camera Module‘.

A Raspberry Pi plus camera is a great solution for web-enabled snapping

Set up your Raspberry Pi and let it perform any OS updates needed. (The Raspberry Pi Imager tool can help)

You’ll need a keyboard and mouse to set up the Raspberry Pi, but you can disconnect them at the end. Insert the updated microSD card and use a regular power supply to start it up (keep your power bank on charge separately while you set things up). Go through the Raspberry Pi setup, making sure you change the default password (since it will be accessible to anyone), and connect to your wireless network. It helps if you can access this network from the garden.

Turn on the interface for the camera, and enable SSH and VNC so you can access Raspberry Pi OS remotely when it’s sitting out in the garden. To do this, open Menu > Preferences > Raspberry Pi Configuration and click on Interface, then set Camera, SSH, and VNC to Enabled (see this documentation). Click Yes when advised that a reboot is needed. 

Next, test the camera. Open a terminal window and enter:

raspistill -o Desktop/image.jpg

A preview window will appear. After a few moments, it will save an image to the Desktop. Double-click the image.jpg file to open it.

You can use Python to take pictures and shoot video. This is handy if you want to create a time-lapse or video camera. This Raspberry Pi Project guide explains how to control the camera with Python.

You can use a USB power bank to run your Raspberry Pi wildlife camera

Note that recording video will quickly fill up your storage space and drain the battery. A better idea is to leave the preview running and use VNC to view the camera remotely. A neater option is to hook up your Raspberry Pi to YouTube (as explained in this Raspberry Pi infrared bird-box project).

Open a web page and go to studio.youtube.com. Sign in, or set up a YouTube account. You will need to enable permission to live-stream. This involves providing YouTube with your phone number. Click Settings, Channel, and ‘Feature eligibility’, expand ‘Features that require phone verification’, and click ‘Verify phone number’. Type in your phone number, then enter the code that YouTube sends you as a text message. For security reasons, it will take 24 hours for YouTube to activate this feature on your account.

Get your key and add to terminal

On the left-hand side of the screen you should see a menu with the My Channel option available:

In the middle of the screen you should see the Video Manager option. On the left you should see a Live Streaming option. Look for and select the ‘Stream now BETA’ option. 

Scroll down to the bottom of the page and you should see the ENCODER SETUP option.

Here there is a Server URL and a Stream name/key. The key is shown as a line of asterisks, until you click the Reveal button. Keep the key secret and don’t share it online. Copy your Stream Key to a text document (password-protect it, ideally).

Open a terminal window and enter this command (replacing <key goes here> with your own key:

raspivid -o - -t 0 -w 1280 -h 720 -fps 25 -b 4000000 -g 50 | ffmpeg -re -ar 44100 -ac 2 -acodec pcm_s16le -f s16le -ac 2 -i /dev/zero -f h264 -i - -vcodec copy -acodec aac -ab 128k -g 50 -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/<key goes here>

With this running on Raspberry Pi, you can view the stream from your camera on YouTube on any computer. This infrared bird-box project explains more about the command options. 

You’ll want this script to execute on startup. Create a file for your startup script and add the aforementioned raspivid stream command to it:

sudo nano /etc/init.d/superscript

Make the script executable:

sudo chmod 755 /etc/init.d/superscript

And register the script to run at startup:

sudo update-rc.d superscript defaults

You can see details of scripts running at startup here.

Shut down Raspberry Pi and fit the computer and Camera Module inside a case (if you are using one). Position Raspberry Pi in your garden and power it with the USB power bank. It will connect to your wireless network, and run the YouTube streaming key. 

Navigate to your channel on YouTube at any time to see the action taking place in your garden. 

Get HackSpace magazine issue 33 — out today

HackSpace magazine issue 33: on sale now!

HackSpace magazine is out now, available in print from the Raspberry Pi Press online store, your local newsagents, and the Raspberry Pi Store, Cambridge.

You can also download the directly from PDF from the HackSpace magazine website.

Subscribers to HackSpace for 12 months to get a free Adafruit Circuit Playground, or choose from one of our other subscription offers, including this amazing limited-time offer of three issues and a book for only £10!

The post Watch wildlife with a Raspberry Pi nature camera| Hackspace 33 appeared first on Raspberry Pi.

Super cool favourites picked by fabulous judges

We’re delighted to announce that our special judges — Eben Upton, Hayaatun Sillem, Limor Fried, Mitch Resnick, and Tim Peake — have chosen their favourite projects from the Coolest Projects online showcase! 

Young tech creators from 39 countries are part of the showcase, including from Ireland, Australia, Palestine, UK, USA, India, and Indonesia. In total, you’ll find an incredible 560 projects from 775 young creators in the showcase gallery.

Our judges have been amazed and inspired by all the young creators’ projects, and they want to highlight a few as their favourites!

Eben Upton’s favourites

Eben Upton is a founder of our organisation, one of the inventors of the Raspberry Pi computer, and CEO of Raspberry Pi Trading. Watch Eben’s favourites.

  • Haya: Bobby ‘A Platformer’
  • Kaushal: Diabetic Retinopathy Detector
  • Zaahra, Eesa: Easy Sylheti
  • Mahmoud: Fighting Against Coronavirus

  • Oisín: MiniGolf In Python
  • Fiona: TeenBeo
  • Artash, Arushi: The Masked Scales: The Sonification of the Impact

Hayaatun Sillem’s favourites projects

Dr Hayaatun Sillem is the CEO of the Royal Academy of Engineering, which brings together the UK’s leading engineers and technologists to promote engineering excellence for the benefit of society. Watch Hayaatun’s favourites.

  • Radu Matei: Agartha Lore. Rebellion
  • Momoka: AI Trash Can
  • Kian: Cellular Ecosystem: Life in a Petri Dish

  • Sama, Sam, Taima, Nouran, Rama: Five Feet Apart
  • Tucker: Rivers.run
  • Cyrus: School Student ePortal

Limor ‘Ladyada’ Fried’s favourites

Limor Fried is an MIT-trained engineer and the founder and owner of Adafruit Industries. Watch Limor’s favourites.

  • Sara, Batool, Rahaf, Nancy: Children Body Language
  • Lars: Colourbird PicoBello
  • Alisa, Michelle: Green Coins
  • Niamh: MineBlower

  • Marah: My School Website
  • Raluca: Protect the Planet!
  • Rhea: The Amazing Photo Filter
A girl presenting a digital making project

Mitch Resnick’s favourites

Mitch Resnick is Professor of Learning Research at the MIT Media Lab, and his Lifelong Kindergarten research group develops and supports the Scratch programming software and online community! Watch Mitch’s favourites.

  • Oisín, Naoise: AUTISTICALLY AWESOME
  • Elana, Saibh: Exploring Schools
  • Mark: Mark’s Coronavirus Game
  • Adarsh: Raspberry Pi–Based, Low-Cost Contactless Vital Signs Monitor
  • Matteo, Massimo, Jacopo: Sheetcheat.xyz
  • Cathal: Ukelectric
A Coolest Projects participant

Tim Peake’s favourites

Tim Peake is a British ESA astronaut who spent 186 days in space on the International Space Station. He’s also a passionate advocate for STEM education. Watch Tim’s favourites.

  • Abhiy: Burglar Buster
  • Carlos, Blanca, Mario: El ojo que te observa (The All-seeing Eye)
  • Zoe: Find It
  • Oluwadabera Jedidiah: Galaxy
  • Patrick: Pear Pad – Have Fun with Apps
  • Hala, Ranwa: Help Me to Learn

Discover over 500 projects

You can explore all the young tech creators’ projects — games, hardware builds, Scratch projects, mobile apps, and more — in our showcase gallery now.

This year’s Coolest Projects online showcase wouldn’t be possible without the support of our Coolest Projects sponsors — thank you!

The post Super cool favourites picked by fabulous judges appeared first on Raspberry Pi.

Raspberry Pi prayer reminder clock

One of our Approved Resellers in the Netherlands, Daniël from Raspberry Store, shared this Raspberry Pi–powered prayer reminder with us. It’s a useful application one of his customers made using a Raspberry Pi purchased from Daniël’s store.

As a Raspberry Pi Official Reseller, I love to see how customers use Raspberry Pi to create innovative products. Spying on bird nests, streaming audio to several locations, using them as a menu in a restaurant, or in a digital signage-solution… just awesome. But a few weeks ago, a customer showed me a new usage of Raspberry Pi: a prayer clock for mosques.

Made by Mawaqit, this is a narrowcasting solution with a Raspberry Pi at its heart and can be used on any browser or smartphone.

Hardware

This project is simple in hardware terms. You just need Raspberry Pi 3 or Raspberry Pi 4, a TV screen, and a HDMI cable.

If you do not have an internet connection, you’ll also need an RTC clock

With the HDMI cable, Raspberry Pi can broadcast the clock — plus other useful info like the weather, or a reminder to silence your phone — on a wall in the mosque. Awesome! So simple, and yet I have not seen a solution like this before, despite Mawaqit’s application now being used in 51 countries and over 4609 mosques. And, last I checked, it has more than 185,000 active users!

How to build it

You’ll need to install the pre-configured system image and flash the mawaqit.xz system image onto your Raspberry Pi’s SD card.

There are then two options: connected and offline. If you set yourself up using the connected option, you’ll be able to remotely control the app from your smartphone or any computer and tablet, which will be synchronised across all the screens connected to Raspberry Pi. You can also send messages and announcements. The latest updates from Mawaqit will install automatically.

That’s a little RTC on the right

If you need to choose the offline option and you’re not able to use the internet at your mosque, it’s important to equip your Raspberry Pi with RTC, because Raspberry Pi can’t keep time by itself.

All the software, bits of command line code, and step-by-step guidance you’ll need are available on this web page.

These figures update on the Mawaqit site

Open source for all

The Mawaqit project is free of charge, and the makers actually prohibit harnessing it for any monetary gain. The makers even created an API for you to create your own extentions — how great is that? So, if you want your own prayer clock for in a mosque, school, or just at home, take a look at Mawaqit.net.

Anyone with the language skills please head to YouTube and provide community translations for this walkthrough video

The post Raspberry Pi prayer reminder clock appeared first on Raspberry Pi.

How to never miss a video call with Raspberry Pi and NextEvent

Since lockdown started, I’ve found I often miss video meetings. It’s not intentional, I simply lose track of time. Though my phone is set to remind me of upcoming meetings ten minutes before they begin, I have a habit of trying to fill that time with something productive and before I know it, I have Eben on the phone, fifteen minutes after the meeting’s start, asking where I am.

Fixing the issue using code

Due to this, and because I was interested in playing with the Google API and learning a little more Python, I decided to write a simple application that will get your upcoming events from your Google Calendar and give you notifications as often as you want, visually on screen as well as through sound.

I call it NextEvent

Here’s the video tutorial to show you more:

And here’s the written tutorial too!

Installing NextEvent to your Raspberry Pi

To install NextEvent, open a terminal window (Ctrl-Alt-T) on Raspberry Pi and type:

sudo apt update
sudo apt upgrade
git clone https://github.com/ghollingworth/nextevent

This will get the files from my GitHub repository. Next you’ll need to install some dependencies, and I’ve created a script to make that easy:

cd nextevent
./install.sh

The dependencies are dateutil (a library for manipulating time and dates), the Google API client libraries, and the gi-cairo library (which is for drawing the GUI).

Then fire up NextEvent:

python3 nextevent.py

The next thing you’ll see is NextEvent starting up, and then it’ll open Chromium. It is here that you will give Google permission to share your calendar with the application.

You’ll then need to log into your Google account and give NextEvent access to your calendar. Chromium will tell you when everything is fine and you can close the browser.

Now you can see your next five events along with the time left until each one. When the time gets down to five minutes, the application will turn red and ‘ding’ at you. It’ll ‘ding’ twice at one minute to go, and four times when your meeting is about to start!

In case you want to delve into the code, maybe to create a meeting room ‘now and next’ display, the nextevent.py source contains the GUI and event processing part of NextEvent. You should be able to go here and change the number of lines, the colours, and the notification sounds.

How does it work?

If you’re the sort that likes to know HOW the code works, here’s a follow-up to the tutorial video where I explain exactly that!

The post How to never miss a video call with Raspberry Pi and NextEvent appeared first on Raspberry Pi.

Galactic coding with Digital Making at Home!

Join us for Digital Making at Home: this week, young people can do out-of-this-world coding with our space-themed projects! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get ready to do some galactic coding with us:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session!

The post Galactic coding with Digital Making at Home! appeared first on Raspberry Pi.

Remote humidity detector

We know crawl spaces are creepy, sweaty, and confining but, hear us out…

You need to keep an eye on the humidity level in your crawl space, as it can seriously affect the whole house’s overall health. It’s ideal to be able to do this remotely (given the creepy, sweaty atmosphere of the space), and a Raspberry Pi allows this.

Crawl space humidity monitor dashboard — live version at https://go.init.st/fcpp6ll

Jamie Bailey took to Medium to share his Raspberry Pi setup that allows him to monitor the humidity of the crawl space in his home from a mobile device and/or laptop. His setup lets you check on the current humidity level and also see the historical data over time. You can also set alarms to be sent to you via text or email whenever the humidity level exceeds a certain threshold.

The hardware you need

  • Power outlet or extension cord in your crawl space
  • Raspberry Pi (3 or 4) or Raspberry Pi Zero W (or WH)
  • BME280 temperature/humidity sensor
  • Female-to-female jumper wires

The software you need

Jamie’s walk-through is extensive and includes all the command line code you’ll need too, so make sure to check it out if you attempt this build.

Crawl space humidity mobile dashboard — live version at https://go.init.st/ol4pfy0

Assembly

The BME280 sensor has four pins you need to connect to your Raspberry Pi. This will send the humidity data to your Raspberry Pi, which you’ll have already set up to let you know what’s happening remotely.

  • BME280 VIN pin connects to GPIO pin 1 (3.3V)
  • BME280 GND pin connects to GPIO pin 6 (GND)
  • BME280 SCL pin connects to GPIO pin 5 (SCL)
  • BME280 SDA pin connects to GPIO pin 3 (SDA)
You can see the Raspberry Pi in a black case hanging in the centre against a floor joist.

Once you have all your software sorted and your hardware connected, turn your Raspberry Pi off and take it down to your crawl space (monitor, keyboard, and mouse are no longer necessary). Jamie advises hanging your Raspberry Pi from the floor joists instead of letting it touch the ground, to avoid contact with any water. He put a nail in one of the floor joists and draped the power cord over the nail (see above). Turn your tiny computer on, make sure data starts flowing into your dashboard, and you’ve got yourself remote humidity sensor!

PS We’re English so… is a crawl space the same as an attic or what? Asking for a friend!

Never mind, Alex asked her American girlfriend.

The post Remote humidity detector appeared first on Raspberry Pi.

Testing young children’s computational thinking

Computational thinking (CT) comprises a set of skills that are fundamental to computing and being taught in more and more schools across the world. There has been much debate about the details of what CT is and how it should be approached in education, particularly for younger students. 

A girl doing digital making on a tablet

In our research seminar this week, we were joined by María Zapata Cáceres from the Universidad Rey Juan Carlos in Madrid. María shared research she and her colleagues have done around CT. Specifically, she presented work on how we can understand what CT skills young children are developing. Building on existing work on assessing CT, she and her colleagues have developed a reliable test for CT skills that can be used with children as young as 5.

María Zapata Cáceres

Why do we need to test computational thinking?

Until we can assess something, María argues, we don’t know what children have or haven’t learned or what they are capable of. While testing is often associated with the final stages in learning, in order to teach something well, educators need to understand where their students’ skills are to know what they are aiming for them to learn. With CT being taught in increasing numbers of schools and in many different ways, María argues that it is imperative to be able to test learners on it.

Screenshot from an online research seminar about computational thinking with María Zapata Cáceres

How was the test developed?

One of the key challenges for assessing learning is knowing whether the activities or questions you present to learners are actually testing what you intend them to. To make sure this is the case, assessments go through a process of validation: they are tried out with large groups to ensure that the results they give are valid. María’s and her colleagues’ CT test for beginners is based on a CT test developed by researcher Marcos Román González. That test had been validated, but since it is aimed at 10- to 16-year-olds, María and her colleagues needed to adapt it for younger children and then validate the adapted rest.

Developing the first version

The new test for beginners consists of 25 questions, each of which has four possible responses, which are to be answered within 40 minutes. The questions are of two types: one that involves using instructions to draw on a canvas, and one that involves moving characters through mazes. Since the test is for younger children, María and her colleagues designed it so it involves as little text as possible to reduce the need for reading; instead the test includes self-explanatory symbols.

Screenshot from an online research seminar about computational thinking with María Zapata Cáceres

Developing a second version based on feedback

To refine the test, the researchers consulted with a group of 45 experts about the difficulty of the questions and the test’s length of the test. The general feedback was very positive.

Drawing on the experts’ feedback, María and her colleagues made some very specific improvements to the test to make it more appropriate for younger children:

  • The improve test mandates that an verbal explanation be given to children at the start, to make sure they clearly understand how to take the test and don’t have to rely on reading the instructions.
  • In some areas, the researchers added written explanations where experts had identified that questions contained ambiguity that could cause the children to misinterpret them.
  • A key improvement was to adapt the grids in the original test to include pathways between each box of the maze. It was found that children could misinterpret the maze, for example as allowing diagonal moves between squares; the added pathways are visual cues that it clear that this is not possible.
Screenshot from an online research seminar about computational thinking with María Zapata Cáceres

Validating the test

After these improvements, the test was validated with 299 primary school students aged 5-12. To assess the differences the improvements might make, the students were given different version of the test. María and her colleagues found that the younger students benefited from the improvements, and the improvements made the test more reliable for testing students’ computational thinking: students made fewer errors due to ambiguity and misinterpretation.

Statistical analysis of the test results showed that the improved version of the test is reliable and can be used with confidence to assess the skills of younger children.

What can you use this test for?

Firstly, the test is a tool for educators who want to assess the skills young people have and develop over time. Secondly, the test is also valuable for researchers. It can be used to perform projects that evaluate the outcomes of different approaches to teaching computational thinking, as well as projects investigating the effectiveness of specific learning resources, because the test can be given to children before and again after they engage with the resources.

Assessment is one of the many tools educators use to shape their teaching and promote the learning of their students, and tools like this CT test developed by María and her colleagues allow us to better understand what children are learning.

Find out more & join our next seminar

The video and slides of María’s presentation are available on our seminars page. To find out more about this test, and the process used to create and validate it, read the paper by María and her colleagues.

Our final seminar of this series takes place Tuesday 28 July before we take a break for the summer. In the session, we will explore gender balance in computing, led by Katharine Childs, who works on the Gender Balance in Computing research project at the Raspberry Pi Foundation. You can find out more and sign up to attend for free on our Computing Education Research Seminars page.

The post Testing young children’s computational thinking appeared first on Raspberry Pi.

Who needs vinyl records when you’ve got Raspberry Pi and NFC?

Redditor Mark Hank missed the tactile experience of vinyl records so he removed the insides of an old Sonos Boost to turn it into a Raspberry Pi- and NFC-powered music player. Yes, this really works:

The Sonos Boost was purchased for just £3 on eBay. Mark pulled all the original insides out of it and repurposed it as what they call a ‘vinyl emulator’ to better replicate the experience of playing records than what a simple touchscreen offers.

The Boost now contains a Raspberry Pi 3A+ and an ACR122U NFC reader, and it plays a specific album, playlist, or radio station when you tap a specific NFC tag on it. It’s teamed with Sonos speakers, and NTAG213 NFC tags. The maker recommends you go with the largest tags you can find, as it will improve read performance; they went with these massive ones.

One of the album covers printed onto thick card

The tags are inside printouts mounted on 1mm thick card (those album cover artwork squares getting chucked at the Sonos in the video), and they’re “super cheap” according to the maker.

You’ll need to install the node-sonos-http-api package on your Raspberry Pi; it’s the basis of the whole back-end of the project. The maker provides full instructions on their original post, including on how to get Spotify up and running on your Raspberry Pi.

The whole setup neatened up

Rather than manually typing HTTP requests into a web browser, the maker wanted to automate the process so that the Raspberry Pi does it when presented with certain stimulus (aka when the NFC reader is triggered). They also walk you through this process in their step-by-step instructions.

How the maker hid the mess under the display table

The entire build cost around £50, and the great thing is that it doesn’t need to sit inside an old Sonos Boost if you don’t want it to. The reader works through modest-width wood, so you can mount it under a counter, install it in a ‘now listening’ stand, whatever — it’s really up to you.

Full instructions are available on hackster.io! And here’s all the code you’ll need, handily stored on GitHub.

The post Who needs vinyl records when you’ve got Raspberry Pi and NFC? appeared first on Raspberry Pi.

Go sailing with this stop-motion 3D-printed boat

Shot on a Raspberry Pi Camera Module, this stop-motion sequence is made up of 180 photos that took two hours to shoot and another hour to process.

The trick lies in the Camera Module enabling you to change the alpha transparency of the overlay image, which is the previous frame. It’s all explained in the official documentation, but basically, the Camera Module’s preview permits multiple layers to be rendered simultaneously: text, image, etc. Being able to change the transparency from the command line means this maker could see how the next frame (or the object) should be aligned. In 2D animation, this process is called ‘onion skinning’.

You can see the Raspberry Pi Camera Module on the bottom left in front of Yuksel’s hand

So why the Raspberry Pi Camera Module? Redditor /DIY_Maxwell aka Yuksel Temiz explains: “I make stop-motion animations as a hobby, using either my SLR or phone with a remote shutter. In most cases I didn’t need precision, but some animations like this are very challenging because I need to know the exact position of my object (the boat in this case) in each frame. The Raspberry Pi camera was great because I could overlay the previously captured frame into the live preview, and I could quickly change the transparency of the overlay to see how precise the location and how smooth the motion.”

You can easily make simple, linear stop-motion videos by just capturing your 3D printer while it’s doing its thing. Yuksel created a bolting horse (above) in that way. The boat sequence was more complicated though, because it rotates, and because pieces had to be added and removed.

The official docs are really comprehensive and span basic to advanced skill levels. Yuksel even walks you through getting started with the installation of Raspberry Pi OS.

Yuksel’s Raspberry Pi + Lego microscope

We’ve seen Yuksel’s handiwork before, and this new project was made in part by modifying the code from the open-source microscope (above) they made using Raspberry Pi and LEGO. They’re now planning to make a nice GUI and share the project as an open-source stop-motion animation tool.

The post Go sailing with this stop-motion 3D-printed boat appeared first on Raspberry Pi.

Travel the world with a retro musical phone

This rotary phone features a built-in Raspberry Pi that communicates with radiooooo.com (a musical time machine) and an Arduino working behind the map to control the selection of the country. Just pick up the phone, choose a country and a decade, and listen to some great music!

How does it work?

The Raspberry Pi:

  • Plays music through radiooooo.com
  • Detects when the handset is picked up/put down
  • Detects the numbers that are dialled in

The Arduino:

  • Detects which country is selected on the map (via jack connectors)
  • Sends the info to the Raspberry Pi over serial

We saw this project on hackster.io and loved how maker Caroline Buttet dug into the finer detail of an old-fashioned rotary phone’s pick-up/put-down mechanism, as well as how the phone knows which numbers you’re dialling. She goes into more detail about that aspect in the second build video, above.

An audio jack being plugged into a world map mounted on a board

Some countries have a jack pin – this is how you select the music

Other bits you’ll need

As well as a Raspberry Pi 4 and Arduino UNO, you’ll need a world map (obviously) and something to mount it on which can be drilled into. This is because the jack pins you can see in the image above need to poke out of different countries.

Caroline’s grandma donated the old rotary phone she used for this project. You should be able to pick one up from a second-hand shop or, if you can get a new handset made in the retro style online.

The shopping list for this build also includes: jumper wires; audio/video cable assembly; LED, breadboard; jack socket 3-pin; resistors

A simplified visual representation of how everything works

In her original post, Caroline explains in detail how to connect the rotary phone’s switches to the pins on your Raspberry Pi, how to build in audio sockets on the board you glue your map to, how to run the necessary Python script from the command line, and what a Chrome extension to use to make radiooooo.com work with your Raspberry Pi.

The Raspberry Pi inside the rotary phone

And yes, Caroline is one of those most magical of makers who deposits all the code needed for this build on GitHub!

And here’s the Arduino mounted onto the back of the map, with the audio jacks taped up to the holes drilled into different countries

The post Travel the world with a retro musical phone appeared first on Raspberry Pi.

Let’s make it colourful with Digital Making at Home

Join us for Digital Making at Home: this week, young people can learn about using the Sense HAT — or its emulator — with us! With Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get ready to do some colourful coding with us:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session.

The post Let’s make it colourful with Digital Making at Home appeared first on Raspberry Pi.

Deep learning cat prey detector

We’ve all been able to check on our kitties’ outdoor activities for a while now, thanks to motion-activated cameras. And the internet’s favourite cat flap even live-tweets when it senses paws through the door.

A nightvision image of a cat approaching a cat flap with a mouse in its mouth

“Did you already make dinner? I stopped on the way home to pick this up for you.”

But what’s eluded us “owners” of felines up until now is the ability to stop our furry companions from bringing home mauled presents we neither want nor asked for.

A cat flap bouncer powered by deep learning

Now this Raspberry Pi–powered machine learning build, shared by reddit user u/eee_bume, can help us out: at its heart, there’s a convolutional neural network cascade that detects whether a cat is trying to enter a cat flap with something in its maw. (No word from the creators on how many half-consumed rodents the makers had to dispose of while training the machine learning model.)

The neural network first detects the whole cat in an image; then it hones in on the cat’s maw. Image classification is performed to detect whether there is anything in or around the maw. If the network thinks the cat is trying to smuggle caught contraband into the house, it’s a “no” from this virtual door bouncer.

The system runs on Raspberry Pi 4 with an infrared camera at an average detection rate of  around 1 FPS. The PC-Val value, representing the certainty of the prey classification => prey/no_prey certainty threshold, is 0.5.

The home made set up including small camera lights and sensors

The infrared camera setup, powered by Raspberry Pi

How to get enough training data

This project formed Nicolas Baumann’s and Michael Ganz’s spring semester thesis at the Swiss Federal Institute of Technology. One of the problems they ran into while trying to train their device is that cats are only expected to enter the cat flap carrying prey 3% of the time, which leads to a largely imbalanced classification problem. It would have taken a loooong time if they had just waited for Nicolas and Michael’s pets to bring home enough decomposing gifts.

Lots of different cats faces close up, some with prey in their mouths, some without

The cutest mugshots you ever did see

To get around this, they custom-built a scalable image data gathering network to simplify and maximise the collection of training data. It features multiple distributed Camera Nodes (CN), a centralised main archive, and a custom labeling tool. As a result of the data gathering network, 40GB of training data have been amassed.

What is my cat eating?!

The makers also took the time to train their neural network to classify different types of prey. So far, it recognises mice, lizards, slow-worms, and birds.

Infrared shots of one cat while the camera decides if it has prey in its mouth or not

“Come ooooon, it’s not even a *whole* mouse, let me in!”

It’s still being tweaked, but at the moment the machine learning model correctly detects when a cat has prey in its mouth 93% of the time. But it still falsely accuses kitties 28% of the time. We’ll leave it to you to decide whether your feline companion will stand for that kind of false positive rate, or whether it’s more than your job’s worth.

The post Deep learning cat prey detector appeared first on Raspberry Pi.

Distributing Raspberry Pi computers to help families access education

The closure of schools has called attention to the digital divide, which sees poorer families struggling or unable to access education*. The coronavirus pandemic didn’t cause this divide, but it has highlighted it and its impact on many people in our society.

As our Foundation CEO Philip outlined back in April, part of our response to the pandemic and social distancing measures is to send free Raspberry Pi computers to students who currently lack the technology to complete their school work at home. Generously funded by the Bloomfield Trust, we have started to distribute Raspberry Pis in the UK.

Who is receiving Raspberry Pis?

Our approach for this initiative is to work with partner charities that help us identify the right recipients for the computers; we want them to go to young people who don’t have a suitable device for completing their schoolwork in their home.

The first partner charity we’ve been working with, whose team has been so patient as we’ve learned together how to do this, are the incredible School Home Support, a youth organisation working to improve school attendance, behaviour, and engagement in learning. With their help, we’ve so far distributed more than 120 Raspberry Pi 4 computers (with 2GB RAM), together with all the peripherals including a screen. School Home Support were also able to secure funding to provide high-speed internet access to the recipients’ home so students can reliably connect to their schools.

Families receive a Raspberry Pi Desktop Kit and a screen. Our partner charity funds reliable internet access.

How are we helping them set up?

The young people set up their Raspberry Pis themselves, and we provide detailed instructions to guide them through this process. Most of the families have never used a computer like Raspberry Pi, so they need encouragement and support to get up and running. This is being provided both by the excellent School Home Support practitioners, and by Raspberry Pi team members, who answer questions when recipients get stuck.

“My mum was confused by the setup at first, but having a call to explain it really helped, and now we see how easy it is to set up and use.”

Raspberry Pi recipient

Recipients are already benefiting

Before receiving these computers, many of the young people only had occasional access to their parents’ phone to find out what school work had been set for them, and to complete it.

“It’s much easier to do my schoolwork now on the bigger screen. I feel like I’m learning more.”

Raspberry Pi recipient
A young girl sitting at a desk using a Raspberry Pi computer

We’re getting feedback that the Raspberry Pis help recipients focus on their work; they now have their own space to work in and more time to complete schoolwork, as they’re no longer rushing to share a device with other family members.

“I don’t always enjoy doing homework, but it’s better now that I have my own computer to do my work.”

Raspberry Pi recipient

Having a Raspberry Pi has increased the students’ motivation, and it has reduced stress — for parents as well as children:

“The Raspberry Pi kit came at a time when I really needed it. Up until that point, T had to do his homework and access the school’s home learning using my phone, which was not very practical at all. This was made worse by the fact that he had to share my phone with his sister, which ended up causing a lot of arguments. He was so pleased to receive a computer he could use. At first he had a lot of fun playing different games on it, and I was surprised about how well he was able to understand and help me set it up. The only negative was that he enjoyed playing games on it a bit too much! I feel relieved that he has his own computer which he can use. It was very stressful and frustrating having to use my mobile phone. There were times when T would be using my phone to do his work and he would be interrupted if I got a phone call, which meant that he would have to log in again, and sometimes would lose his work.”

Parent of a Raspberry Pi recipient

What are we doing next?

It’s wonderful to hear stories like this of how our computers make a difference in people’s lives. We’re still learning lots: while many families have been able to get up and running easily and quickly, others are still overwhelmed because they are unfamiliar with the device. We know we need to do more to build their confidence.

As we’re learning, we’re also talking to our next charity partners in the UK to help us identify more recipients, and to help the recipients get set up on their new Raspberry Pi devices.

If you are part of an organisation that could partner with us to support families in need of access to technology, please email us at stayconnected@raspberrypi.org. Be aware that your organisation would need to fund the families’ internet access.


* The impact of the digital divide on students has for example been reported on by BCS, the Chartered Institute for IT and by the Institute for Fiscal Studies.

The post Distributing Raspberry Pi computers to help families access education appeared first on Raspberry Pi.

Learn at home #4: All about Scratch

There’s no question that families have faced disruptions and tough challenges over the last few months. For the parents and carers who’ve been supporting their children with learning at home, it can feel overwhelming, stressful, rewarding — or all three! As many children are still carrying on with learning at home, we are supporting them with extra resources, and parents with support tutorials.

In our last blog post for parents, we talked to you about debugging — finding and fixing errors in code. This week we’re covering the amazing things young people can do and learn with Scratch — it’s not just for beginners!

Getting the most out of Scratch

Scratch is a block-based programming tool that lets you create lots of different projects. It’s often one of the first programming tools children use in primary school. We’ve made a video introduction to Scratch in case you’re less familiar with it.

If your child at home is ready to try more challenging coding tasks, Scratch is still a great tool for them, as they can use it to build some truly epic projects.

Joel Bayubasire CoderDojo

In this video, Mark shows you examples from the Scratch community and signposts useful resources that will support you and your children as they develop their confidence in Scratch.

Scratch is a great tool for building complex, unique, and challenging projects. For example, the Scratch game Fortnite Z involves 13,500 Scratch blocks and took more than four months to develop. People have also built astounding 3D graphic projects in Scratch!

3D model of a glycine molecule
A 3D model of a molecule, built in Scratch

You can find other amazing examples if you explore the Coolest Projects online showcase. Our free annual tech showcase for young people has lots of great Scratch projects: plenty of inspiration for you and your young people at home.

Exploring and learning in the Scratch community 

The Scratch community is a great place for young people to safely share their projects with each other all year round, and to like and comment on them. It’s a real treasure trove they can explore to find inspiration and learning opportunities, and for young people who are spending more time at home, it offers a way to connect to peers around the world.

In this video, Katharine shows you how the team behind Scratch keeps the community safe, where you as a parent can find the information you need, and how your child will engage with the community.

Code along with us! 

To keep young people entertained and learning, we’re running a Digital Making at Home series. You’ll find new, free code-along videos every Monday, with different themes and projects for all levels of experience. We have lots of Scratch code-alongs on offer! We also live-stream a code-along session every Wednesday at 14:00 BST at rpf.io/home.

Digital Making at Home from the Raspberry Pi Foundation V1

We want your feedback

We’ve been asking parents what they’d like to see as part of our initiative to support them and the young people they care for. They’ve sent us some great suggestions so far! If you’d like to share your thoughts too, email us at parents@raspberrypi.org.

Sign up for our bi-weekly emails, tailored to your needs

Sign up now to start receiving free activities suitable to your child’s age and experience level straight to your inbox. And let us know what you as a parent or guardian need help with, and what you’d like more or less of from us.


PS All of our resources are completely free. This is made possible thanks to the generous donations of individuals and organisations. Learn how you can help too!

The post Learn at home #4: All about Scratch appeared first on Raspberry Pi.

Website hosting on Raspberry Pi 4 with Mythic Beasts

Par : Alex Bate

Here’s Mythic Beast’s Pete Stevens to talk about how we run the Raspberry Pi website on Raspberry Pis, and how Mythic Beasts can run your site on Raspberry Pis too!

Rent a Raspberry Pi

In late 2016, Mythic Beasts launched a Raspberry Pi cloud, allowing you to rent a Raspberry Pi 3 as a service.

Raspberry Pi 4 is a much more capable computer, with more than twice the performance and, crucially, four times the memory. We were so excited by it, we bet Eben Upton a beer that we could host the launch site for Raspberry Pi 4 on Raspberry Pi 4. We’d demonstrated that it was just about possible to run a normal day on a cluster of eight Raspberry Pi 3s, but launch day is a bit more exciting — tens of millions rather than a million visitors.

Eben, being a fool supremely confident in the work that his team had done, took the bet and let us. On Thursday 20 June 2019, he dropped off eighteen 4GB RAM Raspberry Pi 4 computers that had previously been used in testing. We set about configuring them to replace all the web servers that run the Raspberry Pi blog.

  • 14× Dynamic Web server (PHP/Apache)
  • 2× Static webserver (Apache, flat files)
  • 2× Memcache (in memory store to accelerate web serving)

We started the build on Friday 21 June. We immediately ran into our first ‘chicken and egg’ problem. The Raspberry Pi web servers are built from Puppet, based (at the time) on Debian Jessie. Raspberry Pi 4’s release OS was a not-yet-released version of Debian Buster, which at the time wasn’t supported by Puppet. In conjunction with Greg Annandale at the Raspberry Pi Foundation, we created a Puppet build that would run on Raspberry Pi 4, updated the configuration from Jessie to Buster (newer Apache/PHP), and did some testing.

A rack of Raspberry Pis and a mess of wires connecting them
The enclosures were built to accommodate a larger PoE HAT, which is why this doesn’t stand up beautifully neatly.

We have pre-built enclosures from our Raspberry Pi 3 cloud. We followed the same approach using Power over Ethernet to provide power and data to each Raspberry Pi 4. This dramatically reduces the cabling and complexity of the setup. Late on Friday 21, just over 24 hours after we started, we moved the hastily constructed Raspberry Pi 4 setup to Sovereign House, a key Mythic Beasts data centre and one of the best-connected buildings in Europe.

Over the course of a few hours, we gradually moved the entire production load from the existing virtual servers to the Raspberry Pi 4 cloud and every page from the blog was being served directly off Raspberry Pi 4. We left it for two days to bed in before the real test: launch day.

The launch was almost perfectly smooth. The Raspberry Pi cluster coped fine with the tens of millions of users. However, the Raspberry Pi cluster and website is fronted by Cloudflare, which provides acceleration for static resources and protects the site from denial of service. Unfortunately, they had a two-hour outage in the middle of the launch thanks to a misconfigured internet optimiser run by a customer of Verizon. So the Raspberry Pi 4 cluster had a long lunch break wondering where all the users had gone.

We ran the website on the Raspberry Pi 4 cluster for over a month before reverting back to the usual virtual server-based environment. We’d proved that RaspberryPi 4 would make an awesome hosting platform.

Commercialising Raspberry Pi 4 as a service

We were already running Raspberry Pi 3 as a service for many customers (e.g. PiWheels, which builds Python packages for Raspberry Pi), and being able spin up Raspberry Pi 3 on demand is incredibly useful.

At launch, Raspberry Pi 4 wasn’t suitable. We rely on network boot in order to be able to remotely re-image Raspberry Pi. SD cards just aren’t very reliable; visiting the data centre for manual intervention on every SD card failure is not only expensive in time, but also means we’d have to maintain physical access to every Raspberry Pi 4 in our cloud. Netboot means that we just build large enclosures of 108 Raspberry Pis and seal them in, as they will never require physical attention. If one fails — and we’ve not yet seen one fail — we can shut it down and take it out of our database.

For Raspberry Pi 4 we had to wait for network booting to be a reality. We had access to beta firmware in November 2019 and built a sample Raspberry Pi 4 network boot setup. We then had to integrate it into our management code, build Raspberry Pi 4–compatible operating system images, and enhance our billing to cope with multiple models and by-the-hour billing. Then we had to do a file server and network upgrade: serving lots of machines with true gigabit needs more ‘oomph’ than the 100Mbps of Raspberry Pi 3. This also all needed to be backward-compatible so as not to break the existing Raspberry Pi 3 users. On 17 June 2020 we launched, and Raspberry Pi 4 is now ready to order in our cloud.

Is it any good?

Yes. Raspberry Pi is twice as fast as the same-sized instances in AWS, for a quarter of the price. Just see for yourself:

Raspberry Pi 4 a1.large mg6.medium
Spec 4 cores @ 1.5GHz
4GB RAM
2 cores
4GB RAM
1 core
4GB RAM
Monthly price £8.63 $45.35
(~£36.09)
$34.69
(~ £27.61)
Requests per second 107 52 57
Mean request time 457ms 978ms 868ms
99th percentile request time 791ms 1247ms 1056ms

But what about 8GB and 64-bit Raspberry Pi OS?

That sounds like a jolly nice idea. Keep watching the Mythic Beasts blog for updates.

The post Website hosting on Raspberry Pi 4 with Mythic Beasts appeared first on Raspberry Pi.

Reducing the load: ways to support novice programmers

What’s your experience of learning to program? Have you given up and thought it just wasn’t for you? This has been the case for many people — and it’s the focus of a lot of research. Now that teaching programming is in the curriculum in many countries around the world, it’s even more important that we understand what we can do to make learning to program accessible and achievable for all students.

What is cognitive load for learners?

In education, one of the problems thought to cause students difficulty with learning anything — not just programming — is cognitive load. Cognitive load, a concept introduced in the 1980s by John Sweller, has received a lot of attention in the last few years. It is based on the idea that our working memory (the part of our memory that processes what we are currently doing) can only deal with a limited amount of information at any one time. For example, you can imagine that when you are just starting to learn to program, there is an awful lot going on in your working memory, and this can make the task of assimilating it all very challenging; selection, loops, arrays, and objects are all tricky concepts that you need to get to grips with. Cognitive load is a stress on a learner’s working memory, reducing their ability to process and learn new information.

Dr Briana Morrison (University of Nebraska-Omaha)

Finding ways of teaching programming that reduce cognitive load is really key for all of us engaged in computing education, so we were delighted to welcome Dr Briana Morrison (University of Nebraska-Omaha) as the speaker at our latest research seminar. Briana’s talk was titled ‘Using subgoal Labels to Reduce Cognitive Load in Introductory Programming’.

The thrust of Briana’s and her colleagues’ research is that, as educators, we can design instructional experiences around computer programming so that they minimise cognitive load. Using worked examples with subgoal labels is one approach that has been shown to help a lot with this. 

Subgoal labels help students memorise and generalise

Think back to the way you may have learned mathematics: in maths, worked examples are often used to demonstrate how to solve a problem step by step. The same can be done when teaching programming. For example, if we want to write a loop in Python, the teacher can show us a step-by-step approach using an example, and we can then apply this approach to our own task. Sounds reasonable, right?

What subgoal labels add is that, rather than just calling the steps of the worked example ‘Step 1’, ‘Step 2’, etc., the teacher uses memorable labels. For example, a subgoal label might be ‘define and initialise variables’. Such labels not only help us to remember, but more importantly, they help us to generalise the teacher’s example and grasp how to use it for many other applications.

Subgoal labels help students perform better

In her talk, Briana gave us examples of subgoal labels in use and explained how to write subgoal labels, as well as how to work with subject experts to find the best subgoal labels for a particular programming construct or area of teaching. She also shared with us some very impressive results from her team’s research examining the impact of this teaching approach. 

Screenshot of Dr Briana Morrison's research seminar talk

Briana and her colleagues have carried out robust studies comparing students who were taught using subgoals with students who weren’t. The study she discussed in the seminar involved 307 students; students in the group that learned with worked examples containing subgoal labels gave more complete answers to questions, and showed that they could understand the programming constructs at a higher level, than students who learned with worked examples that didn’t contain the subgoal labels. The study also found that the impact of subgoal labels was even more marked for students in at-risk groups (i.e. students at risk of performing badly or of dropping out).

It seems that this teaching approach works really well. The study’s participants were students in introductory computer science classes at university, so it would be interesting to see whether these results can be replicated at school level, where arguably cognitive load is even more of an issue.

Briana’s seminar was very well received, with attendees asking lots of questions about the details of the research and how it could be replicated. Her talk even included some audience participation, which got us all tapping our heads and rubbing our bellies!

Screenshot of Dr Briana Morrison's research seminar talk

Very helpfully, Briana shared a list of resources related to subgoal labels, which you can access via her talk slides on our seminars page.

You can also read more about the background and practical application of cognitive load theory and worked examples including subgoal labels in the Pedagogy Quick Read series we’re producing as part of our work in the National Centre of Computing Education.

Next up in our series

If you missed the seminar, you can find Briana’s presentation slides on our seminars page, where we’ll also soon upload a recording of her talk.

In our next seminar on Tuesday 14 July at 17:00–18:00 BST / 12:00–13:00 EDT / 9:00–10:00 PDT / 18:00–19:00 CEST, we’ll welcome Maria Zapata, Universidad Rey Juan Carlos, Madrid, who will be talking about computational thinking and how we can assess the computational thinking skills of very young children. To join the seminar, simply sign up with your name and email address and we’ll email you the link and instructions. If you attended Briana’s seminar, the link remains the same.

The post Reducing the load: ways to support novice programmers appeared first on Raspberry Pi.

New keyboards for Portugal, Norway, Sweden, and Denmark

It feels like just yesterday that we released the Raspberry Pi keyboard and hub to the world. Well, it turns out it’s been more than a year, and time really has flown for the next stage of this project, which brings four new language/country options: Portugal, Norway, Sweden, and Denmark. They’re available to buy now from Raspberry Pi Approved Resellers.

Raspberry Pi keyboards

The keyboard and hub has been a great success, with many users adopting our Raspberry Pi red and white colour scheme for their setup. As well as this satisfying uptake of the keyboard on its own, we’ve also sold tens of thousands of Raspberry Pi Desktop Kits which include a keyboard, alongside the official mouse, Beginners Guide and, of course, a Raspberry Pi.

Raspberry Pi official keyboard
If I say so myself, it’s quite a cool-looking desktop setup, with the boxes and cables all colour-coordinated.

We made the black and grey set up for users who own a black and grey Raspberry Pi case, but, with four out of five people choosing the red and white variant, it just goes to show what a bit of company branding can do for business!

We’ve found that the US keyboard is the most popular model, with over half our users choosing that option. As a Brit, I prefer the chunkier Enter key of the UK keyboard.

Close-up photo of UK keyboard Enter key
Easy to find

New variants

There is always a demand to support more users with keyboards to match their country and language so, as a second phase, we are announcing keyboards for the following countries:

  • Portugal
  • Norway
  • Sweden
  • Denmark
Photo: Raspberry Pi Portugal keyboard in red and white
The new European Portuguese variant of our keyboard and hub

These new keyboards are available now in red and white, with black and grey options coming soon. They are just print changes from previously released variants, but the devil proved to be in the detail.

For example, we hoped early on that the Portuguese keyboard would suit users in Brazil too, but we learned that Brazilian and European Portuguese keyboard layouts are quite different. Given the differences between UK and US keyboard layouts, this really shouldn’t have surprised us!

There is a very subtle difference between the Norway and Denmark keyboards. I wonder if anyone can spot it?

  • Norway keyboard template
    Norway
  • Denmark keyboard template
    Denmark
Spot the difference
 

We also discovered that a Finnish keyboard layout exists, but I couldn’t identify any differences between it and the Sweden keyboard. While I don’t speak Finnish, I do speak Swedish – an awesome language that everyone should learn – so I came to these investigations with a bit of relevant knowledge. I found that there are very small changes between different manufacturers, but no consistent differences between Finnish and Swedish keyboards, and ultimately I was guided by what Raspberry Pi OS expects as the correct function for these keyboards. I do hope I am right about these two keyboards being the same… I expect I’ll soon find out in the comments!

Photo: Raspberry Pi Sweden keyboard in red and white
Our new Swedish keyboard. If you know of a way in which a Finnish keyboard should differ from this, please tell us in the comments

We know that many users are waiting for a Japan keyboard variant. We hardly ever talk about new products before they are released, but we’re breaking our rule, in this case, to let you know that we hope to have some news about this very soon – so watch this space!

I’d like to give special thanks to Sherman Liu of Gembird for the new key matrix design, and Craig Wightman of Kinneir Dufort for his patience in designing all the key print revisions.

Happy coding, folks!

The post New keyboards for Portugal, Norway, Sweden, and Denmark appeared first on Raspberry Pi.

Code Jetpac’s rocket building action | Wireframe #40

Pick up parts of a spaceship, fuel it up, and take off in Mark Vanstone’s Python and Pygame Zero rendition of a ZX Spectrum classic

The original Jetpac, in all its 8-bit ZX Spectrum glory

For ZX Spectrum owners, there was something special about waiting for a game to load, with the sound of zeros and ones screeching from the cassette tape player next to the computer. When the loading screen – an image of an astronaut and Ultimate Play the Game’s logo – appeared, you knew the wait was going to be worthwhile. Created by brothers Chris and Tim Stamper in 1983, Jetpac was one of the first hits for their studio, Ultimate Play the Game. The game features the hapless astronaut Jetman, who must build and fuel a rocket from the parts dotted around the screen, all the while avoiding or shooting swarms of deadly aliens.

This month’s code snippet will provide the mechanics of collecting the ship parts and fuel to get Jetman’s spaceship to take off.  We can use the in-built Pygame Zero Actor objects for all the screen elements and the Actor collision routines to deal with gravity and picking up items. To start, we need to initialise our Actors. We’ll need our Jetman, the ground, some platforms, the three parts of the rocket, some fire for the rocket engines, and a fuel container. The way each Actor behaves will be determined by a set of lists. We have a list for objects with gravity, objects that are drawn each frame, a list of platforms, a list of collision objects, and the list of items that can be picked up.

Jetman jumps inside the rocket and is away. Hurrah!

Our draw() function is straightforward as it loops through the list of items in the draw list and then has a couple of conditional elements being drawn after. The update() function is where all the action happens: we check for keyboard input to move Jetman around, apply gravity to all the items on the gravity list, check for collisions with the platform list, pick up the next item if Jetman is touching it, apply any thrust to Jetman, and move any items that Jetman is holding to move with him. When that’s all done, we can check if refuelling levels have reached the point where Jetman can enter the rocket and blast off.

If you look at the helper functions checkCollisions() and checkTouching(), you’ll see that they use different methods of collision detection, the first being checking for a collision with a specified point so we can detect collisions with the top or bottom of an actor, and the touching collision is a rectangle or bounding box collision, so that if the bounding box of two Actors intersect, a collision is registered. The other helper function applyGravity() makes everything on the gravity list fall downward until the base of the Actor hits something on the collide list.

So that’s about it: assemble a rocket, fill it with fuel, and lift off. The only thing that needs adding is a load of pesky aliens and a way to zap them with a laser gun.

Here’s Mark’s Jetpac code. To get it running on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 40

You can read more features like this one in Wireframe issue 40, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 40 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code Jetpac’s rocket building action | Wireframe #40 appeared first on Raspberry Pi.

OpenVX API for Raspberry Pi

Par : Alex Bate

Raspberry Pi is excited to bring the Khronos OpenVX 1.3 API to our line of single-board computers. Here’s Kiriti Nagesh Gowda, AMD‘s MTS Software Development Engineer, to tell you more.

OpenVX for computer vision

OpenVX™ is an open, royalty-free API standard for cross-platform acceleration of computer vision applications developed by The Khronos Group. The Khronos Group is an open industry consortium of more than 150 leading hardware and software companies creating advanced, royalty-free acceleration standards for 3D graphics, augmented and virtual reality, vision, and machine learning. Khronos standards include Vulkan®, OpenCL™, SYCL™, OpenVX™, NNEF™, and many others.

Now with added Raspberry Pi

The Khronos Group and Raspberry Pi have come together to work on an open-source implementation of OpenVX™ 1.3, which passes the conformance on Raspberry Pi. The open-source implementation passes the Vision, Enhanced Vision, & Neural Net conformance profiles specified in OpenVX 1.3 on Raspberry Pi.

Application developers may always freely use Khronos standards when they are available on the target system. To enable companies to test their products for conformance, Khronos has established an Adopters Program for each standard. This helps to ensure that Khronos standards are consistently implemented by multiple vendors to create a reliable platform for developers. Conformant products also enjoy protection from the Khronos IP Framework, ensuring that Khronos members will not assert their IP essential to the specification against the implementation.

OpenVX enables a performance and power-optimized computer vision processing, especially important in embedded and real-time use cases such as face, body, and gesture tracking, smart video surveillance, advanced driver assistance systems (ADAS), object and scene reconstruction, augmented reality, visual inspection, robotics, and more. The developers can take advantage of using this robust API in their application and know that the application is portable across all the conformant hardware.

Below, we will go over how to build and install the open-source OpenVX 1.3 library on Raspberry Pi 4 Model B. We will run the conformance for the Vision, Enhanced Vision, & Neural Net conformance profiles and create a simple computer vision application to get started with OpenVX on Raspberry Pi.

OpenVX 1.3 implementation for Raspberry Pi

The OpenVX 1.3 implementation is available on GitHub. To build and install the library, follow the instructions below.

Build OpenVX 1.3 on Raspberry Pi

Git clone the project with the recursive flag to get submodules:

git clone --recursive https://github.com/KhronosGroup/OpenVX-sample-impl.git

Note: The API Documents and Conformance Test Suite are set as submodules in the sample implementation project.

Use the Build.py script to build and install OpenVX 1.3:

cd OpenVX-sample-impl/
python Build.py --os=Linux --venum --conf=Debug --conf_vision --enh_vision --conf_nn

Build and run the conformance:

export OPENVX_DIR=$(pwd)/install/Linux/x32/Debug
export VX_TEST_DATA_PATH=$(pwd)/cts/test_data/
mkdir build-cts
cd build-cts
cmake -DOPENVX_INCLUDES=$OPENVX_DIR/include -DOPENVX_LIBRARIES=$OPENVX_DIR/bin/libopenvx.so\;$OPENVX_DIR/bin/libvxu.so\;pthread\;dl\;m\;rt -DOPENVX_CONFORMANCE_VISION=ON -DOPENVX_USE_ENHANCED_VISION=ON -DOPENVX_CONFORMANCE_NEURAL_NETWORKS=ON ../cts/
cmake --build .
LD_LIBRARY_PATH=./lib ./bin/vx_test_conformance

Sample application

Use the open-source samples on GitHub to test the installation.

The post OpenVX API for Raspberry Pi appeared first on Raspberry Pi.

Volunteer your Raspberry Pi to IBM’s World Community Grid

Par : Alex Bate

IBM’s World Community Grid is working with scientists at Scripps Research on computational experiments to help find potential COVID-19 treatments. Anyone with a Raspberry Pi and an internet connection can help.

Why is finding potential treatments for COVID-19 so important?

Scientists all over the globe are working hard to create a vaccine that could help prevent the spread of COVID-19. However, this process is likely to take many months — or possibly even years.

In the meantime, scientists are also searching for potential treatments for the symptoms of COVID-19. A project called OpenPandemics – COVID-19 is one such effort. The project is led by researchers in the Forli Lab at Scripps Research, who are enlisting the help of World Community Grid volunteers.

What is World Community Grid and how does it work? 

World Community Grid is an IBM social responsibility initiative that supports humanitarian scientific research. 

Image text reads: Accelerate research with no investment of time or money. When you become a World Community Grid volunteer, you donate your device's spare computing power to help scientists solve the world's biggest problems in health and sustainability.

As a World Community Grid volunteer, you download a secure software program to your Raspberry Pi, macOS or Windows computer, or Android device. This software program (called BOINC) is used to run World Community Grid projects, and is compatible with the Raspberry Pi OS and most other operating systems. Then, when your device is not using its full power, it automatically runs a simulated experiment in the background that will help predict the effectiveness of a particular chemical compound as a possible treatment for COVID-19. Finally, your device automatically returns the results of the completed simulation and requests the next simulation.

Over the course of the project, volunteers’ devices will run millions of simulations of small molecules interacting with portions of the virus that causes COVID-19. This is a process known as molecular docking, which is the study of how two or more molecules fit together. When a simulated chemical compound fits, or ‘docks’, with a simulation of part of the virus that causes COVID-19, that interaction may point to a potential treatment for the disease.

An image of a calendar with the text: Get results that matter. As a World Community Grid volunteer, your device does research calculations when it's idle, so just by using it as. you do every dat you can help scientists get results in months instead of decades. With your help, they can identify the most important areas to study in the lab, bringing them one step closer to discoveries that save lives and address global problems.

World Community Grid combines the results from your device along with millions of results from other volunteers all over the world and sends them to the Scripps Research team for analysis. While this process doesn’t happen overnight, it accelerates dramatically what would otherwise take many years, or might even be impossible.

OpenPandemics – COVID-19 is the first World Community Grid project to harness the power of Raspberry Pi devices, but the World Community Grid technical team is already working to make other projects available for Raspberry Pi very soon.

Getting ready for future pandemics

Scientists have learned from past outbreaks that pandemics caused by newly emerging pathogens may become more and more common. That’s why OpenPandemics – COVID-19 was designed to be rapidly deployed to fight future diseases, ideally before they reach a critical stage.

A image of a scientist using a microscope. Text reads: Your device could help search for potential treatments for COVID-19. Scientists are using World Community Grid to accelerate the search for treatments to COVIS-19. The tools and techniques the scientists develop to fight COVID-19 could be used in the future by all researchers to help more quickly find treatments for potential pandemics

To help address future pandemics, researchers need access to swift and effective tools that can be deployed very early, as soon as a threatening disease is identified. So, the researchers behind OpenPandemics – COVID-19 are creating a software infrastructure to streamline the process of finding potential treatments for other diseases. And in keeping with World Community Grid’s open data policy, they will make their findings and these tools freely available to the scientific community. 

Join a global community of science supporters

World Community Grid is thrilled to make OpenPandemics – COVID-19 available to everyone who wants to donate computing power from their Raspberry Pi. Every device can play a part in helping the search for COVID-19 treatments. Please join us!

The post Volunteer your Raspberry Pi to IBM’s World Community Grid appeared first on Raspberry Pi.

Be a better Scrabble player with a Raspberry Pi High Quality Camera

One of our fave makers, Wayne from Devscover, got a bit sick of losing at Scrabble (and his girlfriend was likely raging at being stuck in lockdown with a lesser opponent). So he came up with a Raspberry Pi–powered solution!

Using a Raspberry Pi High Quality Camera and a bit of Python, you can quickly figure out the highest-scoring word your available Scrabble tiles allow you to play.

Hardware

  • Raspberry Pi 3B
  • Compatible touchscreen
  • Raspberry Pi High Quality Camera
  • Power supply for the touchscreen and Raspberry Pi
  • Scrabble board

You don’t have to use a Raspberry Pi 3B, but you do need a model that has both display and camera ports. Wayne also chose to use an official Raspberry Pi Touch Display because it can power the computer, but any screen that can talk to your Raspberry Pi should be fine.

Software

Firstly, the build takes a photo of your Scrabble tiles using raspistill.

Next, a Python script processes the image of your tiles and then relays the highest-scoring word you can play to your touchscreen.

The key bit of code here is twl, a Python script that contains every possible word you can play in Scrabble.

From 4.00 minutes into his build video, Wayne walks you through what each bit of code does and how he made it work for this project, including how he installed and used the Scrabble dictionary.

Fellow Scrabble-strugglers have suggested sneaky upgrades in the comments of Wayne’s YouTube video, such having the build relay answers to a more discreet smart watch.

No word yet on how the setup deals with the blank Scrabble tiles; those things are like gold dust.

In case you haven’t met the Raspberry Pi High Quality Camera yet, Wayne also did this brilliant unboxing and tutorial video for our newest piece of hardware.

And for more projects from Devscover, check out this great Amazon price tracker using a Raspberry Pi Zero W, and make sure to subscribe to the channel for more content.

The post Be a better Scrabble player with a Raspberry Pi High Quality Camera appeared first on Raspberry Pi.

Let’s learn about encryption with Digital Making at Home!

Join us for Digital Making at Home: this week, young people can learn about encryption and e-safety with us! With Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.

So get ready to decode a secret message with us:

Check out this week’s code-along projects!

And tune in on Wednesday 2pm BST / 9am EDT / 7.30pm IST at rpf.io/home to code along with our live stream session.

PS: If you want to learn how to teach students in your classroom about encryption and cybersecurity, we’ve got the perfect free online courses for you!

The post Let’s learn about encryption with Digital Making at Home! appeared first on Raspberry Pi.

Wes’s wonderful Minecraft user notification display

Par : Alex Bate

This Minecraft sign uses a Raspberry Pi to notify you when, and how many of, your friends are logged into your dedicated Minecraft server.

Let’s start by pointing out how wonderfully nostalgic many of Wes ‘Geeksmithing’ Swain’s projects are. From his Raspberry Pi–housing cement Thwomp that plays his favourite Mario games to The NES Project, his NES replica unit with a built-in projector — Wes makes the things we wished for as kids.

The NES Project covered in HackSpace magazine

We honestly wouldn’t be surprised if his next project is a remake of Duckhunt with servo-controlled ducks, or Space Invaders but it’s somehow housed in a flying space invader that shoots back with lasers. Honestly, at this point, we wouldn’t put it past him.

Making the Minecraft friend notification display

In the video, Wes covers the project in two parts. Firstly, he shows off the physical build of making the sign, including laser-cut acrylic front displayed with controllable LED lights, a Raspberry Pi Zero, and the wooden framing.

Secondly, he moves on to the code, in which he uses mcstatus, a Python class created by Minecraft’s Technical Director Nathan Adams that can be used to query servers for information. In this instance, Wes is using mcstatus to check for other players on his group’s dedicated Mincecraft server, but the class can also be used to gather mod information. You can find mcstatus on GitHub.

Each friend is assigned a letter that illuminates if they’re online.

Lucky for Wes, he has the same number of friends on his server as the number of letters in ‘Minecraft’, so for every friend online, he’s programmed the display to illuminate a letter of the Minecraft logo. And while the server is empty, he can also set the display to run through various light displays, including this one, a dedication to the new Minecraft Nether update.

If you’d like to try making this project yourself, you can: Wes goes into great detail in his video, and the code for the project can be found on his GitHub account.

And while we have your attention, be sure to subscribe to Geeksmithing on YouTube and show him some love for such a great project.

The post Wes’s wonderful Minecraft user notification display appeared first on Raspberry Pi.

(Raspberry) Pi Commander | The MagPi 95

Adrien Castel’s idea of converting an old electronic toy into a retro games machine was no flight of fancy, as David Crookes discovers

The 1980s was a golden era for imaginative electronic toys. Children would pester their parents for a Tomytronic 3D or a Nintendo Game & Watch. And they would enviously eye anyone who had a Tomy Turnin’ Turbo Dashboard with its promise of replicating the thrill of driving (albeit without the traffic jams).

All of the buttons, other than the joystick, are original to the toy – as are the seven red LED lights

Two years ago, maker Matt Brailsford turned that amazing toy into a fully working Out Run arcade machine and Adrien Castel was smitten. “I loved the fact that he’d upcycled an old toy and created something that could be enjoyed as a grown-up,” he says. “But I wanted to push the simulation a bit further and I thought a flying sim could do the trick.”

“I didn’t want to modify the look of the toy”

Ideas began flying around Adrien’s mind. “I knew what I wanted to achieve so I made an overall plan in my head,” he recalls. First he found the perfect toy: a battery-powered Sky Fighter F-16 tabletop game made by Dival. He then decided to base his build around a Raspberry Pi 3A+. “It’s the perfect hardware for projects like this because of its flexibility,” Adrien says.

Taking off

The toy needed some work. Its original bright red joystick was missing and Adrien knew he’d have to replace the original screen with a TFT LCD. To do this, he 3D-printed a frame to fit the TFT display and he created a smaller base for the replacement joystick. Adrien also changed the microswitches for greater sensitivity but he didn’t go overboard with the changes.

The games can make use of the full screen. Adrien would have liked a larger screen, but the original ratio oddly lay between 4:3 and 16:9, making a bigger display harder to find

“I knew I would have to adapt some parts for the joystick and for the screen, but I didn’t want to modify the look of the toy,” Adrien explains. “To be honest, modifying the toy would have involved some sanding and painting and I was worried that it would ruin the overall effect of the project if it was badly executed.”

A Raspberry Pi 3A+ sits at the heart of the Pi Commander, alongside a mini audio amplifier, and it’s wired up to components within the toy

As such, a challenge was set. “I had to keep most of the original parts such as throttle levers and LEDs and adapt them to the new build,” he says. “This meant getting them to work together with the system and it also meant using the original PCB, getting rid of the components and re-routing the electronics to plug on the GPIOs.”

There were some enhancements. Adrien soldered a PAM8403 3W class-D audio amplifier to Raspberry Pi and this allowed a basic speaker to replace the original for better sound. But there were some compromises too.

The original PCB was used and the electronics were re-routed. All the components need to work between 3.3 to 5V with the lowest possible amperage while fitting into a tight space

“At first I thought the screen could be bigger than the one I used, but the round shape of the cockpit didn’t give much space to fit a screen larger than four inches.” He also believes the project could be improved with a better joystick: “The one I’ve used is a simple two-button arcade stick with a jet fighter look.”

Flying high

By using the retro gaming OS Recalbox (based on EmulationStation and RetroArch), however, he’s been able to perfect the overall feel. “Recalbox allowed me to create a custom front end that matches the look of a jet fighter,” he explains. It also means the Pi Commander plays shoot-’em-up games alongside open-source simulators like FlightGear (flightgear.org). “It’s a lot of fun.”

Read The MagPi for free!

Find more fantastic projects, tutorials, and reviews in The MagPi #93, out now! You can get The MagPi #95 online at our store, or in print from all good newsagents and supermarkets. You can also access The MagPi magazine via our Android and iOS apps.

Don’t forget our super subscription offers, which include a free gift of a Raspberry Pi Zero W when you subscribe for twelve months.

And, as with all our Raspberry Pi Press publications, you can download the free PDF from our website.

The post (Raspberry) Pi Commander | The MagPi 95 appeared first on Raspberry Pi.

❌