What do we want? Retro gaming, adult beverages, and our favourite Spotify playlist. When do we want them? All at the same time.
The addition of a sneaky hiding spot for your favourite tipple, plus a musical surprise, set this build apart from the popular barrel arcade projects we’ve seen before, like this one featured a few years back on the blog.
The maker’s top choice is Tetris Attack for the SNES.
What more could you want now you’ve got retro games and an elegantly hidden drinks cabinet at your fingertips? u/breadtangle‘s creation has another trick hidden inside its smooth wooden curves.
The Raspberry Pi computer used in this build also runs Raspotify, a Spotify Connect client for Raspberry Pi that allows you to stream your favourite tunes and playlists from your phone while you game.
You can set Raspotify to play via Bluetooth speakers, but if you’re using regular speakers and are after a quick install, whack this command in your Terminal:
curl -sL https://dtcooper.github.io/raspotify/install.sh | sh
u/breadtangle neatly tucked a pair of Logitech z506 speakers on the sides of the barrel, where they could be protected by the overhang of the glass screen cover.
The build’s joysticks and buttons came from Amazon, and they’re set into an off-cut piece of kitchen countertop. The glass screen protector is another Amazon find and sits on a rubber car-door edge protector.
The screen itself is lovingly tilted towards the controls, to keep players’ necks comfortable, and u/breadtangle finished off the build’s look with a barstool to sit on while gaming.
We love it, but we have one very important question left…
Can we come round and play?
Hacking apart a sweet, innocent Raspberry Pi – who would do such a thing? Network Chuck, that’s who. But he has a very cool reason for it so, we’ll let him off the hook.
He’s figured out how to install VMware ESXi on Raspberry Pi, and he’s sharing the step-by-step process with you because he loves you. And us. We think. We hope.
In a nutshell, Chuck “hacks” apart a Raspberry Pi to show you how it can operate as three separate computers, each running different software at the same time. He’s a wizard.
VMware is cool because it’s Virtual Machine software big companies use on huge servers, but you can deploy it on one of our tiny devices and learn how to use it in the comfort of your own home if you follow Chuck’s instructions.
Firstly, you need to make sure you’re running the latest version of Raspberry Pi OS. Chuck uses Raspberry Pi Imager to do this, and the video above shows you how to do the same.
Then you’ll need to format your SD card ready for VMware ESXi. This can be done with Raspberry Pi Imager too. You’ll need to download these two things:
Chuck is the kind of good egg who walks you through how to do this on screen at this point in the project video.
Then you’ll need to create the VMWare Installer to install the actual software. It’s at this point your USB flash drive takes centre stage. Here’s everything you’ll need:
And this is the point in the video at which Chuck walks you through the process.
Once that’s all done, stick your USB flash drive into your Raspberry Pi and get going. You need to be quick off the mark for this bit – there’s some urgent Escape key pressing required, but don’t worry, Chuck walks you through everything.
Once you’ve followed all those steps, you will be up, running, and ready to go. The installation process only takes up the first 15 minutes of Chuck’s project video, and he spends the rest of his time walking you through creating your first VM and adding more storage.
Top job, Chuck.
There’s also the brilliant networkchuck.com.
Working with Oak National Academy, we’ve turned the materials from our Teach Computing Curriculum into more than 300 free, curriculum-mapped video lessons for remote learning.
One of our biggest projects for teachers that we’ve worked on over the past two years is the Teach Computing Curriculum: a comprehensive set of free computing classroom materials for key stages 1 to 4 (learners aged 5 to 16). The materials comprise lesson plans, homework, progression mapping, and assessment materials. We’ve created these as part of the National Centre for Computing Education, but they are freely available for educators all over the world to download and use.
In the second half of 2020, in response to school closures, our team of experienced teachers produced over 100 hours of video to transform Teach Computing Curriculum materials into video lessons for learning at home. They are freely available for parents, educators, and learners to continue learning computing at home, wherever you are in the world.
You’ll find our videos for more than 300 hour-long lessons on the Oak National Academy website. The progression of the lessons is mapped out clearly, and the videos cover England’s computing national curriculum. There are video lessons for:
To access the full set of classroom materials for teaching, visit the National Centre for Computing Education website.
The post Supporting teachers and students with remote learning through free video lessons appeared first on Raspberry Pi.
Sam Battle aka LOOK MUM NO COMPUTER couldn’t resist splashing out on a clear Macintosh case for a new project in his ‘Cosmo’ series of builds, which inject new life into retro hardware.
This time around, a Raspberry Pi, running facial recognition software, and one of our Camera Modules enable Furby-style eyes to track movement, detect faces, and follow you around the room.
He loves a good Furby does Sam. Has a whole YouTube playlist dedicated to projects featuring them. Seriously.
Our favourite bit of the video is when Sam meets Raspberry Pi for the first time, boots it up, and says:
“Wait, I didn’t know it was a computer. It’s an actual computer computer. What?!”
The eyes are ping pong balls cut in half so you can fit a Raspberry Pi Camera Module inside them. (Don’t forget to make a hole in the ‘pupil’ so the lens can peek through).
The Raspberry Pi and display screen are neatly mounted on the side of the Macintosh so they’re easily accessible should you need to make any changes.
All the hacked, repurposed junky bits sit inside or are mounted on swish 3D-printed parts.
Add some joke shop chatterbox teeth, and you’ve got what looks like the innards of a Furby staring at you. See below for a harrowing snapshot of Zach’s ‘Furlexa’ project, featured on our blog last year. We still see it when we sleep.
It wasn’t enough for Furby-mad Sam to have created a Furby look-a-like face-tracking robot, he needed to go further. Inside the clear Macintosh case, you can see a de-furred Furby skeleton atop a 3D-printed plinth, with redundant ribbon cables flowing from its eyes into the back of the face-tracking robot face, thus making it appear as though the Furby is the brains behind this creepy creation that is following your every move.
Eventually, Sam’s Raspberry Pi–powered creation will be on display at the Museum of Everything Else, so you can go visit it and play with all the “obsolete and experimental technology” housed there. The museum is funded by the Look Mum No Computer Patreon page.
The post These Furby-‘controlled’ Raspberry Pi-powered eyes follow you appeared first on Raspberry Pi.
Did you get Raspberry Pi 400 as a home learning or working device? We hope you’ve been getting on well with our affordable all-in-one computing solution.
If you’re a new user, here are some tips for you to get the most out of your brand-new Raspberry Pi 400.
Make sure your Raspberry Pi runs the newest version of the Raspberry Pi OS. Here is how (and here is a video preview of what the process looks like):
Open a terminal window by clicking on the Terminal icon in the top menu bar. Then type this command in the terminal window:
sudo apt update
Press Enter on the keyboard. Once the update is downloaded, type into the window:
sudo apt full-upgrade
Press Enter again. It is safe to just accept the default answer to any questions you are asked during the procedure by typing y and pressing Enter.
Now reboot your Raspberry Pi.
With the newest version of Raspberry Pi OS installed, you can use the following applications in the Chromium browser:
Just log in with your username and password and start working or learning!
Raspberry Pi OS also has LibreOffice installed for working with text files, spreadsheets, and the like.
Go into the Preferences section in the main menu, and open Print Settings. This shows the system-config-printer dialog window, where you can do the usual things you’re familiar with from other operating systems: add new printers, remove old ones, set a printer as the default, and access the print queue for each printer.
Like most things in Linux-based operating systems such as Raspberry Pi OS, whether you can make your printer model work depends on user contributions; not every printer is supported yet. We’ve found that most networked printers work fine, while USB printers are a bit hit-and-miss. The best thing to do is to try it and see, and ask for help on our forums if your particular printer doesn’t seem to work.
Our very own Alasdair Allen wrote a comprehensive guide that covers more topics of setting up a Raspberry Pi for home working, from getting your audio and video ready to setting up a Citrix workspace. Thanks Alasdair!
We’ve got a host of completely free resources for young people, parents, and teachers to continue computing school lessons at home and learn about digital making. Discover them all here!
Let us know in the comments if there are any niggles you’re experiencing, or if you have a top tip to help others who are just getting to grips with using Raspberry Pi as a home learning or working device.
This is creepy, and we love it. OK, it’s not REALLY creepy, it’s just that some people have an aversion to dolls that appear to move of their own accord, due to a disturbing childhood experience — but enough about me.
Using a smartphone app, viewers determine which way a ball travels through transparent pipes, and depending on which light barriers the ball interrupts on its journey, various toys are animated to tell different stories.
The server of the installation is a Raspberry Pi 4. Via its GPIO pins, it controls the track switches and releases the ball.
The apparatus is full of toys donated by residents of Wolfsburg, Germany. The artists wanted local people to not only be able to operate the mechanical piece, but also to have a hand in creating it. Each animatronic toy is made as a separate module, controlled by its own Arduino Nano.
Smart Fairy Tale can be remotely controlled by viewers who want to check in on the toys they gifted to the installation, and by any other curious people elsewhere in the world.
Better yet, the stories the toys tell were devised by local school students. The artists showed the gifted toys to a few elementary school classes, and the students drew several stories featuring toys they liked. The makers then programmed the toys to match what the drawings said they could do. A servo here, a couple of LEDs there, and the students’ stories were brought to life.
So what kind of stories did Wolfsburg’s finest come up with? One of the creators explains:
“There were a lot of scenes to interpret, like the blow-up love story, the chemtrail conspiracy, and the fossil fuel disaster, which culminates in a major traffic jam. The latter one even involved a laboratory for breeding synthetic dinosaurs by the use of renewable energies.”Felix Fisgus
We LOVE it. Don’t tell me this isn’t creepy though…
Create a network of pipes before the water starts to flow in our re-creation of a classic puzzler. Jordi Santonja shows you how.
Pipe Mania, also called Pipe Dream in the US, is a puzzle game developed by The Assembly Line in 1989 for Amiga, Atari ST, and PC, and later ported to other platforms, including arcades. The player must place randomly generated sections of pipe onto a grid. When a counter reaches zero, water starts to flow and must reach the longest possible distance through the connected pipes.
Let’s look at how to recreate Pipe Dream in Python and Pygame Zero. The variable
start is decremented at each frame. It begins with a value of
60*30, so it reaches zero after 30 seconds if our monitor runs at 60 frames per second. In that time, the player can place tiles on the grid to build a path. Every time the user clicks on the grid, the last tile from
nextTiles is placed on the play area and a new random tile appears at the top of the next tiles.
randint(2,8) computes a random value between 2 and 8.
nextTiles are lists of tile values, from 0 to 8, and are copied to the screen in the
draw function with the
grid is a two-dimensional list, with sizes
gridHeight=7. Every pipe piece is placed in
grid with a mouse click. This is managed with the Pygame functions
on_mouse_down, where the variable pos contains the mouse position in the window.
panelPosition defines the position of the top-left corner of the grid in the window. To get the grid cell,
panelPosition is subtracted from
pos, and the result is divided by
tileSize with the integer division
tileMouse stores the resulting cell element, but it is set to
(-1,-1) when the mouse lies outside the grid.
images folder contains the PNGs with the tile images, two for every tile: the graphical image and the path image. The
tiles list contains the name of every tile, and adding to it
_path obtains the name of the file. The values stored in
grid are the indexes of the elements in
waterPath isn’t shown to the user, but it stores the paths that the water is going to follow. The first point of the water path is located in the starting tile, and it’s stored in
update calls the function
CheckNextPointDeleteCurrent, when the water starts flowing. That function finds the next point in the water path, erases it, and adds a new point to the
waterFlow is shown to the user in the
pointsToCheck contains a list of relative positions, offsets, that define a step of two pixels from
currentPoint in every direction to find the next point. Why two pixels? To be able to define the ‘cross’ tile, where two lines cross each other. In a ‘cross’ tile the water flow must follow a straight line, and this is how the only points found are the next points in the same direction. When no next point is found, the game ends and the score is shown: the number of points in the water path,
playState is set to
0, and no more updates are done.
You can read more features like this one in Wireframe issue 46, available directly from Raspberry Pi Press — we deliver worldwide.
As the UK — like many countries around the world — kicks off the new year with another national lockdown, meaning that millions of young people are unable to attend school, I want to share an update on how the Raspberry Pi Foundation is helping young people to learn at home.
Please help us spread the word to teachers, school leaders, governors, parents, and carers. Everything we are offering here is 100% free and the more people know about it, the more young people will benefit.
Schools and teachers all over the world have been doing a heroic job over the past ten months, managing the transition to emergency remote teaching during the first round of lockdowns, supporting the most vulnerable pupils, dealing with uncertainty, changing the way that schools worked to welcome pupils back safely, helping pupils catch up with lost learning, and much, much more.
Both in my role as Chief Executive of the Raspberry Pi Foundation and as chair of governors at a state school here in Cambridge, I’ve seen first-hand the immense pressure that schools and teachers are under. I’ve also seen them display the most amazing resilience, commitment, and innovation. I want to say a huge thank you to all teachers and school staff for everything you’ve done and continue to do to help young people through this crisis.
Here’s some of the resources and tools that we’ve created to help you continue to deliver a world-class computing education:
All of these resources are mapped to the English computing curriculum and produced as part of the National Centre for Computing Education. They are available for everyone, anywhere in the world, for free.
Parents and carers are the other heroes of remote learning during lockdown. I know from personal experience that juggling work and supporting home learning can be really tough, and we’re all trying to find meaningful, fun alternatives to letting our kids binge YouTube or Netflix (other video platforms and streaming services are available).
That’s why we’ve been working really hard to provide parents and carers with easy, accessible ways for you to help your young digital makers to get creative with technology:
One of the harsh lessons we learned last year was that far too many young people don’t have a computer for learning at home. There has always been a digital divide; the pandemic has just put it centre-stage. The good news is that the cost of solving this problem is now trivial compared to the cost of allowing it to persist.
That’s why the Raspberry Pi Foundation has teamed up with UK Youth and a network of grassroots youth and community organisations to get computers into the hands of disadvantaged young people across the UK.
For under £200 we can provide a vulnerable child with everything they need to learn at home, including a Raspberry Pi desktop computer, a monitor, a webcam, free educational software, and ongoing support from a local youth worker and the Foundation team. So far, we have managed to get 2000 Raspberry Pi computers into the hands of the most vulnerable young people in the UK. A drop in the ocean compared to the size of the problem, but a huge impact for every single young person and family.
This has only been possible thanks to the generous support of individuals, foundations, and businesses that have donated to support our work. If you’d like to get involved too, you can find out more here.
Quite possibly the coolest thing we saw Raspberry Pi powering last year was ISS Mimic, a mini version of the International Space Station (ISS). We wanted to learn more about the brains that dreamt up ISS Mimic, which uses data from the ISS to mirror exactly what the real thing is doing in orbit.
The ISS Mimic team’s a diverse, fun-looking bunch of people and they all made their way to NASA via different paths. Maybe you could see yourself there in the future too?
Dallas Kidd currently works at the startup Skylark Wireless, helping to advance the technology to provide affordable high speed internet to rural areas.
Previously, she worked on traffic controllers and sensors, in finance on a live trading platform, on RAID controllers for enterprise storage, and at a startup tackling the problem of alarm fatigue in hospitals.
Before getting her Master’s in computer science with a thesis on automatically classifying stars, she taught English as a second language, Algebra I, geometry, special education, reading, and more.
Her hobbies are scuba diving, learning about astronomy, creative writing, art, and gaming.
Tristan Moody currently works as a spacecraft survivability engineer at Boeing, helping to keep the ISS and other satellites safe from the threat posed by meteoroids and orbital debris.
He has a PhD in mechanical engineering and currently spends much of his free time as playground equipment for his two young kids.
Estefannie spends her time inventing things before thinking, soldering for fun, writing, filming and producing content for her YouTube channel, and public speaking at universities, conferences, and hackathons.
She lives in Houston, Texas and likes tacos.
Douglas Kimble currently works as an electrical/mechanical design engineer at Boeing. He has designed countless wire harness and installation drawings for the ISS.
He assumes the mentor role and interacts well with diverse personalities. He is also the world’s biggest Lakers fan living in Texas.
His favorite pastimes includes hanging out with his two dogs, Boomer and Teddy.
Craig’s father worked for the Space Shuttle program, designing the ascent flight trajectories profiles for the early missions. He remembers being on site at Johnson Space Center one evening, in a freezing cold computer terminal room, punching cards for a program his dad wrote in the early 1980s.
Craig grew up with LEGO and majored in Architecture and Space Design at the University of Houston’s Sasakawa International Center for Space Architecture (SICSA).
His day job involves measuring ISS major assemblies on the ground to ensure they’ll fit together on-orbit. Traveling to many countries to measure hardware that will never see each other until on-orbit is the really coolest part of the job.
Sam Treadgold is an aerospace engineer who also works on the Meteoroid and Orbital Debris team, helping to protect the ISS and Space Launch System from hypervelocity impacts. Occasionally they take spaceflight hardware out to the desert and shoot it with a giant gun to see what happens.
In a non-pandemic world he enjoys rock climbing, music festivals, and making sound-reactive LED sunglasses.
Chen Deng is a Systems Engineer working at Boeing with the International Space Station (ISS) program. Her job is to ensure readiness of Payloads, or science experiments, to launch in various spacecraft and operations to conduct research aboard the ISS.
The ISS provides a very unique science laboratory environment, something we can’t get much of on earth: microgravity! The term microgravity means a state of little or very weak gravity. The virtual absence of gravity allows scientists to conduct experiments that are impossible to perform on earth, where gravity affects everything that we do.
In her free time, Chen enjoys hiking, board games, and creative projects alike.
Bryan Murphy is a dynamics and motion control engineer at Boeing, where he gets to create digital physics models of robotic space mechanisms to predict their performance.
His favorite projects include the ISS treadmill vibration isolation system and the shiny new docking system. He grew up on a small farm where his hands-on time with mechanical devices fueled his interest in engineering.
When not at work, he loves to brainstorm and create with his artist/engineer wife and their nerdy kids, or go on long family roadtrips—- especially to hike and kayak or eat ice cream. He’s also vice president of a local makerspace, where he leads STEM outreach and includes excess LEDs in all his builds.
Susan is a mechanical engineer and a 30+-year veteran of manned spaceflight operations. She has worked the Space Shuttle Program for Payloads (middeck experiments and payloads deployed with the shuttle arm) starting with STS-30 and was on the team that deployed the Hubble Space Telescope.
She then transitioned into life sciences experiments, which led to the NASA Mir Program where she was on continuous rotation for three years to Russian Mission Control, supporting the NASA astronaut and science experiments onboard the space station as a predecessor to the ISS.
She currently works on the ISS Program (for over 20 years now), where she used to write procedures for on-orbit assembly of the Space Xtation and now writes installation procedures for on-orbit modifications like the docking adapter. She is also an artist and makes crosses out of found objects, and even used to play professional women’s football.
Did you know that there are Raspberry Pi computers aboard the real ISS that young people can run their own Python programs on? How cool is that?!
Find out how to participate at astro-pi.org.
The post Meet team behind the mini Raspberry Pi–powered ISS appeared first on Raspberry Pi.
Why use a regular swear jar to retrain your potty-mouthed brain when you can build a Swear Bear to help you instead?
Swear Bear listens to you. All the time. And Swear Bear can tell when a swear word is used. Swear Bear tells you off and saves all the swear words you said to the cloud to shame you. Swear Bear subscribes to the school of tough love.
The Google AIY kit allows you to build your own natural language recogniser. This page shows you how to assemble the Voice HAT from the kit, and it also includes the code you’ll need to make your project capable of speech-to-text AI.
To teach Swear Bear the art of profanity detection, Swear Bear creators 8 Bits and a Byte turned to the
profanity check Python library. You can find the info to install and use the library on this page, as well as info on how it works and why it’s so accurate.
You’ll hear at this point in the video that Swear Bear says “Oh dear” when a swear word is used within earshot.
This project uses the the first version of Google’s AIY Voice Kit, which comes with a larger black AIY Voice HAT and is compatible with Raspberry Pi 3 Model B. The kit also includes a little Voice HAT microphone board.
The microphone allows Swear Bear to ‘hear’ your speech, and through its speakers it can then tell you off for swearing.
All of hardware is squeezed into the stuffing-free bear once the text-to-speech and profanity detection software is working.
8 Bits and a Byte fan Ben Scarboro took to the comments on YouTube to suggest they rework one of our Babbage Bears into a Swear Bear. Babbage is teeny tiny, so maybe you would need to fashion a giant version to accomplish this. Just don’t make us watch while you pull out its stuffing.
The post Raspberry Pi ‘Swear Bear’ keeps your potty mouth in check appeared first on Raspberry Pi.
Note: We’re not *really* here, we just dropped in to point you in the right direction with your new Raspberry Pi toys, then we’re disappearing again to enjoy the rest of the festive season. See you on 4 January 2021!
So… what did you get? We launched a ton of new products this year, so we’ll walk you through what to do with each of them, as well as how to get started if you received a classic Raspberry Pi.
First things first: welcome! You’re one of us now, so why not take a moment to meet your fellow Raspberry Pi folk and join our social communities?
You can hang out with us on Twitter, Facebook, LinkedIn, and Instagram. And we’ve got two YouTube channels: a channel for the tech fans and makers, and a channel for young people, parents, and educators. Subscribe to the one you like best — or even BOTH of them!
Tag us on social media in a photo with your favourite Christmas present and let us know what you plan to do with your new Raspberry Pi!
If you were lucky enough to get a Raspberry Pi 400 Personal Computer Kit, all you have to do is find a monitor (a TV will also do), plug in, and go. It really is that simple. In fact, when we launched it, Eben Upton described it as a “Christmas morning product”. Always thinking ahead, that guy.
If you got a Raspberry Pi 400 unit on its own, you’ll need to find a mouse and power supply as well as a monitor. You also won’t have received the official Raspberry Pi Beginner’s Guide that comes with the kit, so you can pick one up from the Raspberry Pi Press online store, or download a PDF for free, courtesy of The MagPi magazine.
You are going to LOVE playing around with this if you got one in your stocking. The Raspberry Pi High Quality Camera is 12.3 megapixels of fun, and the latest addition to the Raspberry Pi camera family.
This video shows you how to set up your new toy. And you can pick up the Official Raspberry Pi Camera Guide for a more comprehensive walkthrough. You can purchase the book in print today from the Raspberry Pi Press store for £10, or download the PDF for free from The MagPi magazine website.
Share your photos using #ShotOnRaspberryPi. We retweet the really good ones!
If you got one of our classic Raspberry Pi boards, make sure to get the latest version of Raspberry Pi OS, our official supported operating system.
The easiest way to flash the OS onto your SD card is using the Raspberry Pi Imager. Take 40 seconds to watch the video below to learn how to do that.
If you’re a complete newbie, our help pages are the best place to start in case you’re a bit daunted by where to plug everything in on your very first Raspberry Pi. If you want step-by-step help, you can also take our free online course “Getting Started with Your Raspberry Pi”.
Once you’ve got the hang of things, our forum will become your home from home. Gazillions of Raspberry Pi superfans hang out there and can answer pretty much any question you throw at them – try searching first, because many questions have already been asked and answered, and perhaps yours has too.
When you’re feeling comfortable with the basics, why not head over to our projects page and pick something cool to make?
The Raspberry Pi blog is also a great place to find inspiration. We share the best projects from our global community, and things for all abilities pop up every week day. If you want us to do the heavy lifting for you, just sign up to Raspberry Pi Weekly, and we’ll send you the top blogs and Raspberry Pi-related news each week.
And if you got your very own Babbage Bear: love them, cherish them, and keep them safe. They’re of a nervous disposition so talk quietly to them for the first few days, to let them get used to you.
The post Merry Christmas to all Raspberry Pi recipients — help is here! appeared first on Raspberry Pi.
Just in time for the holidays, we’ve updated Raspberry Pi Imager to add some new functionality.
You can go to our software page to download and install the new version 1.5 release of Raspberry Pi Imager and use it now.
We haven’t done telemetry in Imager before, and since people tend — rightly — to be concerned about applications gathering data, we want to explain exactly what we are doing and why: we’re logging which operating systems and categories people download, so we can make sure the most popular options are easy enough to find in Raspberry Pi Imager’s menu system.
We don’t record any personal data, such as your IP address; the information we collect allows us to see the number of downloads of each operating system over time, and nothing else. You’ll find more detailed information, including how to opt out of telemetry, in the Raspberry Pi Imager GitHub README.md.
You can see which OSes are most often downloaded too, on our stats page.
As you can see, the default recommended Raspberry Pi OS image is indeed the most downloaded option. The recently released Ubuntu Desktop for Raspberry Pi 4 and Raspberry Pi 400 is the most popular third-party operating system.
It’s hard to comprehend how far machine learning has come in the past few years. You can now use a sub-£50 computer to reliably recognise someone’s face with surprising accuracy.
Although this kind of computing power is normally out of reach of microcontrollers, adding a Raspberry Pi computer to your project with the new High Quality Camera opens up a range of possibilities. From simple alerting applications (‘Mum’s arrived home!’), to dynamically adjusting settings based on the person using the project, there’s a lot of fun to be had.
Here’s a beginner’s guide to getting face recognition up and running.
1. Prepare your Raspberry Pi
For face recognition to work well, we’re going to need some horsepower, so we recommend a minimum of Raspberry Pi 3B+, ideally a Raspberry Pi 4. The extra memory will make all the difference. To keep as much resource as possible available for our project, we’ve gone for a Raspberry Pi OS Lite installation with no desktop.
Make sure you’re on the network, have set a new password, enabled SSH if you need to, and updated everything with
sudo apt -y update && sudo apt -y full-upgrade. Finally, go into settings by running
sudo raspi-config and enable the camera in ‘Interfacing Options’.
2. Attach the camera
This project will work well with the original Raspberry Pi Camera, but the new official HQ Camera will give you much better results. Be sure to connect the camera to your Raspberry Pi 4 with the power off. Connect the ribbon cable as instructed in hsmag.cc/HQCameraGetStarted. Once installed, boot up your Raspberry Pi 4 and test the camera is working. From the command line, run the following:
raspivid -o test.h264 -t 10000
This will record ten seconds of video to your microSD card. If you have an HDMI cable plugged in, you’ll see what the camera can see in real-time. Take some time to make sure the focus is correct before proceeding.
3. Install dependencies
The facial recognition library we are using is one that has been maintained for many years by Adam Geitgey. It contains many examples, including Python 3 bindings to make it really simple to build your own facial recognition applications. What is not so easy is the number of dependencies that need to be installed first. There are way too many to list here, and you probably won’t want to type them out, so head over to hsmag.cc/FacialRec so that you can cut and paste the commands. This step will take a while to complete on a Raspberry Pi 4, and significantly longer on a Model 3 or earlier.
3. Install the libraries
Now that we have everything in place, we can install Adam’s applications and Python bindings with a simple, single command:
sudo pip3 install face_recognition
Once installed, there are some examples we can download to try everything out.
cdIn this repository is a range of examples showing the different ways the software can be used, including live video recognition. Feel free to explore and remix.
git clone --single-branch https://github.com/ageitgey/face_recognition.git
5. Example images
The examples come with a training image of Barack Obama. To run the example:
On your smartphone, find an image of Obama using your favourite search engine and point it at the camera. Providing focus and light are good you will see:
“I see someone named Barack Obama!”
If you see a message saying it can’t recognise the face, then try a different image or try to improve the lighting if you can. Also, check the focus for the camera and make sure the distance between the image and camera is correct.
6. Training time
The final step is to start recognising your own faces. Create a directory and, in it, place some good-quality passport-style photos of yourself or those you want to recognise. You can then edit the
facerec_on_raspberry_pi.py script to use those files instead. You’ve now got a robust prototype of face recognition. This is just the beginning. These libraries can also identify ‘generic’ faces, meaning it can detect whether a person is there or not, and identify features such as the eyes, nose, and mouth. There’s a world of possibilities available, starting with these simple scripts. Have fun!
Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.
Each issue is free to download from the HackSpace magazine website.
The post Add face recognition with Raspberry Pi | Hackspace 38 appeared first on Raspberry Pi.
Fans of the Stargate SG-1 series, prepare to be inspired: a fellow aficionado has fashioned his own model of the show’s iconic portal. Nicola King takes an interstellar trip in the latest issue of The MagPi Magazine.
When Kristian Tysse began making some projects on his new 3D printer, he soon became aware that the possibility of printing his own ‘working’ Stargate SG-1 model was within his grasp at last. “I suddenly realised I might now have enough knowledge about 3D printing, Raspberry Pi, motors, and programming to actually make a Stargate model of my own,” he tells us. “I wanted people who are familiar with the show to immediately know what it was, and tried to make it work as best I could, while staying as true as possible to the feeling and essence of the TV show.”
Kristian also wanted to use a Raspberry Pi within this fully interactive, light-up, moving-parts project as “it is a powerful device with lots of flexibility. I do like that it functions as a full computer with an operating system with all the possibility that brings.”
You only have to look at the model to see just how much 3D printing was needed to get all of the parts ready to piece together, and Kristian created it in segments. But one of the key parts of his model is the DHD or Dial Home Device which viewers of the series will be familiar with. “The DHD functions as a USB keyboard and, when the keys are used, it sends signals to the (Python) program on Raspberry Pi that engages the different motors and lights in a proper Stargate way,” he enthuses. “If a correct set of keys/symbols are pressed on the DHD, the wormhole is established – illustrated on my Stargate with an infinity mirror effect.”
However, the DHD was a challenge, and Kristian is still tweaking it to improve how it works. He admits that writing the software for the project was also tricky, “but when I think back, the most challenging part was actually making it ‘functional’, and fitting all the wires and motors on it without destroying the look and shape of the Stargate itself.”
Kristian admits to using a little artistic licence along the way, but he is keen to ensure the model replicates the original as far as possible. “I have taken a few liberties here and there. People on the social media channels are quick to point out differences between my Stargate and the one in the series. I have listened to most of those and done some changes. I will implement some more of those changes as the project continues,” he says. He also had to redesign the project several times, and had a number of challenges to overcome, especially in creating the seven lit, moving chevrons: “I tried many different approaches before I landed on the right one.”
The results of Kristian’s time-intensive labours are truly impressive, and show what you can achieve when you are willing to put in the hours and the attention to detail. Take a look at Kristian’s extremely detailed project page to see more on this super-stellar make.
Never want to miss an issue? Subscribe to The MagPi and we’ll deliver every issue straight to your door. Also, if you’re a new subscriber and get the 12-month subscription, you’ll get a completely free Raspberry Pi Zero bundle with a Raspberry Pi Zero W and accessories.
The year is drawing to a close, and we are so excited for 2021!
More than 700 young people from 39 countries shared their tech creations in the free Coolest Projects online showcase this year! We loved seeing so many young people shine with their creative projects, and we can’t wait to see what the world’s next generation of digital makers will present at Coolest Projects in 2021.
Coolest Projects is the world-leading technology fair for young people! It’s our biggest event, and we are running it online again next year so that young people can participate safely and from wherever they are in the world.
Through Coolest Projects, young people are empowered to show the world something they’re making with tech — something THEY are excited about! Anyone up to age 18 can share their creation at Coolest Projects.
On 1 February, we will open registrations for the 2021 online showcase. Mark the date in your calendar! All registered projects will get their very own spot in the Coolest Projects online showcase gallery, where the whole world can discover them.
If a young person in your life — your family, your classroom, your coding club — is making something with tech that they love, we want them to register it for Coolest Projects. It doesn’t matter how small or big their project is, because the Coolest Projects showcase is about celebrating the love we all share for getting creative with tech.
Everyone who registers a project becomes part of a worldwide community of peers who express themselves and their interests with creative tech. We will also have special judges pick their favourite projects! Taking part in Coolest Projects is a wonderful way to connect with others, be inspired, and learn from peers.
So if you know a tech-loving young person, get them excited for taking part in Coolest Projects!
“We are so very happy to have reached people who love to code and are enjoying projects from all over the world…everyone’s contributions have blown our minds…we are so so happy ️️Thank you to Coolest Projects for hosting the best event EVER ️️“– mother of a participant in the 2020 online showcase
Want inspiration for projects? You can still explore all the wonderful projects from the 2020 showcase gallery.
Everyone is invited to take part in Coolest Projects — the showcase is for young people with any level of experience. The project they register can be whatever they like, from their very first Scratch animation, to their latest robotics project, website, or phone app. And we invite projects at any stages of the creation process, whether they’re prototypes, finished products, or works-in-progress!
To be the first to know when registration opens, you only have to sign up for our newsletter:
We will send you regular news about Coolest Projects to keep you up to date and help you inspire the young tech creator in your life!
To round off Computer Science Education Week 2020, the Google Code Next team, working with the Raspberry Pi Foundation and some incredible volunteers in the Chicago area, helped over 400 Black and Latinx high school students get coding using Raspberry Pi 400. Here’s Omnia Saed with more.
In partnership with Google Code Next, the Raspberry Pi Foundation curated a computer science activity for over 400 Chicago Public Schools students. Over 1000 kits with the newly released Raspberry Pi 400 were sent to six public schools to mark the end of Computer Science Education Week (7-14 December).
Google Code Next is a free computer science education program for Black and Latinx high school students. Between 2011 and 2018, Black and Hispanic college students each only made up 3 percent of computer science graduates; Code Next works to change that. The program provides students with the skills and inspiration needed for long and rewarding careers in computer science.
“We aim to provide Black and Latinx students with skills and technical social capital — that web of relationships you can tap into,” said Google Diversity STEM Strategist Shameeka Emanuel.
The virtual event brought over 80 Google volunteers, students and teachers together to create their very own “Raspimon”—a virtual monster powered by Raspberry Pi. For many students, it was their first time coding.
Matt Richardson, Executive Director of the Raspberry Pi Foundation North America, opened the event by telling students to share their work with family and friends.
“I hope you find new ways to solve problems or express yourselves creatively. More importantly, be sure to share what you create with someone you know – you might just spark curiosity in someone else,” he said.
In an interview with the Chicago Sun Times, Troy Williams, Chicago Public Schools interim director of computer science, explains, “Our students being able to have access to these Raspberry Pis and other resources supplements the learning they’re doing in the classrooms, and brings another level of engagement where they can create on their own. It really helps toward closing the digital divide and the learning gap as well.”
Want to join in with the fun? You’ll find a copy of the activity and curriculum on the Code Next website.
And if you’re looking to introduce someone to coding over the holidays, there’s still time to order a Raspberry Pi 400 computer kit from our network of Raspberry Pi Approved Resellers.
The post Raspberry Pi and Google Code Next bring computer science to 1000 Chicago students appeared first on Raspberry Pi.
Raspberry Pi computers have always been used in a huge variety of settings, since the combination of low cost, high performance, and ease of use make it an ideal device for almost any application. We’ve seen a large proportion of sales go into the industrial market – businesses using Raspberry Pi, rather than educational settings or individual consumers. Today we’re announcing new support for this group of customers: a dedicated area of our website for industry, and our Raspberry Pi Approved Design Partners programme, connecting businesses that want to integrate Raspberry Pi into their products with hand-picked design partners who can help.
The industrial market for Raspberry Pi has grown over the years, and now represents around 44% of our annual total sales. We’ve seen this borne out with new releases of Raspberry Pi products: typically sales of a consumer product drop off once a new product is released, but we still see incredible sales of older models of Raspberry Pi. Our inference is that these are destined for embedded applications, where changing to the latest model is not practical.
To support Raspberry Pi’s industrial customers, we have developed a new, dedicated area of our website. Our For industry pages are the best place to go for industrial applications of Raspberry Pi. They provide access to the information and support you need when using our products in an industrial setting, with links to datasheets, compliance documents, and more.
As part of our commitment to industrial customers, we guarantee product lifetimes until at least 2026 on all products. We rarely ever end a product line – in fact, you can still buy Raspberry Pi 1 Model B+ from 2014. And we’ve made it easy for you to take a product through the necessary regulatory compliance steps, with the Raspberry Pi Integrator Programme.
Along with our online resources for industry, we’re announcing a new programme to help customers who want to integrate Raspberry Pi into their products, and to recognise companies with specialist knowledge and proven expertise in designing with Raspberry Pi. The Raspberry Pi Approved Design Partners programme is a way of connecting trusted design consultancies with customers who need support designing Raspberry Pi computing solutions into their products.
We’re launching with a select set of designers whom we already know and work with, and we hope to grow this group over the coming years. If your company provides hardware, software, or mechanical design services with Raspberry Pi, and you’d like us to promote your offering on our website, you can find out more about applying to become a Raspberry Pi Approved Design Partner.
If you have a product or a piece of work that uses Raspberry Pi, and you need technical assistance, Raspberry Pi Approved Design Partners have the capacity to provide you with effective help. All our Design Partners have been through a rigorous application process, and we will monitor them regularly for quality and ability. You can be confident that Raspberry Pi Approved Design Partners have the backing of Raspberry Pi, and have access to a deep level of technical knowledge and support within Raspberry Pi.
We’re excited to help customers build fantastic products using Raspberry Pi, and we’re looking forward to working with a diverse range of designers across the world.
Hi folks, Ladyada here from Adafruit. The Raspberry Pi folks said we could do a guest post on our Adafruit BrainCraft HAT & Voice Bonnet, so here we go!
I’ve been engineering up a few Machine Learning devices that work with Raspberry Pi: BrainCraft HAT and the Voice Bonnet!
The idea behind the BrainCraft HAT is to enable you to “craft brains” for machine learning on the EDGE, with microcontrollers and microcomputers. On ASK AN ENGINEER, we chatted with Pete Warden, the technical lead of the mobile, embedded TensorFlow Group on Google’s Brain team about what would be ideal for a board like this.
And here’s what we designed! The BrainCraft HAT has a 240×240 TFT IPS display for inference output, slots for camera connector cable for imaging projects, a 5-way joystick, a button for UI input, left and right microphones, stereo headphone out, stereo 1W speaker out, three RGB DotStar LEDs, two 3-pin STEMMA connectors on PWM pins so they can drive NeoPixels or servos, and Grove/STEMMA/Qwiic I2C port.
This will let people build a wide range of audio/video AI projects while also allowing easy plug-in of sensors and robotics!
A controllable mini fan attaches to the bottom and can be used to keep your Raspberry Pi cool while it’s doing intense AI inference calculations. Most importantly, there’s an on/off switch that will completely disable the audio codec, so that when it’s off, there’s no way it’s listening to you.
Next up, the Adafruit Voice Bonnet for Raspberry Pi: two speakers plus two mics. Your Raspberry Pi computer is like an electronic brain — and with the Adafruit Voice Bonnet you can give it a mouth and ears as well! Featuring two microphones and two 1Watt speaker outputs using a high-quality I2S codec, this Raspberry Pi add-on will work with any Raspberry Pi with a 2×20 GPIO header, from Raspberry Pi Zero up to Raspberry Pi 4 and beyond (basically all models but the very first ones made).
The on-board WM8960 codec uses I2S digital audio for great quality recording and playback, so it sounds a lot better than the headphone jack on Raspberry Pi (or the no-headphone jack on Raspberry Pi Zero). We put ferrite beads and filter capacitors on every input and output to get the best possible performance, and all at a great price.
We specifically designed this bonnet for use when making machine learning projects such as DIY voice assistants. For example, see this guide on creating a DIY Google Assistant.
The post Adafruit guest post: Machine learning add-ons for Raspberry Pi appeared first on Raspberry Pi.
We’re delighted to round off 2020 by welcoming four of the most popular IQaudio products to the Raspberry Pi fold. DAC+, DAC Pro, DigiAMP+, and Codec Zero will all be available to buy via our network of Raspberry Pi Approved Resellers.
We’ve had a busy 2020 here at Raspberry Pi. From the High Quality Camera to 8GB Raspberry Pi 4 to Compute Module 4 and Raspberry Pi 400, this year’s products have been under development for several years, and bringing them to market required us to build new capabilities in the engineering team. Building capabilities, rather than money or engineer time, is the real rate-limiting step for introducing new Raspberry Pi products.
One market we’ve never explored is hi-fi audio; this is a world unto itself, with a very demanding customer base, and we’ve never felt we had the capabilities needed to offer something distinctive. Over time, third parties have stepped in with a variety of audio I/O devices, amplifiers, and other accessories.
Founded by Gordon and Sharon Garrity together with Andrew Rankin in 2015, IQaudio was one of the first companies to recognise the potential of Raspberry Pi as a platform for hi-fi audio. IQaudio products are widely used by hobbyists and businesses (in-store audio streaming being a particularly popular use case). So when the opportunity arose to acquire IQaudio’s brand and product line late last year, we jumped at it.
Today we’re relaunching four of the most popular IQaudio products, at new affordable price points, via our network of Raspberry Pi Approved Resellers.
Priced at just $20, DAC+ is our lowest-cost audio output HAT, supporting 24‑bit 192kHz high-resolution digital audio. It uses a Texas Instruments PCM5122 DAC to deliver stereo analogue audio to a pair of phono connectors, and also provides a dedicated headphone amplifier.
Priced at $25, DAC Pro is our highest-fidelity audio output HAT. It supports the same audio input formats and output connectors as DAC+, but uses a Texas Instruments PCM5242 DAC, providing an even higher signal-to-noise ratio.
In combination with an optional daughter board (due for relaunch in the first quarter of 2021), DAC Pro can support balanced output from a pair of XLR connectors.
Where DAC+ and DAC Pro are designed to be used with an external amplifier, DigiAMP+ integrates a Texas Instruments TAS5756M digital-input amplifier directly onto the HAT, allowing you to drive a pair of passive speakers at up to 35W per channel. Combined with a Raspberry Pi board, it’s a complete hi-fi that’s the size of a deck of cards.
Codec Zero is a $20 audio I/O HAT, designed to fit within the Raspberry Pi Zero footprint. It is built around a Dialog Semiconductor DA7212 codec and supports a range of input and output devices, from the built-in MEMS microphone to external mono electret microphones and 1.2W, 8 ohm mono speakers.
Unlike the other three products, which are in stock with our Approved Resellers now, Codec Zero will ship early in the New Year.
So there you have it. Four (nearly) new Raspberry Pi accessories, just in time for Christmas – hop over and buy yours now. This is the first time we’ve brought third-party products into our line-up like this; we’d like to thank the team at IQaudio for their help in making the transition.
At the Raspberry Pi Foundation, we are continually inspired by young learners in our community: they embrace digital making and computing to build creative projects, supported by our resources, clubs, and volunteers. While creating their projects, they are learning the core programming skills that underlie digital making.
Over the years, many tools and environments have been developed to make programming more accessible to young people. Scratch is one example of a block-based programming environment for young learners, and it’s been shown to make programming more accessible to them; on our projects site we offer many step-by-step Scratch project resources.
But does block-based programming actually help learning? Does it increase motivation and support students? Where is the hard evidence? In our latest research seminar, we were delighted to hear from Dr David Weintrop, an Assistant Professor at the University of Maryland who has done research in this area for several years and published widely on the differences between block-based and text-based programming environments.
The first useful insight David shared was that we should avoid thinking about block-based programming as synonymous with the well-known Scratch environment. There are several other environments, with different affordances, that David referred to in his talk, such as Snap, Pencil Code, Blockly, and more.
Some of these, for example Pencil Code, offer a dual-modality (or hybrid) environment, where learners can write the same program in a text-based and a block-based programming environment side by side. Dual-modality environments provide this side-by-side approach based on the assumption that being able to match a text-based program to its block-based equivalent supports the development of understanding of program syntax in a text-based language.
Another aspect of the research around block-based programming focuses on its usefulness as a transition to a text-based language. David described a 15-week study he conducted in high schools in the USA to investigate differences in student learning caused by use of block-based, text-based, and hybrid (a mixture of both using a dual-modality platform) programming tools.
The 90 students in the study (14 to 16 years old) were divided into three groups, each with a different intervention but taught by the same teacher. In the first phase of the study (5 weeks), the groups were set the same tasks with the same learning objectives, but they used either block-based programming, text-based programming, or the hybrid environment.
After 5 weeks, students were given a test to assess learning outcomes, and they were asked questions about their attitudes to programming (specifically their perception of computing and their confidence). In the second phase (10 weeks), all the students were taught Java (a common language taught in the USA for end-of-school assessment), and then the test and attitudinal questions were repeated.
The results showed that at the 5-week point, the students who had used block-based programming scored higher in their learning outcome assessment, but at the final assessment after 15 weeks, all groups’ scores were roughly equivalent.
In terms of students’ perception of computing and confidence, the responses of the Blocks group were very positive at the 5-week point, while at the 15-week point, the responses were less positive. The responses from the Text group showed a gradual increase in positivity between the 5- and 15-week points. The Hybrid group’s responses weren’t as negative as those of the Text group at the 5-week point, and their positivity didn’t decrease like the Blocks group’s did.
Taking both methods of assessment into account, the Hybrid group showed the best results in the study. The gains associated with the block-based introduction to programming did not translate to those students being further ahead when learning Java, but starting with block-based programming also did not hamper students’ transition to text-based programming.
David completed his talk by recommending dual-modality environments (such as Pencil Code) for teaching programming, as used by the Hybrid group in his study.
The seminar audience raised many questions about David’s study, for example whether the actual teaching (pedagogy) may have differed for the three groups, and whether the results are not just due to the specific tools or environments that were used. This is definitely an area for further research.
It seems that students may benefit from different tools at different times, which is why a dual-modality environment can be very useful. Of course, competence in programming takes a long time to develop, so there is room on the research agenda for longitudinal studies that monitor students’ progress over many months and even years. Such studies could take into account both the teaching approach and the programming environment in order to determine what factors impact a deep understanding of programming concepts, and students’ desire to carry on with their programming journey.
If you missed the seminar, you can find David’s presentation slides and a recording of his talk on our seminars page.
Our next free online seminar takes place on Tuesday 5 January at 17:00–18:00 BST / 12:00–13:00 EDT / 9:00–10:00 PDT / 18:00–19:00 CEST. We’ll welcome Peter Kemp and Billy Wong, who are going to share insights from their research on computing education for underrepresented groups. To join this free online seminar, simply sign up with your name and email address.
Once you’ve signed up, we’ll email you the seminar meeting link and instructions for joining. If you attended David’s seminar, the link remains the same.
The January seminar will be the first one in our series focusing on diversity and inclusion in computing education, which we’re co-hosting with the Royal Academy for Engineering. We hope to see you there!
The post Block-based programming: does it help students learn? appeared first on Raspberry Pi.
Surprise! We thought we’d squeeze in another shiny new thing for you before the year is out. And it’ll make a cracking Christmas present.
Our brand new book Help! My Computer is Broken (How do I fix it?) takes the most common computer problems and tells you how to fix them.
It’s “the intolerant person’s guide to keeping your computer computing.” If that sounds like you, we recommend you hop straight over to the Raspberry Pi Press online store and pick up a copy for just £10.
It also makes a good, only mildly passive-aggressive, gift. If the above text messages ring a bell, and you’re fed up with being the in-house tech support for your family, then Help! My computer is broken (How do I fix it?) can assist. It shows readers how to fix common computer problems, without having to wade through technical jargon or pester said tech support person.
We had the brilliant Barry Collins, who has been a technology journalist for more than 20 years, write it for you. He’s written for most of the UK’s leading tech publications, and he is a former editor of PC Pro as well as former assistant editor of the Sunday Times‘ technology section.
He’s now co-editor of The Big Tech Question, a site designed to answer people’s tech queries – in a similar vein to this book. Barry also makes regular appearances as a tech pundit on TV and radio.
You can buy Help! My computer is broken (How do I fix it?) now from the Raspberry Pi Press online store, or at the Raspberry Pi store in Cambridge, UK.
Our lovely friends at Wireframe magazine have also made a free PDF version available.
While you’re shopping at the Raspberry Pi Press online store, make sure you check out our Black Friday deal, which we’ve decided to keep rolling until Christmas Eve.
If you buy just one book from the Black Friday range (priced £7 – £10), you get two more completely FREE!
What if you could give the joy of opening a Raspberry Pi–themed gift every single month for a whole year? But what if the thought of wrapping 12 individual things fills you with Scrooge-level dread?
Snap up a magazine subscription for one of your nearest and/or dearest and we’ll take care of the packaging and delivery while you sit back and reap all the credit!
You could end up with a few extra gifts depending on what you sign up for so, read on and take your pick.
The official Raspberry Pi magazine comes with a free Raspberry Pi Zero W kit worth £20 when you sign up for a 12-month subscription. You can use our tiniest computer in tonnes of projects, meaning Raspberry Pi fans can never have enough. That’s a top gift-giving bonus for you right there.
Every issue of The MagPi is packed with computing and electronics tutorials, how-to guides, and the latest news and reviews. They also hit their 100th issue this month so, if someone on your list has been thinking about getting a subscription, now is a great time.
Check out subscription deals on the official Raspberry Pi Press store.
HackSpace magazine is the one to choose for fixers and tinkerers of all abilities. If you’re looking for a gift for someone who is always taking things apart and hacking everyday objects, HackSpace magazine will provide a year of inspiration for them.
12-month subscriptions come with a free Adafruit Circuit Playground Express, which has been specially developed to teach programming novices from scratch and is worth £25.
Check out subscription deals on the official Raspberry Pi Press store.
Custom PC is the magazine for people who are passionate about PC technology and hardware. And they’ve just launched a pretty cool new giveaway with every 12-month subscription: a free Chillblast Aero RGB Gaming mouse worth £40. Look, it lights up, it’s cool.
Check out subscription offers on the official Raspberry Pi Press store.
Wireframe magazine lifts the lid on video games. In every issue, you’ll find out how games are made, who makes them, and how you can code them to play for yourself using detailed guides.
The latest deal gets you three issues for just £10, plus your choice of one of our official books as a gift. By the way, that ‘three for £10 plus a free book’ is available across ALL our magazines. Did I not tell you that before? My bad. It’s good though, right?
Check out more subscriptions deals on the official Raspberry Pi Press store.
And as an extra Christmas gift to you all, we’ve decided to keep our Black Friday deal rolling until Christmas Eve, so if you buy just one teeny tiny book from the Raspberry Pi Press store, you get two more completely FREE!
Better still, all of the books in the deal only cost £7 or £10 to start with, so makes for a good chunky batch of presents at a brilliantly affordable price.
The official Raspberry Pi magazine turned 100 this month! To celebrate, the greatest Raspberry Pi moments, achievements, and events that The MagPi magazine has ever featured came back for a special 100th issue.
100 Raspberry Pi Moments is a cracking bumper feature (starting on page 32 of issue 100, if you’d like to read the whole thing) highlighting some influential projects and educational achievements, as well as how our tiny computers have influenced pop culture. And since ’tis the season, we thought we’d share the How Raspberry Pi made a difference section to bring some extra cheer to your festive season.
The Raspberry Pi Foundation was originally launched to get more UK students into computing. Not only did it succeed at that, but the hardware and the Foundation have also managed to help people in other ways and all over the world. Here are just a few examples!
The Raspberry Pi Foundation provides free learning resources for everyone; however, not everyone has access to a computer to learn at home. Thanks to funding from the Bloomfield Trust and in collaboration with UK Youth and local charities, the Foundation has been able to supply hundreds of Raspberry Pi Desktop Kits to young people most in need. The computers have allowed these children, who wouldn’t have been able to otherwise, to learn from home and stay connected to their schools during lockdown. The Foundation’s work to distribute Raspberry Pi computers to young people in need is ongoing.
Elsewhere, a need for more medical equipment around the world resulted in many proposals and projects being considered for cheap, easy-to produce machines. Some included Raspberry Pi Zero, with 40,000 of these sold for ventilator designs.
While there’s no global project or standard to say what an offline internet should contain, some educational projects have tried to condense down enough online content for specific people and load it all onto a Raspberry Pi. RACHEL-Pi is one such solution. The RACHEL-PI kit acts as a server, hosting a variety of different educational materials for all kinds of subjects, as well as an offline version of Wikipedia with 6000 articles. There’s even medical info for helping others, math lessons from Khan Acadamy, and much more.
17,000 ft is another great project, which brings computing to schools high up in the Himalayas through a similar method in an attempt to help children stay in their local communities.
The free coding resources available on our projects site are great, and the Raspberry Pi Foundation works to make them accessible to people whose first language isn’t English: we have a dedicated translation team and, thanks to volunteers around the world, provide our free resources translated into up to 32 other languages. From French and Welsh to Korean and Arabic, there’s a ton of projects that learners from all over the world can access in their first language.
That’s not all: several charitable groups have set up Raspberry Pi classrooms to bring computing education to poorer parts of the world. People in African countries and parts of rural India have benefited from these programmes, and work is being done to widen access to ever more people and places.
The HAM radio community loves Raspberry Pi for amateur radio projects; however, sometimes people need radio for more urgent purposes. In 2016, German group Media in Cooperation and Transition created the Pocket FM 96 , micro radio transmitters with 4–6km range. These radios allowed Syrians in the middle of a civil war to connect to free media on Syrnet for more reliable news.
Raspberry Pi powered these transmitters, chosen because of how easy it is to upgrade and add components to. Each transmitter is powered by solar power, and Syrnet is still transmitting through them as the war continues into its tenth year.
Code an explosive homage to Toaplan’s classic blaster. Mark Vanstone has the details
Tiger-Heli was developed by Toaplan and published in Japan by Taito and by Romstar in North America.
Released in 1985, Tiger-Heli was one of the earliest games from Japanese developer Toaplan: a top-down shoot-’em-up that pitted a lone helicopter against relentless waves of enemy tanks and military installations. Toaplan would go on to refine and evolve the genre through the eighties and nineties with such titles as Truxton and Fire Shark, so Tiger-Heli served as a kind of blueprint for the studio’s legendary blasters.
Tiger-Heli featured a powerful secondary weapon, too: as well as a regular shot, the game’s attack helicopter could also drop a deadly bomb capable of destroying everything within its blast radius. The mechanic was one that first appeared as far back as Atari’s Defender in 1981, but Toaplan quickly made it its own, with variations on the bomb becoming one of the signatures in the studio’s later games.
For our Tiger-Heli-style Pygame Zero code, we’ll concentrate on the unique bomb aspect, but first, we need to get the basic scrolling background and helicopter on the screen. In a game like this, we’d normally make the background out of tiles that can be used to create a varied but continuous scrolling image. For this example, though, we’ll keep things simple and have one long image that we scroll down the screen and then display a copy above it. When the first image goes off the screen, we just reset the co-ordinates to display it above the second image copy. In this way, we can have an infinitely scrolling background.
Our Tiger-Heli homage in Python. Fly over the military targets, firing missiles and dropping bombs.
The helicopter can be set up as an Actor with just two frames for the movement of the rotors. This should look like it’s hovering above the ground, so we blit a shadow bitmap to the bottom right of the helicopter. We can set up keyboard events to move the Actor left, right, up, and down, making sure we don’t allow it to go off the screen.
Now we can go ahead and set up the bombs. We can predefine a list of bomb Actors but only display them while the bombs are active. We’ll trigger a bomb drop with the SPACE bar and set all the bombs to the co-ordinates of the helicopter. Then, frame by frame, we move each bomb outwards in different directions so that they spread out in a pattern. You could try adjusting the number of bombs or their pattern to see what effects can be achieved. When the bombs get to frame 30, we start changing the image so that we get a flashing, expanding circle for each bomb.
It’s all very well having bombs to fire, but we could really do with something to drop them on, so let’s make some tank Actors waiting on the ground for us to destroy. We can move them with the scrolling background so that they look like they’re static on the ground. Then if one of our bombs has a collision detected with one of the tanks, we can set an animation going by cycling through a set of explosion frames, ending with the tank disappearing.
We can also add in some sound effects as the bombs are dropped, and explosion sounds if the tanks are hit. And with that, there you have it: the beginnings of a Tiger-Heli-style blaster.
You can read more features like this one in Wireframe issue 45, available directly from Raspberry Pi Press — we deliver worldwide.
And if you’d like a handy digital version of the magazine, you can also download issue 45 for free in PDF format.
Baldur’s Gate III: our cover star for Wireframe #45.
The post Recreate Tiger-Heli’s bomb mechanic | Wireframe #45 appeared first on Raspberry Pi.
Well, in a year as disrupted and strange as 2020, it’s nice to know that there are some things you can rely on, for example the traditional end-of-year new release of Raspberry Pi OS, which we launch today. Here’s a run-through of the main new features that you’ll find in it.
We’ve updated the Chromium browser to version 84. This has taken us a bit longer than we would have liked, but it’s always quite a lot of work to get our video hardware acceleration integrated with new releases of the browser. That’s done now, so you should see good-quality video playback on sites like YouTube. We’ve also, given events this year, done a lot of testing and tweaking on video conferencing clients such as Google Meet, Microsoft Teams, and Zoom, and they should all now work smoothly on your Raspberry Pi’s Chromium.
There’s one more thing to mention on the subject of web browsers. We’ve been shipping Adobe’s Flash Player as part of our Chromium install for several years now. Flash Player is being retired by Adobe at the end of the year, so this release will be the last that includes it. Most websites have now stopped requiring Flash Player, so this hopefully isn’t something that anyone notices!
From this release onwards, we are switching Raspberry Pi OS to use the PulseAudio sound server.
First, a bit of background. Audio on Linux is really quite complicated. There are multiple different standards for handling audio input and output, and it does sometimes seem that what has happened, historically, is that whenever anyone wanted to use audio in Linux, they looked at the existing libraries and programs and went “Hmmm… I don’t like that, I’ll write something new and better.” This has resulted in a confused mass of competing and conflicting software, none of which quite works the way anyone wants it to!
The most common audio interface, which lies underneath most Linux systems somewhere, is called ALSA, the Advanced Linux Sound Architecture. This is a fairly reliable low-level audio interface — indeed, it is what Raspberry Pi OS has used up until now — but it has quite a lot of limitations and is starting to show its age. For example, it can only handle one input and one output at a time. So for example, if ALSA is being used by your web browser to play sound from a YouTube video to the HDMI output on your Raspberry Pi, nothing else can produce sound at the same time; if you were to try playing a video or an audio file in VLC, you’d hear nothing but the audio from YouTube. Similarly, if you want to switch the sound from your YouTube video from HDMI to a USB sound card, you can’t do it while the video is playing; it won’t change until the sound stops. These aren’t massive problems, but most modern operating systems do handle audio in a more flexible fashion.
More significant is that ALSA doesn’t handle Bluetooth audio at all, so various other extensions and additional bits of software are required to even get audio into and out of Bluetooth devices on an ALSA-based system. We’ve used a third-party library called bluez-alsa for a few years now, but it’s an additional piece of code to maintain and update, so this isn’t ideal.
PulseAudio deals with all of this. It’s a piece of software that sits as a layer between all the audio hardware and all the applications that send and receive audio, and it automatically routes everything to the right places. It can mix the audio from multiple applications together, so you can hear VLC at the same time as YouTube, and it allows the output to be moved around between different devices while it is playing. It knows how to talk to Bluetooth devices, and it greatly simplifies the job of managing default input and output devices, so it makes it much easier to make sure audio ends up where it is supposed to be!
One area where it is particularly helpful is in managing audio input and output streams to web browsers like Chromium; in our testing, the use of PulseAudio made setting up video conferencing sessions much easier and more reliable, particularly with Bluetooth headsets and webcam audio.
The good news for Raspberry Pi users is that, if we’ve got it right, you shouldn’t even notice the change. PulseAudio now runs by default, and while the volume control and audio input/output selector on the taskbar looks almost identical to the one in previous releases of the OS, it is now controlling PulseAudio rather than ALSA. You can use it just as before: select your output and input devices, adjust the volume, and you’re good to go.
There is one small change to the input/output selector, which is the menu option at the bottom for Device Profiles. In PulseAudio, any audio device has one or more profiles, which select which outputs and inputs are used on any device with multiple connections. (For example, some audio HATs and USB sound cards have both analogue and digital outputs — there will usually be a profile for each output to select where the audio actually comes out.)
Profiles are more important for Bluetooth devices. If a Bluetooth device has both an input and an output (such as a headset with both a microphone and an earphone), it usually supports two different profiles. One of these is called HSP (HeadSet Profile), and this allows you to use both the microphone and the earphone, but with relatively low sound quality — equivalent to that you hear on a mobile phone call, so fine for speech but not great for music. The other profile is called A2DP (Advanced Audio Distribution Profile), which gives much better sound quality, but is output-only: it does not allow you to use the microphone. So if you are making a call, you want your Bluetooth device to use HSP, but if you are listening to music, you want it to use A2DP.
We’ve automated some of this, so if you select a Bluetooth device as the default input, then that device is automatically switched to HSP. If you want to switch a device which is in HSP back to A2DP, just reselect it from the output menu. Its microphone will then be deactivated, and it will switch to A2DP. But sometimes you might want to take control of profiles manually, and the Device Profiles dialog allows you to do that.
(Note that if you are only using the Raspberry Pi’s internal sound outputs, you don’t need to worry about profiles at all, as there is only one, and it’s automatically selected for you.)
Some people who have had experience of PulseAudio in the past may be a little concerned by this change, because PulseAudio hasn’t always been the most reliable piece of software, but it has now reached the point where it solves far more problems than it creates, which is why many other Linux distributions, such as Ubuntu, now use it by default. Most users shouldn’t even notice the change; there may be occasional issues with some older applications such as Sonic Pi, but the developers of these applications will hopefully address any issues in the near future.
One thing which has always been missing from Raspberry Pi OS is an easy way to connect to and configure printers. There is a Linux tool for this, called CUPS, the Common Unix Printing System. (It’s actually owned by Apple and is the underlying printing system used by macOS X, but it is still free software and available for use by Linux distributions.)
CUPS has always been available in apt, so could be installed on any Raspberry Pi, but the standard web-based interface is a bit unfriendly. Various third-party front-end tools have been written to make CUPS a bit easier to use, and we have decided to use one called system-config-printer. (Like PulseAudio, this is also used as standard by Ubuntu.)
So both CUPS and system-config-printer are now installed as part of Raspberry Pi OS. If you are a glutton for punishment, you can access the CUPS web interface by opening the Chromium browser and going to http://localhost:631, but instead of doing that, we suggest just going into the Preferences section in the main menu and opening Print Settings.
This shows the system-config-printer dialog, from which you can add new printers, remove old ones, set one as the default, and access the print queue for each printer, just as you should be familiar with from other operating systems.
Like most things in Linux, this relies on user contributions, so not every printer is supported. We’ve found that most networked printers work fine, but USB printers are a bit hit-and-miss as to whether there is a suitable driver; in general, the older your printer is, the more likely it is to have a CUPS driver available. The best thing to do is to try it and see, and perhaps ask for help on our forums if your particular printer doesn’t seem to work.
This fills in one of the last things missing in making Raspberry Pi a complete desktop computer, by making it easy to set up a printer and print from applications such as LibreOffice.
One of the areas we have tried to improve in the Desktop this year is to make it more accessible to those with visual impairments. We added support for the Orca screen reader at the start of the year, and the display magnifier plugin over the summer.
While there are no completely new accessibility features this time, we have made some improvements to Orca support in applications like Raspberry Pi Configuration and Appearance Settings, to make them read what they are doing in a more helpful fashion; we’ve also worked with the maintainers of Orca to raise and fix a few bugs. It’s still not perfect, but we’re doing our best!
One of the benefits of switching to PulseAudio is that it now means that screen reader audio can be played through Bluetooth devices; this was not possible using the old ALSA system, so visually-impaired users who wish to use the screen reader with a Bluetooth headset or external speaker can now do so.
One feature we have added is an easy way to install Orca; it is still available through Recommended Software as before, but given that is not easy to navigate for a visually-impaired person, there is now a keyboard shortcut: just hold down
And if you can’t remember that shortcut, when you first boot a new image, if you don’t do anything for thirty seconds or so, the startup wizard will now speak to you to remind you how to do it…
Finally, we had hoped to be able to say that Chromium was now compatible with Orca; screen reader support was being added to versions 8x. Unfortunately, for now this seems to only have been added for Windows and Mac versions, not the Linux build we use. Hopefully Google will address this in a future release, but for now if you need a web browser compatible with Orca, you’ll need to install Firefox from apt.
We’ve added a couple of options to the Raspberry Pi Configuration tool.
On the System tab, if you are running on Raspberry Pi with a single status LED (i.e. a Raspberry Pi Zero or the new Raspberry Pi 400), there is now an option to select whether the LED just shows that the power is on, or if it flickers off to show drive activity.
On the Performance tab, there are options to allow you to control the new Raspberry Pi Case Fan: you can select the GPIO pin to which it is connected and set the temperature at which it turns on and off.
To apply the updates to an existing image, you’ll need to enter the usual commands in a terminal window:
sudo apt update sudo apt full-upgrade
(It is safe to just accept the default answer to any questions you are asked during the update procedure.)
Then, to install the PulseAudio Bluetooth support, you will need to enter the following commands in the terminal window:
sudo apt purge bluealsa sudo apt install pulseaudio-module-bluetooth
To swap over the volume and input selector on the taskbar from ALSA to PulseAudio, after your Raspberry Pi has restarted, right-click a blank area on the taskbar and choose Add / Remove Panel Items. Find the plugin labelled Volume Control (ALSA/BT) in the list, select it and click Remove; then click the Add button, find the plugin labelled Volume Control (PulseAudio) and click Add. Alternatively, just open the Appearance Settings application from the Preferences section of the Main Menu, go to the Defaults tab and press one of the Set Defaults buttons.
Some people have reported that some applications are ignoring the effect of the PulseAudio output switcher. This is probably caused by an old ALSA configuration file still being on the system. Once you have updated, execute the following in a terminal window, which should fix this:
To remove the old Audio Preferences application, which will not work with PulseAudio, do:
sudo apt purge pimixer
We are also aware that the output from the Raspberry Pi’s internal audio device is in mono in the image; this is now fixed. To get the fix, do an update in a terminal window:
sudo apt update sudo apt full-upgrade reboot
Some users have reported that in a dual-monitor setup, switching the audio output to the AV Jack causes problems – this is being investigated at the moment.
As ever, do let us know what you think in the comments.
At the Raspberry Pi Foundation, we host a free online research seminar once a month to explore a wide variety of topics in the area of digital and computing education. This year, we’ve hosted eleven seminars — you can (re)discover slides and recordings on our website.
Now we’re getting ready for new seminars in 2021! In the coming months, our seminars are going to focus on diversity and inclusion in computing education. This topic is extremely important, as we want to make sure that computing is accessible to all, that we understand how to actively remove barriers to participation for learners, and that we understand how to teach computing in an inclusive way.
We are delighted to announce that these seminars focusing on diversity and inclusion will be co-hosted by the Royal Academy of Engineering. The Royal Academy of Engineering is harnessing the power of engineering to build a sustainable society and an inclusive economy that works for everyone.
We’re very excited to be partnering with the Academy because of our shared interest in ensuring that computing and engineering are inclusive and accessible to all.
The seminars take place on the first Tuesday of the month at 17:00–18:30 GMT / 12:00–13:30 EST / 9:00–10:30 PST / 18:00–19:30 CET.
We’d love to welcome you to these seminars so we can learn and discuss together. To get access, simply sign up with your name and email address.
Once you’ve signed up, we’ll email you the seminar meeting link and instructions for joining. If you attended our seminars in the past, the link remains the same.
The post Diversity and inclusion in computing education — new research seminars appeared first on Raspberry Pi.
When I first investigated inserting a fan into the standard Raspberry Pi case there were two main requirements. The first was to keep the CPU cool in all usage scenarios. The second was to reduce or eliminate any changes to the current case and therefore avoid costly tool changes.
As I had no experience developing a fan, I did what all good engineers do and had a go anyway. We had already considered opening the space above the Ethernet connector to create a flow of air into the case. So, I developed my first prototype from a used Indian takeaway container (I cleaned it first), but the below card version was easier to recreate
The above duct is what remains from my first effort, the concept is relatively simple, draw air in over the Ethernet port, and then drive the air down onto the CPU. But it wasn’t good enough, running CPU-burn on all four cores required a fan which sounded like it was about to take off. So I spoke to a professional who did some computational fluid dynamics (CFD) analysis for us.
CFD analysis takes a 3D description of the volume and calculates a simulation of fluid flow (the air) through the volume. The result shows where the air moves fastest (the green and red areas)
What this showed us is the position of the fan is important since the fastest moving bit of air is actually quite far from the centre of the processor, also:
The picture above shows how most of the moving air (green and red) is mainly spinning around inside the fan. This happens because there is a pressure difference between the input and output sides of the fan (the sucky end and the blowy end). Fans just don’t work well that way, they are most efficient when unrestricted. I needed to go back to the drawing board. My next experiment was to add holes into the case to understand how much the airflow could be changed.
After running the tests with additional holes in both the lid and the base I concluded the issue wasn’t really getting air unrestricted in and out of the case (although the holes did make a small difference) but the effect the air duct was having on restricting the flow into the fan itself. Back to the drawing board…
During a long run in the fens, I thought about the airflow over the Ethernet connector and through the narrow duct, wondering how we can open this up to reduce the constriction. I realised it might be possible to use the whole ‘connector end’ of the case as the inlet port.
Suddenly, I had made a big difference… By drawing air from around the USB and Ethernet connectors the lid has been left un-modified but still achieves the cooling effect I was looking for. Next was to reduce the direction changes in the air flow and try to make the duct simpler.
The cardboard bulkhead does exactly what you need to do and nothing more. It separates the two halves of the case, and directs the air down directly at the processor. Using this design and the heatsink, I was able to achieve a cooling capable of easily running the cpuburn application but with an even smaller (quieter) fan.
The next job is to develop a plastic clip to attach the fan into the lid. That’s where our friends at Kinneir Dufort came in. They designed the injection moulded polycarbonate that makes an accurate interface with the Raspberry Pi’s PCB. The ‘bulkhead’ clips neatly into the slots in the lid, almost like it was planned!
The Raspberry Pi Case Fan has been developed with an advanced user in mind. It allows them to use the Raspberry Pi at its limits whilst retaining the unique finished exterior of the Raspberry Pi Case.
For those who love a good graph, here are the temperature results during a quad-core compile of the Linux kernel, as demonstrated in Eben’s launch post on Monday.
From our first prototype way back in 2006, to the very latest Raspberry Pi 400, everything we have built here at Raspberry Pi has been driven by a desire to inspire learning. I hope that each of you who uses our products discovers — or rediscovers — the joy of learning through making. The journey from technology consumer to technology creator can be a transformational one; today, on Giving Tuesday, I’m asking you to help even more young people make that journey.
Too few young people have the chance to learn how technology works and how to harness its power. Pre-existing disparities in access to computing education have been exacerbated by the coronavirus pandemic. At the Raspberry Pi Foundation, we’re on a mission to change this, and we’re working harder than ever to support young people and educators with free learning opportunities. Our partner CanaKit supports the Raspberry Pi Foundation’s mission, and they’ve extended the generous offer to match your donations up to a total of $5,000.
Alongside our low-cost, high-performance computers and free software, you may know that the Raspberry Pi Foundation provides free educational programmes including coding clubs and educator training for millions of people each year in dozens of countries. You might not know that the Raspberry Pi Foundation was founded as, and still remains, a nonprofit organisation. Our education mission is powered by dedicated volunteers, and our programmes are funded in part thanks to our customers who buy Raspberry Pi products, and in part by charitable donations from people like you.
Every donation we receive makes an impact on the young people and educators who rely on the Raspberry Pi Foundation. Ryka, for example, is a 10-year-old who attends one of our CoderDojo clubs. Since March she’s been using our project guides and following our Digital Making at Home code-along live streams. Her parents tell us:
“We were looking at ways to keep Ryka engaged during this lockdown period and came across Digital Making at Home. As a parent I can see that there has been discernible improvement in her abilities. We’ve noticed that she is engaged and takes interest in showing us what she was able to build. It has been a great use of her time.”– Parent of a young person who learns through our programmes
Ryka joins millions of learners in our community around the world, many of whom now rely on us more than ever with schools and extracurricular activities disrupted. Through the ongoing support of our donors and volunteers, we’ve been able to rise to the challenge of the pandemic:
Young coders and digital makers need our help in the year ahead as they take control of their computing education under challenging and uncertain circumstances. As a donor to the Raspberry Pi Foundation, you will be investing in our youngest generation of digital makers and helping to create a spark in a young person’s life. On Giving Tuesday, I am grateful to each of you for the role you play in creating a world where everyone can learn, solve problems, and shape their future through the power of technology.
PS Thank you again to our friends at CanaKit for doubling the impact of every donation, up to $5000!
Today we’re launching a stocking-filler product to help you squeeze more performance out of your Raspberry Pi 4. The $5 Raspberry Pi 4 Case Fan clips inside the lid of the Official Case, and keeps your Raspberry Pi 4 cool even when running the heaviest workloads, at the most aggressive overclocks.
Like all electronic products, Raspberry Pi generates waste heat as it works. Along with most fanless products – like most mobile phones – Raspberry Pi 4 was originally designed to operate in a “sprint-and-recover” mode: if run at maximum performance for an extended period it would heat up, and eventually throttle back to limit its temperature.
In practice, the power optimisation work that we’ve done over the last eighteen months has largely eliminated throttling for an uncased board, operating at the stock clock frequency of 1.5GHz, and in a typical ambient temperature.
Here’s a graph of temperature during a quad-core compile of the Linux kernel: you can see the temperature barely exceeds 70C.
But maybe you want to put your Raspberry Pi in a case; or you’ve noticed that your Raspberry Pi will overclock to 1.8GHz or more; or you want to use it in a higher ambient temperature. All of these things can put us back in sprint-and-recover mode.
Here’s the same workload running on a board in a Raspberry Pi official case: now we hit the 80C throttle point and slow down, and the compile job takes (slightly) longer to complete.
To run indefinitely at full speed under these conditions you’ll need either a passive cooling solution (like the excellent Flirc case), or an active one like the Raspberry Pi 4 Case Fan. It draws air in over the USB and Ethernet connectors, passes it over a small finned heatsink attached to the processor, and exhausts it through the SD card slot. Here’s our workload running with the case fan: now the board remains well below 70C, and as expected the compile job takes the same amount of time as on the uncased board.
Gordon Hollingworth will be here on Wednesday to talk about how he designed the Raspberry Pi 4 Case Fan ducting with the aid of a stack of Chinese takeout boxes and a glue gun.
As with all our products, the Raspberry Pi Case Fan is available from our Raspberry Pi Approved Resellers. Simply head over to the Case Fan page and select your country from the drop-down menu.
If your country isn’t on the list yet, don’t worry, we’re constantly working to add further countries and resellers to the list. Until then, check out some of our Approved Resellers that offer international shipping.
Hey everyone, come and see, come and see! Here’s a great new bookazine from the makers of the official Raspberry Pi magazine. We do love the folks at The MagPi. Clever, they are.
If, like us, you’re over 2020 already, dive into the pages of The Official Raspberry Pi Handbook 2021, and pretend it never happened. That will totally work.
To help you get the most of out of your Raspberry Pi computer, this official Handbook features 200 pages of essential information, inspiring projects, practical tutorials, and definitive reviews.
If you’re an absolute beginner, you can learn from the Handbook how to set up your Raspberry Pi and start using it. Then you can move on to the step-by-step tutorials that will teach you how to code and make with your Raspberry Pi.
You’ll also (re)discover the new Raspberry Pi 400 and High Quality Camera, both released this year. And you’ll find out about the top kits and accessories for your projects.
And finally, we’ve also picked out some incredible Raspberry Pi projects made by people in the community to inspire you to get making and coding.
Personally, we prefer new book smell and the crackle of physical pages but, if you’re less picky and don’t mind on-screen reading, the lovely folks at The MagPi have a PDF version you can download for free.
Hey there, folks, Rob from The MagPi here! I hope you’ve all been doing OK.
Today we celebrate the 100th issue of The MagPi, the official Raspberry Pi magazine!
Most of you probably know that The MagPi didn’t start off official, though: eight and a half years ago, intrepid community members came together to create The MagPi as a fanzine, and it ran as one for 30 issues (plus one special) until early 2015, when it became part of Raspberry Pi and went official.
For 70 issues now, the rest of the team and I have worked hard to bring Raspberry Pi fans a monthly magazine packed full of amazing content from the global Raspberry Pi (and wider maker) community. In the last six-ish years, I’ve built robots with you, stuffed Raspberry Pi Zeros into games controllers, lit up my Christmas tree, written far too many spooky puns, gone stargazing, recorded videos for numerous Raspberry Pi launches, and tried to help everyone who wanted to get their hands on the (in)famous issue 40.
I could go on, but I already have: for issue 100 we’re celebrating 100 incredible moments in Raspberry Pi history, from its humble beginnings to becoming the third best-selling computer ever, and one of the few to be on the International Space Station.
One of those moments was the release of Raspberry Pi 400, an incredibly cool model of Raspberry Pi that elicited a few ‘oohs’ and ‘aahs’ from me when mine arrived in the post. We give it the full MagPi breakdown with benchmarks and interviews, courtesy of our good friend Gareth Halfacree.
But wait, there’s more! We’ve managed to squeeze in our usual array of projects, tutorials, reviews, and community reports as well. Expect cool robots, funky guitars, handheld console building guides, and case reviews.
Never want to miss an issue? Subscribe to The MagPi and we’ll deliver every issue straight to your door. Also, if you’re a new subscriber and get the 12-month subscription, you’ll get a completely free Raspberry Pi Zero bundle with a Raspberry Pi Zero W and accessories.
I really think you’ll like this issue. Here’s to another 100.
The post The MagPi #100: celebrate 100 amazing moments from Raspberry Pi history appeared first on Raspberry Pi.
Mike reports a “substantial difference in sound quality” compared to his previous setup (the aforementioned and reviled Bluetooth and RCA plug options).
This project lets you use a Raspberry Pi as a music player and control it from your mobile phone.
This bundle comes with a nice, sleek case, so your music player can be on display discreetly.
Not sure about spending $80 on this kit? In the project video, Mike says it’s “totally, totally worth it” — partly due to the quality of the case.
You can use an Ethernet cable, but Mike wanted to utilise Raspberry Pi 4’s wireless connectivity to boot the Volumio app. This way, the Raspberry Pi music player can be used anywhere in the house, as it’ll create its own wireless hotspot within your home network called ‘Volumio’.
You’ll need a different version of the Volumio app depending on whether you have an Android phone or iPhone. Mike touts the app as “super easy, really robust”. You just select the music app you usually use from the ‘Plugins’ section of the Volumio app, and all your music, playlists, and cover art will be there ready for you once downloaded.
And that’s basically it. Just connect to the Volumio OS via the app and tell your Raspberry Pi what to play.
To get his new music player booming all around the house, Mike used a Starke Sound AD4, which you can watch him unbox and review.
What kind of amplification system have you got paired up with your Raspberry Pi–powered music player?
Today we have a guest post from Igalia’s Iago Toral, who has spent the past year working on the Mesa graphic driver stack for Raspberry Pi 4.
Today we have some very exciting news to share: as of 24 November the V3DV Vulkan Mesa driver for Raspberry Pi 4 has demonstrated Vulkan 1.0 conformance.
Khronos describes the conformance process as a way to ensure that its standards are consistently implemented by multiple vendors, so as to create a reliable platform for application developers. For each standard, Khronos provides a large conformance test suite (CTS) that implementations must pass successfully to be declared conformant; in the case of Vulkan 1.0, the CTS contains over 100,000 tests.
Vulkan 1.0 conformance is a major milestone in bringing Vulkan to Raspberry Pi, but it isn’t the end of the journey. Our team continues to work on all fronts to expand the Vulkan feature set, improve performance, and fix bugs. So stay tuned for future Vulkan updates!
In the latest issue of Wireframe magazine, Mark Vanstone shows you how to turn a 3D shooter into a VR game for a variety of viewers, from Google Cardboard to gaming headsets.
Browser development has really matured of late, with a number
of exciting new features coming to the fore. Where WebGL was well supported, the new WebXR (previously WebVR) is now becoming standard
To begin, we’ll start with the Three.js 3D shooter we made in Wireframe #32 – if you missed it, you can download a copy. We’ll use the same models and much of the same code. The first change, though, is to update the code to run as an ES6 module. The non-module version of Three.js is being phased out at the end of 2020, so it’s probably best to get with the times and use the new stuff. As with our earlier shooter, you’ll need to run this code from a secure web server, which, for mobile phones and gaming headsets, will mean uploading it to somewhere suitable, but if you want to see it running, you can play it at technovisual.co.uk/vr.
Now we need to consider the hardware we’re going to use to run our game. Let’s start at our baseline, Google Cardboard, and work up from there. Available from many outlets online (including Google’s store), it’s a cut-out kit, which you fold up to create a viewer.
There are two lenses to look through, two magnets in a recess on the side, and velcro tabs to hold a mobile phone. The magnets on the side serve as a selection mechanism which we’ll explore later.
Next, we have Gear VR-style viewers. There are many different types, priced from around £12 to £40, and these are essentially a better-built plastic version of the Cardboard but with a button on top to act as a selector. Phones of varying sizes can be used, and as long as the device isn’t more than about four years old, it should be up-to-date enough to run the 3D software.
For example, the six-year-old Samsung S5 is capable of displaying VR, but it’s a bit too slow to make the experience pleasant, whereas a five-year-old iPhone 6 is quite capable of displaying simple VR scenes smoothly. (With
iPhones, you may need to switch on Experimental Features in the Safari settings, however.)
Gaming headsets are a bit different, since they have a built-in screen in the headset, and – in the case of the Oculus Go and Quest – an Android computer in there as well. Tethered headsets use the power of a connected computer to generate the display, and all of them use a slightly different
Three.js system from the cheaper viewers to generate the 3D display.
As time goes on, it’s likely that more mobile phones will be compatible with
the VR software used by the untethered gaming headsets. Gaming headsets also have sensors that track your movement as well as the tilt of the headset, providing six degrees of freedom.
This is just a taste of the comprehensive guide included in the latest issue of Wireframe magazine. If you’re not a subscriber, you can download a PDF copy for free from the Wireframe magazine website. Start at page 50 and work your way through to create your own VR shooter game.
And if you want to take advantage of Wireframe magazine’s latest subscription deal, you can get it for just £10 at the official Raspberry Pi Press online store.
Maker Jen Fox took to hackster.io to share a Raspberry Pi–powered trash classifier that tells you whether the trash in your hand is recyclable, compostable, or just straight-up garbage.
Jen reckons this project is beginner-friendly, as you don’t need any code to train the machine learning model, just a little to load it on Raspberry Pi. It’s also a pretty affordable build, costing less than $70 including a Raspberry Pi 4.
The code-free machine learning model is created using Lobe, a desktop tool that automatically trains a custom image classifier based on what objects you’ve shown it.
Basically, you upload a tonne of photos and tell Lobe what object each of them shows. Jen told the empty classification model which photos were of compostable waste, which were of recyclable and items, and which were of garbage or bio-hazardous waste. Of course, as Jen says, “the more photos you have, the more accurate your model is.”
As promised, you only need a little bit of code to load the image classifier onto your Raspberry Pi. The Raspberry Pi Camera Module acts as the image classifier’s “eyes” so Raspberry Pi can find out what kind of trash you hold up for it.
The push button and LEDs are wired up to the Raspberry Pi GPIO pins, and they work together with the camera and light up according to what the image classifier “sees”.
You’ll want to create a snazzy case so your trash classifier looks good mounted on the wall. Kate cut holes in a cardboard box to make sure that the camera could “see” out, the user can see the LEDs, and the push button is accessible. Remember to leave room for Raspberry Pi’s power supply to plug in.
Add a bit of security to your project or make things selectable
by using different cards. In the latest issue of HackSpace magazine, PJ Evans goes contactless.
NFC (near-field communication) is based on the RFID (radio-frequency identification) standard. Both allow a device to receive data from a passive token or tag (meaning it doesn’t require external power to work). RFID supports a simple ID message that shouts ‘I exist’, whereas NFC allows for both reading and writing of data.
Most people come into contact with these systems every day, whether it’s using contactless payment, or a card to unlock a hotel or office door. In this tutorial we’ll look at the Waveshare NFC HAT, an add-on for Raspberry Pi computers that allows you to interact with NFC and RFID tokens.
We start with the usual step of preparing a Raspberry Pi model for the job. Reading RFID tags is not strenuous work for our diminutive friend, so you can use pretty much any variant of the Raspberry Pi range you like, so long as it has the 40-pin GPIO. We only need Raspberry Pi OS Lite (Buster) for this tutorial; however, you can install any version you wish. Make sure you’ve configured it how you want, have a network connection, and have updated everything by running
sudo apt -y update && sudo apt -y upgrade on the command line.
This NFC HAT is capable of communicating over three different interfaces: I2C, SPI, and UART. We’re going with UART as it’s the simplest to demonstrate, but you may wish to use the others. Start by running sudo raspi-config, going to ‘Interfacing options’, and selecting ‘Serial Interface’. When asked if you want to log into the console, say ‘No’. Then, when asked if you want to enable the serial interface, say ‘Yes’. You’ll need to reboot now. This will allow the HAT to talk to our Raspberry Pi over the serial interface.
As mentioned in the previous step, we have a choice of interfaces and swapping between them means changing some physical settings on the NFC HAT itself. Do not do this while the HAT is powered up in any way. Our HAT can be configured for UART/Serial by default but do check on the wiki at hsmag.cc/iHj1XA. The jumpers at I1 and I0 should both be shorting ‘L’, D16 and D20 should be shorted and on the DIP switch, everything should be off except RX and TX. Check, double-check, attach the HAT to the GPIO, and boot up.
You can download some examples directly from Waveshare. First, we need to install some dependencies. Run the following at the command line:
sudo apt install rpi.gpio p7zip-full python3-pip
pip3 install spidev pyserial
Now, download the files and unpack them:
7z x Pn532-nfc-hat-code.7z
Before you try anything out, you need to edit the example file so that we use UART (see the accompanying code listing).
Find the three lines that start
pn532 = and add a # to the top one (to comment it out). Now remove the # from the line starting
pn532 = PN532_UART. Save, and exit.
Finally, we get to the fun part. Start the example code as follows:
If all is well, the connection to the HAT will be announced. You can now place your RFID token over the area of the HAT marked ‘NFC’. Hexadecimal numbers will start scrolling up the screen; your token has been detected! Each RFID token has a unique number, so it can be used to uniquely identify someone. However, this HAT is capable of much more than that as it also supports NFC and can communicate with common standards like MIFARE Classic, which allows for 1kB of storage on the card. Check out
example_dump_mifare.py in the same directory (but make sure you make the same edits as above to use the serial connection).
You can now read unique identifiers on RFID and NFC tokens. As we just mentioned, if you’re using the MIFARE or NTAG2 standards, you can also write data back to the card. The examples folder contains some C programs that let you do just that. The ability to read and write small amounts of data onto cards can lead to some fun projects. At the Electromagnetic Field festival in 2018, an entire game was based around finding physical locations and registering your presence with a MIFARE card. Even more is possible with smartphones, where NFC can be used to exchange data in any form.
Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store, The Raspberry Pi store in Cambridge, or your local newsagents.
Each issue is free to download from the HackSpace magazine website.
The post Read RFID and NFC tokens with Raspberry Pi | HackSpace 37 appeared first on Raspberry Pi.
The trick with spy devices is to make sure they look as much like the object they’re hidden inside as possible. Where Raspberry Pi comes in is making sure the foam camera can be used as a real photo-taking camera too, to throw the baddies off the scent if they start fiddling with your spyware.
The foam-firing bit of Nathan’s invention was relatively simple to recreate – a modified chef’s squirty cream dispenser, hidden inside a camera-shaped box, gets the job done.
Ruth and Shawn drew a load of 3D-printed panels to mount on the box frame in the image above. One of those cool coffee cups that look like massive camera lenses hides the squirty cream dispenser and gives this build an authentic camera look.
The infrared LED is mounted next to the camera module and switches on when it gets dark, giving you night vision.
The Raspberry Pi computer and its power bank are crammed inside the box-shaped part, with the camera module and infrared LED mounted to peek out of custom-made holes in one of the 3D-printed panels on the front of the box frame.
The foam-firing chef’s thingy is hidden inside the big fake lens, and it’s wedged inside so that when you lift the big fake lens, the lever on the chef’s squirty thing is depressed and foam fires out of a tube near to where the camera lens and infrared LED peek out on the front panel of the build.
Baddies don’t stand a chance!
The post Defeat evil with a Raspberry Pi foam-firing spy camera appeared first on Raspberry Pi.
Did you see the coolest International Space Station (ISS) on Earth on the blog last week? ISS Mimic is powered by Raspberry Pi, mirrors exactly what the real ISS is doing in orbit, and was built by NASA engineers to make the ISS feel more real for Earth-bound STEAM enthusiasts.
The team launched ISS Mimic in celebration of 20 years of continuous human presence in space on the ISS. And they’ve been getting lots of questions since we posted about their creation so, we asked them back to fill you in with a quick Q&A.
Yes, we forced one: “Mechatronic Instantiated Model, Interactively Controlled”
“The second-most complicated International Space Station ever made”. We also like “1/100th scale for 1/100,000,000th cost”
No, it’s a volunteer project, but we do get lots of support. It’s done on our own time and money — though many NASA types and others have kicked in to help buy materials.
Yes — mostly other organisations that we have teamed up with. We partner with a non-profit makerspace near NASA, Creatorspace, for tools, materials, and outreach. And an awesome local 3D printer manufacturer, re:3D, has joined us and printed our (large) solar panels for free, and is helping to refine our models. They are also working towards making a kit of parts for sale for those who don’t have a printer or the time to print all the pieces, with a discount for educators.
Particularly helpful has been Space Center Houston (NASA’s visitor center), who invited us to present to the public and at an educator conference (pre-COVID), and allowed us to spend a full day filming in their beautiful facility. Our earliest supporter was Boeing, who we‘ve worked with to facilitate outreach to educators and students from the start.
5 years — a looong time. We spent much effort early on to establish the scale and feasibility and test the capabilities of 3D printing. We maintained a hard push to keep the materials cost down and reduce build time/complexity for busy educators. We always knew we’d use Raspberry Pi for the brain, but were looking for less costly options for the mechatronics. We’d still like to cut the cost down a lot to make the project more attainable for lower-income schools and individuals.
All of the support has allowed us to take our prototype to schools and STEM events locally. But we really want this to be built around the world to reach those who don’t have much connection to space exploration and hands-on STEM. The big build is probably most suitable for teens and adults, while the alternative builds (in-work) would be much more approachable for younger students.
No, not at all. Our focus is to make it viable for schools/educators — in cost and build complexity — but we want any space nerd to be able to build their own and help drive the design.
Gravity. And time to work on the project… and trying to keep the cost down.
It’s on our radar! Another build that’s screaming to be made is hacking the LEGO ISS model (released this year) to rotate its joints and light LEDs.
There are two Raspberry Pi computers aboard the real ISS right now! And even better, young people have the chance to write Python code that will run on them — IN SPACE — as part of the European Astro Pi Challenge.
Tell the young space enthusiast in your life about Astro Pi to inspire them to try coding! All the info lives at astro-pi.org.
The post Q&A with NASA engineers behind Raspberry Pi–powered ISS Mimic appeared first on Raspberry Pi.
The most wonderful time of the year is approaching! “Most wonderful” meaning the time when you have to figure out what gift best expresses your level of affection for various individuals in your life. We’re here to take away some of that stress for you — provided your favourite individuals like Raspberry Pi, of course. Otherwise you’re on your own. Sorry.
We’ve got ideas for the gamers in your life, what to get for the Raspberry Pi “superfan” who has everything, and options that allow you to keep giving all year round.
If keeping up with the Joneses is your thing, why not treat your nearest Raspberry Pi fan to one of our newest products…
This year, we released Raspberry Pi 400: a complete personal computer, built into a compact keyboard, costing just $70. Our community went wild about the possibilities that Raspberry Pi 400 opens up for home learners and for those who don’t have expensive tech options at their fingertips.
You just plug in a mouse, a monitor (any semi-modern TV screen should work), and go. The Raspberry Pi 400 Personal Computer kit costs $100 and comes with a few extras to help get you started. Or you can buy the Raspberry Pi 400 unit on its own.
Depending on where you are in the world, you may need to pre-order or join a waiting list, as Raspberry Pi 400 is in such high demand. But you could give a homemade ‘IOU’ voucher letting the recipient know that they will soon get their hands on one of our newest and most popular bits of kit.
We publish some cool books around these parts. Laura Sach and Martin O’Hanlon, who are both Learning Managers at the Raspberry Pi Foundation, have written the very newest one, which is designed to help you to get more out of your Python projects.
In Create Graphical User Interfaces with Python, you’ll find ten fun Python projects to create, including a painting program, an emoji match game, and a stop-motion animation creator. All for just £10.
So, if you’ve a keen coder in your midst, this book is the best choice to stretch their skills and keep them entertained throughout 2021. Buy it online from the official Raspberry Pi Press store.
The Pi Hut’s Raspberry Pi 4 Retro Gaming Kit costs £88 and includes everything you need to create your very own retro gaming console. All your lucky kit recipient has to find is a screen to plug into, and a keyboard to set up their new Raspberry Pi, which comes as part of the kit along with a case for it. The Pi Hut has also thrown in a 16GB microSD card, plus a reader for it, as well as our official micro HDMI cable. Job done.
How cool does Picade look?! It’s sold by Pimoroni and you can buy an 8″ display set for £165, or a 10″ display version for £225. Show me a self-respecting gamer who doesn’t want a desktop retro arcade machine in their own home.
Picade is a Raspberry Pi–powered mini arcade that you build yourself. All you’ll need to add is your own Raspberry Pi, a power supply, and a micro SD card.
And if the gamer on your gift list prefers to create their own retro video games, send them a copy of Code the Classics, Volume 1. It’s a stunning-looking hardback book packed with 224 pages telling the stories of some of the seminal video games of the 1970s and 1980s, and showing you how to create your own. Putting hours of projects in the hands of your favourite gamer will only set you back £12. Buy it online from the official Raspberry Pi Press store.
For just $10 apiece, you can drop a couple Raspberry Pi Zero W into any tinkerer’s stocking and they’ll be set for their next few projects. They will LOVE you for allowing them try a new, risky build without having to tear down something else they created to retrieve an old Raspberry Pi.
What to get the superfan who already has a desk full of Raspberry Pi? An official Babbage Bear to oversee the proceedings! Babbage only costs £9 and will arrive wearing their own Raspberry Pi–branded T-shirt. A special Raspberry Pi Towers inhabitant made our Babbage this Christmassy outfit before we photographed them.
If you’ve a superfan on your gift list, then it’s likely they already own a t-shirt with the Raspberry Pi logo on it — so why not get them one of these new designs?
Both costing just £12, the black Raspberry Pi “Pi 4” t-shirt was released to celebrate the launch of Raspberry Pi 4 and features an illustration of the powerful $35 computer. The white Raspberry Pi “Make Cool Stuff” option was created by Raspberry Pi’s own illustrator/animator extraordinaire Sam Alder. Drop that inside fact on the gift tag for extra superfan points.
And if they’re the kind of superfan who would like to make their own Raspberry Pi-–themed clothing, gift them with our Wearable Tech Projects book. This 164-page book gathers up the best bits of wearable technology from HackSpace magazine, with tutorials such as adding lights to your favourite cosplay helmet, and creating a glowing LED skirt. It’s on sale for just £7 and you can buy it online from the official Raspberry Pi Press store.
What if you could give the joy of opening a Raspberry Pi–themed gift every single month for a whole year? Our magazine subscriptions let you do just that, AND they come with a few extra gifts when you sign up.
The official Raspberry Pi magazine comes with a free Raspberry Pi Zero kit worth £20 when you sign up for a 12-month subscription. The magazine is packed with computing and electronics tutorials, how-to guides, and the latest news and reviews.
Check out subscription deals on the official Raspberry Pi Press store.
HackSpace magazine is packed with projects for fixers and tinkerers of all abilities. 12-month subscriptions comes with a free Adafruit Circuit Playground Express, which has been specially developed to teach programming novices from scratch and is worth £25.
Check out subscription deals on the official Raspberry Pi Press store
Wireframe magazine lifts the lid on video games. In every issue, you’ll find out how games are made, who makes them, and how you can make your own using detailed guides. The latest deal gets you three issues for just £10, plus your choice of one of our official books as a gift.
Check out more subscriptions deals on the official Raspberry Pi Press store.
Custom PC is the magazine for people who are passionate about PC technology and hardware. You can subscribe to receive three issues for just £10, and you’ll also receive a book as a gift.
Check out subscription offers on the official Raspberry Pi Press store.
That’s all folks. Have a holly jolly one. Drop a question in the comments box below if you’re after something Raspberry Pi–themed which isn’t mentioned here. I’m half elf and should be able to help.
Whenever you learn a new subject or skill, at some point you need to pick up the particular language that goes with that domain. And the only way to really feel comfortable with this language is to practice using it. It’s exactly the same when learning programming.
In our latest research seminar, we focused on how we educators and our students can talk about programming. The seminar presentation was given by our Chief Learning Officer, Dr Sue Sentance. She shared the work she and her collaborators have done to develop a research-based approach to teaching programming called PRIMM, and to work with teachers to investigate the effects of PRIMM on students.
As well as providing a structure for programming lessons, Sue’s research on PRIMM helps us think about ways in which learners can investigate programs, start to understand how they work, and then gradually develop the language to talk about them themselves.
Sue began by taking us through the rich history of educational research into language and dialogue. This work has been heavily developed in science and mathematics education, as well as language and literacy.
In particular the work of Neil Mercer and colleagues has shown that students need guidance to develop and practice using language to reason, and that developing high-quality language improves understanding. The role of the teacher in this language development is vital.
Sue’s work draws on these insights to consider how language can be used to develop understanding in programming.
Sue identified shortcomings of some teaching approaches that are common in the computing classroom but may not be suitable for all beginners.
PRIMM was designed by Sue and her collaborators as a language-first approach where students begin not by writing code, but by reading it.
PRIMM stands for ‘Predict, Run, Investigate, Modify, Make’. In this approach, rather than copying code or writing programs from scratch, beginners instead start by focussing on reading working code.
In the Predict stage, the teacher provides learners with example code to read, discuss, and make output predictions about. Next, they run the code to see how the output compares to what they predicted. In the Investigate stage, the teacher sets activities for the learners to trace, annotate, explain, and talk about the code line by line, in order to help them understand what it does in detail.
In the seminar, Sue took us through a mini example of the stages of PRIMM where we predicted the output of Python Turtle code. You can follow along on the recording of the seminar to get the experience of what it feels like to work through this approach.
The PRIMM approach is informed by research, and it is also the subject of research by Sue and her collaborators. They’ve conducted two studies to measure the effectiveness of PRIMM: an initial pilot, and a larger mixed-methods study with 13 teachers and 493 students with a control group.
The larger study used a pre and post test, and found that the group who experienced a PRIMM approach performed better on the tests than the control group. The researchers also collected a wealth of qualitative feedback from teachers. The feedback suggested that the approach can help students to develop a language to express their understanding of programming, and that there was much more productive peer conversation in the PRIMM lessons (sometimes this meant less talk, but at a more advanced level).
The PRIMM structure also gave some teachers a greater capacity to talk about the process of teaching programming. It facilitated the discussion of teaching ideas and learning approaches for the teachers, as well as developing language approaches that students used to learn programming concepts.
The research results suggest that learners taught using PRIMM appear to be developing the language skills to talk coherently about their programming. The effectiveness of PRIMM is also evidenced by the number of teachers who have taken up the approach, building in their own activities and in some cases remixing the PRIMM terminology to develop their own take on a language-first approach to teaching programming.
Future research will investigate in detail how PRIMM encourages productive talk in the classroom, and will link the approach to other work on semantic waves. (For more on semantic waves in computing education, see this seminar by Jane Waite and this symposium talk by Paul Curzon.)
If you would like to try out PRIMM with your learners, use our free support materials:
If you missed the seminar, you can find the presentation slides alongside the recording of Sue’s talk on our seminars page.
In our next seminar on Tuesday 1 December at 17:00–18:30 GMT / 12:00–13:30 ET / 9:00–10:30 PT / 18:00–19:30 CEST. Dr David Weintrop from the University of Maryland will be presenting on the role of block-based programming in computer science education. To join, simply sign up with your name and email address.
Once you’ve signed up, we’ll email you the seminar meeting link and instructions for joining. If you attended this past seminar, the link remains the same.
A sci-fi writer wanted to add some realism to his fiction. The result: a Raspberry Pi-based Martian timepiece. Rosie Hattersley clocks in from the latest issue of The MagPi Magazine.
Ever since he first clapped eyes on Mars through the eyepiece of a telescope, Philip Ide has been obsessed with the Red Planet. He’s written several books based there and, many moons ago, set up a webpage showing the weather on Mars. This summer, Phil adapted his weather monitor and created a Raspberry Pi-powered Mars Clock.
After writing several clocks for his Mars Weather page, Phil wanted to make a physical clock: “something that could sit on my desk or such like, and tell the time on Mars.” It was to tell the time at any location on Mars, with presets for interesting locations “plus the sites of all the missions that made it to the surface – whether they pancaked or not.”
Another prerequisite was that the clock had to check for new mission file updates and IERS bulletins to see if a new leap second had been factored into Universal Coordinated Time.
“Martian seconds are longer,” explains Phil, “so everything was pointing at software rather than a mechanical device. Raspberry Pi was a shoo-in for the job”. However, he’d never used one.
“I’d written some software for calculating orbits and one of the target platforms was Raspberry Pi. I’d never actually seen it run on a Raspberry Pi but I knew it worked, so the door was already open.” He was able to check his data against a benchmark NASA provided. Knowing that the clocks on his Mars Weather page were accurate meant that Phil could focus on getting to grips with his new single-board computer.
He chose a 2GB Raspberry Pi 4 and official-inch touchscreen with a SmartiPi Touch 2 case. “Angles are everything,” he reasons. He also added a fan to lower the CPU temperature and extend the hardware’s life. Along with a power lead, the whole setup cost £130 from The Pi Hut.
Since his Mars Clock generates a lot of data, he made it skinnable so the user can choose which pieces of information to view at any one time. It can display two types of map – Viking or MOLA – depending on the co-ordinates for the clock. NASA provides a web map-tile service with many different data sets for Mars, so it should be possible to make the background an interactive map, allowing you to zoom in/out and scroll around. Getting these to work proved rather a headache as he hit incompatibilities with the libraries.
Phil wrote most of the software himself, with the exception of libraries for the keyboard and FTP which he pulled from GitHub. Here’s all the code.
His decades as a computer programmer meant other aspects were straightforward. The hardware is more than capable, he says of his first ever experience of Raspberry Pi, and the SmartiPi case makers had done a brilliant job. Everything fit together and in just a few minutes his Raspberry Pi was working.
Since completing his Mars Clock Phil has added a pi-hole and a NAS to his Raspberry Pi setup and says his confidence using them is such that he’s now contemplating challenging himself to build an orrery (a mechanical model of the solar system). “I have decades of programming experience, but I was still learning new things as the project progressed,” he says. “The nerd factor of any given object increases exponentially if you make it yourself.”
Check out page 26 in the latest issue of The MagPi Magazine for a step-by-step and to learn more about the maker, Phillip. You can read a PDF copy for free on The MagPi Magazine website if you’re not already a subscriber.
A group of us NASA engineers work on the International Space Station (ISS) for our day-jobs but craved something more tangible than computer models and data curves to share with the world. So, in our free time, we built ISS Mimic. It’s still in the works, but we are publishing now to celebrate 20 years of continuous human presence in space on the ISS.
This video was filmed and produced by our friend, new teammate, and Raspberry Pi regular Estefannie of Estefannie Explains it All. Most of the images in this blog are screen grabbed from her wonderful video too.
ISS Mimic is a 1% scale model of the International Space Station, bringing the American football field-sized beauty down to a tabletop-sized build. Most elements in the final version of the build which you see in the video are 3D printed — recently even the solar arrays. It has 12 motors: 10 to control the solar panels and two to turn the thermal radiators. All of these are fed by live data streaming from the ISS, so what you see on ISS Mimic is what’s happening that very moment on the real deal up in space.
Despite the global ISS effort, most people seem to feel disconnected from space exploration and all the STEAM goodness within. Beyond headlines and rocket launches, even space enthusiasts may feel out of touch. Most of what is available is via apps and videos, which are great, but miss the physical aspect.
ISS Mimic is intended to provide an earthbound, tangible connection to that so-close-but-so-far-away orbiting science platform. We want space excitement to fuel STEAM interest.
Users toggle through various touchscreen data displays of things like battery charge states, electrical power generated, joint angles, communication dish status, gyroscope torques, and even airlock air pressure — fun to watch prior to a spacewalk!
The user can also touchscreen-activate the physical model, in which case Raspberry Pi sends the telemetry along to Arduinos, which in turn command motors in the model to do their thing, rotating the solar panels and thermal radiators to the proper angle. The solar panel joints use compact geared DC motors with Hall-effect sensors for feedback. The sensor signals are sent back down to the Arduino, which keeps track of the position of each joint compared to ISS telemetry, and updates motor command accordingly to stay in sync.
The thermal radiator motors are simpler. Since they only rotate about 180° total, a simple RC micro servo is utilised with the desired position sent from an Arduino directly from the Raspberry Pi data stream.
When MIMIC is in ‘live mode’, the motor commands are the exact data stream coming from ISS. This is a fun mode to leave it in for long durations when it’s in the corner of the room. But it changes slowly, so we also include advanced playback, where prior orbit data stored on Raspberry Pi is played back at 60× speed. A regular 90-minute orbit profile can be played back in 90 seconds.
We also have ‘disco mode’, which may have been birthed during lack of sleep, but now we plan to utilise it whenever we want to grab attention — such as to alert users that the ISS is flying overhead.
We may have a mild LED addiction, and we have LEDs embedded where the ISS batteries would live at the base of the solar arrays. They change colour with the charge voltage, so we can tell by watching them when the ISS is going into Earth’s shadow, or when the batteries are fully charged, etc.
A few times when we were working on the model and the LEDs suddenly changed, we thought we had bumped something. But it turned out the first array was edging behind Earth. These are fun to watch during spacewalks, and the model gives us advanced notice that the crew is about to be in darkness.
We plan to cram more LEDs in to react to other data. The project is open source, so anyone can build one and improve the design — help wanted! After all, the ISS itself is a worldwide collaboration with 19 countries participating by providing components and crew.
The solar panels on the ISS are mounted on what’s known as the ‘outboard truss’ — one each on the Port and Starboard ends of ISS. Everything on the outboard truss rotates together as part of the sun-tracking (in addition to each solar array rotating individually). So, you can’t just run the power/signal wires through the interface or they would twist and break. ISS Mimic has the same issue.
Even though our solar panels don’t generate power, their motors still require power and signals. The ISS has a specialised, unique build; but fortunately we were able to solve our problem with a simple slip ring design sourced from Amazon.
Wire management turned out to be a big issue for us. We had bird nests in several places early on (still present on the Port side solar), so we created some custom PCBs just for wire management, to keep the chaos down. We incorporated HDMI connectors and cables in some places to provide nice shielding and convenient sized coupling — actually a bit more compact than the Ethernet we’d used before.
Also, those solar panels are huge, and the mechanism that supports the outboard truss (everything on the sides that rotate together) on the ISS includes a massive 10 foot diameter bull gear called the Solar Alpha Rotary Joint. A pinion gear from a motor interfaces with this gear to turn it as needed.
We were pleasantly surprised that our 3D-printed bull gear held up quite well with a similar pinion-driven design. Overall, our 3D prints have survived better than expected. We are revamping most models to include more detail, and we could certainly use help here.
Our sights are set firmly on educators as our primary area of focus, and we’ve been excited to partner with Space Center Houston to speak at public events and a space exploration educator conference with international attendance earlier this year.
The feedback has been encouraging and enlightening. We want to keep getting feedback from educators, so please provide more insights by either commenting on this blog or via the contact info listed at the bottom.
A highlight for the team was when the ISS Mimic prototype was requested to live for a month in NASA’s Mission Control Center and was synced to live data during an historic spacewalk. Mimic experienced an ‘anomaly’ when a loose wire caused one of the solar panel motors to spin at 100× the normal rate.
You’ll be happy to know that none of the engineering professionals were fooled into thinking the real ISS was doing time-trials. Did I mention it’s still a work in progress? You can’t be scared of failure (for non-critical applications!), particularly when developing something brand-new. It’s part of shaking out problems and learning.
It’s an exciting time in human and robotic spaceflight, with lots of budding projects and new organisations joining the effort. This feels like a great time to deepen our connection to this great progress, and we hope ISS Mimic can help us to do that, as well as encourage more students to play in coding, mechatronics, and STEAM.
One of the best parts of this project has been teaming up with organisations to share the love. We partner with a non-profit makerspace near NASA called Creatorspace, for tools, materials, and outreach. And an awesome local 3D printer manufacturer, re:3D, has joined us to print some of our larger components for free and is helping to refine our models. Space Center Houston (NASA’s visitor centre) invited us to present to public and educator events, and generously allowed us to spend a full day filming in their beautiful facility. Our earliest supporter was Boeing, who we’ve worked with to facilitate outreach to educators and students from the start. And of course we are thankful to NASA for providing the public data stream that makes the project possible.
Did you know that there are Raspberry Pi computers aboard the real ISS that young people can run their own Python programs on? Find out more at astro-pi.org.
The post ISS Mimic: A Raspberry Pi-powered International Space Station model that syncs with the real thing appeared first on Raspberry Pi.
Animator/engineer Ashok Fair has put witch-level finger pointing powers in your hands by sticking a SmartEdge Agile, wirelessly controlled by Raspberry Pi Zero, to a golf glove. You could have really freaked the bejeezus out of Halloween party guests with this (if we were allowed to have Halloween parties that is).
The build uses a Smart Edge Agile IoT device with Brainium, a cloud-based tool for performing machine learning tasks.
The Rapid IoT kit is interfaced with Raspberry Pi Zero and creates a thread network connecting to light, car, and fan controller nodes.
The Brainium app is installed on Raspberry Pi and bridges between the cloud and Smart Edge device. MQTT is running on Python and processes the Rapid IoT Kit’s data.
The device is mounted onto a golf glove, giving the wearer seemingly magical powers with the wave of a hand.
To get started, the glove wearer draws a pattern above the screen attached to the Raspberry Pi to unlock it and wake up all the controller nodes.
The light controller node is turned on by drawing a clockwise circle, and turned off with an counter-clockwise circle.
The fan is turned on and off in the same way, and you can increase the fan’s speed by moving your hand upwards and reduce the speed by moving your hand down. You know it’s working by the look of the fan’s LEDs: they blinker faster as the fan speeds up.
Make a pushing motion in the air above the car to make it move forward, and you can also make it turn and reverse.
If you wear the glove while driving, it collects data in real time and logs it on the Brainium cloud so you can review your driving style.
When we think back to our school days, we can all recall that one teacher who inspired us, believed in us, and made all the difference to how we approached a particular subject. It was someone we maybe took for granted at the time and so we only realised (much) later how amazing they were.
I hope this post makes you think of a teacher or mentor who has made a key difference in your life!
Here computer science student Jonathan Alderson and our team’s Ben Garside talk to me about how Ben supported and inspired Jonathan in his computer science classroom.
Jonathan: My first memories of using a computer were playing 3D Pinball, Club Penguin, and old Disney games, so nothing productive there…or so I thought! I was always good at IT and Maths at school, and Computing seemed to be a cross between the two, so I thought it would be good.
Jonathan: I met Mr Garside at the start of sixth form. Our school didn’t have a computer science course, so a few of us would walk between schools twice a week. Mr Garside really made me feel welcome in a place where I didn’t know anyone.
When learning computer science, it’s difficult to understand the importance of new concepts like recursion, classes, or linked lists when the examples are so small. Mr Garside’s teaching made me see the relevance of them and how they could fit into other projects; it’s easy to go a long time without using concepts because you don’t necessarily need them, even when it would make your life a lot easier.
Ben: It was a real pleasure to teach Jonathan. He stands out as being one of the most inquisitive students that I have taught. If something wasn’t clear to him, he’d certainly let me know and ask relevant questions so that he could fully understand. Jonathan was also constantly working on his own programming projects outside of lessons. During his A level, I remember him taking it upon himself to write a program that played chess. Each week he would demonstrate the progress he had made to the class. It was a perfect example of decomposition as he tackled the project in small sections and had a clear plan as to what he wanted to achieve. By the end of his project, not only did he have a program that played chess, but it was capable of playing against real online users including making the mouse clicks on the screen!
Moving from procedural to object-oriented programming (OOP) can be a sticking point for a lot of learners, and I remember Jonathan finding this difficult at first. I think what helped Jonathan in particular was getting him to understand that this wasn’t as new a concept as he first thought. OOP was just a different paradigm where he could still apply all of the coding structures that he was already confident in using.
That sounds like a very cool project. What other projects did you make, Jonathan? And how did Ben help you?
Jonathan: My final-year project, [a video game] called Vector Venture, ended up becoming quite a mammoth task! I didn’t really have a clue about organising large projects, what an IDE was, or you could split files apart. Mr Garside helped me spend enough time on the final report and get things finished. He was very supportive of me releasing the game and got me a chance to speak at the Python North East group, which was a great opportunity.
Ben: Vector Venture was a very ambitious project that Jonathan undertook, but I think by then he had learned a lot about how to tackle a project of that size from previous projects such as the chess program. The key to his success was that whilst he was learning, he was picking projects to undertake that he had a genuine interest in and enjoyed developing. I would also tell my A level students to pick as a project something that they will enjoy developing. Jonathan clearly enjoyed developing games, but I also had students who picked projects to develop programs that would solve problems. For example, one of my students developed a system that would take online bookings for food orders and manage table allocation for a local restaurant.
Jonathan: I have just completed my undergraduate degree at the University of Leeds (UoL) with a place on the Dean’s List and am staying to complete a Masters in High Performance Graphics.
During my time at UoL, I’ve had three summer placements creating medical applications and new systems for the university. This helped me understand the social benefits of computer science; it was great to work on something that is now benefitting so many people. My dissertation was on music visualisation, mapping instrument attributes of a currently playing song to control parameters inside sharers on the GPU to produce reactive visualisations. I’ve just completed an OpenGL project to create procedural underwater scenes, with realistic lighting, reflections, and fish simulations. I’m now really looking forward to completing my Game Engine project for my masters and graduating.
Ben: There are lots of things that excite me about teaching computer science. Before I worked for the Raspberry Pi Foundation, there was a phrase I heard Carrie Anne Philbin say when I attended a Picademy: we are teaching young people to be digital makers, logical thinkers, and problem solvers, not just to be consumers of technology. I felt this really summed up how great it is to teach our subject. Teaching computer science means that we’re educating young people about the world around them and how technology plays its part in their lives. By doing this, we are empowering them to solve problems and to make educated choices about how they use technology.
As for my previous in-school experiences, I loved those lightbulb moments when something suddenly made sense to a student and a loud “Yesssss!” would break the silence of a quietly focused classroom. I loved teaching something that regularly sparked their imaginations; give them a single lesson on programming, and they would start to ask questions like: “Now I’ve made it do that…does this mean I could make it do this next?“. It wasn’t uncommon for students to want to do more outside of the classroom that wasn’t a homework activity. That, for me, was the ultimate win!
Who was the teacher who helped shape your future when you were at school? Tell us about them in the comments below.
Design Engineering student Ben Cobley has created a Raspberry Pi–powered sous-chef that automates the easier pan-cooking tasks so the head chef can focus on culinary creativity.
Ben named his invention OnionBot, as the idea came to him when looking for an automated way to perfectly soften onions in a pan while he got on with the rest of his dish. I have yet to manage to retrieve onions from the pan before they blacken so… *need*.
Ben’s affordable solution is much better suited to home cooking than the big, expensive robotic arms used in industry. Using our tiny computer also allowed Ben to create something that fits on a kitchen counter.
A thermal sensor array suspended above the stove detects the pan temperature, and the Raspberry Pi Camera Module helps track the cooking progress. A servo motor controls the dial on the induction stove.
No machine learning expertise was required to train an image classifier, running on Raspberry Pi, for Ben’s robotic creation; you’ll see in the video that the classifier is a really simple drag-and-drop affair.
Ben has only taught his sous-chef one pasta dish so far, and we admire his dedication to carbs.
Ben built a control panel for labelling training images in real time and added labels at key recipe milestones while he cooked under the camera’s eye. This process required 500–1000 images per milestone, so Ben made a LOT of pasta while training his robotic sous-chef’s image classifier.
Ben open-sourced this project so you can collaborate to suggest improvements or teach your own robot sous-chef some more dishes. Here’s OnionBot on GitHub.
He also rates this Auto ML system used in the project as a “great tool for makers.”
The post Hire Raspberry Pi as a robot sous-chef in your kitchen appeared first on Raspberry Pi.
We’re pleased to share that Dr Sue Sentance, our Chief Learning Officer, is receiving a Suffrage Science award for Mathematics and Computing today.
The Suffrage Science award scheme celebrates women in science. Sue is being recognised for her achievements in computer science and computing education research, and for her work promoting computing to the next generation.
Sue is an experienced teacher and teacher educator with an academic background in artificial intelligence, computer science, and education. She has made a substantial contribution to research in computing education in school over the last ten years, publishing widely on the teaching of programming, teacher professional development, physical computing, and curriculum change. In 2017 Sue received the BERA Public Engagement and Impact Award for her services to computing education. Part of Sue’s role at the Raspberry Pi Foundation is leading our Gender Balance in Computing research programme, which investigates ways to increase the number of girls and young women taking up computing at school level.
As Dr Hannah Dee, the previous award recipient who nominated Sue, says: “[…] The work she does is important — researchers need to look at what happens in schools, particularly when we consider gender. Girls are put off computing long before they get to universities, and an understanding of how children learn about computing and the ways in which we can support girls in tech is going to be vital to reverse this trend.”
Sue says, “I’m delighted and honoured that Hannah nominated me for this award, and to share this honour with other women also dedicated to furthering the fields of mathematics, computing, life sciences, and engineering. It’s been great to see research around computing in school start to gather pace (and also rigour) around the world over the last few years, and to play a part in that. There is still so much to do — many countries have now introduced computing or computer science into their school curricula as a mandatory subject, and we need to understand better how to make the subject fully accessible to all, and to inspire and motivate the next generation.”
Aside from her role in the Gender Balance in Computing research programme, Sue has led our work as part of the consortium behind the National Centre for Computing Education and is now our senior adviser on computing subject knowledge, pedagogy, and the Foundation’s computing education research projects. Sue also leads the programme of our ongoing computing education research seminar series, where academics and educators from all over the world come together online to hear about and discuss some of the latest work in the field.
We are currently inviting primary and secondary schools in England to take part in the Gender Balance in Computing project.
Congratulations from all your colleagues at the Foundation, Sue!
The post Sue Sentance recognised with Suffrage Science award appeared first on Raspberry Pi.
Fire artillery shells to blow up the enemy with Mark Vanstone’s take on a classic two-player artillery game
Artillery Duel was an early example of the genre, and appeared on such systems as the Bally Astrocade and Commodore 64 (pictured).
To pick just one artillery game is difficult since it’s a genre in its own right. Artillery simulations and games have been around for almost as long as computers, and most commonly see two players take turns to adjust the trajectory of their tank’s turret and fire a projectile at their opponent. The earliest versions for microcomputers appeared in the mid-seventies, and the genre continued to develop; increasingly complex scenarios appeared involving historical settings or, as we saw from the mid-90s on, even offbeat ideas like battles between factions of worms.
To code the basics of an artillery game, we’ll need two tanks with turrets, a landscape, and some code to work out who shot what, in which direction, and where said shot landed. Let’s start with the landscape. If we create a landscape in two parts – a backdrop and foreground – we can make the foreground destructible so that when a missile explodes it damages part of the landscape. This is a common effect used in artillery games, and sometimes makes the gameplay more complicated as the battle progresses. In our example, we have a grass foreground overlaid on a mountain scene. We then need a cannon for each player. In this case, we’ve used a two-part image, one for the base and one for the turret, which means the latter can be rotated using the up and down keys.
Our homage to the artillery game genre. Fire away at your opponent, and hope they don’t hit back first.
SPACE bar is pressed, we call the firing sequence, which places the projectile at the same position as the gun firing it. We then move the missile through the air, reducing the speed as it goes and allowing the effects of gravity to pull it towards the ground.
We can work out whether the bullet has hit anything with two checks. The first is to do a pixel check with the foreground. If this comes back as not transparent, then it has hit the ground, and we can start an explosion. To create a hole in the foreground, we can write transparent pixels randomly around the point of contact and then set off an explosion animation. If we test for a collision with a gun, we may find that the bullet has hit the other player and after blowing up the tank, the game ends. If the impact only hit the landscape, though, we can switch control over to the other player and let them have a go.
So that’s your basic artillery game. But rest assured there are plenty of things to add – for example, wind direction, power of the shot, variable damage depending on proximity, or making the tanks fall into holes left by the explosions. You could even change the guns into little wiggly creatures and make your own homage to Worms.
You can read more features like this one in Wireframe issue 44, available directly from Raspberry Pi Press — we deliver worldwide.
And if you’d like a handy digital version of the magazine, you can also download issue 44 for free in PDF format.
Wireframe #44, bringing the past and future of Worms to the fore.
The post Code your own Artillery-style tank game | Wireframe #44 appeared first on Raspberry Pi.
We’re proud to show our support for This is Engineering Day, an annual campaign from the Royal Academy of Engineering to bring engineering to life for young people by showcasing its variety and creativity. This year’s #BeTheDifference theme focuses on the positive impact engineering can have on everyday life and on the world we live in. So what better way for us to celebrate than to highlight our community’s young digital makers — future engineers — and their projects created for social good!
We’re also delighted to have special guest Dr Lucy Rogers on our This Is Engineering–themed Digital Making at Home live stream today at 5.30pm GMT, where she will share insights into her work as a creative inventor.
In July, we were lucky enough to have Dr Hayaatun Sillem, CEO of the Royal Academy of Engineering (RAEng), as a judge for Coolest Projects, our technology fair for young creators. Dr Hayaatun Sillem says, “Engineering is a fantastic career if you want to make a difference, improve people’s lives, and shape the future.”
In total, the young people taking part in Coolest Projects 2020 online presented 560 projects, of which over 300 projects were made specifically for social good. Here’s a small sample from some future engineers across the world:
“Our project is a virtual big eye doorman that detects if you wear a mask […] we chose this project because we like artificial intelligence and robotics and we wanted to help against the coronavirus.”
“I want people to put trash in the correct place so I made this AI trash can. This AI trash can separates the trash. I used ML2 Scratch. I used a camera to help the computer learn what type of trash it is.”
“As we know, burglary cases are very frequent and it is upsetting for the families whose houses are burglarised and [can] make them feel fearful, sad and helpless. Therefore, I tried to build a system which will help everyone to secure their houses.”
Professor Lucy Rogers PhD is an inventor with a sense of fun! She is a Fellow of the RAEng, and RAEng Visiting Professor of Engineering:Creativity and Communication at Brunel University, London. She’s also a Fellow of the Institution of Mechanical Engineers. Adept at bringing ideas to life, from robot dinosaurs to mini mannequins — and even a fartometer for IBM! — she has developed her creativity and communication skills and shares her tricks and tools with others.
Here Dr Lucy Rogers shares her advice for young people who want to get involved in engineering:
A goal or a useful problem will help you get over the steep learning curve that is inevitable in learning about new pieces of technology. Your goal does not have to be big: my first Internet of Things project was making a LED shine when the International Space Station was overhead.
To me “engineering” is really “problem-solving”. Find problems to solve. You may have to make something, program something, or do something. How can you make your own world a little better?
Learn how to fail safely: break projects into smaller pieces, and try each piece. If it doesn’t work, you can try again. It’s only at the end of a project that you should put all the “working” pieces together (and even then, they may not work nicely together!)
Dr Lucy Rogers will be joining our Digital Making at Home educators on our This is Engineering-themed live stream today at 5.30pm GMT.
This is your young people’s chance to be inspired by this amazing inventor! And we will take live questions via YouTube, Facebook, Twitter, and Twitch, so make sure your young people are able to get Dr Lucy’s live answers to their own questions about digital making, creativity, and all things engineering!
To get inspired about engineering right now, your young people can follow along step by step with Electricity generation, our brand-new, free digital making project on the impact of non-renewable energy on our planet!
While coding this Scratch project, learners input real data about the type and amount of natural resources that countries across the world use to generate electricity, and they then compare the results using an animated data visualisation.
The data we’ve made part of this project was compiled by the International Energy Agency.
To find out more about This is Engineering Day, please visit www.ThisisEngineering.org.uk.
It’s been a journey, but it’s finally here, and I can talk about the secret Raspberry Pi 400 project! I’ll also try to cover some of the questions you asked following Eben’s announcement of Raspberry Pi 400 yesterday.
It’s been over four years since the original idea of a Raspberry Pi inside a keyboard was discussed, before I even started working at Raspberry Pi Towers. Initially, the plan was for a kit with all the parts needed for people simply to open the box and get started by connecting the accessories to a “classic” credit-card sized Raspberry Pi. The challenge was that we needed a mouse and a keyboard: if we could manufacture a mouse and a keyboard, we could make a complete kit. How hard could it be? Then, within a day of our announcing our new keyboard and mouse, we saw a blog from someone who had milled out the keyboard and integrated a Raspberry Pi 3 Model A+ into it.
Our jaws dropped – we were impressed but we couldn’t say a word. Then others did the same with a Raspberry Pi Zero, and by that point we kind of expected that. We knew it was a good idea.
The keyboard and mouse were the big things we needed to sort out: once the quality control and supply chain were in place for those, we could move to fitting keyboard matrices to Raspberry Pi 400s, and achieve final assembly in Sony’s manufacturing facility in Wales. We had first planned to make a Raspberry Pi 3-based version, but it was clear that getting such a complex item into product wouldn’t happen until after we’d launched Raspberry Pi 4, and this would make the new product seem like a runner-up. So, instead, we started work on the Raspberry Pi 4-based version as soon as the design for that was finalised.
The board inside the housing is essentially a Raspberry Pi 4 unit, but with a fresh PCB design. It has the same USB and Ethernet system as the Raspberry Pi 4, but one of the USB2.0 ports is dedicated to the keyboard.
We have already seen a few comments about the USB ports being on the left side of the unit, and the fact that this makes the mouse cable cross over for most right-handed users. The PCB shape had to be defined early on so that the industrial designers could get on with the housing design, and I then stared endlessly at the PCB layout, trying to get one of the USB ports to route to the right side without wrecking the signal integrity of the memory or the HDMI; I could not find a way to do this. Left-handed folks and Bluetooth mouse-owners will be happy at least!
Raspberry Pi 400 has dual-band 802.11b/g/n/ac wireless LAN and Bluetooth 5.0. Like Raspberry Pi 4, it has dual micro HDMI output which achieves up to 4K video. It would have been lovely to have had full-size HDMI connectors, but in order to achieve this we would have to remove other functions, or make a bulkier unit. However, the kit does come with a micro HDMI-to-HDMI cable to cheer you all up.
We kept the GPIO connector since it is loved so much by beginners and experts alike, and this is after all a Raspberry Pi – we want people to be able to use it for tinkering and prototyping. The HAT functionality works better with an extender cable, which you can buy from numerous websites.
Raspberry Pi 400 has the same circuit layout of the power management, processor, and memory as Raspberry Pi 4, but with one major difference: we’ve adjusted the operating point to 1.8GHz! And did I mention cooling? We’ve solved the cooling challenge so users don’t have to give this any thought. Raspberry Pi 400 contains a heat spreader that dissipates the heat across the whole unit, front and back, so that no part of it will feel too hot to touch. In fact, there is enough thermal margin to overclock it, if you’re so inclined.
Some folks have asked us why we did not fit the Raspberry Pi Compute Module inside. The reason is that above a certain scale, it generally makes more sense to go with a custom PCB rather than a module with a carrier board. With hundreds of thousands of Raspberry Pi 400 units in the first instance, we are above that scale.
We also have a feature that is completely new to Raspberry Pi products: an on/off button! Power off is achieved by holding down Fn+F10 for two seconds. This is a soft control that negotiates with Linux to shut down, so you don’t corrupt your memory card or your USB drive. Power can be restored by pressing F10 (or Fn+F10).
A lot of love went into making this the best possible product we can manufacture, and it has been through extensive alpha testing and compliance testing. I thought I would show you the insides of a very early prototype. There are already some teardown videos online if you want to see how Raspberry Pi 400 is put together; it has not changed much from this:
The official Raspberry Pi mouse has been a lovely product to have available where Raspberry Pi 400 is concerned, because now we can provide a complete kit of official matching Raspberry Pi parts that looks fantastic on your desk. The kit comes with the SD card already programmed and inserted, so on Christmas day, you just need to plug it into the family TV and start coding. No frantic searches for somewhere that sells memory cards!
The kit includes:
Finally, a bit of fun to finish with. On Christmas morning 1985, I opened the polystyrene box of a Commodore 64 computer and the world switched on for me. It had the best games and the best sound, and it was easy to program. We think the combination of gaming and programming still works today, but we’ve come a long way since 1985. Here’s a chart to show how a Commodore 64 and a Raspberry Pi 400 compare.
I particularly like the benchmark increase for less than half the power. This makes Raspberry Pi 400 almost a million times more efficient at processing data.
We do hope this bring smiles to the faces of those fortunate enough to get one by Christmas. The factory has been running flat-out for the last two months building up stock – order yours soon though, since they’ll sell quickly!
Alwyn Roberts, Andy Liu, Anthony Morton, Antti Silventoinen, Austin Su, Ben Stephens, Brendan Moran, Craig Wightman, Daniel Thompsett, David Christie, David John, David Lenton, Dominic Plunkett, Eddie Thorn, Gordon Hollingworth, Helen Marie, Jack Willis, James Adams, Jeremy Wang, Joe Whaley, Keiran Abraham, Keri Norris, Kuanhsi Ho, Laurent Le Mentec, Mandy Oliver, Mark Evans, Michael Howells, Mike Buffham, Mike Unwin, Peter Challis, Phil Elwell, Rhys Polley, Richard Jones, Rob Matthews, Roger Thornton, Sherman Liu, Simon Lewis, Simon Oliver, Tim Gover, Tony Jones, Viktor Lundström, Wu Hairong, and all the alpha testers and resellers who made Raspberry Pi 400 possible.
Raspberry Pi has always been a PC company. Inspired by the home computers of the 1980s, our mission is to put affordable, high-performance, programmable computers into the hands of people all over the world. And inspired by these classic PCs, here is Raspberry Pi 400: a complete personal computer, built into a compact keyboard.
Raspberry Pi 4, which we launched in June last year, is roughly forty times as powerful as the original Raspberry Pi, and offers an experience that is indistinguishable from a legacy PC for the majority of users. Particularly since the start of the COVID-19 pandemic, we’ve seen a rapid increase in the use of Raspberry Pi 4 for home working and studying.
But user friendliness is about more than performance: it can also be about form factor. In particular, having fewer objects on your desk makes for a simpler set-up experience. Classic home computers – BBC Micros, ZX Spectrums, Commodore Amigas, and the rest – integrated the motherboard directly into the keyboard. No separate system unit and case; no keyboard cable. Just a computer, a power supply, a monitor cable, and (sometimes) a mouse.
We’ve never been shy about borrowing a good idea. Which brings us to Raspberry Pi 400: it’s a faster, cooler 4GB Raspberry Pi 4, integrated into a compact keyboard. Priced at just $70 for the computer on its own, or $100 for a ready-to-go kit, if you’re looking for an affordable PC for day-to-day use this is the Raspberry Pi for you.
The Raspberry Pi 400 Personal Computer Kit is the “Christmas morning” product, with the best possible out-of-box experience: a complete PC which plugs into your TV or monitor. The kit comprises:
At launch, we are supporting English (UK and US), French, Italian, German, and Spanish keyboard layouts, with (for the first time) translated versions of the Beginner’s Guide. In the near future, we plan to support the same set of languages as our official keyboard.
Saving money by bringing your own peripherals has always been part of the Raspberry Pi ethos. If you already have the other bits of the kit, you can buy a Raspberry Pi 400 computer on its own for just $70.
To accompany Raspberry Pi 400, we’ve released a fourth edition of our popular Raspberry Pi Beginner’s Guide, packed with updated material to help you get the most out of your new PC.
You can buy a copy of the Beginner’s Guide today from the Raspberry Pi Press store, or download a free PDF.
UK, US, and French Raspberry Pi 400 kits and computers are available to buy right now. Italian, German, and Spanish units are on their way to Raspberry Pi Approved Resellers, who should have them in stock in the next week.
We expect that Approved Resellers in India, Australia, and New Zealand will have kits and computers in stock by the end of the year. We’re rapidly rolling out compliance certification for other territories too, so that Raspberry Pi 400 will be available around the world in the first few months of 2021.
Of course, if you’re anywhere near Cambridge, you can head over to the Raspberry Pi Store to pick up your Raspberry Pi 400 today.
We let a handful of people take an early look at Raspberry Pi 400 so they could try it out and pull together their thoughts to share with you. Here’s what some of them made of it.
Simon Martin, who has spent the last couple of years bringing Raspberry Pi 400 to life, will be here tomorrow to share some of the interesting technical challenges that he encountered along the way. In the meantime, start thinking about what you’ll do with your Raspberry Pi PC.