Lateo.net - Flux RSS en pagaille (pour en ajouter : @ moi)

🔒
❌ À propos de FreshRSS
Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierRaspberry Pi

Monitor air quality with a Raspberry Pi

Par Andrew Gregory

Add a sensor and some Python 3 to your Raspberry Pi to keep tabs on your local air pollution, in the project taken from Hackspace magazine issue 21.

Air is the very stuff we breathe. It’s about 78% nitrogen, 21% oxygen, and 1% argon, and then there’s the assorted ‘other’ bits and pieces – many of which have been spewed out by humans and our related machinery. Carbon dioxide is obviously an important polluter for climate change, but there are other bits we should be concerned about for our health, including particulate matter. This is just really small bits of stuff, like soot and smog. They’re grouped together based on their size – the most important, from a health perspective, are those that are smaller than 2.5 microns in width (known as PM2.5), and PM10, which are between 10 and 2.5 microns in width. This pollution is linked with respiratory illness, heart disease, and lung cancer.

Obviously, this is something that’s important to know about, but it’s something that – here in the UK – we have relatively little data on. While there are official sensors in most major towns and cities, the effects can be very localised around busy roads and trapped in valleys. How does the particular make-up of your area affect your air quality? We set out to monitor our environment to see how concerned we should be about our local air.

Getting started

We picked the SDS011 sensor for our project (see ‘Picking a sensor’ below for details on why). This sends output via a binary data format on a serial port. You can read this serial connection directly if you’re using a controller with a UART, but the sensors also usually come with a USB-to-serial connector, allowing you to plug it into any modern computer and read the data.

The USB-to-serial connector makes it easy to connect the sensor to a computer

The very simplest way of using this is to connect it to a computer. You can read the sensor values with software such as DustViewerSharp. If you’re just interested in reading data occasionally, this is a perfectly fine way of using the sensor, but we want a continuous monitoring station – and we didn’t want to leave our laptop in one place, running all the time. When it comes to small, low-power boards with USB ports, there’s one that always springs to mind – the Raspberry Pi.

First, you’ll need a Raspberry Pi (any version) that’s set up with the latest version of Raspbian, connected to your local network, and ideally with SSH enabled. If you’re unsure how to do this, there’s guidance on the Raspberry Pi website.

The wiring for this project is just about the simplest we’ll ever do: connect the SDS011 to the Raspberry Pi with the serial adapter, then plug the Raspberry Pi into a power source.

Before getting started on the code, we also need to set up a data repository. You can store your data wherever you like – on the SD card, or upload it to some cloud service. We’ve opted to upload it to Adafruit IO, an online service for storing data and making dashboards. You’ll need a free account, which you can sign up for on the Adafruit IO website – you’ll need to know your Adafruit username and Adafruit IO key in order to run the code below. If you’d rather use a different service, you’ll need to adjust the code to push your data there.

We’ll use Python 3 for our code, and we need two modules – one to read the data from the SDS011 and one to push it to Adafruit IO. You can install this by entering the following commands in a terminal:

pip3 install pyserial adafruit-io

You’ll now need to open a text editor and enter the following code:

This does a few things. First, it reads ten bytes of data over the serial port – exactly ten because that’s the format that the SDS011 sends data in – and sticks these data points together to form a list of bytes that we call data.

We’re interested in bytes 2 and 3 for PM2.5 and 4 and 5 for PM10. We convert these from bytes to integer numbers with the slightly confusing line:

pmtwofive = int.from_bytes(b’’.join(data[2:4]), byteorder=’little’) / 10

from_byte command takes a string of bytes and converts them into an integer. However, we don’t have a string of bytes, we have a list of two bytes, so we first need to convert this into a string. The b’’ creates an empty string of bytes. We then use the join method of this which takes a list and joins it together using this empty string as a separator. As the empty string contains nothing, this returns a byte string that just contains our two numbers. The byte_order flag is used to denote which way around the command should read the string. We divide the result by ten, because the SDS011 returns data in units of tens of grams per metre cubed and we want the result in that format aio.send is used to push data to Adafruit IO. The first command is the feed value you want the data to go to. We used kingswoodtwofive and kingswoodten, as the sensor is based in Kingswood. You might want to choose a more geographically relevant name. You can now run your sensor with:

python3 airquality.py

…assuming you called the Python file airquality.py
and it’s saved in the same directory the terminal’s in.

At this point, everything should work and you can set about running your sensor, but as one final point, let’s set it up to start automatically when you turn the Raspberry Pi on. Enter the command:

crontab -e

…and add this line to the file:

@reboot python3 /home/pi/airquality.py

With the code and electronic setup working, your sensor will need somewhere to live. If you want it outside, it’ll need a waterproof case (but include some way for air to get in). We used a Tupperware box with a hole cut in the bottom mounted on the wall, with a USB cable carrying power out via a window. How you do it, though, is up to you.

Now let’s democratise air quality data so we can make better decisions about the places we live.

Picking a sensor

There are a variety of particulate sensors on the market. We picked the SDS011 for a couple of reasons. Firstly, it’s cheap enough for many makers to be able to buy and build with. Secondly, it’s been reasonably well studied for accuracy. Both the hackAIR and InfluencAir projects have compared the readings from these sensors with more expensive, better-tested sensors, and the results have come back favourably. You can see more details at hsmag.cc/DiYPfg and hsmag.cc/Luhisr.

The one caveat is that the results are unreliable when the humidity is at the extremes (either very high or very low). The SDS011 is only rated to work up to 70% humidity. If you’re collecting data for a study, then you should discard any readings when the humidity is above this. HackAIR has a formula for attempting to correct for this, but it’s not reliable enough to neutralise the effect completely. See their website for more details: hsmag.cc/DhKaWZ.

Safe levels

Once you’re monitoring your PM2.5 data, what should you look out for? The World Health Organisation air quality guideline stipulates that PM2.5 not exceed 10 µg/m3 annual mean, or 25 µg/m324-hour mean; and that PM10 not exceed 20 µg/m3 annual mean, or 50 µg/m3 24-hour mean. However, even these might not be safe. In 2013, a large survey published in The Lancet “found a 7% increase in mortality with each 5 micrograms per cubic metre increase in particulate matter with a diameter of 2.5 micrometres (PM2.5).”

Where to locate your sensor

Standard advice for locating your sensor is that it should be outside and four metres above ground level. That’s good advice for general environmental monitoring; however, we’re not necessarily interested in general environmental monitoring – we’re interested in knowing what we’re breathing in.

Locating your monitor near your workbench will give you an idea of what you’re actually inhaling – useless for any environmental study, but useful if you spend a lot of time in there. We found, for example, that the glue gun produced huge amounts of PM2.5, and we’ll be far more careful with ventilation when using this tool in the future.

Adafruit IO

You can use any data platform you like. We chose Adafruit IO because it’s easy to use, lets you share visualisations (in the form of dashboards) with others, and connects with IFTTT to perform actions based on values (ours tweets when the air pollution is above legal limits).

One thing to be aware of is that Adafruit IO only holds data for 30 days (on the free tier at least). If you want historical data, you’ll need to sign up for the Plus option (which stores data for 60 days), or use an alternative storage method. You can use multiple data stores if you like.

Checking accuracy

Now you’ve got your monitoring station up and running, how do you know that it’s running properly? Perhaps there’s an issue with the sensor, or perhaps there’s a problem with the code. The easiest method of calibration is to test it against an accurate sensor, and most cities here in the UK have monitoring stations as part of Defra’s Automatic Urban and Rural Monitoring Network. You can find your local station here. Many other countries have equivalent public networks. Unless there is no other option, we would caution against using crowdsourced data for calibration, as these sensors aren’t themselves calibrated.

With a USB battery pack, you can head to your local monitoring point and see if your monitor is getting similar results to the monitoring network.

HackSpace magazine #21 is out now

You can read the rest of this feature in HackSpace magazine issue 21, out today in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy HackSpace mag directly from us — worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

The post Monitor air quality with a Raspberry Pi appeared first on Raspberry Pi.

Recreate 3D Monster Maze’s 8-bit labyrinth | Wireframe issue 18

Par Ryan Lambie

You too can recreate the techniques behind a pioneering 3D maze game in Python. Mark Vanstone explains how.

3D Monster Maze, released in 1982 by J.K. Greye software, written by Malcolm Evans.

3D Monster Maze

While 3D games have become more and more realistic, some may forget that 3D games on home computers started in the mists of time on machines like the Sinclair ZX81. One such pioneering game took pride of place in my collection of tapes, took many minutes to load, and required the 16K RAM pack expansion. That game was 3D Monster Maze — perhaps the most popular game released for the ZX81.

The game was released in 1982 by J.K. Greye Software, and written by Malcolm Evans. Although the graphics were incredibly low resolution by today’s standards, it became an instant hit. The idea of the game was to navigate around a randomly generated maze in search of the exit.

The problem was that a Tyrannosaurus rex also inhabited the maze, and would chase you down and have you for dinner if you didn’t escape quickly enough. The maze itself was made of straight corridors on a 16×18 grid, which the player would move around from one block to the next. The shape of the blocks were displayed by using the low-resolution pixels included in the ZX81’s character set, with 2×2 pixels per character on the screen.

The original ZX81 game drew its maze from chunky 2×2 pixel blocks.

Draw imaginary lines

There’s an interesting trick to recreating the original game’s 3D corridor display which, although quite limited, works well for a simplistic rendering of a maze. To do this, we need to draw imaginary lines diagonally from corner to corner in a square viewport: these are our vanishing point perspective guides. Then each corridor block in our view is half the width and half the height of the block nearer to us.

If we draw this out with lines showing the block positions, we get a view that looks like we’re looking down a long corridor with branches leading off left and right. In our Pygame Zero version of the maze, we’re going to use this wireframe as the basis for drawing our block elements. We’ll create graphics for blocks that are near the player, one block away, two, three, and four blocks away. We’ll need to view the blocks from the left-hand side, the right-hand side, and the centre.

The maze display is made by drawing diagonal lines to a central vanishing point.

Once we’ve created our block graphics, we’ll need to make some data to represent the layout of the maze. In this example, the maze is built from a 10×10 list of zeros and ones. We’ll set a starting position for the player and the direction they’re facing (0–3), then we’re all set to render a view of the maze from our player’s perspective.

The display is created from furthest away to nearest, so we look four blocks away from the player (in the direction they’re looking) and draw a block if there’s one indicated by the maze data to the left; we do the same on the right, and finally in the middle. Then we move towards the player by a block and repeat the process (with larger graphics) until we get to the block the player is on.

Each visible block is drawn from the back forward to make the player’s view of the corridors.

That’s all there is to it. To move backwards and forwards, just change the position in the grid the player’s standing on and redraw the display. To turn, change the direction the player’s looking and redraw. This technique’s obviously a little limited, and will only work with corridors viewed at 90-degree angles, but it launched a whole genre of games on home computers. It really was a big deal for many twelve-year-olds — as I was at the time — and laid the path for the vibrant, fast-moving 3D games we enjoy today.

Here’s Mark’s code, which recreates 3D Monster Maze’s network of corridors in Python. To get it running on your system, you’ll need to install Pygame Zero. And to download the full code, visit our Github repository here.

Get your copy of Wireframe issue 18

You can read more features like this one in Wireframe issue 18, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 18 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Recreate 3D Monster Maze’s 8-bit labyrinth | Wireframe issue 18 appeared first on Raspberry Pi.

Good Buoy: the Raspberry Pi Smart Buoy

Par Alex Bate

As their new YouTube video shows, the team at T3ch Flicks have been hard at work, designing and prototyping a smart buoy for marine conservation research.

Smart-Buoy Series [Summary]

We all love the seaside, right? Whether that’s the English seaside with ice creams and muddy piers or the Caribbean, with white sand beaches fringed by palm trees, people flock to the coast for a bit of rest and relaxation, to enjoy water sports or to make their livelihood.

What does a smart buoy do?

“The sensors onboard the smart buoy enable it to measure wave height, wave period, wave power, water temperature, air temperature, air pressure, voltage, current usage and GPS location,” explain T3ch Flicks on their project tutorial page. “All the data the buoy collects is sent via radio to a base station, which is a Raspberry Pi. We made a dashboard to display them using Vue JS.”

But why build a smart buoy to begin with? “The coast is a dynamic area at the mercy of waves. Rising sea levels nibble at beaches and powerful extreme events like hurricanes completely decimate them,” they go on to explain. “To understand how to save them, we need to understand the forces driving their change.”

The 3D-printed casing of the smaert buoy with tech inside

It’s a pretty big ask of a 3D-printed dome but, with the aid of an on-board Raspberry Pi, Arduino and multiple sensors, their project was a resounding success. So much so that the Grenadian government gave the team approval to set the buoy free along their coast, and even made suggestions of how the project could be improved to aid them in their own research – pretty cool, right?

The smart buoy out at sea along the Grenada coast

The project uses a lot of tech. A lot. So, instead of listing it here, why not head over to the hackster.io project page, where you’ll find all the ingredients you need to build your own smart buoy.

Good luck to the T3ch Flicks team. We look forward to seeing how the project develops.

The post Good Buoy: the Raspberry Pi Smart Buoy appeared first on Raspberry Pi.

Raspberry Pi mineral oil tank with added pizzazz

Par Alex Bate

This isn’t the first mineral oil bath we’ve seen for the Raspberry Pi, but it’s definitely the first we’ve seen with added fish tank decorations.

Using the see-through casing of an old Apple PowerMac G4, Reddit user u/mjh2901 decided to build a mineral oil tank for their Raspberry Pi, and it looks fabulous. Renamed Apple Pi, this use of mineral oil is a technique used by some to manage the heat produced by tech. Oil is able to transfer heat up to five times more efficiently than air, with some mineral oil projects using a separate radiator to dissipate the heat back into the air.

So, how did they do it?

“Started with a PowerMac G4 case I previously used as a fish tank, then a candy dish. I had cut a piece of acrylic and glued it into the bottom.”

They then placed a Raspberry Pi 3 attached to a 2-line 16 character LCD into the tank, along with various decorations, and began to fill with store-bought mineral oil. Once full, the project was complete, the Raspberry Pi forever submerged.

You can find more photos here. But, one question still remains…

…who would use an old fish tank as a candy bowl?! 🤢

The post Raspberry Pi mineral oil tank with added pizzazz appeared first on Raspberry Pi.

Hack your old Raspberry Pi case for the Raspberry Pi 4

Par Alex Bate

Hack your existing Raspberry Pi case to fit the layout of your new Raspberry Pi 4, with this handy “How to hack your existing Raspberry Pi case to fit the layout of your new Raspberry Pi 4” video!

Hack your old Raspberry Pi case to fit your Raspberry Pi 4

Hack your existing official Raspberry Pi case to fit the new Raspberry Pi 4, or treat yourself to the new official Raspberry Pi 4 case. The decision is yours!

How to hack your official Raspberry Pi case

  1. Take your old Raspberry Pi out of its case.
  2. Spend a little time reminiscing about all the fun times you had together.
  3. Reassure your old Raspberry Pi that this isn’t the end, and that it’ll always have a special place in your heart.
  4. Remember that one particular time – you know the one; wipe a loving tear from your eye.
  5. Your old Raspberry Pi loves you. It’s always been there for you. Why are you doing this?
  6. Look at the case. Look at it. Look how well it fits your old Raspberry Pi. Those fine, smooth edges; that perfect white and red combination. The three of you – this case, your old Raspberry Pi, and you – you make such a perfect team. You’re brilliant.
  7. Look at your new Raspberry Pi 4. Yes, it’s new, and faster, and stronger, but this isn’t about all that. This is about all you’ve gone through with your old Raspberry Pi. You’re just not ready to say goodbye. Not yet.
  8. Put your buddy, the old Raspberry Pi, back in its case and set it aside. There are still projects you can work on together; this is not the end. No, not at all.
  9. In fact, why do you keep calling it your old Raspberry Pi? There’s nothing old about it. It still works; it still does the job. Sure, your Raspberry Pi 4 can do things that this one can’t, and you’re looking forward to trying them out, but that doesn’t make this one redundant. Heck, if we went around replacing older models with newer ones all the time, Grandma would be 24 years old and you’d not get any of her amazing Sunday dinners, and you do love her honey-glazed parsnips.
  10. Turn to your new Raspberry Pi 4 and introduce yourself. It’s not its fault that you’re having a temporary crisis. It hasn’t done anything wrong. So take some time to really get to know your new friend.
  11. New friendships take time, and fresh beginnings, dare we say it…deserve new cases.
  12. Locate your nearest Raspberry Pi Approved Reseller and purchase the new Raspberry Pi 4 case, designed especially to make your new Raspberry Pi comfortable and secure.
  13. Reflect that this small purchase of a new case will support the charitable work of the Raspberry Pi Foundation. Enjoy a little warm glow inside. You did good today.
  14. Turn to your old keyboard

The post Hack your old Raspberry Pi case for the Raspberry Pi 4 appeared first on Raspberry Pi.

Record the last seven seconds of everything you see

Par Alex Bate

Have you ever witnessed something marvellous but, by the time you get your camera out to record it, the moment has passed? ‘s Film in the Past hat-mounted camera is here to save the day!

Record the past

As 18-year-old student Johan explains, “Imagine you are walking in the street and you see a meteorite in the sky – obviously you don’t have time to take your phone to film it.” While I haven’t seen many meteorites in the sky, I have found myself wishing I’d had a camera to hand more than once in my life – usually when a friend trips over or says something ridiculous. “Fortunately after the passage of the meteorite, you just have to press a button on the hat and the camera will record the last 7 seconds”, Johan continues. “Then you can download the video from an application on your phone.”

Johan’s project, Film in the Past, consists of a Raspberry Pi 3 with USB camera attached, mounted to the peak of a baseball cap.

The camera is always on, and, at the press of a button, will save the last seven seconds of footage to the Raspberry Pi. You can then access the saved footage from an application on your smartphone. It’s a bit like the video capture function on the Xbox One or, as I like to call it, the option to record hilarious glitches during gameplay. But, unlike the Xbox One, it’s a lot easier to get the footage off the Raspberry Pi and onto your phone.

Fancy building your own? The full Python code for the project can be downloaded via GitHub, and more information can be found on Instructables and Johan’s website.

The post Record the last seven seconds of everything you see appeared first on Raspberry Pi.

Snazzy photographs of Raspberry Pis #SnazzyRPi

Par Alex Bate

If you don’t follow Raspberry Pi on Instagram, you really should, for there you will find #SnazzyRPi, a collection of snazzy-looking Raspberry Pi photographs taken by our very own Fiacre Muller.

Do you have a Raspberry Pi 3 A+? What have you built with it? . And how snazzy is this photo from @fiacremuller?! . . . . . #RaspberryPi #3A+ #RaspberryPi3A+ #Computing

4,412 Likes, 90 Comments – Raspberry Pi (@raspberrypifoundation) on Instagram: “Do you have a Raspberry Pi 3 A+? What have you built with it? . And how snazzy is this photo from…”

Here are a few more to whet your appetite. Enjoy.

Join the #SnazzyRPi revolution and share your Raspberry Pi glamour shots on Instagram using #SnazzyRPi

The post Snazzy photographs of Raspberry Pis #SnazzyRPi appeared first on Raspberry Pi.

NASA, Raspberry Pi and a mini rover

Par Alex Bate

NASA scientist Dr Jamie Molaro plans to conduct potentially ground-breaking research using a Raspberry Pi seismometer and a mini rover.

Jamie has been working on a payload-loaded version of NASA’s Open Source Rover

In the summer of 2018, engineers at NASA’s Jet Propulsion Laboratory built a mini planetary rover with the aim of letting students, hobbyists, and enthusiasts create one for themselves. It uses commercial off-the-shelf parts and has a Raspberry Pi as its brain. But despite costing about $5333 in total, the Open Source Rover Project has proven rather popular, including among people who actually work for the USA’s space agency.

One of those is Dr Jamie Molaro, a research scientist at the Planetary Science Institute. Her main focus is studying the surfaces of rocky and icy airless bodies such as comets, asteroids, and the moons orbiting Earth, Jupiter, and Saturn. So when she decided to create her mini-rover – which she dubbed PARSLEE, or Planetary Analog Remote Sensor and ‘Lil Electronic Explorer – she also sought to shake things up a little.

Brought to life

Constructing the robot itself was, she says, rather straightforward: the instructions were detailed and she was able to draw upon the help of others in a forum. Jamie also built the robot with her husband, a software engineer at Adobe. “My interest in the Open Source Rover Project was driven by my scientific background, but not my ability to build it”, she tells us, of what is essentially a miniature version of the Curiosity rover trundling over the surface of Mars.

After building the rover wheel assembly, Jamie worked on the head assembly and then the main body itself

Jamie’s interest in science led to her considering the rover’s potential payload before the couple had even finished building it. She added a GoPro camera and a Kestrel 833, which measures temperature, pressure, elevation, wind speed, and humidity. In addition, she opted to use a Raspberry Shake seismometer – a device costing a few hundred dollars which comprises a device sensor, circuit board, and digitiser – with a Raspberry Pi board and a preprogrammed microSD card.

With the electronics assembly complete, Jamie and her husband could get on with integrating PARSLEE’s parts

The sensor records activity, converts the analogue signals to digital, and allows the recorded data to be read on Raspberry Shake servers. Jamie hopes to use PARSLEE to study the kinds of processes active at the surface of other planets. A seismometer helps us understand our physical environment in a very different way than images from a camera, she says.

Seismic solutions

To that end, with funding, Jamie would like to heat and cool boulders and soils in the lab and in the field and analyse their seismic signature. Thermally driven shallow moonquakes were recorded by instruments used by the Apollo astronauts, she says. “We believe these quakes may reflect signals from a thermal fracturing process that breaks down lunar boulders, or from the boulders and surrounding soil shifting and settling as it changes temperature throughout the day. We can do experiments on Earth that mimic this process and use what we learn to help us understand the lunar seismic data.”

A Raspberry Pi processes the data recorded from the sensor and powers the whole device, with the whole unit forming a payload on PARSLEE

Jamie is also toying with optimum locations for the Shake-fitted rover. The best planetary analogue environments are usually deserts, due to the lack of moisture and low vegetation, she reveals. Places like dry lake beds, lava flows, and sand dunes all provide good challenges in terms of testing the rover’s ability to manoeuvre and collect data, as well as to try out technology being developed with and for it. One thing’s for sure, it is set to travel and potentially make a scientific breakthrough: anyone can use the rover for DIY science experiments.

Read more about PARSLEE on Jamie’s website.

The MagPi magazine #83

This article is from the brand-new issue of The MagPi, the official Raspberry Pi magazine. Buy it from all good newsagents, subscribe to pay less per issue and support our work, or download the free PDF to give it a try first.

The post NASA, Raspberry Pi and a mini rover appeared first on Raspberry Pi.

How to build databases using Python and text files | Hello World #9

Par Mac Bowley

In Hello World issue 9, Raspberry Pi’s own Mac Bowley shares a lesson that introduces students to databases using Python and text files.

In this lesson, students create a library app for their books. This will store information about their book collection and allow them to display, manipulate, and search their collection. You will show students how to use text files in their programs that act as a database.

The project will give your students practical examples of database terminology and hands-on experience working with persistent data. It gives opportunities for students to define and gain concrete experience with key database concepts using a language they are familiar with. The script that accompanies this activity can be adapted to suit your students’ experience and competency.

This ready-to-go software project can be used alongside approaches such as PRIMM or pair programming, or as a worked example to engage your students in programming with persistent data.

What makes a database?

Start by asking the students why we need databases and what they are: do they ever feel unorganised? Life can get complicated, and there is so much to keep track of, the raw data required can be overwhelming. How can we use computing to solve this problem? If only there was a way of organising and accessing data that would let us get it out of our head. Databases are a way of organising the data we care about, so that we can easily access it and use it to make our lives easier.

Then explain that in this lesson the students will create a database, using Python and a text file. The example I show students is a personal library app that keeps track of which books I own and where I keep them. I have also run this lesson and allowed the students pick their own items to keep track of — it just involves a little more planning time at the end. Split the class up into pairs; have each of them discuss and select five pieces of data about a book (or their own item) they would like to track in a database. They should also consider which type of data each of them is. Give them five minutes to discuss and select some data to track.

Databases are organised collections of data, and this allows them to be displayed, maintained, and searched easily. Our database will have one table — effectively just like a spreadsheet table. The headings on each of the columns are the fields: the individual pieces of data we want to store about the books in our collection. The information about a single book are called its attributes and are stored together in one record, which would be a single row in our database table. To make it easier to search and sort our database, we should also select a primary key: one field that will be unique for each book. Sometimes one of the fields we are already storing works for this purpose; if not, then the database will create an ID number that it uses to uniquely identify each record.

Create a library application

Pull the class back together and ask a few groups about the data they selected to track. Make sure they have chosen appropriate data types. Ask some if they can find any of the fields that would be a primary key; the answer will most likely be no. The ISBN could work, but for our simple application, having to type in a 10- or 13-digit number just to use for an ID would be overkill. In our database, we are going to generate our own IDs.

The requirements for our database are that it can do the following things: save data to a file, read data from that file, create new books, display our full database, allow the user to enter a search term, and display a list of relevant results based on that term. We can decompose the problem into the following steps:

  • Set up our structures
  • Create a record
  • Save the data to the database file
  • Read from the database file
  • Display the database to the user
  • Allow the user to search the database
  • Display the results

Have the class log in and power up Python. If they are doing this locally, have them create a new folder to hold this project. We will be interacting with external files and so having them in the same folder avoids confusion with file locations and paths. They should then load up a new Python file. To start, download the starter file from the link provided. Each student should make a copy of this file. At first, I have them examine the code, and then get them to run it. Using concepts from PRIMM, I get them to print certain messages when a menu option is selected. This can be a great exemplar for making a menu in any application they are developing. This will be the skeleton of our database app: giving them a starter file can help ease some cognitive load from students.

Have them examine the variables and make guesses about what they are used for.

  • current_ID – a variable to count up as we create records, this will be our primary key
  • new_additions – a list to hold any new records we make while our code is running, before we save them to the file
  • filename – the name of the database file we will be using
  • fields – a list of our fields, so that our dictionaries can be aligned with our text file
  • data – a list that will hold all of the data from the database, so that we can search and display it without having to read the file every time

Create the first record

We are going to use dictionaries to store our records. They reference their elements using keys instead of indices, which fit our database fields nicely. We are going to generate our own IDs. Each of these must be unique, so a variable is needed that we can add to as we make our records. This is a user-focused application, so let’s make it so our user can input the data for the first book. The strings, in quotes, on the left of the colon, are the keys (the names of our fields) and the data on the right is the stored value, in our case whatever the user inputs in response to our appropriate prompts. We finish this part of by adding the record to the file, incrementing the current ID, and then displaying a useful feedback message to the user to say their record has been created successfully. Your students should now save their code and run it to make sure there aren’t any syntax errors.

You could make use of pair programming, with carefully selected pairs taking it in turns in the driver and navigator roles. You could also offer differing levels of scaffolding: providing some of the code and asking them to modify it based on given requirements.

How to use the code in your class

To complete the project, your students can add functionality to save their data to a CSV file, read from a database file, and allow users to search the database. The code for the whole project is available at helloworld.cc/database.

An example of the code

You may want to give your students the entire piece of code. They can investigate and modify it to their own purpose. You can also lead them through it, having them follow you as you demonstrate how an expert constructs a piece of software. I have done both to great effect. Let me know how your classes get on! Get in touch at contact@helloworld.cc

Hello World issue 9

The brand-new issue of Hello World is out today, and available right now as a free PDF download from the Hello World website.

UK-based educators can also sign up to receive Hello World as printed magazine FOR FREE, direct to their door. And those outside the UK, educator or not, can subscribe to receive new digital issues of Hello World in their inbox on the day of release.

The post How to build databases using Python and text files | Hello World #9 appeared first on Raspberry Pi.

Take the Wizarding World of Harry Potter home with you

Par Alex Bate

If you’ve visited the Wizarding World of Harry Potter and found yourself in possession of an interactive magic wand as a souvenir, then you’ll no doubt be wondering by now, “What do I do with it at home though?”

While the wand was great for setting off window displays at the park itself, it now sits dusty and forgotten upon a shelf. But it still has life left in it — let Jasmeet Singh show you how.

Real Working Harry Potter Wand With Computer Vision and ML

A few months back my brother visited Japan and had real wizarding experience in the Wizarding World of Harry Potter at the Universal Studios made possible through the technology of Computer Vision. At the Wizarding World of Harry Potter in Universal Studios the tourists can perform “real magic” at certain locations(where the motion capture system is installed) using specially made wands with retro-reflective beads at the tip.

How do Harry Potter interactive wands work?

The interactive displays at Universal Studios’ Wizarding World of Harry Potter have infrared cameras in place, which are ready to read the correct movements of retroflector-tipped wands. Move your wand in the right way, and the cameras will recognise your spell and set window displays in motion. Oooooo…magic!

How do I know this? Thanks to William Osman and Allen Pan, who used this Wizarding World technology to turn cheap hot dogs into their own unique wands! Those boys…

Hacking Wands at Harry Potter World

How to make your very own mostly-functional interactive wand. Please don’t ban me from Universal Studios. Links on my blog: http://www.williamosman.com/2017/12/hacking-harry-potter-wands.html Allen’s Channel: https://www.youtube.com/channel/UCVS89U86PwqzNkK2qYNbk5A Support us on Patreon: https://www.patreon.com/williamosman Website: http://www.williamosman.com/ Facebook: https://www.facebook.com/williamosmanscience/ InstaHam: https://www.instagram.com/crabsandscience/ CameraManJohn: http://www.johnwillner.com/

For his Raspberry Pi-enabled wand project, Jasmeet took that same Wizarding World concept to create a desktop storage box that opens and closes in response to the correct flicks of a wand.

A simple night vision camera can be used as our camera for motion capture as they also blast out infrared light which is not visible to humans but can be clearly seen with a camera that has no infrared filter.

So, the video stream from the camera is fed into a Raspberry Pi which has a Python program running OpenCV which is used for detecting, isolating and tracking the wand tip. Then we use SVM (Simple Vector Machine) algorithm of machine learning to recognize the pattern drawn and accordingly control the GPIOs of the raspberry pi to perform some activities.

For more information on the project, including all the code needed to get started, head over to hackster.io to find Jasmeet’s full tutorial.

The post Take the Wizarding World of Harry Potter home with you appeared first on Raspberry Pi.

Win some Raspberry Pi stickers #GimmeRaspberryPiStickers

Par Alex Bate

To celebrate the launch of Raspberry Pi 4, and because it’s almost the weekend, we’re giving away some sticker packs!

For your chance to win a pack, all you have to do is leave a comment below, or comment on the Facebook post about this give-away, or tweet us with the hashtag #GimmeRaspberryPiStickers — all before midnight (BST) Monday 8 July.

Each sticker pack will contain the following stickers, plus any others I find between now and Monday, and we have 10 packs to give away.

Winners will be picked at random, and I’ll tweet who these lucky ten are on Tuesday, so keep your eyes peeled.

Good luck!

Oh, if you don’t see your comment on this post, it’s because you’re new to the blog and we haven’t approved it yet. Don’t worry, it’s there, and we’ll see it before the contest ends.

The post Win some Raspberry Pi stickers #GimmeRaspberryPiStickers appeared first on Raspberry Pi.

Code your own path-following Lemmings in Python | Wireframe issue 17

Par Ryan Lambie

Learn how to create your own obedient lemmings that follow any path put in front of them. Raspberry Pi’s own Rik Cross explains how.

The original Lemmings, first released for the Amiga, quickly spread like a virus to just about every computer and console of the day.

Lemmings

Lemmings is a puzzle-platformer, created at DMA Design, and first became available for the Amiga in 1991. The aim is to guide a number of small lemming sprites to safety, navigating traps and difficult terrain along the way. Left to their own devices, the lemmings will simply follow the path in front of them, but additional ‘special powers’ given to lemmings allow them to (among other things) dig, climb, build, and block in order to create a path to freedom (or to the next level, anyway).

Code your own lemmings

I’ll show you a simple way (using Python and Pygame) in which lemmings can be made to follow the terrain in front of them. The first step is to store the level’s terrain information, which I’ve achieved by using a two-dimensional list to store the colour of each pixel in the background ‘level’ image. In my example, I’ve used the ‘Lemcraft’ tileset by Matt Hackett (of Lost Decade Games) – taken from opengameart.org – and used the Tiled software to stitch the tiles together into a level.

The algorithm we then use can be summarised as follows: check the pixels immediately below a lemming. If the colour of those pixels is the same as the background colour, then the lemming is falling. In this case, move the lemming down by one pixel on the y-axis. If the lemming isn’t falling, then it’s walking. In this case, we need to see whether there is a non-ground, background-coloured pixel in front of the lemming for it to move onto.

Sprites cling to the ground below them, navigating uneven terrain, and reversing direction when they hit an impassable obstacle.

If a pixel is found in front of the lemming (determined by its direction) that is low enough to get to (i.e. lower than its climbheight), then the lemming moves forward on the x-axis by one pixel, and upwards on the y-axis to the new ground level. However, if no suitable ground is found to move onto, then the lemming reverses its direction.

The algorithm is stored as a lemming’s update() method, which is executed for each lemming, each frame of the game. The sample level.png file can be edited, or swapped for another image altogether. If using a different image, just remember to update the level’s BACKGROUND_COLOUR in your code, stored as a (red, green, blue, alpha) tuple. You may also need to increase your lemming climbheight if you want them to be able to navigate a climb of more than four pixels.

There are other things you can do to make a full Lemmings clone. You could try replacing the yellow-rectangle lemmings in my example with pixel-art sprites with their own walk cycle animation (see my article in issue #14), or you could give your lemmings some of the special powers they’ll need to get to safety, achieved by creating flags that determine how lemmings interact with the terrain around them.

Here’s Rik’s code, which gets those path-following lemmings moving about in Python. To get it running on your system, you’ll first need to install Pygame Zero. And to download the full code, go here.

Get your copy of Wireframe issue 17

You can read more features like this one in Wireframe issue 17, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 17 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Code your own path-following Lemmings in Python | Wireframe issue 17 appeared first on Raspberry Pi.

Really awesome Raspberry Pi 4 X-ray radiographs

Par Alex Bate

“I got my Pi in the post yesterday morning and I was desperately waiting until the end of the workday to get home and play with it! Took a few quick radiographs before I left because I had to.”

And we’re really happy that Reddit user xCP23x did. How cool are these?

“I work for a company that makes microfocus X-ray/CT systems!” xCP23x explained in their Reddit post. “Most of the images are from a 225kV system (good down to 3 microns).”

They used a Nikon XT H 225 ST: 225kV 225W X-ray source for the majority of the images. You can learn more about how the images were produced via the comments on their Reddit user page.

You can see the full Reddit post here, and more radiographs of the Raspberry Pi 4 here.

The post Really awesome Raspberry Pi 4 X-ray radiographs appeared first on Raspberry Pi.

Your Back-to-School Bootcamp with our free online training

Par Dan Fisher

Are you ready FEEL THE BURN…of your heating laptop? And MAX THOSE REPS…using forever loops? Then get your programming muscles into the best shape possible with our free online training courses.

Pump up your programming skills for free

Today we are excited to announce our new online training course Programming with GUIs — now open for sign-ups on FutureLearn. To celebrate, we’ve also curated a set of courses as your personal Back-to-school Bootcamp. Sign up now to start training from Monday 29 July and throughout August!

Scratch Cat and a Python supervising teachers at an outdoor bootcamp

Your Back-to-school Bootcamp has something for beginner, intermediate, and advanced learners, and all the courses are free, thanks to support from Google.

Also keep in mind that all the courses count towards becoming certified through the National Centre for Computing Education.

Couch to 5k…lines of code

If you’re just beginning to learn about coding, the perfect place to start is Programming 101: An Introduction to Python for Educators. You’ll first get to grips with basic programming concepts by learning about the basics of Python syntax and how to interpret error messages. Then you’ll use your new coding skills to create a chatbot that asks and answers questions!

Scratch Cat and a Python doing a relay race

For Primary teachers, our course Scratch to Python: Moving from Block- to Text-based Programming is ideal. Take this course if you’ve been using Scratch and are wondering how to introduce Python to your older students.

If you’ve been programming for a while, sign up for our brand-new course Programming with GUIs — an intermediate-level course that shows you how to build your own graphical user interface (GUI) in Python. You will learn how to incorporate interactivity in your programs, discover different types of GUI features, and build your confidence to design more complex GUI-based apps in the future.

Or maybe you’d like to try Programming 101’s follow-on course Programming 102: Think Like a Computer Scientist? Take your Python skills further by learning to break down problems into smaller tasks and designing algorithms you can apply to data.

Finally, if you’re an experienced computing educator, dig into Object-oriented Programming in Python, a really fun and challenging course that helps you get to grips with OOP principles by creating a text-based adventure game in Python.

Scratch Cat and a Python supervising an outdoors sports activity

Sign-ups are open until the end of August. Now go get those gains!

Tell us about your workout routine

What will your personal coding regime look like this summer? What online courses have you enjoyed taking this year? (They don’t have to be ours!) Tell us in the comments below.

No Title

No Description

The post Your Back-to-School Bootcamp with our free online training appeared first on Raspberry Pi.

The world’s first Raspberry Pi-powered Twitter-activated jelly bean-pooping unicorn

Par Alex Bate

When eight-year-old Tru challenged the Kids Invent Stuff team to build a sparkly, pooping, rainbow unicorn electric vehicle, they did exactly that. And when Kids Invent Stuff, also known as Ruth and Shawn, got in contact with Estefannie Explains it All, their unicorn ended up getting an IoT upgrade…because obviously.

You tweet and the Unicorn poops candy! | Kids Invent Stuff

We bring kids’ inventions to life and this month we teamed up with fellow youtube Estefannie (from Estefannie Explains It All https://www.youtube.com/user/estefanniegg SHE IS EPIC!) to modify Tru’s incredible sweet pooping unicorn to be activated by the internet! Featuring the AMAZING Allen Pan https://www.youtube.com/channel/UCVS89U86PwqzNkK2qYNbk5A (Thanks Allen for your filming and tweeting!)

Kids Invent Stuff

If you’re looking for an exciting, wholesome, wonderful YouTube channel suitable for the whole family, look no further than Kids Invent Stuff. Challenging kids to imagine wonderful inventions based on monthly themes, channel owners Ruth and Shawn then make these kids’ ideas a reality. Much like the Astro Pi Challenge, Kids Invent Stuff is one of those things we adults wish existed when we were kids. We’re not jealous, we’re just…OK, we’re definitely jealous.

ANYWAY, when eight-year-old Tru’s sparkly, pooping, rainbow unicorn won the channel’s ‘crazy new vehicle’ challenge, the team got to work, and the result is magical.

Riding an ELECTRIC POOPING UNICORN! | Kids Invent Stuff

We built 8-year-old Tru’s sparkly, pooping, rainbow unicorn electric vehicle and here’s what happened when we drove it for the first time and pooped out some jelly beans! A massive THANK YOU to our challenge sponsor The Big Bang Fair: https://www.thebigbangfair.co.uk The Big Bang Fair is the UK’s biggest celebration of STEM for young people!

But could a sparkly, pooping, rainbow unicorn electric vehicle ever be enough? Is anything ever enough if it’s not connected to the internet? Of course not. And that’s where Estefannie came in.

At Maker Central in Birmingham earlier this year, the two YouTube teams got together to realise their shared IoT dream.

They ran out of chairs for their panel, so Allen had to improvise

With the help of a Raspberry Pi Zero W connected to the relay built into the unicorn, the team were able to write code that combs through Twitter, looking for mentions of @mythicalpoops. A positive result triggers the Raspberry Pi to activate the relay, and the unicorn lifts its tail to shoot jelly beans at passers-by.

You can definitely tell this project was invented by an eight-year-old, and we love it!

IoT unicorn

As you can see in the video above, the IoT upgrades to the unicorn allow Twitter users to control when the mythical beast poops its jelly beans. There are rumours that the unicorn may be coming to live with us at Pi Towers, but if these turn out to be true, we’ll ensure that this function is turned off. So no tweeting the unicorn!

You know what to do

Be sure to subscribe to both Kids Invent Stuff and Estefannie Explains It All on YouTube. They’re excellent makers producing wonderful content, and we know you’ll love them.

How to milk a unicorn

The post The world’s first Raspberry Pi-powered Twitter-activated jelly bean-pooping unicorn appeared first on Raspberry Pi.

The NEW Official Raspberry Pi Beginner’s Guide: updated for Raspberry Pi 4

Par Phil King

To coincide with the launch of Raspberry Pi 4, Raspberry Pi Press has created a new edition of The Official Raspberry Pi Beginner’s Guide book — as if this week wasn’t exciting enough! Weighing in at 252 pages, the book is even bigger than before, and it’s fully updated for Raspberry Pi 4 and the latest version of the Raspbian operating system, Buster.A picture of the front cover of the Raspberry Pi Beginner's Guide version two

The Official Raspberry Pi Beginner’s Guide

We’ve roped in Gareth Halfacree, full-time technology journalist and technical author, and the wonderful Sam Alder, illustrator of our incredible cartoons and animations, to put together the only guide you’ll ever need to get started with Raspberry Pi.

From setting up your Raspberry Pi on day one to taking your first steps into writing coding, digital making, and computing, The Official Raspberry Beginner’s Guide – 2nd Edition is great for users from age 7 to 107! It’s available now online from the Raspberry Pi Press store, with free international delivery, or from the real-life Raspberry Pi Store in Cambridge, UK.

As always, we have also released the guide as a free PDF, and you’ll soon be seeing physical copies on the shelves of Waterstones, Foyles, and other good bookshops.

The post The NEW Official Raspberry Pi Beginner’s Guide: updated for Raspberry Pi 4 appeared first on Raspberry Pi.

IoT community sprinkler system using Raspberry Pi | The MagPi issue 83

Par Alex Bate

Saving water, several thousand lawns at a time: The MagPi magazine takes a look at the award-winning IoT sprinkler system of Coolest Projects USA participant Adarsh Ambati.

At any Coolest Projects event, you’re bound to see incredible things built by young makers. At Coolest Projects USA, we had the chance to talk to Adarsh Ambati about his community sprinkler and we were, frankly, amazed.

“The extreme, record-breaking drought in California inspired me to think of innovative ways to save water,” Adarsh tells us. “While going to school in the rain one day, I saw one of my neighbours with their sprinklers on, creating run-offs. Through research, I found that 25% of the water used in an average American household is wasted each day due to overwatering and inefficient watering methods. Thus, I developed a sprinkler system that is compliant with water regulations, to cost-effectively save water for entire neighbourhoods using a Raspberry Pi, moisture sensors, PyOWM (weather database), and by utilising free social media networks like Twitter.”

Efficient watering

In California, it’s very hot year round, so if you want a lush, green lawn you need to keep the grass watered. The record-breaking drought Adarsh was referring to resulted in extreme limitations on how much you could water your grass. The problem is, unless you have a very expensive sprinkler system, it’s easy to water the grass when it doesn’t need to be.

“The goal of my project is to save water wasted during general-purpose landscape irrigation of an entire neighbourhood by building a moisture sensor-based smart sprinkler system that integrates real-time weather forecast data to provide only optimum levels of water required,” Adarsh explains. “It will also have Twitter capabilities that will be able to publish information about when and how long to turn on the sprinklers, through the social networks. The residents in the community will subscribe to this information by following an account on Twitter, and utilise it to prevent water wasted during general-purpose landscaping and stay compliant with water regulations imposed in each area.”

Using the Raspberry Pi, Adarsh was able to build a prototype for about $50 — a lot cheaper than smart sprinklers you can currently buy on the market.

“I piloted it with ten homes, so the cost per home is around $5,” he reveals. “But since it has the potential to serve an entire community, the cost per home can be a few cents. For example, there are about 37000 residents in Almaden Valley, San Jose (where I live). If there is an average of two to four residents per home, there should be 9250 to 18500 homes. If I strategically place ten such prototypes, the cost per house would be five cents or less.”

Massive saving

Adarsh continues, “Based on two months of data, 83% of the water used for outdoor landscape watering can be saved. The average household in northern California uses 100 gallons of water for outdoor landscaping on a daily basis. The ten homes in my pilot had the potential to save roughly 50000 gallons over a two-month period, or 2500 gallons per month per home. At $0.007 per gallon, the savings equate to $209 per year, per home. For Almaden Valley alone, we have the potential to save around $2m to $4m per year!”

The results from Adarsh’s test were presented to the San Jose City Council, and they were so impressed they’re now considering putting similar systems in their public grass areas. Oh, and he also won the Hardware project category at Coolest Projects USA.

The MagPi magazine #83

This article is from today’s brand-new issue of The MagPi, the official Raspberry Pi magazine. Buy it from all good newsagents, subscribe to pay less per issue and support our work, or download the free PDF to give it a try first.

The post IoT community sprinkler system using Raspberry Pi | The MagPi issue 83 appeared first on Raspberry Pi.

Raspberry Pi 4: 48 hours later

Par Alex Bate

“We’ve never felt more betrayed and excited at the same time,” admitted YouTubers 8 Bits and a Byte when I told them Raspberry Pi 4 would be out in June, going against rumours of the release happening at some point in 2020. Fortunately, everything worked in our favour, and we were able to get our new product out ahead of schedule.

So, while we calm down from the hype of Monday, here’s some great third-party content for you to get your teeth into.

YouTubers

A select few online content creators were allowed to get their hands on Raspberry Pi 4 before its release date, and they published some rather wonderful videos on the big day.

Office favourite Explaining Computers provided viewers with a brilliant explanation of the ins and outs of Raspberry Pi 4, and even broke their usually Sunday-only release schedule to get the video out to fans for launch day. Thanks, Chris!

Raspberry Pi 4 Model B

Raspberry Pi 4B review, including the hardware specs of this new single board computer, and a demo running the latest version of Raspbian. With thanks to the Raspberry Pi Foundation for supplying the board featured in this video.

Blitz City DIY offered viewers a great benchmark test breakdown, delving deeper into the numbers and what they mean, to show the power increase compared to Raspberry Pi 3B+.

A Wild Raspberry Pi 4 Appears: Hardware Specs, Benchmarks & First Impressions

The Raspberry Pi 4 B has been released into the wild much earlier than anticipated. I was able to receive a review sample so here are the hardware specs, some benchmarks comparing it to the Pi 3 B and Pi 3 B+ and finally some first impressions.

Curious about how these creators were able to get their hands on Raspberry Pi 4 prior to its release? This is legitimately how Estefannie bagged herself the computer pre-launch. Honest.

HOW I GOT A RASPBERRY PI 4 BEFORE ITS RELEASE

I needed a new Raspberry Pi. FIND ME HERE: * http://www.estefannie.com * http://instagram.com/estefanniegg * http://www.twitter.com/estefanniegg * https://github.com/estefanniegg * https://facebook.com/estefanniegg

For their launch day video, Dane and Nicole, AKA 8 Bits and a Byte, built a pi-calculating pie that prints pies using a Raspberry Pi 4. Delicious.

The new Raspberry Pi 4 – Highlights & Celebration Project!

There’s a new Raspberry Pi, the Raspberry Pi 4! We give you a quick overview and build a project to welcome the Raspberry Pi 4 to the world!

Alex from Low Spec Gamer took his Raspberry Pi 4 home with him after visiting the office to talk to Eben. Annoyingly, I was away on vacation and didn’t get to meet him :(

Raspberry Pi 4 Hands-on. I got an early unit!

Watch the best documentaries on Curiosity Stream: https://curiositystream.com/lowspecgamer #RaspberryPi4 #HandsOn #Preview A new Raspberry Pi joins the fray. I got an early Raspberry Pi 4 and decided to explore some of its differences with Eben Upton, founder of Raspberry Pi. All benchmarks run on an early version of the new raspbian.

The MagPi magazine managed to collar Raspberry Pi Trading’s COO James Adams for their video, filmed at the Raspberry Pi Store in Cambridge.

Introducing Raspberry Pi 4! + interview with a Raspberry Pi engineer

The brand new Raspberry Pi 4 is here! With up to 4GB of RAM, 4K HDMI video, Gigabit Ethernet, USB 3.0, and USB C, it is the ultimate Raspberry Pi. We talk to Raspberry Pi hardware lead James Adams about its amazing performance.

Some rather lovely articles

If you’re looking to read more about Raspberry Pi 4 and don’t know where to start, here are a few tasty treats to get you going:

Raspberry Pi 4 isn’t the only new thing to arrive this week. Raspbian Buster is now available for Raspberry Pi, and you can read more about it here.

Join the Raspberry Pi 4 conversation by using #RaspberryPi4 across all social platforms, and let us know what you plan to do with your new Raspberry Pi.

The post Raspberry Pi 4: 48 hours later appeared first on Raspberry Pi.

Buster – the new version of Raspbian

Par Simon Long

Amid all the furore about the release of a certain new piece of hardware, some people may have missed that we have also released a new version of Raspbian. While this is required for Raspberry Pi 4, we’ve always tried to maintain software backwards-compatibility with older hardware, and so the standard Raspbian image for all models of Raspberry Pi is now based on Buster, the latest version of Debian Linux.

Why Buster?

The first thing to mention about Buster (who was the actual dog in Pixar’s “Toy Story” films, as opposed to the toy one made out of a Slinky…) is that we are actually releasing it slightly in advance of the official Debian release date. The reason for this is that one of the important new features of Raspberry Pi 4 is that the open-source OpenGL video driver is now being used by default, and this was developed using the most recent version of Debian. It would have been a lot of work to port everything required for it back on to Raspbian Stretch, so we decided that we would launch on Raspbian Buster – the only question was whether Buster would be ready before the hardware was!

As it turns out, it wasn’t – not quite. The official launch date for Buster is July 7, so we are a couple of weeks ahead. That said, Buster has been in a “frozen” state for a couple of months now, with only minor changes being made to it, so the version we are releasing is pretty much identical to that which will be officially released by Debian on July 7.

We started using Buster internally in January this year, so it has had a lot of testing on Pi – while we may be releasing it a bit early, you need have no concerns about using it; it’s stable and robust, and you can use apt to update with any changes that do happen between now and July 7 without needing to reinstall everything.

What’s new?

There are no huge differences between Debian Stretch and Debian Buster. In a sad reflection of the way the world is nowadays, most of the differences are security changes designed to make Buster harder to hack. Any other differences are mostly small incremental changes that most people won’t notice, and this got us thinking…

When we moved from Jessie to Stretch, many people commented that they couldn’t actually see any difference between the two – as most of the changes were “under the hood”, the desktop and applications all looked the same. So we told people “you’ve now got Stretch!” and they said “so what?”

The overall appearance of the desktop hasn’t changed significantly for a few years, and was starting to look a bit dated, so we thought it would be nice to give the appearance a mild refresh for Buster. Then people would at least be able to see that their shiny new operating system looked different from the old one!

The new appearance

There has been a definite trend in the design of most computer graphical user interfaces over recent years to simplify and declutter; to reduce the amount of decoration, so that a button becomes a plain box rather than something that resembles a physical button. You can see this in both desktop OSes like Windows, and in mobile OSes like iOS – so we decided it was time to do something similar.

The overall appearance of most of the interface elements has been simplified; we’ve reduced things like the curvature of corners and the shading gradients which were used to give a pseudo-3D effect to things like buttons. This “flatter” design looks cleaner and more modern, but it’s a bit of a juggling act; it’s very easy to go too far and to make things look totally flat and boring, so we’ve tried to avoid that. Eben and I have had a mild tussle over this – he wanted as much flatness as possible, and I wanted to retain at least a bit of curvature, so we’ve met somewhere in the middle and produced something we both like!

We’ve also changed the default desktop for a new one of Greg Annandale’s gorgeous photographs, and we’ve moved to a grey highlight colour.

(If you really don’t like the new appearance, it is easy enough to restore the former appearance – the old desktop picture is still installed, as is the old UI theme.)

Other changes

We’ve been including the excellent Thonny Python development environment in Raspbian for some time now. In this release, it’s now our default Python editor, and to that end, we are no longer including IDLE by default. IDLE has always felt dated and not very pleasant to use, and Thonny is so much nicer that we’d strongly recommend moving to it, if you haven’t already!

(If you’d like an alternative to Thonny, the Mu Python IDE is also still available in Recommended Software.)

We’ve made some small tweaks to the taskbar. The ‘eject’ icon for removing USB devices is now only shown if you have devices to eject; it’s hidden the rest of the time. Similarly, if you are using one of the earlier Pis without Bluetooth support, the Bluetooth icon is now hidden rather than being greyed out. Also, the CPU activity gauge is no longer shown on the taskbar by default, because this has become less necessary on the more powerful recent Raspberry Pi models. If you’d still like to use it, you can add it back – right-click the taskbar and choose ‘Add / Remove Panel Items’. Press the ‘Add’ button and you’ll find it listed as ‘CPU Usage Monitor’. While you are in there, you’ll also find the new ‘CPU Temperature Monitor’, which you can add if you’re interested in knowing more about what the CPU is up to.

One program which is currently missing from Buster is Mathematica. Don’t worry – this is only a temporary removal! Wolfram are working on getting Mathematica to work properly with Buster, and as soon as it is ready, it’ll be available for installation from Recommended Software.

A few features of the old non-OpenGL video driver (such as pixel doubling and underscan) are not currently supported by the new OpenGL driver, so the settings for these are hidden in Raspberry Pi Configuration if the GL driver is in use. (The GL driver is the default on Raspberry Pi 4 – older Pis will still use the non-GL driver by default. Also, if using a Raspberry Pi 4 headless, we recommend switching back to the non-GL driver – choose ‘Legacy’ under the ‘GL Driver’ setting in ‘Advanced Options’ in raspi-config.)

If the GL driver is in use, there’s a new ‘Screen Configuration’ tool – this enables you to set up the arrangement of multiple monitors on a Raspberry Pi 4. It can also be used to set custom monitor resolutions, which can be used to simulate the effect of pixel doubling.

Finally, there are a couple of new buttons in ‘Raspberry Pi Configuration’ which control video output options for Raspberry Pi 4. (These are not shown when running on earlier models of Raspberry Pi.) It is not possible on the Raspberry Pi 4 to have both analogue composite video (over the 3.5mm jack) and HDMI output simultaneously, so the analogue video output is disabled by default. 4Kp60 resolution over HDMI is also disabled by default, as this requires faster clock speeds resulting in a higher operating temperature and greater power consumption. The new buttons enable either of these options to be enabled as desired.

How do I get it?

As ever with major version changes, our recommendation is that you download a new clean image from the usual place on our site – this will ensure that you are starting from a clean, working Buster system.

We do not recommend upgrading an existing Stretch (or earlier) system to Buster – we can’t know what changes everyone has made to their system, and so have no idea what may break when you move to Buster. However, we have tested the following procedure for upgrading, and it works on a clean version of the last Stretch image we released. That does not guarantee it will work on your system, and we cannot provide support (or be held responsible) for any problems that arise if you try it. You have been warned – make a backup!

1. In the files /etc/apt/sources.list and /etc/apt/sources.list.d/raspi.list, change every use of the word “stretch” to “buster”.
2. In a terminal,

sudo apt update

and then

sudo apt dist-upgrade

3. Wait for the upgrade to complete, answering ‘yes’ to any prompt. There may also be a point at which the install pauses while a page of information is shown on the screen – hold the ‘space’ key to scroll through all of this and then hit ‘q’ to continue.
4. The update will take anywhere from half an hour to several hours, depending on your network speed. When it completes, reboot your Raspberry Pi.
5. When the Pi has rebooted, launch ‘Appearance Settings’ from the main menu, go to the ‘Defaults’ tab, and press whichever ‘Set Defaults’ button is appropriate for your screen size in order to load the new UI theme.
6. Buster will have installed several new applications which we do not support. To remove these, open a terminal window and

sudo apt purge timidity lxmusic gnome-disk-utility deluge-gtk evince wicd wicd-gtk clipit usermode gucharmap gnome-system-tools pavucontrol

We hope that Buster gives a little hint of shiny newness for those of you who aren’t able to get your hands on a Raspberry Pi 4 immediately! As ever, your feedback is welcome – please leave your comments below.

The post Buster – the new version of Raspbian appeared first on Raspberry Pi.

Raspberry Pi 4 on sale now from $35

Par Eben Upton

We have a surprise for you today: Raspberry Pi 4 is now on sale, starting at $35. This is a comprehensive upgrade, touching almost every element of the platform. For the first time we provide a PC-like level of performance for most users, while retaining the interfacing capabilities and hackability of the classic Raspberry Pi line.

Raspberry Pi 4: your new $35 computer

Introducing #RaspberryPi4: your tiny, dual-display, desktop computer…and robot brains, smart home hub, media centre, networked AI core, factory controller, and much more Get your Raspberry Pi 4 now: http://rpf.io/ytraspberrypi4 Subscribe to our YouTube channel: http://rpf.io/ytsub Help us reach a wider audience by translating our video content: http://rpf.io/yttranslate Buy a Raspberry

Get yours today from our Approved Resellers, or from the Raspberry Pi Store in Cambridge, open today 8am–8pm!

Raspberry Pi 4 Model B

Here are the highlights:

  • A 1.5GHz quad-core 64-bit ARM Cortex-A72 CPU (~3× performance)
  • 1GB, 2GB, or 4GB of LPDDR4 SDRAM
  • Full-throughput Gigabit Ethernet
  • Dual-band 802.11ac wireless networking
  • Bluetooth 5.0
  • Two USB 3.0 and two USB 2.0 ports
  • Dual monitor support, at resolutions up to 4K
  • VideoCore VI graphics, supporting OpenGL ES 3.x
  • 4Kp60 hardware decode of HEVC video
  • Complete compatibility with earlier Raspberry Pi products

And here it is in the flesh:

Still a handsome devil

Raspberry Pi 4 memory options

This is the first time we’re offering a choice of memory capacities. We’ve gone for the following price structure, retaining our signature $35 price for the entry-level model:

RAM Retail price
1GB $35
2GB $45
4GB $55

As always these prices exclude sales tax, import duty (where appropriate), and shipping. All three variants are launching today: we have initially built more of the 2GB variant than of the others, and will adjust the mix over time as we discover which one is most popular.

New Raspberry Pi 4, new features

At first glance, the Raspberry Pi 4 board looks very similar to our previous $35 products, all the way back to 2014’s Raspberry Pi 1B+. James worked hard to keep it this way, but for the first time he has made a small number of essential tweaks to the form factor to accommodate new features.

Power

We’ve moved from USB micro-B to USB-C for our power connector. This supports an extra 500mA of current, ensuring we have a full 1.2A for downstream USB devices, even under heavy CPU load.

An extra half amp, and USB OTG to boot

Video

To accommodate dual display output within the existing board footprint, we’ve replaced the type-A (full-size) HDMI connector with a pair of type-D (micro) HDMI connectors.

Seeing double

Ethernet and USB

Our Gigabit Ethernet magjack has moved to the top right of the board, from the bottom right, greatly simplifying PCB routing. The 4-pin Power-over-Ethernet (PoE) connector remains in the same location, so Raspberry Pi 4 remains compatible with the PoE HAT.

Through the looking glass

The Ethernet controller on the main SoC is connected to an external Broadcom PHY over a dedicated RGMII link, providing full throughput. USB is provided via an external VLI controller, connected over a single PCI Express Gen 2 lane, and providing a total of 4Gbps of bandwidth, shared between the four ports.

All three connectors on the right-hand side of the board overhang the edge by an additional millimetre, with the aim of simplifying case design. In all other respects, the connector and mounting hole layout remains the same, ensuring compatibility with existing HATs and other accessories.

New Raspbian software

To support Raspberry Pi 4, we are shipping a radically overhauled operating system, based on the forthcoming Debian 10 Buster release. This brings numerous behind-the-scenes technical improvements, along with an extensively modernised user interface, and updated applications including the Chromium 74 web browser. Simon will take an in-depth look at the changes in tomorrow’s blog post, but for now, here’s a screenshot of it in action.

Raspbian Buster desktop

Some advice for those who are keen to get going with Raspbian Buster right away: we strongly recommend you download a new image, rather than upgrading an existing card. This ensures that you’re starting with a clean, working Buster system. If you really, really want to try upgrading, make a backup first.

One notable step forward is that for Raspberry Pi 4, we are retiring the legacy graphics driver stack used on previous models. Instead, we’re using the Mesa “V3D” driver developed by Eric Anholt at Broadcom over the last five years. This offers many benefits, including OpenGL-accelerated web browsing and desktop composition, and the ability to run 3D applications in a window under X. It also eliminates roughly half of the lines of closed-source code in the platform.

New Raspberry Pi 4 accessories

Connector and form-factor changes bring with them a requirement for new accessories. We’re sensitive to the fact that we’re requiring people to buy these: Mike and Austin have worked hard to source good-quality, cost-effective products for our reseller and licensee partners, and to find low-cost alternatives where possible.

Raspberry Pi 4 Case

Gordon has been working with our design partners Kinneir Dufort and manufacturers T-Zero to develop an all-new two-part case, priced at $5.

New toy, new toy box

We’re very pleased with how this has turned out, but if you’d like to re-use one of our existing cases, you can simply cut away the plastic fins on the right-hand side and omit one of the side panels as shown below.

Quick work with a Dremel

Raspberry Pi 4 Power Supply

Good, low-cost USB-C power supplies (and USB-C cables) are surprisingly hard to find, as we discovered when sending out prototype units to alpha testers. So we worked with Ktec to develop a suitable 5V/3A power supply; this is priced at $8, and is available in UK (type G), European (type C), North American (type A) and Australian (type I) plug formats.

Behold the marvel that is BS 1363

If you’d like to re-use a Raspberry Pi 3 Official Power Supply, our resellers are offering a $1 adapter which converts from USB micro-B to USB-C. The thick wires and good load-step response of the old official supply make this a surprisingly competitive solution if you don’t need a full 3 amps.

Somewhat less marvellous, but still good

Raspberry Pi 4 micro HDMI Cables

Again, low-cost micro HDMI cables which reliably support the 6Gbps data rate needed for 4Kp60 video can be hard to find. We like the Amazon Basics cable, but we’ve also sourced a 1m cable, which will be available from our resellers for $5.

Official micro HDMI to HDMI cable

Updated Raspberry Pi Beginner’s Guide

At the end of last year, Raspberry Pi Press released the Official Raspberry Pi Beginner’s Guide. Gareth Halfacree has produced an updated version, covering the new features of Raspberry Pi 4 and our updated operating system.

Little computer people

Raspberry Pi 4 Desktop Kit

Bringing all of this together, we’re offering a complete Desktop Kit. This is priced at $120, and comprises:

  • A 4GB Raspberry Pi 4
  • An official case
  • An official PSU
  • An official mouse and keyboard
  • A pair of HDMI cables
  • A copy of the updated Beginner’s Guide
  • A pre-installed 16GB 32GB [oops – Ed.] microSD card

Raspberry Pi Desktop Kit

Raspberry Pi Store

This is the first product launch following the opening of our store in Cambridge, UK. For the first time, you can come and buy Raspberry Pi 4 directly from us, today. We’ll be open from 8am to 8pm, with units set up for you to play with and a couple of thousand on hand for you to buy. We even have some exclusive launch-day swag.

The Raspberry Pi Store sign

Form an orderly line

If you’re in the bottom right-hand corner of the UK, come on over and check it out!

New Raspberry Pi silicon

Since we launched the original Raspberry Pi in 2012, all our products have been based on 40nm silicon, with performance improvements delivered by adding progressively larger in-order cores (Cortex-A7, Cortex-A53) to the original ARM11-based BCM2835 design. With BCM2837B0 for Raspberry Pi 3B+ we reached the end of that particular road: we could no longer afford to toggle more transistors within our power budget.

Raspberry Pi 4 is built around BCM2711, a complete re-implementation of BCM283X on 28nm. The power savings delivered by the smaller process geometry have allowed us to replace Cortex-A53 with the much more powerful, out-of-order, Cortex-A72 core; this can execute more instructions per clock, yielding performance increases over Raspberry Pi 3B+ of between two and four times, depending on the benchmark.

We’ve taken advantage of the process change to overhaul many other elements of the design. We moved to a more modern memory technology, LPDDR4, tripling available bandwidth; we upgraded the entire display pipeline, including video decode, 3D graphics and display output to support 4Kp60 (or dual 4Kp30) throughput; and we addressed the non-multimedia I/O limitations of previous devices by adding on-board Gigabit Ethernet and PCI Express controllers.

Raspberry Pi 4 FAQs

We’ll keep updating this list over the next couple of days, but here are a few to get you started.

Wait, is it 2020 yet?

In the past, we’ve indicated 2020 as a likely introduction date for Raspberry Pi 4. We budgeted time for four silicon revisions of BCM2711 (A0, B0, C0, and C1); in comparison, we ship BCM2835C2 (the fifth revision of that design) on Raspberry Pi 1 and Zero.

Fortunately, 2711B0 has turned out to be production-ready, which has taken roughly 9–12 months out of the schedule.

Are you discontinuing earlier Raspberry Pi models?

No. We have a lot of industrial customers who will want to stick with the existing products for the time being. We’ll keep building these models for as long as there’s demand. Raspberry Pi 1B+, 2B, 3B, and 3B+ will continue to sell for $25, $35, $35, and $35 respectively.

What about a Model A version?

Historically, we’ve produced cut-down, lower-cost, versions of some of our $35 products, including Model 1A+ in 2014, and Model 3A+ at the end of last year. At present we haven’t identified a sensible set of changes to allow us to do a “Model 4A” product at significantly less than $35. We’ll keep looking though.

What about the Compute Module?

CM1, CM3, and CM3+ will continue to be available. We are evaluating options for producing a Compute Module product based on the Raspberry Pi 4 chipset.

Are you still using VideoCore?

Yes. VideoCore 3D is the only publicly documented 3D graphics core for ARM‑based SoCs, and we want to make Raspberry Pi more open over time, not less.

Credits

A project like Raspberry Pi 4 is the work of many hundreds of people, and we always try to acknowledge some of those people here.

This time round, particular credit is due to James Adams, who designed the board itself (you’ll find his signature under the USB 3.0 socket); to Mike Buffham, who ran the commercial operation, working with suppliers, licensees, and resellers to bring our most complicated product yet to market; and to all those at Raspberry Pi and Broadcom who have worked tirelessly to make this product a reality over the last few years.

A partial list of others who made major direct contributions to the BCM2711 chip program, CYW43455, VL805, and MxL7704 integrations, DRAM qualification, and Raspberry Pi 4 itself follows:

James Adams, Cyrus Afghahi, Umesh Agalgave, Snehil Agrawal, Sam Alder, Kiarash Amiri, Andrew Anderson, Eng Lim Ang, Eric Anholt, Greg Annandale, Satheesh Appukuttan, Vaibhav Ashtikar, Amy Au, Ben Avison, Matt Bace, Neil Bailey, Jock Baird, Scott Baker, Alix Ball, Giles Ballard, Paul Barnes, Russell Barnes, Fiona Batchelor, Alex Bate, Kris Baxter, Paul Beech, Michael Belhazy, Jonathan Bell, John Bellairs, Oguz Benderli, Doug Berger, Ron Berthiaume, Raj Bharadwaj, Udaya Bhaskar, Geoff Blackman, Ed Bleich, Debbie Brandenburg, David Brewer, Daniel Brierton, Adam Brown, Mike Buffham, Dan Caley, Mark Calleja, Rob Canaway, Cindy Cao, Victor Carmon, Ian Carter, Alex Carter, Amy Carter, Mark Castruita, KK Chan, Louis Chan, Nick Chase, Sherman Chen, Henry Chen, Yuliang Cheng, Chun Fai Cheung, Ravi Chhabra, Scott Clark, Tim Clifford, Nigel Clift, Dom Cobley, Steve Cole, Philip Colligan, Stephen Cook, Sheena Coote, Sherry Coutu, John Cowan-Hughes, John Cox, Peter Coyle, Jon Cronk, Darryl Cross, Steve Dalton, Neil Davies, Russell Davis, Tom De Vall, Jason Demas, Todd DeRego, Ellie Dobson, David Doyle, Alex Eames, Nicola Early, Jeff Echtenkamp, Andrew Edwards, Kevin Edwards, Phil Elwell, Dave Emett, Jiin Taur Eng, Gabrielle England, YG Eom, Peggy Escobedo, Andy Evans, Mark Evans, Florian Fainelli, David Ferguson, Ilan Finkelstein, Nick Francis, Liam Fraser, Ian Furlong, Nachiket Galgali, David Gammon, Jan Gaterman, Eric Gavami, Doug Giles, Andrew Goros, Tim Gover, Trevor Gowen, Peter Green, Simon Greening, Tracey Gregory, Efim Gukovsky, Gareth Halfacree, Mark Harris, Lucy Hattersley, James Hay, Richard Hayler, Gordon Henderson, Leon Hesch, Albert Hickey, Kevin Hill, Stefan Ho, Andrew Hoare, Lewis Hodder, William Hollingworth, Gordon Hollingworth, Michael Horne, Wanchen Hsu, David Hsu, Kevin YC Huang, Pei Huang, Peter Huang, Scofield Huang, James Hughes, Andy Hulbert, Carl Hunt, Rami Husni, Steven Hwang, Incognitum, Bruno Izern, Olivier Jacquemart, Mini Jain, Anurag Jain, Anand Jain, Geraint James, Dinesh Jayabharathi, Vinit Jayaraj, Nick Jeffery, Mengjie Jiang, David John, Alison Johnston, Lily Jones, Richard Jones, Tony Jones, Gareth Jones, Lijo Jose, Nevin Jose, Gary Kao, Gary Keall, Gerald Kelly, Ian Kersley, Gerard Khoo, Dani Kidouchim, Phil King, Andreas Knobloch, Bahar Kordi-Borojeni, Shuvra Kundu, Claire Kuo, Nicole Kuo, Wayne Kusumo, Koen Lampaert, Wyn Landon, Trever Latham, William Lee, Joon Lee, William Lee, Dave Lee, Simon Lewis, David Lewsey, Sherman Li, Xizhe Li, Jay Li, John CH Lin, Johan Lin, Jonic Linley, Chris Liou, Lestin Liu, Simon Long, Roy Longbottom, Patrick Loo, James Lougheed, Janice Lu, Fu Luo-Larson, Jeff Lussier, Helen Lynn, Terence Mackown, Neil MacLeod, Kevin Malone, Shahin Maloyan, Tim Mamtora, Stuart Martin, Simon Martin, Daniel Mason, Karen Matulis, Andrea Mauri, Scott McGregor, Steven Mcninch, Ben Mercer, Kamal Merchant, James Mills, Vassil Mitov, Ali Syed Mohammed, Brendan Moran, Alan Morgan, Giorgia Muirhead, Fiacre Muller, Aram Nahidipour, Siew Ling Ng, Thinh Nguyen, Lee Nguyen, Steve Noh, Paul Noonan, Keri Norris, Rhian Norris, Ben Nuttall, Brian O’Halloran, Martin O’Hanlon, Yong Oh, Simon Oliver, Mandy Oliver, Emma Ormond, Shiji Pan, Kamlesh Pandey, Christopher Pasqualino, Max Passell, Naush Patuck, Rajesh Perruri, Eric Phiri, Dominic Plunkett, Nutan Raj, Karthik Rajendran, Rajendra Ranmale, Murali Rangapuram, Ashwin Rao, Nick Raptopoulos, Chaitanya Ray, Justin Rees, Hias Reichl, Lorraine Richards, David Richardson, Tim Richardson, Dan Riiff, Peter de Rivaz, Josh Rix, Alwyn Roberts, Andrew Robinson, Kevin Robinson, Nigel Roles, Paul Rolfe, Marcelo Romero, Jonathan Rosenfeld, Sarah Roth, Matt Rowley, Matthew Rowley, Dave Saarinen, Ali Salem, Suzie Sanders, Graham Sanderson, Aniruddha Sane,
Andrew Scheller, Marion Scheuermann, Serge Schneider, Graham Scott, Marc Scott, Saran Kumar Seethapathi, Shawn Shadburn, Abdul Shaik, Mark Skala, Graham Smith, Michael Smith, Martin Sperl, Ajay Srivastava, Nick Steele, Ben Stephens, Dave Stevenson, Mike Stimson, Chee Siong Su, Austin Su, Prem Swaroop, Grant Taylor, Daniel Thompsett, Stuart Thomson, Eddie Thorn, Roger Thornton, Chris Tomlinson, Stephen Toomey, Mohamed Toubella, Frankie Tsai, Richard Tuck, Mike Unwin, Liz Upton, Manoj Vajhallya, Sandeep Venkatadas, Divya Vittal, John Wadsworth, Stefan Wahren, Irene Wang, Jeremy Wang, Rich Wells, Simon West, Joe Whaley, Craig Wightman, Oli Wilkin, Richard Wilkins, Sarah Williams, Jack Willis, Rob Wilson, Luke Wren, Romona Wu, Zheng Xu, Paul Yang, Pawel Zackiewicz, Ling Zhang, Jean Zhou, Ulf Ziemann, Rob Zwetsloot.

If you’re not on this list and think you should be, please let me know, and accept my apologies.

The post Raspberry Pi 4 on sale now from $35 appeared first on Raspberry Pi.

Steampunk-inspired Raspberry Pi enclosure | HackSpace magazine #20

Par Andrew Gregory

Who doesn’t like a good-looking case for their Raspberry Pi?

Exactly.

We’ve seen many homemade cases over the years, from 3D-printed enclosures to LEGO, Altoid tins and gravity-defying Zelda-themed wonderments. We love them all as much as we love own — our own case being this one if you fancy one — and always look forward to seeing more.

Cue this rather fancy steampunk-inspired enclosure made by Erich Styger, as featured in the latest issue of HackSpace magazine.

The magazine states:

This steampunk enclosure for the Raspberry Pi by Erich Styger was laser-cut out of 4 mm birch plywood, and stained to make it look a bit more 1890s. It’s built to fit a Raspberry Pi with an NXP tinyK22 board and a battery backup, and there are ports artfully crafted into it so that the system is fully functional even when the box is closed.

Those gears aren’t just for show: turn the central wheel on the front of the box to open the enclosure and get access to the electronics inside.

Cool, right?

What cases have you made for your Raspberry Pi? Let us know in the comments, or by tagging @Raspberry_Pi and @HackSpaceMag on Twitter.

HackSpace magazine is out now

You can read the rest of this feature in HackSpace magazine issue 20, out today in Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy HackSpace mag directly from us — worldwide delivery is available. And if you’d like to own a handy digital version of the magazine, you can also download a free PDF.

The post Steampunk-inspired Raspberry Pi enclosure | HackSpace magazine #20 appeared first on Raspberry Pi.

Recreate the sprite-following Options from Gradius using Python | Wireframe issue 16

Par Ryan Lambie

Learn how to create game objects that follow the path of the main player sprite. Raspberry Pi’s own Rik Cross explains all.

Options first appeared in 1985’s Gradius, but became a mainstay of numerous sequels and spin-offs, including the Salamander and Parodius series of games.

Gradius

First released by Konami in 1985, Gradius pushed the boundaries of the shoot-’em-up genre with its varied level design, dramatic boss fights, and innovative power-up system.

One of the most memorable of its power-ups was the Option — a small, drone-like blob that followed the player’s ship and effectively doubled its firepower.

By collecting more power-ups, it was possible to gather a cluster of death-dealing Options, which obediently moved wherever the player moved.

Recreate sprite-following in Python

There are a few different ways of recreating Gradius’ sprite-following, but in this article, I’ll show you a simple implementation that uses the player’s ‘position history’ to place other following items on the screen. As always, I’ll be using Python and Pygame to recreate this effect, and I’ll be making use of a spaceship image created by ‘pitrizzo’ from opengameart.org.

The first thing to do is to create a spaceship and a list of ‘power-up’ objects. Storing the power-ups in a list allows us to perform a simple calculation on a power-up to determine its position, as you’ll see later. As we’ll be iterating through the power-ups stored in a list, there’s no need to create a separate variable for each. Instead, we can use list comprehension to create the power-ups:

powerups = [Actor(‘powerup’) for p in range(3)]

The player’s position history will be a list of previous positions, stored as a list of (x,y) tuples. Each time the player’s position changes, the new position is added to the front of the list (as the new first element). We only need to know the spaceship’s recent position history, so the list is also truncated to only contain the 100 most recent positions. Although not necessary, the following code can be added to allow you to see a selection (in this case every fifth) of these previous positions:

for p in previouspositions[::5]:

screen.draw.filled_circle(p, 2, (255,0,0))

Plotting the spaceship’s position history.

Each frame of the game, this position list is used to place each of the power-ups. In our Gradius-like example, we need each of these objects to follow the player’s spaceship in a line, as if moving together in a single-file queue. To achieve this effect, a power-up’s position is determined by its position in the power-ups list, with the first power-up in the list taking up a position nearest to the player. In Python, using enumerate when iterating through a list allows us to get the power-up’s position in the list, which can then be used to determine which position in the player’s position history to use.

newposition = previouspositions[(i+1)*20]

So, the first power-up in the list (element 0 in the list) is placed at the coordinates of the twentieth ((0+1)*20) position in the spaceship’s history, the second power-up at the fourtieth position, and so on. Using this simple calculation, elements are equally spaced along the spaceship’s previous path. The only thing to be careful of here is that you have enough items in the position history for the number of items you want to follow the player!

Power-ups following a player sprite, using the player’s position history.

This leaves one more question to answer; where do we place these power-ups initially, when the spaceship has no position history? There are a few different ways of solving this problem, but the simplest is just to generate a fictitious position history at the beginning of the game. As I want power-ups to be lined up behind the spaceship initially, I again used list comprehension

to generate a list of 100 positions with ever-decreasing x-coordinates.

previouspositions = [(spaceship.x - i*spaceship.speed,spaceship.y) for i in range(100)]

With an initial spaceship position of (400,400) and a spaceship.speed of 4, this means the list will initially contain the following coordinates:

previouspositions = [(400,400),(396,400),(392,400),(388,400),...]

Storing our player’s previous position history has allowed us to create path-following power-ups with very little code. The idea of storing an object’s history can have very powerful applications. For example, a paint program could store previous commands that have been executed, and include an ‘undo’ button that can work backwards through the commands.

Here’s Rik’s code, which recreates those sprite-following Options in Python. To get it running on your system, you’ll first need to install Pygame Zero. And to download the full code, go here.

Get your copy of Wireframe issue 16

You can read more features like this one in Wireframe issue 16, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 16 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Recreate the sprite-following Options from Gradius using Python | Wireframe issue 16 appeared first on Raspberry Pi.

European Astro Pi Challenge: Mission Space Lab winners 2018–2019!

Par Olympia Brown

This is your periodic reminder that there are two Raspberry Pi computers in space! That’s right — our Astro Pi units Ed and Izzy have called the International Space Station home since 2016, and we are proud to work with ESA Education to run the European Astro Pi Challenge, which allows students to conduct scientific investigations in space, by writing computer programs.

Astro PI IR on ISS

An Astro Pi takes photos of the earth from the window of the International Space Station

The Challenge has two missions: Mission Zero and Mission Space Lab. The more advanced one, Mission Space Lab, invites teams of students and young people under 19 years of age to enter by submitting an idea for a scientific experiment to be run on the Astro Pi units.

ESA and the Raspberry Pi Foundation would like to congratulate all the teams that participated in the European Astro Pi Challenge this year. A record-breaking number of more than 15000 people, from all 22 ESA Member States as well as Canada, Slovenia, and Malta, took part in this year’s challenge across both Mission Space Lab and Mission Zero!

Eleven teams have won Mission Space Lab 2018–2019

After designing their own scientific investigations and having their programs run aboard the International Space Station, the Mission Space Lab teams spent their time analysing the data they received back from the ISS. To complete the challenge, they had to write a short scientific report discussing their results and highlighting the conclusions of their experiments. We were very impressed by the quality of the reports, which showed a high level of scientific merit.

We are delighted to announce that, while it was a difficult task, the Astro Pi jury has now selected eleven winning teams, as well as highly commending four additional teams. The eleven winning teams won the chance to join an exclusive video call with ESA astronaut Frank De Winne. He is the head of the European Astronaut Centre in Germany, where astronauts train for their missions. Each team had the once-in-a-lifetime chance to ask Frank about his life as an astronaut.

And the winners are…

Firewatchers from Post CERN HSSIP Group, Portugal, used a machine learning method on their images to identify areas that had recently suffered from wildfires.

Go, 3.141592…, Go! from IES Tomás Navarro Tomás, Spain, took pictures of the Yosemite and Lost River forests and analysed them to study the effects of global drought stress. They did this by using indexes of vegetation and moisture to assess whether forests are healthy and well-preserved.

Les Robotiseurs from Ecole Primaire Publique de Saint-André d’Embrun, France, investigated variations in Earth’s magnetic field between the North and South hemispheres, and between day and night.

TheHappy.Pi from I Liceum Ogólnokształcące im. Bolesława Krzywoustego w Słupsku, Poland, successfully processed their images to measure the relative chlorophyll concentrations of vegetation on Earth.

AstroRussell from Liceo Bertrand Russell, Italy, developed a clever image processing algorithm to classify images into sea, cloud, ice, and land categories.

Les Puissants 2.0 from Lycee International de Londres Winston Churchill, United Kingdom, used the Astro Pi’s accelerometer to study the motion of the ISS itself under conditions of normal flight and course correction/reboost maneuvers.

Torricelli from ITIS “E.Torricelli”, Italy, recorded images and took sensor measurements to calculate the orbital period and flight speed of the ISS followed by the mass of the Earth using Newton’s universal law of gravitation.

ApplePi from I Liceum Ogólnokształcące im. Króla Stanisława Leszczyńskiego w Jaśle, Poland, compared their images from Astro Pi Izzy to historical images from 35 years ago and could show that coastlines have changed slightly due to erosion or human impact.

Spacethon from Saint Joseph La Salle Pruillé Le Chétif, France, tested their image-processing algorithm to identify solid, liquid, and gaseous features of exoplanets.

Stithians Rocket Code Club from Stithians CP School, United Kingdom, performed an experiment comparing the temperature aboard the ISS to the average temperature of the nearest country the space station was flying over.

Vytina Aerospace from Primary School of Vytina, Greece, recorded images of reservoirs and lakes on Earth to compare them with historical images from the last 30 years in order to investigate climate change.

Highly commended teams

We also selected four teams to be highly commended, and they will receive a selection of goodies from ESA Education and the Raspberry Pi Foundation:

Aguere Team from IES Marina Cebrián, Spain, investigated variations in the Earth’s magnetic field due to solar activity and a particular disturbance due to a solar coronal hole.

Astroraga from CoderDojo Trento, Italy, measured the magnetic field to investigate whether astronauts can still use a compass, just like on Earth, to orient themselves on the ISS.

Betlemites from Escoles Betlem, Spain, recorded the temperature on the ISS to find out if the pattern of a convection cell is different in microgravity.

Rovel In The Space from Scuola secondaria I grado A.Rosmini ROVELLO PORRO(Como), Italy, executed a program that monitored the pressure and would warn astronauts in case space debris or micrometeoroids collided with the ISS.

The next edition is not far off!

ESA and the Raspberry Pi Foundation would like to invite all school teachers, students, and young people to join the next edition of the challenge. Make sure to follow updates on the Astro Pi website and Astro Pi Twitter account to look out for the announcement of next year’s Astro Pi Challenge!

The post European Astro Pi Challenge: Mission Space Lab winners 2018–2019! appeared first on Raspberry Pi.

Chat to Ada Lovelace via a Raspberry Pi

Par Alex Bate

Our friends, 8 Bits and a Byte, have built a Historic Voicebot, allowing users to chat to their favourite historical figures.

It’s rather marvellous.

The Historic Voicebot

Have a chat with your favourite person from the past with the Historic Voicebot! With this interactive installation, you can talk to a historical figure through both chat and voice. Made using Dialogflow, Node.js, HTML Canvas, an AIY Voice Kit, a Raspberry Pi and a vintage phone.

All the skills

Coding? Check. Woodwork? Check. Tearing apart a Google AIY Kit in order to retrofit it into a vintage telephone while ensuring it can still pick up voice via the handset? Check, check, check – this project has it all.

The concept consists of two parts:

  • A touchscreen with animations of a historical figure. The touchscreen also displays the dialog and has buttons so people can ask an FAQ.
  • A physical phone that captures speech and gives audio output, so it can be used to ask questions and listen to the answer.

While Nicole doesn’t go into full detail in the video, the Ada animation uses Dialogflow, Node.js, and HTML Canvas to work, and pairs up with the existing tech in the Google AIY Kit.

And, if you don’t have an AIY Kit to hand, don’t worry; you can have the same functionality using a standard USB speaker and microphone, and Google Home running on a Raspberry Pi.

You can find a tutorial for the whole project on hackster.io.

Follow 8 Bits and a Byte

There are a lot of YouTube channels out there that don’t have the follow count we reckon they deserve, and 8 Bits and a Byte is one of them. So, head to their channel and click that subscribe button, and be sure to check out their other videos for some more Raspberry Pi goodness.

The post Chat to Ada Lovelace via a Raspberry Pi appeared first on Raspberry Pi.

An in-flight entertainment system that isn’t terrible

Par Helen Lynn

No Alex today; she’s tragically germ-ridden and sighing weakly beneath a heap of duvets on her sofa. But, in spite of it all, she’s managed to communicate that I should share Kyle‘s Raspberry Pi in-flight entertainment system with you.

I made my own IN-FLIGHT entertainment system! ft. Raspberry Pi

Corsair Ironclaw RGB Gaming Mouse: http://bit.ly/2vFwYw5 From poor A/V quality to lackluster content selection, in-flight entertainment centers are full of compromises. Let’s create our own using a Raspberry Pi 3 B+!

Kyle is far from impressed with the in-flight entertainment on most planes: the audio is terrible, the touchscreens are annoyingly temperamental, and the movie selection is often frustratingly limited. So, the night before a morning flight to visit family (congrats on becoming an uncle, Kyle! We trust you’ll use your powers only for good!), he hit upon the idea of building his own in-flight entertainment system, using stuff he already had lying around.

Yes, we know, he could just have taken a tablet with him. But we agree with him that his solution is way funner. It’s way more customisable too. Kyle’s current rushed prototype features a Raspberry Pi 3B+ neatly cable-tied into a drilled Altoids tin lid, which is fixed flush to the back of a 13.3-inch portable monitor with adhesive Velcro. He’s using VLC Media Player, which comes with Raspbian and supports a lot of media control functions straight out of the box; this made using his mouse and mini keyboard a fairly seamless experience. And a handy magnetic/suction bracket lets him put the screen in the back of the seat in front to the best possible use: as a mounting surface.

As Kyle says, “Is it ridiculous? I mean, yes, obviously it’s ridiculous, but would you ever consider doing something like this?”

The post An in-flight entertainment system that isn’t terrible appeared first on Raspberry Pi.

Remembering Andy Baker

Par Helen Lynn

We are immensely sad to learn of the death, on 1 June, of Andy Baker, joint founder and organiser of the brilliant Cotswold Raspberry Jam. Andy had been suffering from brain cancer.

andy baker pistuffing

Together with co-founder Andrew Oakley, Andy worked incredibly hard to make the Cotswold Jam one of the most exciting Jams of all, with over 150 people of all ages attending its most popular events. He started working with Raspberry Pis back in 2012, and developed a seriously impressive degree of technical expertise: among his projects were a series of Pi-powered quadcopters, no less, including an autonomous drone. Many of us will forever associate Andy with a memorably fiery incident at the Raspberry Pi Big Birthday Weekend in 2016, which he handled with grace and good humour that eludes most of us:

Raspberry Pi Party Autonomous drone demo + fire

At the Raspberry Pi IV party and there is a great demo of an Autonomous drone which is very impressive with only using a Pi. However it caught on fire. But i believe it does actually work.

Andy maintained his involvement with the Raspberry Pi community, and especially the Cotswold Jam, for several years while living with a brain tumour, and shared his skills and enthusiasm with hundreds of others. He was at the heart of the Raspberry Pi community. When our patron, His Royal Highness the Duke of York, kindly hosted a reception at St. James’s Palace in October 2016 to recognise the Raspberry Pi community, Andy joined us to celebrate in style:

Cotswold Jam on Twitter

@ben_nuttall @DougGore @PiStuffing @rjam_chat Cheers, Ben! Fab photo of Prince Andrew being ignored by @davejavupride & Andy Baker @PiStuffing who are too busy drinking… “It’s what he would have wanted…” :-) https://t.co/FK7sk1CoDs

Andy suggested that, if people would like to make a donation in his name, they support his local school’s IT department, somewhere else he used to volunteer. The department isn’t able to accept online donations, but cheques in pounds sterling can be made out to “Gloucestershire County Council” and posted to a local funeral director who will collect and forward them:

Andy Baker memorial fund
c/o Blackwells of Cricklade
Thames House
Thames Lane
Cricklade
SN6 6BH

We owe Andy immense gratitude for all his work to help people learn and have a great time with Raspberry Pi. We were very lucky indeed to have him as part of our community. We will miss him.

The post Remembering Andy Baker appeared first on Raspberry Pi.

Playback your favourite records with Plynth

Par Alex Bate

Use album artwork to trigger playback of your favourite music with Plynth, the Raspberry Pi–powered, camera-enhanced record stand.

Plynth Demo

This is “Plynth Demo” by Plynth on Vimeo, the home for high quality videos and the people who love them.

Record playback with Plynth

Plynth uses a Raspberry Pi and Pi Camera Module to identify cover artwork and play the respective album on your sound system, via your preferred streaming service or digital library.

As the project’s website explains, using Plynth is pretty simple. Just:

  • Place a n LP, CD, tape, VHS, DVD, piece of artwork – anything, really – onto Plynth
  • Plynth uses its built-in camera to scan and identify the work
  • Plynth starts streaming your music on your connected speakers or home stereo system

As for Plynth’s innards? The stand houses a Raspberry Pi 3B+ and Camera Module, and relies on “a combination of the Google Vision API and OpenCV, which is great because there’s a lot of documentation online for both of them”, states the project creator, Jono Matusky, on Reddit.

Other uses

Some of you may wonder why you wouldn’t have your records with your record player and, as such, use that record player to play those records. If you are one of these people, then consider, for example, the beautiful Damien Rice LP I own that tragically broke during a recent house move. While I can no longer play the LP, its artwork is still worthy of a place on my record shelf, and with Plynth I can still play the album as well.

In addition, instead of album artwork to play an album, you could use photographs, doodles, or type to play curated playlists, or, as mentioned on the website, DVDs to play the movies soundtrack, or CDs to correctly select the right disc in a disc changer.

Convinced or not, I think what we can all agree on is that Plynth is a good-looking bit of kit, and at Pi Towers look forward to seeing where they project leads.

The post Playback your favourite records with Plynth appeared first on Raspberry Pi.

Ghost-hunting in schools with Raspberry Pi | Hello World #9

Par Alex Bate

In Hello World issue 9, out today, Elliott Hall and Tom Bowtell discuss The Digital Ghost Hunt: an immersive theatre and augmented reality experience that takes a narrative-driven approach in order to make digital education accessible.The Digital Ghost Hunt - Raspberry Pi Hello World

The Digital Ghost Hunt combines coding education, augmented reality, and live performance to create an immersive storytelling experience. It begins when a normal school assembly is disrupted by the unscheduled arrival of Deputy Undersecretary Quill of the Ministry of Real Paranormal Hygiene, there to recruit students into the Department’s Ghost Removal Section. She explains that the Ministry needs the students’ help because children have the unique ability to see and interact with ghostly spirits.

The Digital Ghost Hunt - Raspberry Pi Hello World

Under the tutelage of Deputy Undersecretary Quill and Professor Bray (the Ministry’s chief scientist), the young ghost-hunters learn how to program and use their own paranormal detectors. These allow students to discover ghostly traces, translate Morse code using flickering lights, and find messages left in ultraviolet ectoplasm. Meanwhile, the ghost communicates through a mixture of traditional theatrical effects and the poltergeist potential of smart home technology. Together, students uncover the ghost’s identity, discover her reason for haunting the building, unmask a dastardly villain, find a stolen necklace, clear the ghost’s name, right an old wrong, and finally set the ghost free.

The Digital Ghost Hunt - Raspberry Pi Hello World

The project conducted two successful test performances at the Battersea Arts Centre in South London in November 2018, funded by a grant from AHRC’s New Immersive Experiences Programme, led by Mary Krell of Sussex University. Its next outing will be at York Theatre Royal in August.

Adventures in learning

The Digital Ghost Hunt arose out of a shared interest in putting experimentation and play at the centre for learners. We felt that the creative, tinkering spirit of earlier computing — learning how to program BASIC on an Atari 800XL to create a game, for example — was being supplanted by a didactic and prescriptive approach to digital learning. KIT Theatre’s practice — creating classroom adventures that cast pupils as heroes in missions — is also driven by a less trammelled, more experiment-led approach to learning.

We believe that the current Computer Science curriculum isn’t engaging enough for students. We wanted to shift the context of how computer science is perceived, from ‘something techy and boyish’ back to the tool of the imagination that it should be. We did this by de-emphasising the technology itself and, instead, placing it in the larger context of a ghost story. The technology becomes a tool to navigate the narrative world — a means to an end rather than an end in itself. This helps create a more welcoming space for students who are bored or intimidated by the computer lab: a space of performance, experiment, and play.

Ghosts and machines

The device we built for the students was the SEEK Ghost Detector, made from a Raspberry Pi and a micro:bit, which Elliot stapled together. The micro:bit was the device’s interface, which students programmed using the block-based language MakeCode. The Raspberry Pi handled the heavier technical requirements of the show, and communicated them to the micro:bit in a form students could use. The detector had no screen, only the micro:bit’s LEDs. This meant that students’ attention was focused on the environment and what the detector could tell them about it, rather than having their attention pulled to a screen to the exclusion of the ‘real’ world around them.

In addition to the detector, we used a Raspberry Pi to make ordinary smart home technology into our poltergeist. It communicated with the students using effects such as smart bulbs that flashed in Morse code, which the students could then decode on their devices.

To program their detectors, students took part in a series of four lessons at school, focused on thinking like a programmer and the logic of computing. Two of the lessons featured significant time spent programming the micro:bit. The first focused on reading code on paper, and students were asked to look out for any bugs. The second had students thinking about what the detector will do, and acting out the steps together, effectively ‘performing’ the algorithm.

We based the process on KIT Theatre’s Adventures in Learning model, and its Theory of Change:

  • Disruption: an unexpected event grabs attention, creating a new learning space
  • Mission: a character directly asks pupils for their help in completing a mission
  • Achievement: pupils receive training and are given agency to successfully complete the mission

The Ghost Hunt

During these lessons, Deputy Undersecretary Quill kept in touch with the students via email, and the chief scientist sent them instructional videos. Their work culminated in their first official assignment: a ghost haunting the Battersea Arts Centre — a 120-year-old former town hall. After arriving, students were split into four teams, working together. Two teams analysed evidence at headquarters, while the others went out into places in the building where we’d hidden ghostly traces that their detectors would discover. The students pooled their findings to learn the ghost’s story, and then the teams swapped roles. The detectors were therefore only one method of exploring the narrative world. But the fact that they’d learned some of the code gave students a confidence in using the detectors — a sense of ownership. During one performance, one of the students pointed to a detector and said: “I made that.”

Future of the project

The project is now adapting the experience into a family show, in partnership with Pilot Theatre, premiering in York in summer 2019. We aim for it to become the core of an ecosystem of lessons, ideas, and activities — to engage audiences in the imaginative possibilities of digital technology.

You can find out more about the Digital Ghost Hunt on their website, which also includes rather lovely videos that Vimeo won’t let me embed here.

Hello World issue 9

The brand-new issue of Hello World is out today, and available right now as a free PDF download from the Hello World website.

Hello World issu 9

UK-based educators can also sign up to receive Hello World as printed magazine FOR FREE, direct to their door, by signing up here. And those outside the UK, educator or not, can subscribe to receive new issues of Hello World in their inbox on the day of release.

The post Ghost-hunting in schools with Raspberry Pi | Hello World #9 appeared first on Raspberry Pi.

Driverless cars run by Raspberry Pi

Par Alex Bate

Could the future of driverless cars be shaped by Raspberry Pi? For undergraduate researchers at the University of Cambridge, the answer is a resounding yes!

Can cars talk to each other?

A fleet of driverless cars working together to keep traffic moving smoothly can improve overall traffic flow by at least 35 percent, researchers have shown. The researchers, from the University of Cambridge, programmed a small fleet of miniature robotic cars to drive on a multi-lane track and observed how the traffic flow changed when one of the cars stopped.

So long, traffic!

By using Raspberry Pis and onboard sensors to program scale-model versions of commercially available cars, undergraduate researchers have built a fleet of driverless cars that ‘talk to each other’. They did this because they are studying how driverless technology can help reduce traffic incidents on our roads.

Cambridge University Driverless cars using Raspberry Pi

The researchers investigated how a car stalled on a multi-lane track affects the buildup of traffic, and how communication between driverless cars can prevent these buildups.

Cambridge University Driverless cars using Raspberry Pi

When the cars acted independently of each other, a stalled car caused other vehicles in the same lane to slow or stop in order to merge into the adjacent lane. This soon led to queues forming along the track. But when the cars communicated via Raspberry Pis, they could tell each other about obstacles on the track, and this allowed cars to shift lanes with the cooperation of other road users.

The researchers recently presented their paper on the subject at the International Conference on Robotics and Automation (ICRA 2019) in Montréal, Canada. You can find links to their results, plus more information, on the University of Cambridge blog.

The post Driverless cars run by Raspberry Pi appeared first on Raspberry Pi.

Retrofit a handheld Casio portable TV with a Raspberry Pi

Par Alex Bate

What do we say to the god of outdated tech? Not today! Revive an old portable television with a Raspberry Pi 3!

Pocket televisions

In the late 1980s, when I was a gadget-savvy kid, my mother bought me a pocket TV as a joint Christmas and birthday present. The TV’s image clarity was questionable, its sound tinny, and its aerial so long that I often poked myself and others in the eye while trying to find a signal. Despite all this, it was one of the coolest, most futuristic things I’d ever seen, and I treasured it. But, as most tech of its day, the pocket TV no longer needed: I can watch TV in high definition on my phone — a device half the size, with a screen thrice as large, and no insatiable hunger for AA batteries.

So what do we do with this old tech to save it from the tip?

We put a Raspberry Pi in it, of course!

JaguarWong’s Raspberry Pi 3 pocket TV!

“I picked up a broken Casio TV-400 for the princely sum of ‘free’ a few weeks back. And I knew immediately what I wanted to do with it,” imgur user JaguarWong states in the introduction for the project.

I got the Pi for Christmas a couple of years back and have never really had any plans for it. Not long after I got it, I picked up the little screen from eBay to play with but again, with no real purpose in mind — but when I got the pocket TV everything fell into place.

Isn’t it wonderful when things fall so perfectly into place?

Thanks to an online pinout guide, JW was able to determine how to  connect the screen and the Raspberry Pi; fortunately, only a few jumper wires were needed — “which was handy given the limits on space.”

With slots cut into the base of the TV for the USB and Ethernet ports, the whole project fit together like a dream, with little need for modification of the original housing.

The final result is wonderful. And while JW describes the project as “fun, if mostly pointless”, we think it’s great — another brilliant example of retrofitting old tech with Raspberry Pi!

10/10 would recommend to a friend.

The post Retrofit a handheld Casio portable TV with a Raspberry Pi appeared first on Raspberry Pi.

An opportunity to reach thousands with the Raspberry Pi

Par Dana Augustin

Dr Bob Brown is a former professor who taught at Kennesaw State University and Southern Polytechnic State University. He holds a doctorate in computer information systems. Bob is also a Raspberry Pi Certified Educator, and continues to provide exceptional classroom experiences for K-12 students. The moment his students have that “Aha!” feeling is something he truly values, and he continues to enjoy that experience in his K-12 classroom visits.

After retiring from teaching computing in 2017, Bob continued his school visits, first on an informal basis, and later as an official representative of KSU’s College of Computing and Software Engineering (CCSE). Keen to learn more about K-12 Computing, Bob applied to the Raspberry Pi Foundation’s Picademy program, and attended Picademy Atlanta in 2018. Here’s his story of how he has since gone on to lead several Raspberry Pi Teachers’ Workshops, inspiring educators and students alike.

“I couldn’t have done this if I had not attended Picademy” — Bob Brown

“I was amazed at the excitement and creativity that Picademy and the Raspberry Pi created among the teachers who attended,” Bob says. “After reading about the number of applicants for limited Picademy positions, I realized there was unmet demand. I began to wonder whether we could do something similar at the CCSE.”

Bob spent over a hundred hours developing instructional material, and raised over $2,000 from Southern Polytechnic alumni. With the money he raised, Bob conducted a pilot workshop for half a dozen teachers in the autumn of 2018. The workshop was free for participants, and covered material similar to Picademy, but in a one-day format. Participants were also given a Raspberry Pi 3B+ and a parts pack. Bob says, “I couldn’t have done this if I had not attended Picademy and been able to start with the Picademy material from the Raspberry Pi Foundation.”

“[The CCSE] helps improve access, awareness, and sustainability to middle and high school students and teachers.” — Jon Preston

The Dean of CCSE at KSU, Dr Jon Preston, was so impressed with the results of the pilot workshop that he authorised a formal fundraising program and two additional workshops in the spring of 2019. Four more workshops have also been scheduled for the summer.

“The College of Computing and Software Engineering at KSU STEM+Computing project helps improve access, awareness, and sustainability to middle and high school students and teachers. CCSE faculty and undergraduate students build learning materials and deliver these materials on-site to schools in an effort to increase the number of students who are energized by computing and want to study computing to help improve their careers and the world. Given the price and power of the Raspberry Pi computers, these devices are a perfect match for our project in the local schools,” says Preston.

The teachers really enjoyed the workshop, and left incredibly inspired.

Teachers came from all over Georgia and from as far away as Mississippi to attend the workshops. For some of the teachers, it was their first time exploring the concept of physical computing, and the hands-on approach to the workshop helped them set their own pace. The teachers really enjoyed the workshop, and left incredibly inspired. “Teacher workshops have a multiplier effect,” says Brown. “If I teach 30 students, I’ve reached 30 students; if I teach 30 teachers, I potentially reach thousands of students over a period of years.”

Another great contribution to the program was the addition of college student facilitators, who provided individual support to the teachers throughout the day, making it easier for everyone to have the assistance they needed.

By the end of the summer, more than 150 K-12 teachers will have participated in a CCSE Raspberry Pi Teachers’ Workshop.

The Raspberry Pi Teachers’ Workshops have become a regular part of the outreach efforts of the CCSE. Grants from State Farm Insurance, 3M Corporation, and a few very generous individual gifts keep the workshops free for K-12 teachers, who also take home a Raspberry Pi and extra components and parts. Participants are also invited to join an online forum where they can exchange ideas and support each other. By the end of the summer, more than 150 K-12 teachers will have participated in a CCSE Raspberry Pi Teachers’ Workshop. You can find more information about the workshops here.

The post An opportunity to reach thousands with the Raspberry Pi appeared first on Raspberry Pi.

Coding an isometric game map | Wireframe issue 15

Par Ryan Lambie

Isometric graphics give 2D games the illusion of depth. Mark Vanstone explains how to make an isometric game map of your own.

Published by Quicksilva in 1983, Ant Attack was one of the earliest games to use isometric graphics. And you threw grenades at giant ants. It was brilliant.

Isometric projection

Most early arcade games were 2D, but in 1982, a new dimension emerged: isometric projection. The first isometric game to hit arcades was Sega’s pseudo-3D shooter, Zaxxon. The eye-catching format soon caught on, and other isometric titles followed: Q*bert came out the same year, and in 1983 the first isometric game for home computers was published: Ant Attack, written by Sandy White.

Ant Attack

Ant Attack was first released on the ZX Spectrum, and the aim of the game was for the player to find and rescue a hostage in a city infested with giant ants. The isometric map has since been used by countless titles, including Ultimate Play The Game’s classics Knight Lore and Alien 8, and my own educational history series ArcVenture.

Let’s look at how an isometric display is created, and code a simple example of how this can be done in Pygame Zero — so let’s start with the basics. The isometric view displays objects as if you’re looking down at 45 degrees onto them, so the top of a cube looks like a diamond shape. The scene is made by drawing cubes on a diagonal grid so that the cubes overlap and create solid-looking structures. Additional layers can be used above them to create the illusion of height.

Blocks are drawn from the back forward, one line at a time and then one layer on top of another until the whole map is drawn.

The cubes are actually two-dimensional bitmaps, which we start printing at the top of the display and move along a diagonal line, drawing cubes as we go. The map is defined by a three-dimensional list (or array). The list is the width of the map by the height of the map, and has as many layers as we want to represent in the upward direction. In our example, we’ll represent the floor as the value 0 and a block as value 1. We’ll make a border around the map and create some arches and pyramids, but you could use any method you like — such as a map editor — to create the map data.

To make things a bit easier on the processor, we only need to draw cubes that are visible in the window, so we can do a check of the coordinates before we draw each cube. Once we’ve looped over the x, y, and z axes of the data list, we should have a 3D map displayed. The whole map doesn’t fit in the window, and in a full game, the map is likely to be many times the size of the screen. To see more of the map, we can add some keyboard controls.

Here’s Mark’s isometric map, coded in Python. To get it running on your system, you’ll first need to install Pygame Zero. And to download the full code, visit our Github repository here.

If we detect keyboard presses in the update() function, all we need to do to move the map is change the coordinates we start drawing the map from. If we start drawing further to the left, the right-hand side of the map emerges, and if we draw the map higher, the lower part of the map can be seen.

We now have a basic map made of cubes that we can move around the window. If we want to make this into a game, we can expand the way the data represents the display. We could add differently shaped blocks represented by different numbers in the data, and we could include a player block which gets drawn in the draw() function and can be moved around the map. We could also have some enemies moving around — and before we know it, we’ll have a game a bit like Ant Attack.

Tiled

When writing games with large isometric maps, an editor will come in handy. You can write your own, but there are several out there that you can use. One very good one is called Tiled and can be downloaded free from mapeditor.org. Tiled allows you to define your own tilesets and export the data in various formats, including JSON, which can be easily read into Python.

Get your copy of Wireframe issue 15

You can read more features like this one in Wireframe issue 15, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 15 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Coding an isometric game map | Wireframe issue 15 appeared first on Raspberry Pi.

Quick Fix — a vending machine for likes and followers

Par Liz Upton

Sometimes we come across a project that just scores a perfect 10 on all fronts. This is one of them: an art installation using Raspberry Pi that has something interesting to say, does it elegantly, and is implemented beautifully (nothing presses our buttons like a make that’s got a professionally glossy finish like this).

Quick Fix is a vending machine (and art installation) that sells social media likes and followers. Drop in a coin, enter your social media account name, and an army of fake accounts will like or follow you. I’ll leave the social commentary to you. Here’s a video from the maker, Dries Depoorter:

Quick Fix – the vending machine selling likes and followers

Quick Fix in an interactive installation by Dries Depoorter. The artwork makes it possible to buy followers or likes in just a few seconds. For a few euros you already have 200 of likes on Instagram. “Quick Fix “is easy to use. Choose your product, pay and fill in your social media username.

There’s a Raspberry Pi 3B+ in there, along with an Arduino, powering a coin acceptor and some I2C LCD screens. Then there’s a stainless steel heavy-duty keyboard, which we’re lusting after (a spot of Googling unearthed this, which appears to be the same thing, if you’re in the market for a panel-mounted beast of a keyboard).

This piece was commissioned by Pixelache, a cultural association from Helsinki, whose work looks absolutely fascinating if you’ve got a few minutes to browse. Thanks to them and to Dries Depoorter — I have a feeling this won’t be the last of his projects we’re going to feature here.

The post Quick Fix — a vending machine for likes and followers appeared first on Raspberry Pi.

HeaterMeter, the open-source barbecue controller

Par Liz Upton

We spent the weekend knee-deep in marinade. (Top tip: if you’re brining something big, like a particularly plump chicken, buy a cheap kitchen bin. The depth makes it much easier than juggling near-overflowing buckets. And when you’re finished, you have a spare bin.)

meat

If you’re a serious barbecue jockey, you’ll want to know about Bryan Mayland’s HeaterMeter, a rather nifty open-source controller for your barbecue, built around a Raspberry Pi. Controlling the heat of your setup is key in low, slow cooking and smoking; you can get glorious results very inexpensively (an off-the-shelf equivalent will set you back a few hundred pounds) and have the satisfaction of knowing you built your equipment yourself. Bryan says:

Temperature data read from a standard thermistor (ThermoWorks, Maverick) or thermocouple probe is used to adjust the speed of a blower fan motor mounted to the BBQ grill to maintain a specific set temperature point (setpoint). A servo-operated damper may optionally be employed. Additional thermistor probes are used to monitor food and/or ambient temperatures, and these are displayed on a 16×2 LCD attached to the unit. Buttons or serial commands can be used to adjust configuration of the device, including adjustment of the setpoint or manually regulating fan speeds.

The Raspberry Pi adds a web interface, with graphing, archives, and SMS/email support for alarm notification, which means you can go and splash around in the kids’ paddling pool with a beer rather than spending the day standing over the grill with a temperature probe.

Heatermeter graph output

You can buy a HeaterMeter online, in kit form or pre-assembled. There’s an incredibly comprehensive wiki available to get you going with the HeaterMeter, and a very straightforward Instructable if you’re just looking for a quick setup. If you’re the type who prefers to learn by watching, Bryan also has a few videos on YouTube where he puts the kit together. To start with, see how to assemble the LCD/button board here and the base board here.

We’re hungry.

The post HeaterMeter, the open-source barbecue controller appeared first on Raspberry Pi.

Toilet Tracker: automated poo-spotting, no cameras

Par Liz Upton

It might be that I am unusually particular here, but there is nothing (absolutely NOTHING) that upsets me more than dirty toilets. Yes, I know this is the epitome of a pampered-person’s phobia. But I have nightmares — honest, actual, recurring nightmares — about horrible toilets, and I’ll plan my day around avoiding public toilets which are likely to be dirty. So this project appealed to me enormously.

Obi-Wan and the Worst Toilet in Scotland

Automating spotting that things are awry in a toilet cubicle without breaching privacy is really tricky. You can’t use a camera, for obvious reasons. Over at Hackster.io, Mohammad Khairul Alam has come up with a solution: he uses a Raspberry Pi hooked up to Walabot, a 3D imaging sensor (the same sort of thing you might use to find pipes behind studwork if you’re doing DIY) to detect one thing: whether there are any…objects in the toilet cubicle which weren’t there earlier.

From a privacy point of view, this is perfect. The sensor isn’t a camera, and it doesn’t know exactly what it’s looking at: just that there’s a thing where there shouldn’t be.

The Walabot is programmed to understand when the toilet is occupied by sensing above seat level; it’s also looking closer to the floor when the cubicle is empty, for seat-smudges, full bowls, and nasty stuff on the floor. (Writing this post is making me all shuddery. Like I said, I really, really have a problem with this.) Here’s a nice back-of-an-envelope explanation of the logic:

There’s a simple Android app to accompany the setup so you can roll out your own if you have an office with an upsetting toilet.

Learn (much) more over at Hackster — thanks to Md. Khairul Alam for the build!

The post Toilet Tracker: automated poo-spotting, no cameras appeared first on Raspberry Pi.

Yuri 3 rover | The MagPi #82

Par Rob Zwetsloot

In honour of the 50th anniversary of the Apollo moon landing, this year’s Pi Wars was space-themed. Visitors to the two-day event — held at the University of Cambridge in March — were lucky enough to witness a number of competitors and demonstration space-themed robots in action.

Yuri 3 rover

Among the most impressive was the Yuri 3 mini Mars rover, which was designed, lovingly crafted, and operated by Airbus engineer John Chinner. Fascinated by Yuri 3’s accuracy, we got John to give us the inside scoop.

Airbus ambassador

John is on the STEM Ambassador team at Airbus and has previously demonstrated its prototype ExoMars rover, Bridget (you can drool over images of this here: magpi.cc/btQnEw), including at the BBC Stargazing Live event in Leicester. Realising the impressive robot’s practical limitations in terms of taking it out and about to schools, John embarked on a smaller but highly faithful, easily transportable Mars rover. His robot-building experience began in his teens with a six-legged robot he took along to his technical engineering apprenticeship interview and had walk along the desk. Job deftly bagged, he’s been building robots ever since.

Inside the Yuri 3 Mars rover

Yuri is a combination of an Actobotics chassis based on one created by Beatty Robotics plus 3D-printed wheels and six 12 V DC brushed gears. Six Hitec servo motors operate the steering, while the entire rover has an original Raspberry Pi B+ at its heart.

Yuri 3 usually runs in ‘tank steer’ mode. Cannily, the positioning of four of its six wheels at the corners means Yuri 3’s wheels can each be turned so that it spins on the spot. It can also ‘crab’ to the side due to its individually steerable wheels.

Servo motors

The part more challenging for home users is the ‘gold thermal blanket’. The blanket ensures that the rover can maintain working temperature in the extreme conditions found on Mars. “I was very fortunate to have a bespoke blanket made by the team who make them for satellites,” says John. “They used it as a training exercise for the apprentices.”

John has made some bookmarks from the leftover thermal material which he gives away to schools to use as prizes.

Yuri 3 rover thermal blanket samples

Rover design

While designing Yuri 3, it probably helped that John was able to sneak peeks of Airbus’s ExoMars prototypes being tested at the firm’s Mars Yard. (He once snuck Yuri 3 onto the yard and gave it a test run, but that’s supposed to be a secret!) Also, says John, “I get to see the actual flight rover in its interplanetary bio clean room”.

A young girl inspects the Yuri 3 Mars rover

His involvement with all things Raspberry Pi came about when he was part of the Astro Pi programme, in which students send code to two Raspberry Pi devices aboard the International Space Station every year. “I did the shock, vibration, and EMC testing on the actual Astro Pi units in Airbus, Portsmouth,” John proudly tells us.

A very British rover

As part of the European Space Agency mission ExoMars, Airbus is building and integrating the rover in Stevenage. “What a fantastic opportunity for exciting outreach,” says John. “After all the fun with Tim Peake’s Principia mission, why not make the next British astronaut a Mars rover? … It is exciting to be able to go and visit Stevenage and see the prototype rovers testing on the Mars Yard.”

The Yuri 3 Mars rover

John also mentions that he’d love to see Yuri 3 put in an appearance at the Raspberry Pi Store; in the meantime, drooling punters will have to build their own Mars rover from similar kit. Or, we’ll just enjoy John’s footage of Yuri 3 in action and perhaps ask very nicely if he’ll bring Yuri along for a demonstration at an event or school near us.

John wrote about the first year of his experience building Yuri 3 on his blog. And you can follow the adventures of Yuri 3 over on Twitter: @Yuri_3_Rover.

Read the new issue of The MagPi

This article is from today’s brand-new issue of The MagPi, the official Raspberry Pi magazine. Buy it from all good newsagents, subscribe to pay less per issue and support our work, or download the free PDF to give it a try first.

Cover of The MagPi issue 82

The post Yuri 3 rover | The MagPi #82 appeared first on Raspberry Pi.

Penguin Watch — Pi Zeros and Camera Modules in the Antarctic

Par Liz Upton

Long-time readers will remember Penguin Lifelines, one of our very favourite projects from back in the mists of time (which is to say 2014 — we have short memories around here).

Penguins

Click on penguins for fun and conservation

Penguin Lifelines was a programme run by the Zoological Society of London, crowdsourcing the tracking of penguin colonies in Antarctica. It’s since evolved into something called Penguin Watch, now working with the World Wildlife Fund (WWF) and British Antarctic Survey (BAS). It’s citizen science on a big scale: thousands of people from all over the world come together on the internet to…click on penguins. By counting the birds in their colonies, users help penguinologists measure changes in the birds’ behaviour and habitat, and in the larger ecosystem, thus assisting in their conservation.

The penguin people say this about Penguin Watch:

Some of these colonies are so difficult to get to that they haven’t been visited for 50 years! The images contain unprecedented detail, giving us the opportunity to gather new data on the number of penguins in the region. This information will help us understand how they are being affected by climate change, the potential impact of local fisheries, and how we can help conserve these incredible species.

Pis in the coldest, wildest place

And what are those special cameras? The static ones providing time-lapse images are Raspberry Pi Camera Modules, mounted on Raspberry Pi Zeros, and we’re really proud to see just how robust they’ve been in the face of Antarctic winters.

Alasdair Davies on Twitter

Success! The @arribada_i timelapse @Raspberry_Pi Zero cameras built for @penguin_watch survived the Antarctic winter! They captured these fantastic photos of a Gentoo penguin rookery for https://t.co/MEzxbqSyc1 #WorldPenguinDay 🐧@helenlynn @philipcolligan https://t.co/M0TK5NLT6G

These things are incredibly tough. They’re the same cameras that Alasdair and colleagues have been sticking on turtles, at depths of down to 500m; I can’t think of a better set of tests for robustness.

Want to get involved? Head over to Penguin Watch, and get clicking! We warn you, though — it’s a little addictive.

The post Penguin Watch — Pi Zeros and Camera Modules in the Antarctic appeared first on Raspberry Pi.

Motion-controlled water fountain…for cats!

Par Alex Bate

Tired of the constant trickle of your cat’s water fountain? Set up motion detection and put your cat in control.


Cats are fickle

My cat, Jimmy, loves drinking from running water. Or from the sink. Or from whatever glass I am currently using. Basically, my cat loves drinking out of anything that isn’t his water bowl…because like all cats, he’s fickle and lives to cause his humans as much aggravation as possible.

Here’s a photo of my gorgeous boy, because what cat owner doesn’t like showing off their cat at the slightest opportunity?

Jimmy’s getting better now, thanks to the introduction of a pet water fountain in the kitchen, and we’ve somehow tricked him into using it — but what I don’t like is how the constant trickle of water makes me want to pee all the time.

Thankfully, this motion-controlled water foundation from Hackster.io maker vladimirm is here to save the day by only turning on the fountain when his cat approached it.

Motion-controlled pet water foundation

So how does it work? Vladimir explains:

When the PIR sensor detects movement, it sends a message to the radio dongle plugged to the Raspberry Pi, which sends the message to the MQTT server. On the other side, the MQTT message is processed by the Home Assistant, which then, using the automation, triggers the smart plug and starts the configured countdown.

The build uses an old Raspberry Pi 1 Model B, and a BigClown Motion Detector Kit, alongside a TP-Link smart plug and an open-source Home Assistant. The Home Assistant smartphone app documents when the smart plug is activated, and for how long, which also means you can track when your pet is drinking and check they’re getting enough water.

Vladimir goes into far more detail in the project tutorial. Now go help your cat stay hydrated!

The post Motion-controlled water fountain…for cats! appeared first on Raspberry Pi.

We’re on holiday!

Par Liz Upton

It’s a bank holiday here in the UK, so we’re taking the day off to spend some time with our families. If you’re desperate to read some content, I’ve got good news for you: there are thousands of posts about the Raspberry Pi that you can leaf through right here. Head over to the archive and fill your boots!

Normal service will resume tomorrow. In the meantime, here’s Hypnotoad so you can have something to look at.

The post We’re on holiday! appeared first on Raspberry Pi.

Liverpool MakeFest | HackSpace magazine #19

Par Ben Everard

The news that UK Maker Faire was to shut its doors came as a bit of a surprise to many. This vibrant weekend of makers meeting, sharing, and learning was absolutely brilliant, and left us fizzing with ideas after our visits there. We’re sad that it’s gone.

Makers being makers though, if there’s demand, it will be filled. And that’s exactly what’s happening in Liverpool with Liverpool MakeFest. On 29 June 2019, the MakeFest will hold its fifth iteration. This is the UK’s biggest free maker event, attracting thousands of visitors, and its vision of a free, maker-focused festival is spreading far and wide.

We visited the mid-Victorian splendour of Liverpool Central Library, the home of MakeFest, to talk to the founders — Denise Jones, Mark Feltham, and Caroline Keep — to find out what makes this event special.

Liverpool MakeFest 2019 is taking place at the Central Library, Saturday 29 June 2019, and it’s completely free to attend

HackSpace magazine: Hello! Thanks for having us over here. How did the three of you come together to start Liverpool MakeFest?

Caroline Keep: I was a geotechnical engineer, Mark’s an academic, and Denise is a librarian. We bumped into each other watching a workshop in lantern making. Mark had all the academic experience. When I came to work with Mark on his makerspace, I was the geeky maker — he didn’t even have a smartphone at that time. I got the education bug and then moved into secondary school teaching.

Mark Feltham: It all started over there, as a chance meeting. We bumped into each other and got chatting. Within six weeks, we’d filled the library. We thought it would be a one-off, but since then it’s taken off.

Caroline is the reigning TES New Teacher Of The Year

HS: So no business plan, no franchising fees, no world domination?

CK: We’ve just winged it. We made all the banners, bunting. The first year my PGCE fund paid for MakeFest! This building reopened again in 2013, and in 2014 we were lucky that they were running a programme of events and initiatives to make it a really vibrant building, so it was the right time as well. We thought we’d have a little room off to the side and get maybe six tables. We’d already done a Mini Maker Faire, and we’ve always been good friends with [local makerspace] DoES Liverpool, so we were confident we’d get at least a few people turning up. And in six weeks we were full.

MF: We pulled the first one off, we’re talking the first three floors of the library and 60 makers, for £850. And that included feeding them and making badges as well.

One of the spin-offs that have come out of MakeFest is Little Sandboxes, which takes making out to deprived areas of the city

HS: For context, this building is huge. It’s bigger than most libraries; it’s probably about the same size as the Life Centre in Newcastle, where UK Maker Faire was held until recently. It must have helped to have a librarian on board to negotiate with the powers that be?

Denise Jones: I had to sell it to the people in charge back then, which were the head of service and the manager of this building. The Department for Culture, Media and Sport has a Taskforce for Libraries, which is funded until next year. We’re close to finishing the national guidance now for the Taskforce — the idea is to get one of these [MakeFests] in every library. We wanted the guidance doc to be inclusive of museums and libraries, because we knew that Manchester had opted to put their MakeFest in a museum. We’ve got Chester and Stoke MakeFest, and there’s one in the pipeline in Wrexham. We were having the same conversations over and over again, so we decided to write a document: how to run a MakeFest.

Liverpool Central Library was renovated a few years ago — the precious books went into temporary storage in a salt mine in Cheshire to keep them dry

HS: What have we got to look forward to this year and beyond?

CK: That’s a good question. We’ve got some corking stuff coming this year. We’ve given it the theme ’Space and time – creativity in the making’. We’ve got events planned for the Apollo anniversary, and [just] before MakeFest we’re going to kick off with a music day, showing people how to make music, and making the instruments to make music. That’s another spin-off that’s come out of MakeFest: the MakerNoise Unconference at Edge Hill University.

MF: We’ve always felt that we hold MakeFest in trust for makers. In terms of where it goes long-term, I don’t see it ever becoming more than a one-day event here, because one day is good. It gives people Sunday to get over things, and get home because they have day jobs on a Monday. We’re always very sensitive to that, we don’t want to take up too much of people’s time. The other thing is that I don’t see it spilling out into a bigger building; it’s always going to be in the library. But the way to grow it is to put it in other libraries. Not to make this one, Liverpool, bigger and take over. Then each maker community gets its own feel, and its own vibe — Stoke MakeFest has a very different feel to ours, because their maker scene is different to ours, and their city is different to ours.

The other way to expand it is that, rather than by just expanding to other cities, you can have more events on throughout the year. Rather than being solely a one-day event, you can have all these spin-offs, so once a month there’s something going on. Rather than it just being about tech and digital, we’ve always liked to have some sort of fantasy element. Things like Doctor Who, Star Wars, Darth Vader, K-9 — the kids love that. We have a lot of friends who are into steampunk; they get roped in to do front-of-house duties. You know what the funny thing was at the first one? Not only did the public enjoy it, but also the makers. It’s kind of like a musician playing an acoustic set. We’ve got a get-together on the Thursday before, we’ve got a Friday night party going, we always do an after-party. The public come on the Saturday, but there’s always stuff going on that week for makers.

In addition to always wanting it to be free for the public, and for the makers to not have to pay for their stand, we feel very strongly that we should give something back. We always give them lunch, we always give them a badge, and there’s always a party. We can’t pay them, but it’s our way of showing our appreciation to the makers who come and make it what it is. The celebration and sharing are big parts of the maker ethos.

People like to show [their projects] not to show off, not to say ‘Look at how clever I am’ — it’s more to say ‘Look at this awesome thing, isn’t this cool?’ Trying to explain that to people can be tricky. You can make this: here’s how you do it. That’s the ethos.

CK: I always feel with MakeFest — you said it’s like an acoustic gig. I always envisioned it as Liverpool’s party for makers. It’s our little get-together, and that’s how I like it.

Read the full interview in HackSpace magazine issue 19, out now! This month we’re looking at building a walking robot, laser cutting LED jewellery, the 55 timer chip, and much more. Download the issue for free, or buy it in print on our website.

Get HackSpace magazine issue 19 from all good newsagents

Special subscription offer

To have 132 pages of making delivered to your doorstep every month, subscribe to HackSpace magazine from just £5 for your first three issues.

The post Liverpool MakeFest | HackSpace magazine #19 appeared first on Raspberry Pi.

Make a Donkey Kong–style walk cycle | Wireframe issue 14

Par Ryan Lambie

Effective animation gave Donkey Kong barrels of personality. Raspberry Pi’s own Rik Cross explains how to create a similar walk cycle.

Donkey Kong wasn’t the first game to feature an animated character who could walk and jump, but on its release in 1981, it certainly had more personality than the games that came before it. You only have to compare Donkey Kong to another Nintendo arcade game that came out just two years earlier — the half-forgotten top-down shooter Sheriff — to see how quickly both technology and pixel art moved on in that brief period. Although simple by modern standards, Donkey Kong’s hero Jumpman (later known as Mario) packed movement and personality into just a few frames of animation.

In this article, I’ll show you how to use Python and Pygame to create a character with a simple walk cycle animation like Jumpman’s in Donkey Kong. The code can, however, be adapted for any game object that requires animation, and even for multiple game object animations, as I’ll explain later.

Jumpman’s (aka Mario’s) walk cycle comprised just three frames of animation.

Firstly, we’ll need some images to animate. As this article is focused on the animation code and not the theory behind creating walk cycle images, I grabbed some suitable images created by Kenney Vleugels and available at opengameart.org.

Let’s start by animating the player with a simple walk cycle. The two images to be used in the animation are stored in an images list, and an animationindex variable keeps track of the index of the current image in the list to display. So, for a very simple animation with just two different frames, the images list will contain two different images:

images = [‘walkleft1’,‘walkleft2’

To achieve a looping animation, the animationindex is repeatedly incremented, and is reset to 0 once the end of the images list is reached. Displaying the current image can then be achieved by using the animationindex to reference and draw the appropriate image in the animation cycle:

self.image = self.images[self.state][self.animationindex]

A list of images along with an index is used to loop through an animation cycle.

The problem with the code described so far is that the animationindex is incremented once per frame, and so the walk cycle will happen way too quickly, and won’t look natural. To solve this problem, we need to tell the player to update its animation every few frames, rather than every frame. To achieve this, we need another couple of variables; I’ll use animationdelay to store the number of frames to skip between displayed images, and animationtimer to store the number of frames since the last image change.

Therefore, the code needed to animate the player becomes:

self.animationtimer += 1
if self.animationtimer >= self.animationdelay:
self.animationtimer = 0
self.animationindex += 1
if self.animationindex > len(self.images) - 1:
self.animationindex = 0
self.image = self.images[self.animationindex]

So we have a player that appears to be walking, but now the problem is that the player walks constantly, and always in the same direction! The rest of this article will show you how to solve these two related problems.

There are a few different ways to approach this problem, but the method I’ll use is to make use of game object states, and then have different animations for each state. This method is a little more complicated, but it’s very adaptable.

The first thing to do is to decide on what the player’s ‘states’ might be — stand, walkleft, and walkright will do as a start. Just as we did with our previous single animation, we can now define a list of images for each of the possible player’s states. Again, there are lots of ways of structuring this data, but I’ve opted for a Python dictionary linking states and image lists:

self.images = { ‘stand’ : [‘stand1’],
‘walkleft’ : [‘walkleft1’,‘walkleft2’],
‘walkright’ : [‘walkright1’,‘walkright2’]
}

The player’s state can then be stored, and the correct image obtained by using the value of state along with the animationindex:

self.image = self.images[self.state][self.animationindex]

The correct player state can then be set by getting the keyboard input, setting the player to walkleft if the left arrow key is pressed or walkright if the right arrow key is pressed. If neither key is pressed, the player can be set to a stand state; the image list for which contains a single image of the player facing the camera.

Animation cycles can be linked to player ‘states’.

For simplicity, a maximum of two images are used for each animation cycle; adding more images would create a smoother or more realistic animation.

Using the code above, it would also be possible to easily add additional states for, say, jumping or fighting enemies. You could even take things further by defining an Animation() object for each player state. This way, you could specify the speed and other properties (such as whether or not to loop) for each animation separately, giving you greater flexibility.

Here’s Rik’s animated walk cycle, coded in Python. To get it running on your system, you’ll first need to install Pygame Zero. And to download the full code, go here.

Get your copy of Wireframe issue 14

You can read more features like this one in Wireframe issue 14, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 14 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Make a Donkey Kong–style walk cycle | Wireframe issue 14 appeared first on Raspberry Pi.

His Royal Highness the Duke of York visits Raspberry Pi HQ

Par Philip Colligan

We welcomed a very special guest to Raspberry Pi HQ today.

Our Patron, His Royal Highness Prince Andrew, the Duke of York, visited our central Cambridge HQ to meet our team, learn more about our work, and give his support for our mission to help more young people learn how to create with computers.

Prince Andrew speaking at a lectern

Royalty and Raspberry Pi

Avid readers of this blog will know that this isn’t Raspberry Pi’s first royal encounter. Back in 2014, Raspberry Pi was one of the UK tech startups invited to showcase our product at a reception at Buckingham Palace. At that stage, we had just celebrated the sale of our two millionth credit card–sized computer.

Fast forward to October 2016, when were celebrating the sale of our ten millionth Raspberry Pi computer with a reception at St James Palace and 150 members of our community. By this time, not only was our product flying off the shelves, but the Foundation had merged with Code Club, had expanded its teacher training programmes, and was working with thousands of volunteers to bring computing and digital making to tens of thousands of young people all over the world.

Prince Andrew and a woman watching a computer screen

Both of our trips to the royal palaces were hosted by Prince Andrew, who has long been a passionate advocate for technology businesses and digital skills. On top of his incredible advocacy work, he’s also an entrepreneur and innovator in his own right, founding and funding initiatives such as iDEA and Pitch at the Palace, which make a huge impact on digital skills and technology startups.

We are really very fortunate to have him as our Patron.

Leaps and bounds

Today’s visit was an opportunity to update Prince Andrew on the incredible progress we’ve made towards our mission since that first trip to Buckingham Palace.

We now have over 25 million Raspberry Pi computers in the wild, and people use them in education, in industry, and for their hobbies in an astonishing number of ways. Through our networks of Code Clubs and CoderDojos, we have supported more than a million young people to learn how to create with technology while also developing essential life skills such as teamwork, resilience, and creativity. You can read more about what we’ve achieved in our latest Annual Review.

Prince Andrew speaking to two seated people

We talked with Prince Andrew about our work to support computing in the classroom, including the National Centre for Computing Education in England, and our free online teacher training that is being used by tens of thousands of educators all over the world to develop their skills and confidence.

Prince Andrew shares our determination to encourage more girls to learn about computing and digital making, and we discussed our #realrolemodels campaign to get even more girls involved in Code Clubs and CoderDojos, as well as the groundbreaking gender research project that we’ve launched with support from the UK government.

Dream team

One of our rituals at the Raspberry Pi Foundation is the monthly all-staff meetup. On the third Wednesday of every month, colleagues from all over the world congregate in Cambridge to share news and updates, learn from each other, and plan together (and yes, we have a bit of fun too).

Prince Andrew and three other men watching a computer screen

My favourite part of Prince Andrew’s visit is that he organised it to coincide with the all-staff meetup. He spent most of his time speaking to team members and hearing about the work they do every day to bring our mission to life through creating educational resources, supporting our massive community of volunteers, training teachers, building partnerships, and much more.

In his address to the team, he said:

Raspberry Pi is one of those organisations that I have been absolutely enthralled by because of what you have enabled. The fact that there is this piece of hardware that started this, and that has led to educational work that reaches young people everywhere, is just wonderful.

In the 21st century, every single person in the workplace is going to have to use and interact with some form of digital technology. The fact that you are giving the next generation the opportunity to get hands-on is fantastic.

The post His Royal Highness the Duke of York visits Raspberry Pi HQ appeared first on Raspberry Pi.

The NSFW Roomba that screams when it bumps into stuff

Par Alex Bate

Hide yo’ kids, hide yo’ wife — today’s project is NSF(some)W, or for your kids. LOTS OF SWEARS. You have been warned. We’re not embedding the video here so you can decide for yourself whether or not to watch it — click on the image below to watch a sweary robot on YouTube.

Sweary Roomba

Michael Reeves is best known for such… educational Raspberry Pi projects as:

He’s back, this time with yet another NSFW (depending on your W) project that triggers the sensors in a Roomba smart vacuum to scream in pain whenever it bumps into an object.

Because why not?

How it’s made

We have no clue. So very done with fans asking for the project to be made — “I hate every single one of you!” — Michael refuses to say how he did it. But we know this much is true: the build uses optical sensors, relays, a radio receiver, and a Raspberry Pi. How do I know this? Because he showed us:

Roomba innards

But as for the rest? We leave it up to you, our plucky community of tinkerers, to figure it out. Share your guesses in the comments.

More Michael Reeves

Michael is one of our Pi Towers guilty pleasures and if, like us, you want to watch more of his antics, you should subscribe to him on YouTube.

The post The NSFW Roomba that screams when it bumps into stuff appeared first on Raspberry Pi.

Build your own animatronic GLaDOS

Par Liz Upton

It’s 11 years since Steam’s Orange Box came out, which is probably making you feel really elderly. Portal was the highlight of the game bundle for me — cue giant argument in the comments — and it still holds up brilliantly. It’s even in the Museum of Modern Art’s collection; there’s nothing that quite says you’re part of the establishment like being in a museum. Cough.

I bought an inflatable Portal turret to add to the decor in Raspberry Pi’s first office (I’m still not sure why; I just thought it was a good idea at the time, like the real-life Minecraft sword). Objects and sounds from the game have embedded themselves in pop culture; there’s a companion cube paperweight somewhere in my desk at home, and I bet you’ve encountered a cake that looks like this sometime in the last 11 years or so.

A lie

But turrets, cakes, and companion cubes pale into viral insignificance next to the game’s outstanding antagonist, GLaDOS, a psychopathic AI system who just happens to be my favourite video game bad guy of all time. So I was extremely excited to see Element14’s DJ Harrigan make an animatronic GLaDOS, powered, of course, by a Raspberry Pi.

Animitronic GLaDOS Head with Raspberry Pi

The Portal franchise is one of the most engaging puzzle games of the last decade and beyond the mind-bending physics, is also known for its charming A.I. antagonist: G.L.a.D.O.S. Join DJ on his journey to build yet more robotic characters from pop culture as he “brings her to life” with a Raspberry Pi and sure dooms us all.

Want to make your own? You’ll find everything you need here. I’ve been trying awfully hard not to end this post on a total cliche, but I’m failing hard: this was a triumph.

The post Build your own animatronic GLaDOS appeared first on Raspberry Pi.

Play musical chairs with Marvel’s Avengers

Par Alex Bate

You read that title correctly.

I played musical chairs against the Avengers in AR

Planning on teaching a 12 week class on mixed reality development starting in June. Apply if interested – http://bit.ly/3016EdH

Playing with the Avengers

Abhishek Singh recently shared his latest Unity creation on Reddit. And when Simon, Righteous Keeper of the Swag at Pi Towers, shared it with us on Slack because it uses a Raspberry Pi, we all went a little doolally.

As Abhishek explains in the video, the game uses a Raspberry Pi to control sensors and lights, bridging the gap between augmented reality and the physical world.

“The physical world communicates with the virtual world through these buttons. So, when I sit down on a physical chair, and press down on it, the virtual characters know that this chair is occupied,” he explains, highlighting that the chairs’ sensors are attached to a Raspberry Pi. To save the physical-world player from accidentally sitting on Thanos’s lap, LEDs, also attached to the Pi, turn on when a chair is occupied in the virtual world.

Turning the losing Avenger to dust? Priceless 👌

Why do you recognise Abhishek Singh?

You might be thinking, “Where do I recognise Abhishek Singh from?” I was asking myself this for a solid hour — until I remembered Peeqo, his robot that only communicates through GIF reactions. And Instagif NextStep, his instant camera that prints GIFs!

First GIFs, and now musical chairs with the Avengers? Abhishek, it’s as if you’ve understood the very soul of the folks who work at Pi Towers, and for that, well…

The post Play musical chairs with Marvel’s Avengers appeared first on Raspberry Pi.

Raspberry Pi Press: what’s on our newsstand?

Par Alex Bate

Raspberry Pi Press, the publishing branch of Raspberry Pi Trading, produces a great many magazines and books every month. And in keeping with our mission to make computing and digital making as accessible as possible to everyone across the globe, we make the vast majority of our publications available as free PDFs from the day we release new print versions.

We recently welcomed Custom PC to the Press family and we’ve just published the new-look Custom PC 190. So this is a perfect time to showcase the full catalogue of Raspberry Pi Press publications, to help you get the most out of what we have on offer.

The MagPi magazine

The MagPi was originally created by a group of Raspberry Pi enthusiasts from the Raspberry Pi forum who wanted to make a magazine that the whole community could enjoy. Packed full of Pi-based projects and tutorials, and Pi-themed news and reviews, The MagPi now sits proudly upon the shelves of Raspberry Pi Press as the official Raspberry Pi magazine.

The MagPi magazine issue 81

Visit The MagPi magazine online, and be sure to follow them on Twitter and subscribe to their YouTube channel.

HackSpace magazine

The maker movement is growing and growing as ever more people take to sheds and makerspaces to hone their skills in woodworking, blacksmithing, crafting, and other creative techniques. HackSpace magazine brings together the incredible builds of makers across the world with how-to guides, tips and advice — and some utterly gorgeous photography.

Visit the HackSpace magazine website, and follow their Twitter account and Instagram account.

Wireframe magazine

“Lifting the lid on video games”, Wireframe is a gaming magazine with a difference. Released bi-weekly, Wireframe reveals to readers the inner workings of the video game industry. Have you ever wanted to create your own video game? Wireframe also walks you through how you can do it, in their ‘The Toolbox’ section, which features tutorials from some of the best devs in the business.

Follow Wireframe magazine on Twitter, and learn more on their website.

Hello World magazine

Hello World is our free magazine for educators who teach computing and digital making, and we produce it in association with Computing at Schools and the BCS Academy of Computing. Full of lesson plans and features from teachers in the field, Hello World is a unique resource for everyone looking to bring computing into the classroom, and for anyone interested in computing and digital making education.

Hello World issue 8

Educators in the UK can subscribe to have Hello World delivered for free to their door; if you’re based somewhere else, you can download the magazine for free from the day of publication, or purchase it via the Raspberry Pi Press online store. Follow Hello World on Twitter and visit the website for more.

Custom PC magazine

New to Raspberry Pi Press, Custom PC is the UK’s best-selling magazine for PC hardware, overclocking, gaming, and modding. With monthly in-depth reviews, special features, and step-by-step guides, Custom PC is the go-to resource for turning your computer up to 11.

Visit the shiny new Custom PC website, and be sure to follow them on Twitter.

Books

Magazines aren’t our only jam: Raspberry Pi Press also publishes a wide variety of books, from introductions to topics like the C programming language and Minecraft on your Pi, to our brand-new Raspberry Pi Beginner’s Guide and the Code Club Book of Scratch.

An Introduction to C and GUI programming by Simon Long

We also bridge the gap between our publications with one-off book/magazine hybrids, such as HackSpace magazine’s Book of Making and Wearable Tech Projects, and The MagPi’s Raspberry Pi Projects Book series.

Getting your copies

If you’d like to support our educational mission at the Raspberry Pi Foundation, you can subscribe to our magazines, and you can purchase copies of all our publications via the Raspberry Pi Press website, from many high street newsagents, or from the Raspberry Pi Store in Cambridge. And most of our publications are available as free PDFs so you can get your hands on our magazines and books instantly.

Whichever of our publications you choose to read, and however you choose to read them, we’d love to hear what you think of our Raspberry Pi Press offerings, and we hope you enjoy them all.

The post Raspberry Pi Press: what’s on our newsstand? appeared first on Raspberry Pi.

Video call with a Raspberry Pi and Google Duo

Par Alex Bate

Use Google Duo and a Raspberry Pi to build a video doorbell for your home so you can always be there to answer your door, even when you’re not actually there to answer your door.

“Martin Mander builds a good build,” I reply to Liz Upton as she shares this project, Martin’s latest one, with me on Slack. We’re pretty familiar with his work here at Raspberry Pi! Previously, we’ve shared his Google AIY retrofit intercom, upcycled 1970s TV with built-in Raspberry Pi TV HAT, and Batinator. We love the extra step that Martin always takes to ensure the final result of each project is clean-cut and gorgeous-looking, with not even a hint of hot glue in sight.

Raspberry Pi video doorbell

“I’ve always fancied making a video doorbell using a Raspberry Pi,” explains Martin in the introduction to his project on Hackster.io. “[B]ut until recently I couldn’t find an easy way to make video calls that would both work in a project and be straightforward for others to recreate.”

By ‘recently’, he means February of this year, when Google released their Duo video chat application for web browsers.

With a Raspberry Pi 3B+ and a webcam in hand, Martin tested the new release, and lo and behold, he was able to video-call his wife with relative ease via Chromium, Raspbian‘s default browser.

“The webcam I tested had a built-in microphone, and even on the first thrown-together test call, the quality was great. This was a very exciting moment, unlocking the potential of the video doorbell project as well as many other possibilities.”

By accident, Martin also discovered that you can run Google Duo out of the browser, even on the Raspberry Pi. This allowed him to strip away all the unnecessary “Chromium furniture”.

But, if this was to be a video doorbell, how was he to tell the Raspberry Pi to call his mobile phone when the doorbell was activated?

“If Duo were a full app, then command line options might be available, for example to launch the app and immediately call a specific contact. In the absence of this (for now?) I needed to find a way to automatically start a call with a GPIO button press.”

To accomplish this, Martin decided to use PyUserInput, a community-built cross-platform module for Python. “The idea was to set up a script to wait for a button press, then move the mouse to the Contacts textbox, type the name of the contact, press Enter and click Video Call“, Martin explains. And after some trial and error — and calls to the wrong person — his project was a working success.

To complete the build, Martin fitted the doorbell components into a 1980s intercom (see his previous intercom build), wired them through to a base unit inside the home, and then housed it all within an old Sony cassette player.

The final result? A functional video doorbell that is both gorgeous and practical. You can find out more about the project on the Hackster.io project page.

The post Video call with a Raspberry Pi and Google Duo appeared first on Raspberry Pi.

Possibilities of the Raspberry Pi — from Code Club to Coolest Projects USA

Par Alex Bate

Yolanda Payne is a veteran teacher and Raspberry Pi Certified Educator. After discovering a love for computers at an early age (through RadioShack Tandy), Yolanda pursued degrees in Instructional/Educational Technology at Mississippi State University, the University of Florida, and the University of Georgia. She has worked as an instructional designer, webmaster, and teacher, and she loves integrating technology into her lessons. Here’s Yolanda’s story:

My journey to becoming a Raspberry Pi Certified Educator started when an esteemed mentor, Juan Valentin, tweeted about the awesome experience he had while attending Picademy. Having never heard of Picademy or the Raspberry Pi, I decided to check out the website and instantly became intrigued. I applied for a Raspberry Pi STEM kit from the Civil Air Patrol and received a Raspberry Pi and a ton of accessories. My curiosity would not be satisfied until I learned just what I could do with the box of goodies. So I decided to apply to Picademy and was offered a spot after being waitlisted. Thus my obsession with the possibilities of the Raspberry Pi began.

Code Club allows me to provide a variety of lessons, tailored to my students’ interests and skill levels, without me having to be an expert

While at Picademy, I learned about Code Club. Code Club allows me to provide a variety of lessons tailored to my learners’ interests and skill levels, without me having to be an expert in all of the lessons. My students are 6th- to 8th-graders, and there are novice coders as well as intermediate and advanced coders in the group. We work through lessons together, and I get to be a student with them.

I have found a myriad of resources to support their dreams of making

Although I may not have all the answers to their questions, I’m willing to work to secure whatever supplies they need for their project making. Whether through DonorsChoose, grants, student fundraising, or my personal contributions, I have found a myriad of resources to support their dreams of making.

Raspberry Pi group photo!

My district has invested in a one-to-one computer initiative for students, and I am happy to help students become creators of technology and not just consumers. Having worked with Code Club through the Raspberry Pi Foundation, my students and I realize just how achievable this dream can be. I’m able to enhance my Pi skills by teaching a summer hacking camp at our local university, and next year, we have goals to host a Pi Jam! Thankfully, my principal is very supportive of our endeavours.

Students at Coolest Projects USA 2018

This year, a few of my students and my son were able to participate in Coolest Projects USA 2018 to show off their projects, including a home surveillance camera, a RetroPie arcade game, a Smart Mirror, and a photo booth and dash cam. They dedicated a lot of time and effort to bring these projects to life, often on their own and beyond the hours of our Code Club. This adventure has inspired them, and they are already recruiting other students to join them next year! The possibilities of the Raspberry Pi constantly rejuvenates my curiosity and enhances the creativity that I get to bring to my teaching — both inside and outside the classroom.

Learn more

Learn more about the free programmes and resources Yolanda has used on her computer science education journey, such as Picademy, Code Club, and Coolest Projects, by visiting the Education section of our website.

The post Possibilities of the Raspberry Pi — from Code Club to Coolest Projects USA appeared first on Raspberry Pi.

Raspberry Pi captures a Soyuz in space!

Par Alex Bate

So this happened. And we are buzzing!

You’re most likely aware of the Astro Pi Challenge. In case you’re not, it’s a wonderfully exciting programme organised by the European Space Agency (ESA) and us at Raspberry Pi. Astro Pi challenges European young people to write scientific experiments in code, and the best experiments run aboard the International Space Station (ISS) on two Astro Pi units: Raspberry Pi 1 B+ and Sense HATs encased in flight-grade aluminium spacesuits.

It’s very cool. So, so cool. As adults, we’re all extremely jealous that we’re unable to take part. We all love space and, to be honest, we all want to be astronauts. Astronauts are the coolest.

So imagine our excitement at Pi Towers when ESA shared this photo on Friday:

This is a Soyuz vehicle on its way to dock with the International Space Station. And while Soyuz vehicles ferry between earth and the ISS all the time, what’s so special about this occasion is that this very photo was captured using a Raspberry Pi 1 B+ and a Raspberry Pi Camera Module, together known as Izzy, one of the Astro Pi units!

So if anyone ever asks you whether the Raspberry Pi Camera Module is any good, just show them this photo. We don’t think you’ll need to provide any further evidence after that.

The post Raspberry Pi captures a Soyuz in space! appeared first on Raspberry Pi.

Create an arcade-style zooming starfield effect | Wireframe issue 13

Par Ryan Lambie

Unparalleled depth in a 2D game: PyGame Zero extraordinaire Daniel Pope shows you how to recreate a zooming starfield effect straight out of the eighties arcade classic Gyruss.

The crowded, noisy realm of eighties amusement arcades presented something of a challenge for developers of the time: how can you make your game stand out from all the other ones surrounding it? Gyruss, released by Konami in 1983, came up with one solution. Although it was yet another alien blaster — one of a slew of similar shooters that arrived in the wake of Space Invaders, released in 1978 — it differed in one important respect: its zooming starfield created the illusion that the player’s craft was hurtling through space, and that aliens were emerging from the abyss to attack it.

This made Gyruss an entry in the ‘tube shooter’ genre — one that was first defined by Atari’s classic Tempest in 1981. But where Tempest used a vector display to create a 3D environment where enemies clambered up a series of tunnels, Gyruss used more common hardware and conventional sprites to render its aliens on the screen. Gyruss was designed by Yoshiki Okamoto (who would later go on to produce the hit Street Fighter II, among other games, at Capcom), and was born from his affection for Galaga, a 2D shoot-’em-up created by Namco.

Under the surface, Gyruss is still a 2D game like Galaga, but the cunning use of sprite animation and that zooming star effect created a sense of dynamism that its rivals lacked. The tubular design also meant that the player could move in a circle around the edge of the play area, rather than moving left and right at the bottom of the screen, as in Galaga and other fixed-screen shooters like it. Gyruss was one of the most popular arcade games of its period, probably in part because of its attention-grabbing design.

Here’s Daniel Pope’s example code, which creates a Gyruss-style zooming starfield effect in Python. To get it running on your system, you’ll first need to install Pygame Zero — find installation instructions here, and download the Python code here.

The code sample above, written by Daniel Pope, shows you how a zooming star field can work in PyGame Zero — and how, thanks to modern hardware, we can heighten the sense of movement in a way that Konami’s engineers couldn’t have hoped to achieve about 30 years ago. The code generates a cluster of stars on the screen, and creates the illusion of depth and movement by redrawing them in a new position in a randomly chosen direction each frame.

At the same time, the stars gradually increase their brightness over time, as if they’re getting closer. As a modern twist, Pope has also added an extra warp factor: holding down the Space bar increases the stars’ velocity, making that zoom into space even more exhilarating.

Get your copy of Wireframe issue 13

You can read the rest of the feature in Wireframe issue 13, available now at Tesco, WHSmith, and all good independent UK newsagents.

Or you can buy Wireframe directly from Raspberry Pi Press — delivery is available worldwide. And if you’d like a handy digital version of the magazine, you can also download issue 13 for free in PDF format.

Make sure to follow Wireframe on Twitter and Facebook for updates and exclusive offers and giveaways. Subscribe on the Wireframe website to save up to 49% compared to newsstand pricing!

The post Create an arcade-style zooming starfield effect | Wireframe issue 13 appeared first on Raspberry Pi.

❌