When we announced new keyboards for Portugal and the Nordic countries last month, we promised that you wouldn’t have to wait much longer for a variant for Japan, and now it’s here!
Japanese Raspberry Pi keyboard
The Japan variant of the Raspberry Pi keyboard required a whole new moulding set to cover the 83-key arrangement of the keys. It’s quite a complex keyboard, with three different character sets to deal with. Figuring out how the USB keyboard controller maps to all the special keys on a Japanese keyboard was particularly challenging, with most web searches leading to non-English websites. Since I don’t read Japanese, it all became rather bewildering.
We ended up reverse-engineering generic Japanese keyboards to see how they work, and mapping the keycodes to key matrix locations. We are fortunate that we have a very patient keyboard IC vendor, called Holtek, which produces the custom firmware for the controller.
We then had to get these prototypes to our contacts in Japan, who told us which keys worked and which just produced a strange squiggle that they didn’t understand either. The “Yen” key was particularly difficult because many non-Japanese computers read it as a “/” character, no matter what we tried to make it work.
Special thanks are due to Kuan-Hsi Ho of Holtek, to Satoka Fujita for helping me test the prototypes, and to Matsumoto Seiya for also testing units and checking the translation of the packaging.
Get yours today
You can get the new Japanese keyboard variant in red/white from our Approved Reseller, SwitchScience, based in Japan.
If you’d rather your keyboard in black/grey, you can purchase it from Pimoroni and The Pi Hut in the UK, who both offer international shipping.
One of our favourite makers, Pi & Chips (AKA David Pride), wanted to see if they could trigger a DSLR camera to take pictures by using motion detection with OpenCV on Raspberry Pi.
You could certainly do this with a Raspberry Pi High Quality Camera, but David wanted to try with his swanky new Lumix camera. As well as a Raspberry Pi and whichever camera you’re using, you’ll also need a remote control. David sourced a cheap one from Amazon, since he knew full well he was going to be… breaking it a bit.
When it came to the “breaking” part, David explains: “I was hoping to be able to just re-solder some connectors to the button but it was a dual function button depending on depth of press. I therefore got a set of probes out and traced which pins on the chip were responsible for the actual shutter release and then *carefully* managed to add two fine wires.”
Next, David added Dupont cables to the ends of the wires to allow access to the breadboard, holding the cables in place with a blob of hot glue. Then a very simple circuit using an NPN transistor to switch via GPIO gave remote control of the camera from Python.
David then added OpenCV to the mix, using this tutorial on PyImageSearch. He took the basic motion detection script and added a tiny hack to trigger the GPIO when motion was detected.
He needed to add a delay to the start of the script so he could position stuff, or himself, in front of the camera with time to spare. Got to think of those angles.
David concludes: “The camera was set to fully manual and to a really nice fast shutter speed. There is almost no delay at all between motion being detected and the Lumix actually taking pictures, I was really surprised how instantaneous it was.”
Here are some of the visuals captured by this Raspberry Pi-powered project…
Take a look at some more of David’s projects over at Pi & Chips.
One of our favourite YouTubers, Harrison McIntyre, decided to make the aphorism “a watched pot never boils” into reality. They modified a tabletop burner with a Raspberry Pi so that it will turn itself off if anyone looks at it.
In this project, the Raspberry Pi runs facial detection using a USB camera. If the Raspberry Pi finds a face, it deactivates the burner, and vice versa.
There’s a snag, in that the burner runs off 120 V AC and the Raspberry Pi runs off 5 V DC, so you can’t just power the burner through the Raspberry Pi. Harrison got round this problem using a relay switch, and beautifully explains how a relay manages to turn a circuit off and on without directly interfacing with the circuit at the two minute mark of this video.
Harrison sourced a switchable plug bar which uses a relay to turn its own switches on and off. Plug the burner and the Raspberry Pi into that and, hey presto, you’ve got them working together via a relay.
Things get jazzy at the four minute 30 second mark. At this point, Harrison decides to upgrade his single camera situation, and rig up six USB cameras to make sure that no matter where you are when you you look at the burner, the Raspberry Pi will always see your face and switch it off.
Harrison’s multiple-camera setup proved a little much for the Raspberry Pi 3B he had to hand for this project, so he goes on to explain how he got a bit of extra processing power using a different desktop and an Arduino. He recommends going for a Raspberry Pi 4 if you want to try this at home.
Join us for Digital Making at Home: this week, young people can explore the graphics side of video game design! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.
So get ready to design video game graphics with us:
Fancy tracking the ISS’s trajectory? All you need is a Raspberry Pi, an e-paper display, an enclosure, and a little Python code. Nicola King looks to the skies
Standing on his balcony one sunny evening, the perfect conditions enabled California-based astronomy enthusiast Sridhar Rajagopal to spot the International Space Station speeding by, and the seeds of an idea were duly sown. Having worked on several projects using tri-colour e-paper (aka e-ink) displays, which he likes for their “aesthetics and low-to-no-power consumption”, he thought that developing a way of tracking the ISS using such a display would be a perfect project to undertake.
“After a bit of searching, I was able to find an open API to get the ISS location at any given point in time,” explains Sridhar. I also knew I wouldn’t have to worry about the data changing several times per second or even per minute. Even though the ISS is wicked fast (16 orbits in a day!), this would still be well within the refresh capabilities of the e-paper display.”
His ISS Tracker works by obtaining the ISS location from the Open Notify API every 30 seconds. It appends this data point to a list, so older data is available. “I don’t currently log the data to file, but it would be very easy to add this functionality,” says Sridhar. “Once I have appended the data to the list, I call the drawISS method of my Display class with the positions array, to render the world map and ISS trajectory and current location. The world map gets rendered to one PIL image, and the ISS location and trajectory get rendered to another PIL image.”
Each latitude/longitude position is mapped to the corresponding XY co-ordinate. The last position in the array (the latest position) gets rendered as the ISS icon to show its current position. “Every 30th data point gets rendered as a rectangle, and every other data point gets rendered as a tiny circle,” adds Sridhar.
From there, the images are then simply passed into the e-paper library’s display method; one image is rendered in black, and the other image in red.
Little wonder that the response received from friends, family, and the wider maker community has been extremely positive, as Sridhar shares: “The first feedback was from my non-techie wife who love-love-loved the idea of displaying the ISS location and trajectory on the e-paper display. She gave valuable input on the aesthetics of the data visualisation.”
In addition, he tells us that other makers have contributed suggestions for improvements. “JP, a Hackster community user […] added information to make the Python code a service and have it launch on bootup. I had him contribute his changes to my GitHub repository – I was thrilled about the community involvement!”
Housed in a versatile, transparent ProtoStax enclosure designed by Sridhar, the end result is an elegant way of showing the current position and trajectory of the ISS as it hurtles around the Earth at 7.6 km/s. Why not have a go at making your own display so you know when to look out for the space station whizzing across the night sky? It really is an awesome sight.
The team at Raspberry Pi and our partner ESA Education are pleased to announce the winning and highly commended Mission Space Lab teams of the 2019–20 European Astro Pi Challenge!
Mission Space Lab sees teams of young people across Europe design, create, and deploy experiments running on Astro Pi computers aboard the International Space Station. Their final task: analysing the experiments’ results and sending us scientific reports highlighting their methods, results, and conclusions.
The science teams performed was truly impressive, and the reports teams sent us were of outstanding quality. A special round of applause to the teams for making the effort to coordinate writing their reports socially distant!
The Astro Pi jury has now selected the ten winning teams, as well as eight highly commended teams:
And our winners are…
Vidhya’s code from the UK aimed to answer the question of how a compass works on the ISS, using the Astro Pi computer’s magnetometer and data from the World Magnetic Model (WMM).
Unknown from Externato Cooperativo da Benedita, Portugal, aptly investigated whether influenza is transmissible on a spacecraft such as the ISS, using the Astro Pi hardware alongside a deep literature review.
Space Wombats from Institut d’Altafulla, Spain, used normalized difference vegetation index (NDVI) analysis to identify burn scars from forest fires. They even managed to get results over Chernobyl!
Liberté from Catmose College, UK, set out to prove the Coriolis Effect by using Sobel filtering methods to identify the movement and direction of clouds.
Pardubice Pi from SPŠE a VOŠ Pardubice, Czech Republic, found areas of enormous vegetation loss by performing NDVI analysis on images taken from the Astro Pi and comparing this with historic images of the location.
Reforesting Entrepreneurs from Canterbury School of Gran Canaria, Spain, want to help solve the climate crisis by using NDVI analysis to identify locations where reforestation is possible.
1G5-Boys from Lycée Raynouard, France, innovatively conducted spectral analysis using Fast Fourier Transforms to study low-frequency vibrations of the ISS.
Cloud4 from Escola Secundária de Maria, Portugal, masterfully used a simplified static model and Fourier Analysis to detect atmospheric gravity waves (AGWs).
Cloud Wizzards from Primary School no. 48, Poland, scanned the sky to determine what percentage of the seas and oceans are covered by clouds.
Aguere Team 1 from IES Marina Cebrián, Spain, probed the behaviour of the magnetic field, acceleration, and temperature on the ISS by investigating disturbances, variations with latitude, and temporal changes.
Highly commended teams
Creative Coders, from the UK, decided to see how much of the Earth’s water is stored in clouds by analysing the pixels of each image of Earth their experiment collected.
Astro Jaslo from I Liceum Ogólnokształcące króla Stanisława Leszczyńskiego w Jaśle, Poland, used Reimann geometry to determine the angle between light from the sun that is perpendicular to the Astro Pi camera, and the line segment from the ISS to Earth’s centre.
Jesto from S.M.S Arduino I.C.Ivrea1, Italy, used a multitude of the Astro Pi computers’ capabilities to study NDVI, magnetic fields, and aerosol mapping.
BLOOMERS from Tudor Vianu National Highschool of Computer Science, Romania, investigated how algae blooms are affected by eutrophication in polluted areas.
AstroLorenzini from Liceo Statale C. Lorenzini, Italy used Kepler’s third law to determine the eccentricity, apogee, perigee, and mean tangential velocity of the ISS.
EasyPeasyCoding Verdala FutureAstronauts from Verdala International School & EasyPeasyCoding, Malta, utilised machine learning to differentiate between cloud types.
BHTeamEL from Branksome Hall, Canada, processed images using Y of YCbCr colour mode data to investigate the relationship between cloud type and luminescence.
Space Kludgers from Technology Club of Thrace, STETH, Greece, identified how atmospheric emissions correlate to population density, as well as using NDVI, ECCAD, and SEDAC to analyse the correlation of vegetation health and abundance with anthropogenic emissions.
The teams get a Q&A with astronaut Luca Parmitano
The prize for the winners and highly commended teams is the chance to pose their questions to ESA astronaut Luca Parmitano! The teams have been asked to record a question on video, which Luca will answer during a live stream on 3 September.
This Q&A event for the finalists will conclude this year’s European Astro Pi Challenge. Everyone on the Raspberry Pi and ESA Education teams congratulates this year’s participants on all their efforts.
It’s been a phenomenal year for the Astro Pi challenge: team performed some great science, and across Mission Space Lab and Mission Zero, an astronomical 16998 young people took part, from all ESA member states as well as Slovenia, Canada, and Malta.
Congratulations to everyone who took part!
Get excited for your next challenge!
This year’s European Astro Pi Challenge is almost over, and the next edition is just around the corner!
So we invite school teachers, educators, students, and all young people who love coding and space science to join us from September onwards.
Follow our updates on astro-pi.org and social media to make sure you don’t miss any announcements. We will see you for next year’s European Astro Pi Challenge!
We’ve really enjoyed starting a series of seminars on computing education research over the summer, as part of our strategy to develop research at the Raspberry Pi Foundation. We want to deepen our understanding of how young people learn about computing and digital making, in order to increase the impact of our own work and to advance the field of computing education.
Part of deepening our understanding is to hear from and work with experts from around the world. The seminar series, and our online research symposium, are an opportunity to do that. In addition, these events support the global computing education research community by providing relevant content and a forum for discussion. You can see the talks recordings and slides of all our previous seminar speakers and symposium speakers on our website.
Gender balance in your computing classroom: what the research says
Our seventh seminar presentation was given by Katharine Childs from our own team. She works on our DfE-funded Gender Balance in Computing programme and gave a brilliant summary of some of the recent research around barriers to gender balance in school computing.
In her presentation, Katharine considered belongingness, role models, relevance to real-world contexts, and non-formal learning. She drew out the links between theory and practice and suggested a range of interventions. I recommend watching the video of her presentation and looking through her slides.
Katharine has also been publishing a number of excellent blog posts summarising her research on gender balance:
From September, our computing education research seminars will take place on the first Tuesday of each month, starting at 17:00 UK time.
We’re excited about the range of topics to be presented, and about our fantastic lineup of speakers: an international group from Australia, the US, Ireland, and Scotland will present on a survey of computing education curricular and teaching around the world; Shuchi Grover will talk to us about formative assessment; and David Weintrop will share his work on block-based programming. I’ll be talking about my research on PRIMM and the benefits of language and talk in the programming classroom. And we’re lining up more speakers after that.
We’d like to thank everyone who has participated in our seminar series, whether as speaker or attendee. We’ve welcomed attendees from 22 countries and speakers from the US, UK, and Spain. You’ve all really helped us to start this important work, and we look forward to working with you in the next academic year!
8 Bits and a Byte created this automatic bubble machine, which is powered and controlled by a Raspberry Pi and can be switched on via the internet by fans of robots and/or bubbles.
They chose a froggy-shaped bubble machine, but you can repurpose whichever type you desire; it’s just easier to adapt a model running on two AA batteries.
Before the refurb, 8 Bits and a Byte’s battery-powered bubble machine was controlled by a manual switch, which turned the motor on and off inside the frog. If you wanted to watch the motor make the frog burp out bubbles, you needed to flick this switch yourself.
After dissecting their plastic amphibian friend, 8 Bits and a Byte hooked up its motor to Raspberry Pi using a relay module. They point to this useful walkthrough for help with connecting a relay module to Raspberry Pi’s GPIO pins.
We spied New Orleans–based Raspberry Pi–powered home brewing analysis and were interested in how this project could help other at-home brewers perfect their craft.
When you’re making beer, you want the yeast to eat up the sugars and leave alcohol behind. To check whether this is happening, you need to be able to track changes in gravity, known as ‘gravity curves’. You also have to do yeast cell counts, and you need to be able to tell when your beer has finished fermenting.
“We wanted a way to skip the paper and pencil and instead input the data directly into the software. Enter the Raspberry Pi!”
Patrick Murphy and co. created a piece of software called Aleproof which allows you to monitor all of this stuff remotely. But before rolling it out, they needed somewhere to test that it works. Enter the ‘Danger Shed’, where they ran Aleproof on Raspberry Pi.
A Raspberry Pi 3 Model B+ spins their Python-based program on Raspberry Pi OS and shares its intel via a mounted monitor.
Here’s what Patrick had to say about what they’re up to in the Danger Shed and why they needed a Raspberry Pi:
“We wanted a way to skip the paper and pencil and instead input the data directly into the software. Enter the Raspberry Pi! The shed is small, hot, has leaks, and is generally a hostile place for a full-size desktop computer. Raspberry Pi solves our problem in multiple ways: it’s small, portable, durable (in a case), and easily cooled. But on top of that, we are able to run the code using PyCharm, enter data throughout the brewing process, and fix bugs all from the shed!”
“The Raspberry Pi made it easy for us to set up our software and run it as a stand-alone brewing software station.”
Join us for Digital Making at Home: this week, young people can recreate classic* video games with us! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.
So get ready to code some classic retro games with us:
This Saturday morning, our friends Maddie Moate and Greg Foot will be live at The Centre for Computing History for a computing- and retro gaming-inspired episode of their show Let’s Go Live, and you can tune in from 10am to join the fun.
Retro gaming and computer funtimes
Saturday’s show will be a retro feast of vintage video games, and will answer questions such as ‘What is a computer?’ and ‘How do computers work?’. As always, Maddie and Greg have a number of activities planned, including designing pixel art and going on a tech safari! They’re also extremely excited to step inside a giant computer and try to put it back together!
Let’s Go Live
Let’s Go Live is a family science show that Maddie and Greg began on day 1 of lockdown to help with the challenge of homeschooling. Since then, Maddie and Greg have hosted 50 live shows from their ‘spare room studio’ and caught the attention of millions of families across the world who enjoy tuning into their daily dose of fun, facts, and science activities.
After a short break, the two are now back for the summer holidays and plan to make Let’s Go Live bigger and better than ever by bringing you live shows from unique locations across the UK — a new venue each week!
We don’t blame you! If you’ve already been following Maddie and Greg on their Let’s Go Live journey throughout lockdown, and you’re looking for more fun online content to entertain you and your family, look no further than the Raspberry Pi Foundation’s Digital Making at Home:
Digital Making at Home
Each week, we share a themed code-along video and host a live stream to inspire families to have fun with coding and digital making at home! Join Christina, Marc, Mr C and their host of special guests as they work their way through our great coding activities. This week, the Digital Making at Home team has been exploring outer space, and they show you how to use Scratch and Python code to race the International Space Station, animate astronauts, and defy gravity.
And our next theme for Digital Making at Home — out tomorrow just when Let’s Go Live finishes — is retro games!
You’ll find all the episodes of Digital Making at Home on our website — new ones are added every Saturday morning. And on the website, you can also tune into our weekly code-along live stream every Wednesday at 2pm BST!
The past few months have given us ample opportunity to stare at the creatures that reside outside. In issue 33 of Hackspace Magazine, out today, Rosie Hattersley looks at ways to track them.
It’s been a remarkable spring and early summer, and not just because many of us have had more time than usual to be able to appreciate our surroundings. The weather has been mild, the skies clear, and pollution levels low. As a result, it ought to be a bumper year for plants and wildlife. Unfortunately, the lockdown limited opportunities for embracing unexpectedly good weather while simultaneously making us more aware of the wildlife on our doorsteps.
“It’s a great time to take a fresh look at the world around us”
If you’re the outdoorsy type who likes to get out and stare intently at feathered friends from the comfort of a large shed on the edge of a lagoon, you may have spent the past few months getting to know suburban birds during your exercise walks, rather than ticking off unusual species. As things finally open up, it’s a great time to take a fresh look at the world around us, and some of the projects focused on the creatures we share it with.
Make your own nature cam
Equipped with a Raspberry Pi connected to a camera and USB power bank, we are able to spy on the wildlife in our garden. The Raspberry Pi Camera Module V2 is a good option here (it’s less intrusive than the newer High Quality Camera, though that would make a superb critter-cam). It’s important not to disturb wildlife with lighting, so use an infrared module, such as the NoIR Camera Module, if you want to snap evening or night-time wildlife activity. Connect the Camera Module to the Camera port on Raspberry Pi using the cable provided, then gently pull up the edges of the port’s plastic clip and insert the ribbon cable. Push the clip back into place and the Camera Module will remain attached. Try our ‘Getting started with the Raspberry Pi Camera Module‘.
You’ll need a keyboard and mouse to set up the Raspberry Pi, but you can disconnect them at the end. Insert the updated microSD card and use a regular power supply to start it up (keep your power bank on charge separately while you set things up). Go through the Raspberry Pi setup, making sure you change the default password (since it will be accessible to anyone), and connect to your wireless network. It helps if you can access this network from the garden.
Turn on the interface for the camera, and enable SSH and VNC so you can access Raspberry Pi OS remotely when it’s sitting out in the garden. To do this, open Menu > Preferences > Raspberry Pi Configuration and click on Interface, then set Camera, SSH, and VNC to Enabled (see this documentation). Click Yes when advised that a reboot is needed.
Next, test the camera. Open a terminal window and enter:
A preview window will appear. After a few moments, it will save an image to the Desktop. Double-click the image.jpg file to open it.
You can use Python to take pictures and shoot video. This is handy if you want to create a time-lapse or video camera. This Raspberry Pi Project guide explains how to control the camera with Python.
Note that recording video will quickly fill up your storage space and drain the battery. A better idea is to leave the preview running and use VNC to view the camera remotely. A neater option is to hook up your Raspberry Pi to YouTube (as explained in this Raspberry Pi infrared bird-box project).
Open a web page and go to studio.youtube.com. Sign in, or set up a YouTube account. You will need to enable permission to live-stream. This involves providing YouTube with your phone number. Click Settings, Channel, and ‘Feature eligibility’, expand ‘Features that require phone verification’, and click ‘Verify phone number’. Type in your phone number, then enter the code that YouTube sends you as a text message. For security reasons, it will take 24 hours for YouTube to activate this feature on your account.
Get your key and add to terminal
On the left-hand side of the screen you should see a menu with the My Channel option available:
In the middle of the screen you should see the Video Manager option. On the left you should see a Live Streaming option. Look for and select the ‘Stream now BETA’ option.
Scroll down to the bottom of the page and you should see the ENCODER SETUP option.
Here there is a Server URL and a Stream name/key. The key is shown as a line of asterisks, until you click the Reveal button. Keep the key secret and don’t share it online. Copy your Stream Key to a text document (password-protect it, ideally).
Open a terminal window and enter this command (replacing <key goes here> with your own key:
You can see details of scripts running at startup here.
Shut down Raspberry Pi and fit the computer and Camera Module inside a case (if you are using one). Position Raspberry Pi in your garden and power it with the USB power bank. It will connect to your wireless network, and run the YouTube streaming key.
Navigate to your channel on YouTube at any time to see the action taking place in your garden.
We’re delighted to announce that our special judges — Eben Upton, Hayaatun Sillem, Limor Fried, Mitch Resnick, and Tim Peake — have chosen their favourite projects from the Coolest Projects online showcase!
Young tech creators from 39 countries are part of the showcase, including from Ireland, Australia, Palestine, UK, USA, India, and Indonesia. In total, you’ll find an incredible 560 projects from 775 young creators in the showcase gallery.
Our judges have been amazed and inspired by all the young creators’ projects, and they want to highlight a few as their favourites!
Eben Upton’s favourites
Eben Upton is a founder of our organisation, one of the inventors of the Raspberry Pi computer, and CEO of Raspberry Pi Trading. Watch Eben’s favourites.
Haya: Bobby ‘A Platformer’
Kaushal: Diabetic Retinopathy Detector
Zaahra, Eesa: Easy Sylheti
Mahmoud: Fighting Against Coronavirus
Oisín: MiniGolf In Python
Artash, Arushi: The Masked Scales: The Sonification of the Impact
Hayaatun Sillem’s favourites projects
Dr Hayaatun Sillem is the CEO of the Royal Academy of Engineering, which brings together the UK’s leading engineers and technologists to promote engineering excellence for the benefit of society. Watch Hayaatun’s favourites.
Sara, Batool, Rahaf, Nancy: Children Body Language
Lars: Colourbird PicoBello
Alisa, Michelle: Green Coins
Marah: My School Website
Raluca: Protect the Planet!
Rhea: The Amazing Photo Filter
Mitch Resnick’s favourites
Mitch Resnick is Professor of Learning Research at the MIT Media Lab, and his Lifelong Kindergarten research group develops and supports the Scratch programming software and online community! Watch Mitch’s favourites.
One of our Approved Resellers in the Netherlands, Daniël from Raspberry Store, shared this Raspberry Pi–powered prayer reminder with us. It’s a useful application one of his customers made using a Raspberry Pi purchased from Daniël’s store.
As a Raspberry Pi Official Reseller, I love to see how customers use Raspberry Pi to create innovative products. Spying on bird nests, streaming audio to several locations, using them as a menu in a restaurant, or in a digital signage-solution… just awesome. But a few weeks ago, a customer showed me a new usage of Raspberry Pi: a prayer clock for mosques.
Made by Mawaqit, this is a narrowcasting solution with a Raspberry Pi at its heart and can be used on any browser or smartphone.
If you do not have an internet connection, you’ll also need an RTC clock
With the HDMI cable, Raspberry Pi can broadcast the clock — plus other useful info like the weather, or a reminder to silence your phone — on a wall in the mosque. Awesome! So simple, and yet I have not seen a solution like this before, despite Mawaqit’s application now being used in 51 countries and over 4609 mosques. And, last I checked, it has more than 185,000 active users!
There are then two options: connected and offline. If you set yourself up using the connected option, you’ll be able to remotely control the app from your smartphone or any computer and tablet, which will be synchronised across all the screens connected to Raspberry Pi. You can also send messages and announcements. The latest updates from Mawaqit will install automatically.
If you need to choose the offline option and you’re not able to use the internet at your mosque, it’s important to equip your Raspberry Pi with RTC, because Raspberry Pi can’t keep time by itself.
The Mawaqit project is free of charge, and the makers actually prohibit harnessing it for any monetary gain. The makers even created an API for you to create your own extentions — how great is that? So, if you want your own prayer clock for in a mosque, school, or just at home, take a look at Mawaqit.net.
Since lockdown started, I’ve found I often miss video meetings. It’s not intentional, I simply loose track of time. Though my phone is set to remind me of upcoming meetings ten minutes before they begin, I have a habit of trying to fill that time with something productive and before I know it, I have Eben on the phone, fifteen minutes after the meeting’s start, asking where I am.
Fixing the issue using code
Due to this, and because I was interested in playing with the Google API and learning a little more Python, I decided to write a simple application that will get your upcoming events from your Google Calendar and give you notifications as often as you want, visually on screen as well as through sound.
I call it NextEvent
Here’s the video tutorial to show you more:
And here’s the written tutorial too!
Installing NextEvent to your Raspberry Pi
To install NextEvent, open a terminal window (Ctrl-Alt-T) on Raspberry Pi and type:
The next thing you’ll see is NextEvent starting up, and then it’ll open Chromium. It is here that you will give Google permission to share you calendar with the application.
You’ll then need to log into your Google account and give NextEvent access to your calendar. Chromium will tell you when everything is fine and you can close the browser.
Now you can see your next five events along with the time left until each one. When the time gets down to five minutes, the application will turn red and ‘ding’ at you. It’ll ‘ding’ twice at one minute to go, and four times when your meeting is about to start!
In case you want to delve into the code, maybe to create a meeting room ‘now and next’ display, the nextevent.py source contains the GUI and event processing part of NextEvent. You should be able to go here and change the number of lines, the colours, and the notification sounds.
How does it work?
If you’re the sort that likes to know HOW the code works, here’s a follow-up to the tutorial video where I explain exactly that!
Join us for Digital Making at Home: this week, young people can do out-of-this-world coding with our space-themed projects! Through Digital Making at Home, we invite kids all over the world to code along with us and our new videos every week.
We know crawl spaces are creepy, sweaty, and confining but, hear us out…
You need to keep an eye on the humidity level in your crawl space, as it can seriously affect the whole house’s overall health. It’s ideal to be able to do this remotely (given the creepy, sweaty atmosphere of the space), and a Raspberry Pi allows this.
Jamie Bailey took to Medium to share his Raspberry Pi setup that allows him to monitor the humidity of the crawl space in his home from a mobile device and/or laptop. His setup lets you check on the current humidity level and also see the historical data over time. You can also set alarms to be sent to you via text or email whenever the humidity level exceeds a certain threshold.
The hardware you need
Power outlet or extension cord in your crawl space
Raspberry Pi (3 or 4) or Raspberry Pi Zero W (or WH)
Jamie used Initial State for data streaming and visualisations, but you can choose a free alternative
Jamie’s walk-through is extensive and includes all the command line code you’ll need too, so make sure to check it out if you attempt this build.
The BME280 sensor has four pins you need to connect to your Raspberry Pi. This will send the humidity data to your Raspberry Pi, which you’ll have already set up to let you know what’s happening remotely.
BME280 VIN pin connects to GPIO pin 1 (3.3V)
BME280 GND pin connects to GPIO pin 6 (GND)
BME280 SCL pin connects to GPIO pin 5 (SCL)
BME280 SDA pin connects to GPIO pin 3 (SDA)
Once you have all your software sorted and your hardware connected, turn your Raspberry Pi off and take it down to your crawl space (monitor, keyboard, and mouse are no longer necessary). Jamie advises hanging your Raspberry Pi from the floor joists instead of letting it touch the ground, to avoid contact with any water. He put a nail in one of the floor joists and draped the power cord over the nail (see above). Turn your tiny computer on, make sure data starts flowing into your dashboard, and you’ve got yourself remote humidity sensor!
PS We’re English so… is a crawl space the same as an attic or what? Asking for a friend!
Computational thinking (CT) comprises a set of skills that are fundamental to computing and being taught in more and more schools across the world. There has been much debate about the details of what CT is and how it should be approached in education, particularly for younger students.
In our research seminar this week, we were joined by María Zapata Cáceres from the Universidad Rey Juan Carlos in Madrid. María shared research she and her colleagues have done around CT. Specifically, she presented work on how we can understand what CT skills young children are developing. Building on existing work on assessing CT, she and her colleagues have developed a reliable test for CT skills that can be used with children as young as 5.
Why do we need to test computational thinking?
Until we can assess something, María argues, we don’t know what children have or haven’t learned or what they are capable of. While testing is often associated with the final stages in learning, in order to teach something well, educators need to understand where their students’ skills are to know what they are aiming for them to learn. With CT being taught in increasing numbers of schools and in many different ways, María argues that it is imperative to be able to test learners on it.
How was the test developed?
One of the key challenges for assessing learning is knowing whether the activities or questions you present to learners are actually testing what you intend them to. To make sure this is the case, assessments go through a process of validation: they are tried out with large groups to ensure that the results they give are valid. María’s and her colleagues’ CT test for beginners is based on a CT test developed by another researcher. That test had been validated, but since it is aimed at 10- to 16-year-olds, María and her colleagues needed to adapt it for younger children and then validate the adapted rest.
Developing the first version
The new test for beginners consists of 25 questions, each of which has four possible responses, which are to be answered within 40 minutes. The questions are of two types: one that involves using instructions to draw on a canvas, and one that involves moving characters through mazes. Since the test is for younger children, María and her colleagues designed it so it involves as little text as possible to reduce the need for reading; instead the test includes self-explanatory symbols.
Developing a second version based on feedback
To refine the test, the researchers consulted with a group of 45 experts about the difficulty of the questions and the test’s length of the test. The general feedback was very positive.
Drawing on the experts’ feedback, María and her colleagues made some very specific improvements to the test to make it more appropriate for younger children:
The improve test mandates that an verbal explanation be given to children at the start, to make sure they clearly understand how to take the test and don’t have to rely on reading the instructions.
In some areas, the researchers added written explanations where experts had identified that questions contained ambiguity that could cause the children to misinterpret them.
A key improvement was to adapt the grids in the original test to include pathways between each box of the maze. It was found that children could misinterpret the maze, for example as allowing diagonal moves between squares; the added pathways are visual cues that it clear that this is not possible.
Validating the test
After these improvements, the test was validated with 299 primary school students aged 5-12. To assess the differences the improvements might make, the students were given different version of the test. María and her colleagues found that the younger students benefited from the improvements, and the improvements made the test more reliable for testing students’ computational thinking: students made fewer errors due to ambiguity and misinterpretation.
Statistical analysis of the test results showed that the improved version of the test is reliable and can be used with confidence to assess the skills of younger children.
What can you use this test for?
Firstly, the test is a tool for educators who want to assess the skills young people have and develop over time. Secondly, the test is also valuable for researchers. It can be used to perform projects that evaluate the outcomes of different approaches to teaching computational thinking, as well as projects investigating the effectiveness of specific learning resources, because the test can be given to children before and again after they engage with the resources.
Assessment is one of the many tools educators use to shape their teaching and promote the learning of their students, and tools like this CT test developed by María and her colleagues allow us to better understand what children are learning.
Redditor Mark Hank missed the tactile experience of vinyl records so he removed the insides of an old Sonos Boost to turn it into a Raspberry Pi- and NFC-powered music player. Yes, this really works:
The Sonos Boost was purchased for just £3 on eBay. Mark pulled all the original insides out of it and repurposed it as what they call a ‘vinyl emulator’ to better replicate the experience of playing records than what a simple touchscreen offers.
The Boost now contains a Raspberry Pi 3A+ and an ACR122U NFC reader, and it plays a specific album, playlist, or radio station when you tap a specific NFC tag on it. It’s teamed with Sonos speakers, and NTAG213 NFC tags. The maker recommends you go with the largest tags you can find, as it will improve read performance; they went with these massive ones.
One of the album covers printed onto thick card
The tags are inside printouts mounted on 1mm thick card (those album cover artwork squares getting chucked at the Sonos in the video), and they’re “super cheap” according to the maker.
You’ll need to install the node-sonos-http-api package on your Raspberry Pi; it’s the basis of the whole back-end of the project. The maker provides full instructions on their original post, including on how to get Spotify up and running on your Raspberry Pi.
The whole setup neatened up
Rather than manually typing HTTP requests into a web browser, the maker wanted to automate the process so that the Raspberry Pi does it when presented with certain stimulus (aka when the NFC reader is triggered). They also walk you through this process in their step-by-step instructions.
How the maker hid the mess under the display table
The entire build cost around £50, and the great thing is that it doesn’t need to sit inside an old Sonos Boost if you don’t want it to. The reader works through modest-width wood, so you can mount it under a counter, install it in a ‘now listening’ stand, whatever — it’s really up to you.
Shot on a Raspberry Pi Camera Module, this stop-motion sequence is made up of 180 photos that took two hours to shoot and another hour to process.
The trick lies in the Camera Module enabling you to change the alpha transparency of the overlay image, which is the previous frame. It’s all explained in the official documentation, but basically, the Camera Module’s preview permits multiple layers to be rendered simultaneously: text, image, etc. Being able to change the transparency from the command line means this maker could see how the next frame (or the object) should be aligned. In 2D animation, this process is called ‘onion skinning’.
So why the Raspberry Pi Camera Module? Redditor /DIY_Maxwell aka Yuksel Temiz explains: “I make stop-motion animations as a hobby, using either my SLR or phone with a remote shutter. In most cases I didn’t need precision, but some animations like this are very challenging because I need to know the exact position of my object (the boat in this case) in each frame. The Raspberry Pi camera was great because I could overlay the previously captured frame into the live preview, and I could quickly change the transparency of the overlay to see how precise the location and how smooth the motion.”
You can easily make simple, linear stop-motion videos by just capturing your 3D printer while it’s doing its thing. Yuksel created a bolting horse (above) in that way. The boat sequence was more complicated though, because it rotates, and because pieces had to be added and removed.
The official docs are really comprehensive and span basic to advanced skill levels. Yuksel even walks you through getting started with the installation of Raspberry Pi OS.
We’ve seen Yuksel’s handiwork before, and this new project was made in part by modifying the code from the open-source microscope (above) they made using Raspberry Pi and LEGO. They’re now planning to make a nice GUI and share the project as an open-source stop-motion animation tool.