The all-new Hello World podcast for educators interested in computing and digital making

There is growing momentum behind the idea of putting computing, computer science, and digital making at the heart of modern education. At the Raspberry Pi Foundation, we want to connect with and support computing educators, both inside and outside of the classroom. Hello World magazine, which we started in 2017, is a platform to help educators all over the world to find inspiration, share experiences, and learn from one another. Hello World is free and has proven to be very popular, with subscribers hailing from 172 countries across the globe!

Hello World, coming directly to your ears now

The Hello World community has told us that they’re hungry for more content while they wait for each new magazine issue. So to complement the magazine, we’ve launched a brand-new Hello World podcast to meet this need! That means you can now hear directly from the educators who are writing Hello World articles, dive a little deeper, and have some fun along the way.

Guests Cat Lamin and Neil Rickus speaking to Hello World podcast hosts Carrie Anne Philbin and James Robinson about well-being and technology

In season 1 of the Hello World podcast, you will:

  • Explore the importance of creativity and passion in computing with PBS Digital Innovator and CUE Rock Star Amanda Haughs
  • Dive into the role of ethics in computing with Isaac Computer Science content creator Diane Dowling
  • Discover how to look after our well-being while teaching with technology, with practical tips from computing educator Cat Lamin and senior lecturer in computing education at the University of Hertfordshire Neil Rickus
  • Get answers to the question “Are these the droids you’re looking for to teach algorithms?” with computing teacher Huzaifah Zainon and advanced skills computing teacher Nicki Cooper

Listen and subscribe wherever you get your podcasts

Start listening to our first episodes now, wherever you usually get your podcasts. And make sure to subscribe to never miss an episode!

* {
box-sizing: border-box;
}
.header {
text-align: center;
padding: 32px;
}
.row {
display: -ms-flexbox; /* IE10 */
display: flex;
-ms-flex-wrap: wrap; /* IE10 */
flex-wrap: wrap;
padding: 0 4px;
vertical-align: top;
text-align: center;
align-items: flex-start
}
/* Create four equal columns that sits next to each other */
.column {
-ms-flex: 33%; /* IE10 */
flex: 33%;
max-width: 33%;
padding: 0 4px;
align-items: flex-start
}
.column img {
margin-top: 0px;
vertical-align: top;
width: 100%;
}
.column figure {
margin-top: 0px;
vertical-align: top;
}
/* Responsive layout – makes a two column-layout instead of four columns */
@media screen and (max-width: 900px) {
.column {
-ms-flex: 50%;
flex: 50%;
max-width: 50%;
}
}
/* Responsive layout – makes the two columns stack on top of each other instead of next to each other */
@media screen and (max-width: 400px) {
.column {
-ms-flex: 100%;
flex: 100%;
max-width: 100%;
}
}

Let us know if you have a question or a topic you would like us to explore on the Hello World podcast. You can get even more involved by featuring as a guest on a future episode, sharing your top tips and best teaching practices with computing educators around the world. Get in touch with us at podcast@helloworld.cc with your suggestions! 

The post The all-new Hello World podcast for educators interested in computing and digital making appeared first on Raspberry Pi.



Source: Raspberry Pi – The all-new Hello World podcast for educators interested in computing and digital making

Recreate Galaxian’s iconic attack patterns | Wireframe #50

Blast dive-bombing aliens in our salute to Namco’s classic. Mark Vanstone has the code

Aliens swoop down towards the player, bombing as they go. Back in 1979, this was a big step forward from Taito’s Space Invaders.

Hot on the heels of the original Space Invaders, Galaxian emerged as a rival space shooter in 1979. Released by Namco, Galaxian brought new colour and unpredictable motion to the alien enemy, who would swoop down on the defending player. Galaxian was so popular in arcades that Namco released a sequel, Galaga, two years later – that game complicated the attack patterns even more. It’s difficult to say how many ports and clones have been made of Galaxian, as there are several versions of similar games for almost every home platform.

The player’s role in Galaxian is similar to Space Invaders, in that they pilot a ship and need to destroy a fleet of aliens. With Galaxian, however, the aliens have a habit of breaking formation and swooping down towards the player’s ship, and dive-bombing it. The aim is to destroy all the enemy ships and move on to the next wave. The subsequent waves of enemies get more difficult as the player progresses. For this sample, we’re going to look at that swooping mechanic, and make the bare nuts and bolts of a Galaxian game with Pygame Zero.

Our Galaxian homage up and running in Pygame Zero.
Our homage to the classic Galaxian, with angry aliens that love to break formation.

First, Galaxian has a portrait display, so we can set the play area’s width and height to be 600 and 800 respectively. Next, we can create a scrolling backdrop of stars using a bitmap that we blit to the screen and move downwards every update. We need a second blit of the stars to fill in the space that the first one leaves as it scrolls down, and we could also have another static background image behind them, which will provide a sense of depth.

Next, we set up the player ship as an Actor, and we’ll capture the left and right arrow keys in the update() function to move the ship left and right on the screen. We can also fire off a bullet with the SPACE bar, which will travel up the screen until it hits an alien or goes off the top of the screen. As in the original Galaxian, you can only shoot one bullet at a time, so we only need one Actor for this.

The aliens are arranged in rows and move left and right across the screen together. We’ll stick to just one type of alien for this sample, but draw two rows of them. You could add extra types and any number of rows. When we create the alien Actors, we can also add a status flag, and we need to determine which side of the row they’re on as when they break formation, the two sides fly in opposite directions. In this case, there’ll be four aliens on the left of each row and four on the right. Once they’re set up in a list, we can iterate through the list on each update and move them backwards and forwards. While we’re moving our aliens, we can also check to see if they’ve collided with a bullet or the player ship. If the collision is with a bullet, the alien cycles through a few frames of an explosion using the status flag, and then, when their status reaches five, they’re no longer drawn. If the collision is with the player, then the player dies and the game’s over. We can also check a random number to see if the alien will start a bombing run; if so, we set the status to one, which will start calls to the flyAlien() function. This function checks which side the alien’s on and starts changing the alien’s angle, depending on the side. It also alters the x and y coordinates, depending on the angle. We’ve written this section in longhand for clarity, but this could be collapsed down a bit with the use of some multiplier variables for the x coordinates and the angles.

There we have it: the basics of Galaxian. Can you flesh it out into a full game?

Here’s Mark’s code for a Galaxian-style shooter with attacking groups of aliens. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 50

You can read more features like this one in Wireframe issue 50, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 50 for free in PDF format.

The post Recreate Galaxian’s iconic attack patterns | Wireframe #50 appeared first on Raspberry Pi.



Source: Raspberry Pi – Recreate Galaxian’s iconic attack patterns | Wireframe #50

Custom PC for free! Reviews, guides, retro tech…

Fancy some extra PC hardware, overclocking, gaming and modding content in your life this summer? You can get your hands on a FREE PDF download of each new Custom PC magazine issue released during May through to September.

You’ll find stuff like in-depth hardware reviews and step-by-step photo guides, as well as hard-hitting tech opinion, game reviews, and all manner of computer hobbyism goodness.

Our favourite regular feature is Retro Tech, so we’re sharing this latest one by Stuart Andrews to get you started on your summer of love with Custom PC.

Catacombs game screengrab
Forgive the blocky pixels and 16-colour palette. In Catacombs 3-D and Catacombs: Abyss lay the seeds of Wolfenstein and Doom

Pity the poor PC of 1983-1984. It wasn’t the graphics powerhouse we know today. IBM’s machines and their clones might have been the talk of the business world, but they were stuck with text-only displays or low-definition bitmap graphics. The maximum colour graphics resolution was 320 x 200, with colours limited to four from a hard-wired palette of 16. Worse, three of those colours were cyan, brown and magenta, and half of them were just lighter variations of the other half. 

By this point, IBM’s Color Graphics Adaptor (CGA) standard was looking embarrassing. Even home computers such as the Commodore 64 could display 16-colour graphics, and Apple was about to launch the Apple IIc, which could hit 560 x 192 with 16 colours. IBM had introduced the Monochrome Display Adaptor (MDA) standard, but this couldn’t dish out more pixels, only higher-resolution mono text. 

Meanwhile, add-in-cards, such as the Hercules or Plantronics Colorplus, introduced higher resolutions, but did nothing for colour depth. The PC needed more, which IBM delivered with its updated 286 PC/AT system and the Enhanced Graphics Adaptor (EGA). 

The new state of the art

The original Enhanced Graphics Adaptor was a hefty optional add-in-card for the IBM PC/AT, using the standard 8-bit ISA bus and with support built into the new model’s motherboard. Previous IBM PCs required a ROM upgrade in order to support it. 

IBM EGA card
The original IBM EGA card was a whopper, even without the additional daughtercard and memory module kit.

It was massive, measuring over 13in long and containing dozens of specialist large scale integration (LSI chips), memory controllers, memory chips and crystal timers to keep it all running in sync. It came with 64KB of RAM on-board but could be upgraded through a Graphics Memory Expansion Card and an additional Memory Module Kit to up to 192KB. Crucially, these first EGA cards were designed to work with IBM’s 5154 Enhanced Color Display Monitor, while still being compatible with existing CGA and MDA displays. IBM managed this by using the same 9-pin D-Sub connector, and by fitting four DIP switches to the back of the card to select your monitor type. 

EGA was a significant upgrade from low-res, four-colour CGA. With EGA, you could go up to 640 x 200 or even (gasp) 640 x 350. You could have 16 colours on the screen at once from a palette of 64. Where once even owners of 8-bit home computers would have laughed at the PC’s graphics capabilities, EGA and the 286 processor put the PC/AT back in the game.

Birth of an industry

However, EGA had one big problem; it was prohibitively expensive, even in an era when PCs were already astronomically expensive. The basic card cost over $500 US, and the Memory Expansion Card a further $199. Go for the full 192KB of RAM and you were looking at a total of nearly $1,000 (approximately £2,600 inc VAT in today’s money), making the EGA card the RTX 3090 of its day, and only slightly more readily available. What’s more, the monitor you needed to make the most of it cost a further $850 US. EGA was a rich enthusiast’s toy.

ATI EGA Wonder
Using Chips and Technologies’ EGA chipset, early graphics card manufacturers such as ATi could produce smaller, cheaper boards

However, while the initial card was big and hideously complex, the basic design and all the tricky I/O stuff were relatively easy to work out. Within a year, a smaller company, Chips and Technologies of Milpitas, California, had designed an EGA-compatible graphics chipset. It consolidated and shrunk IBM’s extensive line-up of chips into a smaller number, which could fit on a smaller, cheaper board. The first C&T chipset launched in September 1985, and within a further two months, half a dozen companies had introduced EGA-compatible cards. 

Other chip manufacturers developed their own clone chipsets and add-in-cards too, and by 1986, over two dozen manufacturers were selling EGA clone cards, claiming over 40 per cent of the early graphics add-in-card market. One, Array Technology Inc, would become better known as ATI, and later swallowed up by AMD. If you’re on the red team in the ongoing GPU war, that story starts here. 

Changing games

EGA also had a profound impact on PC gaming. Of course, there were PC games before EGA, but many were text-based or built to work around the severe limitations of CGA. With EGA, there was scope to create striking and even beautiful PC games.

Colonel's Bequest
The Colonel’s Bequest is a character-driven graphic adventure game by Sierra On-Line

This didn’t happen overnight. The cost of 286 PCs, EGA cards and monitors meant that it was 1987 before EGA support became common, and 1990 before it hit its stride. Yet EGA helped to spur on the rise and development of the PC RPG, including the legendary SSI ‘Gold Box’ series of Advanced Dungeons and Dragons titles, Wizardry VI: Bane of the Cosmic Forge, Might and Magic II and Ultima II to Ultima V. 

It also powered a new wave of better-looking graphical adventures, such as Roberta Williams’ Kings Quest II and III, plus The Colonel’s Bequest. EGA helped LucasArts to bring us pioneering point-and-click classics such as Maniac Mansion and Loom in 16 colours. And while most games stuck to a 320 x 200 resolution, some, such as SimCity, would make the most of the higher 640 x 350 option.

What’s more, EGA made real action games on the PC a realistic proposition. The likes of the Commander Keen games proved the PC could run scrolling 2D platformers properly. You could port over Apple II games such as Prince of Persia, and they wouldn’t be a hideous, four-colour mess. 

Wizardry VI
Wizardry VI: Bane of the Cosmic Forge is the 6th title in the Wizardry series of role-playing video games. It was the first in the trilogy surrounding the Dark Savant, which was followed by Wizardry VII: Crusaders of the Dark Savant and Wizardry 8

And when the coder behind Commander Keen – a certain John Carmack – started work on a new 3D sequel to the Catacomb series of dungeon crawlers, he created something genuinely transformative. Catacomb 3-D and Catacomb: Abyss gave Carmack his first crack at a texture-mapped 3D engine, and arguably started the FPS genre. 

Sure, EGA had its limitations – looking back, there’s an awful lot of green and purple – but with care and creativity, an artist could do a lot with 16 colours and begin creating more immersive game worlds.

 A slow decline

EGA’s time at the top of the graphics tech tree was short. Home computers kept evolving, and in 1985, Commodore launched the Amiga, supporting 64 colours in games and up to 4,096 in its special HAM mode. Even as it launched EGA, IBM was talking about a new, high-end board, the Professional Graphics Controller (PGC), which could run screens at 640 x 480 with 256 colours from a total of 4,096. 

PGC was priced high and aimed at the professional CAD market, but it helped to pave the way for the later VGA standard, introduced with the IBM PS/2 in 1987. VGA supported the same maximum resolution and up to 256 colours at 320 x 200. This turned out to be exactly what was needed for a new generation of operating systems, applications and PC games.

What extended EGA’s lifespan was the fact that VGA remained expensive until the early 1990s, while EGA had developed a reasonable install base. Even once VGA hit the mainstream, many games remained playable in slightly gruesome 16-colour EGA. Much like the 286 processor and the Ad-Lib sound card, EGA came before the golden age of PC gaming, but this standard paved the way for the good stuff that came next.

The retro tech inspiration behind Raspberry Pi 400

Did you know that Raspberry Pi 400 was inspired by retro tech?

Eben Upton explains:

“Raspberry Pi has always been a PC company. Inspired by the home computers of the 1980s, our mission is to put affordable, high-performance, programmable computers into the hands of people all over the world. And inspired by these classic PCs, here is Raspberry Pi 400: a complete personal computer, built into a compact keyboard… Classic home computers – BBC Micros, ZX Spectrums, Commodore Amigas, and the rest – integrated the motherboard directly into the keyboard. No separate system unit and case; no keyboard cable. Just a computer, a power supply, a monitor cable, and (sometimes) a mouse.”

Download Custom PC for free

Yes, you did read that right. Not only have we unleashed an awesome new issue of Custom PC into the wild, but you can even download a PDF of it for the bargain price of absolutely nothing. In fact, you’ll be able to download every issue of Custom PC over the summer for free.

Custom PC Summer special advert

The post Custom PC for free! Reviews, guides, retro tech… appeared first on Raspberry Pi.



Source: Raspberry Pi – Custom PC for free! Reviews, guides, retro tech…

Stop snoring with Raspberry Pi

How many of you have woken up grumpy from being snored at all night? Or maybe you’re the snorer whose sleep is interrupted by being elbowed in the ribs to get you to stop. Not only does snoring keep your partner awake, it also affects the quality of your own sleep, even though you might not realise it.

A demonstration of the Staley’s project in action

Bryan and Brayden Staley think they’ve come up with a solution: a wearable hearing support device and a Raspberry Pi work together to send the wearer a haptic signal when they start snoring, which soothes them and disrupts the cycle.

Wristwear stops you snoring

The wearable device that this project hinges on is the Neosensory Buzz. Worn on the wrist, it helps people with hearing difficulties pick up on things like doorbells, alarms, and even their name being called.

neosensory buzz bracelet
The Buzz bracelet is a pricey but neat bit of kit

Working alongside the Buzz bracelet is a sound inference base, which consists of a Raspberry Pi 4 Model B and a Seeed ReSpeaker. The sound inference base picks up and classifies audio, and specifically recognises snoring. Once it detects a certain number of snoring events, it sends a sinusoidal signal to the Buzz bracelet, and continues until audio level falls below the snoring threshold.

Hardware

GitHub repos

  • ss-app (provides the utilities used to build up a Raspberry Pi from scratch to perform audio classification)
  • neosensory-python-sdk (a Python package for interacting with neosensory products)
  • YAMNet (a pretrained deep net that predicts audio event classes)
smart snoring device software and hardware table
The major components that make up the SS Buzz architecture

Does it actually stop snoring?

Snoring was down by 56% on the nights this project was tested, even though it’s still in the development stage. We like those figures!

Special shout out to developer Brayden, who is just 13 years old. This is his second auditory project, according to his Hackster profile.

The post Stop snoring with Raspberry Pi appeared first on Raspberry Pi.



Source: Raspberry Pi – Stop snoring with Raspberry Pi

Five takes on Raspberry Pi 400

Raspberry Pi 400 featured on Channel 5’s Gadget Show on Friday. We love being on the telly, and it reminded us to remind you that our smart, portable PC is just the best. Here are five different takes on our complete personal computer, built into a compact keyboard.

Classic retro gaming, new retro-style gaming

Since 2004 The Gadget Show has been sharing the latest gadget reviews and tech innovations, and now it’s Raspberry Pi 400’s turn in the spotlight. Jordan Erica Webber took it for an emulator gaming spin, and enjoyed some classic ROMs and some new ones.

Ooh, what’s that Jordan Erica Webber is playing?

Her verdict: this is a great way to go about retro gaming.

Fresh out of the box

Join Caroline’s YouTube subscribers

One of the best things about launching a new product is seeing all the unboxing videos our community makes. And this one by Caroline Dunn was one of our favourites from Raspberry Pi 400 launch day. Caroline liked that this is our most powerful and easy-to-use Raspberry Pi computer yet. In her video she walks you through how simple it is to set up, even for complete beginners.

Raspberry Pi 400 goes cyberpunk

Check out Zack’s YouTube channel

The latest cool project we saw was Zack Freedman’s cyberdeck. The Voidstar Data Blaster is a portable cyberdeck that Zack created to help him stay punk in the suburbs. It’s built entirely around Raspberry Pi 400 and it features lots of cool cyberpunk additions like a wearable display and chunky handles.

Amiga classics

Dan Wood is self-confessed Raspberry Pi addict and also loves retro gaming. So it’s no surprise that he took to his YouTube channel to show you how to play Amiga games on the Raspberry Pi 400. Dan liked the retro-inspired design of our all-in-one machine, and took a trip down memory lane to turn it into the ultimate Commodore Amiga emulator.

Working and learning from home with Raspberry Pi 400

So neat, so unlike my desk

Lots of people use our portable computer to work remotely or access education from home, so we rounded up an FAQ-style bunch of tips and tricks to help you get the most out of it. We cover everything from printing to video conferencing and Google Classroom here.

Oh – and if you’re still wondering what Jordan Erica is playing up there, it’s Morphcat’s Böbl, and you can grab it, and other new-retro ROMs, from itch.io.

The post Five takes on Raspberry Pi 400 appeared first on Raspberry Pi.



Source: Raspberry Pi – Five takes on Raspberry Pi 400

Meet Estefannie Explains it All

Recently listed as one of Instagram’s Top 7 Women in STEM, software engineer and content creator Estefannie talks to Alex Bate about electronics, her online community, and why she can’t stop giving away free tech in her Instagram Live streams.

estefannie at her desk in colourful hoodie
Coming from a software background, Estefannie had to learn electronics

Based in Texas, Mexican-born Estefannie graduated summa cum laude from the University of Houston with a degree in computer science and a passion for helping people discover computing.

Some years later, with an established career as a software engineer under her belt, Estefannie is best-known for her YouTube and Instagram accounts, Estefannie Explains It All, and can often be found with a soldering iron in one hand, a rescue cat in the other, all while sporting the most fabulous pair of circuit board Louboutin heels and laser-cut lightning bolt earrings. Yes, it’s fair to say that we all want to be Estefannie. But how did she get here?

Estefannie wearing her Louboutin heels
Rocking her circuit board Louboutin heels and laser-cut lightning bolt earrings

Alex  You originally made videos on your channel four years ago to make sure that you’d retained the information that you were learning at the time?

Estefannie  Mm-hmm, that’s right.

But why did you decide to move away from the early explainers and start making other types of content, such as your Daft Punk helmet, and running weekly live streams and giveaways? Because I’m assuming that when you were making those early Estefannie Explains It All videos, you didn’t plan on becoming an influencer?

E  No. The influencer part? Oh, no. I was studying for an interview with Google and I decided to make explainer videos and put them online because I knew people would correct me if I was wrong. And, if they didn’t, I knew my explanations were correct and I was better prepared for the interview.

The YouTube comments section was the scariest place on earth for me, so that’s why I went for YouTube.Later on, it was close to Halloween, and I was about to have an interview with Microsoft, this time to be a product evangelist. And I knew that IoT, the Internet of Things, was ‘the latest buzzword’, and I already wanted to dabble with that technology. So, I decided I wanted to make an IoT project and put it on my YouTube channel. That way, when the Microsoft interview arrived, I’d also have that video to show.

Halloween happened and I’d made this stupid pumpkin robot thing that wasn’t even IoT, but I put it on YouTube anyway and realised that I’d really liked doing it. I really, really liked it. And that’s when I found out about Simone Giertz and other makers, and this whole world I hadn’t known about. I thought, ‘I really like doing this, so I’m going to keep doing it.’ I didn’t even care about the interview anymore because I had found ‘the thing’, the thing that I wanted to do.

Microsoft actually loved the video and they wanted me to keep doing more of them, but on their platform, and they would own the content, which I didn’t want. So that’s how it transformed from explainers as prep for interviews to wanting to make videos. And the influencer thing happened a little bit differently. It’s a bit more Instagram-my.

It’s more personal. You’re creating a brand.

E  A brand, yes, I think that’s the key. So the Instagram thing happened for two reasons. The first one was that, before YouTube, I was going to start a business making little video games and mobile apps. And I decided to make it an ‘umbrella’ business so that anything I made could go under there. Because I thought [she laughs], ‘they’re going to go viral and so I need to be prepared legally.’

And while I was doing all of the business stuff, I realised I also need to learn how to do social media, because I need to promote these video games. So I took the time to understand Instagram, follow the people that I thought were interesting or would be doing the same stuff as me. I started out with my personal account as a test and, again, I really liked it. I started seeing people follow me because they were interested in the lifestyle of a software engineer. And I thought it was cool because I would have liked to see how software engineering was as a career before going for it. It was like a window to that world.

Do you think there’s been a change, though, because your brand was that you were a software engineer? And now you’re not in the same job. You’re a full-time creator now. Do you think that’s affected who follows you and how people interact with you?

E  I was very afraid of that when I quit my job. I tried to not talk about it at first. But it didn’t really matter because the people who have followed along, they’ve seen all the changes. And when I quit my job, they congratulated me because I was now able to do this full-time. So it was like the opposite. They were following ‘The Estefannie Experience’, ha ha. For a lot of them, it was like, ‘Oh, that’s another cool path that you can take as an engineer.’

Estefannie with her cats
Cats can provide emotional support while debugging

What was it like to make the leap from software, from something you can control totally to hardware, an area where things can go wrong all the time?

E  Oh, well, software can go wrong all the time, too. When I did that first Halloween pumpkin video, I think that really sparked a new interest in me of like, ‘Oh, I should have studied electrical engineering or computer engineering’. Because I am really passionate about the hardware aspect of it. I’d studied a low-level class as part of my computer science degree about gates and how they work. I remember having to draw them out.

And I really liked that class and understanding how electricity goes through those gates. But it didn’t matter because I was there to learn how to do the programming part. With electronics, it was so fun to go back and actually try it, and I was hurting myself, shocking myself, burning myself. It was great; I love it. It was like I was putting everything in my imagination into real, physical things. And I think that helps me. I like seeing things or touching things.

You’re a big advocate for celebrating failure and learning from failure. You’ve   done talks about it at Coolest Projects and Maker Faire, and you talk about it in your videos. In the earthquake simulator you built for Becky Stern, you showed the first way of making it and how it didn’t work, before showing the final project. Do you think it’s important to share failures on YouTube, instead of editing a perfect project build?

E  I think so. Yes. It comes from a place within me where, when I wasn’t good at something when I tried it for the first time – I’m a nineties kid, I don’t know if this is anything to do with it – but you try, and you fail, and you just assumed ‘OK, I’m not good at it.’ I’m not supposed to be playing piano, or whatever. That’s how I grew up thinking. And so, when I became an actual engineer, and I say ‘engineer’ because studying computer science is one thing, but to become an engineer is something completely different.

And when I actually became an engineer, that’s when it hit me that you have to really just go for it, stop thinking, stop planning, stop analysing, and just do it and see what happens, and learn from that.So that was a great lesson in life for me, and I want to show people like me that I make mistakes all the time and that I struggle sometimes, or that it takes several steps; it takes several tries to get somewhere. And so I want to show it for those people who feel maybe like they can’t do something because they didn’t do it the first time. I want to show them the human side of engineering.

Estefannie's desk
That’s one sweet studio setup

That’s cool. I liked when you were making the visor for your Daft Punk helmet and it was just a series of Instagram Live videos of you unsuccessfully melting plastic in your oven as you tried to learn how to vacuum-form.

E  The plastic melting was so fun, and I learned a lot. I would never do that again, ha ha.

Of all the projects you’ve made and shared, what has been the thing that you’ve been the proudest of because you managed to overcome an issue?

E  I think with most of my projects, I’ve had to overcome something. Except with the Jurassic Park Goggles. Although it was a pain to do, I already knew what I was doing, and that was because of the Daft Punk helmet. I struggled so much with that one that I knew exactly what do to with the goggles.I’ve been working on a smart litter box project for my cats, Teddy and Luna. That one required me to do a lot of woodwork and play with tools that I had never played with before. And so those days terrified me. But, I try to push myself with every project, so they’re all scary.

Estefannie's giveaway wheel
Giveaways are ruled by the random wheel of fate, like Boethius’ Wheel but nicer

You have projects that you’ve put your blood, sweat, and tears into, that you’ve worked hard on, that you’ve written all the code for. Where do you stand on whether you should give that code away for free? Do you provide it all the time? Do you ever think, ‘no, I’m going to keep this for myself’?

E  Oh, I am a true believer in open source. My plan is to continue to give it all away and put it on my website. This morning, I was finishing up a blog post I’m writing about the Daft Punk helmet. A step-by-step on how to do it, because I know people watch the video, but they might not be able to follow it to make their own. So now I’m going ‘here, here’s what I use’. And all those links in the post, Home Depot, etc., all the links I’m using, they’re not even affiliated. I’m making zero dollars out of that post I’ve been working on.

I know lots of the people who want to recreate my projects are kids, and they have no money. This is the type of education I wish I had had when I was younger. If I had known about this stuff, I would have started when I was very young. So, I can’t charge them. I feel, if they have to buy electronics, there’s no way I can charge extra for the schematic and the code. I cannot do that. It’s about being very conscious of who my audience is. I don’t want to stop  them from making it. It’s the opposite. That’s why I do giveaways every week on Instagram Live. I want to give them the boards. I want to give them everything so they can do it. I didn’t have any money growing up, and I know the feeling.

I respect people who want to charge for it. I understand. But I’m not in that boat. Even the smart little box that I’m currently working on, someone who I respect very much said, ‘oh, that’s a great idea, why don’t you patent it and manufacture it? There’s a market for it.’ And I know there’s a market for it, but that’s not the point. The point is to show that you can do it. Anything that’s in your imagination, you can build it, you can do it, and here are the steps. Yeah, I want more money, but I think I can get there in different ways, through YouTube ads and sponsorships.

Estefannie with soldering iron at her desk
Soldering makes us this happy too

There are a million different ways to make an LED blink, and none of them is the wrong way, they’re just the comfortable way you find to do it. Do you get backlash when you release your code from people saying, ‘Well, you should have done it this way’?

E I have never received backlash on code and, in fact, I would encourage people not to be scared to publish their code. I know people who say they want to open-source their code but they have to ‘clean it up first’, and they’re scared to publish it. But the whole point of open source is that you put it out there, you know it works, and it’s going to be OK. And it gets better because people will contribute. I’m never afraid of showing code.

Do you think, when you talk about financial accessibility that that’s one of the reasons that’s holding you back from starting a Patreon? That you’d be putting a financial wall up against people who can’t afford it.

E  One hundred percent. I don’t want to add to people’s financial strain. In fact, I am starting my new cryptocurrency so that I can send tokens to people around the world and, kinda like arcade tickets, they can spend them on things.

How does that work? How can I spend your cryptocurrency?

E  OK, so it has zero monetary value. The idea is that instead of giving out imaginary internet points to people in my live streams, they get actual internet points. And they can exchange them back to me for real items. I’ll have a menu of tech – so many points gets you a Pico, or a Raspberry Pi 400, or some other board – and people exchange their internet points for prizes. It helps me see how active someone has been in the live streams so I can say yes, it’s worth the $200 to ship this item to someone in India.

Ah, I get it. It’s like house points in school.

E  This is why it takes me so long to release a video because I’m like, let me do the cryptocurrency and then also that live stream, and then also this video about so and so. I just want to have a voice.

Close up of Estefannie's desk
Nice breadboard

A How do you decide what content to make? Is it just about creating content you think your audience will like? Or more about content you think is important for people to know?

E  I think I’ve always made videos that I felt were important, but I was always trying to, y’know, ‘play the algorithm’. And that was happening while I was still working and trying to quit my job so, of course, that was a period of my YouTube career where I was trying as much as I could to get views and hop on trends. Not the trends that were just ‘trends’, but trends by people I liked. Back then, I was a big fan of a YouTube baker, so I did a project using her stuff in the hopes she would see it. But I’m not really like that any more. If I see a channel I really like, I’ll try and do a collab, but not just because it would be beneficial for my channel. None of that any more. Just stuff I like.

One piece of advice that a lot of YouTubers have told me – that I’ve decided not to follow – is that you have to stick to one thing so that the audience knows what to expect. The same with Instagram. But I disagree, and I’ve gained more followers by being myself more. I’m Estefannie who also really, really likes crazy fashion. I like make-up and weird earrings, and why should I have to tone that down? Because I’m an engineer? I only post things that I would like. It’s not always me soldering. It’s not always code.

You create the content you want to see, not the content you think people want to see.

E  Yes. That would be easy to play that game, but that’s not what I want to do.

A lot of content creators would create a separate Instagram account or YouTube channel for their other passion, but all that’s doing is showing that it has two  different audiences. I think, especially when you are a woman in tech, if you then separate out the other things that you like, it’s almost like you’re saying, ‘Oh, well, these are two separate things that can’t exist together.’

E  Exactly. You’re saying, ‘I go to work. And I’m a scientist, and I look like this. But then I go home, and I look like this’. And it’s not true. There are some creators who have a million YouTube channels, and I don’t understand why because people really like them for who they are. But it’s following the example of how, if you want to do vlogging, you have to have a separate channel, and I don’t think you necessarily have to.

You are the brand, and people subscribe to you. You love fashion, and I couldn’t see you doing a ‘come shopping with me down Melrose Place’ video because that’s not who you are, but I could totally see you trying to make your own lipstick.

E  Oh, yeah. Oh, yeah.

A  You would make that video and your audience would love it because it’s you, and you’re doing something you’re passionate about.

E  Yeah, I mean, it’s like, the best example for me is Colin Furze. He is who he is. He wears his tie, he’s great. That’s very transparent. That’s him.There’s a maker who influenced the way I dressed for a bit, and I see it on all the other maker women in how they dress. And I didn’t even like those clothes. And when I noticed, and I stopped myself, and I was like, ‘this is not the Estefannie Experience’. It’s the other person experience, and I don’t need to replicate that because that’s not me. And if I want to wear my giant heels, I’ll wear my heels. You have to be yourself.

If people want to be creators, it’s OK to be yourself. And if you’re the only one and you don’t have a team like other creators, that it’s OK to take your time and not do it for the algorithm. That’s my advice. You don’t have to post every week. I mean, you can, but don’t kill yourself. It’s a one-woman show over here. I do my taxes, I do the website, I do the videos. That’s the advice I want to give here. That’s what I want people to take from this interview.

Subscribe to Estefannie on YouTube, and follow her on Instagram. And make sure to take part in her weekly live streams for a chance to win some exclusive Estefannie Internet Points.

Issue 42 of HackSpace magazine is on sale NOW!

Alex spoke to Estefannie for the latest issue of HackSpace magazine. Each month, HackSpace brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents. As always, every issue is free to download from the HackSpace magazine website.

hackspace issue 42 cover

The post Meet Estefannie Explains it All appeared first on Raspberry Pi.



Source: Raspberry Pi – Meet Estefannie Explains it All

Star Wars Arcade Cabinet | The MagPi #105

Why pay over the odds when you can build an accurate replica, and have fun doing it? For the latest issue of The MagPi Magazine, Rob Zwetsloot switches off his targeting computer to have a look.

header of the arcade cabinet bearing a Star Wars logo
Art had to be rescaled, but it’s been done faithfully

Getting the arcade machine of your dreams gets a little harder every day, especially the older they are. Making one, however, is always possible if you have the right skills and a Raspberry Pi.

“My project was to build a replica, or as close as I could reasonably manage, of the Atari Star Wars arcade cabinet,” James Milroy tells us. “I really wanted to build a cockpit as that’s what I played on in the eighties, but sadly I didn’t have the room to house it, so the compromise was to build a stand-up cabinet instead.”

The workings were simple when it came down to it: Raspberry Pi 3B+ with Pimoroni Picade X HAT. This gives us a power switch, audio amp, buttons, and a joystick if necessary. The replica yoke is interfaced with a USB adapter from the same company
The workings were simple when it came down to it: Raspberry Pi 3B+ with Pimoroni Picade X HAT. This gives us a power switch, audio amp, buttons, and a joystick if necessary. The replica yoke is interfaced with a USB adapter from the same company

Even then, the standard cabinet has a lot of detail, and James really nailed the look of it. Why build it from scratch, though? “Initially, I had toyed with sourcing an original cabinet and restoring it, but soon gave up on that idea after finding it nigh on impossible to source a cabinet here in the UK,” James explains. “Almost all cabinets for sale were located in the USA, so they were out of the question due to the high cost of shipping. Atari only made just over 12,500 cabinets worldwide, so their rarity meant that they commanded top dollar, effectively putting them out of my price range. It was at this point that I decided that if it was going to happen, then I would have to make it myself.”

star wars arcade cabinet full length shot

Making a cabinet is hard enough, but the control system would have to be an original Atari yoke. “The Atari yoke is considered the ‘holy grail’ of controllers and, again, is very hard to find,” James says. “My prayers were answered in October 2018 when a thread on a forum I was subscribed to popped up with a small Utah-based startup aiming to supply replica yokes at a realistic price to the arcade community. I grabbed two of these (one for my friend) and the project was on.”

Good feeling

When it came to actually emulating the game, for James there was only one choice: “My decision to go with a Raspberry Pi was a no-brainer really. I had previously made a bartop cabinet using a Raspberry Pi 3 and RetroPie/EmulationStation which I was really pleased with. So I had a platform that I already had experience with and knew was more than capable of emulating the one game I needed to run. Besides, the simplicity and low cost of the ecosystem for Raspberry Pi far outweighs the extra expense and effort required going down the PC route.”

The riser was a custom build by James that emulates lights from the films
The riser was a custom build by James that emulates lights from the film

With a custom build and emulation, authenticity of the gameplay experience could be a bit off. However, that’s not the case here. “I think that it plays just like the real arcade machine mainly due to the inclusion of the replica yoke controller, and adding your credit by pressing the button on the coin door,” says James. “Ideally a vector monitor or a CRT would go a long way to making it look just like the original, but a reasonable representation is possible on an LCD using shaders and anti-aliasing. Gameplay does seem to get really hard really quick, though; this could be due to an imperfect emulation, but is more likely due to my reactions having dulled somewhat in the last 38 years!”

Always in motion

While the current build is amazing as it is, James does have some ideas to improve it. “Overall, I’m really pleased with the way the cabinet has worked out,” he says. “I will be replacing Raspberry Pi 3B+ with a Raspberry Pi 4 to enable me to run a newer version of MAME which will hopefully offer a better emulation, sort some audio glitching I get with my current setup, and hopefully enable some graphical effects (such as bloom and glow) to make it look more like its running on a CRT.”

Get your copy of The Magpi #105 now!

You can grab the brand-new issue right now online from the Raspberry Pi Press store, or via our app on Android or iOS. You can also pick it up from supermarkets and newsagents, but make sure you do so safely while following all your local guidelines. There’s also a free PDF you can download.

The post Star Wars Arcade Cabinet | The MagPi #105 appeared first on Raspberry Pi.



Source: Raspberry Pi – Star Wars Arcade Cabinet | The MagPi #105

Meet SeedGerm: a Raspberry Pi-based platform for automated seed imaging

Researchers at the John Innes Centre for plant and microbial science were looking for a cost‐effective phenotyping platform for automated seed imaging. They figured a machine learning-driven image analysis was the quickest way to deliver this essential, yet challenging, aspect of agricultural research. Sounds complicated, but they found that our tiny computers could handle it all.

Two types of SeedGerm hardware with wired and wireless connectivity used for acquiring seed germination image series for different crop species
Two types of SeedGerm hardware with wired and wireless connectivity used for acquiring seed germination image series for different crop species

What is phenotyping?

A phenotype is an organism’s observable characteristics, like growing towards the light, or having a stripy tail, or being one of those people who can make their tongue roll up. An organism’s phenotype is the result of the genetic characteristics it has – its genotype – and the environment in which it lives. For example, a plant’s genotype might mean it can grow quickly and become tall, but if its environment lacks water, it’s likely to have a slow-growing and short phenotype.

Phenotyping means finding out and recording particular aspects of an organism’s phenotype: for example, how fast seeds germinate, or how broad a plant’s leaves are.

Why do seeds need phenotyping?

Phenotyping allows us to guess at a seed’s genotype, based on things we can observe about the seed’s phenotype, such as its size and shape.

We can study which seed phenotypes appear to be linked to desirable crop phenotypes, such as a high germination rate, or the ability to survive in dry conditions; in other words, we can make predictions about which seeds are likely to grow into good crops. And if we have controlled the environment in which we’re doing this research, we can be reasonably confident that these “good” seed phenotypes are mostly due not to variation in environmental conditions, but to properties of the seeds themselves: their genotype.

Close up of seed germ set up 1
A close up of the incubators, each with Raspberry Pi computers on top, running the show

Growers need seeds that germinate effectively and uniformly to maximise crop productivity, so seed suppliers are interested in making sure their samples meet a certain germination rate.

The phenotypic traits that are used to work out whether seeds are likely to be good for growers are listed in the full research paper. But in general, researchers are looking for things like width, length, roundness, and contour lines in seeds.

How does Raspberry Pi help?

Gathering observations for phenotyping is a difficult and time-consuming process, and in order to capture high‐quality seed imaging continuously, the team needed to design two types of hardware apparatus. Raspberry Pi computers (Raspberry Pi 2 Model B or Raspberry Pi 3 Model B+) power both SeedGerm hardware designs, with a Raspberry Pi camera also providing image data in the lower-cost design.

seed genotyping at a computer
The open source software at work next to one of the mini seed incubators

The brilliant team behind this project recognised the limitations of current seed imaging approaches, and looked to explore how automating the analysis of seed germination could scale up their work in an affordable way. The SeedGerm system benefits from the cost-effectiveness of Raspberry Pi hardware and the open source software the team chose, and that makes us super happy.

Read the whole research paper, published in New Phytologist, here.

Raspberry Pi in biological sciences

Dr Jolle Jolles, a behavioural ecologist at the Center for Ecological Research and Forestry Applications (CREAF) near Barcelona, Spain, and a passionate Raspberry Pi user, has recently published a detailed review of the uptake of Raspberry Pi in biological sciences. He found that well over a hundred published studies have made use of Raspberry Pi hardware in some way.

The post Meet SeedGerm: a Raspberry Pi-based platform for automated seed imaging appeared first on Raspberry Pi.



Source: Raspberry Pi – Meet SeedGerm: a Raspberry Pi-based platform for automated seed imaging

How to add LoRaWAN to Raspberry Pi Pico

Arguably the winner of the standards war around wide area networking protocols for the Internet of Things, LoRaWAN is a low-powered, low-bandwidth, and long-range protocol. Intended to connect battery-powered remote sensors back to the internet via a gateway, on a good day, with a reasonable antenna, you might well get 15km of range from an off-the-shelf LoRa radio. The downside is that the available bandwidth will be measured in bytes, not megabytes, or even kilobytes.

An Adafruit RFM95W LoRa Radio breakout connected to a Raspberry Pi Pico.
An Adafruit RFM95W LoRa Radio breakout connected to a Raspberry Pi Pico

Support for LoRa connectivity for Raspberry Pi Pico was put together by Sandeep Mistry, the author of the Arduino LoRa library, who more recently also gave us Ethernet support for Pico. His library adds LoRa support for Pico and other RP2040-based boards using the Semtech SX1276 radio module. That means that breakouts like Adafruit’s RFM95W board, as well as their LoRa FeatherWing, are fully supported.

LoRaWAN coverage?

To make use of a LoraWAN-enabled Pico you’re going to need to be in range of a LoRa gateway. Fortunately there is The Things Network, an open-source community LoRaWAN network that has global coverage.

About The Things Network

Depending on where you are located, it’s quite possible that you’re already in coverage. However, if you aren’t, then you needn’t worry too much.

A map of The Things Network gateways in the United Kingdom

The days when the cost of a LoRaWAN base station was of the order of several thousand dollars are long gone. You can now pick up a LoRa gateway for around £75. Instead of buying one, I actually built my own gateway a couple of years ago. Unsurprisingly, perhaps, it was based around a Raspberry Pi.

Getting the source

If you already have the Raspberry Pi Pico toolchain set up and working, make sure your pico-sdk checkout is up to date, including submodules. If not, you should first set up the C/C++ SDK and then afterwards you need to grab the project from GitHub.

[[code]]czoyNjI6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3tbJiomXX1wYW4+PC9zdHJvbmc+IGdpdCBjbG9uZSAtLXJlY3Vyc2Utc3VibW9kdWxlcyBodHRwczovL2dpdGh1Yi5jb20vc2FuZGVlcG1pc3Rye1smKiZdfXkvcGljby1sb3Jhd2FuLmdpdAo8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVle1smKiZdfS1jb2xvclwiPiQ8L3NwYW4+PC9zdHJvbmc+IGNkIHBpY29fbG9yYXdhblwiO3tbJiomXX0=[[/code]]

Make sure you have your PICO_SDK_PATH set before before proceeding. For instance, if you’re building things on a Raspberry Pi and you’ve run the pico_setup.sh script, or followed the instructions in our Getting Started guide, you’d point the PICO_SDK_PATH to

[[code]]czoxMjg6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3tbJiomXX1wYW4+PC9zdHJvbmc+IGV4cG9ydCBQSUNPX1NES19QQVRIID0gL2hvbWUvcGkvcGljby9waWNvLXNka1wiO3tbJiomXX0=[[/code]]

Afterwards you are ready to build both the library and the example applications. But before you do that we need to do two other things: configure the cloud infrastructure where our data is going to go, and wire up our LoRa radio board to our Raspberry Pi Pico.

Set up an application

The Things Network is currently migrating from the V2 to V3 stack. Since my home gateway was set up a couple of years ago, I’m still using the V2 software and haven’t migrated yet. I’m therefore going to build a V2-style application. However, if you’re using a public gateway, or building your own gateway, you probably should build a V3-style application. The instructions are similar, and you should be able to make your way through based on what’s written below. Just be aware that there is a separate Network Console for the new V3 stack and things might look a little different.

Migration from TTN V2 to V3

While any LoRa device in range of your new gateway will have its packets received and sent upstream to The Things Network, the data packets will be dropped on the ground unless they have somewhere to go. In other words, The Things Network needs to know where to route the packets your gateway is receiving.

In order to give it this information, we first need to create an application inside The Things Network Console. To do this all you’ll need to do is type in a unique Application ID string — this can be anything — and the console will generate an Application EUI and a default Access Key which we’ll use to register our devices to our application.

Adding an application

Once we’ve registered an application, all we have to do then is register our individual device — or later perhaps many devices — to that application, so that the backend knows where to route packets from that device.

Registering a device

Registering our device can be done from the application’s page in the console.

Registering a device to an application

The Device ID is a human-readable string to identify our remote device. Since RFM9W breakout board from Adafruit ships with a sticker in the same bag as the radio with a unique identifier written on it we can use that to postpend a string to uniquely identify our Pico board, so we end up with something like pico-xy-xy-xy-xy-xy-xy as our Device ID.

We’ll also need to generate a Device EUI2. This is a 64-bit unique identifier. Here again we can use the unique identifier from the sticker, except this time we can just pad it with two leading zeros, 0000XYXYXYXYXYXY, to generate our Device EUI. You could also use pico_get_unique_board_id( ) to generate the Device EUI.

If you take a look at your Device page after registration you’ll need the Application EUI2 and Application Key2 to let your board talk to the LoRa network, or more precisely to let the network correctly route packets from your board to your application.

2 Make a note of your Device EUI, Application EUI, and Application Key.

Wiring things up on a breadboard

Now we’ve got our cloud backend set up, the next thing we need to do is connect our Pico to the LoRa breakout board. Unfortunately the RFM95W breakout isn’t really that breadboard-friendly. At least it’s not breadboard-friendly if you need access to the radio’s pins on both sides of the board like we do for this project — in this case the breakout is just a little bit too wide for a standard breadboard.

Connecting a Raspberry Pi Pico to an Adafruit RFM9x LoRa radio board.
Connecting a Raspberry Pi Pico to an Adafruit RFM95W LoRa Radio breakout

Fortunately it’s not really that much of a problem, but you will probably need to grab a bunch of male-to-female jumper wires along with your breadboard. Go ahead and wire up the RFM95W module to your Raspberry Pi Pico. The mapping between the pins on the breakout board and your Pico should be as follows:

Pico RP20401 SX1276 Module RFM95W Breakout
3V3 (OUT) VCC VIN
GND GND GND GND
Pin 10 GP7 DIO0 G0
Pin 11 GP8 NSS CS
Pin 12 GP9 RESET RST
Pin 14 GP10 DIO1 G1
Pin 21 GP16 (SPI0 RX) MISO MISO
Pin 24 GP18 (SPI0 SCK) SCK SCK
Pin 25 GP19 (SPI0 TX) MOSI MOSI
Mapping between physical pins, RP2040 pins, SX1276 module, and RFM95W breakout

These pins are the library default and can be changed in software.

Building and deploying software

Now we have our backend in the cloud set up, and we’ve physically “built” our radio, we can build and deploy our LoRaWAN application. One of the example applications provided by the library will read the temperature from the on-chip sensor on the RP2040 microcontroller

[[code]]czoxNDgzOlwiPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1hbWJlci1jb2xvclwiPnZvaWQ8L3tbJiomXX1zcGFuPiA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPmludGVybmF7WyYqJl19bF90ZW1wZXJhdHVyZV9pbml0PC9zcGFuPigpIHsKICAgIDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXN7WyYqJl19LXZpdmlkLWFtYmVyLWNvbG9yXCI+YWRjX2luaXQ8L3NwYW4+KCk7CiAgICA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFze1smKiZdfS1sdW1pbm91cy12aXZpZC1hbWJlci1jb2xvclwiPmFkY19zZWxlY3RfaW5wdXQ8L3NwYW4+KDQpOwogICAgPHNwYW4gY2xhc3M9XCJoYXtbJiomXX1zLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtYW1iZXItY29sb3JcIj5hZGNfc2V0X3RlbXBfc2Vuc29yX2VuYWJsZWQ8L3tbJiomXX1zcGFuPig8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPnRydWU8L3N7WyYqJl19cGFuPik7Cn0KCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtYW1iZXItY29sb3JcIj5mbG9he1smKiZdfXQ8L3NwYW4+IDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtb3JhbmdlLWNvbG9yXCI+aW50ZXtbJiomXX1ybmFsX3RlbXBlcmF0dXJlX2dldDwvc3Bhbj4oKSB7CiAgICA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3tbJiomXX11cy12aXZpZC1hbWJlci1jb2xvclwiPmZsb2F0PC9zcGFuPiBhZGNfdm9sdGFnZSA9IDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2x7WyYqJl19b3IgaGFzLWx1bWlub3VzLXZpdmlkLWFtYmVyLWNvbG9yXCI+YWRjX3JlYWQ8L3NwYW4+KCkgKiA8c3BhbiBjbGFzcz1cImhhcy1pbmxpe1smKiZdfW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj4zLjNmPC9zcGFuPiAvIDxzcGFuIGNsYXNzPVwiaGFzLWlubHtbJiomXX1pbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPjQwOTY8L3NwYW4+OwogICAgPHNwYW4gY2xhc3M9XCJoYXN7WyYqJl19LWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtYW1iZXItY29sb3JcIj5mbG9hdDwvc3Bhbj4gYWRjX3RlbXBlcmF0dXJlID17WyYqJl19IDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtb3JhbmdlLWNvbG9yXCI+Mjc8L3NwYW4+IC0ge1smKiZdfShhZGNfdm9sdGFnZSAtIDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtb3JhbmdlLWNvbG9ye1smKiZdfVwiPjAuNzA2Zjwvc3Bhbj4pIC8gPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY3tbJiomXX1vbG9yXCI+MC4wMDE3MjFmPC9zcGFuPjsKCiAgICA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdml7WyYqJl19ZC1hbWJlci1jb2xvclwiPnJldHVybjwvc3Bhbj4gYWRjX3RlbXBlcmF0dXJlOwp9XCI7e1smKiZdfQ==[[/code]]

and send it periodically to your Things Network application over the LoRaWAN radio. Go ahead and change directory to the otaa_temperature_led example application in your checkout. This example uses OTAA, so we’ll need the Device EUI, Application EUI, and Application Key we created.

[[code]]czoxMTY6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3tbJiomXX1wYW4+PC9zdHJvbmc+IGNkIGV4YW1wbGVzL290YWFfdGVtcGVyYXR1cmVfbGVkL1wiO3tbJiomXX0=[[/code]]

Open the config.h file in your favourite editor and change the REGION, DEVICE_EUI, APP_EUI, and APP_KEY to the values shown in the Network Console. The code is expecting the (default) string format, without spaces between the hexadecimal digits, rather than the byte array representation.

[[code]]czoxMjEwOlwiPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1hbWJlci1jb2xvclwiPiNkZWZpbntbJiomXX1lPC9zcGFuPiA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPkxPUkF7WyYqJl19V0FOX1JFR0lPTjwvc3Bhbj4gICAgICAgICAgTE9SQU1BQ19SRUdJT05fRVU4NjgKPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG97WyYqJl19ciBoYXMtbHVtaW5vdXMtdml2aWQtYW1iZXItY29sb3JcIj4jZGVmaW5lPC9zcGFuPiA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29se1smKiZdfW9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5MT1JBV0FOX0RFVklDRV9FVUk8L3NwYW4+ICAgICAgPHNwYW4gY2xhe1smKiZdfXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPlwiSW5zZXJ0IHlvdXIgRGV2aWNlIEVVSVwiPC9ze1smKiZdfXBhbj4KPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1hbWJlci1jb2xvclwiPiNkZWZpbmU8L3tbJiomXX1zcGFuPiA8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPkxPUkFXQU57WyYqJl19X0FQUF9FVUk8L3NwYW4+ICAgICAgICAgPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1ncmVlbi1jeWFuLWN7WyYqJl19b2xvclwiPlwiSW5zZXJ0IHlvdXIgQXBwbGljYXRpb24gRVVJXCI8L3NwYW4+CjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXN7WyYqJl19LWx1bWlub3VzLXZpdmlkLWFtYmVyLWNvbG9yXCI+I2RlZmluZTwvc3Bhbj4gPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhe1smKiZdfXMtbHVtaW5vdXMtdml2aWQtb3JhbmdlLWNvbG9yXCI+TE9SQVdBTl9BUFBfS0VZPC9zcGFuPiAgICAgICAgIDxzcGFuIGNsYXNzPVwiaHtbJiomXX1hcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj5cIkluc2VydCB5b3VyIEFwcCBLZXlcIjwvc3Bhbj4KPHNwe1smKiZdfWFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtYW1iZXItY29sb3JcIj4jZGVmaW5lPC9zcGFuPiA8c3tbJiomXX1wYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5MT1JBV0FOX0NIQU5ORUx7WyYqJl19X01BU0s8L3NwYW4+ICAgIDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtcmVkLWNvbG9yXCI+TlVMTDwvc3Bhe1smKiZdfW4+XCI7e1smKiZdfQ==[[/code]]

I’m located in the United Kingdom, with my LoRa radio broadcasting at 868MHz, so I’m going to set my region to LORAMAC_REGION_EU868. If you’re in the United States you’re using 915MHz, so need to set your region to LORAMAC_REGION_US915.

Then after you’ve edited the config.h file you can go ahead and build the example applications.

[[code]]czo0NTg6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3tbJiomXX1wYW4+PC9zdHJvbmc+IGNkIC4uLy4uCjxzdHJvbmc+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1jeWFuLXtbJiomXX1ibHVlLWNvbG9yXCI+JDwvc3Bhbj48L3N0cm9uZz4gbWtkaXIgYnVpbGQKPHN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY297WyYqJl19bG9yIGhhcy12aXZpZC1jeWFuLWJsdWUtY29sb3JcIj4kPC9zcGFuPjwvc3Ryb25nPiBjZCBidWlsZAo8c3Ryb25nPjxzcGFuIGNsYXN7WyYqJl19cz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3NwYW4+PC9zdHJvbmc+IGNtYWtlIC4uCjxze1smKiZdfXRyb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3Bhbj48L3N0cntbJiomXX1vbmc+IG1ha2VcIjt7WyYqJl19[[/code]]

If everything goes well you should have a UF2 file in build/examples/otaa_temperature_led/ called pico_lorawan_otaa_temperature_led.uf2. You can now load this UF2 file onto your Pico in the normal way.

Grab your Raspberry Pi Pico board and a micro USB cable. Plug the cable into your Raspberry Pi or laptop, then press and hold the BOOTSEL button on your Pico while you plug the other end of the micro USB cable into the board. Then release the button after the board is plugged in.

A disk volume called RPI-RP2 should pop up on your desktop. Double-click to open it, and then drag and drop the UF2 file into it. If you’re having problems, see Chapter 4 of our Getting Started guide for more information.

Your Pico is now running your LoRaWAN application, and if you want to you should be able to see some debugging information by opening a USB Serial connection to your Pico. Open a Terminal window and start minicom.

[[code]]czo0MjpcIjxzdHJvbmc+JDwvc3Ryb25nPiBtaW5pY29tIC1EIC9kZXYvdHR5QUNNMFwiO3tbJiomXX0=[[/code]]

Sending data

However, you’ll need to turn to the Network console to see the real information. You should see an initial join message, followed by a number of frames. Each frame represents a temperature measurement sent by your Pico via LoRaWAN and the Gateway to The Things Network application.

Data coming via LoRaWAN to the Things Network
Data coming via LoRaWAN to The Things Network

The payload value is the temperature measured by the Raspberry Pi Pico’s internal temperature sensor in hexadecimal. It’s a bit outside the scope of this article, but you can now add a decoder and integrations that allow you decode the data from hexadecimal into human-readable data and then, amongst various other options, save it to a database.

Sending commands

As well as sending temperature data, the example application will also let you toggle the LED on your Raspberry Pi Pico directly from The Things Network console.

Sending data back to your Raspberry Pi Pico via LoRaWAN

Go to the Device page in the Network Console and type “01” into the Downlink Payload box, and hit the “Send” button. Then flip to the Data tab. You should see a “Download scheduled” line, and if you continue to watch you should see the byte downlinked. When that happens the on-board LED on your Raspberry Pi Pico should turn on! Returning to the Network Console and typing “00” into the Payload box will (eventually) turn the Pico’s LED off.

Remember that LoRaWAN is long-range, but low-bandwidth. You shouldn’t expect an instant response to a downlinked command.

Where now?

The OTAA example application is a really nice skeleton for you to build on that will let you take data and send it to the cloud over LoRa, as well as send commands back from the cloud to your LoRa-enabled Pico.

Arm Innovation Coffee – The Things Network

There will be more discussion around the Things Network and a live demo of LoRaWAN from a Raspberry Pi Pico during this week’s Arm Innovation Coffee at 10:00 PDT (18:00 BST) this Thursday (29 April).

Wrapping up

Support for developing for Pico can be found on the Raspberry Pi forums. There is also an (unofficial) Discord server where a lot of people active in the community seem to be hanging out. Feedback on the documentation should be posted as an Issue to the pico-feedback repository on GitHub, or directly to the relevant repository it concerns.

All of the documentation, along with lots of other help and links, can be found on the Getting Started page. If you lose track of where that is in the future, you can always find it from your Pico: to access the page, just press and hold the BOOTSEL button on your Pico, plug it into your laptop or Raspberry Pi, then release the button. Go ahead and open the RPI-RP2 volume, and then click on the INDEX.HTM file.

That will always take you to the Getting Started page.

The post How to add LoRaWAN to Raspberry Pi Pico appeared first on Raspberry Pi.



Source: Raspberry Pi – How to add LoRaWAN to Raspberry Pi Pico

How can we design inclusive and accessible curricula for computer science?

After a brief hiatus over the Easter period, we are excited to be back with our series of online research seminars focused on diversity and inclusion, where in partnership with the Royal Academy of Engineering, we host researchers from the UK and USA. By diversity, we mean any dimension that can be used to differentiate groups and people from one another. This might be, for example, age, gender, socio-economic status, disability, ethnicity, religion, nationality, or sexuality. The aim of inclusion is to embrace all people irrespective of difference.

Maya Israel

This month we welcomed Dr Maya Israel, who heads the Creative Technology Research Lab at the University of Florida. She spoke to us about designing inclusive learning experiences in computer science (CS) that cater for learners with a wide range of educational needs.

Underrepresentation of computer science students with additional needs

Maya introduced her work by explaining that the primary goal of her research is to “increase access to CS education for students with disabilities and others at risk for academic failure”. To illustrate this, she shared some preliminary findings (paper in preparation) from the analysis of data from one US school district.

A computing classroom filled with learners.
By designing activities that support students with additional educational needs, we can improve the understanding and proficiency of all of our students.

Her results showed that only around 22–25% of elementary school students with additional needs (including students with learning disabilities, speech or language impairments, emotional disturbances, or learners on the autistic spectrum) accessed CS classes. Even more worryingly, by high school only 5–7% of students with additional needs accessed CS classes (for students on the autistic spectrum the decline in access was less steep, to around 12%).

Maya made the important point that many educators and school leaders may ascribe this lack of representation to students’ disabilities being a barrier to success, rather than to the design of curricula and instruction methods being a barrier to these students accessing and succeeding in CS education.

What barriers to inclusion are there for students with additional needs?

Maya detailed the systems approach she uses in her work to think about external barriers to inclusion in CS education:

  • At the classroom level — such as teachers’ understanding of learner variability and instructional approaches
  • At the school level — perhaps CS classes clash with additional classes that the learner requires for extra support with other subjects
  • At the systemic level — whether the tools and curricula in use are accessible

As an example, Maya pointed out that many of the programming platforms used in CS education are not fully accessible to all learners; each platform has unique accessibility issues.

A venn diagram illustrating that the work to increase access to CS education for students with disabilities and others at risk for academic failure cannot occur if we do not examine barriers to inclusion in a systematic way. The venn diagram consists of four fully overlapping circles. The outermost is represents systemic barriers. The next one represents school-level barriers. The third one represents classroom barriers. The innermost one represents the resulting limited inclusion.

This is not to say that students with additional needs have no internal barriers to succeeding in CS (these may include difficulties with understanding code, debugging, planning, and dealing with frustration). Maya told us about a study in which the researchers used the Collaborative Computing Observation Instrument (C-COI), which allows analysis of video footage recorded during collaborative programming exercises to identify student challenges and strategies. The study found various strategies for debugging and highlighted a particular need for supporting students in transitioning from a trial-and-error approach to more systematic testing. The C-COI has a lot of potential for understanding student-level barriers to learning, and it will also be able to give insight into the external barriers to inclusion.

Pathways to inclusion

Maya’s work has focused not only on identifying the problems with access, it also aims to develop solutions, which she terms pathways to inclusion. A standard approach to inclusion might involve designing curricula for the ‘average’ learner and then differentiating work for learners with additional needs. What is new and exciting about Maya’s approach is that it is based on the premise that there is no such person as an average learner, and rather that all learners have jagged profiles of strengths and weaknesses that contribute to their level of academic success.

In the seminar, Maya described ways in which CS curricula can be designed to be flexible and take into account the variability of all learners. To do this, she has been using the Universal Design for Learning (UDL) approach, adapting it specifically for CS and testing it in the classroom.

The three core concepts of Universal Design for Learning according to Maya Israel. 1, barriers exists in the learning environment. 2, variability is the norm, meaning learners have jagged learning profiles. 3, the goal is expert learning.

Why is Universal Design for Learning useful?

The UDL approach helps educators anticipate barriers to learning and plan activities to overcome them by focusing on providing different means of engagement, representation, and expression for learners in each lesson. Different types of activities are suggested to address each of these three areas. Maya and her team have adapted the general principles of UDL to a CS-specific context, providing teachers with clear checkpoints to consider when designing computing lessons; you can read more on this in this recent Hello World article.

Two young children code in Scratch on a laptop.

A practical UDL example Maya shared with us was using a series of scaffolded Scratch projects based on the ‘Use-Modify-Create’ approach. Students begin by playing and remixing code; then they try to debug the same program when it is not working; then they reconstruct code that has been deconstructed for the same program; and then finally, they try to expand the program to make the Scratch sprite do something of their choosing. All four Scratch project versions are available at the same time, so students can toggle between them as they learn. This helps them work more independently by reducing cognitive load and providing a range of scaffolded support.

This example illustrates that, by designing activities that support students with additional educational needs, we can improve the understanding and proficiency of all of our students.

Training teachers to support CS students with additional needs

Maya identified three groups of teachers who can benefit from training in either UDL or in supporting students with additional needs in CS:

  1. Special Education teachers who have knowledge of instructional strategies for students with additional needs but little experience/subject knowledge of computing
  2. Computing teachers who have subject knowledge but little experience of Special Education strategies
  3. Teachers who are new to computing and have little experience of Special Education

Maya and her team conducted research with all three of these teacher groups, where they provided professional development for the teachers with the aim to understand what elements of the training were most useful and important for teachers’ confidence and practice in supporting students with additional needs in CS. In this research project, they found that for the teachers, a key aspect of the training was having time to identify and discuss the barriers/challenges their students face, as well as potential strategies to overcome these. This process is a core element of the UDL approach, and may be very different to the standard method of planning lessons that teachers are used to.

A teacher attending Picademy teacher training laughs as she works through an activity.
Having time to identify and discuss the barriers/challenges students face, as well as potential strategies to overcome these, is key for teachers to design accessible curricula.

Another study by Maya’s team showed that an understanding of UDL in the context of CS was a key predictor of teacher confidence in teaching CS to students with additional needs (along with the number years spent teaching CS, and general confidence in teaching CS). Maya therefore believes that focusing on teachers’ understanding of the UDL approach and how they can apply it in CS will be the most important part of their future professional development training.

Final thoughts

Maya talked to us about the importance of intersectionality in supporting students who are learning CS, which aligns with a previous seminar given by Jakita O. Thomas. Specifically, Maya identified that UDL should fit into a wider approach of Intersectional Inclusive Computer Science Education, which encompasses UDL, culturally relevant and sustaining pedagogy, and translanguaging pedagogy/multilingual education. We hope to learn more about this topic area in upcoming seminars in our current series.

Four key takeaways from Maya Israel's research seminar: 1, include students with disabilities in K-12 CS education. They will succeed when given accessible, engaging activities. 2, consider goals, anticipated barriers, and the UDL principles when designing instructions for all learners. 3, disaggregate your data to see who is meeting instructional goals and who is not. 4, share successes of students with disabilities in CS education so we can start shifting the discourse to better inclusion.

You can download Maya’s presentation slides now, and we’ll share the video recording of her seminar on the same page soon. 

Attend the next online research seminar

The next seminar in the diversity and inclusion series will take place on Tuesday 4 May at 17:00–18:30 BST / 12:00–13:30 EDT / 9:00–10:30 PDT / 18:00–19:30 CEST. You’ll hear from Dr Cecily Morrison (Microsoft Research) about her research into computing for learners with visual impairments.

To join this free event, click below and sign up with your name and email address:

We’ll send you the link and instructions. See you there!

This was our 15th research seminar — you can find all the related blog posts here.

The post How can we design inclusive and accessible curricula for computer science? appeared first on Raspberry Pi.



Source: Raspberry Pi – How can we design inclusive and accessible curricula for computer science?

Colin Furze is among our special Coolest Projects judges

Young tech creators from more than 40 countries have already registered to take part in this year’s Coolest Projects online showcase! To help us celebrate this year’s wonderful group of participants, we’re lucky to have brought on board Colin Furze, Melissa Pickering, James Whelton, and Fig O’Reilly as special judges.

“Since the first Coolest Projects in 2012, I’ve been continually inspired seeing thousands of young creators sharing their projects with the world. Building websites, apps, games, and hardware around something they’re passionate about, solving problems they face or just doing something cool, year on year Coolest Projects shows the magic of technology.”

James Whelton

Meet the coolest judges!

Colin Furze is a British YouTube personality, presenter, inventor, and five-time Guinness world record holder from Lincolnshire, England. Colin’s YouTube channel has over 10 million subscribers. Colin left school at 16 to become a plumber, a trade which he pursued until joining the Sky1 TV programme Gadget Geeks. He has used his engineering experience to build many unconventional contraptions, including a homemade hoverbike, a jet-powered bicycle made with pulsejet engines, and the world’s fastest dodgem vehicle for Top Gear. Colin has completed three Star Wars–themed challenges in partnership with eBay: in 2016, he completed a giant AT-AT garden playhouse, followed in 2017 by a full-size Kylo Ren Tie Silencer. In 2019 he completed a moving Landspeeder from Star Wars: A New Hope; the vehicle was auctioned off on eBay, with all of the funds going to BBC Children in Need.

Colin Furze, special judge for Coolest Projects
Colin Furze, YouTuber, inventor, and five-time Guinness world record holder

Melissa Pickering is Head of Product at LEGO Education, leading a cross-functional team to design and develop learning through play experiences for kids globally. She has worked in the field of interactive kids’ products for 15 years, from innovating theme parks as a Disney Imagineer to founding an edtech startup. In her six-year LEGO career she has built up and led design teams to innovate LEGO products through digital experiences, with a key focus of using technology to inspire hands-on play.

Melissa Pickering, Coolest Projects special judge
Melissa Pickering, Head of Product at LEGO Education

Fionnghuala O’Reilly is an Irish-American model, beauty pageant titleholder, and engineer. The 27-year-old recently made history as the first woman of colour to represent Ireland at the international Miss Universe pageant. Since getting her degree in Systems Engineering from the George Washington University, O’Reilly, who goes by Fig, has gone on to become a NASA Datanaut, working within the agency’s Open Innovation programme comprised of engineers and scientists who engage with NASA’s open data to create new thinking, processes and products. Fig has joined the two-time Emmy-nominated science television series Mission Unstoppable as the newest correspondent. She is also the founder and CEO of Reach Productions which is the host of NASA’s Space Apps Challenge in Washington DC. In 2020, Fig was named an Ambassador for Engineers Ireland, Ireland’s leading governing body for professional engineers. Fig is a passionate advocate for women and diversity in STEM subjects.

Fig O'Reilly, special judge for Coolest Projects
Fig O’Reilly, beauty pageant titleholder, engineer, and CEO

James Whelton is a coder, entrepreneur, and co-founder of CoderDojo. At 16, James gained worldwide recognition for discovering a hack for the iPod Nano. In response to the lack of opportunities to learn computing at school, he co-founded CoderDojo in 2011, a global community of code clubs for young people where they can learn to build websites, apps and games, and explore technology in an informal, creative, and social environment. James has developed apps and systems with over a million users around the world. He is currently developing an online platform that helps its users achieve their personal goals and build healthier, happier habits and behaviours.

James Whelton, special judge for Coolest Projects
James Whelton, coder, entrepreneur, and co-founder of CoderDojo

Register a project today

These four fabulous people will choose their favourites from among all of this year’s projects — a unique honour that the young tech creator in your life could receive if they take part! We hope this will be a big boost of motivation for them to register their project for the Coolest Projects showcase before the 3 May deadline.

We’ll be announcing the special judges’ favourite projects as part of our big live-streamed Coolest Projects celebration on 8 June!

A girl presenting a digital making project at a Coolest Projects event

Everyone up to age 18 can register for Coolest Projects, and we welcome all projects, all experience levels, and all kinds of projects, made with any programming language or any hardware. Through Coolest Projects, young people are able to show the world something they’ve made with tech that they love, and the projects are as diverse as the participants!

Discover all the support we offer young people to help them create something with tech that they will be proud of.

The showcase gallery is open for you already

You can explore the projects of the young tech creators who’ve already registered if you visit the Coolest Projects online showcase gallery! Which one is your favourite project so far?

The post Colin Furze is among our special Coolest Projects judges appeared first on Raspberry Pi.



Source: Raspberry Pi – Colin Furze is among our special Coolest Projects judges

Custom USB games controllers with Raspberry Pi Pico | HackSpace 42

Games controllers – like keyboards – are very personal things. What works for one person may not work for another. Why, then, should we all use almost identical off-the-shelf controllers? In the latest issue of HackSpace magazine, we take a look at how to use Raspberry Pi Pico to create a controller that’s just right for you.

home made retro gaming joystick box
Gaming like it’s 1989

We’ll use CircuitPython for this as it has excellent support for USB interfaces. The sort of USB devices that we interact with are called human interface devices (HIDs), and there are standard protocols for common HIDs, including keyboards and mice. This is why, for example, you can plug almost any USB keyboard into almost any computer and it will just work, with no need to install drivers.

We’ll be using the Keyboard type, as that works best with the sorts of games that this author likes to play, but you can use exactly the same technique to simulate a mouse or a gamepad.

Before we get onto this, though, let’s take a look at the buttons and how to wire them up.

We’re going to use eight buttons: four for direction, and four as additional ‘action’ buttons. We’ll connect these between an I/O pin and ground. You can use any I/O pin you like. We’re going to use slightly different ones in two different setups, just because they made sense with the physical layout of the hardware. Let’s take a look at the hardware we’re using. Remember, this is just the hardware we want to use. The whole idea of this is to create a setup that’s right for you, so there’s no need to use the same. Think about how you want to interact with your games and take a look at the available input devices and build what you want.

The connectors should just push onto the buttons and joysticks
The connectors should just push onto the buttons and joystick

The first setup we’re creating is an Arcade box. This author would really like an arcade machine in his house. However, space limitations mean that this isn’t going to be possible in the near future. The first setup, then, is an attempt to recreate the control setup of an arcade machine, but use it to play games on a laptop rather than a full-sized cabinet.

Arcade controls are quite standard, and you can get them from a range of sources. We used one of Pimoroni’s Arcade Parts sets, which includes a joystick and ten buttons (we only used four of these). The important thing about the joystick you pick is that it’s a button-based joystick and not an analogue one (sometimes called a dual-axis joystick), as the latter won’t work with a keyboard interface. If you want to use an analogue joystick, you’ll need to switch the code around to use a mouse or gamepad as an input device.

You can solder the pin headers straight onto Pico
You can solder the pin headers straight onto Pico

As well as the electronics, you’ll need some way of mounting them. We used a wooden craft box. These are available for about £10 from a range of online or bricks and mortar stores. You can use anything that is strong enough to hold the components.

The second setup we’re using is a much simpler button-based system on breadboard-compatible tactile buttons and protoboard. It’s smaller, cheaper, and quicker to put together. The protoboard holds everything together, so there’s nothing extra to add unless you want to. You can personalise it by selecting different-sized buttons, changing the layout, or building a larger chassis around this.

Insert coin to continue

Let’s take a look at the arcade setup first. The joystick has five pins. One is a common ground and the others are up, down, left, and right. When you push the joystick up, a switch closes, linking ground to the up pin. On our joystick the outermost pin is ground, but it’s worth checking on your joystick which pin is which by using a multimeter. Select continuity mode and, if you push the joystick up, you should find a continuous connection between the up pin and ground. A bit of experimentation should confirm which pin is which.

In order to read the pins, we just need to connect the directional output from the joystick to an I/O pin on Pico. We can use one of Pico’s internal pull-up resistors to pull the pin high when the button isn’t pressed. Then, when the button is pressed, it will connect to ground and read low. The joystick should come with a cable that slots onto the joystick. This should have five outputs, and this conveniently slots into the I/O outputs of Pico with a ground on one end.

You can solder the pin headers straight onto Pico
You can solder the pin headers straight onto Pico

The buttons, similarly, just need to be connected between ground and an I/O pin. These came with cables that pushed onto the button and plugged into adjacent pins. Since Pico has eight grounds available, there are enough that each button can have its own ground, and you don’t have to mess around joining cables together.

Once all the cables are soldered together, it’s just a case of building the chassis. For this, you need five large holes (one for the joystick and four for the buttons). We didn’t have an appropriately sized drill bit and, given how soft the wood on these boxes is, a large drill bit may have split the wood anyway. Instead, we drilled a 20 mm hole and then used a rotary tool with sanding attachment to enlarge the hole until it was the right size. You have to go quite easy with both the drill and the sanding tool to avoid  turning everything into shards of broken wood. Four small holes then allow bolts to keep the joystick in place (we used M5 bolts). The buttons just push into place.

With a combination of small sections of wire and jumpers, you can create whatever pattern of wiring you like on protoboard
With a combination of small sections of wire and jumpers, you can create whatever pattern of wiring you like on protoboard

The only remaining thing was a 12 mm hole for a micro USB cable to pass through to Pico. If you don’t have a 12 mm drill bit, two overlapping smaller holes may work if you’re careful.

The buttons just push-fit into place, and that’s everything ready to go.

A smaller approach

Our smaller option used protoboard over the back of Pico. Since we didn’t want to block the BOOTSEL button, we only soldered it over part of Pico. However, before soldering it on at all, we soldered the buttons in place.

Tactile switches typically have four connections. Well, really they have two connections, but each connection has two tabs that fit into the protoboard. This means that you have to orientate them correctly. Again, your multimeter’s continuity function will confirm which pins are connected and which are switched.

Protoboard is a PCB that contains lots and lots of holes and nothing else. You solder your components into the holes and then you have to create connections between them.

We placed the buttons in the protoboard in positions we liked before worrying about the wiring. First, we looked to connect one side of each switch to ground. To minimise the wiring, we did this in two groups. We connected one side of each of the direction buttons together and then linked them to ground. Then we did the same to all the action buttons.

There are two ways of connecting things on protoboard. One is to use jumper wire. This works well if the points are more than a couple of holes apart. For holes that are next to each other, or very close, you can bridge them. On some protoboard (which doesn’t have a solder mask), you might simply be able to drag a blob of solder across with your soldering iron so that it joins both holes. On protoboard with solder mask, this doesn’t work quite so well, so you need to add a little strand of wire in a surface-mount position between the two points and solder it in. If you’ve got a pair of tweezers to hold the wire in place while you solder it, it will be much easier.

For longer connections, you’ll need to use jumper wire. Sometimes you’ll be able to poke it through the protoboard and use the leg to join. Other times you’ll have to surface-mount it. This all sounds a bit complicated, but while it can be a bit fiddly, it’s all fairly straightforward once you put solder to iron.

Program it up

Now that we’ve got the hardware ready, let’s code it up. You’ll first need to load CircuitPython onto your Pico. You can download the latest release from circuitpython.org. Press the BOOTSEL button as you plug Pico into your USB port, and then drag and drop the downloaded UF2 file onto the RP2 USB drive that should appear.

We’ll use Mu to program Pico. If you’ve not used CircuitPython before, it’s probably worth having a quick look through the ’getting started’ guide.

The code to run our games controller is:

import board<br/>import digitalio<br/>import gamepad<br/>import time<br/>import usb_hid<br/>from adafruit_hid.keyboard import Keyboard<br/>from adafruit_hid.keycode import Keycode<br/><br/>kbd = Keyboard(usb_hid.devices)<br/><br/>keycodes = [Keycode.UP_ARROW, Keycode.DOWN_ARROW, Keycode.LEFT_ARROW, Keycode.RIGHT_ARROW,                   Keycode.X, Keycode.Z, Keycode.SPACE, Keycode.ENTER]<br/><br/>pad = gamepad.GamePad(<br/>    digitalio.DigitalInOut(board.GP12),<br/>    digitalio.DigitalInOut(board.GP14),<br/>    digitalio.DigitalInOut(board.GP9),<br/>    digitalio.DigitalInOut(board.GP15),<br/>    digitalio.DigitalInOut(board.GP16),<br/>    digitalio.DigitalInOut(board.GP17),<br/>    digitalio.DigitalInOut(board.GP18),<br/>    digitalio.DigitalInOut(board.GP20),<br/>)<br/>last_pressed = 0<br/>while True:<br/>    this_pressed = pad.get_pressed()<br/>    if (this_pressed != last_pressed):<br/>        for i in range(8):<br/>            if (this_pressed &amp; 1&lt;&lt;i) and not (last_pressed &amp; 1&lt;&lt;i):<br/>                kbd.press(keycodes[i])<br/>            if (last_pressed &amp; 1&lt;&lt;i) and not (this_pressed &amp; 1&lt;&lt;i):<br/>                kbd.release(keycodes[i])<br/>        last_pressed = this_pressed<br/>    time.sleep(0.01)

This uses the HID keyboard object (called kbd) to send key press and release events for different key codes depending on what buttons are pressed or released. We’ve used the gamepad module that is for keeping track of up to eight buttons. When you initialise it, it will automatically add pull-up resistors and set the I/O pins to input. Then, it will keep track of what buttons are pressed. When you call get_pressed(), it will return a byte of data where each digit corresponds to an I/O pin. So, the following number (in binary) means that the first and third buttons have been pressed: 00000101. This is a little confusing, because this is the opposite order to how the I/Os are passed when you initialise the GamePad object.

The while loop may look a little unusual as it’s not particularly common to use this sort of binary comparison in Python code, but in essence, it’s just looking at one bit at a time and seeing either: it’s now pressed but wasn’t last time the loop ran (in which case, it’s a new button press and we should send it to the computer), or it isn’t pressed this loop but was the previous loop (in which case, it’s newly released so we can call the release method).

The << operator shifts a value by a number of bits to the left. So, 1<<2 is 100, and 1<<3 is 1000. The & operator is bitwise and so it looks at a binary number and does a logical AND on each bit in turn. Since the right-hand side of the & is all zeros apart from one bit (at a different position depending on the value of i), the result will be dependent on whether the value of this_pressed or last_pressed is 1 or 0 at the position i. When you have an if condition that’s a number, it’s true if the number is anything other than 0. So, (this_pressed & 1<<2) will evaluate to true if there’s a 1 at position 2 in the binary form of this_pressed.  In our case, that means if the joystick is pushed left.

You can grab this code from the following link – hsmag.cc/USBKeyboard. Obviously, you will need to update the GPIO values to the correct ones for your setup when you initialise GamePad.

We’ve taken a look at two ways to build a gamepad, but it’s up to you how you want to design yours.   

Issue 42 of HackSpace magazine is on sale NOW!

hackspace issue 42 cover

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents. As always, every issue is free to download from the HackSpace magazine website.

The post Custom USB games controllers with Raspberry Pi Pico | HackSpace 42 appeared first on Raspberry Pi.



Source: Raspberry Pi – Custom USB games controllers with Raspberry Pi Pico | HackSpace 42

Raspberry Pi touchscreen music streamer

If you liked the look of yesterday’s Raspberry Pi Roon Endpoint Music Streamer but thought: “Hey, you know what would be great? If it had a touchscreen,” then look no further. Home Theater Fanatics has built something using the same RoPieee software, but with the added feature of a screen, for those who need one.

Subscribe to Home Theater Fanatics on YouTube for more great builds like this one

The build cost for this is a little higher than the $150 estimate to recreate yesterday’s project, given the inclusion of a fancier Digital Audio Decoder and the touchscreen itself.

Hardware

connecting raspberry pi to touchscreen
It really is a super user-friendly walkthrough video

The brilliant Home Theater Fanatics show you how to put all of this together from this point in the build video, before moving on to the software install. They take care to go through all of the basics of the hardware in case you’re not familiar with things like ribbon cables or fans. It’s a really nice bird’s-eye view walkthrough, so beginners aren’t likely to have any problems following along.

ribbon attaching to raspberry pi
See – close-ups of how to connect your ribbon cables and everything

Software

Same as yesterday’s build:

At this point in the build video, Home Theater Fanatics go through the three steps you need to take to get the RoPieee and Roon software sorted out, then connect the DAC. Again, it’s a really clear, comprehensive on-screen walkthrough that beginners can be comfortable with.

Why do I need a touchscreen music streamer?

touchscreen music player
Get all your album track info right in your face

Aside from being able to see the attributed artwork for the music you’re currently listening to, this touchscreen solution provides easy song switching during home workouts. It’s also a much snazzier-looking tabletop alternative to a plugged-in phone spouting a Spotify playlist.

The post Raspberry Pi touchscreen music streamer appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi touchscreen music streamer

How to build a Raspberry Pi Roon Endpoint Music Streamer

Our friend Mike Perez at Audio Arkitekts is back to show you how to build PiFi, a Raspberry Pi-powered Roon Endpoint Music Streamer. The whole build costs around $150, which is pretty good going for such a sleek-looking Roon-ready end product.

Roon ready

Roon is a platform for all the music in your home, and Roon Core (which works with this build) manages all your music files and streaming content. The idea behind Roon is to bring all your music together, so you don’t have to worry about where it’s stored, what format it’s in, or where you stream it from. You can start a free trial if you’re not already a user.

Parts list

Sleek HiFiBerry case

Simple to put together

Fix the HiFiBerry DAC2 Pro into the top of the case with the line output and headphone outputs poking out. A Raspberry Pi 4 Model B is the brains of the operation, and slots nicely onto the HiFiBerry. The HiFiBerry HAT is compatible with all Raspberry Pi models with a 40-pin GPIO connector and just clicks right onto the GPIO pins. It is also directly powered by the Raspberry Pi so, no additional power supply needed.

Raspberry Pi 4 connected to HiFiBerry HAT inside the top half of the case (before the bottom half is screwed on)

Next, secure the bottom half of the case, making sure all the Raspberry Pi ports line up with the case’s ready-made holes. Mike did the whole thing by hand with just a little help from a screwdriver right at the end.

Software

Download the latest RoPieee image onto your SD card to make it a Roon Ready End Point, then slot it back into your Raspberry Pi. Now you have a good-looking, affordable audio output ready to connect to your Roon Core.

And that’s it. See – told you it was easy. Don’t forget, Audio Arkitekts’ YouTube channel is a must-follow for all audiophiles.

The post How to build a Raspberry Pi Roon Endpoint Music Streamer appeared first on Raspberry Pi.



Source: Raspberry Pi – How to build a Raspberry Pi Roon Endpoint Music Streamer

Transform Raspberry Pi 400 into a hacker cyberdeck

Resisting the desolate consumerism of the suburbs is a serious business for hardware hacker Zack Freedman. Zack transformed a Raspberry Pi 400 into the Voidstar Data Blaster, a portable cyberdeck to fight against becoming a normie.

The suburbs thing is explained at the beginning of Zack’s build video. Subscribe to his YouTube channel.

Hang on, what is a cyberdeck?

Zack explains:

“A data blaster [cyberdeck] is the trademark battlestation of console cowboy antiheroes running nets through cyberspace.”

There’s a whole subreddit devoted to exploring what does and does not make a real-life cyberdeck, so if you were looking for a rabbit hole to go down, knock yourself out.

view of cyber deck on a desk
Punky

How do you turn a Raspberry Pi 400 into a cyberdeck?

Added features to transform a Raspberry Pi 400 into the Voidstar Data Blaster include:

  • Detachable wearable display
  • Battery handles
  • SDR receiver
  • Antennae
  • 1280×480 touchscreen
data blaster strapped to forearm of maker
Wear your data blaster with pride

Handles make the cyberdeck nice and portable. Console cowboys can also use them to flip the deck up onto their forearm and easily “jack in” to cyberspace.

Rules around which keyboard you can use on a legitimate cyberdeck are pretty tight. It can’t be touchscreen (because that means it’s a tablet); however, it can’t fold away on a hinge either (because that makes it a laptop). Enter Raspberry Pi 400, a computer built into a mechanical keyboard about the length of an adult forearm. Perfect.

The SDR receiver means that users are cyber snooping-ready, while the head-mounted display provides a cyberpunk design flourish. That display acts as a second screen alongside the mini touchscreen. You can drag anything from the main display into sight on the headgear.

Authentic cyberpunk aesthetic

A lot of trial and error with a 3D printer finally yielded a faceplate that allows the screen and headgear to fit in perfectly. Zack also designed and printed all the flair and logos you see stuck around the cyberdeck. LEDs make the decorative filament fluoresce. Integrated pegs keep all the wiring neat – an inspired practical addition.

Rear view of the underside of the data blaster
The underside of the data blaster

Here are all the STL files if you’d like to create your own cyberdeck. And the design files let you take a closer look at a 3D render of Zack’s creation.

We saved the best bit for last: not only can you play Doom on the Voidstar Data Blaster, you can play it on the wearable display. Stay punk.

The post Transform Raspberry Pi 400 into a hacker cyberdeck appeared first on Raspberry Pi.



Source: Raspberry Pi – Transform Raspberry Pi 400 into a hacker cyberdeck

Play Call of Duty with a Raspberry Pi-powered Nerf gun

YouTuber Alfredo Sequeida turned a Nerf gun into a controller for playing Call of Duty: Warzone. This is a fun-looking modification project, but some serious coding went into the process.

Head to the 13-minute mark for an in-game demonstration

Trigger happy

Funnily enough, the Nerf gun that Alfredo chose was a special edition Fortnite model. This irked him as a Call of Duty player, but this model had the most potential to accommodate the modifications he knew he wanted.

mini screen embedded on nerf gun
The screen is an old Android phone which lends its accelerometer to the project

The controller uses the Nerf gun’s original trigger. Alfredo designed extra 3D-printed buttons (white dots on the far right) to let him perform more in-game actions like moving, plating, and jumping.

Software

A Raspberry Pi 4 powers the whole thing, running Python scripts Alfredo wrote for both the Raspberry Pi and his gaming PC. Here’s all the code on GitHub.

Gameplay movement is controlled by getting accelerometer data via the command-line tool ADB logcat from an old Nexus 5 Android phone that’s mounted on the Nerf gun. The data is logged using a custom app Alfredo made on Android Studio.

raspberry pi embedded in nerf gun
A Raspberry Pi 4 wired up to all the buttons on the other side of the Nerf gun

Part of the action

The controller’s design makes players feel part of the action as their Call of Duty operator scouts around locations. It’s a much more immersive experience than holding an ordinary game controller in your lap or tapping away at a PC keyboard. Alfredo even plays standing up now his NERF gun controller is in action. He might as well be on a real life Special Ops mission.

call of duty POV game play
The Nerf gun complements the gameplay view that Call of Duty players have

More Call of Duty mod ideas…

So what’s next, Alfredo? We vote you make some modded night vision googles out of an old Viewmaster toy. That’ll totally work, right?

woman holding a view master toy up to her face to look through it
I am 90% sure young Alfredo doesn’t know what a Viewmaster is (even I had to Google it)

The post Play Call of Duty with a Raspberry Pi-powered Nerf gun appeared first on Raspberry Pi.



Source: Raspberry Pi – Play Call of Duty with a Raspberry Pi-powered Nerf gun

Our new SIGCSE Special Project on culturally relevant resources for computing

Over the last 20 years, researchers and educators have increasingly aimed to develop computing curricula that are culturally responsive and relevant. Designing equitable and authentic learning experiences in computing requires conscious effort to take into account the characteristics of learners and their social environments, in order to address topics that are relevant to a diverse range of students. We previously discussed this topic in a research seminar where the invited speakers shared their work on equity-focused teaching of computer science in relation to race and ethnicity.

Educator and student focussed on a computing task
Designing equitable and authentic learning experiences in computing requires conscious effort.

Culturally relevant teaching in the classroom demonstrates a teacher’s deliberate and explicit acknowledgment that they value all students and expect all students will excel. Much of the research on this topic stems from the USA. In the UK, it may be that a lack of cultural responsiveness in the computing curriculum is contributing to the underrepresentation of students from some ethnic backgrounds in formal computing qualifications [1] by negatively affecting the way these young people engage with and learn the subject.

Guidelines for creating culturally relevant learning resources for computing

Addressing this issue of underrepresentation is important to us, so we’re making it part of our work on diversity and inclusion in computing education. That’s why we’re delighted to have received an ACM SIGCSE Special Project Award for a project called ‘Developing criteria for K-12 learning resources in computer science that challenge stereotypes and promote diversity’. Our overarching aim for this project, as with all our work at the Raspberry Pi Foundation, is to broaden participation in computing and address the needs of diverse learners. Through this project, we will support computing educators in understanding culturally responsive pedagogy and how to apply it to their own practice. To this end, we’ve set up a working group that will use research into culturally responsive pedagogy to develop a set of guidelines for creating culturally relevant learning resources for computing. Our primary audience for these guidelines are teachers in the UK, but we are confident the project’s results will have value and application all over the world.

There is increasing awareness across all education, and in computing education in particular, that culturally responsive approaches to curriculum and teaching fosters relevancy, interest, and engagement for student learners. This exciting effort brings together computing classroom teachers and education researchers to identify approaches and resources that England’s educators can leverage to enact culturally responsive approaches to teaching computing.

Joanna Goode, Sommerville Knight Professor at the University of Oregon, member of our Special Project working group

What do we mean by culturally relevant resources?

A learning resource obviously has learning objectives, but it is also always set in a particular context, which may or may not be relevant to young people. It may contain images, video, and other media assets in addition to text. Presenting computing stereotypes, for example in the media assets and language used, or situating resources in an unfamiliar context can cause learners to feel that they do not belong in the subject or that it is not relevant to them and their life. On the other hand, providing resources that allow learners to relate what they are learning to issues or tasks that are personally meaningful to them and/or their culture or community can be empowering and engaging for them. For example, a common scenario used to introduce basic algorithm design to young people is making a cup of tea, but tea preparation and drinking may be culturally specific, and even if tea is drunk in a young person’s home, tea preparation may not be an activity they engage in.

A matcha tea preparation
Preparing a cup of tea — a scenario often used for introducing learners to algorithm design — can be culturally specific: compare matcha and builder’s tea.

Ensuring that a more diverse group of young people feel like they belong in computing

The expected long-term outcome of this project is to remove significant obstacles to young people’s participation in computing by ensuring that a more diverse group of young people feel represented and like they belong in the subject. The working group we have established consists of seven practising computing teachers from a diverse range of UK schools and a panel of four experts and academics (Lynda Chinaka, Mike Deutsch, Joanna Goode, and Yota Dimitriadi) working with young people and their teachers in the UK, USA, and Canada.

A teacher aids children in the classroom
We will support computing educators in understanding culturally responsive pedagogy and how to apply it to their own practice.

Yota Dimitriadi, Associate Professor at the University of Reading and a member of the expert panel, says: “I am delighted to participate in this project that enables conversations and positive action around inclusive and intersectional computing practices. It is more important than ever to enhance a global perspective in our curriculum planning and further our understanding of culturally responsive pedagogies; such an approach can empower all our students and support their skills and understanding of the integral role that computing can play in promoting social justice.”

Such an approach can empower all our students and support their skills and understanding of the integral role that computing can play in promoting social justice.

Yota Dimitriadi, Associate Professor at the University of Reading, member of our Special Project working group

The group has started to meet and discuss the guidelines, and we aim to share early findings and outputs in the summer months. We’re very excited about this project, and we think it is an important starting point for other work. We look forward to updating you in the summer!


[1] Students of Black, non-Chinese Asian, and Mixed ethnicities; Kemp, P.E.J., Berry, M.G., & Wong, B. (2018). The Roehampton Annual Computing Education Report: Data from 2017. University of Roehampton, London.

The post Our new SIGCSE Special Project on culturally relevant resources for computing appeared first on Raspberry Pi.



Source: Raspberry Pi – Our new SIGCSE Special Project on culturally relevant resources for computing

Raspberry Pi: a versatile tool for biological sciences

Over the nine-ish years since the release of our first model, we’ve watched grow a thriving global community of Raspberry Pi enthusiasts, hobbyists, and educators. But did you know that Raspberry Pi is also increasingly used in scientific research?

Thumbnail images of various scientific applications of Raspberry Pi
Some of the scientific applications of Raspberry Pi that Jolle found

Dr Jolle Jolles, a behavioural ecologist at the Center for Ecological Research and Forestry Applications (CREAF) near Barcelona, Spain, and a passionate Raspberry Pi user, has recently published a detailed review of the uptake of Raspberry Pi in biological sciences. He found that well over a hundred published studies have made use of Raspberry Pi hardware in some way.

How can Raspberry Pi help in biological sciences?

The list of applications is almost endless. Here are just a few:

  • Nest-box monitoring (we do love a good nest box)
  • Underwater video surveillance systems (reminds us of this marine conservation camera)
  • Plant phenotyping (These clever people made a ‘Greenotyper’ with Raspberry Pi)
  • Smart bird-feeders (we shared this one, which teaches pigeons, on the blog)
  • High-throughput behavioural recording systems
  • Autonomous ecosystem monitoring (you can listen to the Borneo rainforest with this project)
  • Closed-loop virtual reality (there are just too many VR projects using Raspberry Pi to choose from. Here’s a few)
Doctor Jolle giving a presentation on Raspberry Pi
Dr Jolles spreading the good word about our tiny computers

Onwards and upwards

Jolle’s review shows that use of Raspberry Pi is on the up, with more studies documenting the use of Raspberry Pi hardware every year, but he’s keen to see it employed even more widely.

It is really great to see the broad range of applications that already exist, with Raspberry Pi’s helping biologists in the lab, the field, and in the classroom. However, Raspberry Pi is still not the common research tool that it could be”. 

Jolle Jolles

Dr Jolles hard at work
Hard at work

How can I use Raspberry Pi in my research?

To stimulate the uptake of Raspberry Pi and help researchers integrate it into their work, the review paper offers guidelines and recommendations. Jolle also maintains a dedicated website with over 30 tutorials: raspberrypi-guide.github.io

“I believe low-cost micro-computers like the Raspberry Pi are a powerful tool that can help transform and democratize scientific research, and will ultimately help push the boundaries of science.”

Jolle Jolles

The paper, Broad-scale Applications of the Raspberry Pi: A Review and Guide for Biologists, is currently under review, but a preprint is available here.

‘Pirecorder’ for automating image and video capture

Jolle has also previously published a very handy software package especially with biological scientists in mind. It’s called pirecorder and helps with automated image and video recording using Raspberry Pi. You can check it out here: https://github.com/JolleJolles/pirecorder.

You can keep up with Jolle on Instagram, where he documents all the dreamy outdoor projects he’s working on.

Drop a comment below if you’ve seen an interesting scientific application of Raspberry Pi, at work, on TV, or maybe just in your imagination while you wait to find the time to build it!

The post Raspberry Pi: a versatile tool for biological sciences appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi: a versatile tool for biological sciences

Go down a Raspberry Pi YouTube rabbit hole

We here at Virtual Raspberry Pi Towers are looking forward to our weekends getting warmer, now that we are officially in British Summer Time. But we wanted to make the most of these last Saturdays and Sundays in which we have no choice but to cosy up against the typically British spring weather with a good old-fashioned YouTube rabbit hole.

Here are a few channels we think you’ll like. Some we’ve known about for a while, others are new friends we’ve made over the last year or so, and one is almost brand new so we’re putting you ahead of the curve there. You’re welcome.

Sophy Wong

Subscribe to Sophy Wong’s channel if you love the idea of wearing the tech you create. She collaborated with HackSpace magazine to publish a book, Wearable Tech Projects, which is currently on sale at the Raspberry Pi Press online store for just £7.

This is one of the projects Sophy shared in her Wearable Tech Projects book

Sophy describes herself as a “maker, designer, geek, addicted to learning how to do new things.” And she even visited NASA to watch a SpaceX launch.

Subscribe to Sophy’s channel here.

Blitz City DIY

Blitz City DIY (aka Liz) is a “DIY-er on a quest to gather and share knowledge” and has already built something cool with our newest baby, Raspberry Pi Pico. Her busy channel features computing, audio, video, coding, and more.

Check out Raspberry Pi Pico in action in this recent video from Blitz City DIY

We love Liz an extra lot because her channel features on entire playlist dedicated to Raspberry Pi Adventures. She also shares a healthy dose of festive content showing you how to Tech the Halls. No, April is NOT too early for Christmas stuff.

Subscribe to Blitz City DIY here.

Electromaker

Our new friends at Electromaker share tutorials, community projects, and contests where subscribers win hardware and massive cash prizes. Flat cap aficionado Ian Buckley also hosts The Electromaker Show – a weekly roundup of all that’s new and interesting in the maker community.

Electromakers assemble!

You can also swing by the super useful online shop where you can buy everything you need to recreate some of the projects featured. If you’re daunted by shopping for every little bit you need to create something awesome, you can choose one of these electro {maker KITS} and get right to it. We especially like the Lightsaber and Daft Punk-esque helmet kits.

Follow Electromaker here.

Estefannie Explains It All

You must have seen an Estefannie Explains It All video by now. But did you know about the weekly livestreams she hosts on Instagram? We know you’ll watch just because she’s cool and sometimes holds her pet cat up to the camera, but you’ll definitely want to tune in to try and win one of her tech giveaways. Some lucky viewers even got their hands on a Raspberry Pi 400.

Fond memories of when Estefannie visited Raspberry Pi Towers

Estefannie is another top collaborator whose channel has a dedicated Raspberry Pi playlist. Some of the things she has created using our tiny computers include Jurassic Park goggles, an automated coffee press, and a smart gingerbread house.

And as if all that wasn’t enough, Estefannie graced the Princesses with Power Tools calendar this year as Rey from Star Wars. You can buy a copy here.

Subscribe to Estefannie Explains It All here.

Kids Invent Stuff

Ruth Amos and Shawn Brown use their channel Kids Invent Stuff to bring kids’ ideas to life by making them into real working inventions. Young people aged 4–11 can submit their ideas or take part in regular invention challenges.

The infamous pooping unicorn

We first gave this channel a shout-out when Ruth and Shawn teamed up with Estefannie Explains It All to build the world’s first Raspberry Pi-powered Twitter-activated jelly bean-pooping unicorn. For real.

The MagPi Magazine got to know Ruth a little better in a recent interview. And Ruth also features in the 2021 Princesses with Power Tools calendar, as a welding Rapunzel. Go on, you know you want to buy one.

Ellora James

We saved the best (and newest) for last. Ellora James is brand new to YouTube. Her first tutorial showing you how to use Pimoroni’s Grow HAT Mini Kit was posted just three weeks ago, and she added a project update this week.

Ella helps you differentiate between edible pie and Raspberry Pi

We really like her video showing beginners how to set up their first Raspberry Pi. But our favourite is the one above in which she tackles one of the Universe’s big questions.

Subscribe to Ellora James here.

The post Go down a Raspberry Pi YouTube rabbit hole appeared first on Raspberry Pi.



Source: Raspberry Pi – Go down a Raspberry Pi YouTube rabbit hole

Edge Impulse and TinyML on Raspberry Pi

Raspberry Pi is probably the most affordable way to get started with embedded machine learning. The inferencing performance we see with Raspberry Pi 4 is comparable to or better than some of the new accelerator hardware, but your overall hardware cost is just that much lower.

Raspberry Pi 4 Model B

However, training custom models on Raspberry Pi — or any edge platform, come to that — is still problematic. This is why today’s announcement from Edge Impulse is a big step, and makes machine learning at the edge that much more accessible. With full support for Raspberry Pi, you now have the ability to take data, train against your own data in the cloud on the Edge Impulse platform, and then deploy the newly trained model back to your Raspberry Pi.

Today’s announcement includes new SDKs: for Python, Node.js, Go, and C++. This allows you to integrate machine learning models directly into your own applications. There is also support for object detection, exclusively on the Raspberry Pi; you can train a custom object detection model using camera data taken on your own Raspberry Pi, and then deploy and use this custom model, rather than relying on a pretrained stock image classification model.

Because the importance of bananas to machine learning researchers can not be overstated. To test it out, we’re going to train a very simple model that can tell the difference between a banana🍌 and an apple🍎.

Getting started

If you don’t already have an Edge Impulse account you should open up a browser on your laptop and then create an account, along with a test project. I’m going to to call mine “Object detection”.

Creating a new project in Edge Impulse
Creating a new project in Edge Impulse

We’re going to be building an image classification project, one that can tell the difference between a banana 🍌 and an apple 🍎, but Edge Impulse will also let you build an object detection project, one that will identify multiple objects in an image.

Building an object detection rather than an image classification system? This video is for you!

After creating your project, you should see something like this:

My new object detection project open in Edge Impulse

Now log in to your Raspberry Pi, open up a Terminal window, and type

[[code]]czo1MjQ6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3tbJiomXX1wYW4+PC9zdHJvbmc+IGN1cmwgLXNMIGh0dHBzOi8vZGViLm5vZGVzb3VyY2UuY29tL3NldHVwXzEyLnggfCBzdWRvIGJhc2ggLQo8e1smKiZdfXN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3NwYW4+PC9zdHtbJiomXX1yb25nPiBzdWRvIGFwdCBpbnN0YWxsIC15IGdjYyBnKysgbWFrZSBidWlsZC1lc3NlbnRpYWwgbm9kZWpzIHNveCBnc3RyZWFtZXIxe1smKiZdfS4wLXRvb2xzIGdzdHJlYW1lcjEuMC1wbHVnaW5zLWdvb2QgZ3N0cmVhbWVyMS4wLXBsdWdpbnMtYmFzZSBnc3RyZWFtZXIxLjAtcGx7WyYqJl19dWdpbnMtYmFzZS1hcHBzCjxzdHJvbmc+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1jeWFuLWJsdWUtY297WyYqJl19bG9yXCI+JDwvc3Bhbj48L3N0cm9uZz4gc3VkbyBucG0gaW5zdGFsbCBlZGdlLWltcHVsc2UtbGludXggLWcgLS11bnNhZmUtcGVybVwie1smKiZdfTt7WyYqJl19[[/code]]

to install the local toolchain. Then type

[[code]]czoxODE5OlwiPHN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBlZGdlLWltcHVsc2UtbGludXgKRWRnZSBJbXB1bHNlIExpbnV4IGNsaWVudCB2MS4xLjUKPyBXaGF0IGlze1smKiZdfSB5b3VyIHVzZXIgbmFtZSBvciBlLW1haWwgYWRkcmVzcyAoZWRnZWltcHVsc2UuY29tKT8gPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5le1smKiZdfS1jb2xvciBoYXMtcGFsZS1jeWFuLWJsdWUtY29sb3JcIj5hbGFzZGFpcjwvc3Bhbj4KPyBXaGF0IGlzIHlvdXIgcGFzc3dvcmQ/IFtoe1smKiZdfWlkZGVuXQpUaGlzIGlzIGEgZGV2ZWxvcG1lbnQgcHJldmlldy4KRWRnZSBJbXB1bHNlIGRvZXMgbm90IG9mZmVyIHN1cHBvcnQgb257WyYqJl19IGVkZ2UtaW1wdWxzZS1saW51eCBhdCB0aGUgbW9tZW50LgoKCj8gVG8gd2hpY2ggcHJvamVjdCBkbyB5b3Ugd2FudCB0byBjb25uZXtbJiomXX1jdCB0aGlzIGRldmljZT8gPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1wYWxlLWN5YW4tYmx1ZS1jb2xvclwiPkFsYXN7WyYqJl19ZGFpciBBbGxhbiAvIE9iamVjdCBkCmV0ZWN0aW9uPC9zcGFuPgo/IFNlbGVjdCBhIG1pY3JvcGhvbmUgPHNwYW4gY2xhc3M9XCJoYXN7WyYqJl19LWlubGluZS1jb2xvciBoYXMtcGFsZS1jeWFuLWJsdWUtY29sb3JcIj5VU0ItQXVkaW8gLSBSYXplciBLaXlvPC9zcGFuPgo8c3BhbiB7WyYqJl19Y2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5bU0VSXTwvc3Bhbj4gVXNpbmcge1smKiZdfW1pY3JvcGhvbmUgaHc6MSwwCj8gU2VsZWN0IGEgY2FtZXJhIDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtcGFsZS1je1smKiZdfXlhbi1ibHVlLWNvbG9yXCI+UmF6ZXIgS2l5bzwvc3Bhbj4KPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91c3tbJiomXX0tdml2aWQtb3JhbmdlLWNvbG9yXCI+W1NFUl08L3NwYW4+IFVzaW5nIGNhbWVyYSBSYXplciBLaXlvIHN0YXJ0aW5nLi4uCjxzcGFuIHtbJiomXX1jbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPltTRVJdPC9zcGFuPiBDb25uZWN7WyYqJl19dGVkIHRvIGNhbWVyYQo8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj5bV1Mge1smKiZdfV08L3NwYW4+IENvbm5lY3RpbmcgdG8gd3NzOi8vcmVtb3RlLW1nbXQuZWRnZWltcHVsc2UuY29tCjxzcGFuIGNsYXNzPVwiaGFzLWlue1smKiZdfWxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj5bV1MgXTwvc3Bhbj4gQ29ubmVjdGVkIHRvIHdzczovL3JlbW90e1smKiZdfWUtbWdtdC5lZGdlaW1wdWxzZS5jb20KPyBXaGF0IG5hbWUgZG8geW91IHdhbnQgdG8gZ2l2ZSB0aGlzIGRldmljZT8gPHNwYW4gY2x7WyYqJl19YXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtcGFsZS1jeWFuLWJsdWUtY29sb3JcIj5yYXNwYmVycnlwaTwvc3Bhbj4KPHNwYW4gY2xhe1smKiZdfXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPltXUyBdPC9zcGFuPiBEZXZpY2UgXCJyYXNwYmV7WyYqJl19cnJ5cGlcIiBpcyBub3cgY29ubmVjdGVkIHRvIHByb2plY3QgXCJPYmplY3QgZGV0ZWN0aW9uXCIKPHNwYW4gY2xhc3M9XCJoYXMtaW5saW57WyYqJl19ZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPltXUyBdPC9zcGFuPiBHbyB0byBodHRwczovL3N0dWRpby5lZGdlaW17WyYqJl19cHVsc2UuY29tL3N0dWRpby88ZW0+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1jeWFuLWJsdWlzaC1ncmF5LWNvbG97WyYqJl19clwiPlhYWFhYPC9zcGFuPjwvZW0+L2FjcXVpc2l0aW9uL3RyYWluaW5nIHRvIGJ1aWxkIHlvdXIgbWFjaGluZSBsZWFybmluZyBtb2R7WyYqJl19ZWwhXCI7e1smKiZdfQ==[[/code]]

and log in to your Edge Impulse account. You’ll then be asked to choose a project, and finally to select a microphone and camera to connect to the project. I’ve got a Razer Kiyo connected to my own Raspberry Pi so I’m going to use that.

Raspberry Pi has connected to Edge Impulse
Raspberry Pi has connected to Edge Impulse

If you still have your project open in a browser you might see a notification telling you that your Raspberry Pi is connected. Otherwise you can click on “Devices” in the left-hand menu for a list of devices connected to that project. You should see an entry for your Raspberry Pi.

The list of devices connected to your project

Taking training data

If you look in your Terminal window on your Raspberry Pi you’ll see a URL that will take you to the “Data acquisition” page of your project. Alternatively you can just click on “Data acquisition” in the left-hand menu.

Getting ready to collect training data
Getting ready to collect training data

Go ahead and select your Raspberry Pi if it isn’t already selected, and then select the Camera as the sensor. You should see a live thumbnail from your camera appear to the right-hand side. If you want to follow along, position your fruit (I’m starting with with the banana 🍌), add a text label in the “Label” box, and hit the “start sampling” button. This will take and save an image to the cloud. Reposition the banana and take ten images. Then do it all again with the apple 🍎.

Ten labelled images each of the banana 🍌 and the apple 🍎

Since we’re building an incredibly simplistic model, and we’re going to leverage transfer learning, we probably now have enough training data with just these twenty images. So let’s go and create a model.

Creating a model

Click on “Impulse design” in the left-hand menu. Start by clicking on the “Add an input block” box and click on the “Add” button next to the “Images” entry. Next click on the “Add a processing block” box. Then click on the “Add” button next to the “Image” block to add a processing block that will normalise the image data and reduce colour depth. Then click on the “Add a learning block” box and select the “Transfer Learning (images)” block to grab a pretrained model intended for image classification, on which we will perform transfer learning to tune it for our banana 🍌 and apple 🍎 recognition task. You should see the “Output features” block update to show 2 output features. Now hit the “Save Impulse” button.

Our configured
Our configured Impulse

Next click on the “Images” sub-item under the “Impulse design” menu item, switch to the “Generate features” tab, and then hit the green “Generate features” button.

Generating model features

Finally, click on the “Transfer learning” sub-item under the “Impulse design” menu item, and hit the green “Start training” button at the bottom of the page. Training the model will take some time. Go get some coffee ☕.

A trained model

Testing our model

We can now test our trained model against the world. Click on the “Live classification” entry in the left-hand menu, and then hit then the green “Start sampling” button to take a live picture from your camera.

Live classification to test your model
Live classification to test your model

You might want to go fetch a different banana 🍌, just for testing purposes.

A live test of the model

If you want to do multiple tests, just scroll up and hit the “Start sampling” button again to take another image.

Deploying to your Raspberry Pi

Now we’ve (sort of) tested our model, we can deploy it back to our Raspberry Pi. Go to the Terminal window where the edge-impulse-linux command connecting your Raspberry Pi to Edge Impulse is running, and hit Control-C to stop it. Afterwards we can do a quick evaluation deployment using the edge-impulse-runner command.

[[code]]czoxODgyOlwiPHN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBlZGdlLWltcHVsc2UtbGludXgtcnVubmVyClRoaXMgaXMgYSBkZXZlbG9wbWVudCBwcmV2aWV3LgpFZGdle1smKiZdfSBJbXB1bHNlIGRvZXMgbm90IG9mZmVyIHN1cHBvcnQgb24gZWRnZS1pbXB1bHNlLWxpbnV4LXJ1bm5lciBhdCB0aGUgbW9tZW50Lgp7WyYqJl19CkVkZ2UgSW1wdWxzZSBMaW51eCBydW5uZXIgdjEuMS41Cgo8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3V7WyYqJl19cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5bUlVOXTwvc3Bhbj4gQWxyZWFkeSBoYXZlIG1vZGVsIC9ob21lL3BpLy5laS1saW51eC1ydW57WyYqJl19bmVyL21vZGVscy8yNDIxNy92Mi9tb2RlbC5laW0gbm90IGRvd25sb2FkaW5nLi4uCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2x7WyYqJl19b3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPltSVU5dPC9zcGFuPiBTdGFydGluZyB0aGUgaW1hZ2UgY2xhc3NpZml7WyYqJl19ZXIgZm9yIEFsYXNkYWlyIEFsbGFuIC8gT2JqZWN0IGRldGVjdGlvbiAodjIpCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciB7WyYqJl19aGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPltSVU5dPC9zcGFuPiBQYXJhbWV0ZXJzIGltYWdlIHNpemUgOTZ4OTYgcHh7WyYqJl19ICgzIGNoYW5uZWxzKSBjbGFzc2VzIFsgXCc8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4te1smKiZdfWNvbG9yXCI+YXBwbGU8L3NwYW4+XCcsIFwnPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1ncmVlbi1jeWFuLWNve1smKiZdfWxvclwiPmJhbmFuYTwvc3Bhbj5cJyBdCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtb3Jhbmd7WyYqJl19ZS1jb2xvclwiPltSVU5dPC9zcGFuPiBVc2luZyBjYW1lcmEgUmF6ZXIgS2l5byBzdGFydGluZy4uLgo8c3BhbiBjbGFzcz1cImhhcy1pe1smKiZdfW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5bUlVOXTwvc3Bhbj4gQ29ubmVjdGVkIHRvIGNhbWVye1smKiZdfWEKCldhbnQgdG8gc2VlIGEgZmVlZCBvZiB0aGUgY2FtZXJhIGFuZCBsaXZlIGNsYXNzaWZpY2F0aW9uIGluIHlvdXIgYnJvd3Nlcj97WyYqJl19IEdvIHRvIGh0dHA6Ly88ZW0+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1jeWFuLWJsdWlzaC1ncmF5LWNvbG9yXCI+e1smKiZdfVhYWC5YWFguWFhYLlhYWDwvc3Bhbj48L2VtPjo8ZW0+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1jeWFuLWJsdWlze1smKiZdfWgtZ3JheS1jb2xvclwiPlhYWFg8L3NwYW4+PC9lbT4KCmNsYXNzaWZ5UmVzIDMxbXMuIHsgYXBwbGU6IFwnPHNwYW4gY2xhc3M9XCJoYXN7WyYqJl19LWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPjAuMDA5Nzwvc3Bhbj5cJywgYmFuYW5hOiBcJzxzcGFuIGNsYXtbJiomXX1zcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj4wLjk5MDM8L3NwYW4+XCcgfQpjbGFzc2lmeVJle1smKiZdfXMgMjltcy4geyBhcHBsZTogXCc8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj57WyYqJl19MC4wMDgyPC9zcGFuPlwnLCBiYW5hbmE6IFwnPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1ncmVlbi1jeWFuLXtbJiomXX1jb2xvclwiPjAuOTkxODwvc3Bhbj5cJyB9CiAuCiAuCiAuCmNsYXNzaWZ5UmVzIDIzbXMuIHsgYXBwbGU6IFwnPHNwYW4gY2xhc3M9XCJoYXtbJiomXX1zLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPjAuMDA3ODwvc3Bhbj5cJywgYmFuYW5hOiBcJzxzcGFuIGNse1smKiZdfWFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj4wLjk5MjI8L3NwYW4+XCcgfVwiO3tbJiomXX0=[[/code]]

This will connect to the Edge Impulse cloud, download your trained model, and start up an application that will take the video stream coming from your camera and look for bananas 🍌 and apples 🍎. The results of the model inferencing will be shown frame by frame in the Terminal window. When the runner application starts up you’ll also see a URL: copy and paste this into a browser, and you’ll see the view from the camera in real time along with the inferencing results.

Deployed model running locally on your Raspberry Pi

Success! We’ve taken our training data and trained a model in the cloud, and we’re now running that model locally on our Raspberry Pi. Because we’re running the model locally, we no longer need network access. No data needs to leave the Raspberry Pi. This is a huge privacy advantage for edge computing compared to cloud-connected devices.

Wrapping up?

While we’re running our model inside Edge Impulse’s “quick look” application, we can deploy the exact same model into our own applications, as today’s announcement includes new SDKs: for Python, Node.js, Go, and C++. These SDKs let us build standalone applications to collect data not just from our camera and microphone, but from other sensors like accelerometers, magnetometers, or anything else you can connect to a Raspberry Pi.

Performance metrics for Edge Impulse are promising, although still somewhat below what we’ve seen using TensorFlow Lite directly on Raspberry Pi 4, for inferencing using similar models. That said, it’s really hard to compare performance across even very similar models as it depends so much on the exact situation you’re in and what data you’re dealing with, so your mileage may vary quite a lot here.

However, the new Edge Impulse announcement offers two very vital things: a cradle-to-grave framework for collecting data and training models then deploying these custom models at the edge, together with a layer of abstraction. Increasingly we’re seeing deep learning eating software as part of a general trend towards increasing abstraction, sometimes termed lithification, in software. Which sounds intimidating, but means that we can all do more, with less effort. Which isn’t a bad thing at all.

The post Edge Impulse and TinyML on Raspberry Pi appeared first on Raspberry Pi.



Source: Raspberry Pi – Edge Impulse and TinyML on Raspberry Pi