Custom USB games controllers with Raspberry Pi Pico | HackSpace 42

Games controllers – like keyboards – are very personal things. What works for one person may not work for another. Why, then, should we all use almost identical off-the-shelf controllers? In the latest issue of HackSpace magazine, we take a look at how to use Raspberry Pi Pico to create a controller that’s just right for you.

home made retro gaming joystick box
Gaming like it’s 1989

We’ll use CircuitPython for this as it has excellent support for USB interfaces. The sort of USB devices that we interact with are called human interface devices (HIDs), and there are standard protocols for common HIDs, including keyboards and mice. This is why, for example, you can plug almost any USB keyboard into almost any computer and it will just work, with no need to install drivers.

We’ll be using the Keyboard type, as that works best with the sorts of games that this author likes to play, but you can use exactly the same technique to simulate a mouse or a gamepad.

Before we get onto this, though, let’s take a look at the buttons and how to wire them up.

We’re going to use eight buttons: four for direction, and four as additional ‘action’ buttons. We’ll connect these between an I/O pin and ground. You can use any I/O pin you like. We’re going to use slightly different ones in two different setups, just because they made sense with the physical layout of the hardware. Let’s take a look at the hardware we’re using. Remember, this is just the hardware we want to use. The whole idea of this is to create a setup that’s right for you, so there’s no need to use the same. Think about how you want to interact with your games and take a look at the available input devices and build what you want.

The connectors should just push onto the buttons and joysticks
The connectors should just push onto the buttons and joystick

The first setup we’re creating is an Arcade box. This author would really like an arcade machine in his house. However, space limitations mean that this isn’t going to be possible in the near future. The first setup, then, is an attempt to recreate the control setup of an arcade machine, but use it to play games on a laptop rather than a full-sized cabinet.

Arcade controls are quite standard, and you can get them from a range of sources. We used one of Pimoroni’s Arcade Parts sets, which includes a joystick and ten buttons (we only used four of these). The important thing about the joystick you pick is that it’s a button-based joystick and not an analogue one (sometimes called a dual-axis joystick), as the latter won’t work with a keyboard interface. If you want to use an analogue joystick, you’ll need to switch the code around to use a mouse or gamepad as an input device.

You can solder the pin headers straight onto Pico
You can solder the pin headers straight onto Pico

As well as the electronics, you’ll need some way of mounting them. We used a wooden craft box. These are available for about £10 from a range of online or bricks and mortar stores. You can use anything that is strong enough to hold the components.

The second setup we’re using is a much simpler button-based system on breadboard-compatible tactile buttons and protoboard. It’s smaller, cheaper, and quicker to put together. The protoboard holds everything together, so there’s nothing extra to add unless you want to. You can personalise it by selecting different-sized buttons, changing the layout, or building a larger chassis around this.

Insert coin to continue

Let’s take a look at the arcade setup first. The joystick has five pins. One is a common ground and the others are up, down, left, and right. When you push the joystick up, a switch closes, linking ground to the up pin. On our joystick the outermost pin is ground, but it’s worth checking on your joystick which pin is which by using a multimeter. Select continuity mode and, if you push the joystick up, you should find a continuous connection between the up pin and ground. A bit of experimentation should confirm which pin is which.

In order to read the pins, we just need to connect the directional output from the joystick to an I/O pin on Pico. We can use one of Pico’s internal pull-up resistors to pull the pin high when the button isn’t pressed. Then, when the button is pressed, it will connect to ground and read low. The joystick should come with a cable that slots onto the joystick. This should have five outputs, and this conveniently slots into the I/O outputs of Pico with a ground on one end.

You can solder the pin headers straight onto Pico
You can solder the pin headers straight onto Pico

The buttons, similarly, just need to be connected between ground and an I/O pin. These came with cables that pushed onto the button and plugged into adjacent pins. Since Pico has eight grounds available, there are enough that each button can have its own ground, and you don’t have to mess around joining cables together.

Once all the cables are soldered together, it’s just a case of building the chassis. For this, you need five large holes (one for the joystick and four for the buttons). We didn’t have an appropriately sized drill bit and, given how soft the wood on these boxes is, a large drill bit may have split the wood anyway. Instead, we drilled a 20 mm hole and then used a rotary tool with sanding attachment to enlarge the hole until it was the right size. You have to go quite easy with both the drill and the sanding tool to avoid  turning everything into shards of broken wood. Four small holes then allow bolts to keep the joystick in place (we used M5 bolts). The buttons just push into place.

With a combination of small sections of wire and jumpers, you can create whatever pattern of wiring you like on protoboard
With a combination of small sections of wire and jumpers, you can create whatever pattern of wiring you like on protoboard

The only remaining thing was a 12 mm hole for a micro USB cable to pass through to Pico. If you don’t have a 12 mm drill bit, two overlapping smaller holes may work if you’re careful.

The buttons just push-fit into place, and that’s everything ready to go.

A smaller approach

Our smaller option used protoboard over the back of Pico. Since we didn’t want to block the BOOTSEL button, we only soldered it over part of Pico. However, before soldering it on at all, we soldered the buttons in place.

Tactile switches typically have four connections. Well, really they have two connections, but each connection has two tabs that fit into the protoboard. This means that you have to orientate them correctly. Again, your multimeter’s continuity function will confirm which pins are connected and which are switched.

Protoboard is a PCB that contains lots and lots of holes and nothing else. You solder your components into the holes and then you have to create connections between them.

We placed the buttons in the protoboard in positions we liked before worrying about the wiring. First, we looked to connect one side of each switch to ground. To minimise the wiring, we did this in two groups. We connected one side of each of the direction buttons together and then linked them to ground. Then we did the same to all the action buttons.

There are two ways of connecting things on protoboard. One is to use jumper wire. This works well if the points are more than a couple of holes apart. For holes that are next to each other, or very close, you can bridge them. On some protoboard (which doesn’t have a solder mask), you might simply be able to drag a blob of solder across with your soldering iron so that it joins both holes. On protoboard with solder mask, this doesn’t work quite so well, so you need to add a little strand of wire in a surface-mount position between the two points and solder it in. If you’ve got a pair of tweezers to hold the wire in place while you solder it, it will be much easier.

For longer connections, you’ll need to use jumper wire. Sometimes you’ll be able to poke it through the protoboard and use the leg to join. Other times you’ll have to surface-mount it. This all sounds a bit complicated, but while it can be a bit fiddly, it’s all fairly straightforward once you put solder to iron.

Program it up

Now that we’ve got the hardware ready, let’s code it up. You’ll first need to load CircuitPython onto your Pico. You can download the latest release from circuitpython.org. Press the BOOTSEL button as you plug Pico into your USB port, and then drag and drop the downloaded UF2 file onto the RP2 USB drive that should appear.

We’ll use Mu to program Pico. If you’ve not used CircuitPython before, it’s probably worth having a quick look through the ’getting started’ guide.

The code to run our games controller is:

import board<br/>import digitalio<br/>import gamepad<br/>import time<br/>import usb_hid<br/>from adafruit_hid.keyboard import Keyboard<br/>from adafruit_hid.keycode import Keycode<br/><br/>kbd = Keyboard(usb_hid.devices)<br/><br/>keycodes = [Keycode.UP_ARROW, Keycode.DOWN_ARROW, Keycode.LEFT_ARROW, Keycode.RIGHT_ARROW,                   Keycode.X, Keycode.Z, Keycode.SPACE, Keycode.ENTER]<br/><br/>pad = gamepad.GamePad(<br/>    digitalio.DigitalInOut(board.GP12),<br/>    digitalio.DigitalInOut(board.GP14),<br/>    digitalio.DigitalInOut(board.GP9),<br/>    digitalio.DigitalInOut(board.GP15),<br/>    digitalio.DigitalInOut(board.GP16),<br/>    digitalio.DigitalInOut(board.GP17),<br/>    digitalio.DigitalInOut(board.GP18),<br/>    digitalio.DigitalInOut(board.GP20),<br/>)<br/>last_pressed = 0<br/>while True:<br/>    this_pressed = pad.get_pressed()<br/>    if (this_pressed != last_pressed):<br/>        for i in range(8):<br/>            if (this_pressed &amp; 1&lt;&lt;i) and not (last_pressed &amp; 1&lt;&lt;i):<br/>                kbd.press(keycodes[i])<br/>            if (last_pressed &amp; 1&lt;&lt;i) and not (this_pressed &amp; 1&lt;&lt;i):<br/>                kbd.release(keycodes[i])<br/>        last_pressed = this_pressed<br/>    time.sleep(0.01)

This uses the HID keyboard object (called kbd) to send key press and release events for different key codes depending on what buttons are pressed or released. We’ve used the gamepad module that is for keeping track of up to eight buttons. When you initialise it, it will automatically add pull-up resistors and set the I/O pins to input. Then, it will keep track of what buttons are pressed. When you call get_pressed(), it will return a byte of data where each digit corresponds to an I/O pin. So, the following number (in binary) means that the first and third buttons have been pressed: 00000101. This is a little confusing, because this is the opposite order to how the I/Os are passed when you initialise the GamePad object.

The while loop may look a little unusual as it’s not particularly common to use this sort of binary comparison in Python code, but in essence, it’s just looking at one bit at a time and seeing either: it’s now pressed but wasn’t last time the loop ran (in which case, it’s a new button press and we should send it to the computer), or it isn’t pressed this loop but was the previous loop (in which case, it’s newly released so we can call the release method).

The << operator shifts a value by a number of bits to the left. So, 1<<2 is 100, and 1<<3 is 1000. The & operator is bitwise and so it looks at a binary number and does a logical AND on each bit in turn. Since the right-hand side of the & is all zeros apart from one bit (at a different position depending on the value of i), the result will be dependent on whether the value of this_pressed or last_pressed is 1 or 0 at the position i. When you have an if condition that’s a number, it’s true if the number is anything other than 0. So, (this_pressed & 1<<2) will evaluate to true if there’s a 1 at position 2 in the binary form of this_pressed.  In our case, that means if the joystick is pushed left.

You can grab this code from the following link – hsmag.cc/USBKeyboard. Obviously, you will need to update the GPIO values to the correct ones for your setup when you initialise GamePad.

We’ve taken a look at two ways to build a gamepad, but it’s up to you how you want to design yours.   

Issue 42 of HackSpace magazine is on sale NOW!

hackspace issue 42 cover

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents. As always, every issue is free to download from the HackSpace magazine website.

The post Custom USB games controllers with Raspberry Pi Pico | HackSpace 42 appeared first on Raspberry Pi.



Source: Raspberry Pi – Custom USB games controllers with Raspberry Pi Pico | HackSpace 42

Raspberry Pi touchscreen music streamer

If you liked the look of yesterday’s Raspberry Pi Roon Endpoint Music Streamer but thought: “Hey, you know what would be great? If it had a touchscreen,” then look no further. Home Theater Fanatics has built something using the same RoPieee software, but with the added feature of a screen, for those who need one.

Subscribe to Home Theater Fanatics on YouTube for more great builds like this one

The build cost for this is a little higher than the $150 estimate to recreate yesterday’s project, given the inclusion of a fancier Digital Audio Decoder and the touchscreen itself.

Hardware

connecting raspberry pi to touchscreen
It really is a super user-friendly walkthrough video

The brilliant Home Theater Fanatics show you how to put all of this together from this point in the build video, before moving on to the software install. They take care to go through all of the basics of the hardware in case you’re not familiar with things like ribbon cables or fans. It’s a really nice bird’s-eye view walkthrough, so beginners aren’t likely to have any problems following along.

ribbon attaching to raspberry pi
See – close-ups of how to connect your ribbon cables and everything

Software

Same as yesterday’s build:

At this point in the build video, Home Theater Fanatics go through the three steps you need to take to get the RoPieee and Roon software sorted out, then connect the DAC. Again, it’s a really clear, comprehensive on-screen walkthrough that beginners can be comfortable with.

Why do I need a touchscreen music streamer?

touchscreen music player
Get all your album track info right in your face

Aside from being able to see the attributed artwork for the music you’re currently listening to, this touchscreen solution provides easy song switching during home workouts. It’s also a much snazzier-looking tabletop alternative to a plugged-in phone spouting a Spotify playlist.

The post Raspberry Pi touchscreen music streamer appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi touchscreen music streamer

How to build a Raspberry Pi Roon Endpoint Music Streamer

Our friend Mike Perez at Audio Arkitekts is back to show you how to build PiFi, a Raspberry Pi-powered Roon Endpoint Music Streamer. The whole build costs around $150, which is pretty good going for such a sleek-looking Roon-ready end product.

Roon ready

Roon is a platform for all the music in your home, and Roon Core (which works with this build) manages all your music files and streaming content. The idea behind Roon is to bring all your music together, so you don’t have to worry about where it’s stored, what format it’s in, or where you stream it from. You can start a free trial if you’re not already a user.

Parts list

Sleek HiFiBerry case

Simple to put together

Fix the HiFiBerry DAC2 Pro into the top of the case with the line output and headphone outputs poking out. A Raspberry Pi 4 Model B is the brains of the operation, and slots nicely onto the HiFiBerry. The HiFiBerry HAT is compatible with all Raspberry Pi models with a 40-pin GPIO connector and just clicks right onto the GPIO pins. It is also directly powered by the Raspberry Pi so, no additional power supply needed.

Raspberry Pi 4 connected to HiFiBerry HAT inside the top half of the case (before the bottom half is screwed on)

Next, secure the bottom half of the case, making sure all the Raspberry Pi ports line up with the case’s ready-made holes. Mike did the whole thing by hand with just a little help from a screwdriver right at the end.

Software

Download the latest RoPieee image onto your SD card to make it a Roon Ready End Point, then slot it back into your Raspberry Pi. Now you have a good-looking, affordable audio output ready to connect to your Roon Core.

And that’s it. See – told you it was easy. Don’t forget, Audio Arkitekts’ YouTube channel is a must-follow for all audiophiles.

The post How to build a Raspberry Pi Roon Endpoint Music Streamer appeared first on Raspberry Pi.



Source: Raspberry Pi – How to build a Raspberry Pi Roon Endpoint Music Streamer

Transform Raspberry Pi 400 into a hacker cyberdeck

Resisting the desolate consumerism of the suburbs is a serious business for hardware hacker Zack Freedman. Zack transformed a Raspberry Pi 400 into the Voidstar Data Blaster, a portable cyberdeck to fight against becoming a normie.

The suburbs thing is explained at the beginning of Zack’s build video. Subscribe to his YouTube channel.

Hang on, what is a cyberdeck?

Zack explains:

“A data blaster [cyberdeck] is the trademark battlestation of console cowboy antiheroes running nets through cyberspace.”

There’s a whole subreddit devoted to exploring what does and does not make a real-life cyberdeck, so if you were looking for a rabbit hole to go down, knock yourself out.

view of cyber deck on a desk
Punky

How do you turn a Raspberry Pi 400 into a cyberdeck?

Added features to transform a Raspberry Pi 400 into the Voidstar Data Blaster include:

  • Detachable wearable display
  • Battery handles
  • SDR receiver
  • Antennae
  • 1280×480 touchscreen
data blaster strapped to forearm of maker
Wear your data blaster with pride

Handles make the cyberdeck nice and portable. Console cowboys can also use them to flip the deck up onto their forearm and easily “jack in” to cyberspace.

Rules around which keyboard you can use on a legitimate cyberdeck are pretty tight. It can’t be touchscreen (because that means it’s a tablet); however, it can’t fold away on a hinge either (because that makes it a laptop). Enter Raspberry Pi 400, a computer built into a mechanical keyboard about the length of an adult forearm. Perfect.

The SDR receiver means that users are cyber snooping-ready, while the head-mounted display provides a cyberpunk design flourish. That display acts as a second screen alongside the mini touchscreen. You can drag anything from the main display into sight on the headgear.

Authentic cyberpunk aesthetic

A lot of trial and error with a 3D printer finally yielded a faceplate that allows the screen and headgear to fit in perfectly. Zack also designed and printed all the flair and logos you see stuck around the cyberdeck. LEDs make the decorative filament fluoresce. Integrated pegs keep all the wiring neat – an inspired practical addition.

Rear view of the underside of the data blaster
The underside of the data blaster

Here are all the STL files if you’d like to create your own cyberdeck. And the design files let you take a closer look at a 3D render of Zack’s creation.

We saved the best bit for last: not only can you play Doom on the Voidstar Data Blaster, you can play it on the wearable display. Stay punk.

The post Transform Raspberry Pi 400 into a hacker cyberdeck appeared first on Raspberry Pi.



Source: Raspberry Pi – Transform Raspberry Pi 400 into a hacker cyberdeck

Play Call of Duty with a Raspberry Pi-powered Nerf gun

YouTuber Alfredo Sequeida turned a Nerf gun into a controller for playing Call of Duty: Warzone. This is a fun-looking modification project, but some serious coding went into the process.

Head to the 13-minute mark for an in-game demonstration

Trigger happy

Funnily enough, the Nerf gun that Alfredo chose was a special edition Fortnite model. This irked him as a Call of Duty player, but this model had the most potential to accommodate the modifications he knew he wanted.

mini screen embedded on nerf gun
The screen is an old Android phone which lends its accelerometer to the project

The controller uses the Nerf gun’s original trigger. Alfredo designed extra 3D-printed buttons (white dots on the far right) to let him perform more in-game actions like moving, plating, and jumping.

Software

A Raspberry Pi 4 powers the whole thing, running Python scripts Alfredo wrote for both the Raspberry Pi and his gaming PC. Here’s all the code on GitHub.

Gameplay movement is controlled by getting accelerometer data via the command-line tool ADB logcat from an old Nexus 5 Android phone that’s mounted on the Nerf gun. The data is logged using a custom app Alfredo made on Android Studio.

raspberry pi embedded in nerf gun
A Raspberry Pi 4 wired up to all the buttons on the other side of the Nerf gun

Part of the action

The controller’s design makes players feel part of the action as their Call of Duty operator scouts around locations. It’s a much more immersive experience than holding an ordinary game controller in your lap or tapping away at a PC keyboard. Alfredo even plays standing up now his NERF gun controller is in action. He might as well be on a real life Special Ops mission.

call of duty POV game play
The Nerf gun complements the gameplay view that Call of Duty players have

More Call of Duty mod ideas…

So what’s next, Alfredo? We vote you make some modded night vision googles out of an old Viewmaster toy. That’ll totally work, right?

woman holding a view master toy up to her face to look through it
I am 90% sure young Alfredo doesn’t know what a Viewmaster is (even I had to Google it)

The post Play Call of Duty with a Raspberry Pi-powered Nerf gun appeared first on Raspberry Pi.



Source: Raspberry Pi – Play Call of Duty with a Raspberry Pi-powered Nerf gun

Our new SIGCSE Special Project on culturally relevant resources for computing

Over the last 20 years, researchers and educators have increasingly aimed to develop computing curricula that are culturally responsive and relevant. Designing equitable and authentic learning experiences in computing requires conscious effort to take into account the characteristics of learners and their social environments, in order to address topics that are relevant to a diverse range of students. We previously discussed this topic in a research seminar where the invited speakers shared their work on equity-focused teaching of computer science in relation to race and ethnicity.

Educator and student focussed on a computing task
Designing equitable and authentic learning experiences in computing requires conscious effort.

Culturally relevant teaching in the classroom demonstrates a teacher’s deliberate and explicit acknowledgment that they value all students and expect all students will excel. Much of the research on this topic stems from the USA. In the UK, it may be that a lack of cultural responsiveness in the computing curriculum is contributing to the underrepresentation of students from some ethnic backgrounds in formal computing qualifications [1] by negatively affecting the way these young people engage with and learn the subject.

Guidelines for creating culturally relevant learning resources for computing

Addressing this issue of underrepresentation is important to us, so we’re making it part of our work on diversity and inclusion in computing education. That’s why we’re delighted to have received an ACM SIGCSE Special Project Award for a project called ‘Developing criteria for K-12 learning resources in computer science that challenge stereotypes and promote diversity’. Our overarching aim for this project, as with all our work at the Raspberry Pi Foundation, is to broaden participation in computing and address the needs of diverse learners. Through this project, we will support computing educators in understanding culturally responsive pedagogy and how to apply it to their own practice. To this end, we’ve set up a working group that will use research into culturally responsive pedagogy to develop a set of guidelines for creating culturally relevant learning resources for computing. Our primary audience for these guidelines are teachers in the UK, but we are confident the project’s results will have value and application all over the world.

There is increasing awareness across all education, and in computing education in particular, that culturally responsive approaches to curriculum and teaching fosters relevancy, interest, and engagement for student learners. This exciting effort brings together computing classroom teachers and education researchers to identify approaches and resources that England’s educators can leverage to enact culturally responsive approaches to teaching computing.

Joanna Goode, Sommerville Knight Professor at the University of Oregon, member of our Special Project working group

What do we mean by culturally relevant resources?

A learning resource obviously has learning objectives, but it is also always set in a particular context, which may or may not be relevant to young people. It may contain images, video, and other media assets in addition to text. Presenting computing stereotypes, for example in the media assets and language used, or situating resources in an unfamiliar context can cause learners to feel that they do not belong in the subject or that it is not relevant to them and their life. On the other hand, providing resources that allow learners to relate what they are learning to issues or tasks that are personally meaningful to them and/or their culture or community can be empowering and engaging for them. For example, a common scenario used to introduce basic algorithm design to young people is making a cup of tea, but tea preparation and drinking may be culturally specific, and even if tea is drunk in a young person’s home, tea preparation may not be an activity they engage in.

A matcha tea preparation
Preparing a cup of tea — a scenario often used for introducing learners to algorithm design — can be culturally specific: compare matcha and builder’s tea.

Ensuring that a more diverse group of young people feel like they belong in computing

The expected long-term outcome of this project is to remove significant obstacles to young people’s participation in computing by ensuring that a more diverse group of young people feel represented and like they belong in the subject. The working group we have established consists of seven practising computing teachers from a diverse range of UK schools and a panel of four experts and academics (Lynda Chinaka, Mike Deutsch, Joanna Goode, and Yota Dimitriadi) working with young people and their teachers in the UK, USA, and Canada.

A teacher aids children in the classroom
We will support computing educators in understanding culturally responsive pedagogy and how to apply it to their own practice.

Yota Dimitriadi, Associate Professor at the University of Reading and a member of the expert panel, says: “I am delighted to participate in this project that enables conversations and positive action around inclusive and intersectional computing practices. It is more important than ever to enhance a global perspective in our curriculum planning and further our understanding of culturally responsive pedagogies; such an approach can empower all our students and support their skills and understanding of the integral role that computing can play in promoting social justice.”

Such an approach can empower all our students and support their skills and understanding of the integral role that computing can play in promoting social justice.

Yota Dimitriadi, Associate Professor at the University of Reading, member of our Special Project working group

The group has started to meet and discuss the guidelines, and we aim to share early findings and outputs in the summer months. We’re very excited about this project, and we think it is an important starting point for other work. We look forward to updating you in the summer!


[1] Students of Black, non-Chinese Asian, and Mixed ethnicities; Kemp, P.E.J., Berry, M.G., & Wong, B. (2018). The Roehampton Annual Computing Education Report: Data from 2017. University of Roehampton, London.

The post Our new SIGCSE Special Project on culturally relevant resources for computing appeared first on Raspberry Pi.



Source: Raspberry Pi – Our new SIGCSE Special Project on culturally relevant resources for computing

Raspberry Pi: a versatile tool for biological sciences

Over the nine-ish years since the release of our first model, we’ve watched grow a thriving global community of Raspberry Pi enthusiasts, hobbyists, and educators. But did you know that Raspberry Pi is also increasingly used in scientific research?

Thumbnail images of various scientific applications of Raspberry Pi
Some of the scientific applications of Raspberry Pi that Jolle found

Dr Jolle Jolles, a behavioural ecologist at the Center for Ecological Research and Forestry Applications (CREAF) near Barcelona, Spain, and a passionate Raspberry Pi user, has recently published a detailed review of the uptake of Raspberry Pi in biological sciences. He found that well over a hundred published studies have made use of Raspberry Pi hardware in some way.

How can Raspberry Pi help in biological sciences?

The list of applications is almost endless. Here are just a few:

  • Nest-box monitoring (we do love a good nest box)
  • Underwater video surveillance systems (reminds us of this marine conservation camera)
  • Plant phenotyping (These clever people made a ‘Greenotyper’ with Raspberry Pi)
  • Smart bird-feeders (we shared this one, which teaches pigeons, on the blog)
  • High-throughput behavioural recording systems
  • Autonomous ecosystem monitoring (you can listen to the Borneo rainforest with this project)
  • Closed-loop virtual reality (there are just too many VR projects using Raspberry Pi to choose from. Here’s a few)
Doctor Jolle giving a presentation on Raspberry Pi
Dr Jolles spreading the good word about our tiny computers

Onwards and upwards

Jolle’s review shows that use of Raspberry Pi is on the up, with more studies documenting the use of Raspberry Pi hardware every year, but he’s keen to see it employed even more widely.

It is really great to see the broad range of applications that already exist, with Raspberry Pi’s helping biologists in the lab, the field, and in the classroom. However, Raspberry Pi is still not the common research tool that it could be”. 

Jolle Jolles

Dr Jolles hard at work
Hard at work

How can I use Raspberry Pi in my research?

To stimulate the uptake of Raspberry Pi and help researchers integrate it into their work, the review paper offers guidelines and recommendations. Jolle also maintains a dedicated website with over 30 tutorials: raspberrypi-guide.github.io

“I believe low-cost micro-computers like the Raspberry Pi are a powerful tool that can help transform and democratize scientific research, and will ultimately help push the boundaries of science.”

Jolle Jolles

The paper, Broad-scale Applications of the Raspberry Pi: A Review and Guide for Biologists, is currently under review, but a preprint is available here.

‘Pirecorder’ for automating image and video capture

Jolle has also previously published a very handy software package especially with biological scientists in mind. It’s called pirecorder and helps with automated image and video recording using Raspberry Pi. You can check it out here: https://github.com/JolleJolles/pirecorder.

You can keep up with Jolle on Instagram, where he documents all the dreamy outdoor projects he’s working on.

Drop a comment below if you’ve seen an interesting scientific application of Raspberry Pi, at work, on TV, or maybe just in your imagination while you wait to find the time to build it!

The post Raspberry Pi: a versatile tool for biological sciences appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi: a versatile tool for biological sciences

Go down a Raspberry Pi YouTube rabbit hole

We here at Virtual Raspberry Pi Towers are looking forward to our weekends getting warmer, now that we are officially in British Summer Time. But we wanted to make the most of these last Saturdays and Sundays in which we have no choice but to cosy up against the typically British spring weather with a good old-fashioned YouTube rabbit hole.

Here are a few channels we think you’ll like. Some we’ve known about for a while, others are new friends we’ve made over the last year or so, and one is almost brand new so we’re putting you ahead of the curve there. You’re welcome.

Sophy Wong

Subscribe to Sophy Wong’s channel if you love the idea of wearing the tech you create. She collaborated with HackSpace magazine to publish a book, Wearable Tech Projects, which is currently on sale at the Raspberry Pi Press online store for just £7.

This is one of the projects Sophy shared in her Wearable Tech Projects book

Sophy describes herself as a “maker, designer, geek, addicted to learning how to do new things.” And she even visited NASA to watch a SpaceX launch.

Subscribe to Sophy’s channel here.

Blitz City DIY

Blitz City DIY (aka Liz) is a “DIY-er on a quest to gather and share knowledge” and has already built something cool with our newest baby, Raspberry Pi Pico. Her busy channel features computing, audio, video, coding, and more.

Check out Raspberry Pi Pico in action in this recent video from Blitz City DIY

We love Liz an extra lot because her channel features on entire playlist dedicated to Raspberry Pi Adventures. She also shares a healthy dose of festive content showing you how to Tech the Halls. No, April is NOT too early for Christmas stuff.

Subscribe to Blitz City DIY here.

Electromaker

Our new friends at Electromaker share tutorials, community projects, and contests where subscribers win hardware and massive cash prizes. Flat cap aficionado Ian Buckley also hosts The Electromaker Show – a weekly roundup of all that’s new and interesting in the maker community.

Electromakers assemble!

You can also swing by the super useful online shop where you can buy everything you need to recreate some of the projects featured. If you’re daunted by shopping for every little bit you need to create something awesome, you can choose one of these electro {maker KITS} and get right to it. We especially like the Lightsaber and Daft Punk-esque helmet kits.

Follow Electromaker here.

Estefannie Explains It All

You must have seen an Estefannie Explains It All video by now. But did you know about the weekly livestreams she hosts on Instagram? We know you’ll watch just because she’s cool and sometimes holds her pet cat up to the camera, but you’ll definitely want to tune in to try and win one of her tech giveaways. Some lucky viewers even got their hands on a Raspberry Pi 400.

Fond memories of when Estefannie visited Raspberry Pi Towers

Estefannie is another top collaborator whose channel has a dedicated Raspberry Pi playlist. Some of the things she has created using our tiny computers include Jurassic Park goggles, an automated coffee press, and a smart gingerbread house.

And as if all that wasn’t enough, Estefannie graced the Princesses with Power Tools calendar this year as Rey from Star Wars. You can buy a copy here.

Subscribe to Estefannie Explains It All here.

Kids Invent Stuff

Ruth Amos and Shawn Brown use their channel Kids Invent Stuff to bring kids’ ideas to life by making them into real working inventions. Young people aged 4–11 can submit their ideas or take part in regular invention challenges.

The infamous pooping unicorn

We first gave this channel a shout-out when Ruth and Shawn teamed up with Estefannie Explains It All to build the world’s first Raspberry Pi-powered Twitter-activated jelly bean-pooping unicorn. For real.

The MagPi Magazine got to know Ruth a little better in a recent interview. And Ruth also features in the 2021 Princesses with Power Tools calendar, as a welding Rapunzel. Go on, you know you want to buy one.

Ellora James

We saved the best (and newest) for last. Ellora James is brand new to YouTube. Her first tutorial showing you how to use Pimoroni’s Grow HAT Mini Kit was posted just three weeks ago, and she added a project update this week.

Ella helps you differentiate between edible pie and Raspberry Pi

We really like her video showing beginners how to set up their first Raspberry Pi. But our favourite is the one above in which she tackles one of the Universe’s big questions.

Subscribe to Ellora James here.

The post Go down a Raspberry Pi YouTube rabbit hole appeared first on Raspberry Pi.



Source: Raspberry Pi – Go down a Raspberry Pi YouTube rabbit hole

Edge Impulse and TinyML on Raspberry Pi

Raspberry Pi is probably the most affordable way to get started with embedded machine learning. The inferencing performance we see with Raspberry Pi 4 is comparable to or better than some of the new accelerator hardware, but your overall hardware cost is just that much lower.

Raspberry Pi 4 Model B

However, training custom models on Raspberry Pi — or any edge platform, come to that — is still problematic. This is why today’s announcement from Edge Impulse is a big step, and makes machine learning at the edge that much more accessible. With full support for Raspberry Pi, you now have the ability to take data, train against your own data in the cloud on the Edge Impulse platform, and then deploy the newly trained model back to your Raspberry Pi.

Today’s announcement includes new SDKs: for Python, Node.js, Go, and C++. This allows you to integrate machine learning models directly into your own applications. There is also support for object detection, exclusively on the Raspberry Pi; you can train a custom object detection model using camera data taken on your own Raspberry Pi, and then deploy and use this custom model, rather than relying on a pretrained stock image classification model.

Because the importance of bananas to machine learning researchers can not be overstated. To test it out, we’re going to train a very simple model that can tell the difference between a banana🍌 and an apple🍎.

Getting started

If you don’t already have an Edge Impulse account you should open up a browser on your laptop and then create an account, along with a test project. I’m going to to call mine “Object detection”.

Creating a new project in Edge Impulse
Creating a new project in Edge Impulse

We’re going to be building an image classification project, one that can tell the difference between a banana 🍌 and an apple 🍎, but Edge Impulse will also let you build an object detection project, one that will identify multiple objects in an image.

Building an object detection rather than an image classification system? This video is for you!

After creating your project, you should see something like this:

My new object detection project open in Edge Impulse

Now log in to your Raspberry Pi, open up a Terminal window, and type

[[code]]czo1MjQ6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtY3lhbi1ibHVlLWNvbG9yXCI+JDwvc3tbJiomXX1wYW4+PC9zdHJvbmc+IGN1cmwgLXNMIGh0dHBzOi8vZGViLm5vZGVzb3VyY2UuY29tL3NldHVwXzEyLnggfCBzdWRvIGJhc2ggLQo8e1smKiZdfXN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3NwYW4+PC9zdHtbJiomXX1yb25nPiBzdWRvIGFwdCBpbnN0YWxsIC15IGdjYyBnKysgbWFrZSBidWlsZC1lc3NlbnRpYWwgbm9kZWpzIHNveCBnc3RyZWFtZXIxe1smKiZdfS4wLXRvb2xzIGdzdHJlYW1lcjEuMC1wbHVnaW5zLWdvb2QgZ3N0cmVhbWVyMS4wLXBsdWdpbnMtYmFzZSBnc3RyZWFtZXIxLjAtcGx7WyYqJl19dWdpbnMtYmFzZS1hcHBzCjxzdHJvbmc+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1jeWFuLWJsdWUtY297WyYqJl19bG9yXCI+JDwvc3Bhbj48L3N0cm9uZz4gc3VkbyBucG0gaW5zdGFsbCBlZGdlLWltcHVsc2UtbGludXggLWcgLS11bnNhZmUtcGVybVwie1smKiZdfTt7WyYqJl19[[/code]]

to install the local toolchain. Then type

[[code]]czoxODE5OlwiPHN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBlZGdlLWltcHVsc2UtbGludXgKRWRnZSBJbXB1bHNlIExpbnV4IGNsaWVudCB2MS4xLjUKPyBXaGF0IGlze1smKiZdfSB5b3VyIHVzZXIgbmFtZSBvciBlLW1haWwgYWRkcmVzcyAoZWRnZWltcHVsc2UuY29tKT8gPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5le1smKiZdfS1jb2xvciBoYXMtcGFsZS1jeWFuLWJsdWUtY29sb3JcIj5hbGFzZGFpcjwvc3Bhbj4KPyBXaGF0IGlzIHlvdXIgcGFzc3dvcmQ/IFtoe1smKiZdfWlkZGVuXQpUaGlzIGlzIGEgZGV2ZWxvcG1lbnQgcHJldmlldy4KRWRnZSBJbXB1bHNlIGRvZXMgbm90IG9mZmVyIHN1cHBvcnQgb257WyYqJl19IGVkZ2UtaW1wdWxzZS1saW51eCBhdCB0aGUgbW9tZW50LgoKCj8gVG8gd2hpY2ggcHJvamVjdCBkbyB5b3Ugd2FudCB0byBjb25uZXtbJiomXX1jdCB0aGlzIGRldmljZT8gPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1wYWxlLWN5YW4tYmx1ZS1jb2xvclwiPkFsYXN7WyYqJl19ZGFpciBBbGxhbiAvIE9iamVjdCBkCmV0ZWN0aW9uPC9zcGFuPgo/IFNlbGVjdCBhIG1pY3JvcGhvbmUgPHNwYW4gY2xhc3M9XCJoYXN7WyYqJl19LWlubGluZS1jb2xvciBoYXMtcGFsZS1jeWFuLWJsdWUtY29sb3JcIj5VU0ItQXVkaW8gLSBSYXplciBLaXlvPC9zcGFuPgo8c3BhbiB7WyYqJl19Y2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5bU0VSXTwvc3Bhbj4gVXNpbmcge1smKiZdfW1pY3JvcGhvbmUgaHc6MSwwCj8gU2VsZWN0IGEgY2FtZXJhIDxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtcGFsZS1je1smKiZdfXlhbi1ibHVlLWNvbG9yXCI+UmF6ZXIgS2l5bzwvc3Bhbj4KPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1sdW1pbm91c3tbJiomXX0tdml2aWQtb3JhbmdlLWNvbG9yXCI+W1NFUl08L3NwYW4+IFVzaW5nIGNhbWVyYSBSYXplciBLaXlvIHN0YXJ0aW5nLi4uCjxzcGFuIHtbJiomXX1jbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPltTRVJdPC9zcGFuPiBDb25uZWN7WyYqJl19dGVkIHRvIGNhbWVyYQo8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj5bV1Mge1smKiZdfV08L3NwYW4+IENvbm5lY3RpbmcgdG8gd3NzOi8vcmVtb3RlLW1nbXQuZWRnZWltcHVsc2UuY29tCjxzcGFuIGNsYXNzPVwiaGFzLWlue1smKiZdfWxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj5bV1MgXTwvc3Bhbj4gQ29ubmVjdGVkIHRvIHdzczovL3JlbW90e1smKiZdfWUtbWdtdC5lZGdlaW1wdWxzZS5jb20KPyBXaGF0IG5hbWUgZG8geW91IHdhbnQgdG8gZ2l2ZSB0aGlzIGRldmljZT8gPHNwYW4gY2x7WyYqJl19YXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtcGFsZS1jeWFuLWJsdWUtY29sb3JcIj5yYXNwYmVycnlwaTwvc3Bhbj4KPHNwYW4gY2xhe1smKiZdfXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPltXUyBdPC9zcGFuPiBEZXZpY2UgXCJyYXNwYmV7WyYqJl19cnJ5cGlcIiBpcyBub3cgY29ubmVjdGVkIHRvIHByb2plY3QgXCJPYmplY3QgZGV0ZWN0aW9uXCIKPHNwYW4gY2xhc3M9XCJoYXMtaW5saW57WyYqJl19ZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPltXUyBdPC9zcGFuPiBHbyB0byBodHRwczovL3N0dWRpby5lZGdlaW17WyYqJl19cHVsc2UuY29tL3N0dWRpby88ZW0+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1jeWFuLWJsdWlzaC1ncmF5LWNvbG97WyYqJl19clwiPlhYWFhYPC9zcGFuPjwvZW0+L2FjcXVpc2l0aW9uL3RyYWluaW5nIHRvIGJ1aWxkIHlvdXIgbWFjaGluZSBsZWFybmluZyBtb2R7WyYqJl19ZWwhXCI7e1smKiZdfQ==[[/code]]

and log in to your Edge Impulse account. You’ll then be asked to choose a project, and finally to select a microphone and camera to connect to the project. I’ve got a Razer Kiyo connected to my own Raspberry Pi so I’m going to use that.

Raspberry Pi has connected to Edge Impulse
Raspberry Pi has connected to Edge Impulse

If you still have your project open in a browser you might see a notification telling you that your Raspberry Pi is connected. Otherwise you can click on “Devices” in the left-hand menu for a list of devices connected to that project. You should see an entry for your Raspberry Pi.

The list of devices connected to your project

Taking training data

If you look in your Terminal window on your Raspberry Pi you’ll see a URL that will take you to the “Data acquisition” page of your project. Alternatively you can just click on “Data acquisition” in the left-hand menu.

Getting ready to collect training data
Getting ready to collect training data

Go ahead and select your Raspberry Pi if it isn’t already selected, and then select the Camera as the sensor. You should see a live thumbnail from your camera appear to the right-hand side. If you want to follow along, position your fruit (I’m starting with with the banana 🍌), add a text label in the “Label” box, and hit the “start sampling” button. This will take and save an image to the cloud. Reposition the banana and take ten images. Then do it all again with the apple 🍎.

Ten labelled images each of the banana 🍌 and the apple 🍎

Since we’re building an incredibly simplistic model, and we’re going to leverage transfer learning, we probably now have enough training data with just these twenty images. So let’s go and create a model.

Creating a model

Click on “Impulse design” in the left-hand menu. Start by clicking on the “Add an input block” box and click on the “Add” button next to the “Images” entry. Next click on the “Add a processing block” box. Then click on the “Add” button next to the “Image” block to add a processing block that will normalise the image data and reduce colour depth. Then click on the “Add a learning block” box and select the “Transfer Learning (images)” block to grab a pretrained model intended for image classification, on which we will perform transfer learning to tune it for our banana 🍌 and apple 🍎 recognition task. You should see the “Output features” block update to show 2 output features. Now hit the “Save Impulse” button.

Our configured
Our configured Impulse

Next click on the “Images” sub-item under the “Impulse design” menu item, switch to the “Generate features” tab, and then hit the green “Generate features” button.

Generating model features

Finally, click on the “Transfer learning” sub-item under the “Impulse design” menu item, and hit the green “Start training” button at the bottom of the page. Training the model will take some time. Go get some coffee ☕.

A trained model

Testing our model

We can now test our trained model against the world. Click on the “Live classification” entry in the left-hand menu, and then hit then the green “Start sampling” button to take a live picture from your camera.

Live classification to test your model
Live classification to test your model

You might want to go fetch a different banana 🍌, just for testing purposes.

A live test of the model

If you want to do multiple tests, just scroll up and hit the “Start sampling” button again to take another image.

Deploying to your Raspberry Pi

Now we’ve (sort of) tested our model, we can deploy it back to our Raspberry Pi. Go to the Terminal window where the edge-impulse-linux command connecting your Raspberry Pi to Edge Impulse is running, and hit Control-C to stop it. Afterwards we can do a quick evaluation deployment using the edge-impulse-runner command.

[[code]]czoxODgyOlwiPHN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWN5YW4tYmx1ZS1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBlZGdlLWltcHVsc2UtbGludXgtcnVubmVyClRoaXMgaXMgYSBkZXZlbG9wbWVudCBwcmV2aWV3LgpFZGdle1smKiZdfSBJbXB1bHNlIGRvZXMgbm90IG9mZmVyIHN1cHBvcnQgb24gZWRnZS1pbXB1bHNlLWxpbnV4LXJ1bm5lciBhdCB0aGUgbW9tZW50Lgp7WyYqJl19CkVkZ2UgSW1wdWxzZSBMaW51eCBydW5uZXIgdjEuMS41Cgo8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLWx1bWlub3V7WyYqJl19cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5bUlVOXTwvc3Bhbj4gQWxyZWFkeSBoYXZlIG1vZGVsIC9ob21lL3BpLy5laS1saW51eC1ydW57WyYqJl19bmVyL21vZGVscy8yNDIxNy92Mi9tb2RlbC5laW0gbm90IGRvd25sb2FkaW5nLi4uCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2x7WyYqJl19b3IgaGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPltSVU5dPC9zcGFuPiBTdGFydGluZyB0aGUgaW1hZ2UgY2xhc3NpZml7WyYqJl19ZXIgZm9yIEFsYXNkYWlyIEFsbGFuIC8gT2JqZWN0IGRldGVjdGlvbiAodjIpCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciB7WyYqJl19aGFzLWx1bWlub3VzLXZpdmlkLW9yYW5nZS1jb2xvclwiPltSVU5dPC9zcGFuPiBQYXJhbWV0ZXJzIGltYWdlIHNpemUgOTZ4OTYgcHh7WyYqJl19ICgzIGNoYW5uZWxzKSBjbGFzc2VzIFsgXCc8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4te1smKiZdfWNvbG9yXCI+YXBwbGU8L3NwYW4+XCcsIFwnPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1ncmVlbi1jeWFuLWNve1smKiZdfWxvclwiPmJhbmFuYTwvc3Bhbj5cJyBdCjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtbHVtaW5vdXMtdml2aWQtb3Jhbmd7WyYqJl19ZS1jb2xvclwiPltSVU5dPC9zcGFuPiBVc2luZyBjYW1lcmEgUmF6ZXIgS2l5byBzdGFydGluZy4uLgo8c3BhbiBjbGFzcz1cImhhcy1pe1smKiZdfW5saW5lLWNvbG9yIGhhcy1sdW1pbm91cy12aXZpZC1vcmFuZ2UtY29sb3JcIj5bUlVOXTwvc3Bhbj4gQ29ubmVjdGVkIHRvIGNhbWVye1smKiZdfWEKCldhbnQgdG8gc2VlIGEgZmVlZCBvZiB0aGUgY2FtZXJhIGFuZCBsaXZlIGNsYXNzaWZpY2F0aW9uIGluIHlvdXIgYnJvd3Nlcj97WyYqJl19IEdvIHRvIGh0dHA6Ly88ZW0+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1jeWFuLWJsdWlzaC1ncmF5LWNvbG9yXCI+e1smKiZdfVhYWC5YWFguWFhYLlhYWDwvc3Bhbj48L2VtPjo8ZW0+PHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy1jeWFuLWJsdWlze1smKiZdfWgtZ3JheS1jb2xvclwiPlhYWFg8L3NwYW4+PC9lbT4KCmNsYXNzaWZ5UmVzIDMxbXMuIHsgYXBwbGU6IFwnPHNwYW4gY2xhc3M9XCJoYXN7WyYqJl19LWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPjAuMDA5Nzwvc3Bhbj5cJywgYmFuYW5hOiBcJzxzcGFuIGNsYXtbJiomXX1zcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj4wLjk5MDM8L3NwYW4+XCcgfQpjbGFzc2lmeVJle1smKiZdfXMgMjltcy4geyBhcHBsZTogXCc8c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj57WyYqJl19MC4wMDgyPC9zcGFuPlwnLCBiYW5hbmE6IFwnPHNwYW4gY2xhc3M9XCJoYXMtaW5saW5lLWNvbG9yIGhhcy12aXZpZC1ncmVlbi1jeWFuLXtbJiomXX1jb2xvclwiPjAuOTkxODwvc3Bhbj5cJyB9CiAuCiAuCiAuCmNsYXNzaWZ5UmVzIDIzbXMuIHsgYXBwbGU6IFwnPHNwYW4gY2xhc3M9XCJoYXtbJiomXX1zLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPjAuMDA3ODwvc3Bhbj5cJywgYmFuYW5hOiBcJzxzcGFuIGNse1smKiZdfWFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj4wLjk5MjI8L3NwYW4+XCcgfVwiO3tbJiomXX0=[[/code]]

This will connect to the Edge Impulse cloud, download your trained model, and start up an application that will take the video stream coming from your camera and look for bananas 🍌 and apples 🍎. The results of the model inferencing will be shown frame by frame in the Terminal window. When the runner application starts up you’ll also see a URL: copy and paste this into a browser, and you’ll see the view from the camera in real time along with the inferencing results.

Deployed model running locally on your Raspberry Pi

Success! We’ve taken our training data and trained a model in the cloud, and we’re now running that model locally on our Raspberry Pi. Because we’re running the model locally, we no longer need network access. No data needs to leave the Raspberry Pi. This is a huge privacy advantage for edge computing compared to cloud-connected devices.

Wrapping up?

While we’re running our model inside Edge Impulse’s “quick look” application, we can deploy the exact same model into our own applications, as today’s announcement includes new SDKs: for Python, Node.js, Go, and C++. These SDKs let us build standalone applications to collect data not just from our camera and microphone, but from other sensors like accelerometers, magnetometers, or anything else you can connect to a Raspberry Pi.

Performance metrics for Edge Impulse are promising, although still somewhat below what we’ve seen using TensorFlow Lite directly on Raspberry Pi 4, for inferencing using similar models. That said, it’s really hard to compare performance across even very similar models as it depends so much on the exact situation you’re in and what data you’re dealing with, so your mileage may vary quite a lot here.

However, the new Edge Impulse announcement offers two very vital things: a cradle-to-grave framework for collecting data and training models then deploying these custom models at the edge, together with a layer of abstraction. Increasingly we’re seeing deep learning eating software as part of a general trend towards increasing abstraction, sometimes termed lithification, in software. Which sounds intimidating, but means that we can all do more, with less effort. Which isn’t a bad thing at all.

The post Edge Impulse and TinyML on Raspberry Pi appeared first on Raspberry Pi.



Source: Raspberry Pi – Edge Impulse and TinyML on Raspberry Pi

Play your retro console on a modern TV

Want to connect your retro console to your modern TV? The latest issue of Wireframe magazine has the only guide you need…

“Get a Raspberry Pi. Done.” It’s probably the most frequently recurring comment we get across all videos on the My Life in Gaming YouTube channel, which often revolve around playing classic games on original hardware. Not everyone has held onto their old consoles through the years, so I get it.

 PS1Digital on a 4K OLED TV
PS1Digital on a 4K OLED TV

Software emulation, whether through a PC, Raspberry Pi, or any other device, is easy on your wallet and solid enough to give most people the experience they’re looking for.

But for me, the core of my gaming experience still tends to revolve around the joy I feel in using authentic cartridges and discs. But as you may have noticed, 2021 isn’t 2001, and using pre-HDMI consoles isn’t so easy these days. A standard CRT television is the most direct route to getting a solid experience with vintage consoles.

 Standard RCA cables with composite video. A direct HDTV connection is a poor experience
Standard RCA cables with composite video. A direct HDTV connection is a poor experience

But let’s face it – not everyone is willing to work a CRT into their setup. Plenty of people are content with just plugging the cables that came with their old systems (usually composite) into their HD or 4K TV – and that’s OK! But whether for the blurry looks or the input lag they feel, this simply isn’t good enough for a lot of people.

Down the rabbit hole

“There has to be a better way,” you say as you browse Amazon’s assortment of analogue-to- HDMI converters, HDMI adapters like Wii2HDMI, or HDMI cables for specific consoles by a variety of brands. You might think these are just what you’re looking for, but remember: your TV has its own internal video processor. Just like your TV, they’re going to treat 240p like 480i. Not only is it unnecessary to deinterlace 240p, but doing so actively degrades the experience – motion- adaptive deinterlacing takes time, adding input lag.

RetroTINK-2X MINI (left) and 2X Pro (right). The MINI pairs great with N64
RetroTINK-2X MINI (left) and 2X Pro (right). The MINI pairs great with N64

That Sega Saturn HDMI cable is going to deinterlace your gorgeous 240p sprite-based games so hard that they’ll look like some sort of art restoration disaster in motion. The dark secret of these products is that you’re buying something you already own – a basic video processor designed for video, not video games, and the result will likely not be tangibly better than what your TV could do. The only reason to go this route is if you have no analogue inputs and could not possibly invest more than $30.

So what is the better way? The primary purpose of an external video processor is to send a properly handled signal to your TV that won’t trigger its lag-inducing processes and turn your pixels into sludge – basically any progressive resolution other than 240p. Luckily, there are several devices in various price ranges that are designed to do exactly this.

There is lots more to learn!

This is just a tiny snippet of the mammoth feature in Wireframe magazine issue 49. The main feature includes a ‘jargon cheat sheet’ and ‘cable table’ to make sure any level of user can get their retro console working on a modern TV.

If you’re not a Wireframe magazine subscriber, you can download a PDF copy for free. Head to page 50 to get started.

You can read more features like this one in Wireframe issue 49, available directly from Raspberry Pi Press — we deliver worldwide.

The post Play your retro console on a modern TV appeared first on Raspberry Pi.



Source: Raspberry Pi – Play your retro console on a modern TV

Raspberry Pi automatically refills your water bottle

YouTuber Chris Courses takes hydration seriously, but all those minutes spent filling up water bottles take a toll. 15 hours per year, to be exact. Chris regularly uses three differently sized water bottles and wanted to build something to fill them all to their exact measurements.

(Polite readers may like to be warned of a couple of bleeped swears and a rude whiteboard drawing a few minutes into this video.)

Hardware

  • Raspberry Pi
  • Water filter (Chris uses this one, which you would find in a fridge with a built-in water dispenser)
  • Solenoid valve (which only opens when an electrical signal is sent to it)

How does the hardware work?

The solenoid valve determines when water can and cannot pass through. Mains water comes in through one tube and passes through the water filter, then the solenoid valve releases water via another tube into the bottle.

Diagram of the water bottle filler setup, hand-drawn by the maker
See – simples!

What does the Raspberry Pi do?

The Raspberry Pi sends a signal to the solenoid valve telling it to open for a specific amount of time — the length of time it takes to fill a particular water bottle — and to close when that time expires. Chris set this up to start running when he clicks a physical button.

Chris also programmed lights to indicate when the dispenser is turned on. This manual coding proved to be the most time-consuming part of the project.

But all the wires look so ugly!

Water dispenser mounted onto side of fridge
Sleek and discreet

Chris agreed, so he 3D-printed a beautiful enclosure to house what he dubs the ‘Hydrobot 5000’. It’s a sleek black casing that sits pretty in his kitchen on a wall next to the fridge. It took a fair bit of fridge shuffling and electrical mounting to “sit pretty”, however. This Raspberry Pi-powered creation needed to be connected to a water source, so the tubing had to be snaked from Hydrobot 5000, behind appliances, to the kitchen sink.

Check out those disco lights! Nice work, Chris. Follow Chris on YouTube for loads more coding and dev videos.

The post Raspberry Pi automatically refills your water bottle appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi automatically refills your water bottle

Raspberry Pi Zero W turns iPod Classic into Spotify music player

Recreating Apple’s iconic iPod Classic as a Spotify player may seem like sacrilege but it works surprisingly well, finds Rosie Hattersley. Check out the latest issue of The MagPi magazine (pg 8 – 12) for a tutorial to follow if you’d like to create your own.

Replacement Raspberry Pi parts laying inside an empty iPod case to check they will fit
Replacement Raspberry Pi parts laying inside an empty iPod case to check they will fit

When the original iPod was launched, the idea of using it to run anything other than iTunes seemed almost blasphemous. The hardware remains a classic, but our loyalties are elsewhere with music services these days. If you still love the iPod but aren’t wedded to Apple Music, Guy Dupont’s Spotify hack makes a lot of sense. “It’s empowering as a consumer to be able to make things work for me – no compromises,” he says. His iPod Classic Spotify player project cost around $130, but you could cut costs with a different streaming option.

“I wanted to explore what Apple’s (amazing) original iPod user experience would feel like in a world where we have instant access to tens of millions of songs. And, frankly, it was really fun to take products from two competitors and make them interact in an unnatural way.” 

Guy Dupont

Installing the C-based haptic code on Raspberry Pi Zero, and connecting Raspberry Pi, display, headers, and leads
Installing the C-based haptic code on Raspberry Pi Zero, and connecting Raspberry Pi, display, headers, and leads

Guy’s career spans mobile phone app development, software engineering, and time in recording studios in Boston as an audio engineer, so a music tech hack makes sense. He first used Raspberry Pi for its static IP so he could log in remotely to his home network, and later as a means of monitoring his home during a renovation project. Guy likes using Raspberry Pi when planning a specific task because he can “program [it] to do one thing really well… and then I can leave it somewhere forever”, in complete contrast to his day job. 

Mighty micro

Guy seems amazed at having created a Spotify streaming client that lives inside, and can be controlled by, an old iPod case from 2004. He even recreated the iPod’s user interface in software, right down to the font. A ten-year-old article about the click wheel provided some invaluable functionality insights and allowed him to write code to control it in C. Guy was also delighted to discover an Adafruit display that’s the right size for the case, doesn’t expose the bezels, and uses composite video input so he could drive it directly from Raspberry Pi’s composite out pins, using just two wires. “If you’re not looking too closely, it’s not immediately obvious that the device was physically modified,” he grins.

All replacement parts mounted in the iPod case
All replacement parts mounted in the iPod case

Guy’s retro iPod features a Raspberry Pi Zero W. “I’m not sure there’s another single-board computer this powerful that would have fit in this case, let alone one that’s so affordable and readily available,” he comments. “Raspberry Pi did a miraculous amount of work in this project.” The user interface is a Python app, while Raspberry Pi streams music from Spotify via Raspotify, reads user input from the iPod’s click wheel, and drives a haptic motor – all at once. 

Guy managed to use a font for the music library that looks almost exactly the same as Apple’s original
Guy managed to use a font for the music library that looks almost exactly the same as Apple’s original

Most of the hardware for the project came from Guy’s local electronics store, which has a good line in Raspberry Pi and Adafruit components. He had a couple of attempts to get the right size of haptic motor, but most things came together fairly easily after a bit of online research. Help, when he needed it, was freely given by the Raspberry Pi community, which Guy describes as “incredible”.

Things just clicked 

Guy previously used Raspberry Pi to stream albums around his home
Guy previously used Raspberry Pi to stream albums around his home

Part of the fun of this project was getting the iPod to run a non-Apple streaming service, so he’d also love to see versions of the iPod project using different media players. You can follow his instructions on GitHub.

Next, Guy intends to add a DAC (digital to analogue converter) for the headphone jack, but Bluetooth works for now, even connecting from inside his jacket pocket, and he plans to get an external USB DAC in time. 

The post Raspberry Pi Zero W turns iPod Classic into Spotify music player appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi Zero W turns iPod Classic into Spotify music player

214 teams granted Flight Status for Astro Pi Mission Space Lab 2020/21!

The Raspberry Pi Foundation and ESA Education are excited to announce that 214 teams participating in Mission Space Lab of this year’s European Astro Pi Challenge have achieved Flight Status. That means they will have their computer programs run on the International Space Station (ISS) later this month!

ESA Astronaut Thomas Pesquet with the Astro Pi computers onboard the ISS.
ESA Astronaut Thomas Pesquet with the Astro Pi computers onboard the ISS

Mission Space Lab gives teams of students and young people up to 19 years of age the amazing opportunity to conduct scientific experiments aboard the ISS, by writing code for the Astro Pi computers — Raspberry Pi computers augmented with Sense HATs. Teams can choose between two themes for their experiments, investigating either life in space or life on Earth.

Life in space

For ‘Life in space’ experiments, teams use the Astro Pi computer known as Ed to investigate life inside the Columbus module of the ISS. For example, past teams have:

  • Used the Astro Pi’s accelerometer sensor to compare the motion of the ISS during normal flight compared to its motion during course corrections and reboost manoeuvres
  • Investigated whether influenza is transmissible on a spacecraft such as the ISS
  • Monitored pressure inside the Columbus module to be able to warn the astronauts on board of space debris or micrometeoroids colliding with the station
  • And much more
Compilation of photographs of Earth, taken by Astro Pi Izzy aboard the ISS.
Compilation of photographs of Earth, taken by Astro Pi Izzy aboard the ISS

Life on Earth

In ‘Life on Earth’ experiments, teams investigate life on our home planet’s surface using the Astro Pi computer known as Izzy. Izzy’s near-infrared camera (with a blue optical filter) faces out of a window in the ISS and is pointed at Earth. For example, past teams have:

  • Investigated variations in Earth’s magnetic field
  • Used machine learning to identify geographical areas that had recently suffered from wildfires
  • Studied climate change based on coastline erosion over the past 30 years
  • And much besides

Phase 1 and 2 of Mission Space Lab

In Phase 1 of Mission Space Lab, teams only have to submit an experiment idea. Our team then judges the teams’ ideas based on their originality, feasibility, and use of hardware. This year, 426 teams submitted experiment ideas, with 396 progressing to Phase 2.

Timeline of Mission Space Lab in 2020/2021, part of the European Astro Pi Challenge.
Timeline of Mission Space Lab in 2020/21 — click to enlarge

At the beginning of Phase 2 of the challenge, we send our special Astro Pi kits to the teams to help them write and test their programs. The kits contain hardware that is similar to the Astro Pi computers in space, including a Raspberry Pi 3 Model B, Raspberry Pi Sense HAT, and Raspberry Pi Camera Modules (V2 and NoIR).

Astro Pi kit box.

Mission Space Lab teams then write the programs for their experiments in Python. Once teams are happy with their programs, have tested them on their Astro Pi kits, and submitted them to us for judging, we run a series of tests on them to ensure that they follow experiment rules and can run without errors on the ISS. The experiments that meet the relevant criteria are then awarded Flight Status.

Phase 3: Flight Status achieved

The 214 teams awarded flight status this year represent 21 countries and 862 young people, with 30% female participants. 137 teams with ‘Life on Earth’ experiments and 77 teams with ‘Life in space’ experiments have successfully made it through to Phase 3.

Spain has the most teams progressing to the next phase (26), closely followed by the UK (25), Romania (21), France (21) and Greece (18).

In the next few weeks, the teams’ experiments will be deployed to the Astro Pi computers on the ISS, and most of them will run overseen by ESA Astronaut Thomas Pesquet, who is going to fly to the ISS on 22 April on his new mission, Alpha.

In the final phase, we’ll send the teams the data their experiments collect, to analyse and write short reports about their findings. Based on these reports, we and the ESA Education experts will determine the winner of this year’s Mission Space Lab. The winning and highly commended teams will receive special prizes. Last year’s outstanding teams got to take part in a Q&A with ESA astronaut Luca Parmitano!

Well done to everyone who has participated, and congratulations to all the successful teams. We are really looking forward to reading your reports!

Logo of Mission Space Lab, part of the European Astro Pi Challenge.

The post 214 teams granted Flight Status for Astro Pi Mission Space Lab 2020/21! appeared first on Raspberry Pi.



Source: Raspberry Pi – 214 teams granted Flight Status for Astro Pi Mission Space Lab 2020/21!

Remake Manic Miner’s collapsing platforms | Wireframe #49

Traverse a crumbly cavern in our homage to a Spectrum classic. Mark Vanstone has the code

One of the most iconic games on the Sinclair ZX Spectrum featured a little man called Miner Willy, who spent his days walking and jumping from platform to platform collecting the items needed to unlock the door on each screen. Manic Miner’s underground world featured caverns, processing plants, killer telephones, and even a forest featuring little critters that looked suspiciously like Ewoks.

Written by programmer Matthew Smith and released by Bug-Byte in 1983, the game became one of the most successful titles on the Spectrum. Smith was only 16 when he wrote Manic Miner and even constructed his own hardware to speed up the development process, assembling the code on a TRS-80 and then downloading it to the Spectrum with his own hand-built interface. The success of Manic Miner was then closely followed by Jet Set Willy, featuring the same character, and although they were originally written for the Spectrum, the games very soon made it onto just about every home computer of the time.

Miner Willy makes his way to the exit, avoiding those vicious eighties telephones.

Both Manic Miner and Jet Set Willy featured unstable platforms which crumbled in Willy’s wake, and it’s these we’re going to try to recreate this month. In this Pygame Zero example, we need three frames of animation for each of the two directions of movement. As we press the arrow keys we can move the Actor left and right, and in this case, we’ll decide which frame to display based on a count variable, which is incremented each time our update() function runs. We can create platforms from a two-dimensional data list representing positions on the screen with 0 meaning a blank space, 1 being a solid platform, and 2 a collapsible platform. To set these up, we run through the list and make Actor objects for each platform segment.

For our draw() function, we can blit a background graphic, then Miner Willy, and then our platform blocks. During our update() function, apart from checking key presses, we also need to do some gravity calculations. This will mean that if Willy isn’t standing on a platform or jumping, he’ll start to fall towards the bottom of the screen. Instead of checking to see if Willy has collided with the whole platform, we only check to see if his feet are in contact with the top. This means he can jump up through the platforms but will then land on the top and stop. We set a variable to indicate that Willy’s standing on the ground so that when the SPACE bar is pressed, we know if he can jump or not. While we’re checking if Willy’s on a platform, we also check to see if it’s a collapsible one, and if so, we start a timer so that the platform moves downwards and eventually disappears. Once it’s gone, Willy will fall through. The reason we have a delayed timer rather than just starting the platform heading straight down is so that Willy can run across many tiles before they collapse, but his way back will quickly disappear. The disappearing platforms are achieved by changing the image of the platform block as it moves downward.

As we’ve seen, there were several other elements to each Manic Miner screen, such as roaming bears that definitely weren’t from Star Wars, and those dastardly killer telephones. We’ll leave you to add those.

Here’s Mark’s code for a Manic Miner-style platformer. To get it working on your system, you’ll need to install Pygame Zero. And to download the full code and assets, head here.

Get your copy of Wireframe issue 49

You can read more features like this one in Wireframe issue 49, available directly from Raspberry Pi Press — we deliver worldwide.

And if you’d like a handy digital version of the magazine, you can also download issue 49 for free in PDF format.

The post Remake Manic Miner’s collapsing platforms | Wireframe #49 appeared first on Raspberry Pi.



Source: Raspberry Pi – Remake Manic Miner’s collapsing platforms | Wireframe #49

Easter fun with Raspberry Pi

Easter is nearly upon us, and we’ll be stepping away from our home-office desks for a few days. Before we go, we thought we’d share some cool Easter-themed projects from the Raspberry Pi community.

Egg-painting robot

Teacher Klaus Rabeder designed, 3D-printed, and built a robot which his students programmed in Python to paint eggs with Easter designs. Each student came up with their own design and then programmed the robot to recreate it. The robot can draw letters and numbers, patterns, and figures (such as an Easter bunny) on an egg, as well as a charming meadow made of randomly calculated blades of grass. Each student took home the egg bearing their unique design.

The machine has three axes of movement: one that rotates the egg, one that moves the pens up and down, and one that makes servo motors put the pen tips onto the egg’s surface. Each servo is connected to two pens. Springs between the servo and pen make sure not too much pressure is applied.

What a cool way to spend your computing lessons!

Digital Easter egg hunt

eggs in foil with jumper wires attached
Go digital this Easter

Why hunt for chocolate eggs in a race against time before they melt, when you can go digital? Our very own Alex made this quick and easy game with a Raspberry Pi, a few wires, and some simple code. Simply unwrap your chocolate eggs and rewrap them with the silver side of the foil facing outwards to make them more conductive. The wires create a circuit, and when the circuit is closed with the foil-wrapped egg, the Raspberry Pi reveals the location of a bigger chocolate egg.

All the code and kit you need to recreate this game yourself is here.

Incubate baby chicks

The second-best thing about this time of year — after all the chocolate — is the cute baby animals. Lambs and bunnies get a special mention, but this project makes sure that chicken eggs are properly incubated to help baby chicks hatch. Maker Dennis Hejselbak added a live-streaming camera so he and other chick fans can keep an eye on things.

We’re sad to report that Emma still hasn’t revised her ‘No office chicks’ policy since we first reported this project back in 2015. Maybe next year?

Happy Easter!

Stand by for a delicious new issue of Wireframe magazine tomorrow. We’ll see you on Tuesday!

The post Easter fun with Raspberry Pi appeared first on Raspberry Pi.



Source: Raspberry Pi – Easter fun with Raspberry Pi

Drag-n-drop coding for Raspberry Pi Pico

Introducing Piper Make: a Raspberry Pi Pico-friendly drag-n-drop coding tool that’s free for anyone to use.

piper make screenshot
The ‘Digital View’ option displays a dynamic view of Raspberry Pi Pico showing GPIO states

Edtech startup Piper, Inc. launched this brand new browser-based coding tool on #PiDay. If you already have a Raspberry Pi Pico, head to make.playpiper.com and start playing with the coding tool for free.

Pico in front of Piper Make screen
If you already have a Raspberry Pi Pico, you can get started right away

Complete coding challenges with Pico

The block coding environment invites you to try a series of challenges. When you succeed in blinking an LED, the next challenge is opened up to you. New challenges are released every month, and it’s a great way to guide your learning and give you a sense of achievement as you check off each task.

But I don’t have a Pico or the components I need!

You’re going to need some kit to complete these challenges. The components you’ll need are easy to get hold of, and they’re things you probably already have lying around if you like to tinker, but if you’re a coding newbie and don’t have a workshop full of trinkets, Piper makes it easy for you. You can join their Makers Club and receive a one-off Starter Kit containing a Raspberry Pi Pico, LEDs, resistors, switches, and wires.

Piper Make starter kit
The Starter Kit contains everything you need to complete the first challenges

If you sign up to Piper’s Monthly Makers Club you’ll receive the Starter Kit, plus new hardware each month to help you complete the latest challenge. Each Raspberry Pi Pico board ships with Piper Make firmware already loaded, so you can plug and play.

Piper Make starter kit in action
Trying out the traffic light challenge with the Starter Kit

If you already have things like a breadboard, LEDs, and so on, then you don’t need to sign up at all. Dive straight in and get started on the challenges.

I have a Raspberry Pi Pico. How do I play?

A quick tip before we go: when you hit the Piper Make landing page for the first time, don’t click ‘Getting Started’ just yet. You need to set up your Pico first of all, so scroll down and select ‘Setup my Pico’. Once you’ve done that, you’re good to go.

The post Drag-n-drop coding for Raspberry Pi Pico appeared first on Raspberry Pi.



Source: Raspberry Pi – Drag-n-drop coding for Raspberry Pi Pico

Graphic routines for Raspberry Pi Pico screens

Pimoroni has brought out two add‑ons with screens: Pico Display and Pico Explorer. A very basic set of methods is provided in the Pimoroni UF2 file. In this article, we aim to explain how the screens are controlled with these low-level instructions, and provide a library of extra routines and example code to help you produce stunning displays.

You don't have to get creative with your text placement, but you can
You don’t have to get creative with your text placement, but you can

You will need to install the Pimoroni MicroPython UF2 file on your Pico and Thonny on your computer.

All graphical programs need the following ‘boilerplate’ code at the beginning to initialise the display and create the essential buffer. (We’re using a Pico Explorer – just change the first line for a Pico Display board.)

import picoexplorer as display<br/># import picodisplay as display<br/>#Screen essentials<br/>width = display.get_width()<br/>height = display.get_height()<br/>display_buffer = bytearray(width * height * 2)<br/>display.init(display_buffer)

The four buttons give you a way of getting data back from the user as well as displaying information
The four buttons give you a way of getting data back from the user as well as displaying information

This creates a buffer with a 16-bit colour element for each pixel of the 240×240 pixel screen. The code invisibly stores colour values in the buffer which are then revealed with a display.update() instruction.

The top-left corner of the screen is the origin (0,0) and the bottom-right pixel is (239,239).

Supplied methods

display.set_pen(r, g, b)

Sets the current colour (red, green, blue) with values in the range 0 to 255.

grey = display.create_pen(100,100,100)

Allows naming of a colour for later use.

display.clear()

Fills all elements in the buffer with the current colour.

display.update()

Makes the current values stored in the buffer visible. (Shows what has been written.)

display.pixel(x, y)

Draws a single pixel with the current colour at
point(x, y).

display.rectangle(x, y ,w ,h) 

Draws a filled rectangle from point(x, y), w pixels wide and h pixels high.

display.circle(x, y, r)

Draws a filled circle with centre (x, y) and radius r.

display.character(78, 112, 5 ,2)

Draws character number 78 (ASCII = ‘N’) at point (112,5) in size 2. Size 1 is very small, while 6 is rather blocky.

display.text("Pixels", 63, 25, 200, 4)

Draws the text on the screen from (63,25) in size 4 with text wrapping to next line at a ‘space’ if the text is longer than 200 pixels. (Complicated but very useful.)

display.pixel_span(30,190,180)

Draws a horizontal line 180 pixels long from point (30,190).

display.set_clip(20, 135, 200, 100)

While the screens are quite small in size, they have plenty of pixels for display
While the screens are quite small in size, they have plenty of pixels for display

After this instruction, which sets a rectangular area from (20,135), 200 pixels wide and 100 pixels high, only pixels drawn within the set area are put into the buffer. Drawing outside the area is ignored. So only those parts of a large circle intersecting with the clip are effective. We used this method to create the red segment.

display.remove_clip()

This removes the clip.

display.update()

This makes the current state of the buffer visible on the screen. Often forgotten.

if display.is_pressed(3): # Y button is pressed ?

Read a button, numbered 0 to 3.

You can get more creative with the colours if you wish
You can get more creative with the colours if you wish

This code demonstrates the built-in methods and can be downloaded here.

# Pico Explorer - Basics<br/># Tony Goodhew - 20th Feb 2021<br/>import picoexplorer as display<br/>import utime, random<br/>#Screen essentials<br/>width = display.get_width()<br/>height = display.get_height()<br/>display_buffer = bytearray(width * height * 2)<br/>display.init(display_buffer)<br/><br/>def blk():<br/>    display.set_pen(0,0,0)<br/>    display.clear()<br/>    display.update()<br/><br/>def show(tt):<br/>    display.update()<br/>    utime.sleep(tt)<br/>   <br/>def title(msg,r,g,b):<br/>    blk()<br/>    display.set_pen(r,g,b)<br/>    display.text(msg, 20, 70, 200, 4)<br/>    show(2)<br/>    blk()<br/><br/># Named pen colour<br/>grey = display.create_pen(100,100,100)<br/># ==== Main ======<br/>blk()<br/>title("Pico Explorer Graphics",200,200,0)<br/>display.set_pen(255,0,0)<br/>display.clear()<br/>display.set_pen(0,0,0)<br/>display.rectangle(2,2,235,235)<br/>show(1)<br/># Blue rectangles<br/>display.set_pen(0,0,255)<br/>display.rectangle(3,107,20,20)<br/>display.rectangle(216,107,20,20)<br/>display.rectangle(107,3,20,20)<br/>display.rectangle(107,216,20,20)<br/>display.set_pen(200,200,200)<br/>#Compass  points<br/>display.character(78,112,5,2)   # N<br/>display.character(83,113,218,2) # S<br/>display.character(87,7,110,2)   # W<br/>display.character(69,222,110,2) # E<br/>show(1)<br/># Pixels<br/>display.set_pen(255,255,0)<br/>display.text("Pixels", 63, 25, 200, 4)<br/>display.set_pen(0,200,0)<br/>display.rectangle(58,58,124,124)<br/>display.set_pen(30,30,30)<br/>display.rectangle(60,60,120,120)<br/>display.update()<br/>display.set_pen(0,255,0)<br/>for i in range(500):<br/>    xp = random.randint(0,119) + 60<br/>    yp = random.randint(0,119) + 60<br/>    display.pixel(xp,yp)<br/>    display.update()<br/>show(1)<br/># Horizontal line<br/>display.set_pen(0,180,0)<br/>display.pixel_span(30,190,180)<br/>show(1)<br/># Circle<br/>display.circle(119,119,50)<br/>show(1.5)<br/>display.set_clip(20,135, 200, 100)<br/>display.set_pen(200,0,0)<br/>display.circle(119,119,50)<br/>display.remove_clip()<br/><br/>display.set_pen(0,0,0)<br/>display.text("Circle", 76, 110, 194, 3)<br/>display.text("Clipped", 85, 138, 194, 2)<br/>display.set_pen(grey) # Previously saved colour<br/># Button Y<br/>display.text("Press button y", 47, 195, 208, 2)<br/>show(0)<br/>running = True<br/>while running:<br/>    if display.is_pressed(3): # Y button is pressed ?<br/>        running = False<br/>blk()<br/><br/># Tidy up<br/>title("Done",200,0,0)<br/>show(2)<br/>blk()

Straight lines can give the appearance of curves
Straight lines can give the appearance of curves

We’ve included three short procedures to help reduce code repetition:

def blk() 

This clears the screen to black – the normal background colour.

def show(tt)

This updates the screen, making the buffer visible and then waits tt seconds.

def title(msg,r,g,b)

This is used to display the msg string in size 4 text in the specified colour for two seconds, and then clears the display.

As you can see from the demonstration, we can accomplish a great deal using just these built-in methods. However, it would be useful to be able to draw vertical lines, lines from point A to point B, hollow circles, and rectangles. If these are written as procedures, we can easily copy and paste them into new projects to save time and effort.

You don't need much to create interesting graphics
You don’t need much to create interesting graphics

In our second demonstration, we’ve included these ‘helper’ procedures. They use the parameters (t, l, r, b) to represent the (top, left) and the (right, bottom) corners of rectangles or lines.

def horiz(l,t,r):    # left, top, right

Draws a horizontal line.

def vert(l,t,b):   # left, top, bottom

Draws a vertical line.

def box(l,t,r,b):  # left, top, right, bottom

Draws an outline rectangular box.

def line(x,y,xx,yy): 

Draws a line from (x,y) to (xx,yy).

def ring(cx,cy,rr,rim): # Centre, radius, thickness

Draws a circle, centred on (cx,cy), of outer radius rr and pixel thickness of rim. This is easy and fast but has the disadvantage that it wipes out anything inside ring

def ring2(cx,cy,r):   # Centre (x,y), radius

Draw a circle centred on (cx,cy), of radius rr with a single-pixel width. Can be used to flash a ring around something already drawn on the screen. You need to import math as it uses trigonometry.

def align(n, max_chars):

This returns a string version of int(n), right aligned in a string of max_chars length. Unfortunately, the font supplied by Pimoroni in its UF2 is not monospaced.

What will you create with your Pico display?
What will you create with your Pico display?

The second demonstration is too long to print, but can be downloaded here.

It illustrates the character set, drawing of lines, circles and boxes; plotting graphs, writing text at an angle or following a curved path, scrolling text along a sine curve, controlling an interactive bar graph with the buttons, updating a numeric value, changing the size and brightness of disks, and the colour of a rectangle.  

The program is fully commented, so it should be quite easy to follow.

The most common coding mistake is to forget the display.update() instruction after drawing something. The second is putting it in the wrong place.

When overwriting text on the screen to update a changing value, you should first overwrite the value with a small rectangle in the background colour. Notice that the percentage value is right-aligned to lock the ‘units’ position. 

It’s probably not a good idea to leave your display brightly lit for hours at a time. Several people have reported the appearance of ‘burn’ on a dark background, or ‘ghost’ marks after very bright items against a dark background have been displayed for some time. We’ve seen them on our display, but no long-term harm is evident. Blanking the screen in the ‘tidy-up’ sequence at the end of your program may help.

We hope you have found this tutorial useful and that it encourages you to start sending your output to a display. This is so much more rewarding than just printing to the REPL.

If you have a Pimoroni Pico Display, (240×135 pixels), all of these routines will work on your board.

Issue 41 of HackSpace magazine is on sale NOW!

Each month, HackSpace magazine brings you the best projects, tips, tricks and tutorials from the makersphere. You can get it from the Raspberry Pi Press online store or your local newsagents. As always, every issue is free to download from the HackSpace magazine website.

The post Graphic routines for Raspberry Pi Pico screens appeared first on Raspberry Pi.



Source: Raspberry Pi – Graphic routines for Raspberry Pi Pico screens

Raspberry Pi dog detector (and dopamine booster)

You can always rely on Ryder’s YouTube channel to be full of weird and wonderful makes. This latest offering aims to boost dopamine levels with dog spotting. Looking at dogs makes you happier, right? But you can’t spend all day looking out of the window waiting for a dog to pass, right? Well, a Raspberry Pi Camera Module and machine learning can do the dog spotting for you.

What’s the setup?

Ryder’s Raspberry Pi and camera sit on a tripod pointing out of a window looking over a street. Live video of the street is taken by the camera and fed through a machine learning model. Ryder chose the YOLO v3 object detection model, which can already recognise around 80 different things — from dogs to humans, and even umbrellas.

A hand holding a raspberry pi high quality camera pointing out of a window
Camera set up ready for dog spotting

Doggo passing announcements

But how would Ryder know that his Raspberry Pi had detected a dog? They’re so sneaky — they work in silence. A megaphone and some text-to-speech software make sure that Ryder is alerted in time to run to the window and see the passing dog. The megaphone announces: “Attention! There is a cute dog outside.”

A machine learning image with a human and a dog circled in different colours
The machine learning program clearly labels a ‘person’ and a ‘dog’

“Hey! Cute dog!”

Ryder wanted to share the love and show his appreciation to the owners of cute dogs, so he added a feature for when he is out of the house. With the megaphone poking out of a window, the Raspberry Pi does its dog-detecting as usual, but instead of alerting Ryder, it announces: “I like your dog” when a canine is walked past.

Raspberry Pi camera pointing out of a window connected to a megaphone which will announce when a dog passes by
When has a megaphone ever NOT made a project better?

Also, we’d like to learn more about this ‘Heather’ who apparently once scaled a six-foot fence to pet a dog and for whom Ryder built this. Ryder, spill the story in the comments!

The post Raspberry Pi dog detector (and dopamine booster) appeared first on Raspberry Pi.



Source: Raspberry Pi – Raspberry Pi dog detector (and dopamine booster)

Kay-Berlin Food Computer | The MagPi #104

In the latest issue of The MagPi Magazine, out today, Rob Zwetsloot talks to teacher Chris Regini about the incredible project his students are working on.

When we think of garden automation, we often think of basic measures like checking soil moisture and temperature. The Kay-Berlin Food Computer, named after student creators Noah Kay and Noah Berlin, does a lot more than that. A lot more.

At night, an IR LED floodlight allows for infrared camera monitoring via a Raspberry Pi NoIR Camera Module
At night, an IR LED floodlight allows for infrared camera monitoring via a Raspberry Pi NoIR Camera Module

“It is a fully automated growth chamber that can monitor over a dozen atmospheric and root zone variables and post them to an online dashboard for remote viewing,” Chris Regini tells us. He’s supervising both Noahs in this project. “In addition to collecting data, it is capable of adjusting fan speeds based on air temperature and humidity, dosing hydroponic reservoirs with pH adjustment and nutrient solutions via peristaltic pumps, dosing soil with water based on moisture sensor readings, adjusting light spectra and photoperiods, and capturing real-time and time-lapsed footage using a [Raspberry Pi] Camera Module NoIR in both daylight and night-time growth periods.”

Everything can be controlled manually or set to be autonomous. This isn’t just keeping your garden looking nice, this is the future of automated farming.

All the data is used for automation, but it’s accessible to students for manual control
All the data is used for automation, but it’s accessible to students for manual control

Seeds of knowledge

“The idea originated from the long standing MIT food computer project and lots of open-source collaboration in both the agriculture and Raspberry Pi communities,” Chris explains. “We’ve always had the hopes of creating an automated growing system that could collect long-term data for use in the ISS during space travel or in terrestrial applications where urbanisation or climate concerns required the growth of food indoors.”

With students doing a lot of learning from home in the past year, having such a system accessible online for interaction was important for Chris: “Adding a layer that could keep students engaged in this endeavour during remote learning was the catalyst that truly spurred on our progress.”

“All data is viewable in real time and historically,
“All data is viewable in real time and historically,

This level of control and web accessibility is perfect for Raspberry Pi, which Chris, his students, and his Code Club have been using for years.

“The fact that we had access to the GPIOs for sensors and actuators as well as the ability to capture photo and video was great for our application,” Chris says. “Being able to serve the collected data and images to the web, as well as schedule subroutines via systemd, made it the perfect fit for accessing our project remotely and having it run time-sensitive programs.”

There are six plants in the box, allowing for  a lot of data collection
There are six plants in the box, allowing for a lot of data collection

The computer has been in development for a while, but the students working on it have a wide range of skills that have made it possible.

“We have had a dedicated nucleus of students that have spent time learning plant science, electronic circuitry, Python, developing UIs, and creating housings in CAD,” Chris explains. “They all started as complete beginners and have benefited greatly from the amazing tutorials available to them through the Raspberry Pi Foundation website as well as the courses offered on FutureLearn.”

Grow beyond

“The entire system has a network of sensors... which monitor atmospheric variables of air temperature, humidity, CO2, O2, and air pressure.
The entire system has a network of sensors which monitor atmospheric variables of air temperature,
humidity, CO2, O2, and air pressure.

The project is ongoing – although they’re already getting a lot of data that is being used for citizen science.

“The system does a fantastic job collecting data and allowing us to visualise it via our Adafruit IO+ dashboards,” Chris says. “Upgrading our sensors and actuators to more reliable and accurate models has allowed the system to produce research level data that we are currently sharing in a citizen science project called Growing Beyond Earth. It is funded by NASA and is organised through Fairchild Botanical Gardens. We have been guided along the way by industry professionals in the field of hydroponics and have also collaborated with St. Louis-based MARSfarm to upgrade the chamber housing, reflective acrylic panels, and adjustable RGBW LED panel.  Linking our project with scientists, engineers, researchers, and entrepreneurs has allowed it to really take off.”

Get your copy of The Magpi #104 now!

You can grab the brand-new issue right now online from the Raspberry Pi Press store, or via our app on Android or iOS. You can also pick it up from supermarkets and newsagents, but make sure you do so safely while following all your local guidelines. There’s also a free PDF you can download.

MagPi 104 cover

The post Kay-Berlin Food Computer | The MagPi #104 appeared first on Raspberry Pi.



Source: Raspberry Pi – Kay-Berlin Food Computer | The MagPi #104

How to add Ethernet to Raspberry Pi Pico

Raspberry Pi Pico has a lot of interesting and unique features, but it doesn’t have networking. Of course this was only ever going to be a temporary inconvenience, and sure enough, over Pi Day weekend we saw both USB Ethernet and Ethernet PHY support released for Pico and RP2040.

Raspberry Pi Pico and RMII Ethernet PHY
Raspberry Pi Pico and RMII Ethernet PHY

The PHY support was put together by Sandeep Mistry, well known as the author of the noble and bleno Node.js libraries, as well as the Arduino LoRa library, amongst others. Built around the lwIP stack, it leverages the PIO, DMA, and dual-core capabilities of RP2040 to create an Ethernet MAC stack in software. The project currently supports RMII-based Ethernet PHY modules like the Microchip LAN8720.

Breakout boards for the LAN8720 can be found on AliExpress for around $1.50. If you want to pick one up next day on Amazon you should be prepared to pay somewhat more, especially if you want Amazon Prime delivery, although they can still be found fairly cheaply if you’re prepared to wait a while.

What this means is that you can now connect your $4 microcontroller to an Ethernet breakout costing less than $2 and connect it to the internet.

Building from source

If you don’t already have the Raspberry Pi Pico toolchain set up and working, you should first set up the C/C++ SDK. Afterwards you need grab the the project from GitHub, along with the lwIP stack.

[[code]]czozNjM6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBnaXQgY2xvbmUgZ2l0QGdpdGh1Yi5jb206c2FuZGVlcG1pc3RyeS9waWNvLXJtaWktZXRoZXJuZXQuZ2l0e1smKiZdfQo8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPiQ8L3NwYW4+PHtbJiomXX0vc3Ryb25nPiBjZCBwaWNvLXJtaWktZXRoZXJuZXQKPHN0cm9uZz48c3BhbiBjbGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdntbJiomXX1pZC1ncmVlbi1jeWFuLWNvbG9yXCI+JDwvc3Bhbj48L3N0cm9uZz4gZ2l0IHN1Ym1vZHVsZSB1cGRhdGUgLS1pbml0XCI7e1smKiZdfQ==[[/code]]

Make sure you have your PICO_SDK_PATH set before before proceeding. For instance, if you’re building things on a Raspberry Pi and you’ve run the pico_setup.sh script, or followed the instructions in our Getting Started guide, you’d point the PICO_SDK_PATH to

[[code]]czoxMjk6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBleHBvcnQgUElDT19TREtfUEFUSCA9IC9ob21lL3BpL3BpY28vcGljby1zZGtcIjt7WyYqJl19[[/code]]

then after that you can go ahead and build both the library and the example application.

[[code]]czozNzA6XCI8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3JlZW4tY3lhbi1jb2xvclwiPiQ8L3tbJiomXX1zcGFuPjwvc3Ryb25nPiBta2RpciBidWlsZAo8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS1jb2xvciBoYXMtdml2aWQtZ3tbJiomXX1yZWVuLWN5YW4tY29sb3JcIj4kPC9zcGFuPjwvc3Ryb25nPiBjZCBidWlsZAo8c3Ryb25nPjxzcGFuIGNsYXNzPVwiaGFzLWlubGluZS17WyYqJl19Y29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj4kPC9zcGFuPjwvc3Ryb25nPiBjbWFrZSAuLgo8c3Ryb25nPjxzcGFuIGN7WyYqJl19bGFzcz1cImhhcy1pbmxpbmUtY29sb3IgaGFzLXZpdmlkLWdyZWVuLWN5YW4tY29sb3JcIj4kPC9zcGFuPjwvc3Ryb25nPiBtYWtlXCI7e1smKiZdfQ==[[/code]]

If everything goes well you should have a UF2 file in build/examples/httpd called pico_rmii_ethernet_httpd.uf2. You can now load this UF2 file onto your Pico in the normal way.

Go grab your Raspberry Pi Pico board and a micro USB cable. Plug the cable into your Raspberry Pi or laptop, then press and hold the BOOTSEL button on your Pico while you plug the other end of the micro USB cable into the board. Then release the button after the board is plugged in.

A disk volume called RPI-RP2 should pop up on your desktop. Double-click to open it, and then drag and drop the UF2 file into it. Your Pico is now running a webserver. Unfortunately it’s not going to be much use until we wire it up to our Ethernet breakout board.

Wiring things up on the breadboard

Unfortunately the most common (and cheapest) breakout for the LAN8720 isn’t breadboard-friendly, although you can find some boards that are, so you’ll probably need to grab a bunch of male-to-female jumper wires along with your breadboard.

LAN8720 breakout wired to a Raspberry Pi Pico on a breadboard.
LAN8720 breakout wired to a Raspberry Pi Pico on a breadboard (with reset button)

Then wire up the breakout board to your Raspberry Pi Pico. Most of these boards seem to be well labelled, with the left-hand labels corresponding to the top row of breakout pins. The mapping between the pins on the RMII-based LAN8720 breakout board and your Pico should be as follows:

Pico RP20401 LAN8720 Breakout
Pin 9 GP6 RX0
Pin 10 GP7 RX1 (RX0 + 1 )
Pin 11 GP8 CRS (RX0 + 2)
Pin 14 GP10 TX0
Pin 15 GP11 TX1 (TX0 + 1)
Pin 16 GP12 TX-EN (TX0 + 2)
Pin 19 GP14 MDIO
Pin 20 GP15 MDC
Pin 26 GP20 nINT / RETCLK
3V3 (OUT) VCC
Pin 38 GND GND
Mapping between physical pin number, RP2040 pin, and LAN8720 breakout