 Joe, it's time for John Park's workshop. It's me, I'm John Park, and here I am in my workshop. We're gonna look at a couple of projects today. I've got a continuation of last week's project. It's in a much more completed state, just about ready to start working on a learn guide for that. That's our Touch Deck project. And then I'm gonna introduce you to the next thing, which is this Raspberry Pi 400, which is that cool all-in-one keyboard and computer Raspberry Pi, synthesizer controller using some analog knobs, kind of a continuation of the JP's product, pick of the week from Tuesday. It's all somehow interconnected. Let's see. I wanna first of all say thank you for joining me. And if you're wondering where all the chatting is happening, look, this is it right here. Oh, geez, I need my glasses. It's in Discord. If you check out the Discord, it's Adafruit's Discord at adafruit.it slash discord. Hey, where did I go? That was weird. I clicked a thing and I went to another screen. Hmm, strange things for our foot with my computer. Maybe I should have put on my glasses sooner. Could you see me that whole time? Yeah, the question is this a workshop or the virtual workshop? Today, it's the real workshop. Look, there's no retro reflective green screen, blue screen here today. There's a rolled up one back there. And then I've got a boom arm there. I was thinking of setting it up for today. It would have been good for some of the demo that I'm gonna show you inside of Rhino and Grasshopper, but alas, I didn't have time to get it all set up because I was working on projects. So if you are joining us over in YouTube, hello, Eric and MoondogIT and Connor McCarder. Nice to see you all. If you're over on Facebook or Twitch or any of the other services, I'm sorry, I am not keeping an eye on those chats. I only have so much attention to spread across things. Hey, Melaku, Yosef, nice to see you. And yes, I wanna say hi to these people over here in our Discord chat. So Andy and Steve and Hugo and Sea Grover and Susan. Mr. Certainly, thanks for coming by. Gary-Z, hello. All right, so let's get into some stuff here. Let me see if I can safely remove that without changing my camera views. Hey, that time it worked, weird. First thing I'll mention is the jobs board. If you are looking to share your resume because you're looking for some contract work or full-time work or part-time work, head on over to jobs.adafruit.com. That's the thing right there. You can see it. This is if you head to the available for hire section, you'll see some people looking to work in Canada, in Iran, in Buffalo, New York, in Spain, Atlanta, Georgia, Colorado, people all over the place who have skills on offer. So if you're looking to hire someone, this is a great place to look. You can see we've got a lot of people with engineering skills, embedded development, inventors, art and design, retail, all kinds of positions on offer there. So go check out jobs.adafruit.com. It's totally free. It's free for you to go and check it out. It's free for you to post a position. It's free for you to post your resume. Free, free, free. So why not use it? Use it, I say. All right, let's see. Did you know I do a show on Tuesdays as well at this same time? And it's my product pick of the week show where I take some product. I've been doing a lot of STEMA QT boards and I check it out. I get into a little more depth with it. I go beyond the initial surface level of it and actually dive in and build a project with it, usually some kind of a demo project. And I get to learn a little bit about it, how to usually, how to use it in circuit Python or in some cases in Arduino, sometimes both. And this is what it looks like. It changes every week. So it doesn't always look like that, but that's what this most recent one was. And that little board there, that PCF8951, actually I got that wrong, didn't I? It's the 8591, isn't it? Yeah, I got that totally wrong. Ignore that graphic, but instead, I'll give you a little recap. It is the PCF8591 and it is an 8-bit ADC and 8-bit DAC on a board with STEMA QT. Got a cyber deck plugged into the GPIO pins of this Raspberry Pi 400. This Raspberry Pi is running a little Python program that uses the circuit Python library for this board. It reads all four of the analog inputs. What I'm doing is tracking the first knob here for the X position, the vertical line here with this second knob. Third one we saw, that's changing the hue. And then the last one is brightness. So you'll find I can get that pretty zorching bright there or drop it, drop it down way, way low or all the way off. It is the PCF8591, 8-bit ADC and DAC in STEMA QT format. Right, so that was the product pick of the week and if you didn't know, during that time slot when I do the show, you can get 50% off or sometimes it's a different amount, but this one and a lot of them have been 50% off. So you can get the board for like $3.20 or something like that. We have a limit of 10 at that discount price, no resellers, but we want individuals to grab stuff and if they want, if they have big plans and are thinking of doing a project that has lots of DACs or ADCs in this case or both, that's a good board to grab. In fact, the board has four address pins or four address has jumpers so you can have four different I squared C addresses. So you can use them all, you can chain them which means you can have 32 potentiometers or faders or whatever on your project. Even if your project board doesn't have any analog input at all, which is great. Very cool way to set up some eight bit, only 256 levels, but eight bit ADC interface. It's a great board. All right, let's see, questions in the chat. Let me bring up the chat. That's not the chat, that's the chat. Recap because of the capacitors. Oh, the puns have started. Never ended really high, Todd Bott. Nice to see you. You got a Cappy capacitor, you did. Hey, that's a cool, kind of a cool segue actually into this project I'm working on. So I started this last week and what I'll do is jump over to this view here. I know we're gonna get some reflections just because I have a lot of lights in the studio here. So let me take, I'm gonna remove the Discord view there for a moment but I'm gonna bring it back at some point. And one of the things I wanna show you, actually let me go to a small view of me there. One of the things I wanna show you with this, so this board is, what are we looking at? It is the Feather RP2040. So this is that same chip that if you can see it there, there's a little Raspberry Pi logo right there on that chip. And that's cause this is the same chip that's on the Pico. We now have this Feather RP2040 and it gives us USB-C, which is kind of nice. It gives us battery charging and a connector for powering over a LiPo or other three volt-ish battery source. We have a Stem-AQT connector on there so we can plug it right into something like that. Analog DAC, ADC and DAC board. What else do we have? We have some very nicely marked silk screen of your pinout on the top of the board and in fact, when we're went one better and put different pinout information but pinout information on the top and the bottom. So this just tells you kind of the raw GPIO names of the pins that are available here and then you get the sort of Feather pins named so you can tell what your I squared C pins, battery pin, all these analog pins and so on and so forth. Got a reset button on there too and a NeoPixel for display. So really cool board and I'm using it in this project. We started this project last week by plugging it into a FeatherWing. So FeatherWings as you probably know are the boards that we can plug Feathers into or on top of Feathers to extend their capabilities. So in this case, this Feather RP2040 plugs right into the back of this TFT display. Three to half inch, it's 480 by 320 pixels and it is a touchscreen display. I've still got the protective film on this one because I'm still building this one but that is what's sitting inside of here. And one of the things I'll talk about is the case design. I started working on that last week. I'll show you just a kind of quick step through of how I built it. This is an earlier iteration. I kind of changed some of these whole sizes a bit but you can see the board kind of sets in there and we have space for everything to fit and still get to reset buttons and things like that. And then I have a case top that goes on top. So that's what's sitting here and then I've got some M2.5 screws running through and I put some nice little rubber feet on there so it doesn't slide around. And what this is doing is it's plugged in over USB-C and it is acting as a USB HID device. So that's a human interface device. It's acting like a keyboard and it can do keyboard shortcuts like pressing a key or a sequence of keys, control shift four, that sort of thing. It can send consumer control which is like volume up, volume down for system wide and it can also do, we have in the circuit Python HID layout library, the ability to type a whole phrase in can be, I don't know what the limits are but it's just a quote, a whole bunch of stuff you wanna type and it'll type out a whole declaration of independence if you wanted to. The reason that's important is because one of the ways I'm gonna use this is as a emoji shortcut machine for Discord and the way Discord and some other chat services work is they have a format where you type in colon, smile, colon and that'll give you a smiley face. Another thing I'm using it for, you've seen me do a bunch of different iterations of this sort of thing, but this one I may keep this around in this use and extend what I've done before for camera switching and one of the original ideas behind this was the stream deck, which is used by a lot of people who live stream to do camera switching types of things as well as other managing of social and chat functions. And what you'll see right here is I've got these cam ABC, those are the picture in picture cameras. So let me see here, I'm switching among the three cameras there and I can also switch, let's go back to that one there. Let me take myself off of there. And then the main cameras. So that's my main workshop camera, that's my down shooter on the work bench and that's my down shooter right here. So you can see, can get a little confusing working on this stuff while streaming live because hey, I'm trying to, and I don't have a shortcut yet for that little view right there of just me. So those are actually just typing 1, 2, 3, 4, 5, 6 and I just have those shortcuts set up inside of Wirecast which is what I'm using in broadcast as the keys that get hit when I wanna switch camera views. Now, if I go back, let's go back to the full view of this and I can get rid of the little mini view there. If, so one of the really cool things is this was, by the way, I wanna give credit where credit's due, I forgot to mention this last week, Matambale in our chat offered a suggestion to me a few weeks ago about doing this type of a touch thing in order to type Greek and math symbols and that's one aspect of this that I'm gonna be creating, I haven't built that layer set yet and Todd Bot also in our chat suggested doing a stream deck type of clone so those ideas have come together and we were able to secure the help of FOMI guy, our very own FOMI guy in the Circa Python development community has created some really great updates to some of the libraries that are making all of this work faster and better than ever as well as the code and the ability to abstract our button functions and icons and button positions from the main code so it's fairly easy to maintain an update and even share layer sets with people. So what am I talking about layer sets? Well, this is not restricted to just this set of 12 buttons that I have here. I created some little icons here so that we can move to the next layer, to the previous layer or to the home layer. So if I go to the home layer that's always gonna bring me to whichever one I designate as the first layer set. In this case, I've got it set as some YouTube controls. So I won't touch these because they'll break stuff since I'm not in YouTube right now but it does things like play pause, the five second jump or go back with the rewind and fast forward. Next we'll go to the next video, previous we'll go to the previous video if you're inside of a playlist which has a previous video, volume up down. Those are media controls so those will change it system-wide regardless of which software you're using. Full screen, that's some key combo. I forget maybe just control F or shift F. Slow and fast, these increase or decrease the playback speed of YouTube video and mute will do the mute inside of YouTube but not system-wide mute. And then if I go to the next layer set here I have a set of, as I mentioned before, these Discord icons. So if I bring back my Discord here for a second and then go click in it so that I've given it focus. If I wanted to type in a Blinka, normally I would type colon Blinka colon and it brings up Blinka or, and you might not see this but I would click on, there you go, I would click on a little emoji list and go to the Adafruit set and click on one of the emojis there, which is great but this is really a lot faster with this Gizmo here, with our touch deck. So if I just click on my Adabot, it typed in an Adabot. I don't have it pressing enter for me because sometimes you might wanna include that in a continuous inline. You might not want it to always hit return but you could have it do that. You could add the keystroke for the enter key and that would give you the ability to just fire off emojis instantly. So I'll do one last one before I stop bombarding you with emojis but you can see it at play there, works really well and if you are crazy about emojis you could set this up with many, many more layers of emojis. I believe also the, I'll be working with Fome Guy on a guide for this and hopefully get him to do some good explanations of some of the tricks he came up with for making this work efficiently. But one of the things that we have, I believe is the ability to resize. So we may be able to do without too much difficulty on different size screens or even make more smaller buttons. So if you wanted to have more of them on there as long as your fingers can still touch them. Also this is a resistive touch screen. So if I, what's a, let me grab, I've got my little pencil here. It's got a little metal blunt tip on there. So if you wanted to use this sort of stylus-like, it would work like this. Oh, so Siege Grover says there's an 11 second latency on YouTube today and I just have to ask, is that a latency between when I'm broadcasting and you're seeing it on Twitch versus YouTube or is that audio desync? Cause I really hope it's not that poorly audio desync. Yeah, Fome Guy, he has a cool avatar but I don't know if we have an emoji for his avatar. Maybe we should make that an honor. So, oh good, audio's fine, says Andy. Thank you. When you say latency, I start to be concerned. So one thing I wanted to do is follow up a little bit. Last week I started showing some of the build process for this and I was saying I was kind of building it just straight ahead in Rhino, sort of drafting curves and extruding them and filleting edges and just kind of working off of some dimensions. I decided to rebuild it all using Rhino Plus Grasshopper which I mentioned before is sort of the proceduralism or parametric flow graph of your build that's available in Rhino and this approximates a workflow more like what you'd see in something like Fusion 360 or SolidWorks or something where you're creating a flow of data and you can always go back and update some dimensions in order to update your project without having to manually do it. So I'll just show you some brief steps of it. What I started with were some curves and these were the curves that I took dimensions of my board, for example, those are the dimensions of the TFT right there. I drew those out myself. I'm not gonna turn on dimension lines there but you'll see that they have some very specific dimensions and then I started doing things like offsetting it. So you can see this is an offset version that's because I'm gonna use it to carve a slightly thicker version of this out of my solid objects. So this is a extrusion of that. Let me, I'm just gonna step through some of these things so then I cap that extrusion. So that's kind of prepared now as a thing that can carve out from other stuff. This other curve that I showed you is the sort of main outer frame and I've got a, essentially I've built this as the frame and the base sort of as two pieces that then we assemble together. There's a outer section which I can then carve that intersection out of and then as we go along I'm kind of adjusting some of the, whoops, what have I done? Adjusting some of those dimensions to fit if I can show you in here. You can see now I'm carving out that, using that little boolean difference to kind of carve out that shape there, rounding out some edges and entering some little screw holes into there. So that's all of the process for the frame. So this is kind of the final frame top that I have and you can see the same sort of thing happening down here below with the main base. So that one's a little more complex. So what I do is I again start with some dimensioned curves and some vectors to extrude things on the angles. I wanted this 30 degree angle for the sort of tilt of the base there. So you'll see I'll go and take some lines and rotate them there and then extrude those up. Put a little cap on it so it's a solid. Carve out some little holes out of the bottom for my screws and standoffs to go. And the cool thing is this is a combination of geometry that I made with particular dimensions inside of Rhino that I'm then using to carve out as well as things that are just sort of entered in these nodes. If I go and change something like the diameter of those circles that are cutting out those sections then it'll update automatically in the model, which is great. So this is much more the kind of workflow that I was talking about last week saying, I'm ashamed I'm not doing it this way. So I decided I got to remember some, I haven't used Grasshopper in a while. So I had to go back and remember some Grasshopper and then go about building this again. So that's kind of like the final base shape there. And what I'll do, let me give you a shaded render of just that base there. One second, turn off preview there. Go to a rendered view. Hope I don't crash something. There we go. And oh yeah, I've left the top on there, but that's okay. So that's what the final looks like. And then I basically bake that. So all of this is procedural up until here where there's this relationship between all the objects as well as values that I've entered into these little nodes or formulas, vectors that I've picked. At this point, I have the sort of final model being created through this history of nodes. And then I can go and bake it, which when I do that, I think I've already got a baked one here. Let me do this cooking style, cooking program style and find, I'm sorry, I've got to make my windows bigger because I can't really work with the interface this small to print. Okay, yeah, so those are the baked NERB surfaces, solids and NERB surfaces. And then I meshed them before turning them into STL files just so I could kind of control the details of the meshing. So that's what, yeah. So that's the model file right there. That probably looks familiar to you. It's a triangulated mesh, which is what most of your three printer software wants for slicing and then I've sliced those and printed out these guys. So thanks for putting up with that. I just wanted to come back full circle. And since I had talked last week about, hey, I'm not doing quite the way I want to do it. I wanted to follow up and show you the proper way. So there's still, I think maybe just a couple of little adjustments I'm going to make, but otherwise I think this is ready. I may try to, I realized that my dimensions didn't account for the fact that there's a somewhat sizable gap on the edge of this TFT sort of physical screen, the glass if you will, and where pixels can go. These are all the way up to the edge. So if you center the board, you're never quite centering the display. So I may readjust things just to stuff over a little bit. We'll see it, the problem with that is that it'll take the screws and offset them from the screen frame, so maybe I'll leave it alone. But look for a learn guide on this. I'm going to be starting work on that soon. And again, like I said, hopefully FOMI guy, oh, we've just gotten a, we've got a FOMI guy icon in the chat. Thanks Hugo for grabbing that as a cool avatar. I don't know what that is. Is that, someone tell me, that must be like a character from something that I just don't know. I'm not up on. But yes, so that'll be coming soon. I'll probably put out the STL and step files for that, which might be useful for someone who wants to re-imagine it. I don't think the Rhino files are as useful to people because it's expensive software and it isn't what the maker community tends to use, but I'm happy to share it, that and the Grasshopper node network. All right, so that takes us kind of through. Let me know if there are any other questions about this touch deck. That's the main feature set as we've got it. I will show you now actually the code I had mentioned. This is actually, yeah, this is it. So this is the layers file, if you look here. This is not the main code. This is this abstracted, trying to think what the word I'm looking for is, but essentially the controls definitions is a separate file from the main code and I'm drawing a blank on the word, so I'm sure someone will help me out in the chat. The file here is just what's gonna happen when buttons get pressed. So in the main code it says, hey, I know the second button got pressed. This here tells us, okay, if we're in layer one, this first layers file is this section here to here, here are the buttons, zero through 11. So the second button getting pressed physically on the screen, if I'm in the YouTube mode, is gonna be the pause key. So all you have to do to change the functionality of these keys, again, sorry for the glare, there we go, prop that up a little bit, stay, is you can give it a label name, so you can see there, it just says pause underneath it. You can change which icon it points to. So right now I have a set of icons I made and they are all prefixed with PR for purple-ish, although I kind of went bluer as I developed it, and then pause, that's the pause icon. So if I just wanna change that, let's say I just feel like confusing myself and putting the play icon in there, I'll change that right now and in a second I'll save and when it restarts it should bring that up as the new icon and then the actions and we tell it, what are you gonna do? Oh, it's gonna be the HID key and then key code K. So you can do consumer control and the various consumer controls or you can do layout and it'll type in a bunch of stuff. By the way, in YouTube, play and pause are identical. They do the same thing and they have the same shortcut, so maybe I should pop something else in there. I don't even know if there's a stop, like play, pause and stop, they kind of tend to be all the same screen that just, same key that just is an automatic toggle. So right now, let's see, if I save this, you'll see it's gonna auto reload the code and again, this being a cool circuit Python display, we get to see the REPL show up there until it takes itself out with the screen blank and now it's loading up in there. You can see I've replaced my pause key with a play key, so I'm gonna set that back and I'll hit save on that so I don't confuse myself later. I think, follow me guys in the chat too. So if anyone has questions, if anyone's over, particularly in YouTube, if you have questions you're not over in the Discord chat, FOMI guy is there. Let's see, the one question, okay, Todd says, can we use Unicode combining characters to combine Adabot emoji and iBall emoji to make a FOMI guy emoji? That's a tough one, Todd. A question over in the YouTube chat, David Desa says, JP, did you find Rhino difficult to learn when you first used it? Boy, I'll tell you what, I started learning Rhino when it was in beta back in 1998 or 1999, so I've used it a long time and I think at the time I was already a alias studio and alias power animator user, so I was used to a NURBS workflow, originally it was just a NURBS package, it was kind of the first NURBS modeling program on Windows NT at the time, I believe, and so I think that the methodology of working with NURBS curves, NURBS surfaces and those kind of operations are maybe the bigger learning curve, but I think there are great tutorials out there, there's great video series, and I think it's, I've taught it a little bit before when I used to teach graphic software and I found people took to it very quickly. Someone said, can one call up websites? The question is, can you call up websites using this little gizmo? So one of the advantages that a device like the Stream Deck has is that it runs some background software on your computer that's always watching the device and so it has sort of system level application controls so it can listen for things to happen and then serve up the results of those. This is just a simple HID keyboard so you can do the kinds of things I've shown here. If you wanna go one level more sophisticated, you can install some macro software on your computer. An example I'd like to give you is Quicksilver on the Mac and if you look at a guide, I'll bring this up real quick. If you look at this guide I did, about one second. If we go to learn.adafruit.com, I called it the Launch Deck and it's a similar project to this but it uses the Neo Trellis M4 and I created an app launching application using Automator. I think, has Apple changed Automator? Again, I can't remember if that's what it's called now but that does system level stuff on the Mac. You can use the Windows 10 task bar to do things like launching a webpage, launching an application and then Quicksilver, do I show it here? Yeah, Quicksilver is and there are other pieces of software like this. It is pretty in depth, the kinds of things it can do. It can run actions that are complex such as launch a web browser, a particular web browser whichever one the system favors and go to a particular URL or launch my email program and start a blank email that's addressed to, you know, LeMore. So that's my answer to can you do more sophisticated stuff? Yes, but you're certainly not dealing with the same level of sophistication as far as I know as the Stream Deck or its competitors which have pretty well developed sort of server software sitting on your computer watching for events. All right. If there aren't any other questions on that. Oh, sorry, one last one. Car Xpava, Karen Randall asks, is Rhino at all like SketchUp? Yeah, there are definitely ways in which it's similar because the concept of creating solids, extruding them, creating other solids, using them to carve things out. That Boolean solid Boolean workflow is pretty much how SketchUp works. It's fairly constrained to that and Rhino certainly works that way as do a lot of other solids programs. But then it goes way, way, way beyond in the drafting ability and the inferencing and ability to snap things. Although SketchUp does great snapping kinds of stuff. I'm not here to bash software, but I've used SketchUp quite a bit many years ago and it does have a similar workflow to some of the kind of carving types of things that I'm doing here. All right, yeah. So actually one last thing on the touch deck before we move on to preview our upcoming project. Fome Guy is in the chat talking about the speed of the bitmaps. So I actually pushed us to go to larger bitmaps. We had smaller ones at first, I think 48 by 48 pixel. We've gone up to 80 by 80 pixel, which clearly these take up a little more space. This very morning I updated, actually this would be kind of a cool experiment. So if you watch, do I have a second USB-C port? No, I don't. Okay, so I'm just gonna unplug this in a second, but watch right now, let me switch to my down shooter and I'll show you if you look at how quickly or how slowly bitmaps load. It's like about, I don't know, half a second per bitmap to load off of disk. Those are all 16-bit RGB BMP files. And then this morning, I unplugged that one, I converted all of those to indexed, 256 color indexed, which makes, makes, makes, makes less than half the size. We'll see if we can get this one to load up here. My first take on it, I didn't do it side by side, was that it didn't really change the load speed that much. But let's, oh, this is, I may have an old, old version of a circuit Python in here. I had a, I had this issue before where I have to go into the REPL and kind of goose it before it wants to launch. So one second, let me tell it to relaunch from the console here. So those actually always load in faster that first time. I think it might, yeah, it's about the same speed. It does not look any faster to me. And those are, those are much, much smaller files, but they're, I mean, they're all kind of small anyway. So we'll see. I think Lady Aida had some suggestions for some possible solutions, maybe using Sprite Maps. So we may, we may be able to get that faster, but that's partly an intellectual exercise, because I don't imagine if you're using this kind of thing, you're using it to like page really quickly between lots and lots of stuff. You kind of are going to be working in Photoshop for a while and want to use it as a Photoshop launcher, and then you might switch over to YouTube to watch some stuff for a while, switch it to that. So I don't know that a blazing fast swap to layers will matter that much, but it'll be nice. So we'll see. Jay Furcian, if I'm getting your name right, says, I can't wait to see this guy. I made a much simpler version of this two years ago using a PyPortal and Automator. I've always wanted to revisit it. Oh, very cool. So there's a, there's a PyPortal app launcher. If you take a look at the Discord there, very cool. All right. So let's take a look now at the, at the new thing. So I'm going to switch over to a different camera over at my work bench there. And let me go and put myself into the corner there. So what's going on over here? What's all this you might ask? So as you'll remember last Tuesday, two days ago, I created the knob controller for, let me run this, for the Raspberry Pi 400 that uses our little Quad ADC and four knobs. So I built this little gizmo here. I've just, I'm holding it in something that'll give it some weight. Let me zoom in for a second. There we go. So this is our STEMA QT cable connecting it up to the cyber deck on the Raspberry Pi 400. So that just angles and brings out all the GPIO pins. And then I soldered up this board with, take it out of here for a second, with four potentiometers and lovingly bent wires and tried to make it look clean and neat on this Perf board. And these are just sending, at the end of the day, they're sending 256 levels of analog of varying values or rather the sending analog voltage acting like voltage dividers and that gets interpreted by the chip as 256 values. So what I showed, and you saw this on the little recap video before, is that I'm controlling this Pimaroni unicorn hat HD. And I just wrote some simple code to move some neopixels around up and down and left and right for these two lines and then sweep through the different hue values and also change the brightness on that. So I mentioned to Lamor, I said, hey, do you want me to write up a little guide on using analog inputs on the Raspberry Pi? Because as I've mentioned, there aren't any. The Raspberry Pis don't have analog inputs even though they have all this GPIO. So if you want to use knobs, you kind of have to add some sort of analog to the digital converter as ADC. Zoom back out here and she said, yes, that would be cool. In fact, what I think would be a really neat demo, try to get that in focus again, is to control a synthesizer on the Raspberry Pi. I said, hey, neat, I like synthesizers. Who doesn't, right? And so what I did was I took a look at Sonic Pi and Sonic Pi is a software synthesizer that's often installed on a Raspberry Pi distribution if you get one of the official Raspberry distributions. It's often right there or it lives up in the install some software wherever they've got in here somewhere. There's ad remove software and then they have like a list that's got some recommended software as well as you can kind of pull in anything in the world. So Sonic Pi is a very nice educational coding platform that is coded in Ruby and acts as the front end for a really powerful software synthesizer engine that's called Super Collider. And so Super Collider, you find it in a lot of devices, some really high end devices. You've seen me show, we've got our friend Steve and Noriko in the chat here. He has made a version of the Monome Norns before. Norns is a super high end synthesizer coding box. Steve's made one called Fates. And the Norns and Fates, they use Super Collider as typically as the engine, you can put other things on there, but usually they use Super Collider and then they have a little scripting language front end similar to this sort of idea, they use Lua. But this is, if I can zoom in here and you can see this little funky cyber screen I've made here, let me move it down a bit, try to avoid glares. This, not that, this, this is Super Collider. It looks like a coding ID. And in Super Collider, we typically do things really simply, you can say play 60 and it's gonna play a MIDI note that's MIDI note 60, which I always forget is that like middle A or middle C, something like that. But you can get much more sophisticated from there by saying, oh, I want a particular synth engine and I wanna control parameters such as not just which note is playing, but what the frequency of a filter cutoff is or what the amplitude, how loud it is, the sustain, how long the note sort of trails off after it's been played. And so, I thought, I'll bet that we can control some of the parameters inside of Sonic Pi using this little gizmo and some Python code that is sending messages from a Python script over to Sonic Pi. And the way this is often done in software is either with MIDI or with OSC, which stands for open sound control. OSC is like a latter day more advanced MIDI type of ideas for sending information from one application to another related to music things generally. And as it turns out, I'll show you this over on the computer later, as it turns out, it took about one library, three lines of code in Python and two lines of code over in Sonic Pi to send OSC controls derived from what I'm doing with my knob over a UDP to Sonic Pi. So enough talk, let me play something. So what's gonna happen here, and again, I'm sorry, I don't know if you can see that code. I've left my visualizer thing running, so that'll still do its cool stuff, which is just, hey, that's just fun. And then I've got the same knobs that are controlling these four visualizer parameters send four different sets of values over this OSC into Sonic Pi, and those are for the note. So I have something in here that says A, B, C, and D are synced to the thing that's sending OSC. And what do I interpret those variables as? A will be the note, B will be the cutoff, C will be the sustain along the note trails off for, and D will be the amplitude or how loud it is. So what I'll do now is I'm gonna hit play and I have a little sort of arpeggio that'll play, and you'll hear changes as I turn the knobs. Here we go. So there's that frequency-filtered sweep. I'll change the notes, the sort of root note, and now I'll change the sustain of just the first note. So that first note will start to mush and stay present. See how it's just hanging out there. Now if I set that amplitude real low, we won't hear it anymore, it's still playing. And one reason I wanna do that is I'll show you, hopefully you can see this, if I expand this window, this little colored set of text flying in here, these are called the cues. Those are the external events that are coming in from that Python code. If I switch back over to Python, this code here, let me see if I can make that pretty big and hopefully you'll be able to see it. Let me zoom in on that text. I'll stop that code from running. I can turn on the plotter, that might be useful, and I'll run that code again. It gave one spurious send. So as I change things like the filter, you'll see that blue line moving up and down. So every time this code goes through a loop, let me turn off that plotter, what it does is it's checking, I've installed this Python OSC and I'm importing the OSC message builder, as well as this UDP client. And then I set up this thing called a sender, and it has the UDP client as a simple UDP client to the machine itself, 127.0.0.1, port 4560, that's the port that Sonic Pi is gonna listen for. And then I send a message. That's actually why that first we heard it one time because I actually send a message at the beginning here before I've checked the dials. And that message has a little format. It says it has a path trigger slash profit, profit synth is one of the synths that it emulates. And then these four values. So that's the note value, 143, the filter cutoff 110, the sustain is set to one and the amplitude is set to 0.7. So then what I've done is the rest of the code is the same as I did on Tuesday. I'm using that PCF 8591 to read these four knobs. And then I'm changing that raw, actually the raw value we get is a 16 bit value. So we get zero to 65,535. I take that and I switch it into kind of two different sets of ranges. One is the set of ranges that changes, let me run it again, that changes this display. Turn this volume down. So I need to set it to zero to 16 or zero to 15 to change those 15 rows and columns. And then depending on the range of what I wanna do over in the synthesizer, I send things like MIDI notes. So if we look at that MIDI note range, let me hit stop here. And I'll go back over some of this stuff on the main monitor where you can see it a little more clearly, I think. Oh, that's pretty big, hopefully you can see that. So here are some, make that even a little bigger so I can get there. Here are some of the remappings of those values. So the OSC note value, I take my raw value that comes in. I map it from zero to 255. I use that in both instances. But then to turn that into a note, I say, okay, we'll make this an integer and we'll use the circuit Python map range library even though I'm in Python here, I'm using a lot of circuit Python libraries. They work on the Raspberry Pi. And thanks to Dan Halbert for letting me know that I could do it that way. It makes life easier for me. And then this is the map range is taking whatever comes over that OSC zero value, expects to bring in a zero to 255 range, remaps it to MIDI note 43 to MIDI note 58. So I've got two octaves if I've done my math right there. And so on I do that for the different things that I'm reading over in Sonic Pi. And the result is what we've seen. So then back here in Sonic Pi, let me see if I can make this full screen and hide some stuff. I'll hide those cues, show log, show queue log. That'll help, right? And I can make this bigger and scale that code up a bit. A bit, a bit, a bit, a bit. I'll do it, okay. So that's the full code in Sonic Pi. Let me just move this down a little again. And is that in focus? Sorry, I have a hard time seeing if that's better. So, and I apologize if people are in the chat telling me to focus. So here, all the Sonic Pi code is doing is it's creating a live loop. It is using these four variables to cast to whatever comes in over OSC. So there's four things coming over from OSC. We sync up with those. And then I tell the synth name, profit. We could change that. We can even, even midstream. So actually if I play this, let's go back and run our sender. So while this is running, it's kind of live coding environment. We can go in and say, I'm gonna do Hoover. And I'll just hit rerun. Had some buffered in there. That's a different synth. I think we can do maybe a sign. No, that's not a synth. So I just gave it a bad name. And it's like, no, you don't. I think Super Saw might be one. Now we get this cool buzzing. Okay, I'll stop that. I'll hit stop on this. So this is one of the neat things about this is this is live coding concept where it's constantly running. You can change things and just tell it to kind of refresh. It doesn't even miss a beat. It keeps the tempo and it adds in your changes. Here I'm just throwing in a little sleep and then I'm adding two to the note each time I run through and that's what gives us that little sort of boop boop boop boop arpeggio that runs up. So this right here where we have the synth, I said, okay, I can just manually change that. We could change that also from our knobs if we wanted to give it some little sort of selection ranges. And then right there, note is A. So that's whatever came from the first knob. Cutoff is B, came from the second knob. Sustain is C, came from the third knob and amplitude is D. All that running on this little gizmo right here. Let me zoom out a bit. I'm really excited about this Raspberry Pi 400. It's super cool. I built a funky little frame for my monitor there. That's it, that's the gizmo. I'm running out over HDMI to this display. I'm running out over a little USB audio dongle, little sound card in a box kind of thing, which makes life a lot easier. It gives me a line out. There is no line out on the Raspberry Pi. I also have had problems getting the audio over the HDMI and then getting the HDMI resolutions I want. So I skipped all that. This was Lady A to suggestion. She said, why don't you try one of these little audio cards? Worked great. And then our love little analog ADC, giving the knobs the treatment. So let me run back over to my desktop here and see if we've got any questions. And I can also, so let's say, WagonLoads in YouTube asks, YouTube chat, is iSquaredC fast enough for MIDI interface? Yes. Someone in the chat might be able to remind me of the speeds of those two things, but I believe iSquaredC is quite a lot faster than the, whatever, 3200 bot or wherever MIDI is. MIDI is pretty slow. Yeah, ASUS built a code, JP making the opening song for Stranger Things. It's hard to not make something really reminiscent of that arpeggio from Stranger Things. Let's see, what else? Other questions? Could you use a touch mouse pad for the X and Y? Probably, yeah, like if you use, in fact, I have one of these here somewhere. There's a, we have some little digitizer touch screens. Here's one, I think it has a little controller and I'm not sure what it outputs if it's iSquaredC or not, but these types of little, this is from a Nintendo replacement part, but it's a little digitizer screen and it comes with a little controller board, things like that I think you can get to just send out. Maybe that's digital, so maybe it's not a good example, but yeah, X, Y pads that are analog, you should be able to read joysticks, you should be able to read because that's just two potentiometers. MIDI is 31 to 50 BOD, super slow. What is iSquaredC? Like a thousand megahertz or something like that? I just made up numbers, it's fast, fast enough for this. I think it's like between 400 megahertz and a thousand megahertz, something like that is the range on iSquaredC that we can use. Someone tell me if I just made all that up. Oh mate, there may be questions. I was scrolled funny here, let's see. Yeah, I can't take it apart to show it or I don't want to right now, but the Cyber Deck, man is it cool. In fact, do you know what I'm gonna do? I'm just gonna pull the plug on this, sorry. Sorry Raspberry Pi, so it should have shut you down, but I'm doing a show here. I'm just gonna grab this. So here is the, here's the Raspberry Pi with that Cyber Deck. Phil B did the super cool silkscreen on it, ginemonic style, and then I have that hat plugged into it, but these come, boop, right out of there. That's what we would be interfacing with normally, which means a hat like this, I think would be facing backwards. So you want something like this to use different hats. We just have a bonnet version that's like a half height size, and this has the JST3 pin connector for stuff like NeoPixel strips, analog stuff, and the I squared C connector there. Or I guess you'd have to use digital things with that. The one thing I will tell you is it's actually possible to be off by one with some things. This one is keyed. That connector has that little keying notch there, so it pretty much goes in correctly, but I managed to put a non-keyed hat in there, one pin off and wondered why it wasn't working. That is why it wasn't working, so be careful, you can actually get the pins off by one. There's enough space in there to do that, which is too bad. Yeah, so cool device, hopefully I didn't make it too angry by pulling the plug on it there without a proper shutdown, but these things happen. Yeah, nunchuck, someone mentioned, you could use the little nunchuck breakout, and that's a, there's a little STEMI QT board for that. I've got one right here somewhere. Will I find it easily? Probably not, too much stuff. All right, and then I did mention, I would look a little bit at the code here, so this, let me open up. Are you in here? There, almost, standby. There it is. So that's the code that I was just showing or similar. I think I made a couple of changes for doing the Sonic Pi synth sounds, and this is what the code that's doing both the unicorn hat and the OSC messages here looks like. So I'm bringing in the message builder and the UDP client. These are the pertinent lines of code here to set up the Python code as the client, which the Sonic Pi is set up as the server to receive that with, what, just this line right here? Sync, that's literally all I have to do is say sync. It's pretty great, and I'll show you, I flashed this up before, if I pop over to this tab here. Oh gosh, that's a big, let me make that a small knee. I think I Googled OSC in Sonic Pi, I came up with this one, here's how you receive it. Boom, and it showed a basic listener mentioned doing it in, I think, closure and Python. That's the code there that I grabbed and was able to install it with PIP3. So very straightforward, and I'll be doing a guide on that. So may also try something funky with that. We'll see if I can do a mirrored display with one of these little Pi TFTs, instead of that visualizer, then we'll be able to do things like see the scopes, maybe even, it's a touch screen, maybe even press play. The play button there, the interface might be too small, but there are waveform scopes for, I think, the spectrum, Sonic spectrum frequency as well as like moving waveforms over time and a leisure zoo pattern and some other stuff, so. Might be cool, we'll see. All right, anything else? Let's see, that's that code, that's that code. We've looked at all the code. I think that's gonna do it. So thank you all for tuning in. Thanks for being active in both the chats. I'm sorry, if I missed some of the chat there, it's a lot going on, but thanks for coming by both YouTube and Discord people. And I think that's gonna do it for today. I hope you had a good one, I did. Now I'm gonna go off and build some more of this fun stuff and write it up. So thank you for Adafruit Industries. I'm John Park. This has been John Park's Workshop. Bye everyone.