 Well, it's that time of the week again. It's time for Chitchat Across the Pond. This is episode number 769 for June 10th, 2023, and I'm your host, Allison Sheridan. This week, our guest is Bart Bouchots, but this is not a programming-by-self episode. It's a Chitchat Across the Pond light, at least within our personal definition of light. How's that, Bart? It might even be genuinely life, maybe. I don't know. You said you're going to be talking about some developer stuff, so that sounds a little more light than most normal humans would, but for us, that's light, right? Right, and it sort of came about for two reasons. So the first is it's been a week and a worker have had to do a lot of programming for my work hat, and there's only so much my brain can do. And I tried to write the show notes in Friday evening, and my brain was like, nope. For a start, I'm still in PowerShell mode. This bash thing is not happening. There were dollar signs everywhere there shouldn't be. Oh, it was a mess. It was a giant mess is what it was. And I'm also really keen to talk to someone fun and nerdy, and I can't think of anyone more fun and more nerdy than you, about, frankly, a part of Apple's keynote. I'm already thinking ahead to Let's Talk Apple for this one, thinking how on earth can I fit everything I want to say from that amazing keynote into one Let's Talk Apple? That's going to be, I don't know how I'm going to do that. Well, you're going to take the 30 foot level, right, or 100 foot level, 100 meter level? Right. But even when you zoom out, it's been a long time since so much has happened in one keynote. Yeah, definitely. Because they dropped a bunch of cool stuff in a press release two days in, like on the Wednesday, we've got a press release, some service updates, little things like sharing pass keys with a group of friends. Basically, they sure locked one password for families, just an press release. Well, they sure locked the feature of one password for families, but yeah, yeah. Share tag. A friend of mine is waiting for me to answer a question. Actually, a listener of the show was waiting for me to answer the question of, is there any reason why I shouldn't use pass keys? Sorry, I shouldn't use keychain and one password at the same time. I didn't even get around to responding before. Now, the answer is getting harder and harder to tell. It's probably a good thing for one password. They've pivoted to an enterprise product because their focus has really shifted to the enterprise with integration with the sign-on and stuff. Don't say that as though it's shifted away from the home and family because that is not true. They've added. Correct. I'm saying it from the point of view of the company not being damaged by Apple adding this feature. I'm saying it because I want them to survive and thrive and if they were still dependent on only families, on only families, just they're not shifted, they're not pivoted away from, which is what you said was what I was saying, they've added. Okay, yeah, that's fair. They've expanded their, yeah. They're not a one-trick pony. Correct. Yes, that is a good way of putting it. And of course, the other, yeah, it's slightly off topic. But anyway, like I say, right, you've made my point here, there is so much in WWDC, but the bit that everyone else isn't talking about as much, I've listened to lots of podcasts and everyone is obsessed with the and one more thing, obviously, but they're all focusing on the hardware, which is a version one piece of hardware at a very high price that's not going to be available anywhere until January at the earliest and US only at first. And I'm like, okay, it's interesting. It's more than I was expecting for a version one product. I was, it's more than I thought it would be. But that like, if I may steal your analogy, that's not what blew my dress up. That sort of ruffled it a bit. But what blew my dress up was Vision OS. I was not expecting Vision OS. I'm actually glad you blew my mind. Let me, let me say again, what you said, reinforced it is everybody I've tried to talk to about it, including Steve, wants to talk about the price, wants to talk about the hardware. And it's like, yeah, yeah, yeah, yeah, that's all going to work itself out. Yeah, yeah, yeah. Okay, fine. I mean, that's been on a trajectory of very, very slow trajectory to getting better and better and better, not getting much cheaper. But we could argue about what the price is versus something about 40 years ago, whatever. Sure, let's talk about what we saw, the vision of the future. And I told Steve, it was for me, it was like the first time Steve and I talked to our friend Ron over CUCME, which was a video conferencing app. It was a one by one inch picture of him moving at maybe four frames a second in maybe 32 pixels. Yeah, yeah, it was just ridiculous. And we all went, whoa, because we could see what it was going to be. And I had kind of that woe feeling. It sounds like you did too. I did too. I absolutely did too, because to me, this was as this is the next revolution in computing. So I, you and I probably have this in common. We started computers on the command line. That was the computer interface. I started on on a Mac. Oh, okay. Well, I started on a teletype. Let me, you're right. Oh, okay. You've outdone me now and cards, punch cards and cards. Okay. Now you've really outdone me now. The floppy disks had already shrunk when I started. And they had become not floppy. So they were already in anachronism. And I really thought it was silly that they were called floppy disks. Well, the floppy on the inside. Oh, I found that crunchy on the outside. Armadillo. Anyway, dear. Okay. So, yeah. So I started off life on the command line. And the closest we got to a visual interface was something called a two way, which is where you use ASCII characters to draw a sort of a kind of a GUI with text. Right? Like putting borders around things. Yeah. Yeah. Word perfect was like this blue screen with white text. And there was a top row of text was in yellow. Oh, yeah. And every other letter was underlined. And if you have down the alt key and that letter, then this menu would appear, which would just be a box of characters over the other boxes of characters. And then you could use this arrow keys to pick something from the menu. And that was like a GUI ish. Right. And then the first time I mind got a little bit blown was when Windows 3.1 came out. Because you could go on your command line and type WYN and hit enter. And this app would launch that let you launch other apps. And those little apps were in floaty windows and things. And you had little icons and whatnot. So you saw Windows 3.1 before you saw any Mac operating systems? Because wasn't long before I didn't go to the Mac until the Mac operating system in some form existed before that, right? They were certainly, they coexisted for sure. And probably the Mac, probably predated because the Mac goes back to 82. 84. 83. 1982 won't be like 1982. Because that was 1984. 1982. All right. Yeah. That was the name of that movie. Never mind. Never mind. I knew it was an Orwell reference. I don't know why I thought it was 82. Anyway, okay. 1984. And yeah, no, Windows 3.1 is later than that. Yeah, no, it is. Oh, yeah. Yeah. Sorry. Much later than that. Yeah. Yeah. Literally. Oh, yeah. It's 1992. Yeah. There we go. I lost a decade. Just poof. Well, anyway, that's okay. We forgive you. You came over to the right side. I did have a very, very eventually. But when you saw a windowed after operating system, a true GUI, that was a step function. That was a step function. And then Windows 95 blew my mind. Because in Windows 95, you never saw the command line. You turned the computer on, it made a bloom, bloom, bloom, noise, a little field came up, and your icons were just there. And that was it. That was the whole computer was this graphic user interface. Was that a mind-blowing jump or no? That was pretty substantial, actually, because Windows 3.1 was a place you went when you weren't playing games to do a few things. It was like an app with apps in it, because you had to start up by typing WIN and hitting Enter. And so it was like, you know, you'd go stick in your file, put it up, and go C, you know, A colon, and, you know, name of your XE file for your game or whatever. And then maybe when you were done playing games, you type win to go play with paint. All right, if you want to do some homework. But it was like an app that had little apps inside that it still wasn't quite the wait a minute, the computer can just be pictures. Which, of course, the Mac, that was the Mac's whole thing, right? Hello, and up it came with pictures, right? But we didn't get that until Windows 95. And that's, I think that's pretty mind-blowing change. So that was a pretty big change that happened in my early computing life. And the next time my mind was blown was when we replaced indirect, like, so the icons and the windows, you manipulated them indirectly. Your hand moved a mouse. The mouse moved the cursor. The cursor moved things. So, yes, your hand was manipulating the icons, but it needed an intermediary, right? You had, you had to use the mouse to control the thing on the screen that was actually manipulating the interface. I was just laughing because you got there so much later. Oh, slide to unlock? Okay. Like, that was, like, I still remember, I remember everyone huddled around me in the office when my, because in Ireland, we didn't get the first generation iPhone. We got the iPod touch. And I ordered one straight away. And it arrived while I was in work in IT services. And everyone was huddled around for the unboxing. And just the act of sliding your finger and watching zero lag on a touchscreen and having the thing open blew everyone's mind. I was like, Oh my God, direct touch, like directly manipulating an interface. That was absolutely revolutionary. And what I saw with VisionOS, I keep on getting the name wrong because it was rumored to be RealityOS for so long, I'm going to say it wrong a few times. That was the next time my mind was blown to that same level. To me, what everyone else has been doing is like a little gimmick for adding on to something else, a way to play games, a way to have meetings with silly bodies with no legs. It was like this piece of technology in search for a reason. And what Apple have is a computer operating system. They have a way of manipulating the world. Well, one of the phrases they've used now, and I think this really encapsulates it, is spatial computing. Yes. And that was the dimension that was, I think, I mean, I'm a productivity hound. I don't game. Doing some exercise is kind of a fun thing. I can see that. And I'm not against anybody else enjoying gaming on a headset. That's just fine. If that's what they want to go do, it just doesn't interest me. But productivity out in space and having my computing out in the space around me, that was more mind blowing. Steve specifically asked me, did I feel that kind of leap going to the iPhone from the Blackberry? And I said, no, that wasn't. Wow. I mean, it was cool. It's really cool. But it was doing the same things I could do before in a different way. And I guess, technically, some of that is what's true here. It is. That's kind of what I like about the evolutions. It's gotten better to do what I need to do. I was writing documents in Word Perfect on the TUI. I'm writing documents on macOS 10 in a really shiny GUI app. And I will be writing documents in spatial computing. So to some extent, I'm continuing to do what I do, and that's why it blew my dress up. Interesting. Okay. Yeah. Yeah. So I was thinking about this a lot, like how best to express why my mind was blown. So everyone was expecting Apple to do a headset that you would wear while you walked around Google Glass. That's what everyone thought was going to happen, which to me is mobile computing. It is designed to be used on the move. A bit like an iPhone is a mobile computer. You walk around while you're tapping on your iPhone. That's all perfectly fine. This isn't that. This is the ultimate portable computer. This is a laptop on steroids. This is a laptop with an infinite display. That's portable. Portable means you move it around and then you settle in to do something. Oh, okay. When you use a laptop, you bring it with you, and then you settle yourself down and you do something. So we're not going to use it crossing the street looking down at the ground about to get hit by a car. Certainly not this product. And Apple may do something else in the future that is also spatial computing, but more of a stripped down spatial computing, which would be some sort of glasses like thing, which would basically be a bit like the iPhone to the Mac. Yes, it's familiar, but it's a very stripped down minimalist OS. This wasn't that. To me, this is the next laptop. Only instead of me wondering about, do I get a 12 inch or a 15 inch? I have an infinity inch. Is it an infinity inch laptop? It's small. It's portable. I bring it where I want to go and then I settle down and I do something. So maybe I settle down on a standing desk. Maybe I settle down on a counter. Maybe I settle down on the couch. Maybe I settle down in a lounge somewhere in an airplane or an airplane, but like a laptop or an airplane, like a laptop, I settle down and I do something. And instead of me being confined to a little screen, I now have the freedom to arrange my work as broadly as possible. And there's I genuinely find small screens claustrophobic. I hate working on small screens. It's not just irritating. It's deeper than irritating. You feel like you're in the center seat on a five seat row in an airplane with your elbows touching and you're trying to eat with a spork? Sorry, that's a Brian read you a little too good a description, but yeah, I know that that works pretty darn well. Yeah. Exactly. So this gives you all of your apps and it immediately you could see from the demo that didn't start with a game or with floating avatars in a pretend meeting. They started with a really ordinary app. They showed us a music app. They showed us Safari. Yeah, they showed us FaceTime and stuff, but it wasn't a virtual meeting room where no one has any legs and everyone's pretending to be in space. I'll a Mark Zuckerberg's demo of software he didn't have. They showed us a well thought out. Right. But you're also you're in a window, right? Yeah. It's animated playing cards that you arrange as you wish, not a virtual room you're all sharing, which is way more inclusive, by the way. Yeah. Yeah. It's a it is a desktop operating system with an extra dimension. And they don't just have a shiny demo like Mark Zuckerberg had. I watched Mark Zuckerberg's keynote thinking I was seeing I would see the future of computing and I was like, meh. I couldn't care less. It was like, oh, look, some shiny graphics of an OS that doesn't exist. That's a vague idea in Mark Zuckerberg's head. This isn't. Back to my favorite mantra, it didn't seem to solve any problems. And to me, one of the problems that I see being solved here, my immediate thought was you're in a cubicle with one cruddy low resolution monitor doing some boring task, whatever it is in a corporate world, and to suddenly be able to have two 50-inch monitors side at a slight angle to each other and working back and forth and becoming more efficient and not noticing all the cubicle myths around you. One of the things John Syracuse mentioned on the accidental tech podcast was you really will need to dial that fake background of trees and rivers and waterfalls or whatever, the forest, because otherwise you'll be looking at the wall in front of you and your cubicle and your display would be on the other side of it. So you need to be able to raise that. I mean, it doesn't depend on where you are. Well, that's what I'm saying. If you're doing it in a cubicle or on an airplane, you need to get rid of that wall in front of you. Yeah. And so there's a few things struck me. So the first thing is, even from just a segment on the main keynote, so leaving aside the fact that I went nerdier, which I will get to, but just a bit they showed in the main keynote, you could already tell that there was a design language. There was a consistent concept of how a UI would behave. They showed us, for example, the fact that every app, every window, and they still call them windows, has this little drag bar underneath. So it's a little vertical bar that looks like the bar on our iPhones, but it's just a little bit below each window. And that's the bar that you look at. And then you can use your fingers to pinch and zoom to make the window bigger or smaller. And you can also push it away because you have a third dimension now. So you can push the window away, move it left or right. And you do that with that drag bar. So you can literally just... I missed the drag bar. I saw the little like curved thing in the rounded rectangle in the bottom right to drag the size of the window. But I didn't notice that you did anything with the bar across the center bottom. Yeah. So that center bottom is basically... So on the Mac, we move a window by grabbing its tool bar, right? So the top of the window next to the travel is you grab any where there and you can move the window around. Well, in Vision OS, it's that little bar that looks like the iPhone screens little bar that's just underneath the window. You grab that and that lets you move the window in all three dimensions. So you can move it left or right to get it out of your way, but also push and pull. So if you're working in a document, you can pull it right off to the front. And then if you need to glance over at a graph or at a spreadsheet or something for some numbers, you can put it off to the side and even into the background or anything you particularly don't care about right now. You don't make it go away. You just... I mean, it's like Stage Manager on absolute steroids, right? Because you have your little windows, but you can push them back. And because it anchors your stuff, not in your field of view, so that's the cool thing compared to Google Glass. This interface isn't anchored relative to where you're looking. It's anchored on the world. And so when you turn your head, your windows stay put. And according to everyone who I've read, the thing they all say from their demos, because some of the lucky people got to have real demos of this thing at WWDC, they say that the... Unlike if you use A or on your iPhone, this thing is rock solid. The stuff does not move in the world. If you put a window over your coffee table, it stays over your coffee table. Okay, perfect. So it's utterly believable, which is what you need. I haven't even thought about that being a really important feature of this, but you're right. You need it to stay where it is. If I'm going to turn to the right and I want to see Excel and I want to turn to the left and look at my photos, well, my photos better be there when I get back with my head. Right, yeah. Otherwise, there's not spatial computing, right? You're supposed to be able to arrange your digital universe around you. So it has to be rock solid. And actually, you have to press and hold the digital crown to basically re-center the universe. So if you have your window spread out where the coffee table is the middle, if you want to change your mind, the light has moved or something and you want that to become the middle, you turn to where you want the new middle and you press and hold the digital crown and then the virtual universe shifts and now it's anchored. So I'm sitting up on my couch and then I start feeling lazy and I want to turn 90 degrees and relax back. I might want to rotate the universe. Interesting. I haven't heard about that. So one of the things that I've, I probably should have gone back and rewatch the keynote if I'd realized I wanted to talk more about this, but the, I've gotten myself stuck on thinking a lot about when they said, you can plug this into a Mac and now everything on your Mac. Now you're arranging these, when you've basically got spaces, Bart's favorite things, spaces, but you can have them all over in space out in front of you. And I've forgotten what I felt like and exactly what the details are of what you get from RealityOS without the Mac. So I know you've got, still got applications though, right? You've got photos and notes and things. Oh yeah, it's a full computer. Every iPad app. Every iPad app. Okay. Right now, before the developers write a line of code. Okay. Yeah. So before, so like you can run an iPad app on the Mac, you can run iPad apps on RealityOS out of the box. Okay. It disappear as windows in RealityOS. Sorry, VisionOS. Oh, I said it too. Sorry, VisionOS. Yeah, we're both in it because so long, so long that the rumor has been telling us to prepare for RealityOS, it's just stuck in my brain. Just keep thinking Marvel. Marvel, Vision, Vision, Vision, Vision. Anyway, even before a single developer writes a single app, you already have the full stable of iPad apps ready to go on day one. And that's without anyone doing anything extra. So that already tells you that this is a full on OS already. But and of course, everything you write in SwiftUI will be transformed automatically because the whole point of SwiftUI, the reason Apple have been pushing it for so long is because SwiftUI is built around the concept of it's a declarative language. So you don't... What does that mean? I knew you'd ask. I've been trying to think about the perfect way to explain it. So do you remember in programming by stealth, we had a big discussion early on about the difference between using a P tag to say this is a paragraph versus just sticking to be your tags to say make an empty line. Remember, this is the light audience. So I know. So the point is you're saying what something is. You're not saying I need you to make it four pixels wide and I need you to make the text bold and blue. You're saying this is a heading. This is a button. Okay. This is a toggle. So you're declaring what you want. So you're literally saying to the computer, I want a window with two buttons at the bottom. I want a scrollable list in the middle. And then I want a heading above that. And I mean, obviously there's a syntax to it and it's a bit more Cody, but you're declaring what you want. And then that same declaration is used to build your watch app. It's used to build your Mac app and it's used to build your iPad and your iPhone apps and your reality OS app. So that in your declaration of your desire gets translated into different pixels with different looks and feels and different responses depending on the environment they're in. Okay. Okay. And that means that even if you take a modern, a modern Swift UI app without the developer writing a line of code, it's not just available in reality OS. It's it already feels native in reality OS. Sorry, vision OS, vision OS, because Swift UI will translate it to be the way their buttons are and so forth. Okay. Okay. So do you think that their invention of Swift was actually with this kind of vision in mind? At least part of the push was, but it was also, even if this wasn't happening, so we know for a fact that there's seven years of development within this thing. So for at least the last seven years, we have to look back at everything Apple were doing. And they already knew they were working on this for the last seven years. So everything for the last seven years, we now have to reappraise everything they said. And every WWDC for seven years back, we should now be looking at going, what were they really saying? And they were, they were pushing Swift UI and stuff. And they were selling it because you had iOS, iPad OS, Mac OS and the Apple Watch and the Apple TV. So they already had a lot of personas for your Swift UI. It was already well worth your effort. But they also knew it would be even more worth your effort when VisionOS came out too. Okay. Okay. The best thing about the internet is someone will go back and search that and show like a timeline of remember when they said this and then they said this and then they said this and be able, whether the story was really there or not, it'll be built, right? True. True. So I can say I was already sufficiently impressed by how well thought out this UI was just from the main keynote. But me being a, you know, being someone who writes code, I won't call myself a developer, but I am a coder. I obviously watch two keynotes every time there's a WWDC. I watch the, you know, the public keynote and I watch the state of the platforms. Which is also public. Which is, yeah, it's, they're all public, right? Every single. Every single session is public actually, but it's not aimed at. Yeah. Right. The intended audience is not the press. The intended audience is the developers. And it's notable that you don't see the same Apple people on the two. You don't, you see the people who actually write the APIs on the state of the platforms. You don't see the famous executives, right? You don't see your Craig Federighi stuff. You see, you know, lower level, you know, lower level, more hands on people anyway. There's a chapter and the nice thing about the state of the platforms is it's actually chapterized. So when you go to the, when you go to it in your browser, you can jump straight to the chapter you want. And they have a chapter on vision OS. And so even if you don't watch the rest of the keynote, because it gets, it gets quite in the weeds at times. But when they jump to vision OS, they pull out of the weeds because this is a whole new OS. The developers need to be taught first principles, which is fantastic because first principles are human friendly. So they actually start by saying they explain the concept of the UI. So they say that there are three atoms of content in vision OS. So most apps are going to be windows, which you should imagine mentally as a pane of frosted glass. And on that pane of frosted glass, you will paint your app. And you have all of the standard stuff. So sidebars and menus and all of the normal components that make up an iPad app, a Mac app, they're all there. They're all ready for you in SwiftUI. All of the standard controls have been adapted to make them fit in that space. And they explain that, you know, how they've made the font a little bit bolder because that works better. They've white text they have discovered works really well in A or V or much better than black text. And so there's not a whole bunch of translation to make it all work. But anyway, they've done that hard work and it's just there for you. But then, so that's one atom is the window, which is a frosted pane of glass. The other atom is what they call a volume, which is for a constrain, it's a box. So your app can exist in a box if you have three dimensional content. If your app makes sense of some sort of a three-dimensional experience, you can exist it in a box. And then the last thing you can do is an immersive experience, which is the equivalent of a full screen app. There are your three ways of doing things, right? It's a pane of glass, a box and a universe. You can make a universe. I'm going to interrupt the audience real quick. My granddaughter appears to have escaped and joined us here for a second. You can say hello and then you got to go. Okay, bye-bye. The live show is used to seeing that happen, but that was a special little surprise for you to chat. She loves very comfortable on the microphone. I don't know where she gets in. I think growing up in the pandemic, I think kids are not afraid of cameras and microphones. Maybe that's what it is. That's what Granny was for ages. So let's see. I'm going to back you up to where I, before I got distracted, was you started talking about the volume, about some box. Is this the 3D space that your pane of glass will be within? Like you don't let it get closer than this or farther than that? Well, no. So you can say that my app doesn't live in a window. My app instead lives in this volume. And so you as a developer would say, I'm writing an app that is three-dimensional. So if you're writing a player for 3D movies, you would define that as a volume. And then the person can control, basically they can stretch and shrink the box, but it will stretch and shrink in three dimensions because it has a depth. Okay. But you can still pick it up and move it. So it's for 3D stuff. Got it. Got it. Got it. Okay. It's for 3D stuff. It might be a visual visualizer for a 3D model or something. By the way, I do want to say when they showed the 3D CAD model and the people walking around it, they said that was PTC. That was a company that used to be called Parametric Technology Corporation. I think they don't go by that anymore. They just go by PTC. But that was the primary application that I used and then supported and was probably one of the places I spent the most amount of money in my corporate job. And I was on the board of directors of the international users group for PTC. So I went, yay, PTC. Cool. Perfect excitement. So that would be an example of an application that would use the volume. But you don't have to use volume. You can use the pane of glass if you're a 2D. Yeah. So yeah, use a developer would decide what makes sense. So if I was writing a music player or something, I would want a pane of glass. But if I was writing a CAD app, I would absolutely want a volume. So what about the apps that they showed that have pieces that can come out, like you're watching a basketball game and there's a panel that can come out and kind of floating in another piece of space that shows the stats on the player or something? Well, give me a sec to come back to that because I want to get the three atoms out before we zoom in a little bit more. So the third atom then is a fully immersive universe. So you can basically say I want to design an immersive experience, which means it's effectively the full screen version in 3D, right? Everything else goes away and my experience is the only thing the user sees. So they would often be, the main thing Apple demoed for the purpose of those is a customized version of their beautiful mount hood. So you would have some sort of like a Koi pond that you could then enable as your fake background to replace your cubicle. And that would be an app that would be a fully immersive app that you could then use as your background. Oh, that's a separate Koi swimming around or something. So that would be a separate app from the thing that you're watching or, you know, I've got Excel because I'm a nerd. 20 winners of Excel. But I've got, I've got, I'm inside a Koi pond or something. You could do that or it could be an experience like a fully three dimensional movie of some sort, right? Some sort of interactive three dimensional experience, a game or something would also be an example of something that you would do as a full screen. Inside the pyramids? Right. Exactly. Yes. Or exploring some sort of, yeah, or anatomy or I mean the possibilities. Or anatomy. Oh, I'm going to go inside the human body. Right. Yeah, exactly. Right. So you can end this. So there are your three big picture atoms. But if you have a window based app, you don't have to have just one window. You can have other windows, which you should keep close to your main window so that they, when you use the drag bar, they'll all move together, but you can have other ones and they will be in the Z axis and they will come towards the user. So any new stuff you open is in front of your main pane of glass, the little mini pains of glass that are in front of. Now the thing they said, they've done a lot of research and that you should use the Z dimension, but don't go mad. Oh, okay. So just give a little bit of depth towards you. Okay. And they demoed it actually in, so in, so, okay, so I went three levels deeper. So first level public keynote, second level state of the platform, third level, they actually have a session called principles of spatial design. And they explain the computer science they've put into figuring out what doesn't, doesn't work, what's easy for the user, what's hard for the user, dos and don'ts. And one of the things they say is never ever make your, your text three dimensional and they demo it. It's awful. Three dimensional text looks terrible. Well, you're watching it in 2D, so how do you know? Oh, no, you, like, you can see, you can take a photo of something and know it looks horrible, right? Like I can take an animation of a piece of text and wiggle it left and right. And on a 2D screen, I can still tell that that's not easy to read. Okay. Because it looks embossed. And it, that's not easy to read. It's much easier to read a flat thing. Like it, trust me, it was really obvious. They also demoed that you do not use colored text, you use backgrounds, colored backgrounds with white text, and they show you clearly which doesn't, doesn't work. And so they walk through all of these principles of design, how many points you need to have on things, how you should put your radiuses and stuff, and they demo it to you, why it doesn't work. And when you see the dos and don'ts, you're like, okay, thank goodness they worked this out. So the developers already have a real head start here and not breaking our brains. I appreciate what they're saying, and they're probably correct in what they're saying not to do. But this is a classic place where Apple, in my opinion, breaks a lot of usability when they do things like the translucent menu bar on the Mac, where it was changing colors right to left based on the background that you have of the, your background screen. And it made suddenly like white text wouldn't work on the left hand side, but it was working on the right because it was too light on the left and dark on the right. And I don't know, they get into that translucency thing just a wee bit more than is usable for a lot of people. They have gotten much better at making that work intelligently, right? You're right, when they started doing it, it was awful. But modern Mac OS is actually, they tend to fade in pretty well, so it actually works. I haven't noticed in a while, and maybe it's some of that experience that they had, but remember, it was a while before we were able to even shut the darn thing off, and it was like, oh, thank goodness, I can turn that off, because now I can read my menu bar. Yeah, now it's an accessibility feature, which is what it is, right? So that's important. The other really important thing I need to say is that they made a very big point of this in one of the three levels deep. I don't remember at which level they got into this. This might have been, say, to the platform, but Windows have what they have named adornments. So a menu bar, like at the top of, say, the mail app, that doesn't have to be in your main window. You can detach that and have it nearby as a little floating up, a little floating rounded wreck, just a little bit above or a little bit to the side. And they call those adornments. And it actually looks way better to have an adornment. Do they have to be rectangles? Could it be a circle that you had to spin? You really annoy people? I mean, it could be anything, but if you're going to use the end of the box, you get very pretty rounded rectangles, a bit like the flattened dock in modern versions of MacOS. That's sort of a very pleasingly rounded rectangle. And it will sit just above, just below, just to the left, just to the right. You can have your adornments wherever you want them, but they are very good. Will those also be slightly closer to you? They are, yes. They're slightly, slightly in the Z axis. So again, giving you some depth feel. Yes. Interesting. And they also talk about how you show what is currently selected. The other thing they've thought out, which has really impressed me is privacy. So this operating system, the pointer is your eyes. So you can use a mouse, like you can use a mouse on the iPad, but you don't have to. And that's not the primary driver. So like the mouse on the iPad is a bonus extra, your eyes are the cursor in VisionOS. So you look at what you want and it responds to you like the Apple TV does, but it's just sort of a little wobble. Like the thing you're selecting is, is slightly, slightly pulls out. It's actually all of the icons have layers and they actually show you how to make an icon. And so it has to be three layers. And they will round them and they will do the specular highlights and stuff. But you, the developer, write a three-layered icon. I do remember seeing this with, I think it was maybe the photos icon and they kind of, they kind of showed it to you at an angle so you could see that the little flower was in front of a little background in front of another background, like a matte, a matte framing. But I do want to say that that's another area where I don't think they're actually really good at this. And the perfect example that is in, on the Apple TV, when you've, when you've hovered over an episode, it's often, or even a show that you want to watch. It's really hard to tell which one you've selected. And there's a, again, an accessibility feature where you can have it draw a white box around it. To everybody listening, if you didn't know that, go turn that on. You'll be amazed how much more easily you'll be able to tell what you've selected. So they, they may underdo the vividness of what you've selected. We go. Yeah. Yeah. And they also, there are, from day one, there are accessibility features already in VisionOS. So there are already things you can turn on for people with reduced vision and stuff. That's already there, which is kind of impressive. Did you learn anything in the State of the Union about the touch stuff with the fingers? Or is that just- Yeah. So that's purely a gesture. That, that, that is how you, so you're looking, so your, your gaze is the movement of the mouse, for want of a better description. The act of clicking the mouse is just pinching your fingers. So it doesn't matter where you do that gesture, because there are 12, is it, cameras? As long as your hand is somewhere within the field of view of the headset, it's, it can recognize your hand and it can see the gesture. So you just move the, the thumb and whatever you name that finger and that counts as a click. So you just look at what you want to do and you just move your, your thumb and finger and you click. And you just make a hand gesture to grab the, the bar. If you're looking at the bar and move it in or out. And so it doesn't matter where your hands are. You don't have to reach into the world. They can be anywhere. It's just a, it's just the direction of movement that matters, not the positioning in 3D space. Right. Right. I wonder on that one, everything I've heard so far is it's really an easy gesture to do, but I am relatively certain I will double tap accidentally. That my fingers will just kind of, I don't know, spasm to do twice. I have that problem when I'm tapping on the iPhone. I'm trying to type or something. I have a tendency to double tap things. But that just might be an accessibility issue for me. I don't know. I don't know how it would respond. Obviously, since I haven't played with it. But anyway, so where I was getting to, so your interaction is your eyes. Okay. So if you're an advertiser, or if you're, you know, someone who's who wants to learn a lot about you, you're very interested in where people look, right? There are entire labs of eye tracking. That is a thing. And this yoke does eye tracking as part of its bread and butter. But the app does not get your eye position until you make the gesture to click. Oh, so they don't see everywhere you're looking until you get there, until you select it. Until you interact. Okay. So you actually click. So the OS is showing you what you're, what you're positioned on by wiggling buttons and stuff, but the app doesn't know the buttons are wiggling. The app doesn't know anything until you click or pinch your finger. It's just such a shame that people do evil things with that information. But I think that's such useful information to a developer to know it took me, you know, 15 seconds to find the right button that I went over here and I noticed this over here, and I went down over there. Oh, there it is a way over on the right. Oh, maybe that's an important button. I should move it to the left as a developer. And I remember talking to a company at Macworld a hundred years ago where they were selling a tool that would let you do this eye tracking on your app and it would let you see where the basically create a heat map of where people had not eye tracking, but where people have been moving the mouse, whatever. No, actually it was the ads at the time, but everybody told me about it was like, no, oh, that's a terrible idea. But it's so useful. There's no reason that you couldn't have like you can give permission for an app like text expander to track your keystrokes. That's a permission you give the app. And once you give the app the permission, it can read every keystroke in every app. There's no reason they couldn't put similar. What if you could do that inside test flight? If you agree to test my app, I'm going to ask for this permission. I am going to track your eyes, but I am not going to sell it. That seems very plausible. And that is like being in a beta already gives different things, right? When you enroll your device into a beta, different things happen. So that's entirely within Apple's security model. That doesn't break anything. But in terms of the day to day, the fact that Apple are already thinking before the thing has even launched to the public about how to protect your privacy, that immediately made my mind go, okay, this is not a half baked demo. This is very clearly a mature, an operating system with a lot of thought in it. And it's also the more I watched the in depth of the third level deep of videos, these people have been using these APIs to write apps, because they talk about when we did the music app, when we did the photos app, when we did, and they were talking about how they chose to use the APIs. And it's really clear that these are people who have been working in this environment already. And now they're sharing with the developers this amazing toolkit they've already been using. So I was impressed. That's interesting. Yeah. So they started with the music app and then started mutating it, I assume, to the new language. Yes, exactly. And so you should think about it. So remember I said that the point of SwiftUI is that it's a declarative language. So you declare what something is. But each declaration has optional attributes. And those attributes are not the same across all devices. So that is why on the Mac, the same Swift UI code does slightly different things because you on the Mac can say, well, the Mac has a concept of a menu, which the iPad doesn't. Therefore, my Mac app has these extra optional bits of the UI that are only going to be visible on the Mac. And you just put them into your declaration and they get ignored by things that don't use them. But they're there in the Mac. So they're basically like if you're writing functions with optional arguments, you just put in the optional arguments. And on the iPad, the optional arguments for the iPad do a thing. And on the Mac, the optional arguments for the Mac do a thing. And in RealityOS, VisionOS, there's different sets of optional bells and whistles that you can add to your declarative UI to say, and on RealityOS, I want you to. So I want you to draw me a menu that has a copy button, a paste button. On the Mac, I'm going to say here's the options for the Mac. I want it to be at the top of the screen. Or maybe I want it to be actually in the menu bar. Another way you can do that in modern apps like mail and stuff, where the buttons are actually in the menu bar next to the traffic light. That's just an option you type in when you're doing your UI. But you would have another line underneath that that says, and on VisionOS, it's an adornment. And so you've defined that you will have a copy button, a paste button, and what it does when you click on them. And then you've said on the Mac, this menu bar goes here, on VisionOS, it goes there, on the iPad, it goes there. So that's still declarative. But you can add effectively conditional statements. If I'm on the Mac, then do this extra thing. And so that's where the work is for the developers. So out of the box, you get an app that's already decent. But the spit and polish between a good app and a great app is doing those extra little things to make your Mac app more Mac-y. And your iPad app more iPad-y, not just a bigger iPhone app. And SwiftUI has all that capability. So that's what they want developers to focus on. So start with your iPad app and then make it better. Did they talk at all in any of this about how the Mac apps work in VisionOS? Because it's not really VisionOS when you're using your Mac to see things through this headset, right? You're actually, if you opened numbers on your Mac with the VisionOS, the Vision headset, and you opened numbers inside VisionOS, they would be different. Yeah, they didn't go into that. To be honest, I didn't, I'm not saying there isn't a session that went into that detail, but none of the sessions I saw did. Okay. Yeah, I'm picturing that being real interesting. Maybe you only switched to the Mac apps when there isn't a good VisionOS app, right? You wouldn't use numbers or you prefer it, I suppose. Yeah. Yeah, I mean, it's, I think using it in the flesh because it's a bit like Sidecar. It reminds me a lot of the Sidecar feature, right? Where you can just bring your iPad next to your Mac and it becomes an external display. Well, in this case, you look at your Mac and you just sort of take its screen and then you have an external display for your Mac in VisionOS. It's just kind of like Sidecar where suddenly you have Mac apps on an iPad. Now you have Mac apps in VisionOS. I'm just picturing, take Overcast, which is an iOS app that doesn't run on the iPad, but Marco writes a VisionOS app, but I actually end up using the Mac app, which is a Catalyst app, and I put that on the VisionOS. Yes, the rabbit hole can go deep. Let's make the worst user experience possible by doing that, right? And then I just, I realized we're running to the end of our time window here, but the last thing I just want to say that impressed me much is how clever they were about allowing you to be in this immersive world. It's not binary. It's not an I am immersed or I am not immersed. It's actually a spectrum of how immersed you want to be. And so you can have it that the real world is perfectly in shot and your windows are exactly in the real world and you can choose to make them small so that they might only take up like, you know, 60% of your field of view or something. And the real world is 100% crisp, right? 100% present. So you're very unimmersed. But you dial the digital crown and you start to blur the background a little bit. And so your virtual world becomes stronger. And you can dial it all the way to go away universe, right? Which is if you're in the cubicle farm, that's actually what you want, right? Go away universe. But it's a choice you have. And even if you're in the fully go away universe state, it's still detecting people. So human beings break through. Groober described as being like a force ghost. He's a Star Wars fan. But not a spooky ghost. He was very careful to point out like a force ghost, but not spooky. That's how we describe it in this piece. But the people who walk into the room break into your virtual world. And so you're never completely isolated. You're you don't have this thing of being scared witness, where like a lot of people describe where everybody's at whatever I'm being, yeah, or walking into a wall. The other thing Groober pointed out was that he was using it on the couch, very comfortable, really enjoying it. And they went, okay, now let's run the dinosaur app. And a portal opened up in the room. And a dinosaur poked its head out. And then they went, let's go look at the dinosaur. Stand up. And as soon as he stood up to walk, the coffee table in the fake living room came into his universe because it was an object that was he needed to know about that he needed to know about. So it's constantly sensing your environment. So your shins will not get, you know, destroyed. And so he was able to see the coffee table, walk around the coffee table and then look into the portal and see the full dinosaur in its own little universe in the portal. And he said, and everything was fully rigid in the world. As he walked around, it was anchored like it was real. Interesting. Back to the recognizing people, the example that they showed of somebody walking up and handing something to the woman that was working, that seemed like a great use of that. I know I keep just poking at what won't work, but I'm wondering, will my dream of a happy person in a cubicle work, if it thinks that it's noticing these people that are on the other side of me, less than six feet away inside of my cubicle? I wonder how that'll work. But it's visual, right? This whole thing is based on processing what the sensors see. So the cubicle is your shield. Well, you might get the person across the aisle from you in the next cubicle. I guess it depends on how tall your cubicle is. Or put a little curtain. What's it gonna say? I've seen all sorts of interesting use of box files and so forth when cubicle walls aren't high enough. Give me some privacy. Yeah, so it is surprisingly well thought out. And the other thing, of course, right? So from your point of view in the virtual universe, having the people break through and having the physical shin dangers break through is obviously for your benefit. But there's the inverse, right? I'm sitting in on my, I'm working from home. I'm sitting in my office and I'm staring in an empty wall. I've gotten rid of my very expensive Mac Studio display. I don't need it anymore. I now have infinity space. I'm working away at my desk. I'm using my keyboard. I'm using my trackpad because that all works fine. And my darling beloved walks in and I see him. But how does he know how deep I am in another universe? But that's why they put your eyes on the screen on the front of the darn thing. And the representation of your eyes goes from crystal clear, your eyes, which means I am in the least immersed mode, to barely recognizable, completely blurred out eyes, which means I am completely immersed. I have this thing dialed up to 11. And so it's actually very obviously visible to the person how deep you are in virtual reality. So that how deep you are doesn't change because the person walked into the room. I thought it did. All right. So when they, when they become a Force Ghost, your representation of your eyes will also change to show that you have seen them. So they work in unison. Okay. I don't understand this Force Ghost reference. I don't know what that is. So I don't quite know what you mean. Okay. Ghost. It doesn't matter what kind of a ghost, right? And Star Wars. But when the person walks in, when the person walks in, you notice them and that would start to clarify your eyes to them. Precisely. So when the headset notices the person, two things happen. They appear in your view and your eyes become more crystal at the same time in lockstep with each other. So they're unlikely to see the fully blurred out view. Well, they will while they're more than six feet away, right? I can't remember whether it's six feet. There's a distance that they walk into to break through. So you can sit in a coffee shop and the street can be happening without like all of these random heads coming through PowerPoint or whatever, right? But yeah, so that gets back to the chemical problem, right? That'll be an interesting difference. The thing I liked, again, what we've been talking about here is that it captures the imagination with this, what you'll be able to do with it, where my imagination wasn't captured with the other ones. Someone else's imagination was, which was a bunch of floating avatars with no legs where Bob is a dinosaur and Sally's a hula girl or vice versa, nothing wrong with that. That's not an imagination that's got any value to me. It did, though. To me, that's the thing, right? What am I going to do in spatial computing? Everything I do on my computer, I'm going to do in spatial computing, and I'm probably going to discover some bonus extras. As a pure bonus, I'm probably going to find things that I can do in a spatial OS that I can't do anywhere else, and I'm going to love them. I have no idea what they are. I don't think anyone does, right? I've often heard described that when television was invented, it was radio with pictures, and people thought about television as being, well, we do a radio play, but we can see people, or we do a theatre presentation, but we broadcast it to people, and only later did people realise that we can do things here that we couldn't do in the previous world, and that will be true of spatial computing. I don't know what it'll be, and a whole bunch of it probably won't interest me. Apparently the sports stuff was really impressive. I don't care, but there'll be something. Probably thinking about it logically, I want a planetarium app. I want stellar cartography from Star Trek, and that is entirely compatible with this headset, as designed. Interesting. Well, and that's taking what is important to Bart, and sports ball is important to Lindsay. Many people. I want better Excel. Pivot tables in 3D. Oh my god, 3D pivot tables. Yes, that's exactly what I needed to be. I needed to have that Z factor going on. Graphs in general can work much better if you can animate them into a third dimension. Oh yeah, exactly. Maybe we should end on talking just a little bit about how did none of this freaking leak? How did it not? Because there's no manufacturer yet. Okay, but there could have been the manufacturing of the hardware. But to know what it was for. I don't think it's production, but I don't think they have demo units, which means that they're handcrafted. Whatever Johnny Ives' lab still exists. Johnny Ives has gone off to do his own thing. But Johnny Ives' labs still exist, so they have manufacturing equipment to fabricate one-off prototypes. So they would have been building in-house these things. They haven't gone to a giant big factory and made a million of these yet. So that's why it hasn't leaked. I would imagine that today. But the hardware could have leaked, and you still wouldn't have known what it was for. That is also true. It's the O.S. meant the humans held their lips in always every, in that Z direction, and all, you know, spherically they did it. Like Project Purple. I don't know what that is. The iPhone's O.S. was called Project Purple inside Apple. That was worked on for a very long time under, what's his face? He left Apple over Apple Maps. Nice guy, didn't get on with Tim Cook. Oh, it's going to do my head in now. I'm going to think of his name in a couple of minutes. Scott Forsall. There we go. Finally. Penny dropped. Under Scott Forsall, Project Purple was what became iOS. That was five years in the making, and I'd say it's secret too. Now they actually had their own building. I would not be surprised if they took the Project Purple playbook, blew the dust off it, and just repeat it. Yeah. It just makes me so happy that nobody knew. I wanted to be surprised for the first time. I mean, 15-inch MacBook Air. Yay, we knew that, you know? That was about it, but I'm so pleased about it. And bits of the hardware did leak. Yeah, like the digital crown. But it didn't make sense. Yeah. Yeah, the digital crown leaked, the eyes leaked, but the eyes sound silly, right? It has a screen in the front with your eyes is what, oh, the guy who's so good at leaking stuff. Oh, I knew who you mean. German. He went to Bloomberg. Mark German. German, Mark German. Yeah. He had that, but it didn't make sense. It's like, what's that game where you have blind people trying to detect an elephant? It's like, I think it's a tree. No, you know, we're all just feeling around. We don't know it's a trunk. We don't know it's tusks. We have no idea what it is. And when Apple put it all together, it was like, ah, okay, so the leaks were right. We just didn't understand. Now, I was really surprised how everybody that I talked to who just watched the keynote didn't notice that there was a scene where there was a man walked towards us and then turned his head and the eyes were out in front, or you could see his eyes on the outside of the goggles. And then he turned his head and it got real weird. It was just a split second shot, but it looked real strange because you realized when you aren't looking straight at somebody, the eyes are displaced from the face. So if you look straight at me, it looks okay, but they're forward. So it was like, the eyes were just wrong. I mean, has anybody posted pictures? I'm gonna do a 3D effect. I wanted to get out my ski goggles and put some, some googly eyes on them and put them on for the live show. I forgot to bring them to Lindsay's house though, so I can't do it. I can't believe I'd be the only person to think of it. But anyway, well, this is, this is cool. This has been fun, Bart. I see what you wanted to talk about now. We did, I didn't actually know what we were talking about coming into this conversation. I knew I wanted to talk to you about it though. So thankfully, you trust me enough that when I say these things, you go, oh, sure. Yeah. Yeah. Yeah. My response was, yay, sounds great, whatever that is. I didn't do optionals, but I will dig up because I think, I think I was wise enough to bookmark the videos in my developer account. So I think I will be able to give you links to the videos I recommended. So the chapter in the state of the platform, because that's chapterized, and then the principles of spatial design. Those are the most important ones. I'm glad you're going to do that because I was really, really bummed. I put in my calendar what time the state of the union was, and I blocked off that time on my calendar to sit down and watch it. And I was dedicated to it. I was going to do it. I was going to watch it because I really enjoyed it the last couple of times. And they didn't play it. It was like recorded and then given to us later. It wasn't live. I mean, it didn't go live at the time they said it would. It was on the schedule and it's set on Apple's website. I was looking right at it the right time. I'm in the right time zone, so I didn't mess that up. And it simply wasn't there and found out it was just recorded and given to us later. I was like, well, yeah, but it was recorded. Well, I'm wondering if the developer app might have had it in real time. Now, as it happens, real time doesn't work for me being on the wrong side of the Atlantic Ocean and the continental United States. There's a slight time zone problem. So I always watch it in the developer app, but I don't know if the developer app would have given it to you live. I don't know. But in any case, that was the time slot I had to dedicate to that and I have not had a minute to myself that I could do it on ever since. So I will probably never watch, which makes me sad because I wanted to, but I know me. Well, the great thing is it's always there and it's chapterized so you can jump around. But that's the problem. And you can easily skip. If it's always there, Bart, I'm not going to do it. Never there. I need it to be there always when I'm signed up. Yeah. Yeah, exactly. Oh, well, I can say, OK, tell Steve to put a thing in your calendar at a random time. You must sit down and watch this. You must sit down and watch it now. Yeah. At the Apple TV, by the way, is a fantastic place to have the developer app because it's just the perfect thing. You can just sit back in the couch and just actually watch them on the developer app and they'll remember your place and stuff. And can you do it on the developer app without a developer account? You absolutely can. If you have an Apple ID, you can just sign in and then it will sync your video position and your bookmarks between your Mac and everything else. You don't have to, though. You just install the app. You can just watch every single presentation, all of them. You can even bookmark them, but they're local bookmarks because you're not signed in. So they'll only exist on the Apple TV, which is probably fine, right? But they've really lowered the barrier. Is there a developer app inside the Apple TV app on the Mac? Sorry. Well, the developer website, if you sign in, it'll sync all your stuff, right? So there's a developer app for iOS, there's a developer app for Apple TV, and there's the developer.app.com website for the Mac. If you sign in with any Apple ID, you could just do it for free. You don't have to pay a cent. And they've actually made the betas free now as well. Even the developer betas are now free. So there's no need to pay for money unless you want to publish an app to the app store. So it's all free. Are you recommending that anyone put the developer beta on their production machine today? No. No way I'm bloody well not. It's called a developer beta. And even when it goes to a public beta, yes, the word public is there, but there is also a second word, beta or beta. I don't know how you pronounce it. Don't install it on a production machine. Definitely not one that your income depends on. There you go. All right, well, Bert, now I dare you to figure out how you're supposed to sign off. Oh, I'm definitely going to say happy computing on this one. I mean, wow, this is the ultimate happy computing. The future of computing has been reinvented. How much happier does it get than that? I think I'm ruined my ending. But anyway, until next time, when we're back to business as more usual, happy computing. You're going to find different ways to contribute. If you'd like to do a one-time donation, you can click the PayPal button. If you want to make a recurring contribution, click the weekly Patreon button. You're only charged when I publish an episode of the NoCillicast, which let's face it, it's every single week, so I don't charge Patreon for chit chat across the pond light or programming by stealth episodes. Another way to contribute is to record a listener contribution. It's a great way to help the NoCillicast ways learn from you and takes a little bit of the load off of me doing all the work. If you want to contact me for any reason, you can email me at alison at podfeed.com, and I really encourage you to follow me on mastodon at podfeed at chaos.social. Maybe you want to talk to the other NoCillicast ways. You can do that in our Slack group at podfeed.com slash slack. Thanks for listening and stay subscribed.