 already here for another amazing Q&A session. So, as usual, I'm going to ask Ashima my icebreaker question. And the icebreaker question is, is there anything interesting in your surroundings, an item, or maybe something else that you want to share about you? I think maybe I would like to share this, this is like my workout mat, but it's made of wood, like straws, so it's an insulator. You could work out on like a really cold floor and it would still remain warm and it would remain cool on a warm floor. So it's like a great way to work out in places which have like huge temperature fluctuations. So yeah. Nice, nice. I also saw it was like nicely decorated as well, colorful and everything. Yeah. Cool. I think it's sometime I don't see them around here where I live, but yeah, I mean, it would be nice to find a shop where I can find them. Maybe Amazon has them. We'll see, we'll see. We'll put the link. John? Yeah, I just think I could do one of those myself for a bit of yoga in the morning when the flour is cold. Chris, have you got any curios you'd like to share with us? Yeah, I've got a bunch of crystals on my desk. So I recently, if you listen to the podcast I did, I mentioned that I've gotten into rock hounding as a result of my child. And I like, I don't know, I've never been like that into, I mean, obviously crystals are cool, but when you find your own crystals in nature, it's just this surreal experience. Like, wow, I dug that out of the ground. I mean, look at this like perfectly shaped, you know, thing. It's just, so yeah, my room is quickly, I've organized this, the cameras, that you can't see too much of the mess, but there's like rocks everywhere in the rest of my room right now, outside of the cone of, you know, observability here. Excellent stuff, thank you very much. Yeah. Yeah, that's really cool. How would you suggest I choose between Oz, Viz.clj and Ami, or other option for data visualization? I guess the answer which I gave was basically, Hanami would give you like the most amount of customization possible. And like if you want to go in depth, then that is definitely the option to choose. But if you're just starting out and you have never done any data visualization in closure, I would recommend starting off with Viz. And like I said, we have these options to substitute, like create Hanami substitutions. So that will help you slowly, like gradually learn Hanami as well, along with using Viz. So Viz would sort of fulfill your initial requirement of creating small charts. And then you are slowly learning Hanami in the process. Other than that, we also have a tech quiz which has like certain, you know, specialized plots. If you have those needs, like for example, colorful histogram or something. So you can check that out. And if you don't want to use any browser options, then I think CLJA plot would be the one to go, which is right now in progress. And Tamash is sort of building it. So yeah. Okay, thank you. Can I add a couple of points to that? Of course. Yeah, great. So. So yeah, that's, yeah, I think that sounds great. As far as Oz is concerned, this is something, so we did a workshop on this just earlier this week. And I think there's kind of a challenge with Oz in that it has evolved into kind of a Swiss army knife. It's got a lot of different tools or it's got a lot of different things that it can do. And so I think a lot of people aren't necessarily even aware of everything that it has available. It is, you know, it's a set of reagent components which have now been made more flexible as far as how you get data in and out. So you have access to the view, the Vega view API for inserting or changing data sets. It's also, we've been working on similar things to note space and clerk with our, with our name space as a notebook functionality. And yeah, and so just released version 2.0 earlier this week, which has some really cool stuff to check out. We're actually evaluating forms in parallel now, not just doing the dependency graph and then computing one after the other. But actually when we see the two forms are independent, we can compute those in parallel now. There's, yeah, and I'd say the other thing that really stands out with Oz is that for a long time now it's been designed with the goal of being able to produce static documentation or scientific documentation. So PDF output is on the horizon, been kind of designing around that for a while right now and getting, putting that in place. But also if you want to build a static website, like a blog or just produce kind of a data science report, you can use kind of the notebook functionality of Oz and then spit out a static HTML document that looks really nice. So I think, I mean, but just to kind of cap all that off with Oz, you know, that's Oz specifically, right? I mean, it's a Swiss Army knife, use it for what kind of makes sense for you. I feel like it's a good first tool to grab if you think you might want to be doing a bunch of these things, because then you have everything there. But the other thing I'll say is just that it's really fantastic that as a community, we've, I mentioned this in the talk, have converged towards Vega and Vega Lite as data visualization languages because it means that we've got a lot of interoperability, right? You're not locked into any one of these things as long as they're all kind of using this underlying language. So for example, you could take Hanami's templating mechanisms for, you know, more concisely describing data visualizations and you could use them with Oz. Or as you mentioned, you can use them with Viz CLJ. So there's, yeah, the answer is use everything. Yeah. And Ashima, it was also suggested that you publish the docs on CLJ.doc.org and you mentioned that. You tell me. Yeah, like I was not aware of that lightly. So I'll definitely check it out. And if it suits our needs, then I would definitely use it. All right, thank you. John, any more questions? Yeah, I've got a few questions for Chris. Actually, while you were talking about Oz, I was just curious, does Oz implement the complete Vega Light specification? And is it just Vega Light or is it Vega as well? What's the kind of situation there? Yeah, so it's just wrapping Vega and Vega Light. It's effectively wrapping the, well, it's kind of doing two, again, it's complicated to describe what a Swiss Army knife is doing, because sometimes it's doing different things. But the standard sort of usage where you're at a REPL or a closure process and want to visualize some data kind of interactively, that opens up a web browser and you can interact with a Vega or Vega Light visualization there. So it's just using the embed API to embed in a page. But if you're building statically compiled output, if you're building a web page, if you're again, hopefully in the future here pretty soon, you're building a PDF, it's going to use one of two mechanisms to try to statically compile an image. And incidentally, you can just use Oz to statically compile an image, just kind of by itself outside of the context of scientific documents. So that's what you want, that is something you can do. But to do that static compilation, see who worked on it. I want to say Jack Rusher and the, what is it? I'm forgetting the name of the organization that did this work, but Darkstar was the name of the project that was used. Light science? Yes, yes, thank you. That's using Grawl to compile Vega and or Vega Light, I guess, so that actually from within the JVM now, using Grawl to interpret or whatever, compile the JavaScript in the Vega libraries, it's able to now compute at least the SVG. I think that if you want PNG, which sometimes is better if you have bigger data because you don't want to have an SVG with too many nodes, that could be a little bit problematic, but I think there you might still have to use the, install the NPM Vega CLI to get it to work, but which is the other mode. So Oz will use the NPM Vega CLI as a fallback, but if you want SVG output, it will call out to the Darkstar library to produce that. Darkstar, that always reminds me of the movie. If you haven't seen the movie then, it's worthwhile watching that. It's classic. I don't know if I have. It's a classic old one. Yes. One of the early sci-fi movies out there. Amazing movie orchestra. Yes. Nice. Cool. And so, are there any good tutorials to help people get into using Oz? If it's such a big Swiss Army knife, or should we look at the workshop you did earlier on this week? Yeah, honestly, the workshop, we ended up kind of getting into the weeds a little bit. My goal was to really try to do a lot of demonstrations and unfortunately, well, unfortunately and not for it. I mean, whatever, it was actually great. I mean, people ended up being a little bit more interesting and asking questions about how things worked and kind of future direction stuff. So not necessarily, but I do want to put out soon a better, just full kind of video that outlines and details all the things you could, or actually maybe a bunch of separate videos that show the different features, one that kind of gives you an overview and some that go into the details. Okay, that sounds good. Yeah, if you want any help with that, I'd be happy to get involved with that as well. Oh yeah, that'd be fantastic. Yeah. Cool. All right, great. I'm kind of curious if we could switch back to Paul this a little bit. I know I don't really want to kind of air your dirty laundry, but I'm kind of curious in the original tech stack and perhaps how you want to evolve that and some specific parts of this awesome potential of closure data science that you want to unleash in this project as well. What kind of thoughts do you have on that? Yeah, absolutely. So one of the biggest things that I really want to do with the code base is, and this is kind of funny, but at some point, we kind of re-implement code base using, not re-implemented, but we set up Sierra's component library for dependency injection and life cycle management, which seemed like a good idea at the time. I mean, it's kind of what everyone was using and we wanted to be able to kind of start and stop the system work cleanly. And I mean, certainly it solved that part of the problem, but it came with the cost of just making the code base a lot harder to work with, which I think especially from a data science perspective is very problematic because I think that when you're designing software to be software, when you're doing software engineering, you can kind of tolerate a little bit more of in tankerousness, shall we say, in exchange for kind of a cleaner system design where you can have a test environment running at the same time as your development environment. I mean, component can do a lot of stuff, right? But I've recently come to the conclusion that for the kind of interactivity that you want with the data science project in order to kind of dig into a namespace and start evaluating stuff, having to muck around with finding the right component from the system thing and then sticking that into the right function call every time is just really kind of annoying. And so I'm thinking that it would be great to re-implant and either... I'm interested in considering Integrant but I think probably Mount is kind of the... Seems like the simplest, or maybe it's easiest, I'm not sure, but seems like a very straightforward solution that kind of solves the problem of making things more usable from the developer perspective. There is also... Yeah, the other big thing, and this is a major thing, again, I touched on this briefly, but we implemented in the... Using Core.matrix, which at the time seemed like a great idea because Core.matrix has this really modular design where it's built against protocols. So you can actually implement the Core.matrix API using whatever kind of underlying data types. So that felt like a really powerful feature initially, but since then, and I'm not someone who's deep in the kind of performance and numerical computing side of things. That's not necessarily my bag. I'm more experienced with the algorithms and the analysis side of things, not necessarily the nitty-gritty of how they get done, but I recognize that a lot of the closure community has moved more towards the tech ML stack, and that really buys us a lot in terms of... It solves some very real kind of pressing problems with performance, I think, and also, again, gives us the opportunity to interface more with Python, with R, with the rest of the kind of data science sphere in other ecosystems. So that's a transition that we'd like to make. So those are the two biggest things that come to mind. Right now, I'm kind of going through a very long old issue queue, and if you go back to the talk, the link that I shared to our repo, you can go find the GitHub project on that page, and that's where I'm starting to break things down a little bit. You can search by math or closure to find closure-related issues, and things are sort of, there's a priority column there, so you can take a look at that to see what's pressing in our mind. Excellent, thank you. Yeah, okay, we have another question for Shima. I think she's answering straight away in the channel, but we'll get her live here. So since Viz doesn't have the Swiss Army night dimension, does it have a notion of when it will be done? In other words, where will Viz be going and what do you want to do with it next? So I think it will still be in the Alpha stage for at least a few more months. Usually we decide our directions based on the feedback that we get from users, like that was our process. We decided to follow when we started building it, and it has worked very well for us so far, so we would like to continue with that, just showing it in the study sessions and watching how people react to it, that would probably be the main deciding factor. There are a few more things which we would like to explore, like some specific use cases, such as I mentioned the time series data visualization or something like that, so maybe we'll also look into that, yeah. All right, fantastic. So let me thanks again, Ashima for her awesome talk there, her effort and Chris, a small again for his talk and being with us today, answering all these questions. It is amazing. Thank you very much.