 So, I have written this file count heads dot jl, so I am going to first write this file heads dot jl, so I am going to for readability, you can use it or not use it. So, how does this at time work? I mean is it I mean clearly I understand the idea that does it like some kind of proxy or something like that. So, I should actually I have been using at without saying what it does. So, at something means it is a macro. So, Julia has full macros like list for any other Haskell or any other metaprogramming environments. So, you can actually read any Julia code arbitrarily and you know modify it and all that cool stuff. So, at time is basically a macro that takes an expression and puts a timer around it and reports it. So, it is very convenient way to modify the behavior. Sorry, can you repeat the last part? So, the at. So, the macro what does it do at time? So, at time macro just you know runs it takes this expression and evaluates it inside a timer. So, you know I mean we have MATLAB style timers right I mean this kind of convenient to use on. So, can you can you inline a lot of. So, that is the easiest timer. You just put tick and talk in between anything and it is good but at time is even better and that gives the answer back and you can add elapsed to get the actual value of the time back I think. So, when you say expression what is the constraint around the expression I mean can it be. Anything. Anything. Correct. So, I can say this and then you know I can create a compound expression right. So, let us do it again to twice the time or it can be you know 1 plus 1 also for all you know. So, for example a lot of our stuff right like tests for example. So, you can say act test something and it just works by taking the expression and evaluating it and seeing if it turns out to be whatever the macro expects. I think we will do a couple of examples of macros it is really interesting to see how you can sort of play around with the code. But for the current purposes we are trying to do a parallel for loop. Let us just do this for. So, this is the inner part of the count heads loop right. I am just kind of running it in parallel here. What might also help is a quick overview of available packages. I will I will do that. In general direction of where you are seeing most lot of these packages getting built right. Yeah. I will give a sense of we will do I think I will do this do a little bit of C call and then packages and then we should be able to wrap up. C call us for anything that you want it more external. C call like yes external libraries. So, this syntax is like I said anything is done to add the macro. So, parallel is actually just a macro. It takes an expression and runs it in parallel. It will scan inside the loop inside the body for a for loop and then it will look inside the for loop and split up the iterations and done them on different process. So, all of this is implemented in Julia itself and in actually very few lines of Julia as a matter of fact. Sorry. I think you were just asking about I was always going to say that. So, the plus is just the reduced operation. So, because every what do I do with. So, think of it as a map reduce. So, this body of the loop is a map step and then how do I combine the answer because at the end of the day I will assign it to something. So, the plus is just saying add up everything from every. Does that make sense? So, think of every time you call this function rank boor it will generate a random true or false value and it plus just means how what to do with every value that comes up. So, it will add them up. It can be any function or it has to be any function. It can be any function any reducer that you want. So, can it be let us say can it be something for just doing a random summary summation of integer values. It could be. Is that what it means? Yeah. You could do this. Okay. Let us just do the first one the one I was doing. Let us first I was going to do this loop without anything else. Okay. This probably is not a good idea. Sorry. Sorry about this. I think this is a bad example. No one. It is great to have to look at the line. I thought I was looking at the 0.1 version of the manual. It is the latest one. So, the same error that Rabu pointed out. That talks about something else right. The implementation is so comprehensive that they do not have to remember and just go and do a lookup. Rather than remember. So, we are not in a parallel setup here. So, obviously you know it is it is not like that. So, we will say OVR. So, why does this not work? Okay. This is an example that does not work anymore. And I need to get to the bottom of it. Oh. Okay. It is adding bool. Yeah. Yeah. So, OVR does not work. Rand bool is a function on the end. Okay. But now it says there is no plus method out here. It is adding bool and ran on the end. I have not import all these for the. For the remote processors also I have to import all these. So, the parallel is a macro which takes up like I said plus is the reducer. Now, let us just make this a little bigger. Let us just go with the example here which has been calibrated. Work in a demo, demo stable type. It is 200 million. Yeah. I think this is actually going to end up slowing than one processor because I am running too many, perhaps I do not know yet. Because we do at time. For segments of tables that I want to pass through. You can pass anything you want. It all just says memory. Yeah. But it will end up broadcasting and stuff. You have to be careful. You have to think about the underlying communication that it will impose. So, it is all this if you want to. Yeah. I mean it may be if you have a large amount of data but even a large amount of compute it may be okay to broadcast. But if it is like 20 GB then you may not want to broadcast and you have to rethink how you organize your computation. The usual carriers, right? Yeah, it automatically does that. So, because it is running in parallel it is already serializing this. I mean it would otherwise not know that I have the body of the loop how to send it. So, you said even user defined types are serialized. Yeah. Anything that is fully in Julia gets serialized. If you have some arbitrary C pointer embedded inside which happens a lot in Julia but then you are screwed. Okay. So, we should have done this before. So, the way to do this systematically is... So, the at parallel macro also may have more number of running processes or even one it is still treated as a... That is what I am going to do just now. I do not know the answer though. So, now I am running with one processor and then idea is to add one more processor and then try it and see if it speeds up or not. That would be the ideal. Or maybe this is the it is not doing anything. Okay. Is it like each and every iteration of the for loop returns back the last computed value? Yes, there is a good question. Yeah, good question. So, in any block of code not just a for loop in any block of code the return value is the last expression. So, if I do a conditional evaluation if block else block and else block executes and you assign it to something the assignment is always whatever the last element that gets executed. So, in the for loop it is this one. In a function you do not have to necessarily return something. That is typically... I mean that I understand. Okay. But in this case a plus has to get each and every iteration of for loop. So, what will happen is that it will do the plus locally and then do the distributed plus across the processors. So, I will try now and add prox. We will add one more processor and see if it helps. Okay. And... We will encode our base again. I start a new session. And then at first I added those processors. So, I might just learn it again just to make sure that there is no jit compilation effect coming. Although it should be 3D minor. Okay. So, it is like one processor is almost 13 seconds and this one is 7 seconds, 8 seconds. Not bad for, you know, doing zero work to parallelize it, right? If I add one more I think I am not going to gain anything now. So, probably I should even bother. Because I have only a dual core. Yeah. So, this is the app parallel. But now similar to app parallel we have a team app. Okay. We did kind of... I don't know that four cores, no? It is four i5. It is four i5. No, no, it is two. I am pretty sure. With hyperthreaded. Yeah. So, it is four. It is four i5. No. I am pretty sure this is the mobile version. No, it is. I mean, if you look at the... But if you have hyperthreaded, it may be four logical cores. Yes, it shows four cores. Yeah. But I would expect on a compute bound job to see any major improvement. But we got some 20 to 10 percent. Not bad. We have one more at secret actually. So, just like app parallel now we have a team app. Just another... It is almost the same. It is slightly faster. We have something called team app. So, typically like, you know, so there is... Oh, by the way, I haven't shown a help. So, this may be a good time to try out help. So, there is very preliminary help available in Julia. It is usually one line. It doesn't... I mean, I like to say that it doesn't overpower you with help. It just gives you some enough stuff. But yeah, we have work to do on our help. So, that is the syntax of the map function, right? I mean, it is the function that you want to apply and the data you want to apply to. And it just does f of x over every x out there. So, just like map, there is a P-map. So, if you have a map function, you could do a P-map and get parallel execution. If you try arbitrary stuff, you'll probably find it does not work. So, those are good things to report bugs for. P-map may not even have a help. Oh, it does. Okay. So, there is a P-map interface. There is a parallel for loop. This is kind of like your openMP style interface, right? I mean, openMP, you put similar pragma text stuff, right? Right. So, you could say it is a director, right? Because it's a macro that changes the behavior of the program, just like the directors do. So, that's all. As much as I wanted to talk about parallel programming for now, you will find that this is actually good enough for a large number of use cases. There will be use cases where you want distributed arrays, where you want distributed data structures where each node computes something and sort of gathers state as it goes on and then you want to work on the shared state and all that stuff. So, the capabilities are all there because like I said, with remote call and fetch, you can pretty much do anything in Julia. But, you know, the goal is to be higher and higher levels of abstraction so that you can, you know, stuff like add parallel. And we will keep building more and more abstractions like this to make some of these things useful and more user friendly. Okay. So, this pair is worth a good read, so you can, you should try it out if you want to know more about parallel stuff. I'll quickly now go to calling C and Fortran code. It's my favorite example of yours. That's it. Yeah. No, the compiler just does this once. So, the first time it will do a DL same and then store the result of it. So, it's effectively doing something like this. So, it won't happen every time. Yeah. So, SQL is actually a special function. It's not like a regular other functions. It's deeply embedded inside Julia and it knows a lot about, you know, its structure of what it's doing. So, you give your, in fact, in this case, I bet you that we didn't even need to give this lipstick. If the symbol is available in your library or in a program, you can just call it like that. And this is what, this is the output of it and this thing takes no input. So, I just have nothing there. You still have to say something. Yeah. Just in. No, then the arguments will not match the columns in that. You mean like, maybe I can say why? Can it influence 64? No, int is in 64. It will get... Or the conversion is happening. I mean, the answer, the question is, is it really an input? Yeah, it's probably converting it. Well, the bit representation will remain the same, right? So, if the number is small enough. So, notion of symbol is, because for not to use, hold on to the clock. Similar to like, would be symbol. Yeah, this is a symbol. So, you could have just done this either. This too, I mean, doesn't have to be a symbol. You could just give it a string. And the string can be constructed using arbitrary string construction operations. Because I have to, if I say, why it won't do anything. So, that's the... Does this make the C code? Julia obviously does type and friends. When it's calling a Julia function. Yeah. So, I was just wondering if it could be like a type and friends. Yeah, because here it knows, right, what the input types are and output types are explicitly specified by the user. So, calling a C function, we should not mess up the types in the rest of the code. What happens is, if you... What if your C code returns like white star and stuff? Then you have more work to do. Let's take a little more complex example. I don't know if one is available that easily. Okay, this is not a bad one. Okay. I'm not going to run it, I guess, because we all know one. It's a get end function. I just want to sort of highlight the syntax. So, it's the same. If the function takes input arguments, then you give it a tuple of the input argument type. So, here I'm saying it's a pointer of type. It's a, you know, care star. Another pointer of type care star. And... No, sorry, let's just take it. This is the output type. The output is a string, right, of get end. So, you get a care star as the output. Input is a care star, which is the value, environment value you're looking for. And this is actually passing the value. So, I'm taking my get end function. Input is a where. I put a where in here and then I run it. I do the SQL and the answer is null. So, I know that comma is extra. Yeah? No, because this has to get tuple. So, if you don't put a comma, it won't know it's a tuple. It will get removed. So, there's a difference between this. So, let's say this is one. This is also one. But this is a tuple of one element. Yeah, the parser will take them off. If it's just one element. You can, there are other ways of specifying tuples, but this is sort of the shortest way of doing so. So, this is all it takes. You realize I did not compile it. As long as the shared library is available and I can deal open it, I can call it from Julia. And now these are simple examples where you'll ask, what if I have to pass a struct? What if I have to pass a pointer to a struct? What if the return value is a pointer? What if my header file is long and complex? We have to type all this stuff by hand. So, the answer to most of these is, no, you don't have to do any of these things like this. There are easy ways to make them work. And I'm going to just quickly describe this rather than go through it all. And we can sort of later on go in details. Immutable types map to C structs in Julia. So, if you have an immutable type with the same structure as the C struct, you can just pass it it will match exactly. And so that becomes very convenient. If you have a callbacks, the C functions calling back to Julia functions, you can register them in Julia as well with C function. That's the way to create callbacks. If you have... The coolest thing though is a package called clang.jl that will be based on the clang compiler. So, clang is a library called libclang through which you can analyze source, C source code. And it can... So, someone's written a... Isiah Norton's guy out of Boston. He's written a package which essentially can take a header file, pass it, and generate basically all those C call lines for you. So, you don't have to manually create them and then you just call them yourself. Of course, it's working progress. So, then you will say, what about enums? What about defines, you know? I think I believe all of that is doable but the work is currently at a step where it's not fully done but it's already quite usable. So, as an example, the WK library which is quite... I mean, you can imagine it has, you know, dozens of functions. It was wrapped up automatically using clang.jl one of my colleagues here in clang.jl. Unfortunately, he's not here today. So, this is what I wanted to do with the C stuff. Again, there is a very good blog entry on all the capabilities of C code of calling C libraries. The one I wanted to look at. Whether these last five, six blog posts are all really good, you know, introduction but starting from... Efficient Lightning gets in Julia is an explanation about immutable types in Julia. It's a nice little blog post. This is just some videos. Distributed numerical optimization is a... I mean, it takes a numerical optimization problem but it shows you how to do asynchronous parallelism in fact, let me just kind of... So, this was a code that they distributed parallelization in Julia. So, this is an implementation of T maps. I mean, it looks kind of scary but on the other hand, you know, in distributed memory running a asynchronous parallel job, not bad. I think that one was synchronous. This one is the asynchronous version. So, I just want to point it out for an hour rather than run through it. And then there's some performance at the bottom which is quite... So, this was done on EC2 and gave good scaling and all that, right? You may wonder how can you get 105% scaling, right? Any guesses? Yeah. As we have more machine, we get more cash so sometimes we get super speed ups. Just a little bit over 100. And the other one that I wanted to point out is passing Julia callback functions with C. This is actually a generic blog post. For those of you who know of anything about in this scientific world, I guess some of you guys know about FFTW, the fast Fourier transform library. So, Steve Johnson is author of FFTW. He also contributes a lot to Julia and this is a blog post written by him. And show you how to... the Q sort function, right? Which is how to call the C Q sort function. I believe you never want to call the C Q sort function because it's quite slow. I'm sure everyone knows this. But if you wanted to call it, it's an interesting case because you have to work a whole bunch of these... It had a callback for comparison. And this example shows how to work with the callback. So, Steve, so passing Julia callback functions to C. If you imagine you have some complicated C library and you just want to pass in a Julia function to do some basic small amount of work, you can actually go ahead and start doing that stuff. Is it a pointer of P? Well, it's... It's a type parameter, right? Yeah, so what else? I understand one step. What is T? T is any type? T is the type parameter, right? My comparison. If you give a... No. If you give a double-star, T will be something else. So, again, right? That's the generic programming. It's a very long blog post. I'm not going to go through it right now, but you can even do closures. You can do all kinds of stuff out here. And a couple of really cool recent blog posts, you can actually create entire GUIs in Julia by embedding, like, buttons and menus and drop-downs and basically calling TK, the ticker TK libraries from Julia itself. And a couple of blog posts on that. So, that's sort of a quick primer on the capabilities of calling C libraries from Julia. So, everything that you can imagine you would like to do with C libraries is actually possible. That was the limited point I was trying to make here. And, okay, the last thing I'll look at is the... I'll just point out to the packages. So, if you go to docs.julialang.org, actually, let me start from there to just show you all. So, docs.julialang.org actually has all the documentation. And some people have very interesting UI designs. So, if you go to docs.julialang.org, it will give you the release no.1 documentation. If you want the latest documentation, you would have never guessed this unless I tell you. So, you have to do it like that. Or you bookmark it. So, there's a little button at the bottom which is between versions. So, the key documentation stuff is there's the Julia manual which is, you know, many of you have already seen, the standard library which is a reference for all the functions that are documentation. And there is this thing called available packages. So, these are all the packages that are available in Julia. And a lot of the packages are... Some of them are very well-developed. Some of them are not very well-developed. Some of them are orphaned. Some of them, you know, the author will start working if you file an issue. The only thing about it is people like yourselves working in their free time. So, depending on usability, depending on how much use the package gets and how much use the author himself meets for the package, you know, the quality of the power work varies. If you look at this table of contents, there's actually a... I mean, I can't even probably describe all of these things. So, there are the utility packages like stuff like ArcSpares, you splines. This is a benchmark package to just run bunch of ports and benchmark things. Some bio stuff, you know, room filters or machine learning stuff, type. Kyro for drawing, you know, Kyro surface. Calculus calendar, we looked at it. Catalan is number theoretic functions. Chain vectors is a... Chain vectors is actually interesting. It's something that grows your vector in large chunks, you know, depending on the data that you... So, chain vectors is something that's used as a loop for the HDFS package in Julia, man. So, you know, if it reads a block size, let's say... Let's say your block size of 64 md, but your data that you're reading spans three block boundaries. How do you do it? So, you just allocate... You grow your vector by allocating chunks and chain them, and it overloads indexing to make it look logically like a vector. It's a very cool trickery inside, but very nice interface outside. So, is that a new, like, new package? Yeah, that one was specifically developed. Actually, this has been done again by a friend of mine here in Bangalore, Tanmayan Mugapatra. And it's under active development. Clang is the one I said that generates automatic wrappers. CLP is a linear programming library, very high-quality open-source linear programming library. You might have heard of GLPK, for example, some of you guys. Clustering. This is your typical, you know, K-means, DT-means, all that stuff. Codex is compression, decompression stuff. CoinMP is integer programming, I believe, if you have a need for such a thing. Continued fractions, compos... I mean, this list just goes on and on. Curl is... No, that's not the one I want. Data frames. This is a good one. So the data frame package itself has these many contributors. So you can imagine it's a pretty serious package. It's probably the most serious package in Julia in terms of work and contribution. And often when you go to a package, let's say data frames, you click on it. It will have a quick read-me, which will give you an example of how to use it. So Harlan Harris is a well-known data scientist from the... I forget, it's New York or Washington area, but he's the guy who started this thing. So, you know, this is the kind of stuff it can do. Maybe it looks a little familiar to people who have done bit of R. So you have your columns and names out here and you can index by name and so on and so forth. What else do we have here? Decision tree, another sort of commonly used machine learning kind of a thing, right? Give it data and it will sort of give you decisions in decision tree of how to classify it. De-vectorized is an interesting package. If any of you have done any... some of the criticisms of Julia, one is that vectorized code performance tends to suffer sometimes. In every scientific program, vectorized code is fast and de-vectorized is slow. In Julia, it's the other way around. And the reason is that we just have more work to do on our memory management. It's not that the language has inherent issues with it. But de-vectorized is a package that can take a vectorized expression. It's a macro actually. So it can read any vectorized expression and spit out a bunch of nested for loops which will de-vectorize the vectorized operation and without allocating any extra memory. So you often get huge amounts of performance gates. It's just really cool programming and use of metaprogramming facilities here. Diacom... Diacom is an image format, medical image format you may know of. Pretty horrific image format, file format if I might say so. It is a full support but it's a parameter. Dimensionality, sorry. Dimensionality reduction. This is your principal component analysis and matrix factorization and stuff like that. Distributions is an important core function, a core package. That's basically all the probability and distribution and associative functions, you know, CDS, PDF, you know, random number generators, various kinds for various distributions. I'll just kind of call out more commonly known ones because I don't even know all of these myself. FITSAIO is an astronomy file format if you have need for such a thing. GLM, the standard workhorse, you know, the linear model stuff. GLM, general linear models. GLP is another well-known, the GNU linear programming library. GLUT is the open GL stuff. GSL is the GNU scientific library. Interfaces with GZ. GATFLA is a very cool graphics package for Julia. Still in the experimental phase but I actually do want to show you this one. So let's let it load for a bit. Gaston, some of you guys are using it. It's a GNU plot thing. MCMC is Monte Carlo, Markov chain Monte Carlo. This is a graph package, a bunch of graph libraries out here. So let's just open that. Sorry? Someone created an interface to create charts with a Google charts API. Kurovi is a commercial solver package. I don't know what it does exactly. Maybe probably linear programming. We have some really cool linear programming guys from MIT operations research guys using Julia. So we have a lot of OR. It all tends to be bias by the people who use it and so the more and more communities come, the more kinds of packages tend to show up. So how many of these packages continue to remain in a font-trip kind of a mode to actually becoming part of to more if you had to give out distributions, right? How many of these packages? No, so actually our goal is not to bring any of these. Maybe one or two who become core kind, like there's a profiler, right? So profiler may become part of base. Unlikely that we will take any of these into base. What our plan is that so this is what Python, the sci-fi fellows do. They have this tool called Anaconda. So it can take a bunch of Python packages, the scientific packages and bundle them up and take one distribution. So we are going to, I mean, the Julia base is already quite big if you look at the capabilities it has. So we don't want to make it unwieldy and it will be also very well supported long-term support documentation, bug fixes, all that stuff. So all these other packages will evolve and most likely there will be a number of distributions. So there may be a mega distribution which has everything but there may be a distribution for OR guys which has the OR-related packages all bundled up and tested and all that. So that's how I think this is going to evolve. But the general use case will be that there will be a base Julia and you do pkg.add, get your packages of interest and kind of work with that. Not good for deployments. So for... I'm going to take a query from a deployment and second is for people who might be just willing to want to quickly use and not worry about the amount of hacking that you can lay out. So my guess is once things stabilize I think we are about three to six months away from the base stabilizing. Once the base stabilizes, the packages will start stabilizing themselves because the largest change is the APIs. It's not functionality but bugs are usually rare. It's APIs that change. And once things stabilize there, you will not have this constant need to update and refresh and all that stuff. You will be able to just use the version that comes on your Debian distribution or something. And for deployment, the plan is to make easy deployment tools. So when you've tested everything with a particular set of versions, like you were saying, like a jar file, just create a distribution and stick it out there so that you don't get into this version and dependencies and all that stuff. Any other... I think you have a very cool, lip-curl wrappers for doing HTTP client kind of stuff. And I think the one... This one I'd like to call it out again because... Where do I see lip-curl here? Well, curl is different. That needs to be deprecated. This one called lip-curl, I think. Yeah, this one. It's another guy from Bangalore, actually. And so this is a pretty high-performance library that wraps around the curl. There was... HDFS5 is your... If you're a MATLAB user, HDFS5 gives you ability to read binary MATLAB files and store binary MATLAB files. HDFS is your Hadoop file system interface for parallel computing with data that is in HDFS. JSON, you know, for parsing JSON. LibSVM is another support vector machine library. Machine learning stuff. MATLAB is a MATLAB interface from Julia to MATLAB. Someone wrote a Julia to MATLAB calling it a FACE. Some finance stuff, market technicals. MLBase is machine learning. There's a Mongo interface. I don't know if this is really well-tested or not, but it exists. It uses the Mongo CATI. ODBC is for your data... Actually, I think one really missing thing is like a Lim MySQL wrapper, right? I mean, I think you would really benefit from having a traditional database before going to the NoSQL, have the SQL versions first. What about file APIs? They are part of the... File IO and network IO is in the base, part of the language itself. OD... Ordinary differential equations, you know, if you need to integrate, you know, various differential equations and stuff. Looking at OpenGL, you can imagine. Optem is another important one. So, Optem is all the... a lot of the optimization packages. So, you know, various iterative algorithms. I wonder what all it has. I wonder if it has nonlinear optimization. I don't think it has nonlinear optimization, but that could be wrong. Also, let me switch to this thing for now. GAT5 is a graphics framework that produces really beautiful looking and interactive graphics. It produces SVG files. So, you know, it's not meant for interactive use, but it kind of, you know, is nice to integrate into HTML documents and stuff like that. So, you know, you can kind of look at these datasets like that, label them. This is a standard R dataset of brain size versus, you know, animal... I think brain mass versus body mass. It's a very commonly used dataset. Histograms, this is really nice to error bars. You know, you can do stuff like that here. This is, I think, this is a link plot. So, based on what you select here, it picks stuff on the other side. So, this is also another graphics package. This is a different view to graphics more than interactive graphics, you know, where you plot something and see it's sort of a more of a grammar of graphics approach and it creates SVG files. So, you do need to double click it and open it in a browser. Expect that it will get much better soon. Whether the interactivity in these SVG files was by generated JavaScript. So, what are the codec packages? We do audio video... No, no, I think it's just the compression, the compression for zip kind of stuff if I'm not mistaken, like text. It's encoding. It's probably got stuff like base 64. I don't think there are audio video codecs. Yeah. So, I think it pretty much gives you a sense of this stuff, right? I'm wondering if there's anything else that's worth a look here. Some web socket, someone's created... Text analysis could be a useful package. Sundials is an OD... Actually, there's a package called simjulia. I don't know if it's showing up here. Oh, it's right here. Simjulia is an event, discrete event simulation library, which is pretty cool. Is there a container for an individual or as a different street, container, something like that? We would like to have... I mean, one of the things we've often talked about is deployment of Julia stuff, Julia programs behind a REST interface and sort of automating the thing out. Yeah, from what I understand, I mean, one of your ideas, that's what I'm referring to, is that with other languages, you just kind of use R and update for like programming and then you copy it over and make a Python and you use that in production. Correct. So, you would want Julia to be used both for code and for production? That's the holy grail and absolutely, that's our intention. So, for it to run in production, you would need to have like an HTTP container or like a long-term process container. Correct. So, is there something like that? There is nothing... There's nothing packaged. I think the pieces are there. So, we have HTTP server kind of components. We have talked about deploying it behind a web server, like making it an Apache module or a... Let's forget the other web server that's commonly used. EngineX. EngineX, yeah, making it an EngineX module and sort of just linking up the network layers in Julia so that, you know, data can effort is becoming more HTTP for Julia. Yeah. Yeah. We're very much in our interest lists, but I think we have a lot of groundwork to cover before we can attempt it. Maybe... I would do that as a post 1.0. So, maybe by around the end of the year. And then based on use case, right? I mean, people have to want it. Finally, everything comes as a contribution. So, you know... I definitely don't have any skills. So, at some point, there will be a WebStack. There already is a web framework that someone has written in Julia. It's called WebStack. And apparently, I think... If I remember reading... It's not showing up here. I think they've not registered it on the package database. If you Google for it, it's called WebStack.jl. And I think the claim was that it had NodeJS performance. Like, it matched the NodeJS performance on the simple benchmarks. So, there is a web framework in Julia. The question is like, is that... If someone really needs it, I think it will get developed further. That's how I see these things. But someone wrote it for the... It was actually written in... Has anyone heard of the hacker school in New York? So, there was a group of students that had... So, Stefan teaches as part of the hacker school. And they did a... There were a bunch of people who wanted to work on Julia and they got together and wrote this WebStack framework. So, it's actually quite comprehensive. I think that pretty much covers everything I wanted to kind of show you guys. Maybe a little bit of a feel for the language. I mean, I... You know... We didn't get into stuff like macros and expressions, you know, and sort of deconstructing expressions and manipulating them. We could do all those things as we go along. I did not touch up on a lot of the linear algebra, sparse matrix stuff, and a bunch of cool stuff that we have. But my goal was to sort of give a language over so that you guys can, you know... If you have further interest, you can on your own figure out what's available and hack on pieces and stuff like that. Any specific questions or anything you guys want me to... What about the functionary paradigm to set up something like lambda as a product? Anonymous functions is a common one that we use. It's very commonly used. Let me... That's a good example to just show. So, you can just have stuff like this. Let's say... Because it used to be... Yeah, yeah. If you have a number in front of a variable, then it becomes... It's like two times... So, you can now take... Pass this to any other function that needs it, you know. Let's say you have f of x equals 2 fx. Let's say it's x. Let's say it's just 2x. Then I could have done... No, this won't work because I want to pass x as a function out here. So, I want a function that takes a function and maybe evaluates it. So, maybe... Maybe I do this. And then if I say... I think I'm going to get myself into a bit of a... So, I've given him this, but I need to give it... Okay, this is a poorly constructed one. Sorry. But as a question, you can pass functions to functions. As first class objects, you can create anonymous functions. So, lambda is a higher order function. Higher order functions, all that stuff. You take performance hits when you do these things. Some of them, we are going to address. Some of them will actually, you know... Just the usual caveats and stuff. I need a good example for this. Happily. You have to try it out afterwards. I saw that one anonymous object in the map example. Yeah, that's right. So, if you recollect the map help... Body... Body is miscited by that. I was... Correct. So, this is a plop. So, map takes a function and so you can just create functions on the fly without defining them and pass them there. That's the line definition of the function. Yeah, yeah. So, it's a lambda, right? So, it took an array of three elements and squared it. I mean, I doubled it. No, no, no, they are... I mean, there are certain cases which you have to try hard to find, but they are quite fast. So, these... These idioms are... These idioms are... Absolutely incorrect, absolutely. I mean, when I say something is not fast, there might be a few extra 10, 20 millisecond overhead that you may not want. Sorry? What anonymous function is cache? Is it combined each time or is it combined inside a scope? I do not know the answer to that. It would be... Probably be combined each time... I would... My gut... My gut reaction would be that if it's the same exact thing, then it will not be recombined. But if the types change, for example... Yeah. So, actually, here, this is an easy test, actually. Let's say, if I do this... I wonder if I can call it like this or not. But it's too fast. I mean, it's too fast to measure it. You can't measure a delay. Because if I do 2.0 there... This, it would definitely have had to recompile, right? Because the types have changed there. So, even when we're calling it, it's not going to come out as cache. Sorry? Let's say I'm... Oh, yeah, yeah, yeah. Oh, yeah. So, once you've defined it, and you're just using it multiple times, it just feels like anything else, yeah. That's free. Yeah, yeah, yeah, that's what... The internals... Yeah. I would not be the right person to give a 100% answer to that. So, what's... The last thing, I guess, is... This is our GitHub page. This is where everything ends out of, I guess, a lot of people already visited it today. Wow, we just reached the 400th issue today. 400th open issue. So, we have 2800 issues closed 400 open. That's, I would say, a pretty active project. And I think the specific thing about performance of anonymous... I don't know if you call it anonymous, or what would be another name for these performance... Yeah, here. So, there is someone who has filed this thing, you know. It's been closed. That means it has been fixed, but... So, you'd see... If you see any weird behavior as a good chance, it has been reported and... Either it's in the plan to be fixed or... No, it has already been fixed. A good way to start contributing is looking for these up for grabs issues. These are generally considered to be self... I mean, sort of... Self-contained or do not require knowledge of too many moving parts of Julia. A lot of them are pretty big and open-ended, but you may not know what to do about it. But take a map on Windows, right? That's something, if you know some Windows programming, you can probably jump in and contribute that. Or, you know, these are alias benchmarks, test suites, all kinds of stuff here. Write a parallel comprehension operator. What's the password to JavaScript? There is a crazy issue open, right? What is the... Just someone thought it should be done. Because of M-scripten. M-scripten can take LLVM and run it inside the browser by compiling LLVM bit code to JavaScript. So... And Julia has LLVM underneath it. Of course, it's not going to be as realistically possible because of the dynamic nature and our reliance on, like, a dozen of CN-4 grand libraries. So you would have to probably strip down Julia and then have, like, any simple programs. The interesting thing would be if this thing works, then you don't need to do that entire deployment of running Julia inside a browser. The whole thing can run locally inside JavaScript on your browser itself. Possibly the language can run, but all the language... Yeah. But you may be actually able to just implement a password in JavaScript itself from scratch and a simple interpreter also. I don't know, I mean, how possible this is, but clearly it's something that has been discussed quite a bit, as you can see. I wonder if it's an April 1st post or something. It's not, yeah. Can you imagine the use case as a browser-based MBC system? Sorry? Browser-based? I mean, I'm touching the backbones of a full-fledged MBC... Yeah. That would be cool. I mean, I think it's probably easier to deploy the web service than to do this. This is my guess. There are like three, four attempts at building the web services that serve Julia. Okay? Thank you very much for staying, and I guess we'll stay around whoever wants to try out more stuff.