 Welcome to another edition of RCE. Again, this is Brock Palin. You can find us online at rce-cast.com Also, feel free to jump over to iTunes or your other favorite podcatcher and be sure to leave a review This makes a podcast more visible to others There's also a nomination form on the website where we've recently had some contact from users some Giving recommendations of other places and actually the show today is an example of one Once again, I have Jeff Squire some Cisco systems and one of the authors of open MPI Jeff, thanks a lot for your time. Hey, Brock. It is a busy time for everybody, right? So as we're recording today There is a Euro MPI conference going on. There's an MPI forum conference going on They're gonna go hand-in-hand. We got super computing coming up in a couple of months What's going on in your neck of the woods? Yeah So I will also be a super computing dropped by the Michigan booth and come see some of our recent advancements and hardware we've put on the floor, but Next week probably about the time this goes out I will also be at the frontiers and computing and data science a little mini conference That's free that's helped by Michigan State University. It's their second year of doing it. It's very nice It's organized by their new computational of sciences department and actually the week after that October 10th and 11th There'll be an academic medical form IT form held at the University of Michigan I will also be there. You'll actually get a tour of our HPC data center That Monday and that's also a free conference. So come by and see me there Cool. All right. Well today as you were talking about this was a user suggested Topic and I think we actually do have an award-winning category here because we have the most number of people We have ever interviewed at once. We have four people from this particular project. So Brock tell us what we got today Yeah, so we have a little bit of a dining philosopher's problem here with with four guests plus two hosts on the phone So this is the most we've ever done. Everyone's in a different location But the guests today are representing the Julia project Julia laying org Guys wanting to go ahead and introduce yourself Hi, thanks for having us. So I'm Jeff the Dancin Let's see. I recently finished a PhD at MIT I'm mostly interested in compilers and programming languages And I have a little bit of background in scientific and research programming. It's part of what got me started on this Hi, I'm a little man here glad to be here as well I'm been a professor at MIT for 20 years and applied math and computer science lab and I'm interested in the whole range from numerical algorithms all the way to making a scientific computing and technical computing run faster and easier I'm Stefan Karpinski Yeah, I you know used to work as a data scientist I learned MATLAB from Verrall who's gonna introduce himself next and You know we we got into this because we felt like we could do something Something good for data science something good for numerical computing a language that we could use for all the things We like doing Yeah, hi, this is Viral Shah You know, I've known all these guys for a very long time now and I personally have a PhD in in computer science and computational sciences from the University of California at Santa Barbara and You know when we met up we started talking about Julia and this is you know, I come from the scientific computing world and You know, it's just what I love doing and that's what all of us are doing today So the Jeff and Stefan and myself are also co-creators of Julia computing along with Alan But while Alan is still driving the research at MIT, the three of us are co-founders of Julia computing Okay, so what exactly is Julia? Julia is a programming language. It's a general purpose programming language But we was motivated by the kinds of problems that people have in technical computing And the kinds of programs people write in MATLAB and Python numpy and in R And we we looked at that world and looked at our own experiences in it And then decided that a new language design was really necessary the way I like to describe it is Julia is something like a Python or an R or a MATLAB when you first come to see it so that it's easy to Start using but under the hood. It's something very very different. It's much more powerful Okay, so when you say it's something like these languages, what do you mean by that? Do you mean it's syntactically or grammatically or what do you mean? What I tend to mean is that when people come to Julia, they feel pretty much at home They feel like they're ready to use it No, once you get past the the square brackets or semicolons or a couple of other things usually people are quite at home Yes, sometimes people get hung up on one based indexing and zero based indexing But like Alan said, you know, it all sort of disappears in the background once you once you start using it So just for the record, which one are you zero or one based indexing? We are technically one based indexing. I mean we started out as one based indexing But you know today you can pretty much do much more than one based indexing as of the latest Julia release My colleague Ron revests Says that that wars have been fought over lesser things. See he thinks he has the solution We should all go to one half based indexing and leave it at that I think that the the what what Alan to expand on what Alan's saying about people feeling comfortable is that the The syntax is you know superficially similar to MATLAB. So you can often translate MATLAB code to Julia just by changing a few parentheses to square brackets for indexing into arrays and not really changing too much else But you know the semantics are sort of closest probably to Python, it's very straightforward dynamic language to use but then there's sort of this rabbit hole of Advanced features that you can go down that you don't need to know about right away to get into To write useful programs, but which can help you as you find yourself doing harder and harder things And indeed what happens when you start to go down this rabbit hole is you become a better program or Something that when you've used these other languages, you never knew you you were missing and never knew you wanted to be But then when you do it you wonder how you lived without it Okay, so you've already mentioned Python and MATLAB and you know we have Python and ours examples of open source as You know interpretive languages as well as MATLAB as an example of a scientific but closed-source one So why do we need a new one? Well, I think one thing is that to us the key is to get the core abstractions really right and also I to us technical computing is Largely about functions essentially and not just any functions the the functions that arise in this domain Actually tend to be Quite complex and have many definitions and many notions of them and that's kind of what programs are built out of and that's sort Of what we observed So we wanted a language that had a generic function based paradigm Rather than the popular class based OO paradigm that's found in Python for instance Which is good for many things But we felt in this domain you really wanted generic functions where things are more function based rather than object-based Sure. So a generic function is a function that has many definitions. So basically You can you add you add new methods to a function rather than adding it to an object So you have a single function and over time as new types are defined or new cases are identified You add other definitions to that function and then when the function is called There's a system for selecting which definition to call basically select the most specific Matching definition. So I was going to say it gives you roughly the same kind of dispatch of selecting different pieces of code that you get in object-oriented programming that kind of turn but in a different direction where it's from a function perspective rather than from a kind of noun and object-based perspective If I could interject briefly regarding the broad question of why another language there's two things that quickly like to say One is that it's really not a competition I mean, we're pretty friendly with the various other communities to go to their conferences. They come to ours but one rather different design that that's worth bringing up is that languages like like the ones we mentioned tend to be a Kind of script overseas or some other fast language where Well, Julia does use other languages. It's primarily Julia all the way down and that has very different applications So let's let's dive right into that. So That is the typical Constraint right is with NumPy and other things like that You have nice abstractions in the target language itself But under the covers it very quickly turns into C for speed or sometimes even Fortran Something that can be you know utilized for its typically numerical efficiency in various types of computation You mentioned two important things there one You mentioned the whole speed thing itself And so we'd like to understand how as an interpreted language does Julia achieve its speed and to It's Julia all the way down that would be let's let's dive into that when you're done with the first question Well, I guess the Julia all the way down. I'll address first just because it's a kind of a good lead in You know the we started out when we first started implementing Julia It was it was not particularly fast because you know you everything starts as baby steps But we kept ourselves to this discipline of don't implement don't resort to implementing things in C Just because it happens to be faster right now So figure out how to implement the thing in Julia and then work on the Julia Compiler and library and all of the pieces that you need to make sure that that thing is fast enough And we keep ourselves to that discipline to this day You know, there's there's times when it makes sense to just call You know a Bloss library because you know, we're not gonna we we have better things to do with our time than re-implement Bloss But you know, it just makes sense to implement it in the language and go with that Okay, so when you say that you've implemented a bunch of things in terms of Julia I mean somewhere down underneath it has to turn into assembly language. I mean, what is what is the crossover point? Is there a small little core of C or do you have a Julia to assembly compiler? How does that work? Yeah, yeah, we compiled a native code using LLVM. Yeah, the LLVM compiler toolchain lets us basically generate in you know use a C++ API to Describe the instructions we want to execute and then say give me a function. Give me a machine give me the machine code for this And then you get the machine code and you jump right into it and you execute right away And LLVM is just a phenomenal library for this It you know cuts down on the amount of time and nonsense involved in implementing a new language and To you know by by a decade or something like that Is sometimes people wonder that you know can't I just take this LLVM and sprinkle it over my you know favorite language of choice today and Often the answer comes, you know at the answer is no because you know What makes it possible for Julia to go from that very high level fun to use? You know programming language down to the you know the tightest assembly is Julia's language design And I would suggest that people who want to you know know about it in a great amount of detail Either look at our paper that's going to appear in Siam review or Jeff's PhD thesis That's a very good point girl that that it's not just like you can have a jit or you could have LLVM and your favorite language Could be fast you need all of those pieces from top to bottom to somehow fit together like a glove to get the speed I sometimes like to give the analogy that one You know what one one one mistake in a computation like one arithmetic error can kill the Correctness of the entire program the same with with performance you need you need everything to work together from top to bottom to achieve performance That ties back into what I was saying earlier about Starting out with the premise that we're going to implement everything top to bottom in this one language So we can't take any shortcuts and be like oh no no don't worry that'll be done in C a good example is so you know Integers don't have to be that fast in a high-level language like you know Python or R Because your for loops are implemented in C or C++ But in our case our integers have to be fast because our for loops are implemented in Julia And so you can't you know you have to make different decisions You have to you know then people don't always like this But our integers are machine integers and they overflow And and you know that's that's something that coming people coming from Python are surprised by because Python has big ins everywhere But we can't afford to do that because we're actually building everything all the way from the bottom up The word on the street is that To sum this kind of up is that Julia solves the two language problem or technically that's Uster Hoots dichotomy But that's kind of the way it's been described to to categorize the whole story So you've talked a lot about performance. What you know interpretive languages scripting languages tend to have a bad reputation But you know one of your big use cases is your performance versus a lot of scientific use cases in detail their scripting languages How exactly does Julia compare? So I'll address that briefly. So Interpreting versus compiling is actually a language implementation feature. So it pretty much any language can be either interpreted or compiled So it's actually as you know, Julia you start it up and you can type to it interactively Like scripting languages, but actually underneath it is in fact compiling code much of the time So you'll sometimes you'll type something and we'll compile it and then run the compiled code Interactive but still compiled We have tons of performance comparisons. You could find them everywhere Most recently in my graduate course problem set one I told the students to to write a program in Julia for the first time from for most of them And and then take your favorite language Everybody's an expert on one of them and compare and I've seen everything from factors of seven to forty thousand reported back to me And it's all over the cross the board depending on how they did it and what they were doing so an interesting Variation on this is that there are there are sort of there are very fast Lang you know jit compiled implementations of dynamic languages like JavaScript So JavaScript really sort of you know, it wasn't the first one But it was the first really mainstream one with the the Google v8 engine and that really proved that you can do this But the Julia's implementation is actually pretty different than that because you know We're not bolting on a jit to a dynamic language that was not really designed with the thought of performance in mind We got the benefit of starting from scratch and thinking about every single decision from the you know From the ground up about in terms of performance and as a result Julia actually has to do far fewer tricks We don't really have to do anything particularly crazy to get good performance And in fact, we're closer to a head and ahead of time compiled language like C++ or something like that Despite being dynamic, which is it's an interesting trade-off Let me interject a question here. So the the traditional arguments in high-performance computing kinds of scenarios and technical computing scenarios is Fortran versus C and or C++ Right and and Fortran although a lot of computer scientists kind of derided or make fun of it is fantastic for what it was intended for Which is numerical computation and it's absolutely terrible at other things like string manipulation And it does a passable job at file manipulation But only in a fairly basic sense C on the other hand is is relatively good and C++ is better at Those other kinds of things, but it has many more pitfalls that You can easily fall into where does kind of Julia fit in this I don't even want to call it a two-dimensional spectrum because I'm really kind of talking about a bunch of different things here but you know in terms of Technical computing you've emphasized how you designed it from the ground up for speed but where do you fall on the side of System-level things that you need in order to support that technical computing to load your input and save the output all those kinds of things So I like to tell this story when I've been in the high-performance computing world for a while And I think it was over a decade ago. I visited the world's fastest super computer at the time the earth simulator In a suburb of Tokyo, Japan and I had this best tour this this fellow gave me this Wonderful tour of the whole place and I asked him how do you program this machine and my tour guide? He kind of paused and kind of let out this little smile And he said that you know in Japan we really respect our elders So that's why we program all in Fortran So the reason I tell this story is Because in my graduate course lots of young fellows, you know Men and women I asked them what do you program in now and have anybody has anybody program in Fortran and the answer is no so We see this as kind of an art that's not staying current much longer At least from my point of view I've actually found that the occasionally the the syntactic Similarities between Julia and Fortran are uncanny They're accidental. I don't think any of us has really done any earnest Fortran programming, but the difference is that you know we into in Fortran 2011 or whatever the latest standard is The core problem often looks a lot like a bit of Julia code It's this sort of like high-level vectorized expression that goes down and you know does the computation very efficiently But it's prefaced by two pages of type declarations Explaining exactly what the layout and types and everything of the problem are whereas in Julia It's just that one line and that is enough to express the computation efficiently Actually, we had what is I think in one of the libraries Maybe it was a most where someone just took the Fortran descriptions and wrote a pretty trivial You know Come a pretty trivial translation of it to Julia and I think the code was like Maybe like 50 lines or something Another point that Stefan mentioned that we're not going to translate the the flaws or a late pack or very popular Fortran libraries, but interestingly enough the thought experiment has circulated that if we did take the time to translate it We believe we would be as fast possibly even faster for some things and more general as well But it's a project that doesn't seem to be high in the priority list Okay, so we focused a lot on The performance, but we haven't expressed anything in terms of other performance We were interested in in scientific and even non-scientific stuff I mean my laptop here has four cores and a desktop might have six eight twelve Does Julia expose parallelism and if so is it Implicit like it's you've implemented threading for some performance sensitive functions or is it explicit such as you know threads or message passing Right, so we've had distributed memory parallelism in the standard library from the beginning actually which is a Multi-processing implementation where we just start up multiple instances of the system and they can pass messages between each other The API for that is not really similar to MPI. It's more similar to an RPC kind of mechanism so we've had that for a while and then Just recently we're adding some experimental support for shared memory multi-threading Yeah, I should also add that there is MPI support that has been added as an as a package So while base Julia ships with RPC style programming MPI has been added as a first-class package in the language and there's tons of MPI libraries that are now Plucked into Julia so you could do your distributor dense linear algebra sparse linear algebra and You know bunch of PETC stuff and just about anything else Including writing Julia plus MPI as if you know just like you would have written you know C plus MPI or something Indeed my graduate course is all about the various ways of doing parallelism all of the above including GPU parallelism and and other kinds of special hardware and The motto that has been emerging from the class has been don't bring your algorithms to the hardware to the high-performance Hardware bring your language to the high-performance hardware. So we keep asking over and over again How is it that you can try your best to use the same code possibly with with minor modifications or possibly none at all ideally? And work on lots of different kinds of parallelism at once It's not been the tradition in high-performance computing But in the end I believe that this is what's needed for for high-performance computing to get to the next level Wait, so so I want to inject something there. You said so you mentioned GPU and bringing your language to the hardware FPGA Xeon 5 how exactly do you do that? Like do you have to inject something into the runtime or like say I have some customized piece of hardware That's really good at expressing specifically Molecular dynamics methods or something like that something a little higher level. How would I actually inject that into Julia? So it describes that base hardware So on a GPU one could for example Emic CUDA on the other end, but the user could be typing Julia or Julia likes syntax That's actually the subject of today's lecture So a big piece of this is that as people add Support in LLVM for different types of hardware we get a lot of that benefit for free So there's the sort of two pieces to it There's the as people add LLVM back ends and can emit code for different GPU architect textures Xeon 5 etc and the Xeon 5 is really a sort of it's sort of a strange Pentium with You know with lots of cores and extra wide instructions and as long as you have you know GPU or You know a compiler back end that can address that you're fine. You're good to go GPUs are a little harder because they're not completely general purpose But there's a huge amount of good work that's been done on getting Julia to just run run generic code on GPUs so let me let me dive down into that because You can look at this Two different things in the same way under the broad nomenclature of architecture for example I think that's actually a fascinating point that you bring out that LLVM is picking up support for new types of hardware and you quote-unquote get that for free I'm sure it's not entirely for free, but you leverage all of that work into your own which is totally cool How does that though? represent itself in terms of the underlying Topology because it's not just enough to support the instruction set of the GPU or The instruction set of the Phi which is nominally the same as the big heavy lifting Xeons and so on But what about the architecture in terms of knowing that like oh this thing is across a PCI bus And so I need to choose how I access it rather than just running stuff I've got to move things around and that takes time and that implies a different pipelining model all these kinds of architecture and topology issues How does that figure in there? You know so I don't think we have a silver bullet for this yet But I one thing that I've observed is that often was this these kinds of hardware Simply the mechanics of accessing it can be very difficult You have to learn a lot a lot of different tools and kind of put things together and learn how the different worlds talk to each other so we think first by Integrating access to these things in a really nice way so that the effort to try running something on a GPU Say is just as low as possible Then it makes it a lot easier to experiment with and people can try making libraries that solve these kinds of problems Yeah, along those lines. I think that for you know compiler writers. We want Julia to be as much of a great scripting language on top of LLVM as possible and it already is really I mean there's one angle that you look at the language and you're like wow This is an incredible scripting language for LLVM And so then you can immediately try out these things for free That you get with LLVM and just minimum effort And I think that's necessary in an area where you know We really as a collectively as a whole have not yet figured out the best way to program to GPUs to You know large distributed super computers. We have some ways to do it, but we don't have the best way yet I don't think Well, we do have you know annotations for code, right? That we use call, you know So we have act parallel that is used to tell the compiler and the runtime system that you know These iterations of this loop are in parallel, but at SIMD, which you know tries to you know find instruction level parallelism in the code that's generated and The you know as as new and as newer backends get added to LLVM These same kinds of constructs could be reused going forward I personally find that you know if I wrote my code in Julia and it could run on On the CPU on the GPU on the night's landing You know the the zon-fi the thing that comes after the zon-fi if I could write my code at a high level in Julia and And have it compile down to these architectures. I don't mind the extra, you know mental A Challenge of figuring out hey take this chunk of data send it out there and then just run this Julia function there Usually it's you know cross compiler stuff like writing a bit in Julia And then another bit in C and then something in Python that that's kind of the hard part But once it's all available in Julia, it's easy to just say hey copy this data of the GPU run it in Julia there Bring it back, you know do this thing on the CPU and so on and so forth Now does the same kind of concepts also apply To networking because I would imagine I really don't know a lot about the guts of LLVM But though LLVM does not directly interface with networking stacks and there are many these days For all the different types of networking out there. Do you For see LLVM and or Julia going that direction or are you going to continue to rely on external libraries? For example like MPI to give you the multi network support So the basic stuff that we use is TCP IP So if you if you download julius 0.5 from our download page today and you try to you know do parallel computing with Julia It will allow you to use you know shared memory on the same machine Or multi threading and then if you want to go across nodes, you're going to get essentially tcp IP under the hood The way to go across these things is Currently people use the MPI package that we have that I talked about earlier Often the MPI implementations are tuned for the hardware under the hood and that's the easiest way to Leverage, you know all the kinds of networks that are coming up But we also have one more thing that has been done and not widely recognized which is Our cluster manager framework so a cluster manager framework allows you to Sort of implement different transports and different sort of behaviors for julia for different kinds of clusters So you could totally take you know an infini band cluster and overwrite julius cluster manager with You know with something that's infini band specific I I think that a lot of people I think these days are using MPI is sort of A transport layer that happens to be really carefully tuned They're not necessarily using the MPI High-level Interface to get their data all over the place some people still are but to some extent it's just the best way to get data between machines And I I'd like to see that decoupled in the future I think there's no real reason why you should care so much about what the low level I mean MPI is both of these things, right? It's this low level transport piece and then it's also this high level programming Paradigm piece and you know, I think those things should really be decoupled in the future and I think we're headed in that direction It's it's funny that you say that actually because I I work at Cisco, right? And so it's a very large networking and server company And so depending on whether you call MPI a high level abstraction or a low level abstraction very much depends on who you're talking to So you mentioned that In the end you're leveraging llvm for the system level stuff But what is the actual julia interpreter the front end itself implemented in? Ah, so we actually use uh several languages ourselves So despite talking about the two language problem and wanting to solve it. We actually use multiple languages um You know, you use the right tool for the job Uh, so actually we have a compiler front end. Uh, that believe it or not is written in scheme Which does parsing and some of the initial lowering passes Uh, and then from there Uh, we there's some compiler components that are written in julia actually the the part that those type inference is implemented in julia itself And then since llvm is a c++ API the code generator is written in c++ So, um, what about interacting with our languages you've mentioned this and that you tried to do a julia first and you know Fixed a performance issues rather than just slapping it on top But what if I have some well developed package? Um, and I want to hook it into julia How difficult or easy is it to kind of make a julia package that calls something external? And translate between the data structures. It's generally very generally very very easy Uh, we care a lot about interrupts. We do a lot about it. Uh, so c calling, uh, you know, c is kind of the Standard, uh, abi that everything works with so that that's that's kind of the common currency Uh, so we had c calling from very early Which llvm makes it easy to do so you can call c functions directly with no overhead Uh, and then that also gives you a for-tran Right away, uh, and then since c python, uh, of course has a c api So that also then opens the door to calling python And so packages have been made for for doing that so that the python interrupt is very good at this point also And people have gone ahead and also implemented, uh, interop with r and java, which are considerably different but also work reasonably well I kind of embarrass myself because uh, I never managed to figure out how to use mechs files in matlab Even though I've been using matlab for decades But with julia, I call c and for-tran all the time without the issue So along those lines with these, you know Integrations into other languages and whatnot. Can you describe some projects that are actually using julia? I kind of like to joke. I was at a mit retreat among computer scientists where um, the person the professor before me was talking about How julia was being used all over the world by not sorry, but His app was being used all over the world by non technical people And I got up there and said that I was amazed how many people right here in our lab were using julia That always impresses me more than when people are using all over the world And um julia is being used for everything and everything. I mean, we could give you a very very long list Wasn't wasn't there a robotics lab right next to the julia lab at mit that you didn't even realize was using julia The story's been worse that the office was on the other side of my wall If I drilled a hole I would have seen them using julia, but I didn't know about it until months later at the julia con Yeah, so I I think the biggest killer app for julia so far has been um This optimization library called jump Um, and it's hard to describe what exactly it is because what it really is is it's a unified front-end language for describing optimization problems constrained linear optimization problems In such a way that you can swap out different back ends by just changing a single line of code Um, and that's an incredibly empowering thing in that area and that's become a really de facto standard to use jump um with different back ends in optimization Uh, and and that's one of the reasons why we're seeing a lot of julia used in In different types of you know in robotics because you need to do a lot of constrained optimization it's also being used by The fAA to develop their next generation air traffic control system um, so the the language for both defining what the spec is And actually having an implementation of that spec generated from Or from the spec itself All that is done in julia using metaprogramming and using You know the amazing facilities of jump to do this kind of optimization that you need to make sure things don't collide with each other So so while jump is a language for optimization written in julia What i'm hearing more and more is that uh in lots of other communities in robotics and machine learning Uh in all kinds of science and engineering applications People are saying they want to create a jump-like language so that it's it's it's the it's the metaprogramming that's Generating a lot of excitement In areas that have nothing to do with optimization, but but the technology is is working for them Those kinds of use cases. We also have um, you know A very quick and rapid adoption of julia On wall street where finance firms are using it, you know for portfolio optimization or trading algorithms um, you know calculating risk all kinds of stuff and You know, that's obviously the kind of thing that julia computing focuses on and and works with our customers on Yeah, so one of the the things that drives people to try julia Um, and and then they find that it's really the only game in town is if you really really need A huge amount of performance, but also a huge amount of productivity So obviously there's languages where you can get great performance like c++ and fortran and there's extremely productive languages like You know python and r where you can just sort of write a couple of high-level lines of code and get Amazing stuff to work, but if you need both of those things at the same time, there's not a lot of games in town except for julia So you kind of had a vision when you launched out to create julia But what is the strangest use case you found julia being used for? um, something that you didn't expect Honestly, we did not expect it to replace c++ as much as it has um, you know, and I think In retrospect it made a lot of sense because you know c++ programmers love performance They love operator overloading um And you know, they they they really they sort of have this, you know They they like method of overloading too and those and templates Those are sort of the the features of c++ that the programmers really don't want to give up And julia has good answers for all of those because we have parametric types. We have good performance Um, you know, and and it's it's just sort of a good drop-in replacement for a lot of things um I think, you know, that's that's sort of the biggest surprise to me You know, I was personally surprised when I saw the faa guys, uh, you know, the the folks at johns Hopkins laboratory and um, and link labs using julia for the advanced collision avoidance system for the faa um This was they started using julia way back like in 2012 almost when it was announced and You know, given the critical nature of that project and the needs for performance and the memory footprint and and all these other constraints, I mean, it was really surprising that you know Even after like looking at all the other alternatives They decided going ahead with julia and that too at such an early stage. So I it's one of my favorite use cases Yeah, I think they had a lot of foresight. They really thought very deeply about What their problem was and what kind of properties a language needed to have? Um, and and it's really interesting that their conclusion was this julia language seems like it really has what we need um, which was, you know, metaprogramming and great performance I think if somebody asked me what my biggest surprise was It would be Almost psychological. I I guess I kind of grew up thinking that all languages are the same, you know a Turing machine you can You know, you you you you you you you can um, you know, anything could be implemented in anything sort of thing and and and by the way I know what the language that I know best. So I'm never going to change You know and then people tried julia and uh, oh, this isn't so bad I can do this and and it's that very kind of psychology that that you know I thought we were going to build this language and you know, we'd have fun with it But you know, it might just end there. What license is julia distributed under We use the mit license for most things Okay, and then a follow-up question to that which is Not really related but sometimes related. I always like to ask this to other Development projects that we talked to here on the program is what source control management system. Do you use and why? We use git And it's it's because uh, stephan suggested it many years ago and sort of forced us all to use it and i'm glad he did Yeah, there was a little bit of kicking and screaming early on about git because git git doesn't have these Doesn't have the best learning curve, but then once you learn to use it as a power user It's you know, it's hard to imagine living without it Um, I I also got us on github pretty early back when github wasn't really, you know ubiquitous and cool Um, and that was a good choice as well um I think github has been sort of the larger thing in driving the community which I I personally don't think would have happened as fast as it did You know without all the amazing collaborative features that github offers I mean, maybe it would have but I think it's certainly helped a lot So what's coming into future for julia any big changes coming new features? This current release is actually a pretty phenomenal release um, I've been working on a blog post for it and you know blog posts are They're uh, they're way more work to write than you think they're going to be But but it's coming along pretty well um There's a lot of changes to functional programming. Uh, so in the past julia has been It's always been technically a functional language in the sense that you can pass functions around and return them from From other functions and stuff like that and lambdas are supported Um, but this is the first release where you can really do all of that without any overhead Uh, so you can now use functional programming In all of the contexts where you want to use julia because it's such a high performance language Um, and that's been, you know, jeff jeff did most of that most of the spearheading of that work and it's, you know, remarkable stuff um There's a number of other changes, uh Uh, I'm I'm gonna finish that blog post up today and post it Maybe you can put a link on the website or something to to link to that But uh, I don't know. Were there any other features in the the new release that you guys want to talk about or future stuff? Yeah, I wanted to speak a little bit about, you know, I'm always sort of excited on about running julia on multiple new architectures and You know, this was the first release where we have had I would say pretty good support for arm, which means julia is now running on raspberry pies and You know, the folks out at berkeley are using our arm implementation to you know, do drift parking in modern cars and all kinds of fun stuff And this release also saw our first report to power eight Which means you could now start running julia and all the big iron Many of those two top 500 sites. So so going forward. I think these two architectures are definitely going to have In they're gonna mature a lot more and uh, hopefully we'll also have our gpu port coming out In the next few months. Uh, that's been actively worked upon By tim bassard and valentine sure we We should put up for the self-driving car. I just love watching that little I don't know what it is a 15 second movie or whatever it is, but I love it Yeah, the self-driving car thing is really cool. Um, another big feature is uh, this is still experimental But we have native multi-threading support in julia. So you can just put At threads on a for loop and get uh, that completely paralyzed And it scales it does the sort of the smart stuff with Splitting up the work and allocating it appropriately. So it's not, you know, it's not a lot of you know You don't spin a thousand threads up for eight cores um That is still experimental, but I mean we have a roadmap plan to get to 1.0 julia next year And that will include multi-threading and and that I think that's really really key. That's going to be an exciting feature because having Access to first class high quality high performance multi-threading in a dynamic scripting language Is you know, that's going to be revolutionary I also want to add again, uh, you know, I think this release has been phenomenal like stefan said. We've also had our first Um, a first table release that supports the julia debugger, which is called gallium written by kenno fisher It's a side story, but ken actually started working on julia in high school And he's now one of the co-founders of julia computing So this julia release has the first debugger that has been put out and He's done some amazing integration with mozilla's, uh, you know rr framework that allows some phenomenal kinds of debugging capabilities That I haven't seen in many other You know similar languages. So there's a lot of that kind of stuff coming out over the next few months Yeah, we we joke about how kenno lives in the kenno verse, which is like, you know Three to four years ahead of the rest of the universe He always has this like amazing technology that he's able to use and has developed that eventually You know, we get pieces of it that trickle back to the rest of us and you know, it's always mind-blowing stuff And I think the the debugger sort of fits into that category. He wrote this amazing, um C plus plus bindings library, which lets you Just dynamically call c plus plus in the repel from julia And completely seamlessly integrate c plus plus libraries with with julia libraries, which is a It's a it's a tour de force. It's really impressive stuff And this time traveling debugger is gonna be it's gonna blow people's minds Yeah One way to think about it is is airplanes have black boxes, which I gather they're really orange in the real world Now every bit of software can have Sort of the the 2016 version of you know, what went right and what went wrong with everything Yeah, and we we plan to integrate all of this technology into our uh, our julia box web platform So we've been offering this uh, free service where you can go to julia box.org previously now julia box.com because the company is Is hosting it and we are unfortunately going to have to charge people some money at some point Um, but we'll try to keep a free a free tier But one of the key things there is to be you know, a lot of these pieces of technology are hard to set up locally So the best way to get access to it often is to go to a web service where You know, you already have c++ integration and you already have a debugger a time traveling debugger setup for you And you can just use it right out of the box So can you give us a little bit overview? What exactly is the business model and services provided by julia computing? Yeah, so our business model is uh, you know, like very much of a traditional open source business So we provide consulting and training and support for various You know forms of julia, which is if you have the julia professional edition, which is what someone could You know download it brings you a beautiful IDE. It brings you all the widely used julia packages and and easy to install You know works behind the firewall all that other stuff For for a sort of a small support fee every year and then over and above that we do consulting and training services that help customers get you know running with julia quickly or writes and custom modules We've been in you know, we've been quite lucky in having customers who've actually Hired us for consulting work to work on open source software and release it back You know, so the debugger is for example an outcome of one of those things And a bunch of other projects around static compilation came out like that Um in terms of business models going forward We do expect to build other pieces of software around julia that allow That are domain specific and may not necessarily be open source but help particular industries You know get significant amounts of productivity and performance over and above what they're used to And we're targeting the world of finance first But we're seeing a lot of traction in life sciences In the traditional engineering Ecosystems and even in embedded computing Or often called in another things Okay, thank you everyone for your time. This has been very interesting. We'll have this up soon Thanks everyone. Great. Yeah, thank you Yeah, thanks. Yeah, thank you very much