 Yeah, no worries. So the question here is, like we've talked about library ecosystem and libraries are all fine and dandy, but quite often you have a case where you want to install certain libraries for your code, like you want certain kind of a situation and the ecosystem is so big, like the whole Python ecosystem, there's like probably hundreds of thousands of packages available for different Python versions and different kinds of like different installations, different dependencies and different things, and they are not compatible all at the same time, like you cannot install Python, like I mean like the whole Python, it's impossible, there's so many, so many packages, you cannot have everything in one installation, so that brings to mind that okay, how do you manage installing multiple things at the same time for different projects, how do you manage that you get the correct dependencies for certain projects, and there are two major players in this thing, and they are the Python packaging index and Anaconda, so they are the major things, so here's a big list of what is the difference between Python packaging index and Anaconda, I can try to say it in short form, so Python packaging index is this community project that basically provides Python packages as these so called wheels, like wheels of cheese, because everything in Python is a multi Python joke, but it provides packages as these, like you have Python package as one file that you can download from this Python packaging index, and these are often used in conjunction with these virtual environments, but they don't have to be installed into virtual environments, but they're quite often used, so that you can like have your own small Python world, where you install only the packages that need from the Python package, and this is very good, like and it was originally developed for like sharing Python code, so Python code written in Python, but like we talked already, much of the Python code is not written in Python, there's underlying layers that are written in C, there might be dependencies, so for example Numpy uses these blast linear algebra libraries to run like the matrix operations and that sort of stuff, so it uses underlying libraries that are not part of Python, and they have not written in Python, they have written in Fortran and C, and providing these is a bit of a complicated thing for the packaging index, so what they do is that they basically put everything into these wheels, and then they provide everything with you, which can result in like very big installations quite often, and it's a bit complicated there, how do you share like non Python code, there is a lot of packages that provide that, but it's a bit complicated. But writing these packages is pretty simple, like you can easily write your own packages, we'll talk about it tomorrow, there will be a section on pack catching your code, so writing these is pretty simple, like how do you write them, and then you can publish them in the packaging index yourselves, and if you're writing Python code that depends on other Python code, so for example you write your own thing that depends on other Python things or other libraries like Numpy and that sort of thing, it's very easy to share it in Python Packaging Index, and use pip to install it, so pip is the installer tool for Python Packaging Index packages. The other major player is Konda, and Konda is basically like a solution to the question of okay like we have so many of these different dependencies, or different packages, and they might not work together, so how do we like, how do we make certain that we get like a consistent working environment, and the Anaconda Incorporated, which like developed, used to be Continuum Analytics, now it's Anaconda Incorporated, they developed this tool called Konda, which basically is a packaging installer, and then they provided in this Anaconda Insta installation, so they provide like a wide bunch of already existing tools in one installation, which is this Anaconda Insta, so many of you probably already use it, which contains like, it was basically designed for data science people, or people in banks or something like that, like you get one installation that contains a lot of already good stuff, but it has moved beyond that, like the project has moved beyond that, because the community thought that okay like, we want more packages there, we want more packages, we want more things, like we don't want to be all dependent on the Anaconda Incorporated, so what happened is that Anaconda also provides their own like Anaconda.org, which is this like Packaging Index for Konda Packages, and you can have your own like channels there, where you can provide your own packages, and some of these channels are now bigger, like the Konda Forge, which is this open source channel, is a lot bigger than the channels provided by the Anaconda Incorporated, but basically it's a different world, and what Konda does is that it can manage all of these like low level dependencies as well, so it can install you like compilers and tools, and different kinds of linear algebra libraries, and especially when it comes to like CUDA, like GPU program, it can install you like the correct CUDA toolkits, which are these GPU libraries that your code needs, so it can try to manage this dependency thing, but of course it's very complicated because there's like a huge bunch of these libraries, and different combinations. Is it correct to say that the Python Package Index, the things you installed with Python, is more or less Python packages and Python APIs, Python packages and other C and Fortran and other code that's called from Python, but we need to come to Konda, in addition to that you can also rely on system level libraries, and also additional tools, for example the bioinformatics, there are tools like SAM tools, so if you want to analyze something, Konda will also get these tools, which are not Python interfaces, but tools on their own. In addition to that, in PIP, when you install this, they also distribute the source code, for example a unique system, for Windows and Mac mostly you have the binaries, if the source code is there, and if you are a Ubuntu computer for example, when you download it, the PIP expects that your system to have the compiler, but Konda could also bring down the compiler and set up everything, but mostly, maybe all the time, Konda is pre-compiled binaries, they're compiled so they're ready to be used. Yes, yes, and of course you can combine these, so you can install PIP packages in Konda, like Konda creates its own environments, which are similar to the virtual environments I've already mentioned, but they've managed by Konda. To the whole ecosystem, I'll quickly mention this, over here there's this kind of like glossary of different terms, so when we talk about Anaconda and Konda, you can get headache from just hearing about all of these different terms, because there's so many of them. So I'll try to quickly go through the whole Konda thing, like what are the different partners basically in the ecosystem. So there's the Anaconda Cloud, so this is the place where people store the packages, then there's Konda Forge, which is the largest open source community channel, and Anaconda Cloud also contains packages by the Anaconda Incorporated, which are in so-called defaults channel or base channel and R channel and those kinds of channels, but they are incompatible usually with Konda Forge, because Konda Forge is basically a completely different world in the same ecosystem, Anaconda Cloud contains like two worlds, that is like open source world with Konda Forge, and then there's the defaults world, which is curated by the company Anaconda Incorporated. You can use both, but usually you don't want to mix and match them, because you can get problems. Then there's the package managers, so PIP is the package manager for Python Packaging Index, so if you install from PIP, you install it with PIP usually. But then there's Konda, which is like the package manager for Konda environments and Anaconda stuff and that sort of things, and there's a newer version that we usually recommend to our customers called Mamba, which is like an open source project as well, but there's a C++ implementation of this, because again Konda is written in Python, so it's slow. So when you create this huge environment, sometimes it can take like minutes to figure out how do I match these different packages into the same environment. And that's why people usually use this Mamba, which can do it much faster, this solving. So inside it's a mess, but there's like a subtle solver inside the Konda, and it tries to figure out the correct packages that you need in the environment, because it wants a working consistent environment. PIP will just install stuff into the environment. Then there's the package manager deployment, so basically you can get already existing good set of packages if you install the Anaconda. So the Anaconda is like a distribution of lots of packages by Anaconda Incorporated, and it's free for academic and non-commercial use, but for companies it pays money, like the license, it doesn't cover companies. Then there's Mini Konda, which is basically, it's provided by Anaconda Incorporated, and it uses default channel by default, like the Anaconda packages, by default it tries to install them from there, and it only contains the Konda installer. So you can use that as like a starting point if you want to create your own environment or something. You don't have to install the whole Anaconda installation. Then there's also this Mini Forge, which is like a Konda Forge version of the whole thing, where it basically uses the Konda Forge by default, so it's an open source version of Mini Konda basically, and it has Mamba as well installed. So these are some of the words you might hear when you look at webpages and look at installations. So you need to basically choose what you want to use. Usually when people are using, let's say, Windows, the Anaconda navigator is so good, that like using that is a good idea, and you can manage packages with that as well. You don't have to use this command line tool, so anything you can use the Anaconda navigator to manage your package installations. If you're using Linux, you often either maybe get the Anaconda installation and just use that, and maybe create your own environment that doesn't have the packages you need, or you just take, let's say, Mini Forge, and then you use that to create your environments. But that's more like command line, let's do it ourselves kind of a way. There's different ways of installing the packages. Yeah, it's correct to say that regardless of if you have installed Anaconda or Mini Konda or some packages to get Konda, you can use any of the other packages, managers, Konda, to install packages from any source. For example, Anaconda and Mini Konda both could be used to install the same source, but Mini Konda is lightweight. But wouldn't you recommend that when you distribute the code that needs Konda, that you would recommend Mini Konda to be the requirements. And then you provide the package list rather than asking the user to install Anaconda and use your package. Yes, usually it's a good idea to provide the minimal example. So usually if you go to a web page and you see, okay, pip install this, if you have a library or something, what they mean is that you can install, this is the minimum needed for the installation, but usually you can have it installed in various other ways. You might install it in a Konda package or something. Or what they really are saying that, okay, create your own environment and then run this pip install command. Because if you just run the pip install command, you usually install to whatever environment you currently have. That's not a good idea. So usually it's a good idea to choose. For the sake of time, shall we show this best practice you are talking about? Yes. What is this environment and why do we have it to isolate? Yes, so the environments, so when we're talking about environments, usually when you start, for example, if you now have started to pip install, you might have started it in the base environment of your Anaconda installation or maybe you have installed the environment that we have provided in the web page. But the environment, what we mean is that wherever your Python interpreter basically is, what is your Python interpreter that you're running? That determines, like, Python based on where it is, it will try to find packages near that. Basically it goes one folder up and one folder down and tries to find packages that are present in the system. If the Python is the system Python, it will try to find it from the system libraries and if it's installed into a completely different place, it will try to find packages from there. So what this environment means is that we create a folder basically, we get some Python there and then we install packages relative to that Python. And then they are installed into this own world where they won't interact with others and we don't have it like loaded all the time, but we can load it per case. If we need to do one kind of a thing with certain packages, we can activate this environment. So we basically go into this world of this Python and then we run whatever we want there and if we are doing a different project, we can activate a different environment. And this is a good idea because it makes it possible to reproduce your code much more and you run into less problems because as we mentioned previously, there's so many packages. You cannot get them all working at the same time. So it's better to just have a small world where there's not that many problems that has only the things you need basically. And that's in the environment. And this is, I always start with a new environment whenever there's like a problem with Python. You start from scratch and you start building up the environment what you need. And you can create an environment by running some commands yourself. But nowadays, and that is often recommended somewhere, but that is like we will, you will find in the exercise too, you can run it after the course and run the environment yourself. But we don't recommend that way of installing packages in general. So let's jump straight to the correct solution basically or the beta solution. So what you're saying is that let's say if I want NumPy 1.24 and NumPy 1.23, both versions of NumPy in my same system. Rather than just installing pip install NumPy, that there's a better way to do this, which would help me to reproduce this later. Yes. Yes. And this is by recording the dependencies into this environment file because the problem with like, if you, for example, with that is that of course you can like install NumPy some version and then you realize, okay, my other code needs another version. So you can basically install the other version, but where was the, now do you lost the other one? So basically like either you need to like constantly like reinstall stuff and maybe when you do a reinstall, suddenly some dependency of that package gets reinstalled and suddenly you're in this mess where like the route you took to get to the current state of the environment is depending on what commands you run in what order and then you cannot reproduce it like you cannot anymore like you don't know how you got there. So it's usually better to just like create a world where everything is like correct at the correct time and how you do that is that you record the dependencies into these either the requirements TXT or environment jam. So if you... Does it matter what you call this Simo? Does it matter what you call these files? Yes. Yes. Like you can of course record them in, like you can name them anything you want, but the thing is that if other people are trying to find them, they will look at these files and they think that okay these are the files and these are the ones that the tools will by default try to search for. If you try to install a VIP for example, it will try to find the requirements TXT but these are like basically like the conventions, so you should abide by the conventions. Of course if you don't then let the users know or the other people know that okay, my requirements are in foobar.txt or something, but that's not... If I visit the GitHub page or GitLab page, if the requirement.txt is there, I know what it is. Yes. So it's better that you follow this. Yes. Yes. Exactly. So you can always look at these and what the requirements TXT, so the requirements TXT is meant for PEEP installations. So and what it contains, it's just a list of packages. It can have... We'll talk about versioning a bit later, but it's usually just a list of packages, a text file, and when you tell a PEEP to install an environment, it will just install these packages. Condi environments are a bit more complicated, so they will... Not in a bad way, but they have more information in them. And they're usually called these environment jammu. There's a typo. It's environment.yammu. And they contain the name of the environment usually, what environment you want to use, and then where do you want the packages to come from? I talked about defaults and corner forge, the channels. So you usually define like, okay, I want the packages for... In this case, for example, to come from the defaults channel. So the Anaconda Incorporated packages. And then you have a list of dependencies that are the packages. And you can usually convert, like you can notice from the syntax, that you can easily convert PEEP packages, PEEP requirements TXT to environment jammu, and other way around, if you just take these and put them into requirements TXT, you can get basically a PEEP installation, and the other way around, if you take these, you can convert them quite easily. Which version of NumBuy would this install similar? So in this case, because we haven't specified any versions, like the both tools, they try to get the newest one. Like if there's some other packets that, of course creates like a requirement that, okay, I need a lower version, then you might get that lower version. But the point is here, is that you let the tools do its job. Like if you don't know what version, if you don't have a specific need for a version, it's usually a good idea to let the tool decide what versions to get. Because like if you don't have any specific requirement, it's a good idea to let the solvers figure out what is compatible and what is needed. So because you mentioned that it is easily interchangeable, so I want to mention one thing. So if you have some packages, depending on the channel you select, there might be different versions. So if it's always the latest version, it might pick up, but depending on the channels you specify, maybe some different versions can be installed. So you also need to, that's why it's important that you specify the channel in your environmental file, rather than depending on what defaults you have. Yes. And also in Konda environment, you can also, let's say specify the Python version you want to use and that sort of, you can get more complicated with those. So why is this important to have the requirements, the requirements file, because it makes it possible to recreate the environment quickly. So you can recreate the whole thing, you can remove it and you can recreate it and you should get basically the same kind of situation. Of course, in some cases, you want to lock down some versions. So if you need to reproduce it, let's say you run something for your paper and you want the collaborators or reviewers or whoever, you want them to be able to reproduce the thing. So then you might want to lock down the versions that, or pin the versions that you are currently using, because you know that those work. And in that case, you can give these versioning numbers. In case logic, you can give larger than smaller than that kind of stuff. You can give all kinds of, but you can basically lock down certain versions of packages. So let's say you want these exact versions of NumPy, Modplotlib, Pandas and SciPy. And same with the Konda and with the added Python 3.10 here, so that you want it with a certain Python version. So as a best practice, wouldn't you recommend instead of having these exact versions, you would say that something bigger than this version, but less than this. Give us some range, meaning it's very specific. Like the versioning, how it goes usually in these packages, if the people abide by it, is so-called semantic versioning. So the first version is usually the major release. So between two major releases, nothing usually works. So if the number goes from one to two, the first number, it basically means that it's no longer compatible and you should do major changes to it. In the minor version, the second number there, it basically means that it should be, they can introduce new features and they can deprecate old features, but they usually give a few versions of leeway, where you see these deprecation warnings and that sort of things. This will go away in the next version or something like that. So usually the second version will basically mean that, we are doing some changes between the versions and some things might not work anymore, but usually the main things are the same. And the third version is basically a patch version, so that shouldn't ever affect. If they have done their job correctly, it should never affect your code. It's like fixes for bugs and that sort of thing. But that means that you can usually give quite a bit of a leeway for your, you can say that it needs to be version, higher than version one, but less than version two or something. It needs to be version one, but that's all, or version something, some range of versions. So the whole advice you gave, we also need to take care of when we distribute packages to stick to those rules, because people expect that. Yes. And it's usually a good idea not to pin yourself too hard to the packages, especially if you're developing something, if you pin yourself to a certain version, you're basically locked in time. You're at that time moment, and then the world will move onwards, but you are stuck in that time when you created an environment and when you spin the versions. And in two years, in three years, in four years, most likely nobody else can recreate that environment anymore. And that's a bad thing. So your code will be stuck in that time. So it's usually not a good idea to do that too often. Yeah. Sometimes we go on sort of bypassing these practices, and let's say if I have created an environment, is it possible to create this file from an existing environment? Yes. If I can give it to other people. Yes. So in this exercise, there's an example how you can freeze an environment. So you can use this Konda nv export. There's also an additional flag here that you can put Dastas from history. If you want to freeze it based on the commands that you have run previously. And you can also run this pip freeze. But what I usually recommend, if you don't know how you created your environment, I personally start from completely blank slate, and then I run the code until I no more longer get import errors. If you really don't know what the environment has eaten, it's very hard to reproduce it. But you can freeze it if you want to have an environment for posterity, for publication or something like that. Should we just show one examples while users of learners could also follow? Or what do you recommend? Let's do an example. So this will of course be dependent on what system you're running. In my case, I'm running Linux, so I can use the terminal, for example, in JupyterLab. Well, I can start a new one, let's say, here. I'll start a new terminal. If you're running in, let's say, anaconda navigator, you can basically point it to the environment file and you can let it work on that. But you can have various different user interfaces. But because I'm using command line, I'll create the environment in this way. So let's say I have an environment jam. So in this case, it's basically the one over here with the numpy and mutlots and whatever. So let's... I'll run the exercise exercise 4. You can probably run it yourself, but because... Or you can run it yourself and I recommend trying it. But because there's differences in environments, it might be complicated if you get it working. But do try if you feel like it. So what I'm going to do, I'm going to create this environment using this konda n of create. And then I'm going to point it to the environment jammel that I have. Can you show us that file, how it looks like in your... Yes, yes. So the environment, it's just a dot jammel file. So jammel syntax, so it has this list of packages over here. For following, it's down the teaching middle. Oh, yeah. Yeah. I already have it, so I'm going to remove it. So you can remove the environments with this command. And why so? Also very important to do this environment. I usually don't... I usually remove all of my environments every two months or something. And the reason behind is that I'm secure in the knowledge that I can recreate it from the environment files that I have. I don't fear losing my... My system that I have set up is so fragile that I'm worried that I mess it up and now I cannot recreate it anymore. And that's a terrible fear to have as a program that you cannot recreate the environment you already have. And that's why it's very important to create the environment file, because then you can let go of that fear. You don't have to worry about that anymore. Yeah. So typing, installing one by one would be easier. That is the easy way out. But the robust way is what you're doing. Slightly more work, but you will thank your past self. Yes. And if you want to... Let's say you want to recreate, you want to add a new package there. What I usually do is that if I run, let's say the install command in the environment myself documentation, there's also this. You can update the environment based on an updated environment, Jammu. You can install new packages there, but I always record it into the environment anyway. Because eventually you mess up the environment anyway. Everybody does that. Everybody does a wrong installation or something and then it's broken the environment. And this is unfortunately the side effect of having so many different choices of libraries. At the same time, you can fix it by having a consistent way of recreating where you are. And that is the... Either the environment Jammu or the requirements TXT. Of course the requirements TXT is... It works very well if you're only using Python packages and stuff in the PIP world. But if you're using GPUs, if you're using more complex things, you really... I recommend checking the conda because it makes it possible to install much more complex environments where everything works together. So Simo, it's thinking now. The conda is thinking and it's solving environment. You see this timer going on. So when you work on a system and when you create environments like this with the requirement file, the waiting time is also less rather than like add more and more to an existing environment. Because conda needs to figure out all the matching files. And the other thing is when you work on a system for a long time, it might create a big cache. The packages you don't want will get accumulated. So in addition to being reproducible, there are so many other benefits by following the procedure that you are doing now. Yes. And I also mentioned that if you want to share the environment or reproduce it in another system, usually the environments are not easy to transfer. They are tools to do this but it's very like finicky. Usually it's better to just move the environment jam to a different place and recreate the environments there. It's just one text file and it's very easy to copy that and then recreate it in another place. Instead of trying to because the environments, if you are doing work with deep learning the environments can get to 4 gigabytes. You don't want to move 4 gigabytes when you can move a text file. That's not usually a good idea. No. So moving around things on different systems it would create issues because these are pre-compined binaries. You don't want to move something from windows to a unique computer. It will never work. But if you can move this environmental file and recreate it according to your target system and also you could easily version control this environment because it's text file. You could have Git and other ways of version controlling this. You can have multiple snapshots. That's an important thing and I should mention that in the Python Packaging Index and also in the Conduct Channels there's huge amounts of variety based on what operating system you're running. So for example some packages usually the packages are built for Mac OS X, they're built for windows, they're built for Linux, different variants of Linux, different dependencies and you can have huge amount of different combinations of the same package. The same package but it's built in different ways based on where it's going to be installed and these tools make it possible to install it and of course this is a bit of a demo effect but the environment solving takes a long time and this is why the Mamba is so popular because I can show it, if we have time I can show it how long it takes with Mamba but I'm pretty certain it doesn't take this long. But this can also happen in Anaconda Navigator for example it's pretty annoying sometimes that it doesn't produce output that often, like what's it doing like it's trying to solve the environment and yeah it takes a while and it doesn't necessarily produce output so you can think that it's crashed but yeah sometimes the solving just takes a long time. Let's see if there's any good questions in the chat. So I will also try on my terminal let's see. The one reason could be that you have a lot of there are ways to clear Konda cache for example it could check less things and also your base environment where this Jupyter lab is installed it might have certain libraries already installed so what Konda trying to do is it's trying to sort of not to redo things and maybe reuse things and also be compatible. Yeah and also maybe it's because of the share and that sort of things there's other things running on my laptop at this point. Are there any good questions? So where should it be stored? So usually a good idea to if you're using version control or something you should usually store the the environments and requirements in the in the with the code so it's easy to reproduce. Just in the Konda update that's just all. This is a good thing to mention that updating already existing environments can sometimes be really complicated. Well for PIP is less complicated because PIP usually just downloads stuff and it doesn't care necessarily it just will install them. It doesn't look like it just goes YOLO and it just like goes packages. Which is good in many cases but it sometimes gets you environment that don't work anymore. What Konda does is the complete opposite. So it tries to make certain that everything works and if you have like basically if you have let's say if you have ever been to like a tour or something like a tourist tour or something and there's one person who walks slowly then the whole group needs to walk slowly and that just basically can happen with the Konda packages. That is that there's one package that is basically like I don't want to update. I don't like updating. I don't want to be updated and then it can hold the whole environment back and it can create this kind of a situation where Konda goes on a tangent and it tries to like create an environment with doesn't want to update anymore. So for all of these situations usually the the solution is to is to just create a new environment. For for fun sake I created a new environment here like let's try with Mamba like I'm running it now from my terminal and it's the same command exact same command but this time I'm running it from with Mamba. So Mamba is is much faster so I'm pretty certain that this will like this will finish before this gives any output and it gives more output as well. It tells where it tries to find the packages and what does it download like what packages and that sort of stuff. Yeah. So now it tells what we are looking for and soon it should probably give the answers what it's yeah. So now it's already doing the installation. So this is why we usually recommend people to use Mamba when they create environments. So it's already done when the condi is still wondering about and this is why this is maybe a good demo on why certain tools have been reinvented but user interface basically the same. But maybe we should have like do we have anything for this session? Do you want to show something more? Yeah maybe maybe I'll quickly show how to activate the environment. So when we often often in places we recommend that you don't necessarily run this condi in it. You can run it. So if you see let's say you have a terminal and you see this base over here. It means that your condi is active now and the python will always be found from that environment and that this can cause problems in many systems like if you want to, if you don't want to always have it active. Like if you have a program tries to use python but suddenly it wants to use the python from the activated condi that can cause problems. So we usually recommend that people don't run this condi in it but you can of course run it and if you run it you can use this condi to activate the environment but and now you have an environment here but if you if you yeah like usually we don't recommend you run it all the time because then or other programs that want to use python as well they might find this wrong python instead of the python that you want to use. So here in this file that you activated we didn't mention which version of python we need so what would have been the I did so yeah so we can check what is the version of the python so we got 3.12 so in this environment I didn't specify the just stop that condi I didn't specify what python to use so yeah I just got the newest python in this case and if I run which python this is like linux specific command but you can see that it it shows python from my newly created environment So in an environment does it always include a python as well not even if you don't have it mention it in your requirement file does it always include a python no no if there's nothing that requires python it doesn't always contain python so usually when you create an environment it might be sometimes good idea to add let's say peep and python always there so that you always get peep and python into the environment but but sometimes like most of the time you install like numpy or something and then that has a dependency to python so it will bring python with it but it's always like this kind of a question of who are the major players what are the major packages that you want to get and those are the ones that bring everybody with them so they're basically like the Taylor Swift going into a like to a restaurant like other people come because Taylor Swift is at the restaurant and that those are the stars of the environment let's say pie torch or something pie torch will bring its friends with it but like nobody is really caring about the friends they are caring about the pie torch and this is usually how you want to think about the environment so there's usually the major players that you want to these are the major things that they want the condor to focus on so to sort of sort of investigate that what you said you know there might be other packages that pandas needed is it possible to refreeze this environment and show how that would look like after creating? Yes let's let's run the condor in of export and you notice here that I get this like pretty horrible looking environment file where suddenly there's like a huge bunch of stuff here and this is basically like the exact versions I got and much of these are provided or packages that come as dependencies to let's say an umpy but you don't want to give other people these as an installation instruction because unless you they want the exact same environment but usually you don't want you want to give them the like the like the actual the one you needed in fact this will break in more systems because if you see the hash like the third like the you have two equal signs the third hash actually is very unique to the system you are in now so there's very likely that it will break on a different system Yes so many of the things here are like like you can read it as as you have a package then you have a package version and then you have a specific build of that package for example in this case it's for Python 3.12 but yeah it's very specific and that's why you usually want to give the road outline of the environment instead of the exact requirements like you don't want to give like of course you can give if you want to create the exact same cake that somebody else has created you can give the exact grams of what flower to get exact brand of the flower when it was made like exact amount of the flower that is needed but most of the time you want to give a generic recipe that says that ok just get me flower this mount so it's like similar you don't want you want to give a general recipe you don't want to give and the konda is like the cook that can then manage this Yeah then also the file you used also give the others to build upon your package for example if they need a let's say PyTorch in their program in addition to what you install it's easier for them to add PyTorch instead of editing this file so this is sort of overwhelming and this is not something you want to show and it's also very simple Should we now go to a break before We have two minutes I think one minute you can have a look at the questions Yeah let's look at the questions quickly and then go to a break so let's see if you have any more questions please go and ask them we'll try to answer them Do you want me to share the questions? Yes maybe, yeah that would be great Ok here we go You can only see on the stream how this goes I can see the questions on the stream There's some questions about things like how do you make the stuff in the environment file do you have to do it manually I think you might have answered that Usually it's a good idea to write it yourself Where do you store an environment file? So I think we have some mumba-isolation instructions somewhere as well so we have to find that and place it So under question 44 we'll place it as soon as I find the link for the mumba-isolation Yes So about the question I often think of environments if you go to if you go camping or if you go to a hotel you go on a trip pack the usual aspects you pack your toothbrush you pack your shampoo this kind of stuff and that's basically what most of the environments are you always get the mumba-isolation and mumba-isolation most of the stuff is like it rolls off your tongue you know ok I will need these anyway I always want these stuff and then there's usually maybe 2-3 packages that are the actual meat of the whole thing and they are the special you need a certain kind of clothing when you go to a trip you need clothing for warm weather or something like that and that's the actual thing there Yes although it's very interesting our discussion Simo that we have to stop now On question number 46 I will slightly mention something about it especially we see this if it says access denied when you try to install especially on shared HPC systems for example it's always like that you are not allowed to modify the central environment and actually you should not do that try to do that you can use with dash as user as mentioned answer but you should go for this environment but Simo was promoting isolated silos where you can add things have different versions and also delete them if they don't want so don't try to install a sentry Yes so if you if you install like anaconda navigator or something usually like in that example you install it into program files and you install it as root or something like that and you don't have maybe right access to that folder and that's a good thing because then you cannot mess up the installation and what you want to do is create a separate realm separate world virtual environment or condo environment where you then install the stuff because if you run for example the pip install a dash as user it will install into folders where they will always be present so if some other environment uses the same python version it will find the same packages and then like all hell breaks loose and like suddenly nothing works anymore so like you don't you want to keep the packages that you want in the environment you want to keep it in a separate like silo like separate world and you want to first create the world or it's like the environment you want to create the world and you want the packages to that world and yeah and Simo will take the rest of the questions and thank you very much for your great introduction I think it will be very useful for most of us Richard can you please take a yes so I guess we will have a break until 23 past hour so see you then bye