 Sorry folks, I don't know how I got disconnected, let me just share my screen again. So there are three components in event A, one is the public section, as you can see the FOSS Asia Summit and everything and then there's organizer section which only the person who created an event can access and then there's admin section that only the administrator of the platform can access. So if whenever you create an event you can access the event section for example I can go to manage events and then I see the events I manage and I can go to the dashboard. So this terminology is important because sometimes people see in issues that the issue is an organizer section and let's say I copy the URL and share it in the issue description and they try to open that URL and obviously they can't access because this is not a public section, this is an organizer section and so thus you would need to create your own event to access and organize the section of particular event and then there are some bugs in the admin section so then you cannot even access those by creating your own event you need to deploy your own server on locally to get to that section. So this is a general outlook of the sections of open event and then there are kinds of users. So there's admin user which is at the top of the hierarchy then there are owner of the events so owner of the event can do anything related to an event including deleting the event or transferring the event to another owner then there are organizers. Organizers can also do everything except deleting and transferring the ownership of an event then there are co-organizers and then there are also video modulators and things like that. So basically this is the rundown of the open event platform so the next thing we'll do is set up the server so before that let's take you know a little bit to stop so it doesn't even have any question till now. Okay so I'll take that as people don't have a question yet so then next let's turn to installing the server. So I have this repository open event server so basically whenever you want to start setting up a project you go to the repository there's the readme and then you can see the installation instructions. So we want a local installation instructions so I just go to that section and then start following the instruction. So I have terminal with me which I have which I've opened a fresh Docker container it is clean so I don't have anything pre-installed in it so that whatever we do we have to do same I assume that nothing is installed on your computer except Git. I assume that Git is already installed on it and that your operating system is Linux so if you don't use Linux if you have Windows I recommend that you use Linux Ubuntu or any of the Linux distros but if you can't then I recommend that you use WSL too. So Windows subsystem for Linux allows you to run Linux in a virtual environment in Windows but anyway let's then let's get started. So as you can see the first step is to get a copy of source code to clone it. If you want to develop open event it is recommended that you first fork the repository and then clone your own project but as I'm just demonstrating it here and my fork is a little bit diverged from the source project I'm going to just clone the original project itself. So I have copied this command pasted it as cloning here then I cd into the repository after it has cloned. So now we have cloned the repository and I am changing the directory to open event server. So now the next step is that we install the dependency. So we have a section for macOS we have for to be in Ubuntu so I'm just going to copy that to be in Ubuntu installation instruction and going to paste this here and I have to enter my password and once I have entered the password I have to accept that I want to install these dependencies and now they are getting installed. So anyone who is trying the system the installation instructions with me can also do that and if they are stuck anywhere they can add the questions in chat and then whenever I'll take a break while something like the dependency installation is happening then I can answer your questions. So now I have this command line prompt which is asking me that what is my geographic location if it is because there's no default location in the Docker container but if you're installing it locally it most probably won't ask you this but anyway I'm going to enter Asia and then Kolkata as my timeline time sorry time zone. So now this is now set up so if I go up there's this instruction that if you want to start PostgreSQL you I want to run this command so let's run this command and I have an error that this must either run as the cluster owner or root so there are two options I can either run this by changing my user to PostgreSQL and then running this command I can just run it as root so I'm going to run it as root so now the cluster should have started so after you have installed PostgreSQL what you want to do is to you know just see if you can access it using the psql command which is the command line client for PostgreSQL so it gives me an error that role Docker does not exist that's because when I run psql it assumes that the user is the same user as the you know the Ubuntu user so my Ubuntu user is Docker so it assumes that I'm accessing as Docker user but I don't want that so I'll just change to postgres user and try to access psql and I can do that so that means that postgres is installed correctly I'll exit and then let's continue on the instructions given in the read me so the next instruction is to install poetry first of all I'll mention that the project only works on python 3.8 so if you do not have python 3.8 installed you have to install it if you can see this here that I have python 3.8.2 installed using the setup so it's by default if I installed using the dependencies it has installed the correct python version but it may happen that you do not have the correct version in your Ubuntu and it may also happen that you have 3.9 which is also not supported in the project so the project depends on python 3.8 features and thus if you run it on python 3.7 or python 3.6 that would mean that it will throw syntax error and we have some breaking changes in python 3.9 which are not supported yet in our project that's why we can't use python 3.9 so if you're in a situation like this you may need to use pyenv which allows you to install multiple versions of python so we are not going to go into that there are instructions in the link repository if you get stuck there we can help you you can ask a question here or in the gitter channel so let's then install poetry which is the next step so poetry is like pip in python pip is the dependency management solution but it has some you know disadvantages it does not have dependency locking and also the dependency resolution depends on the order of dependency so poetry solves this issue so we use poetry instead so let's install it and it gives me an error the command curled is not found so one general instruction is that if something is not found you just sudo apt install and that you know binary so i just install curled using this command and then i'll run the command again and now it's saying that python not found but how can it be possible i just installed python so in new muntu versions python is not installed by default or it may even point to python 2.2 or 2.7 so but i want to use python 3 so if i write python 3 it works so what i'll do here is i'll alias python to python 3 it may so happen that you have to alias it to python 3.8 because by default it may point to python 3.6 and you have installed python 3.8 as the new version so you you can do any way you can even just use python 3.8 instead of python anywhere when writing commands i'm going to use this because it's much easier to do it this way so i just run the command again and now it is installing poetry so one more important notice is that whenever you install something there are some instructions and something if it does not work after installation is that you should scroll up and read the instructions so similar to when we were installing postgres if you had installed postgres it would not it would have not worked right away so you have to install uh and read the instructions which are given in installation so now poetry is installed but if i run poetry command you'll notice it's not working so i have to read the instructions and it says that it has already added the poetry bin directory to your path but you have to run this command if you want to use it right away so what happens is that whenever we install something without the pseudo commanders not install globally it's installed in the user's directory and it's not by default added to the path so we can't use it uh you know without any additions in the path so that's why you need to add it in dot profile or dot bash rc or dot dot zshrc but poetry does it by default however you need to close the terminal and and open it again so if you don't want to do that we have to run this current uh shell command and after i do that i can run poetry it it works so then let's see what's the next instruction the next instruction is to install the dependencies so i just run poetry install so it'll throw an error saying that there's no compatible python version found and that python 3.8.5 is not supported and the project requires python 3.8.6 poetry is very strict about its python version and if you use pyenv it's very easy to install a new python version and it requires currently that we use python 3.8.6 but we have python 3.8.5 so whenever i recommend that when you install this project correctly and locally you install the correct version but because i know that python 3.8.5 and python 3.8.6 are not incompatible with each other i'm just going to use a trick and going to change the pyproject.2ml and specify that the required version of python is not python 3.8.6 but python 3.8.5 i do that and i don't have them so let's install it using the same command so do i install them and now i can go here and i can say that python 3.8.5 is required rather than python 3.8.6 and now if i run poetry install again it will automatically create a virtual environment and install my dependencies there so what is a virtual environment so you may have heard of it before but a virtual environment is a separate folder for each project which installs which stores its dependencies separately so if you use pip to install a dependency it's installed globally and it may conflict with other projects so one project may require request 3.2 other project may require request 3.10 so it's very difficult so that's why we use per project virtual environments and install the dependencies there and also it's important that the virtual environments are activated before we try to use them in a project because if you have installed dependencies in a separate folder it won't automatically be picked up by the project and thus throw an error so that's why virtual environments need to be activated before they can be used so there's a command in poetry for that we'll see see that shortly that how to activate a virtual environment without poetry creating a virtual environment would mean a lot of manual work but with poetry it becomes very simple so now the dependencies are installed let's see what the next instruction is and the next instruction is poetry shell so what this this do this activates our virtual environment and as you can see I have this virtual environment activated here this is automatically managed by poetry then the next command is pre-commit install so what does it do is that it automatically whenever you commit your changes to the project for creating a pull request it automatically formats the files it it sorts the import order it removes any unused imports so it helps maintain us and you doing the work automatically otherwise you would have to say that you know do this before we can review the project so I just install pre-commits like this and then next instruction is to create the database for the project so for Linux users we are going to create the user first for the project so I'm just going to copy this command I'm going to make myself the super user and I have to enter my you pass out for that and once I do this now I can run the postgres psql commands without sudo so firstly we're using sudo ratio postgres to switch to the postgres user and the commands now I don't have to do that I can just run these commands so I'm going to create this database for the project and make it make myself the owner of the database and similarly again for the test database as well and there is another instruction written just below that about this is a peer method it's very easy it does not require a password for authentication and then there are other instructions below for mac users and using postgres sql shell and things like that so if you want to do that you can use those instructions as well then the next part comes where we generate the configuration and let's say we generate the configuration we copy the .env.example file in .env and let's see what's in .env file and what is it so basically as you can see this database URL in here there's a app config here this postgres user flask so these are basically configuration of the server and these are environment variables in a file so you can also create environment variables and just export them and it will work as well but after you reboot the pc they will go away and it's not very easy to do that in a local project on servers we recommend using environment variables but .env is an easy way to configure environment variables locally per project so that's what we do then we read the next instruction this is something that most people forget to do but it's very important so we need to create a secret key for the server so secret key is used for cryptographic users so whenever we encrypt a file or we sign the login token we use this secret key so if you do not set the secret key or you share the secret key with someone else someone else can log into your server with local data so never do that every project should have a secure and separate server key that's what prevents anyone from logging in into your server as an admin right so we run this command which is specified here to generating a random server key secret key and then we use it to write the secret key and .env so I come here secret key and then it's safe and then we create the database tables so we run these two commands to create the actual database tables in our database so it will run all the migrations or create all the required tables and now it's asking me for my email so this is important that you enter your actual email where you can receive emails if you enter dummy email you may not receive the confirmation emails and everything so I enter that now I enter a password you can enter any password I repeat that password and now it will create all the required tables the next thing we want to do is to stamp the dv had to notify that we have migrated the database cool so now it's done the next thing we have some notice that if you have used any other username and password for database you need to change the database urn.env right then the next part is to run the application it's this instruction for macOS there's instruction for salary but if you just want to run the server you can do it just like that so I can run the server now it's running on localhost 5000 so to verify that I go here in the next terminal I run curl localhost 5000 and I have a lot of information but this is not how we want to access the api so for accessing the api we have slash v1 and then api name so if I want to access events I run this and now as you can see I don't have any events because this is a new instance of the server but the hello world of the api settings so I can see the settings in this particular you know server and also let's let me do this install jq this is another tip that if you're dealing with json you can install jq and it will automatically format the json into very you know colorful and easily visible format in the terminal and it can also be used to pass and filter json so it's a very good utility so as you can see this these are the settings and app name open event and every other thing are just you know null or false they're not initialized much so this as you can see now the server has set up now I also show you some other stuff how to you know see the server in the browser but before that I need to make this on run on 0.0.0.0 because I'm running it inside docker and localhost doesn't work that way so okay so now it's running on the all access port so if I go here and access localhost 5000 then now it's opening up and you can see the documentation in a bit it's it's quite large so it takes some time so there's how to log in and how to create a new user all the event information placeholder micro location sessions and speakers and something like that similarly as I saw you saw as you saw previously I can access settings like this and there's another little thing which is very cool we would not have a lot of of GraphQL support but we have basic GraphQL support for things like settings so I can write query settings as you can see they are this is auto completing and I can query it like this so that's cool as well so as you can see now the server is set up and everything is working as expected so I'll take a break here and ask if you have any questions do you want anything answered if are you stuck at any point so Alvin were you able to solve your issue or do you still have this python not found error you can also answer in the chat and then if anyone okay yeah so basically if anyone has any other question please write it in the chat Pratik will be answering that or in the next break I'll be so Alvin is asking that is this the instruction no it's the instruction for Docker that's for deployment but not for development so you have to follow local installation instructions like let me share the screen again here this local dot mv these are the instructions and if you have another python version and the app version does not match by sense you wanted you can use pyenv so you can try that out and if you face any problem you can let me know as well so then I think if does anyone does not have any other question then let's move to the front end so front end is the well the UI part whatever you see on inventory.com is actually front end and the back end is all the logic and the API and models and everything so validation and checking everything is working fine every logic is on server so now I go to the front end repository then they are local installation instructions so now as you can see here I need these dependencies git which is already installed node js 14 so if I go here it will show me that how I can install it you can download it but I am more of a command line person so I'll install it using command line so let's go to the next terminal we let server be open in one and let's do the front end part in this terminal so let me clear the terminal and now let's install node so as I said let's whenever we want to install something we write sudo apt install and then the dependency so there's nothing called node so if you search for a bit you'll get that the package is called node js and linux so I'll just install it using this and now it's installed but now if I run node version it's 10.19.0 but I need 14 and we exactly need 14 because as I said we need the version specified in the project because that is tested in the project not others so how do I install node 14 so one way to do that is use nvm so like py nvm is node version manager it allows you to install multiple node versions simultaneously so I just write nvm and there it is so if I go here this is how you install sorry so I copy this and I paste now it's done again it's saying that it is already added the required changes for nvm to be used in the shell and you need to close and reopen your terminal to use it but if you want to use it in the same terminal as this one then you need to copy this thing and run it so I'm going to do that again this was not installed using sudo and thus we need to add it manually to path so so now I've done this done this so if I run nvm I have this instruction of how to do things so I want to use I want to install 14th version of node so I'll just say 14 and I have searched that what version is the used with the project and it's 14.15.0 you can use any version latest most preferably preferably latest version is still in the 14.3 so you can't use 15 point something but 14 point something is good so I'm going to install node 14.15.0 so now if I run node version you see that it has automatically changed it to 14.15 if you have multiple node versions it may not do that so you need to use this command nvm alias default and then the version like 14.15.0 whenever we install node yarn is automatically sorry npm is automatically installed but we want yarn to be installed so now we have this installation instruction of how to install yarn and just copy it again and run it and now it's installed as well then we need to get clone the repository and again you need to fork it if you want to develop it and clone your own repository but as I'm just showing the instructions here I'll just clone the original repository now it is cloned so I'll go to open event frontend and then the next instruction is that we run yarn so once I run yarn it'll be installing all the dependencies so it'll take some time so does anyone have any question any doubt do they want to ask anything and also like if you realize that if you run python-version and it's not found then you can also try python-version so that works as well frontend installation is a little bit much more easier than server but if someone is joining again in this workshop from the previous time you would have noticed that we have changed the commands in the server to require less installations, less extra installations so we install libpyro by default libpango by default so you don't have to do that so just the curl and get these are the two requirements currently that we assume that you have by default now the dependencies are installed then the next step we see is that we want to copy .emv.example in .emv so the same as server project we have .emv file here as well now let's see so we have configuration options of mapbox and api host and other stuff the important part of the project the configuration of the project is api host so we will see in a bit what it means to have the api host here so then let's go check the next instruction and it is to run this command now it throws in a get text not found so we need to install get text this is a problem with windows as well that people try to install this project in windows and windows does not have get text it's not as easy as sudo apt install get text on windows i don't know if the package managers like chocolaty on windows work with get text or not but i tried installing it for a couple of people and it was very difficult so i just recommend that you use wsl instead so i have installed get text now i run this command again and now it should work so now let's move on to the next instruction there's some notes about how to install get text and what's the fast boot and things like that then after this we run yarn start so it takes a bit of time and it's involved process it takes a huge amount of memory as well so bear with me a little bit till then again if you have any question you can ask now if you are facing any issues in the installation or anything then also you can ask a rebies very welcoming and need to solve the doubts and with this predict is also parallelly solving the doubts in the comment section but definitely it takes time though 30 minutes is a bit too much as you can see there's some deprecation warnings and other issues errors in the the starting process unfortunately they are not part of our code so we can't do it can't do anything about it so yeah okay so now the project has started on this port so let's go here and see as you can see there's some loading issues because it requires a lot of memory and even if i have 16 gb of memory it's quite filled up but anyway the project is loading up and you can see that it is similar to evented com but like there's no upcoming events and there's no false issue summit so why is that so basically this is because we have dot e and v so let me show you here so in any way we have written api host equal to open event docu dot false asian dot org right so what we can do instead is that we can write in here api host equals actually dps api dot event da dot com right and then we have to obviously comment this out and then if i run uh if i i have to restart the uh the uh yarn start command again and take some time so i'm not going to do that but if i do that then you will see uh the false asia summit and everything everything that's present on actual evented com locally as well and then obviously there's an option for api host to be equal to your local server so local host 5000 so we are going to do that so let's rerun the yarn start command unfortunately there's no easy way to uh for for you to take up the new api host uh you know without running it again and then it takes a little bit less time than the cold start of yarn start but still it is somewhat slower so uh this is how you configure the api host in in the project so now if you can see the api host is local host 5000 so to for for you to see the change you can see that we have event here written here now and after the project builds you will notice a change here as well now the project is built and the site has reloaded automatically and now you see there's open event written so event is not written anymore because by default we name the application in server as open event so i'm going to log in and i'm going to use the email i registered when i was setting up the server so that i don't have to create a user again you can create a new user for that as well but i log in here and now what i can do is i can go to admin section because this is my local server as you can see there's no session no events and everything now if i go to settings section and i change the app name so for example let's say you are creating a college fest like hack fest 2.0 you can change the app name and you can add the api url local host 5000 and everything you can configure google capture and mail settings for set setting rocket chat that you also use currently so tagline have fun and the api url is local host 5000 this is obviously temporary if you deploy it you need to give it an actual url like api.eventure.com and things like that i'm going to disable email for now and click save what is this okay so from email let's say this and the front end url is local host 4200 save and then i go here and reload okay so basically what happens is that settings is cached so let me go here and remove the cache of settings and reload the page and there you go hack fest 2.0 and thus you can create events and things like that everything that you can do on eventure.com or explore the admin settings because on eventure.com you can be an organizer so you can already access that but you cannot be an admin so if you need to debug anything in the admin section add any feature you need to deploy your own server locally so this is basically how you do how you deploy server how you deploy front end and also how you link server and front end so this is the kind of workshop part we'll be discussing something more but basically till now do you have any questions are you stuck anywhere do you want something specific to see what would you like to do for the rest of the workshop so Leslie says i tried to open an event server last night in the final step the command to run Celery and start app did not work but the unicorn one could start the server right Celery could not start and return to command length correctly okay so that's a good you know point to do as well let's see how to run Celery so for here the Celery command says that this is how you run it so let's see what happens i go here i'll open i've opened a new terminal i'll go to the server section server repository let me yeah again i have to run poetry shell so i don't have to run poetry install again and again oh sorry sorry thank you yeah so uh uh can you see it now yeah okay so uh as you can see i am in my uh open event server section and there's uh i run poetry shell you don't have to run poetry install again and again because once it's installed the dependencies are installed but when if there are changes in dependencies on server and you get no module founder then you have to run uh poetry install but whenever you want to work on the project you have to run poetry shell uh the indicator should be that if you do not have poetry shell poetry shell active then you won't see this thing here and you may not have access to the dependencies so let's see we had this command on how to run Celery let's copy it and run it so there's this error for Redis so as we had Postgres server we have Redis server as well and let's start this as well so so go service Redis start Redis server start yeah now i'll run this command again so for Celery we want Redis to be working so if you found get this error you can if you get this error cannot connect to Redis that means that the Redis server is not running and it's good thing that we we can see the errors up front without the assumption that everything is working properly this error also happens to a lot of people so if you get this error you just turn pseudo service Redis that server start and it should start now when i run it then i get this message that Celery hit this and that Celitas are like there are some warnings unfortunately again these are third party warnings so we cannot do anything about it but this is basically what you should see now it will not return to the uh you know terminal so like when we ran this server run server command or the yarn start command it did not return to to the terminal and this should be more clear in the documentation because it just assumes you that once you run this and then you run this so it must happen in the same terminal but it won't we had previously written that you add ampersand after that so let me show you so if i kill Celery and i run it with an ampersand command at the end it will just run the Celery server and exit and i have my terminal with me again so Celery is still running but it is detached so i can run anything in the terminal here but it is it is not recommended because now i can see that this is the process ID of Celery i can kill it but i cannot kill it using ctrl c and i can forget about it so it's just recommend that that you run these two commands in two different terminals i'll make note of changing this thing in the documentation also integrate socket iofos this this is written in now so i'll also remove that but anyway this is how you run Celery so hopefully once when you try it this is how it will work but again let's say you try with ampersand and now it is detached so how do you now turn Celery off so you run kill command and you use sorry you run kill command and then you use the PID which was printed when you ran the ampersand one command so then now the Celery will shut down automatically then so this is how you run Celery so hopefully that answers your question thank you yes yeah then does anyone else have any other question yeah so i just read something about albin wrote in the chat so engine x and apache cannot work simultaneously on the same server because basically both are services which bind on port 80 and 443 and only one can bind on 443 and 80 you can run either of them i recommend engine x it's much faster than apache it's not a process per request right and does you know you just even if you want to run just apache then you can just add the reverse proxy definition and proxy or your request to local host 8080 or wherever the unicorn processors are running because this workshop was about how to run it locally we just showed you how to run server using run server command if you want to deploy it in production as it is written in the command sorry the documentation that you want to use unicorn however if you're doing it running it in production then it's much better that you use docker compose so i'm just going to show you the documentation about docker compose and how easy it is to deploy it using docker compose so here it is written deployment with docker so what you have to do is to please scroll in the project seed into the project copy the command dot env again add the secret key this is extremely important this command is redundant but anyway you can do that you won't do anything and then you just do docker compose updash d and it will just do everything like you don't have to set up posters you don't have to set up redis you don't have to install the dependencies poetry this and that you don't have to install poetry installations anything it will we already build the image on every commit on the development branch and even on master branch whenever we make release we tag them and docker compose has those pre-built images so it will just download those images and run those and if you go to local os 8080 you know just see that so let's i'm showing i'll be showing you the docker compose file uh if you go here you will see that we use the eventy open event server image here and it will just run postgres release everything's pre-configured you don't have to uh you know run the create db command or something else it's extremely easy it takes your settings from .env file it also runs salary automatically so everything is extremely easy if you use docker compose however it's for a deployment you know introduction but if you want to develop locally there's no docker solution currently for that we may have that in future but currently we recommend that you install postgres and redis and things like that salary so uh this is uh about the deployment thing does anyone have any other question yeah so eight gb command uh i've seen people uh run uh you know um sir the front end with four gb command as well 30 minutes a bit too long but uh a person with four gb ran um i i helped him run the front end and it took him 10 minutes to run it but 30 minutes is too much i don't think it should take 30 minutes it also depends on how much time is free for it so i think at least two gb ram should be free for it yeah uh and also um the the yarn uh l 10 n generate command that should not uh take much more time than just one minute uh it may happen that it has already run but it does not show the terminal for you so using that it's still running so for that i recommend that you just press enter and see if terminal comes up so that that helps as well so i have this question are there any plans to add pw a feature to event a to strengthen the offline capability so what we are going to do in future is we are uh evaluating if we want to shift to react or view so we will be developing a new site uh it will integrate first with the whole site and then be completely changed to the new site so it details on a final but when we are building it we will take in consideration that uh it should be very fast to load and yes as you said the pwa will be there as well however offline is uh you know we don't really know how we can add offline feature uh adding a local database would require a lot of effort so it will be much more friendly so for example if a person is offline then it will show an app shell and say that say that you are offline and in future yes we can also think of adding the loaded events in a local database so that they can see the events they have already or maybe adding a feature of downloading an event data and if that data is downloaded in the local database they can see it even when they are offline but this is obviously just speculation for now and we'll have to see what uh what's the priority of the project then then do we have any other question do we have any uh you know any request that what do you want to discuss next do you want to discuss architecture do you want to discuss a specific feature do you uh want to see how a specific component of the projects okay so now uh then let's move on to the code of the project so uh I'll be showing you components in uh different parts of the project so first let's open up front end so this is the front end repository uh let me collapse everything and then let's uh let me explain that what are the different components parts of it so this is the root project folder and we have app config and things like that the most sensitive part of the uh the all all the source code of the project is in the app folder and then we have adapters components controllers routes models so in models we have you know the data whatever we fetch with api so for example we have event model and event model contains the event title and the attendees of the event and uh other information like logo description things like that and then there are feedback groups video streams video channel so video channel has name provider like jitscb blue button and and then there are video stream moderators so this this is all the data we load from the api and then there are routes so uh that's one uh the the important the starting point of the project I recommend is to visit router.js and then we have login register is a password and these are the routes so if someone accesses uh slash login uh it'll redirect the person and then the code of login route so this is how you navigate to if you want to understand that what is the uh a particular component which is used in the part of the project this is how you navigate around it so let's take an example if you if you see the uh the url currently on your screen you'll notice it's event.com slash e slash identifier slash video slash workshop slash 299 uh something like that so let's then uh find out that which part which template handles it which javascript file handles this particular route so if I go back to router.js I find uh I start to find where where is e so there's public but there's no e here but there's this path option which says slash e slash event id so this means that it overrides the public route and anything that loads slash e slash event id is handled by public route so when you go to the event homepage the public route loads it so as you can see public route we have model function which loads uh the event so uh it finds the record event with the event id parameter and it also loads the uh required uh information about the event like social links speaker call tax owner organizers and video stream so then I go again and uh I see slash e slash event id slash video but there's no video route as well but there's this stream route and then there's video slash video name and then there's view stream id so if I go to this stream view so there's this route stream view and then there's this information about model where we load the event from the public model and then we find the stream so we have video stream we find the a particular video stream by its event uh the stream id and also include the channel and everything in here so this is how we load the video stream but what where's the template where's the logic of ui where's the ui of the logic sorry so if you see stream view there's this file in template and it loads this component so anything in template is either used as the template of the route or template of the component so I see that the view dot ts file or js file uses this ui and this public stream video stream so I go in here template uh there's components so I go into components then there's public then there's a stream then there's video stream so this video stream this is the logic this is the ui that if it is loading we show the loader if we get an iframe we show the iframe if it is jitsi we use the jitsi stream component so now we are in jitsi so let's move to this jitsi stream component and here it is shown that this is how this is loaded so once the this component is inserted we call jitsi setup in the jitsi stream component and this is the logic that okay this when setup jitsi is called then we pass the channel url and everything we set up the information we change the moderator password and things like that so this is the kind of top-down approach you start at root and you follow the templates to find that which component is used and where the logic is found so this is how you do it another way of doing something like that is to you know just use ember administrator so ember inspector so there's this extension if you go to chrome store or five fox extension you can install ember inspector and then if you can click on inspect go to the ember tab and then you can click on a particular component and say inspect ember component it will load a particular div and a particular card so this is how you do that so this is how you you can see which component it is and you can see the arguments passed in it the props passed in it you can see the function passed in it and things like that so this is how you debug and navigate the project so does anyone have any question for now does do you have any kind of project questions or do you want do you want any specific part of the project to be explained any component things like that if you have anything any problem setting it up then you can ask that that as well if not then Pratik can you show the people how to you know navigate the issue board how to find issues things like that what's the criteria of selecting issues currently okay so i'll just share my screen okay can everyone see my screen yeah okay so first of all this is a open events of our main repository if you want to contribute the normal workflow is that first of all you will have to form the repository in which you are planning to contribute and these are general steps which you will follow in any open source repository in which you will contribute so the thing is that this is the main upstream repository and you won't have the rights to push a new branch or anything on this repo so forking is done so that you actually have a copy of that repo on your personal account and forking also helps GitHub to keep track that okay this repository is a fork of this repository and we know that so GitHub actually detects if you maybe push push a new branch on your repo itself it will automatically detect that okay there's a new branch and maybe you want to make a pr and it will show options here so so for example in open event server you can see yeah in my personal repository there's a open event server repo and it is telling me that it is two commits behind from 4.6 development branch because that's the case i haven't updated it yet so that is how GitHub also keeps track of these things so yeah you will first go to the local installation do the installation as Arif just showed after that you will go on to the issues tab now it depends that maybe testing the system you've got a new issue you can simply make a new issue here and there's a template available already in which you can tell what are the things you face in that issue in the thing that you face okay yeah so after that you can maybe also there are there are label mark if say you you are not so confident about your software development skills and you just want to try out the workflow and maybe contribute to the documentation you can simply select docs and here are all the issues that are related to docs also maybe you want to this is going to be our safe first issue so there you can also find issues in most open source repositories and here as well which is a tech of good first issue so let's take this for example so first of all see if someone else has taken the issue or not so if someone else has said okay i'm working on this so in position we don't assign you issues you simply come and say okay i would be working on this issue but say someone took the issue but haven't been active on it for a week or two you can just ping them on github itself asking if they are still working on it and if they don't reply though that means that they are not working on it and you can simply take that issue up and start working on it so basically there's some discussion on this issue about a confusion that the other person might be having so you can do that as well say you see a issue you are you understand it and but you're not sure if that's a thing it is asking to do so you can just ping arib or me or anyone else or whoever opened the issue and you can ask okay i think this is what this is saying and i think this is the solution am i going the correct way and can i make a peer about it and we'll get back to you right away so yeah so here's a PR related to this for the PR like as i told that first of all you will fork the repository and after forking then you will clone it so what it will do is like i told that there are forks now how do you make sure that whenever you are pushing a branch say git push on which repository are you trying to push it is it for satia slash open event server or is it username slash open event server so here comes git remotes git remotes are used to keep track of this only that in which repository you are trying to point it so the basics and tags of most git commands it is git secondly the action you are trying to do let it be pull push fetch anything third is basically the remote name so i can set up multiple reports for example remotes for example the repository from which you clone get automatically nameset origin and it's more of a convention that mainly the repository that is your main repository it should be named as upstream so you will simply git add remote upstream and the open event server url so that would dot git that would set this repository at upstream and say now you want to make a PR like this person is a branch fix issue 7161 and what he or she must have done is that they must have pushed it now if they will push in the upstream repo it will simply throw an error that you don't have the appropriate rights so what he or she must have done is that they must have put the amount put the command git push origin and the branch name so that would have pushed it to the particular repo which he or she must have fought and git and git up being intelligent about it simply detects it and will show you option here that okay you have pushed a new branch maybe you are you trying to make a PR about it so you can simply make a PR from here okay so thank you Pratik I hope everyone has got an option to the to see the options that how to navigate issues how to find issues and how to contribute on GitHub as well so does anyone has now have any question about the point any request that what should we do otherwise we also have you know other stuff we can go through but do you have any questions right now as Mario said that we have some people from a lot of people from India so we can also take questions in Hindi if you are not comfortable in other languages okay so then let me show you a couple of things as well this is some issue that people don't understand what we mean when we say that we want something to be translatable so what we have in the project is that the project supports multiple languages so you you can see the language switcher in the project below so if I come here you can see that there's English by default but there are a lot of languages like Bangla in the French and Vietnamese and Spanish and German so we support multiple languages and how it works is that we use get text which is the open source in kind of defective standard of extracting translatable things and we have these translation translations folder and we have messages dot pot so what does messages dot pot does it extracts all the strings in a particular kind of string translatable strings from the project and list them that where these these projects where these strings are present and this is used by softwares like Weblit to show the translators that this message is present here so that they understand the context and also they can see where this particular string is used and then they create different files for different strings so for example this is the Hindi file and this has the translations of these strings in here so we have French we have Japanese Korean Polish and everything like that so when we say we want something to be translatable it does not mean that we want to add something in these files because these are automatically generated so if I go to package dot disk json I can see that there are three commands lten and extract lten and update lten and generate so when we all ran lten and generate while installing the front end and what it does is that automatically creates json files from these translations so actually in the front end we do not consume these pure files but we consume these json files for translations so if I come here assets locates now here you can see this json files which was generated by the generate command so if for example someone loads selects Bengali as their language this json file will be downloaded and whenever whenever in the entire site manage events is written it is it will be replaced by this phrase so that's why we ran the generate command otherwise the project would not have worked so when for example how to make something translatable so let's say I am in this component event card event card and here there is no text in here okay let me go to site panel so here you can see that there are multiple texts written here let's say I add a new text I have to add a new item I add div item and I say support right so now it is not automatically translated if even if there is a translatable string in the PO files which automatically has the translations for different languages it won't be translated to make it translatable we wrap it in t helper okay so now it is translatable but if only if it has translations already if it does not have translations I have to wrap it in t and I have to then run this particular command yarn lten and extract so now it will update the port files and add this string there and then I have to run lten and update this will generate the PO file for all the languages and once I've done that I'll push the code and weblate and other translation projects will pick pick it up show that there's a new string to be translated support and thus make it translatable now assume that I had to do the same thing in a js component okay so let's say I go to add to calendar and let's say I have this is organizer message now this is not translatable because this is a normal string let's say I want to make it translatable what I need to do is I need to make I need to wrap this as well but I cannot wrap it in t because there's no t function in here but there's this lten and service so I'll write this dot lten and dot t and then I'll pass in the string here by default there's no lten service here so I'll add this service lten so now if I run the command which I showed you here lten and extract and lten and update it will automatically be added to the you know the port files and be generated for translation so whenever you see an issue which says that make this thing translatable you have to wrap it in either t helper in the hbs file or like this here so this is what I wanted to share about how to make something translatable in the project so does anyone has have any question so russian serialic we don't have the support for that currently but if you want to contribute to the to add in translations for it you can do that definitely so for that you will need to create a new file for the translations and then run the both commands extract and generate and then you can use the blade for adding translations on that it doesn't even have any question or anything else any request what what do they need to discuss next you can ask your questions in hindi too don't forget and and can anyone try to ask the question in hindi so then I don't know what else to discuss on so if if anyone else have any other suggestion we can take that up otherwise we'll be wrapping up the workshop in a bit so now is the last chance to have any other question yes and just to make it for sure we also have one more workshop in our forces summit about which which will also be taken by arib itself himself so that chance to so how sanitary is working at this project yeah so sentry is the error reporting platform we use so what we do for sentry is different in different projects so I'll be sharing my screen and showing that how it is working in both projects so here it's it's front end so first we go to package json and if you see we have added the three dependencies for sentry sentry browser sentry integrations and sentry tracing and then we have this file sentry.ts where we have imported sentry right and then we first check that the sentry dsn does not include dummy why is that so let me first close the existing files so we have config sorry environment and here this this is the environment we use to configure the project like app name and other stuff so here there's sentry configuration options and this is the dsn default we we have added this dummy dsn here so first we check that there's no dummy in the dsn so if there's no dummy string in the dsn we initialize it and we initialize with integrations like embed, redo, console and browser tracing we have also added some error checks to not send some very common errors like 404 and then we have also added some extra enriches in sentry to add some information about the json so if for example you're creating an order and the order creation fails so whatever json we are getting from the server is also sent in the sentry event so this is how we add sentry in frontend and then there's server so i'll show that as well so there's config.py and we have this is config.py so we have sentry dsn here sentry release name and sentry traces sample rate this is enabled here and otherwise if i search sentry this is where we import sentry celery, sentry flask, sentry redis, sentry sul alchemy and here we configure that sentry sdk init and then we install the integrations of flask, redis, celery, sul alchemy and then also set the release name and the trace for the performance monitoring so this is how we have configured sentry in the project i can also show you that how the errors are reported in sentry so we have our own hosted version of sentry for for satiate projects and here you can see that whatever there's an error we see what's the error and this is the interface where we can monitor that how many errors are being reported we can see which user experienced that error which browser the user was using and how many users it has affected so this is generally how we do it so i hope that answered your question now do you have any other doubts any other questions i guess for the moment don't have any more doubts so what do you think okay we would also like to know about open event ws gen yeah so the ws gen is a project for generating a website so for example if you have an event on event a and you don't want you know authentication and other ticket management or anything like that but just want a static site listing all the speakers all the sessions and you want it to be fast so for example if you go to event a and you go to the session section we load only a few sessions at a time to not make it complicated so it whenever you scroll below there's some time loading more sessions but we can also generate a static site so that every session and every speaker is pre-loaded in the html pre-generated in the html and then we just show it on the page and you can filter and favorite session as well on the site so it is written in Node.js and whenever a request comes up we start a generation we load the speakers and sessions from the api event information from the api and then we pass them to the pre-generate pre-written trim templates and it generates a folder where it has several html files and css files we zip that folder and you can download that zip or you can also deploy it on the site itself to preview it so this is basically how you generate a static site using ws gen so previously we had Dhruv Jain who is now also working on rocket chat who made who solved a lot of issues in ws gen and made it possible for us to run modern apis on the platform so I hope that answers your question yeah so if anyone else does not have any question then we are reaching the end so now last chance for any question I guess we don't have any more questions are there any questions folks please count or else we can move forward with our photo session it's the okay with linear tricks sorry if I pronounce your name wrong is thanking for the kind presentation thank you pet for your appreciation so now what do you think folks should we move forward with her photo shoot I'll post it on my twitter and you can also repeat it come on swap your beautiful faces we all want to see you Elvin, Gaurav, Svi, Che, Lessivong, Pet, Zihwang, Mario, Alexandria yes we have four four photos now some more some more come on yeah five yes yes can you have all faces as the last workshop of the day two Leslie two okay after I say switch on for switching on the camera like people are leaving no problem so I guess we take screenshot with this much people yeah so everyone can just wave their hands like that and then we can take a screenshot yeah thank you everyone so for closing words I defer to Shubham thank you everyone for joining the session Shubham will now take it from here sure thank you Ari for sharing the state so like day two was also so good like we get to hear so many good talks so many lightning talks like today today was the first day of lightning talks and we had to attend good workshops of PS lab got an introduction about pocket science labs device two which was really good so I hope you folks have enjoyed today's session and we have many more things to learn tomorrow and who knows like if you want to learn something eagerly from many days and the same topic shows up tomorrow and you'll be like wow I wanted to learn it from so many days and expert speaker in that particular section is giving a talk on it so opportunity yeah let's we'll take one more screenshot because we have one more new face so Pratik Ari do you want to say something just thank you everyone for joining hope you learned something and then please stay for the rest of the week and hopefully when we have the next session then we will discuss I'll plan something more interesting and I'll plan something more an expansion of the session so thank you everyone yeah yeah thank everyone for joining be healthy take breaks between the sessions and thank you for being such a good audience see you all tomorrow at day three