 All right, we're starting up with the November share community tech call Got a few folks on so far and waiting for a few others to join Let's see. Let's see if anyone else joins the next minute and we'll get going Link to the agendas there if you don't have it yet And I was just adding a bunch of things last minute to the agenda and I may just go ahead and share my screen here Okay, ticked over one minute. So let's go get going Is there anyone we can have to take notes on the call I Think harsh to it last time. So maybe someone other than harsh. I can do it great, thank you and I think it's set for anyone to just have Suggestion rights, so just go ahead and type in and I'll confirm everything at the end Let's see and then also please go ahead and add yourself to the list of the attendees if you haven't yet Okay, and I think we have Some newcomers here this call at least The Center for open science folks not new to share but Getting your you're still relatively new to this call, right? That's that's true. I've kind of lurked on one or two before but I guess so in turn myself is that I don't know I'm abram Booth was sort of on the share team when we kind of built share v2 here at COS and I'm still and and now the maintainer of That was kind of the share instance that we have gone so I'm familiar with a lot of the Issues that share faces and such but yeah, I'm looking forward to see where it goes from here great great and for the recording Also, I'm I'm Rick Johnson from Notre Dame here So let's go ahead and keep rolling on through any other newcomers Okay So looking at the action items from the last call scrolling down So I think Cam you had the first two is cam on today I'm gonna send him a note to see where he's at. Okay Let's see and harsh. I know you did take a look at GitHub. We've got something on The list to show for that Let's see I have started pushing my updated flow code to get hub as well. So we'll show that of course have the gender created and then The issue was created for that For the open-door stuff So any questions on any of this I Would like to demo one of my flows I've worked on great great perfect perfect All right So I think the only thing we had before that was to mention that We were selected to present at CNI on December 10th So myself and Ryan Mason and Cam Blandford were all going to present. I think it's at 545 p.m. I think it's their presentation time. So the last time slot of the first day there And so are there the basic we have kind of a rough skeleton in place for the presentation still working through it, but the plan is to go through a lot of the The strategy around the community effort of Community development Building up more shared governance working towards community-shared infrastructure kind of things of that nature and then and then the plan to demo a little bit of the Early technology at the end there and it's it's a 30-minute presentation So it's so we'll probably be moving through all of that pretty quickly because we'll want to have Five or ten minutes for questions at least at the end. All right Let's see if there's any thing on that You'll hand it stop sharing hand it over to harsh if you're able to be able to demo the pushing point from github Yeah, sure thing see my screen now. Yes, perfect So I was looking at How do we ingest these flows over here in the share red flows directory into Into the share red So typically When you just say start with node, let me get the font higher when you just say node red it's It load the specific settings file Which is this one And right now it doesn't have projects enabled So it says projects disabled and there's a specific setting which you can go and change and say if I make this true In the settings file you're enabling settings, right? and Then if you start node red again It's going to say Settings file the same settings file and now it knows that you've enabled projects, but there are no active projects so When you go to the UI The first thing it's going to do is say, okay, which product do you want to use because you've enabled it? And then you can either create a project or you can clone a project or clone a repository so in this case we want to clone a repository and it's going to ask you to set up your version control and in this case it's my user name and email GitHub And then you give it a product name. So let's just call it Let's just call it SRF Or actually I will be have one of those. I'm gonna call it SRF one and give it a The git repository URL and Use the name password to your GitHub profile and Give it an encryption key. It could be anything But you want to store it somewhere where you can access it again later. So let's just say something and When you clone the project It's going to get it over here So now you now this file has been loaded from a GitHub repository and you can tell that by clicking this button over here and You can look at the commit history. So these are all the commits that were made to that repository earlier There's a couple of changes you can couple of ways you can work on this flow To push and pull from the repository itself So you can say add something And let's say we connected here and then I can hit deploy So it's going to say you have a local change and You can add that as a commit and then push it to the repository I haven't really figured out how to do branching from the UI yet So so alternatively what what I've been thinking about is instead of Instead of using the UI When you when you actually create the when you clone the project It's going to create create a project's folder. It already has a project's folder, but it's going to add that product over here So in this case Okay, I see what's happening. So I don't see the project's folder over here right now And that's specifically the reason I'm using a different settings file. So I Want to go back back track a little bit. So initially when I showed you how to start How to change the settings file Instead of using the default settings file, which is in the node red folder It would be easier a little bit more convenient and more explicit To use use the settings file in the share read repository. So which is share read slash settings or JS, right? and you can indicate to Indicate to share it node red that you want to use that file by saying use user dir Dot, which is the path to your settings file, which in this case is the current folder. So if I do that It's going to pick up the settings file Which I have in my local folder which is right here and Now if I ingest so it already has one really because I was playing with this earlier But let me try to repeat the same thing again. So if I if I go to So this is not going to work now if I go back to my console and This is an unrelated issue. So I need not worry about that right now But I can say Projects and I can say new project and I can clone the repository and do the same process Again, and what's going to happen is it's going to create this folder over here under projects folder And then that's the clone repository right there So now I have a way to say Go to projects To the to the one that I just cloned and that is version control so I can say And there is no I can do all the good commands over here now And that project folder is a sub folder within your share read repository And this is just local file. So when you come at this file, it's not going to go to the share read project It's going to go to the share read flows project because it's it's got a separate git configuration over here Yeah, does anybody have any questions Great. So what I will do is actually just do like a quick write up on this in GitHub issues I know Rick close it today morning But if it helps just for folks to like reference later and even for myself from a learning standpoint that yes This this is the way I figure this out and you know, maybe we need to change it later again Just to add some history on that So I'll create I'll add some notes in the GitHub issue related to this Work and we can reference that later. I'll add it to the meeting notes afterwards this week. Hi harsh This is this is really a Good stuff night. I didn't know that node would have this capability built in So it's exciting to to see it From a security standpoint Because we're not Encrypti, I don't think we have, you know, I don't think the server sets up SSL certs or Yeah, I don't I don't think it's doing that is it then passing that get your github password Locally unencrypted. Do we need to maybe make it easy to create a SSL cert for the local server? um, I I don't I don't know and And and I think what it's so so your question. I didn't really understand it very clearly. So your question is about When the flow file is sitting on the local machine, is it encrypted or unencrypted? Is that your question? Well, that one is encrypted. I know and that's what you're doing with the credentials. Yeah I guess and I maybe maybe it's not even that big of a risk because it's You know, well, I guess it would be I don't know I You have to submit your password there. You you I'm guessing post the password to that back server. Yeah, that is Encrypted and so I suppose if someone then you know had Some way to monitor your local machine. They'd be able to grab that Potentially, oh, I see what you're saying. Yeah, maybe we need to make it easy to create a cert Install it in the local server so that nothing no credentials them, you know, go over Over on encrypted channels that I don't know. Maybe it's fairly old. It's a fairly low risk Issue But I think it's it could be an issue nonetheless Mm-hmm. Something to think about And you're referring that specifically to the github credentials, correct? In that case that I saw. Yeah, that was I think credential you passed across. Yeah. Yeah. Yeah. Yeah Yeah, I see what you're saying because it does maintain that connection from no dread to github once you've given it those credentials So well in that part I expected somewhere did the you know github's I Sir, yeah It's just this the the one where you from the interface hit the back end of your own server And again, it's probably fairly low risk, but just something we should think about Sure, I can I can post a question to someone in like on the slack channel for no dread and see what other people are doing about it And if it's a concern at all, yeah, okay, we're good. Thanks, Arsh. Thank you Great great demo Yeah, thank you, Arsh and and and and you probably saw in that first screen That harsh had where there were a couple unknown Nodes and that's actually something I discovered just before the call here that I was working through Trying to fix. There's a there's a couple package dependencies that are not in the main package.json file yet in the The share read repo and github and I had some merge conflicts when I was trying to push it up So I so I ran out of time essentially, but I'm gonna work on that after the call here to clean that up Okay, thanks, Rick. Let's use me the energy to like figure out what's going on sure sure Yeah, but I can I can actually Show you what it looks like with that stuff turned on if there isn't anything else on the github One before we moved to other demos. Let's see here Let me share again and then I think I've been out of you said you had a demo as well You can go to you after me probably Let's see. Okay, so hopefully you can see my screen again. Yes. All right, so The two nodes that we're missing were these splitter ones and the weight paths and Those are two npm modules that It has to load and I think I think if you reference it in the package jet json It will go ahead and bring those in when you bring the report when you do like an npm start. Is that right? Try and remember all the steps that you have to do to like to have it reference the dependencies, but anyhow, I'll Get those up there. Yeah Okay. Okay. Yeah, so so essentially the work I Did since last time So so kind of recap about what is happening here. So this is making a call to cross ref It is first taking in a config file that is Blocking me. Thank you. Wait. Oh, there we go. I Think I have the config file open here Yes Yes, yeah, so it's taking in this config file where it sets the URL of where it needs to send the rest calls to cross ref and then a mapping of what is the Crossref expected attribute defined creator or given name family name title, etc So essentially that is then being loaded into The Environment memory here. It's currently setting it in the global config and that's something I want to move to the just the general flow config because obviously that global will not work unless we're running just this one but essentially then it takes It takes the URL from the global config there sends the request here it is actually Taking the array of items and splitting that into separate items So then it actually does all of these flows Once for each of the items in this array coming from here. That's what the splitter one is doing and then the thing that but the thing that I added this time was joining The different parsing operations for each one. So it's actually gonna say I want to get my creators I'm gonna get my titles and when it does that it has them returned in separate objects and then what this weight paths one is doing is it's actually merging those into one object and Then it's pushing it through. So let me Run this again. Let me clear out the output here. I also got rid of a bunch of Errors that were happening with the code in there some type errors So so what's happening here is looking at one example? so the Payload here, it's it's got my creators and It's got my titles and it is and for the moment I was pushing the the input work all together just into this object here in the same one Ideally, this is pulled out and in a separate place and I kind of came halfway where I have this separately there, but I Didn't quite get it pulled out of the flow right to get it out of this section yet But essentially the idea is that it's taking all of the metadata from here And then it's gonna start mapping it to new objects and properties into here and what is done so far is creators and titles and Before it was actually coming back as separate Objects and now it's all in one. So that was an important step there So now really all that is necessary now to add other properties that you have to add to this list of flows new new items to grab the properties Push it to an object and then you set up a similar thing where you say I want to get the titles creators etc and Then this weight paths object takes those in and what it will do is pretty cool So it will actually wait For all of these things to finish before it moves on to the next step This was a node module that I that I found So that's one of the great things about using node red is there's lots of modules like this that already exist And these timeouts are configurable to set up however you want If you don't want things to to wait on those two Move on I think the this the step would be to not have it be within this weight paths block but that's definitely possible and Then right here. I'm just grabbing the work here. I am Grabbing the paths and then just pushing it say load object the main the main output object because By default it's down here. So I just thought I would just go ahead and push it to there for now Any questions on this stuff? It was very interested in the weight logic So So the one that you have the blue node that that one is you want to set the timeout to The slowest one preceding them so if if we know like, you know, maybe the first yellow block is going to take The most time you want to set at least that weight path at least to the timeout for Higher than the time higher than that the timeout to be Yeah, but I wonder like can you set weights on those individual yellow nodes to like wait for certain time before sending it or And I haven't played much with this side. That's something for me to explore too, but yeah, I would assume you could the That I before when I was looking for the how to do this. There were lots of different ways people were doing it Where they wrote more code themselves like there's a way to do it with With a join node as well. I saw that looked more intensive in terms of setting it up So I thought I would try this first, but yeah, I would assume That all that's on that and I haven't looked at the code behind this weight paths node either That's all, you know, that's all wrapped up somehow within the The packaging for the node itself. Yeah, yeah Yeah, but I wouldn't but now when I was this is a good segue where I think this flow in particular For example is ready for someone to collaborate on so if their environment is set up right and they take it they could actually add other properties to this while other people are working and working parallel because They could work on new code to parse for other properties. So like all of this, this is all self-contained, right? So that Could be worked on independent of of me continuing to work on this So I think this is in a In a good place to start to branch off a little bit nice and I started creating some issues in github to list some of the different tasks for that like for example Changing this to push it to There's a flow context instead of a global context that I think is probably where it belongs So like just this particular flow is where those properties would live Things like that Yeah, I would love to do one of those issues and work through it Rick great great Yeah, and and I was actually just pushing all of these up with the github There there's a there's a few quirks to it where You make a change like say if I I can just add something real quick It now Will detect that I made a change here And I can I can look at it look at the change that happens And this is actually really nice the way it does it It will show me exactly everything that's new here And then All you have to do is you know push this down Do the commit And then it will do the commit message and show and then the other Additional step that you have to do that isn't necessarily obvious Sorry, my zoom window is continually in the way There we go There this little button here will show a one next to the up arrow And then you just have to push that up and and myself I have the luxury of being A committer on the repo But not everyone has that case so I can just push it up to it But in in and in other cases You'll likely need to fork From the main repo and then push up to that and then then once you've pushed your changes up you can submit Poor request things like that So rick, uh, just a quick thing. I noticed like I when you added a new node. Yeah right now You didn't have to do like a deploy for the changes to show up in In local changes Did you oh, yeah, I I didn't see that So if you just take that node out, yeah one that you added and try to add it back again um And then you refresh that on the local changes And I want to see if I deploy it when I deployed that went away that changed went away I guess Yeah, and now let's try adding a node again. Okay because Okay, so so Yeah, it's not showing. I don't know what this file is by the way, but I didn't look into that yet Didn't you hit the refresh button rick when you added the node last time? Did I? If you did Yeah, the refresh button right next to where it says local changes Oh, this one. Yeah Yeah Because I had to actually like hit like a you know deploy changes and then it showed up as a hey I found like there's a change on your local flow file. So Yeah, yeah, I'm not sure what happened before but that's what seems to have been necessary just now Because it didn't show up until I just clicked deploy Yeah, now it's here Yeah Play again that'll go away Yeah, but essentially so that's so the other tricky thing that was obvious from when harshwood demoing and I'm demoing is like There's the two nodes are missing in his they're on mine and we have those those settings Split those settings are in the other repo the share read repo and and this harvest flows in its own repo So that's one of the things that we'll probably need to work through is how To best keep those in sync or if we if it's or if we should change that repo hierarchy, etc And something just to think about Yeah, if there's no questions on this all Handed over Abhinavi. You said you had wanted to demo as well. Yeah Great Uh, can you guys see my screen? Yes so I was kind of working like a for a prototype of how the Entire pipeline for harvesting will look like like Getting it from Say so so I'm using PubMed Central for this and I'm using OIP image endpoint of this So and also the entire flow is like you get it from a list of records from OIP image parse it And then you can put it in MongoDB or you can also put it in elastic search And you also have some things like if there's an error in parsing You put that in elastic search the entire document so that you can parse it again later on and see all those errors and Then you can use MongoDB as like a source of truth for all the data that you pass So this was kind of what I was trying and then We're also thinking of building like a feed kind of thing on elastic search so that you can put your queries and then Check how a feed flow will look like suppose someone wants to Get the records for a particular institution So he can say that I want records for Virginia Tech. So once that record gets entered he'll be Notified about that. So that was kind of the Flow I was working on So I was able to achieve Putting it into database. I'm here to work on the feed part So there are a couple of good Things and some pain points I realized while working in this So one of them is So this OAPMH endpoints they work on the basis of something called resumption token like If their number of records are too high, they won't give you all the records They'll give you in batches and then in batches of 2025 and then You'll give you a resumption token which you can use again to query the next batch of articles, so if you want to Continue your parsing for say a long period of time and Harvest everything in that case you will need to set up some way of Taking the resumption token and changing your URL again and again So that was one part and then The error parsing was one new thing I discovered like you can have the catch nodes and then you can specify What all the nodes from which you want to catch the errors? like in this case, I'm Catching all the parsing errors and then putting it into elastic search Then there's a db error which is the error which happens when I'm inserting that goes into there and all these errors will also This ingestion will also ingest all the raw data so that you can parse it again, whatever error was there and then regain that data, it won't be lost and so this is the node which puts the parsed data into es and And this is the node which puts it into MongoDB And then there is a node For MongoDB where I'm storing the state of harvesting like what was the last time range in which I harvested So this I'll read initially to Start my start and end time of my harvesting and then update it later on Thing that I harvested for this time and then next time if you start you read it again and Do that and then this is the resumption token file which gets updated every batch So that I can create a url appropriately so this url if you see It either has a resumption token url or this url which is the url which is created for like PubMed has from and to url. So if you see So if you want to list records from this time range, you can use this. That's what I'm using here Yeah, so So yeah, so this is This switch is either first checking in database So if it's first getting the resumption token initially the resumption token will be empty So there would be anything so it will go to database and get the time ranges and create a url And then this is where I'm doing the mapping part so Yeah, one more thing I realized that all these big repositories the schema for the data keeps on changing like if you parse their OIPMH results for say 2004 versus 2018 you see the schema file has changed So because the schema file keeps on changing this mapping will also depend on the schema Which that a particular repository is using So for this purpose is For the purpose of this demo I think there's This is the schema file which uh, and And and the schema file that you need does that change depending on which set of records you Yeah, so every record you get you also get the schema location like this. Okay So you can use this to determine if your flow will work for this particular record or not so I noticed that I I'm I My flow works for this kind of schema But there's one more schema which keeps on coming in between. It's something some other name and My flow fails for that so for that, uh, all the records go into es error index So I can I I don't lose the data. That's why I had to work on this error part But still, uh, you will need to make multiple mappings for the same source. Also, that's what I realized That's one of the challenges So I can demo this part like And one of the question I had the these I see that is there are those persistent connections with the the get last date and the PMC db Sorry, so it's it's it's showing as connected without you running anything essentially. Yeah So my database is running locally. So it's connected. That's that's what's showing that I'm I can connect to the database which is Okay So like if I shut down my database, it will show that it's not connected So then you then no point running your flow And then uh regarding the passing thing which you mentioned so I had to like create different functions for each field because Either so it could be done two ways either you read the entire schema file and come up with the function which can work for all the combination of the schema Or you get the data and figure out what kind of uh parts that you want to write So in case of an xml data, it is Easier to automate because you will know what kind of schema can come. It's predefined and it's standard standardized So so like, uh, I'm fetching quite a lot of things Uh Like article idea, I'll show one of the examples. So these are like the parsing nodes All these are individual functions which parse one particular record one particular attribute of the record So right now if you see, uh, my harvesting state, uh Which is stored in mongo db. So it says, uh, the start time the last start time the last end time and the End time so we are only concerned about these two right now So this tells me that last time I started here and ended here So when I run the flow, I'll start from this date and I'll do it for one year So my start date will be this and end it will be 2007 one one That that may be a window that we can't see the It's possible when you did the sharing it was just the the browser No, no worries. Just wanted just wanted to warn you Can you see? Oh, yes, you can see it now So, yeah, so this is the Uh harvesting state which is stored in mongo db. So I'm saying, uh, this stores the last start and last end date of the harvest Uh, so when I start my flow again, it will start from this end and do it for one year So it will fetch records from 2006 to 2007 and right now if you see there are No records in the metadata. So Uh, if I start here Can you see the browser? No, now it's just the the command line now Can you see it now not yet? There we go So I ran the Sting and this is these are the records which come so this raw field has the entire Data in json format in json string format so that if there's some So because I'm not passing everything, but I might require that data again. So I don't want to reharvest So I'm storing the raw data here, which I can pass later on But uh, there are some fields which I might which I think will be required. So I'm passing that So like the identifier the schema location the article type journal type of mission name Now the article id's in p pub made up pretty Comprehensive like they give all types of article id's So this is what article id area looks like then there's article title In terms of authors. I have the entire They also have a type attribute in author like was a contributor or author for a second I think Then they have this and then there's also a very nice Linking between authors and the affiliations So each author has an affiliation id which is linked to one of the affiliations So that way you can determine if this author belongs to virginity not a demo That kind of mapping you can do so I'm storing the id here and if you see affiliations affiliation one points to this Arizona state university and then affiliation two So this is a good thing Then I have a I have also extracted the abstract into one particular one and big paragraph So which is helpful and then we also have keywords So we are thinking of a topic modeling kind of application will might Need all these things and Then if someone needs a full text he can go and parse the raw data And he he will be able to fetch the full text So this is what it looks like and If you See the screen now will have 25 records And if you see the resumption text, so it has one resumption token. So if I go and Do my flow again this resumption token will be used so Yeah, so I can so I've actually tried this harvesting like a in in terms of a repeat interval So I can do it every five seconds and So and I can run it through the night and it will harvest all the records. So if I do it right now Yeah, so So this is harvesting every five seconds. So if If you see the Yeah, see the terminal and so with it re harvesting are you kind of Blowing away the data and recreating every time or are you or is there They're checking for Like existing records. Uh, so because uh, so I'm I'm assuming uh, the OAP image interface of PubMed will handle that part because they are only giving me all the batches But we can easily put a index on say identifier and then it will update those So if I so we can check that so suppose we want to So metadata count is 192. Uh, so it might have to now if we see how many So it's essentially assuming that from the last harvest you still have those records and it's giving you just new ones. Yeah. Yeah, okay I can also show the the ES endpoint, uh, which Currently so, uh, it will create two indexes one error index and one the actual index so So if you go here In the dashboard Yeah, so these are the records which are there in elastic search. So that's chat So we can create a search query on these like Suppose I want to search for this affiliation. So So I'll only get that affiliation Yeah So all these things so so we can essentially run the parser for this now And that's a sorry feed parser, which will keep on interested. So this was the demo That's great. Yeah, and then like so so if we did want to um No, I mean, this is really really fantastic the so if we so if we did want to reset the like on the The oi pmh endpoint side saying hey, we actually need all records now Uh, do you know how how we would do that? So so they don't give also you have endpoints for fetching all the records, but they don't give you one all at once They'll give you in batches of 25. So that's why you need that assumption token. They have this Like if the request of more than 25, they don't need to do that 25 and then give you a token Which you can query again to get the next 20. Oh, it's that result from talking. Got it. Got it. Got it. Okay So that's why you need to run this like an interval of some time. Sure. Sure. Sure. Sure I'm maybe asking questions that other folks already knew the answer to I really like the the error handling piece. I didn't know you could do that And and just having it on a separate workflow to catch the same error on different nodes. So that's pretty neat Yeah, that's really nice I had a quick question about uh, how did you go around like making those connections to elastic search and MongoDB database Did are those like inbuilt? connection nodes It's a stdp request node. So elastic search gives you a stdp endpoint So suppose if you want to put some data, this is this. This is the index name and this is the end point So whatever comes in message dot payload gets ingested So you just need to make sure whatever you are passing here is the actual thing which you want to ingest Cool. It's just a stdp endpoint Sounds like no questions from anyone else. Is that right? So rick, the issues that you've created on uh, uh, share, uh, github profile, uh, github, uh repository Is it like, uh, I can just go pick one which makes the most sense and start working through it Yeah, I think yeah, that that sounds good. Let me Share my screen again and I can show that show those for everyone Yeah, so the So I think Before saying that real quick, I would say I mean if I think yours is a great example of one we would want to push up to the sharehead flows When when you're ready for that. So definitely please take a look at that And at this point It doesn't have to be totally functional. It can be it can be Error opponent buggy Um, we just want to start getting things pushed up Uh, but yeah, so like pulling up The tickets I had it open here, didn't I? Yeah so So here's what it's just so I just I'm not sure this is necessarily the Long-term best way to do it, but I went ahead and Created a bucket milestone called cross rest ref flow in here just to start to Sort some of these we could certainly do it with labels as well Um, but I created these ones as kind of ones that were immediate on my list to look at the flow I was working on Um, so I need to fix this one first the package that json is not Up to date Ignore that second for a second there was I talked about the push params to flow context and flowable Add mapping to new fields. This is one that could probably be broken out into like, you know We could break it into a bunch of different fields looking at what's coming back from cross ref and then this one One of the things we we're assuming here Is that as the objects are coming back and we're building the json objects to kind of ensure that it is actually conforming to the share schema we're assuming we're going to have a validation step Of that against share. So that's the idea there is it I I said that correctly, right ryan cam Yeah, you did. Okay good Um, yeah, so so I would say harsh. I would say any of these I'll do the top one the the package.json, but Like the bottom, especially the bottom two Either of those are definitely fair game and and like I said the you could even If you wanted to tag the specific fields, you're going to look at you could create new issues for that sure I've also used an external npm module and my project should I demo how I did that because there was one like Yeah, we had kind of thing there Yeah, yeah, wait, so which one were you I was using xml parser module. Okay, so Yeah, do you want me to I can stop sharing quickly here Yeah, so uh Because everything was in my So if you go in the node red, uh, you can see there's a settings.js So here there's uh something called Function global context So you specify that, uh, so you can't just do require something inside node flow flow node. So you'll have to Import it as a global context. So I'm doing this here And the way to install is you have to go to So this is my root directory and then you have to go to Yeah, you have to go and run npm install here. So inside the Home like this node directory if you do it outside like in your shared folder. It doesn't Pick up pick it up because it has to be there in this known modules for it to work So if you see so inside the node red under known modules So if you go to node red and then do npm install whatever and then put that in your settings.js You can then Go and So I wonder if that has to do with the Like the settings change that harsh mentioned earlier If if you did that that maybe it would Look at the other one as well. So this is this is where you get the parser from global context Which you define in your settings.js and now you can use this module. So that's Cool That's it Okay, and we actually just have five minutes left. We have been rolling right through in the schedule time here Um, let me get back to the agenda quickly So one thing I wanted to mention I was wondering That there may still be some folks that are want to ramp up and I thought maybe we should just start having some drop-in hours where myself or Ryan or cam are available to help Folks get their environment set up Um, so I wanted to pose that I didn't know what Day or time are best for that I think we had done like a Tuesday morning in the past And I don't know if that is is that Ryan or cam do you think that is a a good Day or time Yeah, that works for me Okay, well, I guess I might well could just pose a question if there's anyone on the call at the moment that that Would want to join that or we could Continue to kind of leave that as an open option Yeah, we can definitely host office hours Okay Well, we can either well, why don't we just mail down a time and we can post that to the share dev list and discord and then if folks want to drop in they can And then the last question I had was whether so so we have We're going to present at sea and I on December 10th the next call this of this one isn't actually scheduled until december 13th Um, so just a question of whether it would make sense to meet before then I am certainly Happy and willing to to have another one of these before then Is that that might help us nail down? What we might what might be best to demo etc? um Any opinions If you want to continue, uh, just go ahead without me Okay, okay, okay. Well, it sounds like I just find so I so I I I would assume it is just fine for I'll be enough to continue as As you are working and we'll keep iterating on the various items and And camon and ryan you're which which items you recap like in a minute kind of the the things that you're working on to Sure. Yeah, so I've been working on a prototype for decentralized browser storage so you can Basically contribute to this network by just having your browser open for a little bit Um And you know that could be extended to a chrome extension or something. So whenever you're just on your computer. It's running With zero setup But that's what I've been working on Okay, okay I've been helping uh came along with that also as well as well as also working on some of the node read stuff that we've been mentioning Okay, so Then real quick Okay, so so action items that all makes sense and probably adding myself Update package JSON dependencies is one I need to do Make sure that happens Anything else to add to the action on the list here real quick Okay, sounds like no all right Well, if there isn't anything else Thank you for joining and I'm really excited about the progress so far. This is really been a really exciting call From my end um And I think it'll be good to see where we are If we don't touch base before the 13th to then look at that and then continue to move forward Yeah, this is good stuff. Thanks all Thank you. Thank you Talk to you guys later