 Hi everyone, so I'll be talking about Victoria police station locator. We recent in fact yesterday we made the station locator live and it has gone live just yesterday so it's fresh off the press. Just let me talk about it this was a very small component that we had to make for the Victoria police. So if I talk about the requirements Victoria police approach salsa to build a new component on their existing website. They wanted to elevate the user experience for finding the police stations. The police station data. They already had police station data available within their organization in our DMS, which is an API that they have. But the old implementation was not using it. So they had to add update the those stations manually which was naturally very honest. The problem was that that impacted also the ability for to allow the data to be accurate and you know, so for example, if one of the police stations was the phone number change or if the police station was removed or something like that. It went through a whole process where before it would be updated on the site so they wanted something which was much more instant and not if not instant but at least much more time conscious. The objective of the new component was to SSI to improve the user experience and minimize the dependency on manual content entry. And for the solution. The Victoria police already has a website and the website is actually hosted hosted on single digital presence. Also that website is supported by the single digital presence team. We had to just build on top of that. So what we did was from the backend perspective we ought to be created a new content type call stations we automated the migrations from the API into Drupal. We also made sure one of the problems with such approach in our in the historical projects that we had done was that whenever this API migrations used to happen. For example, in this case they wanted it to happen twice a day. It would be really really, it would be very processor heavy on the service. So is DP specifically asked us to make sure that this syncing was done in a much more performant way. So what we did was we actually did a delta syncing from the API so the API call whenever that was done that was based that was a delta call rather than doing a full so that was the number of records touch was minimized. We in the from the front end perspective we implemented suburb and postcode search for stations. We also implemented filter wire stations filter wire whether the station is open 24 hours or not filter by distance and then filter by specialty services. In addition we did a Google Maps integration as well. Very quickly, our team had a relationship manager project manager designer tech lead back end developer front end developer and a Q&A analyst. In fact, the back end developer for the project is on the call today. So, yeah. At the time and the project inception was 27 March, first bill started the bill started in May and go lay was as I said yesterday so all in all it was a 15 day project 15 weeks project sorry. It would have been amazing if it was a 15 day project but it was a 15 weeks project. So they're talking about a few challenges so when we started the project the API for the station was still in development. We did not have any API at all. And it but during the project failed when we started building the API team from Victoria Palace was very, very helpful and they were able to provide us with appropriate samples. So that made the development a bit easier and the API went into production last week and and hence we were able to launch just yesterday. The other major issue with the project was the partial ownership of the project and because it was a partial ownership. We did have to make sure that SDP was with us in all stages of the project. So we made sure that SDP was kept in loop for whatever design we were implementing technical designs and they knew exactly what we were doing. Also, we made sure that when we were building we built it as a separate feature. So we made sure that before we merged the feature into develop they had a good review and we walked them through the whole functionality as well. Another thing that SDP specifically requested after we started the project was that they wanted the back end module to be developed as a separate module in a separate repository that would then be added into the Victoria Palace repository from the back end. So that is why we created a new repository, a new module called Idestationlocator and that actually added to the complexities a bit because of the dependencies etc. But having said that, it was a great learning experience and we were able to do that well. So a very quick demo. Before I actually demo the new functionality, let me show you what the old functionality was. Can everybody see my screen and the old functionality? So this is how the old functionality used. This is find my local police station and this is how the local police station is shown. For example, if now I wanted to search for a specific suburb, I would like something like Ararat and then search for it and it will show me the result. And that was it. This was naturally, as you can see, this is a keyword search. So it's very, very, it's not very nice. So take for example, when I search for Boxhill, it gives me Boxhill yes, but it gives me everything with hill in it and box in it. So forest hill. And if I go Pyramid Hill, Swan Hill, etc, etc. So if you look at these, they are just not relevant results. Also, it gives us some information about the individual stations, but not a lot of information is available. Now, looking at the new thing that we implemented just yesterday, this is a new police station location page. And when we land on the page, as you can see, we can see a full listing. We also have a maps integration, wherein we can zoom in and do a specific area and I can actually filter or search by a suburb. And then I'm searching by suburb, I can actually say something like this. So when I do an any distance, it will still show me all 332 results, but it will sort by distance. But when I do something like, let's say 10 kilometers, then I press on search, then it limits my results as well. I can go to individual bullet stations as well. And it gives me much more information and also gives me a map there as well. And we have a map view is also when we are on maps we have actually disabled the filter by distance because when I'm on map it tries to focus on an area but the more I zoom out, I can keep on like start seeing more results. I can't keep on, etc, etc. And from here as well, I can go to individual police station. And that's it. That's the main demo that I wanted to show. Any questions. This is using the back end. Sorry, I didn't get you. No, we are using elastic search as DP actually uses elastic search. Okay. Well, sorry, go ahead. Sorry, thanks. Are there any challenges you came across? Particularly around pagination. No, because this was mainly listing. So pagination on the listing on the maps and on the I think, if I'm looking at challenges pagination was not a problem. We did not face any issues there. But if I talk about problems or challenges, I would say map view was pretty challenging because deciding on the behavior of the maps is always how it should work should it should we allow panning should we allow filtering on my soul. Those were some of the reasons why, for example, we disabled this, because if we filter, and then user tries to pan across the map, that would mean they would not see results which was not looking very good. So that's why they were, I would say quite a lot of challenges are on map behavior, but when it came to the list, I don't think there were a lot of challenges. Okay, I had a brief look at it yesterday and I was just because I've done a few of these myself I was curious to see the sort of ways that you did and I noticed that when you do a search it returns everything. And then the pagination is just pulling off the data already has. So it uses elastic for filtering by keywords. Yes, but it doesn't use elastic for filtering by pages, which is fine because it's not a lot of data, but I was just wondering if there was a story behind that. There's nothing, no specific story. But because of the fact that we just had 300, 300 results, we knew the initial load will not be enough even if even if we have like in the future there are a lot of stations how many can they be maybe 500. So and that's not a big chunk of data. So that's why that implementation was pretty okay. Just about the back end. So this is a simple content type that you're managing on the data. Yes, this is a simple content type. We are migrating into that content type and we have made sure that the content type is not editable. Okay, sorry. So it just get the data from the API always. There's no manual update. That is correct. It gets the data from manual from the API twice a day, 12pm and 6pm. And because of the fact that we wanted to make sure that API was the source of truth, we have disabled any editing in the group interface. And I will be talking to you go in specifically when hand over. I'm just curious about one thing like everything is coming from API and it's API driven thing. Why exactly they use the back end, you can directly call the API. Yes, there was a there was a huge discussion around that. But the main reason was that Victoria pull is specifically wanted individual pages. And secondly, they. So if I go to the old implementation, if you if you see so they had a lot of pages like this location question mark. So in within their content, they were linking to pages like this, and we needed to create a list of those and also needed to create redirects. So that is why having a content type, and we had a major discussion with Anthony from SDP around that, that should we directly will push the data into elastic search and not use Drupal content types at all. Using the well known module. What will that we used on the other project Alan, we put on that data pipelines. We did discuss about that as well. But finally it was decided that it makes more sense to use Drupal, especially because of the fact that the content is not that huge. We're just talking about 320, 330 stations. On this, like it's, it's a, it's a locator you can implement any kind of locator with this approach. I think that's why you have a separate module. The reason why we have a separate model is not because we wanted it. It's because SDP wanted it. They wanted a module which is separate, which was which is in a separate repository, and that can be installed on on the reference site. So that's how we started with a separate module. But, but it's sort of not very separate as well because the front end is not separate, only the back end is separate. So, back end is a separate module but front end is not so it's not that modular, as we would have wanted, but SDP front end does not support separate module so that's why we were not able to do that. So that's the reason for using the elastic API. This is an existing project based on SDP and SDP uses elastic and we just implemented one component the station locator nothing else. So, that's the reason why we use elastic which was already there. The default stack in SDP implementation that for the search they are using elastic everywhere. So they can call directly front end API. They don't need to go with the Drupal route. Any more questions? So just one question about the Google Maps like there's a limitation with the API nowadays. So how exactly are managing the API calls and all. It's a straight forward API calls every time. So when you say about limitations, there are two API keys that we are using. One of the API key uses the Google map embed and Google JavaScript API. Now both these APIs are actually free and they don't have any kind of any such limitations so that was fine. The other API that we have to use is Google Geo coding and Google reverse coding. And let me again share my screen. So Google reverse coding is used when I use this feature. So when I use I use my location I can't show how to use it because I have not enabled allow I have not allowed my browser to use my location. So when I use when I use this feature the browser sensor location and then it needs to be reverse geocoded to give a lat long. So that's where we use the reverse geocoding. And when I do this and select one of these. So the API that we are using for postcodes is that it has only suburb names and postcodes it does not have Latin long. So when we select this we use the Google geocoding API as well. So these are the two APIs that are actually the ones that need some investment, but that's that's what it is. Was that your question. It's a standard thing nothing special going on. We did have a detailed conversation and with Victoria police and Victoria police API. The API team actually told that they have the option of code coding and geocoding and reverse geocoding now. That's something that we were told just last week. So we have clearly told them that this is something that we can look forward in the future we can change our implementation to use their APIs for coding and reverse coding. Stop recording now.