 Okay. Hello. Good afternoon. So I think we're, yeah, nice audience here. Let's get started. So today I'd like to talk about, well, that's not a big surprise, I'd like to talk about search. And I know this is a Drupal event, but I'd like to take a step back for a minute and yeah, take a look at the bigger picture and see what are the expectations of our users when it comes to search. First example, you're probably familiar with it, Wikipedia. When you start typing in search field, what you get there are, suggest, sorry. So when you start typing there, you get suggestions for, well, for the article that you are probably looking for. And once you select, you select that suggestion, it will be, you will be taken directly to the article page. You still have the possibility to have the traditional full text search, but when you know what you're looking for, you will probably never end up on that page. We still have a couple of seats here in the front. Another example on Amazon, you're probably also familiar with this one, when you start typing, you also get suggestions. But here the suggestions are of a different kind. Here you don't get suggested results. You get suggested search queries. And you see for the top searches here, you even get a category that's associated with that suggestion. And when you select it, it takes you to the search result page with that facet pre-selected. On Spotify, there again you have another complete. And that autocomplete contains a lot of different kinds of results. You have tracks, you have albums, you have artists, and pretty much anything that you could be searching for on Spotify, you will find it in that autocomplete. And there is a search result page, but in practice you never need it. On GitHub, the search by default is limited to the product that you're currently looking at. So you have the possibility to make a global search that really, so you have the possibility to make a global search that searches through the entire data set. But by default, you are limited to just that project which you're looking at. On Stack Overflow, there are two, the search is nothing like really fancy, no fancy widgets. But when you search, the default sorting is by relevance. And relevance is actually a combination of multiple parameters. And you see that you can actually sort the results in different ways. For example, for the most recent ones, in case you want to answer questions that were recently asked, you can also click on a tag. And it's also the search system that powers like the listing of a specific tag. So you see that it's not only about searching, it's also about finding data and querying the data in multiple different ways, sometimes structured, sometimes unstructured. So what can we learn from these examples? Well, the first thing that we can learn is that there is no one size fits all when it comes to search. Every project has different data, different users with different needs. And if we want to match the needs of these users, we need small adjustments in the right places. And these are spread throughout the technical stack. Starts at the front end with different search widgets, different business logic for how you use the facet, different styling for the search result. Also at the application layer, you will be building different kind of queries. You will be preparing the data to give exactly what your application needs. And then at the storage level, you need to have a dedicated system that will actually read the data fast enough, oftentimes on like millions of entries and terabytes of data to really give the performance that is needed. The other thing that we can learn from all these examples, well, all of them are built on Elasticsearch. Actually, all of them with one exception. And that exception is the Amazon example. And the reason why I pointed out is because a lot of people, when they hear about Elasticsearch for the first time, they think, hey, Elasticsearch, that must be an Amazon service, kind of like Elastic computing cloud. But no, Elasticsearch is an open source project just like Drupal. So talking about Drupal, what does search look like in Drupal? Well, there is a search module in core. It's enabled by default, so it must be good, right? And so it lets you search for content. And, yeah, well, it lets you search for exact matches of content and exact matches of content that were created before last time cron run. And you also have an advanced search. No comments there. And actually, that search, it doesn't look so bad, but in terms of how it works, it works pretty much the same thing, the same way as this one does. Do you know what this is? Drupal 4.7. So if we take a look at the change log, looking for search, we see that here, with the release of Drupal 4.7, that was the last time that any significant changes were done to the system. Just to put that into context, when Drupal 4.7 was released, these devices here, they did not exist. And this website here was still very cool. So if you... So, yeah, so if the search is relevant and important to your site, don't use the core search. But the core search is not the only option that we have. And chronologically, the next one that came along was Apache Solar. And Apache Solar was... Well, the idea was started among others by this guy, Robert Douglas, many years ago. The idea is to use the open source search engine, Apache Solar, and to combine it, to integrate it in Drupal with the module of the same name. So the module actually offers a good amount of functionality. That's from the official module page. But really, the main selling points were, well, it's fast. It's really fast. And search is a technically challenging problem. And if you want to solve it with the right performance, then you need to have a dedicated system. The second one, it does proper text analysis. It means that you can do stemming, you can do fuzzy searches, all these things that provide tolerance towards your users, to provide really satisfying user experiences that are not frustrating your users. And then we have the big selling argument, facet. And facet is not only the widgets where you have your links in the side bar. It's really this concept that you search. We start a search with just a couple of keywords. And then you get a lot of results, way too many results. You just go through them by hand. Then you have facet that lets you iteratively refine your search, your result set down to exactly what you want to look for. And this is great, but somehow as a community, we got stuck on this idea that facets are the essence of a powerful search. And don't get me wrong. Facets are great. They're definitely a very useful concept, very useful widget. But it's not the end of the story. And I think that maybe things like text analysis might be much more important to provide good search experiences. But it's all happening under a hood. So there's not really fancy. There's nothing to show to clients. It just it's one of the things that is a little bit of magic. Maybe we should understand it a little bit better. And of course, I can't talk about search in Drupal and not mention the search API. There was a dedicated session on that. So I'm not going to go into detail. But in short, search API lets you integrate a lot of different search backend and use them with site building tools to build views and different kind of flexible search results. So it's a very good project. We've used it on a lot of client cases and build some really cool stuff with it. But I have a couple issues with it. The first one is that it's an abstraction of solar. That means that you can do everything that you can do with solar with other search backends if they support that. But you can't do anything that solar doesn't let you do. The second one is that it's all based on configuration. So that's great because it means that as a site builder, you can do everything through the browser, you just click around you, you do your thing. But at the same time, there are certain tasks that are better solved with code than with configuration. And I personally believe that search is one of those. Third point is that it is hard to customize because if you want to get to the small details that really make the difference between a search that works and the search that people really feel that it works, you have to go through the search API hooks or API. So the search API API. And then through that you can reach into the API of the underlying system, which can be sometimes quite awkward and make simple tasks a little bit more complex than you need to be. But I'm not here to complain about search API. What I'd like to do is I'd like to show you maybe an alternative. And this alternative is elastic search. Elastic search, the first contact that I had with it was a couple of years ago on a project on which we used Drupal and Node.js to talk to Solar. And we had some pretty complex integration there. And at some point in the project, we realized that we had some major issues with Solar, namely dead logs around deployment and like down times with three indexing. And we decided to give elastic search a try. And to our surprise, well, we were able to switch over within like around a day. And we never looked back. And it was, yeah, it just worked. And I'm not here to just tell you why elastic search is better than Solar. But you can just look at Google Trends. I know Google Trends has to be taken with a grain of salt. But there is definitely a trend there that's difficult to deny. What are the selling points of elastic search? Well, there's a lot of them. If you look on the elastic search website, there's plenty of great explanation. But I think that there is one point that is really important and is very relevant to us as Drupal developers and the kind of use cases that we have in combination with Drupal. And that's a developer friendly restful API. Developer friendly means that it is an API that is nice enough to work with that we want to work with it. We don't want to hide it behind some kind of abstraction layer. And also it's a restful API meaning that it works the same way as the stuff that we do on a daily basis. So it will be familiar to you if you're a web developer. So fast forward a couple years later, so last year in the fall, we had a customer that came to us with a relatively simple project. They wanted an internet. And this internet had to be extremely simple. It had to be minimal. But one of the things that it had to do, it needed a search. And that search, it just had to be, again, very simple. You start typing anywhere on the internet and it searches through users, it searches through content. It also searches through the company calendar that's like hosted elsewhere and that we get through an API. So it had to be quite powerful. And at that point, well, the project was on Drupal 8 with an Angular front end and the search API was not really stable enough, especially the Elasticsearch integration. So we decided to build a simple solution directly using Elasticsearch. And this is what we got. So here, this is not the actual internet, but copy with fake content. So when you start typing, you just search through everything, you see there's error correction. When you select something, you just jump directly through that user profile. If you're looking for content instead of looking for users, you also have very similar experience. And so it's simple. It works. The users were really happy about this. So how did we go about building this? And how can you build this too? So the first thing is that we did, we installed Elasticsearch that's pretty easy. You get it, you unzip it, and you run it. In real life, you will probably want, well, on a real project, you will want to use a package manager, so app get or whatever, or you maybe a Docker image. But if you just want to try it out, no, well, if you want to have it in production with high availability, yes. So you will want that kind of setup. If you just want to try it out, you can actually do that under a minute, which you can't say about most systems. And then we want to talk to Elasticsearch. So it's running, but we want to talk to it from our PHP code. And there's an excellent PHP library from Elastic. And since everybody here is using Composer to build their Drupal 8 project, right? You only need to run that one command. So you install that Elasticsearch client. And so now we're done with the setup. We can start indexing. So how do we index things? Well, you use your good old hook entity update, and there you instantiate your client. In this case, I'm just calling the library directly. Of course, you would wrap this in a service and use dependency injection to get it. But yeah, so this is just a minimal amount of code to get things working. And here on the client we call the index method. And we specify that we want to save data in a content index. And the type is article. And the type is very vaguely to an index what a table is to a database in terms of MySQL. And we specify an ID just so that we can refer to that entry later on. And then for the body, well, we just submit the entity object as is. And actually, the great thing is that it works. This is what you then get in you to your search index. The not so nice thing is that what you get in your search index is this. You might recognize the structure. This is the same structure that you get from the standard trust API without any modification with Drupal core. So the node ID is an array that contains objects. These objects have value property, which is a string. And in that string, you have a number. That's not the kind of data that you want to have in your JavaScript frontend, for example. And it's also not the kind of data that you want to have in your search index. So we're going to revise this a little bit. So on entity save, then we run through the serializer using the serializer API from Drupal core. We define a custom normalizer. And it's that data that we then put into elastic search with something that will look like this. And here we have like, we're fully free to define our own structure. But you can specify really anything. So you have strings, you have objects, you have numbers. And elastic search would actually take that data and create a schema out of it. So you don't need to explicitly specify like what your data structure is going to be. By default, you just send an object and it will just take that. So that's great. It really means that you can get started very quickly. But there are some cases where you want to specify, like, not what data you have, but really like what this data represents. And you need to do that so that elastic search will be able to better handle your data. For example, we have a created property which is quite common on entities. It's a timestamp. And a timestamp is a number. And so if you index that, elastic search will say, okay, that's a number. But we need to specify that it's a number that represents a date. And so what we do, we specify that it is a date. And the format is epoch seconds. So unix timestamps. For the content field, well, it gets recognized as a string already, but we want to tell elastic search, hey, this is English. Please parse this as English or Finnish or German or Chinese. There's actually built-in support for actually, well, built-in support for all the western languages or really most of them, as well as support for like Asian languages through plugins. So that's something that kind of, yeah, comes out of the box and works very nicely. And then there are other fields, for example, the image URL. It is a string, but we don't want to analyze it. We don't want to know what are the words in that URL. So we tell elastic search not to analyze it. So now we have our content in elastic search structured properly. And now we just need to do a step is we need to start searching. And searching is also very simple. You have your client and then you call the search method. And there you can specify what index you want to search on. You can even specify multiple ones. For example, we want to search on content and we want also to search users. And there we have a query language that structures as JSON or in this case PHP object or PHP arrays that get converted to JSON. And in this case we do a multi-match because we want to search in multiple places. We search for whatever the input we get from, for example, the controller where you put this code. Of course, this input has to be escaped. So make sure that you put proper security measures into place. Don't copy code from these slides and deploy that in production, please. And we specify what fields we want to search on. So in this case we search on the label. And because we want search as you type functionality, for example, for the data that we get for the autocomplete, we also search through another version of that same label field that is indexed differently as an autocomplete, meaning that you get partial matches. And what we're doing here with this like hat five, it means that we are boosting the label. So we will get a result if we have a partial match or a complete match, but if we do get a complete match then this will be ranked higher. Also, because we want to be tolerant to our users, we specify some fuzziness. The fuzziness is typically a number and it represents the number of characters that can be off when you're mistyping something. But if you specify auto, it will automatically increase the longer your input. So that means that if you type a short word, you might be allowed one or two typos, and if you type something like a whole sentence, then maybe even 10 characters might be off and that would be fine. And then we have a small but very important detail. That's the prefix length. And the prefix length means how many characters from the beginning of the input will not be allowed to be fuzzy or to be wrong. And this is really important because when, well, we didn't have this at first. And then when we, when we removed it, well, when we didn't have it, we had this problem. So you start looking for your friends or your colleague Tony. And then you have a couple search results like Tony and Tomy. And Tomy, yeah, it actually makes sense because, you know, it's just one letter off that makes sense. But Johnny is also one letter off and that search result that that suggests it somehow looks wrong. And the reason is that, well, when we mistype something, usually we don't mistype the first letter. It's always later on in the word. And also the way how the brain works. We give a lot more importance to the first letter of the word as to the, the remaining ones. So we want to make sure that this last result that we're not going to have it because even though it is correct, it doesn't fail rights to user. And when we're building this kind of suggestions to user, we want to make sure that the suggestion is not the place to give interesting stuff to your user. The suggestion is the place where you just give the user what the user expects, which is a different strategy that what you use then for the search result page when you do the full text search. So this is what we built. And it actually worked out really nicely. The customer was very happy. The team was very happy. The project was launched on time and under budget. And we decided to try this approach for our other project. So here's another project. Avery is a company that does office supplies and they have an online catalog. One of the use cases that they have is you buy their product in the store and then there's a product code on there that you need to enter on the site to get, for example, the word template that goes along with the print at home business cards that you got. And so we had support there that you can just type product code and product codes are analyzed differently than than just regular text. So we needed to make sure that these product codes were really understood as one word, not as multiple words, even though they might have hyphens and things like that in it. So that's one of the things that you can do within that search interface. But within exactly the same search interface, you can also search for like products, types, product attributes. So in this case, business cards that you can print on a laser printer. And this data that you get here in this suggestions is actually coming from facets. So it is based on the availability of the product based on what you already typed. So how did you build that? We are also doing a search query. In this case, well, the projects multi language. So we are actually having multiple indices per language. And what we are, what we're doing, we don't actually expect any search results. So we ignore the search results. We do the search query as usual. So what I showed before. And here we do an aggregation. And an aggregation is the kind of operation that gets you the data for facets among others. But it's more generic than that and actually lets you do a lot more stuff than just facets. So we call this aggregation suggested terms. So this is just the name that we decide to give it. And then we use terms aggregation, meaning that it will return the most popular terms. The field that we use there is the suggestions, which is just a field that we have where we put all the suggestions in. And we use the raw version of this field, meaning that it's not analyzed. The reason we do that is because sometimes we have product categories or product attributes that are made up of multiple words. And we only want to return those as a whole and not as individual words. And to make sure that we only get the data that's relevant to us, we only include those aggregations or those facets that match what the user has entered. Then based on the result, we have our aggregations with the key that we specified before. Then we just go through each of the buckets and we create our response this way. So also something that's quite simple. One interesting challenge that we had that was actually very easy to solve, but something that's a little bit unusual is, well, I think everybody's familiar with paper sizes. And everybody knows that A4 is really the most common paper size. And it means that when you have multiple variants of a product that come in multiple sizes, in terms of searching, they're going to all do pretty much just as well because the description is the same because the attributes are the same. Only the size is different. However, A4 products sell a lot more. So we want to make sure that these will be ranked higher. And this means that to do that we split our query into two parts using a Boolean query. There's the part that must match. So in this case, the content must match our input. Same thing that we did before. And then we have a should part. And the should is like, it would be great if it did that, but it doesn't have to. So in this case, it should match the term field size. Field size should be A4. And this just gives us like that little boost so that for products, while different product variants, A4 will always be the first one. What's interesting here is that well, this example is just a static value here. But there's nothing that stops you from putting other like maybe dynamic data from based on the user profile or maybe based on user history. This is happening not at indexing time. It's happening at querying time so that you can actually build really powerful personalizations built on this kind of modifications here. Another example that was pretty cool, colleague of mine built this project. It's a portal for real estate professionals where they can do very fine grained searches with like based on a bunch of attributes. And then the user can save the search. And we wanted to send out notifications whenever there's something new. And what you use there is percolator. Percolator is kind of a coffee maker from the 60s that apparently makes really bad coffee. And somehow they use that name for an API that actually it's something that's really cool. What you do with a percolator is that instead of indexing content, you index a search query. And later on, when you index a new document, you can make a percolator query where you say, hey, are there any safe searches for which this document which I just created would be a result. And this, it works in a very simple way and also a way that scales really easily. So it was very quick to implement and also we know that we're not going to have any scalability issues with this. And this kind of issue saved searches with notification or otherwise things that really don't scale well at all. After a couple of projects using this kind of approaches, a colleague of mine said, hey, you know, it's not much custom code for each of these projects, but it is still custom code. There should be a way to optimize that. And so we created elastic search helper and elastic search helper as name implies, it helps you, but it doesn't do anything on its own. And the way that you use elastic search helper is you create a plugin, a plugin that defines an elastic search index. You specify what the name of the index should be. You can also use placeholders in there, some pretty cool stuff, the type. And also you can optionally specify what entity type and even bundle you want to put in the index. And actually that's all you need to just get started and it will actually, this example would put nodes into a search index. And what's interesting here is that all the functionality, all the code that gets executed is in this base class. And if you want to do things differently, if you want to have a slightly different setup, if you want to modify your data before it gets indexed or delete things in a different way, you can actually just override all the methods of that base class. And that means that, well, and inside of each of these methods, it's just elastic search code that gets executed. So you don't need to learn a new API. You are just using the elastic search API. This kind of approach has also been used by more people. Here's another project that's from colleagues in Belgium where they are using elastic search and elastic search helper to build this site that lets users search for student housing. It's making use of elastic search built in geographical proximity search. Also a project that actually was pretty easy went pretty very well. There's another project that was built by colleagues in Finland that it's not launched yet. I can't show it to you, but pretty much it looks like this. You search for stuff and then you get your search results. And what's interesting, it's not angular. It's not react. It's nothing fancy. It's actually this is rendered as part of a view. And what my colleague did is just to write a small views plugin which is going to be integrated in elastic search helper which lets you perform a search and then get the search results from elastic search and then render these entities through views. So you still get all the flexibility of views. You can even combine that with other filters. But yeah, it actually works really nicely. So this approach is can really be combined with more traditional site building methodologies if you want. There's one last project that I'd like to show and that project is something that we did for SportCheck. It's a large sports retailer in Germany, Germany, Austria, Switzerland and they also have a big online shop that is actually not built on Drupal but they have a lot of products and a lot of product categories and they needed a system to manage all that content complexity. And so we built a system that's actually built in Drupal where we pull all the data and we give the user the possibility to analyze the data and to do bulk operations on this large data set. So this listing, for example, I know in terms of styling it's nothing very special. You've probably seen something like this before. But this data is actually coming from Elasticsearch and for a listing like this it's not really interesting. We can also use more fancy interfaces, for example, this tree view that really lets the editors go through product categories and assign bulk operations to them that are then dispatched to other users for content creation and moderation. What's interesting here is that well, the first iteration of this tree was loading data from entities. And after a good amount of optimization to load 6,000 categories we had something that took about 2 to 3 seconds on a cache miss. And like 2 to 3 seconds for some time that it's okay but it's really not fast. Then we switched to Elasticsearch and the time to get exactly the same data from Elasticsearch was under 200 milliseconds. And actually a good part of the time was just transferring the data. So it's, yeah, it's faster. 10 times faster in this case. But it's not only about providing faster user experiences. We had another challenge in this project where we had to export a CSV file that had like 32,000 entries and well the when you have entities you can create a view and with views you can create a CSV file. But views really does not work with 32,000 entries. And so what we did we created that CSV file based on Elasticsearch and we were actually able to do it without having to to do anything about timeouts or it actually just worked and it's not about just being faster but it's about being so much faster that you can do things that would otherwise not be possible. So this is an example where we took data from somewhere else and instead of importing it in data and creating entities for everything we just took that data and put it straight through Elasticsearch in the format that we wanted. And this really gives us a lot of performance and then when we have all the data in Elasticsearch what we can do we can simply install Kibana and Kibana is a data visualization tool that lets you build graphs and visualizations and lets you combine those in dashboards that we'll update in real time. It's really amazing tool it would probably need a dedicated presentation but it's something that we are just putting this well we're just putting this in the hands of our customer and what they can do with this they can really dig into the data and understand the hundreds of thousands of product that they have and how how that is structured and understand the traffic and understand the correlations there and what's great about using Elasticsearch when you have really like a structure in Elasticsearch that makes sense to the business not only to how Drupal wants to structure that then you can simply enable Kibana for each of these projects and give a tool to your users to your content editors to really understand all the content and all the data that they have and trust me whenever you have large enough data sets there's always a story and always great insight that can be learned from that now we are actually getting to the beyond part of my presentation and most of the examples that I showed you it was a typical use case that you have Drupal and Drupal sends data to Elasticsearch and then Elasticsearch is then searched and when you courage it returns data and yeah that works but there's really nothing that says that these two operations need to happen this way there's an interesting trend that we've been seeing on a lot of sites especially in the publishing sector that there is one CMS that will push data to Elasticsearch and then there's another instance that's a frontend that will actually get that data and these are now the same systems and actually these don't need even need to be the same technology the frontend could be built like a small satellite symphony application maybe it could be Django we've actually seen this it could be Node.js and I think that well when we think of decoupled Drupal we always think of Drupal and REST service web service that talks to some kind of JavaScript library but that doesn't need to be the case and actually when you put Elasticsearch in the middle it has a lot of advantages first then you really have some actual decoupling you don't need to have any Drupal specific libraries in your frontend so that means that you can actually switch the frontend and swap it out with something else very easily and if you want to you can actually swap out the backend very easily as well and Elasticsearch there in the middle provides a system that is really really fast and that really gives you a huge amount of flexibility when it comes to querying not for searching but for getting any kind of data and really sorting that data by taking many parameters into consideration but why stop there I mean Elasticsearch can get data from websites and CMSs but hey if you want to go the the whole internet of thing way yeah you can have your devices your toaster talk to Elasticsearch and get data from Elasticsearch to include a couple buzzwords here you can have your Amazon echo getting data from Elasticsearch or even include that and in a Facebook bot this kind of stuff you know that's the future and I think that Elasticsearch might be a better back end for these kind of needs than the the APIs that Drupal provides so today we are good at building websites and I know we built web applications we built web platforms and all of that but really it's all stuff that opens in a browser and there's something really interesting there at the interact intersection between websites and search and that's what we're doing at the moment but we shouldn't forget that there's a whole lot more beyond search well beyond the parts that we integrate in a site and I think it's really important to focus on that part also because that means that when websites will be less relevant then we'll still have something important and relevant to do so in summary I think that there is an increasing well that search is taking an increasingly central role in a lot of systems in a lot of projects a lot of websites and if search matters to your project it is the details that matter you need to have the right adjustments in the right place to get a system that really creates a simple but powerful search experience and Elasticsearch actually has a really expressive API and the challenge here is not learning the API actually learning the API that everybody agrees this is actually really easy the challenge here comes with learning all the possibilities that you have with this API and how you can combine these possibilities to really create powerful search experiences and finally last but not least Elasticsearch is really fun to work with this is the most consistent feedback that I've gotten from everybody that I've talked to who has worked with Elasticsearch Elasticsearch is fun to work with and that means that at the end of the day when I come home after working on an Elasticsearch project I'm happy and that's a whole kind of a whole new dimension of value generation compared to like you know my salary getting paid and our customers being rich so it's to me this is really like what motivates me to come here and to talk to you about this way of doing things I know it's very different from like what we've been taught to like this like the dripple way of doing things but it's something that really makes well I think it makes a lot of sense and it's really fun to work with and that's the reason why I think you should definitely try it out I'd like to thank all the people who have shared their project their experiences their challenges their bugs their frustration their bug fixes with me I know they learned a lot of stuff and I'm really looking forward to hear about your project about your challenges of course your bug fixes too and I hope you learned something today thank you any questions yeah please come to the microphone so a big question is how elastic search if it works with facets so if you want to do filtering extra filtering on the search results if it's working with facets or how you propose and do these things so I actually from what I showed you most of like pretty much all of these examples had custom code for the user interface maybe that was like Angular code sometimes it was just like PHP code but actually a facet is two things first you provide the data to display the facet and then when you click on that then you just send a parameter that sets a filter and what I showed earlier with the Boolean that lets you specify multiple criteria this is where you would be adding the filters that say I'm searching for this keyword and it should be in this category and it should be in this date range and so on so you actually take the value that comes from selecting the facet and then feed that into the query but is integrated with facets module no it's not so you have to do custom coding yes in order to okay thanks hi I was wondering if there's an ongoing effort or any plans to integrate your elastic search helper module with the other D8 elastic search connector which not only also provides like search API plugability but also I think like views integration and such so how is that going I there are no plans at the moment I think that the approaches are really different so the for example elastic search helper the only configuration that it has is the like the server name and any like authentication so it I'm not sure that there's really much to be gained from that because like most of the the integration that like well the only thing that elastic search helper does is actually put all the things that you need into one plugin instead of having that into a hook and different places so it acts on a very different level from a search API for example so there's the views integration you talked about is not in the elastic helper module or we are going to put that back so at the moment it's in project specific code and we need to refactor that a little bit thanks the question is simple how about accounting in elastic search Lucin had problem I didn't try Soler but I have problem similar as your on German site group of results should be counted counts should be displayed and levels levels etc yes so actually there like elastic search doesn't do like very exact things when it comes to counting but at the same time I have to mention I haven't like seen any issues even with really large data sets so I mean it's it's unprecise when you get into the milk and sub entries so it can be like especially if you use it in distributed fashion then you will have like different like result from different charts that are then combined but it elastic search can actually tell you if there is some error interim like in terms of counting but like the kind of data sets that we see in the case of a Drupal projects I've never seen anything that was really problematic it's not be very large number of data but it should be works in some kind sorry it should be I do not have large amount of data yeah and I expect it should work with elastic search because that I want to switch to elastic search I have a main solution with AP search API search temporary but the problem is very complex and search should be very very complex yeah and I think that like the because that elastic sorry looks very interesting for me and right solution okay great thanks thanks one comment that maybe people should know is that elastic search and solar actually run the same search library yeah which I think so if you've been using solar I think and you want to use elastic search you can apply a lot of the knowledge because the text processing functionality everything is actually the same it's whether you configure it through the API or configure it through text file through XML files but yeah maybe configure it through the API as nice or sometimes yeah well actually I think there there's a lot of similarities in terms like how stemming works and so on but actually one of the big differences is it for solar you specify everything through configuration files and then for elastic search that will be happening through the API so very much like when you add and you create a new table in MySQL it is a query that creates the table so you can actually like the way that you like submit that configuration is very different and also like most of the fine adjustments that were needed were like in specifying custom analyzers custom languages and elastic search actually provides a lot of functionality on top that makes that a lot easier so what you learned in the past will be helpful but you probably won't need all of that sure then I just had a question in terms of these implementations do you basically proxy the queries always through Drupal or how did you access search API since I assume like solar it doesn't really have any security built in so if you allow people access to it they could just write their own data to it yeah so this is actually the the first thing that comes to mind is like oh elastic search has a recipe let's have JavaScript talk directly to it don't do that so elastic search lets you do everything through the API including like shutting down your cluster so you you actually always want to make sure that all your requests go through some through some kind of proxy that will check like user permissions that will check authentication and so on so yeah that's a very important point thank you thanks for your presentation I'm excited to to try it out do you have like examples of the types of things that you've shown here somewhere available so we can try it out or is it just like elastic search helper that's it well actually the whole idea of elastic search helper is that it provides something that's really minimal and the documentation that you should be reading is not the elastic search helper documentation right but the elastic documentation elastic search documentation by the way oh we just have to read that book so there's yes so um there's actually the the people at elastic have been very nice to send me a couple of these books this is the the definitive guide you can actually find this on the website so like if you don't get a book or if you're not ahead of space in your suitcase you can still read that online and the documentation has actually split into two parts one of them is reference that tell you all the details like the name of the parameters and what values they take and then there's the guide and the guide actually tells you how the various things kind of fit together and it gives you exactly that kind of like information how do you build like date aggregations how do you like how should you index like things if you want to build in an autocomplete this kind of thing that's that's in here you want one was one more question okay yeah if you if you are really interested in reading the book please come we have a couple we also have a bunch of elastic stickers if you want to put them on your on your laptop and be very proud yeah help yourself