 Hello and welcome. My name is Shannon Kemp and I'm the Chief Digital Manager of DataVersity. We'd like to thank you for joining the DataVersity Webinar today for Ways to Optimize Your Business and Drive Impactful Results through Spatial Analytics sponsored today by Altrix. Just a couple of points to get us started. Due to the large number of people that attend these sessions, he will be muted during the Webinar. For questions, we'll be collecting them by the Q&A in the bottom right-hand corner of your screen. If you'd like to tweet, we encourage you to share the questions via Twitter using hashtag DataVersity. And if you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the chat icon in the upper-hand corner for that feature. And as always, we will send a follow-up email within two business days containing links to the slides, the recording of the session, and additional information requested throughout the Webinar. Now let me introduce to you our speakers for today, Lisa Aguilera and Jessie Cho. Lisa is a Senior Product Marketing Manager at Altrix. She is passionate about the analytics space and showing how innovative analytic technology can help analysts move past mundane data tasks, elevate their skills and expertise, and deliver ever-increasingly sophisticated insights. Jessie is a Solutions Consultant at Altrix. Prior to joining the Altrix team, Jessie was a business analyst and Altrix user for over two years. She understands firsthand the need of business analysts as well as how to maximize Altrix to drive faster business insights. In her current role, Jessie leverages her unique experience to help organizations realize the benefits of self-service data analytics. And with that, let me turn it over to Lisa to get us started. Hello and welcome. Thank you so much, Shannon. And welcome, everyone. I want to say thank you very much for taking time out of your very busy day to join us. I do hope that you will be able to pick up some useful bits of information in some of the materials that we present today. Now, before we get kicked off here on our agenda, I just wanted to do a quick informal ask here of how many of you on this call are actually performing spatial analytics or using some of the location information in your data. You can just go ahead and pop it in the Q&A. It'll help kind of guide our conversation today as well. So to level up that, we're going to spend a quick moment on explaining what spatial analytics is. And then I'm going to dive into four common use cases of spatial analytics to help you kind of think about other ways you could be using location-based information. Then we're going to go into four different case studies of real-world examples of unique and varied ways that analysts like yourselves may be doing spatial analytics or deriving some of the location data and blending it with their transactional low-point sale data in order to get new insights. After which, I'll pass this on to Jesse for a quick demonstration and then we'll get to the Q&A. So thank you all for participating in the chat. I see a couple of you are actually doing spatial analytics and a couple of you are not. So I hope that we'll cover both sides of the house in this conversation. So for those of you who may not know, there is actually a secret underlying piece of information in those data sets. It's the fact that 70% to 80% of all data has some type of spatial component as part of it. And some of this location information is really there to give relevance to the data sets that you're using. Now, if you're not using this data to understand the where of things when you're performing your analytics, this is going to be a great session for you to understand why you'd want to maybe tap into that. So tapping into spatial analytics is more than just mapping. It's more than creating, you know, drive time analysis. It's more than what you would typically see. It's really about helping you help organizations drive efficiencies in the offerings, in the operations, and in order to help enhance profitability within your organization as an analyst. Now, some seem to think that you really have to be an expert in spatial analytics or you have to know mapping or you have to have special mapping tools. And what we're going to talk through today and what we're going to showcase in some of these cases is there's a lot of analysts like yourself at other organizations that you'll hear more about today that don't have spatial background, that don't have big tools to do things with, that have been able to tap into the location information within their data set to really derive some creative and deeper insights that have driven some really impactful bottom-line benefits. Now, when you look at spatial analytics for location-based data, you can use it to do things like customized location services or inventory based on key customer habits within your organization. You can use it to help improve store service or asset location strategies. You can also use it to ensure availability and improve customer experiences that'll help drive down latencies or gaps in services within your organization and drive efficiencies in marketing and sales programs or offerings within your organization. So let's take a quick look at one of our first use cases here. This one is a pretty basic one that can impact many organizations. So for those of you on the call, how many of you actually do drive-time analysis as of right now? And I'm not talking about point-to-point location analysis. I'm really talking about drive-time. So taking into consideration routes and traffic and everything else as part of your analysis, you can go ahead and informally type in. Great. I'm seeing a few yeses. Excellent. Wonderful. So for those of you who are already doing this, you understand the value that this does. For those of you I see here, a few of you are saying no, maybe not yet. Maybe you should be looking into it. So let's talk a little bit about this use case. Drive-time analysis is really important in terms of helping determine the physical location of a new site or a new building to evaluate competitive mix in your area. It's also extremely important if your organization is... if you're working in an organization that uses real estate as part of a franchise model or something along those lines. Now, for those of you who went through the housing bubble in the last recession, I'm sure you can all attest to the fact that real estate is an expensive and risky proposition. Now, if you are in a hyper-competitive industry or a franchise type industry or a chain type industry, location, as you know, is key to the success of those organizations. And a good location can be competitive to acquire. It also can be quite time-sensitive to acquire because it can be very difficult to find right commercial locations at the right price in the right areas before competition comes in and sweeps it out. Now, to offset some of the real estate or location risk, location analytics must balance ROI against some demographic information, area competition, operational costs, and changes to things like roads and zoning. So as I mentioned earlier, we're going to use a bit of an example here. Let's say you're in a hyper-competitive market in analysts that work that, let's say, a coffee chain. A lot of these variables take on an even higher level of importance. So using our coffee chain example as an analyst working in an organization like this, if you were able to visualize and understand the location selection risk, you could use spatial analytics to understand the interface key variables and help make suggestions in refining location selection strategy as more of the data starts to become available and you start to become more comfortable with it. Now, one key way of determining a new location would be using drive-time analysis, like I said earlier. This is important because you may have a location that may look physically close based on a straight line, but road by road, it may actually be further than you would prefer. Your organization would prefer. It may not even be tapping into the right, maybe cluster of demographics that you're trying to target within that drive-time. That's why it's really important to use drive-time analysis with your information of your user and demographic data. You could also look at it in terms of competing options. Again, using the example if you were an analyst at a coffee chain organization, you would have to take in considerations like restaurants in the surrounding area or other cafes or even other existing franchise locations and ensure that they're not situated too closely and that the customer base in that region isn't going to get cannibalized. What kind of data would you use when you build this type of analysis? You would want to pull in things like your customer loyalty program data, sales by store. If you were that analyst at a coffee chain, you would add parameters for competing businesses as well. You would then apply drive-time proximity to every geocoded customer address and location. Then you would look at the targeted customers segmented within a specific drive-time from two potential locations. More importantly, choose the location with most profitable potential if you're trying to build an analysis like this. Should you not have rich enough customer loyalty data or point of sales data? You could go further and augment your data using demographic and thermographic information, bringing in something from Dunn and Bradstreet, Experian Census, et cetera. What type of analysis should you be looking to build when you're trying to do drive-time analysis? You really want to look at drive-time. You want to look at cluster maps of users within a given area, plot mapping and competing offers, and potential customers who match key demographics within a certain drive time for your organization-specific offering that you're bringing to market. That's a bit of an overview on drive-time. Let's get into the next use case that location analytics can impact. This is really about mitigating risk and exposure, as well as really adapting to maybe disasters and being prepared and readiness within certain geographies. In this case, we're going to use a basic example of an insurance company. If you're an analyst at an insurance company, you really need to be on top of your game when it comes to setting up premiums and mitigating exposure. Spatial analytics is extremely invaluable for this, particularly in regards to disaster claims. When performing this type of analysis, you would need a couple of data sets. You would probably pull in your claim data, your customer data, your policy holder data, flood weather or seismic type data depending on the risk exposure you're looking to analyze and something like property value. After collecting all of this data, you would then create various analytics used. For an example, if you're an analyst or if you currently are an analyst at an insurance company, you probably already know this, you would want to understand the risk exposure for your company, let's say, regarding earthquake policies. I'm from California, so this is a big thing for us out here. How much risk is it going to be to cover certain households within a certain area for earthquake policies? You would do this by mapping the policy holders and geocoding your addresses from their policies, and then you would overlay a map of seismic information that denotes fault lines and maximum probable magnitudes. After you've kind of built out the seismic hazard map, you would then remix the reassemble data and use it to really understand where do you have certain clusters and you'd be able to pull that reporting information back to your decision makers to really help them understand their risk exposure in certain areas. So the other thing that you would be able to do with using this type of data is if you are an analyst at an insurance company, for example, you would want to set up response centers after disasters. So one of the things is, again, here in California, earthquakes are a big thing out here, they're real, so how many households do you have within certain areas that would need that kind of response time for your insurance claim company? Do you have enough staff in place to support those call zones in those areas where you need to quickly get to and make sure that you have not only the right response time, but the right number of support available within a certain area. So you could be able to take all of that information, look at how your staff within your organization blend all that data together, and pull these reports for your business decision makers that will really help them understand when risk exposure as well as support within a given area. You could also take all this data and then combine it with regions, depending on your high risk and areas that you're facing in terms of claims and claims management and be much more proactive and effective in terms of determining premiums for your organization. So there's multiple ways that you can look at this. Similarly, you could take a different direction. Real estate investors can use hazard data. So if you're working at a big real estate firm or a big development firm, you could take this data, bet it with the hazard data, such as flood plain maps, and evaluate which properties are valued appropriately if your organization is in the habit of purchasing land to develop on. You could also couple this information with predictive analytics and data sets such as listings and sales price or school enrollment if you're looking at home development, crime and more. And you could forecast trends on property values in different areas with relative ease by tapping into a lot of this information and pulling it together. So third use case that we're going to go into here. This is a bit surprising. There's a lot of organizations that do tap into customer profiling in order to influence sales and marketing, but there's a lot of organizations that don't. They don't take location information as a factor in a data point in order to determine what they're going to be doing from a customer marketing perspective or a sales perspective. So when using location analytics for this type of activity, you're going to hyper-target sales and marketing activities to microclimates within a region, which helps them resonate a lot better. So you'd want to find out who your most loyal customers are within a given region. You'd want to find out what's the best ROI for marketing spend and where should it be targeted and to whom should it be targeted. And then who's most likely to purchase a certain product within a certain region? As we know, people purchasing patterns and preferences actually differ quite a lot based on not just state-to-state or city-to-city, but also between county and counties. So what kind of data would you need to create this type of yield? You'd want to tap into a lot of your customer data, your sales by location, your responsiveness by location. And then, again, you best overlay demographic and formographic information on top of that. So done in Bradstude Experience and Census to be able to really paint a full 360 picture of not only just your existing customer, but potential customers within a given area. Now, typically customer information is gathered as, you know, the point of sale or the time of transaction or via online surveys, you know, where customers are required to enter their receipt transaction in exchange for a prize or a drawing in a sense sort. Or you may get information from loyalty card data. But if your face is a challenge that the customer data you need is not part of your company's original collected data set, what should you be doing? So imagine if you are an analyst at a home renovation and retail chain that wants to optimize marketing spend. Your company decides it wants to target homeowners for renovation-related campaigns and renters for home accessory promotions. But your current customer data set is unable to distinguish between renters and owners. So how can you really get that data and tease it out? Well, the secret is fuzzy matching. If those of you who haven't heard of fuzzy matching, Jesse will talk a little bit through that in the demo. Fuzzy matching allows you to match data assets that don't really have common shared identifiers. And using a waterfall matching process, a user can then change their access to possible data associations and refine the data until you have the highest probability matches and the best data possible. So this customer data can then be further enriched by overlaying it with third-party data, something like experiencing household data, where you would get over 300 different variable sets for demographics or psychographic attributes. So some of the analytics that you can perform once you've collected all this data is you could overlay drive time information with this. You could see who should get which type of offers or coupons or promotions based on their proximity to a certain location, store location. You could also determine purchase preferences and tendencies and use it to really drive suggestions regarding inventory optimization and preferences of anyone within a given region. So for example, you know, barbecues, maybe you carry barbecues year round in most of the Florida stores that you have as an analyst, but let's say you're in Nebraska, barbecues in the wintertime, maybe not such a great idea. So really kind of customizing based on purchasing preferences we're going to give a location. It's key in terms of operational efficiency. Now you could also use it to determine potential purchasing patterns. If you're using that experience helpful data and overlaying it with your customer data to try and find potential people within your store and customize and determine your offering based on their buying habits. It's another way to drive insights using location. So our last use case that I'm talking about before we get into some of the real-life case study scenarios and some of the insights that your staff have been able to build is talking about how spatial analytics or location analytics can be used and tied to the placement of physical assets such as property. So not necessarily an actual store location, but physical property, things like service coverage area. So if you were an analyst working within the telco business, you know, some data that you may want to pull in to perform your analysis would be customer data, maybe coverage, tower coverage within an area, data bandwidth coverage within an area, demographic information again and population density. And you would use all of this data and you could create a couple of different analytics for customer maps of users within a given area, demographics of customers within a given area, thought mapping of coverage, and then drive time to specific locations. So again, if you're an analyst and, you know, for example, you're working in a telco company, you could be tasked with how do you optimally play cell phone towers to reduce drop calls. Your company may send out data to that signal and you would then take that data, get that signal change, put it on a map and see where your coverage is lacking. You could then use this data and map it and combine it with population density, growth and customer locations. You could use it and you would find ideal cell tower placements or even usage. You could take that a little bit further. You could drop call tolerances and iterate on this model. You can overlay this with factors such as negotiated lease rates for placing cell towers on buildings. The other thing that you could do and an organization that we know of has done this for a telco area in New York is they looked at the demographic information and customer information within a given region. They looked at the best cell coverage as well as their data coverage to understand what types of coverage offering to make within a given region so that they weren't offering the best cell coverage and the best data plan coverage within their areas in which the demographic skewed much more heavily younger. They used the analysis to decide that the demographic of younger users actually ends up using data more and they don't use actual calls so why would we want to increase our actual call coverage when we need to increase our data coverage and we're able to really customize and optimize their service offering that way by using location data, coverage data, customer data as well as demographic data. Now let's get into some actual analytics, facial use cases that hopefully if you are an analyst who has participated and done spatial analytics or location analytics may kind of see some creative ideas and how to, how other ways to use and think about your location information when you're building your analytic insights and for those of you who on the call don't have a spatial analytics background, these are analysts just like yourself. The majority of these use cases that I'm going to be speaking to don't come from analysts who have spatial analytics backgrounds or location intelligence backgrounds. They're able to perform these insights through a little bit of creativity and foresight in terms of how they're going to tackle their problems. So the first case today I want to talk about is Cardinal Health. This is a very interesting use case. The analysts at Cardinal Health did not have an actual spatial analytics background whatsoever. One of the ways that they were using it was in the nuclear pharmacy division of Cardinal Health and they were looking to improve the doctor-patient relationship for PET scans for oncology. Now for those of you on the call who are lucky enough to not know what a PET scan is, it is a powerful imaging technique that is typically used to diagnose and treat lots of diseases, but most notably cancer. And there are more than 1.5 million PET scans conducted each year. Now the problem that we're facing here wasn't the sheer number of PET scans that were being conducted. It was a little known fact that there is a huge logistics challenge with PET scans actually. The reason for that is F18 is a nuclear medicine needed to perform these PET scans. And you have to have just the right amount of F18 strategically placed around the country to make it to the hospitals in time for PET scans because the F18 drug actually only has a half-life of six hours. So Cardinal Health had the challenge of trying to figure out how to support any one of the 300 million people seeing any one of the 10,000 of physicians and any one of the 10,000 of hospitals or clinics in the country and ensure that they had this F18 with such a short shelf life readily and available for their PET scans when they were there with their doctors. Now how can they manufacture enough of this drug in a four-hour period and have it delivered on time without inconveniencing patients? And the other thing that they're trying to figure out is how to profitably price the drug knowing that some of the F18 may actually not make it to its location due to its shelf life and it would be useless once it got there. Now what they did was they used all of the data sources they could get their hands on, all of their Excel data, all of CRM data, location data, hospital data, patient data, number of appointment data, and they used this to analyze time of distance from the delivery and they blended that drive time data with root information and shelf life data as a product to come up with optimized groups and drive times of securing and delivering the drug. Now the power of tapping into not only just their traditional data sources but looking at the spatial and location data as part of the process at Cardinal Health resulted in huge savings. The value of this analytic output really drove visibility into operational gaps that they had and increased efficiency by 100x and their first project saved almost 1.1 million in logistics fees. So I hope this is an encouraging story for those of you who may not have tapped into the location information of your data to just really drive home and understand how valuable it can be. Now another use case that we have here is Dell Hayes. So if you are from the east coast on this call, you're probably familiar with the Dell Hayes brand but for those of you who like myself live on the west coast, you may not have heard of them before. They are a food supermarket operator. They have over 1,500 super markets in the eastern United States and they operate under several names like food line, Hanna Ford and bottom line dollar food brands. They're also located internationally in the Europe as well. So the analysts at Dell Hayes, they support the strategy, marketing, pricing, category management, merchandising, finance and legal departments within Dell Hayes and a lot of the analytics that they use is spatial or location data for a lot of the insights that they do create and they are a small team of analysts that they have been able to create some pretty interesting things. The analysis that they use location data and operational data is in test and control initiatives that they have taken on with Dell Hayes. Really seeing what types of activities might be suitable for certain stores and estimating the effectiveness of an initiative and change before rolling it out in math. So what they do is they take information and match store households in pre-period after the initiative. So one of the initiatives that they did was with their food lion branch. The food lion branch had a flyer in a small local newspaper and the organization wanted to evaluate the effectiveness of the flyer because they were spending about 1.5 million a week on the flyer. The analysts at Dell Hayes established a test market and they called any locations from that test list that had major things happening like opening, closing, remodels happening, or stores that had a typical high season volatility. What they did is they used the location data for a lot of these to create tests of using the flyer versus not. What they resulted in was the test of where the flyers were working actually showed a positive lift and they figured out there were certain demographics in areas where the flyer was not working. The areas where they did see a positive lift resulted in a minimum of $1.4,000 per share per week in sales lift. The story that it did not work and what they did was they stopped using the Sunday flyer and optimized their marketing spend and eliminating this in what they found was it didn't work in urban areas and that also resulted in a lift because they weren't inefficiently spending marketing dollars in areas where they didn't need to. Another test that they're doing with location-based information is they're exploring omnichannel service efforts and offering. Deciding if an online grocery shopping will work for certain locations and will provide a lift versus walking into physical locations. What they're doing there is they're taking their demographic information, customer information and considerations how far customers are coming from to perform this test. Now, sorry, before I get too far into this, I didn't call out at the bottom of each of the slides that I'm sharing with you cases. On slide share is the full story of how they're able to perform these analytic use cases that I'm talking about and they dive into a bit more detail than I'm diving in here so if you are interested in hearing a little bit more about what Dell Hayes is doing after the event you'll have this slide so you can click on those links there and find out more details. Now, the next use case we're going to look at is AAA. These are the site assistants that they offer to the insurance, to the travel help and DMV support at some of their locations. I know I'm a big fan of their DMV support at some of their locations. The analysts at AAA actually use location data for a lot of different regions. They've been able to tap into some of the spatial components of the data sets that they work with on a day-to-day and have a team of spatial analytic experts when they're pulling all of this in. What they do is they use location data to help determine their clubs and what kind of services they're going to be offering at their clubs, where to locate their clubs. They also use it to help understand the demographics and purchase history of their members near each branch office and location in order to find out how to support or what offerings to introduce within key locations. They also use it from physical alignment of services and people who are there helping at each of these club locations. What are peak hours for members coming into certain locations and do they have enough people on staff or do they have too little people on staff to support those kind of high traffic times and high traffic locations? Triple A use a lot of their data such as travel, ERS, auto buying insurance, vehicle repair data. They use income data, household count and other census data and they used it to really build out a comprehensive view of their customers and potential customers within each area. What they've done is been able to really optimize the various service offerings of each membership location. Make sure that there's ease of access to them. Make sure that the drive time distance is optimized as well as I said earlier that each of these services offered within each store area was optimized as well. To send out personalized communications to targeted marketing segments, triple A does tap into all of their existing day to day operational data plus they tap into some of the location information in that data in order to drive deeper efficiencies. The last use case and this one is my particular favorite. If you do have time to check out one I would highly suggest to click on the slide for a little bit more of a deep dive on the Cineplex story. For those of you who are not familiar with Cineplex, they are a leading entertainment company in Canada and they operate the most modern and fully digitalized motion picture theater circuits in the world actually. One of the objectives that the analysts were faced at Cineplex was in identifying key audiences for new films release to inform key marketing sales and other business decisions. The problem that Cineplex is facing is every week a new film comes out and there are very few data points available prior to release to really identify how successful or how popular a movie is going to be within any given location. The analysts were tasked with forecasting the demand for film production because Cineplex wanted to improve the guest experience not only in the theater but also drive staffing decisions based on the demand available for film products and make sure that they were more efficiently staffed predicting location specific demands for films that played any given weekend as well as make sure that they were accurately supported in their food and beverage in terms of marketing. To further ask the analysts at Cineplex were also tasked with helping to suggest personalized marketing efforts for upcoming releases as part of their customer loyalty program. The challenge they were looking at was how to determine who to communicate which new releases without past purchase data available or based on the type of movie that was being offered because they didn't have a lot of information to go off of. Now what they did use was they did use their customer loyalty data. They actually combined it with film attribute data so genre and rating and they predicted the composition of the audience based on 13 age and gender segments. Then what they did was they combined these results from 13 models to predict the segments and created a forecasting module that consisted of multiple modules that were together to produce a forecast for each film location and business day. Now they then tested this forecasting model accuracy against a 52 week sample from the past and then used market basket analysis with advanced ticket purchase from loyalty members to identify companion films. For each member they generated an interest score for new movies using some of the consumer behavior data for companion films and other film attributes like actors, directors, and genres. With this they were really able to address the request of their business leaders to optimize the user experience at each location. Before I get in, let Jamie kind of, I'm sorry, let Jesse kind of show you how you can do this yourself. I just want to talk about the one thing all of these cases have in common. What they have in common is what a lot of you guys on the phone have in common today. Some of them came from a mix that's not having any location information background or spatial analytics background. Some of them came from it having had done some spatial analytics background. Some of these cases have in common is that they all used a solution, a self-service analytics platform that made it easy for them to actually take all of this location-based data and blend it together and create some of the insights they talked about. One of the key ways that they did that is they explored spatial allocation data. It was really about using the platform solution to help them use location information and spatial data just like they would any other data set and really treat data as it is, data, independent on whether it had a mapping or geolocation component to it. The problem is most existing business intelligence and analytics tools today, they do display some spatial data on a map, but they leave the analysis between the points up to a user, so not as well-rounded as most people like and they can be a little challenging to use. Then there's the other side of the spectrum where you have more sophisticated but niche tools that exist for mapping and spatial analytics, but there's a select number of experts who can use these tools, they can be a little bit difficult to explore if you've never had exposure to them in the past. You need to make sure that whatever you're using to perform your location intelligence is not narrow in scope or expensive or require you to have to work with a different tool set to work with this data set versus the location based data set that you're working with or that requires you to actually turn to somebody else for their analytic expertise. You want to look for a true analytic tool that will allow you to access and use all the data that you need to perform your insights, regardless of the size or the format that you need for analysis, which is what those for-use cases that I just talked about did. You also want something that's going to help you geocode data quickly and easily. You want to help you perform in-line visual litics. In-line visual litics is very different than you normally get with Tableau. It's about helping you visually build out your analytic models so you can quickly iterate on them, quickly create assisted analytic models, and then output your production to a visual consumption platform or to a map or to a PDF. The other thing is you want to make sure that whatever platform or tool that you're using to perform some of your use cases that has the information and that you can analyze data in new ways and easily share that information in the way that it needs to be consumed by other users. For those of you who don't know about Altric, it's easy to think of Altric as a repeatable workflow for accessing, profiling, prepping, blending, performing spatial analytics, predictive analytics and modeling all of your data. It's the comprehensive work that it really empowers analysts of all skill sets to be able to access all their data within their organizations in a secure and governed way so that your IT departments and your data architecture and data managers feel safe with you being able to directly connect to the data that you're connecting to in order to pull the insights you need. It's a code free way of liberating you to actually use spending hours working on your data and help you create and think of new ways of using your data to answer questions that just weren't answerable before, like the examples with Cineflex or with Cardinal Health and empowering you to be able to perform really limitless kind of analytic models and address a tremendous amount of business questions that may come your way without limiting you to online visual analytics that Alteryx offers as a platform. And then just as easy as it is to consume and connect into data and we're very agnostic as to where that data is in the cloud in a traditional data warehouse, Excel, access format doesn't really matter our size. Alteryx is actually agnostic in terms of sharing information and outputting that data and creating an analytic insight if no one is there to actually consume it. Your insights don't make any impacts to your organization if people aren't actually using them. So you need to be able to share them in a way that your business decision makers want to consume them. I'm sure many of you can raise your hand and say you have to create reports ad hoc all the time and Alteryx will help make that very easy for you. Some of you may be creating reports that need to get repeated on a regular basis that are consumed often through things like Tableau or Click or Power BI, Alteryx allows you to quickly output all of your analytic information into the visual platforms that your business decision makers want to consume information on or it can help you write information for downstream. So with that I'm going to pass this on to Jesse who's going to show you a bit about Alteryx and how easy we make the analytic process from end to end. Thank you, Lisa. Let me share my screen here. You all see my screen? Yes. You're coming through. Perfect. This is Alteryx Designer. It's a desktop software solution. When you first open it up you get this getting started and we do have tutorials to kind of get you familiar with the tool so you can open any of those up and walk through how you would build out a workflow. So this area we call the canvas. This is where you drag and drop your different tools. This is the configuration window. This is where you'll get any messages that occur while the workflow is running. To the left is the configuration window. This is where you'll configure each of the tools that you drag onto the canvas. For example, I bring in an input data tool. Then this configuration window on the left changes. So this is where you configure categories so we have the favorites tools. These can be customizable but most people start out with these favorite tools. We have our in and out tools, different preparation tools, joining tools, parsing, transforming. Our in database tools as well if you're processing a large dataset and you want to utilize the processing speed of, for example, your SQL server and do some manipulations and joining and then stream it back out into your computer's local working memory. We also have reporting tools and then our spatial tools so you can see there are also icons that you can drag and drop onto the canvas. Then we also have predictive tools as well and our connectors that can output directly to Salesforce. We also have connectors to share point. So that's a brief overview of our tools. Here I have a pre-built workflow just for sake of time. First we have our orders table so we have it by customer ID, order ID, store number, product, unit price, discount and quantity. We also have customer data with customer ID, their address location and then we have another input with our store information with address as well as latitude and longitude. So first we need to clean up our orders. We have some null values that we need to clean up which we can do with the data cleaning tool here. We calculate the total sales for each row and then we summarize our data based at the customer level. You can see down here in the results window we have customer ID, how many transactions they've ordered with us, their total spend as well as what store they shopped at. Then we join the customer data with our orders table with the join tool. So now we know what location our customers are in. We have a three geocoder which uses the TomTom data and indicate the address field to get the spatial object which is essentially all of our customer points here. So you can see in the browse that all of our customers are mapped out during the Denver area. So we join these two data sets together and then with the stores because we have lat and long already we can create a point in Altrix. Then we use the non-overlap drive time tool where we're creating a trade area essentially based off of a 10 minute drive time using the stores point. So here we can see we've created these trade areas based off of the 10 minute drive time for each of our stores. Then we're just renaming the field here so store point and then we have our store trade area. Then I want to see which of my customers are shopping within my trade area and outside my trade area. So to do that I can use the spatial matching tool and this will give me customers that match so I have around 1500 customers that were all within a 10 minute drive time of a store. Then I have these 800 customers or so that are outside that trade area. So I want to dive deeper into those customers. So here I just have another join tool to join back the spatial information here with the customer point, the store trade area and the store point. Then I wanted to calculate these customers because they're outside the store trade area. How far are they driving to the store that they actually shop that. So here I can use the distance tool, tell it which field I want to calculate the distance between. So I chose customer point and the store point and then I wanted to output the drive time based off of peak hours. You can actually specify off peak night or peak hours. So we have different options of the drive time calculation there. And you can also optimize your route based off of time or distance. And here in my units I chose miles. With UlterX you're able to output data just as you are to input data. You can output to a flat file space if you have right access to. So UlterX respects the read and write accesses of your databases. So no one can access databases they don't already have access to. You can also summarize spatial objects as well. So here I'm grouping by the stores. And I'm taking the sum of the spend, how much customers bought there, the average drive time, and the average drive distance by miles. And combining my customer points. To essentially create a report here that would show me PDF of which customers are outside each of the stores that they shop at. So here's my PDF report. So I can see for store 100 these blue dots are my customers that have outside the 10 minute drive time. And the orange are each of my stores. So I can see that some customers they actually live closer to another store, but for some reason they drove out to store 100 to buy a product. And then I have some tables here to show what the revenue outside the 10 minute drive time was and what it was inside. So we can see that total spend was a lot higher within the 10 minute drive time. And then those that were driving outside were spending at least 20 minutes to drive to the store. And this report is actually broken out by store number. So I can see for each store how far customers are traveling to shop at that specific store. So maybe that's something I need to look into for those specific customers. I can ask why they drove to a further store when there's actually a closer store to them. But I'm able to see that with this drive time analytics. So this concludes the brief demo. We have some time for some Q&A. Thank you so much for this great presentation. Just a reminder if you have questions, go ahead and submit them in the Q&A in the bottom right hand corner of the screen. And to answer the most commonly asked questions, I will be sending a follow-up email by end of day Thursday with links to the recording and anything else requested throughout the webinar. So what will be some things to consider when looking at Ltrex for spatial analytics that you want to take that one? Sure. So there is a difference, and I'll give you an example of Dell Haze. Dell Haze actually uses a combination together. There is a bit of when it comes to being able to one, access data that you need securely and quickly without coding or waiting for your IT or data analytics department to extract that data. We are extremely respectful of all of the data governance and access rewrite policies that organizations have in place, but it is quite easy for you as an actual analyst to access your data. The other thing that is a big difference is Ltrex is a full analytics platform work bench. So not only do you have a community, you can perform the data blending, you can perform the data profiling, and it's one piece of feedback that we have for certain traditional spatial tools start to really kind of choke when you're working with larger and larger amounts of data, or you're bringing in lots and lots of different data points. So we have many customers who actually use Ltrex as if you will, their analytics engine and creating all their models and then outputting it out to things like Azure or map info or Tableau even for that matter. The other difference is all Ltrex as you saw with Jesse's demo here, it is a drag and drop environment. That's what we mean by inline visual litics. You get to do your full analytics work flow visually without coding, you can share this quickly easily and you can up skill your individual analytics abilities and the data profiling as you saw with Jesse's screen here is the same exact thing as spatial, you just drop a tool on the canvas and you're doing spatial analytics. It's the same with predictive as well, we have plenty of training courses that will help you up skill your predictive capabilities and you just can't get those with some of the more traditional spatial analytics tool. Where can I learn more on the tool? There's a lot of great resources to learn about the tools, I think I saw some questions and chat coming in about how do you know when to augment your data with demographic data and to getting a little bit more in depth about the spatial components and how to get them going. The Ltrex community as Jesse shows here, she can pop up and say thank you so much. The Ltrex community is a wonderful resource to go to. You don't have to be an Ltrex customer, you can download the free trial and everything you saw here that Jesse showed and there are plenty of free training that you can sign up for. You can ask the specific questions on when do I know when to augment my data set to the user group. That will answer your questions not just about Ltrex but just analytics in general. Highly recommend that you check out the Ltrex community to learn about this. This is the lunch and learn on spatial analytics. This is coming up this week on Thursday. Again, you don't have to be a customer. You can download the free trial and follow along and understand Ltrex and spatial capabilities. Certainly a great set of resources there. What processes do you have to determine the enriched part of your flow? Again, that's a great question. I think that the best place to really look at it here at the Ltrex community is going to depend on what type of analysis that you're looking to do and what you're trying to augment. I would very much highly recommend posting that specific question here. I think I also put it in the chat window too, a thread within the Ltrex community that did address this specific question. If you'd like to check out the chat box as well, you can click on the link I put in there. I certainly know the answer to this. I've done several webinars with Tableau. The question is, is Tableau your product as well? Maybe you can talk to that as well as how you interact. I'm going to let Jesse answer this question because she comes from an analytics background and she was sitting in your guys' shoes not too long ago performing a lot of insight work. Jesse, I'd like to hear your opinion on Tableau and Ltrex. I was at a company where we had Altrix server and Tableau server. I would build out a workflow process that updated operational dashboards on a daily basis so that our operations team could go look at the dashboard every morning and determine if they needed to make any changes to scheduling for our part-time staff. Once I built out the workflow in Altrix, I was able to schedule these workflows and actually have them automatically update to Tableau data extracts. As you can see here, I have a Tableau data extract output which then would automatically get updated onto Tableau server. They work hand-in-hand really well because Altrix is really good at manipulating, blending and joining all of the data sets. I was dealing with over 500 million rows of data. For me to do a join on Tableau server, it was somewhat difficult and clunky. I was able to utilize Altrix to do my nimbulations, joins, calculations because some calculations have to be on different levels of detail. Tableau, I'm sure some of you are familiar with it. I'm sure some of you have Tableau. You get that error that says you cannot aggregate on a non-aggregate on different levels of aggregation. So I was able to do all those calculations within Altrix and then push it directly to Tableau. We also do have a Tableau starter kit that is a complimentary download as well where you can output directly to a templated workbook where you just update your TDE. So if you already have a TWB that's built and you just want to refresh your TDE, we can use this icon as well. And then we also have a polygon for Tableau to build out your spatial visual dashboards. And we can output directly to Tableau server or Tableau online as well. And where did the time go? It's such a great presentation. I'm afraid we are at the top of the hour, however. Lisa and Jesse, I can certainly send the remaining questions over if you want to review those. Thanks so much for this great presentation. Thanks for our attendees for being so engaged in everything we do. We just love all the questions coming in as always. And just a reminder again to answer your questions, I'll be back. Jesse and Lisa, thank you so much. I hope you both have a great day and thanks to Altrix for sponsoring today's webinar. Thank you, Shannon. Thank you. Have a great day all.