 Hello and welcome, my name is Shannon Kemp and I'm the executive editor of DataVersity. We'd like to thank you for joining this DataVersity webinar. Six keys to an agile Tableau implementation sponsored by Alturix. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we'll be collecting them by the Q&A in the bottom right-hand corner of your screen. If you'd like to tweet, we encourage you to share highlights of questions by Twitter using hashtag DataVersity. If you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the chat icon in the upper right for that feature. And as always, we will send a follow-up email within those days containing links to the slides, the recording of the session, and additional information requested throughout the webinar. Now let me introduce our speakers for today. Maymuna Block and Albert Pardia. Maymuna is an Alliance Marketing Manager at Alturix focused on a big data and data visualization partnership. She has over 10 years of experience in partner marketing and is passionate about bringing joint technology solutions to customers. Albert is a Solutions Engineer at Alturix with 16 years of experience in database marketing and seven years of experience using Alturix from the client side. And with that, let me turn it over to Maymuna to get us started. Hello and welcome. Hi Shannon, thank you so much. As Shannon mentioned, this is six keys to an agile tableau implementation. I'm Maymuna Block and I also will be joined by Albert Pardia here today. So Shannon mentioned we're Alturix. And of course, before we begin and before we have a conversation, you say hi, right? So I just want to say a little bit about us. We're the leading platform for self-service data analytics. And we're passionate about empowering analysts. And I'm going to start a little bit at the end here because really the main message here is delivering deeper insights in hours, not weeks. And how we do that is we allow analysts of the way to prepare, blend, and analyze all their data in a repeatable workflow. And the repeatable parts will be more on that later. The second pillar is a way to deploy and share analytics at scale because when you're doing your work and you're doing your analytics, how do you then share that throughout an organization and how do you make sure the most, your work is being made the most of? So because I'm not sure where everyone is in their data analytics and visualization portion of their analytic journey, and of course we'd love to talk about this later in the Q&A, I just wanted to talk a little about Tableau because obviously this is the six keys to an agile Tableau implementation. And then we're talking about Tableau, really what it's about in visual data discovery, right? And as the shift towards data-driven decision-making has continued, many organizations and analysts have realized the value in visualizing their data. So the great thing about visualizing data is it allows organizations to really interpret and spot connections and interesting moments within fast amounts of data, right? Visually things just jump out at you. So even without knowing the context, these dashboards I'm showing you here, you can see certain interesting moments and you can see pinpoint areas where maybe you need to explore something further. And it's really helping people make decisions based on data, make the decisions faster. Tableau is a really powerful tool, right? It allows users to manipulate and interact directly within the data visually, which is even greater. But as you move further along your path of visual data discovery, you realize to fully take advantage of your data. And I think you're lucky, very lucky to be in a world now where you benefit from the fast amounts of data available. But you realize that you're still in the same situation you were before you started visualizing the data. Or, you know, in some cases visualizing data with Tableau has been so beneficial that it actually then brings to light within organizations what many analysts know already, which is that preparing data is a very slow and time consuming task. So what about data prep? We took a survey and we found that most analysts are using what they know, right? Which is Excel. And it's the preferred spreadsheet application as a legacy tool for basic calculations and manipulating numeric data. But unfortunately the design dates back well before this current age when it was big data. And so it's not great at preparing and cleansing large volumes of data, especially disparate data. And it definitely was not designed for the frequency of data we have now, right? You know, another thing we found in the polls that a lot of analysts are joining organizations that already have established in-house tools and, you know, it's hard to enact change within an organization that might already have their tools in place. And organizations may have spent money on these tools. So where do you go from there? SQL queries is also the methodology we hear a lot. You know, that requires specialized skills and it can be time consuming and often brittle, right? You might change one thing and then you have to go back and recode. Other parts might break. It's brittle. And so it's not surprising that family said, well, are your tools effective? And only 7% of people say that tools, existing tools are very or extremely effective. And, you know, the end result is that most data analysts are spending the vast majority of their time cleansing and contributing data and not actually analyzing it. You know, that's the fun part, right? Like, you know you have those inside there and you can't wait to get to the part where you have that aha moment within your data and maybe within Tableau. So how do you ensure that you're making faster agile decisions on all your data? And you have the most updated data and consistent data. And so what we like to say is your analytics are only as good as your data and your data policies. And so that's why we came up with this session is six keys to implementing an agile Tableau implementation. It's to make sure that you have a solid foundation of flexible data prep and agile data prep so you can really maximize the analytics you're doing in Tableau. And after we walk through the six keys, we're going to get to the good stuff, which is how do you accomplish it? So with that, let's get started with the six keys. So number one, ensure that you can use all your data. You're not getting the full picture. If your visualizations are only being done on half your data, right? And usually the problem there is that you don't have a quick way to cleanse and blend and join multiple data sources. Again, you know, I've said already we're blessed with a bounty of data sources, but they are disparate and, you know, maybe, or maybe you're working with multiple Excel sheets, right, in billions of rows. So you need to really establish a methodology that ensures you can use all of your data. And so number two, make sure your work, you can edit it easily, right? You put a lot of work into analytics, but then only you have to go back and maybe decode or spend time rewriting formulas. And once you're visualization, if you're ready at that point of using Tableau, created, it's hard to change the work behind your workbook. So you're going to have to go back and you're spending time coding or even waiting on outsourced help. You know, a lot of companies consulting firms or contractors to code. So how can you ensure that your analytic work behind your visualization is easily editable? The third point, make sure you can enable ad hoc investigations. These visualizations are helping spot new trends and pinpointing areas that need more investigation, but then that be gets more questions, right? And you want to dig deeper to that data, but do you really want to start your prep and analyze some scraps? Well, of course not. So it's really important that you set up your analytics in a workflow that can be easily replicated and then adjusted on the fly. So to really enable that instant ad hoc investigation. So the fourth key here is how do you get the most from your analytics? And you need to make sure you're doing that, right? So you want to understand your data and maybe you do understand your data already, which is great, but then you're thinking, okay, how can I get more out of my data? And it's great if you can apply spatial or predict future outcomes, right? Because the only thing better than making data-driven decisions is being proactive about those decisions and digging deeper into that data. And, you know, to get spatial context around your data, maybe you want to assess drive time for short locations and maybe use predictive analytics to see which customers will give you the highest lifetime value. And there's a way for you to do this without the need for a data scientist or a specialist and then visualize it by which you. And so the fifth key is instant answers. Don't wait around for answers. I think we've all been in that situation where you're sitting there and something's spinning and you're waiting and waiting. We're all used to really instant information these days, which is great. So maybe you're already using Tableau. You're sitting there and you have so much data that these visualizations are taking a lot of time to render. And when your organization has become reliant on immediate access to insights, that can be really frustrating. And so you need to make sure and ensure you have a way to pre-process the data so that your calculations are done before the visualization is rendered in top-low. So you're not waiting for the visualization to render and you can get yourself and your organization immediate answers. And finally, last but not least, be able to share your work. You're doing an amazing job, right? And why wouldn't you want to share your work? And the reason behind that is that way others can see the work behind your analytics and the thought process. And that's hard with some legacy tools. You know, and you often get this, how did we get to your question? And it's hard to go back and retrace your steps. And you're already doing all the analytics already and so you're like, okay, I have to go back and look at my work. If you can package this up and then as well and spread it throughout your organization, this saves time and it makes your analytics scalable. And you know, the other benefit this has for your analytics in top-low is that then your analysis is being done consistently. A workflow that blends all the data and gets everyone to the point where every department or group has the same consistent data set. That means everyone is forcing off the same data and that's really beneficial for an organization. So again, six key is really to ensure that your work is shareable which means also traceable and audible and usable. And so with that, I will pass it over to Albert so you can really see and assess how you can accomplish this. Thank you, Maimuna. I'll go ahead and share my screen. All right. Well, as Maimuna earlier pointed out that, you know, a majority of our time as line of business analysts, we spend what, we spend about close to 70, sometimes 80 percent prepping our data, making it look good so that, you know, so that Tableau can consume it and we can create our dashboards. And, you know, again, this, the tool that we have here, the tool that you're seeing here, which is Altrix, it actually takes care of that 870, 80 percent of time, you know, you save this workflow and then something, and then you can repeat that process. So in the next 20 or 25 minutes I'm going to do just really a high level overview of the software. We're going to create a workflow where we're going to grab data and do some, you know, some data manipulation, some data processing, and then we're going to output it into a Tableau workbook. So as you can see here, this workflow, or I'm sorry, this tool right here is really geared towards that line of business analysts with no coding experience at all. And again, you know, there are so much, you know, so many things that we need to do with data before we consume the functionalities that we have here pretty much covers a lot of those daily tasks that we have, whether that be accessing data from different data sources, doing a lot of the prepping and the blending, parsing, aggregation, and again also with the spatial if you're doing any type of geospatial analysis like drive time or trade area analysis. And then also getting into more of the predictive nature, the level analytics where we're trying to predict behavior based on certain algorithms, based on certain models. We have all of those tools here at your fingertips. And then again, you know, you don't need to be a coder. You don't need to be an R coder or a SQL coder to create these models. This tool again is really geared towards that line of business analysts without that experience. As I begin this demo, it's drag and drop environment where I drag the tool, I dropped it right here on the canvas. And this is where I grab my data. And as you can see here, we can grab different types of data sets, whether that be an Excel file, a CSV file, or a file from the from a database platform, whether that be a SQL Oracle, Amazon Redshift, Teradata, there's really a lot to list. And if you are curious in regards to, you know, the different types of data files that we can access, you can go to our website, and under tech specs, you can see all of the different data sources here. And this list is not static. It's always increasing. But let's go on with our workflow here. I'm going to go ahead and access a data set here. And I'm going to grab the CSV file, as you see right there. And then within the CSV file, I can just go ahead and run it. Add a browse, run it. And just like that, you have access to the data. Now, I'm not limited to the number of input data tools I can bring in. I can bring in one, two data inputs, or 20 data inputs. Again, we don't have a theoretical limit on how much data you can bring in. The limit really resides on the resources of your either server or your laptop. So the second thing I'm going to do is I'm going to view this data, which is right here. Again, it's a loan data file where I have the name of the loan and some sort of indicator code. And I see the loan values by that. The first thing I need to do is that, well, I need to transform some of this data type here. And the reason why I'm saying that is because I know that I grabbed this particular file as a CSV. And all of the data analysts out there knows that once you grab something from a CSV, the default format for that is character, right there. Now, obviously this is a common altrix where you can just change the field type of the highlighted fields into some sort of numerical type. In this case, we're going to use double. Right there. Now it's a double. Now it's an actual numeric where we can do some sort of calculation on it, right? And the next thing I'm going to do is that obviously I'm going to prep this data so that I can use of these fields, these field names, or these header names to be actual values. So one thing I'm going to do is I'm going to transpose this data set where I am going to choose these fields as my hinge and go ahead and run it. There it is. Now you see all of those different years, those different values. Header names are now actually values in one column. And from here, I can again manipulate and even rename header fields here. So this one I'll call it a year and on here on the bottom I'll call that as a loan. Go ahead and run it again. And then again, one thing I noticed right away, I have null values. Now in this workflow that I'm going to create, I am going to go ahead and not delete the null values, but isolate only the values that are not null. So I'm going to go ahead and put a filter tool here. Filter tool is like an equivalent of a where clause in SQL. Go ahead and find the loan where it's not null. Run that again and you will see I'm going to zoom in here a little bit so you can see this. And the fall side, now the true side where the logic that I want where the loan is not null will show right here on the true side. It will show you all values where it's not null. And on the fall side, it will show you the values that are not null. Now I can do further filtering here where I can again pick a field this time year comes after 2009 for example. As you notice while I'm creating this workflow, I'm not doing any type of scripting at all. It's just a drag and drop right on to the canvas here. The next thing I'm going to do is I'm looking at this, well let me go ahead and run it first. And I notice that I have this indicator code and I want a little bit of more of a descriptor. So again, as I said in the beginning, you're not limited to the number of files that you can bring in. I'll go ahead and bring in another file and actually I'm going to bring in a lookup table where there it is probably named lookup table. And what I'm going to do is I'm going to blend it or in other words I'm going to do some sort of V lookup or an inner join. So I want it to be a little bit more descriptive. I see that what I want is really this column right here, this topic and this file, one of the things I realize is that I don't really need the entire field, the entire name. All I want really is that more that major category environment. I don't need the land use or the agricultural production. So I need a way to parse those values out. And again, as I'm looking at this, I notice that the delimiter to parse it out is a colon. So I'll go ahead and top my text to columns tool where I look for that topic. My delimiter is a colon. I want it split into two columns. I'll go ahead and hit OK. And there scroll all the way to the right and you can see there it is I'm able to parse out the values for that one column into two separate columns. Great. Awesome. Now obviously I have all of these other attributes here that I don't necessarily need. So what I will do is I'll bring down this select tool right here where I will just select the fields or the attributes that I need, which is the series code and this topic one which has that major category and I'll rename this as topic and the reason why I'm bringing in a series code is because that will be my match key. I'm going to do an inner join here or a VLOOKUP. So I'll drop in this join tool right here and I will connect it from the one on top true side and as long as I know my match key which is here it's called the indicator code and here is called series code and let me go ahead and run it. I'll zoom it and you'll notice right away that there are three anchors right here. The letter J the middle anchor shows you records that are common between the two branches, between the two files right here. So these are the records that are common. Anything above here L means that those records are only present on the above file but as you can see there are zero records displayed. So that means all of the records matched and on the bottom one you see records that fell out that are only present in this lookup table. So now I have the choice whether I want to do a right outer join, a left outer join or just the results from an inner join by using a union tool. Now for this example I'm just going to stick with the ones in the middle the union tool because that's all I'm doing really is prepping this by tableau. Now the other thing I realize is that my year here is just well it's just a year value and I know that I need to change this into a format where tableau can recognize it as a date. So I will bring this formula tool here directing it to the records that I want which is the middle anchor right here. And I'm going to go ahead and create a new field I'll call it year date. The type will be a date format and I'll do a little bit of concatenation here where I'll use the value of the year and then concatenate it with some values here. Let me go ahead and run it hit okay and my final data set perfect. This is what I want right there. That's the value that I want. And then from here what I'll do now is I will output this data. Now like I showed you in the beginning we can output this into several different types of format whether that be a CSV, an Excel file or back into a database platform. But this time really I just want to output this into a TDE file or even better. I already created a workbook and I want to update the data source of that workbook. So I don't need to recreate my dashboard. So all I need to do is look for that output tool from Tableau where I will update my workbook right here. All right. Excellent. Let me zoom out. And what we did here is really we created a really simple workflow where we do a lot of our data prep and our data blend here. And again you can save this workflow and then you can even schedule it to run at a certain cadence whether that be weekly or monthly or daily. You know depending on let's say for example I'm accessing a database right and I know that every week it's constantly being updated and as I know I can directly connect to the table in that database. So you don't even need to open ball tricks. You can definitely schedule this workflow to run at that particular frequency or cadence that you want. As you can see here after it ran we have the results right here, TWB. Let me go ahead and click on that which is the Tableau workbook. And again once this opens I do not claim to be a Tableau artist here or an artisan. You know my workbook is very as you can see it's very generic but I was able to update the particular data source that I wanted to update and again it updated my workbook. Now one of the other things that I wanted to show you again because right in the beginning I also mentioned some of the functionalities here and I think Maimuna mentioned it also spatial and also the advanced analytics portions which is the predictive. We have all of those tools as well. So if you're doing any type of advanced analytics where you need to predict a certain behavior based on past attributes, you can use these tools that we have here and what we did is that we actually use open source R. The R coding is the engine behind these tools and let me go ahead and drop one of these tools right here. Now this particular tool is actually a macro and whenever I say macro, a macro is a compilation of other tools all housed into one single tool or one single macro right here. So I can open up this macro see what's inside, zoom out here a little bit and you can see all of the tools that comprise that macro. Again if for any R coders out there that want to see this where Altrix is not a black box you can definitely see the R coding right here and let's say for example you want to create your own R tool or you want to create your own algorithm and you can certainly put down this R tool right here drop in your R code and create your own macro where you can put it right smack right here and again what we did here again we're targeting that line of business analysts with minimal to no coding for those people this is again another option for you if you want to do some sort of predictive algorithm or creating a response model for example using some of these predictive algorithm tools that we have and we have regressive algorithms here and also some classification algorithms as well and again we have a whole host of other tools here we have around 50 tools if you want to do some A B testing or some time series type of prediction it's all in here even doing some simulation and optimization we also have it here as well. Now one of the things I wanted to show you is also the annotation and now I have a prepared workflow here one of the things you can do is that you can also be descriptive in your workflow so if let's say for example you want to you know you want to create this workflow obviously it's saved and you want another person running this workflow this other person taking over this workflow doesn't you know does not need to you know ask anyone and say hey is there any type of you know word document that I can read on how this workflow runs we have some annotation capabilities here and type of text you want to put on the workflow so that the other analyst can take over or maybe it could be you you know maybe you want to know hey what did I do like two weeks ago what were the processes that I did and you can really it's set up like a Microsoft Visio where you can see how each step is taken and from this point accessing data doing a data join which is an inner join and so on and so forth and then doing a little bit of predictive the other thing that we the other functionality that we have is again is also spatial analytics you know within that spatial analytics tool we can create trade areas we can create maps find drive time analysis as well and you know these are the things that you know really help you know those companies that need to visually see you know what areas should I target and for this particular example I'll go ahead and open up this workbook right here because I just ran it so this exact this particular example that we have right here you know we're trying to create a response model where we're trying to predict which customers has a high probability of you know saying yes to you know to the next marketing communication you know once you create this workflow you can go ahead and you know run this several times right here and again as you can see here we're just updating this workbook that was already done and then again using the logistic regression formula algorithm using a score tool you know we're able to find a score you can see here we're able to figure out which records are high value targets and then we're able to see that's contained in that workbook and let me go ahead and show you right there so what we're doing here is really we're just updating this you know based on the new data that we're connected to whether that be a standalone data or a data file in a database you know it you know we're accessing that live and then our data visualization is also updated as well and from here this is where we can you see you know we can see the value in regards no pun intended in regards to which records or which customers that we should we should target. So I am going to go ahead and pass this past the ball to my Muna to wrap things up here. Great thank you so much Albert that was an awesome demonstration. Sorry I needed myself for a second there. So you know we covered a lot today and so I just wanted to bring it back to how we really help analysts discover deeper insights in hours not weeks and you saw the workflow there and again really it's about using all your data to prepare and blend it and enrich it with experience data Tom Tom data analyze it with predictive and spatial and then share it and again really want to emphasize as Albert showed this is all an reusable repeatable workflow. We output up to Tableau obviously and you know helping build that foundation to Tableau visual analytics is incredibly important but we also can output to click Oracle Microsoft Power BI back into Excel etc. So there's a lot of amazing options here and if you go to the website Albert show metrics.com you can see some of these input and output specs and then really what I want to end on is this really relatable and exciting story of a data analyst and so actual picture not of the store because this customer is North America's largest home and so as that kind of business they had a strategic goal to optimize revenue for and you know customer preferences varying from store to store so how do you optimize your product mix by location to achieve that target and it's incredibly hard when you have 575 product categories hundreds of thousands of product skews for a store not to mention multiple price points for product line and you know combining all these potential skew store combinations with customer data, transactional data is a huge huge challenge and so this analyst was using Excel to blend sales data and it was taking him two weeks and even then the very best he could do was not do individual you know analyze individual stores but cluster stores by geographic regions and analyze the category performance of only 5 to 10% of all their merchandise so it wasn't even an exact store and it wasn't all their merchandise so this analyst downloaded the free 14-day trial of Altrix and within that time period he was able to build a workflow that decreased his time to blend all sales data from two weeks to less than one hour. Huge time savings here right and then obviously you do other things in time analyze more data and so what used to take two weeks to analyze only 5 to 10% of skews nothing he's done 10 times a day every day instantaneously for management stores on all the skews and then he also decided to start using Tableau and instead of putting things back into PowerPoint and getting people's comments now he has these live dashboards and he uses Altrix to prepare all the data and then he delivers mass customized Tableau workbooks to all the business users within this large home improvement retailer he also was able to automate the process of eliminating these data which was tedious hours long effort and he reduced that down to about 30 minutes. This is a really cool example of how an analyst empowers himself and then in terms of organizations, analysts all their data and obviously this gives them a competitive edge which is incredible. So thank you so much after this wealth Q&A I just wanted to highlight some important links here I mentioned that 3 14 day trial you can download it right away there's an option to download or there's a 3 hour test run you can do you're not allowed to download things that your organization and then you can download a starter kit which has pre-built work clothes and the Tableau visualizations which we shared today and this is a really great way to get started asking you download the trial or even before you download the trial you can check that out and then you're allowed to then incorporate your own data I would have loved to share some other customer stories but you can also go to www.altrex.com slash tab low and you'll be able to check out some of the other customers we have there with stories from Pality as well as JPMorgan Chase and some other really interesting and so with that let's move into the Q&A portion. Thank you both for this great presentation and demonstration just a reminder one of those common questions that people ask are questions about the slides in the recording so I will send a follow-up email by end of day Thursday for this webinar with links to both the slides in the recording I'll also include a link to the 14-day free trial as mentioned so let's just dive into the questions here can Altrex access SAS files from a UNIX server? I'll answer that question Altrex can access actually SAS files now as long as you have access to that server permission to that server the answer is yes you can access that now I'm also looking at some of the questions here being written down and I think one of the common questions I'm seeing here is Altrex and ETL type of tool it's only functionality actually the answer is no you know what I'll go ahead and share my screen for a second my moon I'm going to if you can pass me the ball for a second thanks and then while you're doing that you know maybe I will cover this sorry I'm dedicating but you know and say maybe people have started using it as an ETL tool then when they realize they start preparing their data and cleansing it they have all the time back right and so they're saying okay well what can I ask and I do and so that's when we see people start moving into those advanced analytics which I think is really cool anyway sorry Albert take it away not a problem no it's a good question it's a fair question because sometimes you just want to know well who do you guys compete against right do you compete against an informatic like an ETL tool do you compete against SAS where they do all of these predictive algorithms or rapid miner with any type of data mining advanced analytic platform who do you compete with and if you guys know Gartner and publish this particular slide where Alterics actually resides right smack in the middle we do all of those things again our target is that self-service data analyst you know that don't necessarily need to bother well not that I'm saying you're bothering anyone but you can as a self-service data analyst access data make it you can create a workflow where it can be an ETL type workflow you can create an Alterics workflow where it can be a you're creating a predictive model for example or do some sort of you know or creating a report a static report whether that be an Excel but also at the same time you know we do know we do have partnerships especially with Tableau we know that's one of our biggest partners which is Tableau and we know where our niche is we know our niche is really this portion right here this data preparation data blending portion where sometimes you know again you could use that in Tableau but when you start accessing more than one data source let's say you're accessing really different data sources and you're bringing in more than one million records you're bringing in hundreds and millions of rows this is where Tableau kind of bogs down a little bit and this is where Alterics is able to do all of that data wrangling data prepping and the data blending of all of those hundreds and millions of rows that we can prepare and put it in a dataset to be exported into Tableau all right well since you're addressing ETL let me ask the specific questions to ETL I primarily need an ETL too so what does the events analytics mean to me good question go ahead I was going to say I mean I think I don't want to rehash but I think what I'm saying is people don't think they're really interested in advanced analytics so oftentimes we find that at a certain point I've saved myself all these hours and all these weeks so now I do have time to perform advanced analytics and it's easy and I don't need to know how to code or maybe you're saying that because you already have someone that helps you in code and I think Albert showed someone can go in there and write their own code or adjust code so I think it does have meaning for anyone in analytics who just wants to help the organization and themselves take that analytics to another level sure and specifically just to reiterate or re-clarify ETL tool Tableau is a data display dashboard tool correct Tableau can also grab records from different data sources and it can do joins as well as there the where Altrix fits in is that we can do again targeting that line of business where we can access different data sources at a larger volume and then putting it in a user interface where really as you can see it's not a high learning curve at all it's a drag and drop process where you can see the progression of how your records are being transformed each step of the way yeah I think it's a hard question to answer just because I think we're always just so used to categorizing things and we go okay what bucket does something fit in right so the ETL tool or your visual analytics tool as Albert just said we do have a tiny visual portion but obviously Tableau is amazing we know Tableau is amazing and that's why a lot of customers use them together and Tableau does have those joins but for large amounts of data and you know multiple data sources it gets a little sticky right so that's why these two tools and conjuncts are stronger together and just going back to the imparts you've already addressed the question regarding SAS but what about specifically MicroStrategy and NEDESA NEDESA I think we have we have access to that MicroStrategy I do not believe so and I'm going to grab the ball again here and share the screen excellent and if you guys go to our website altrix.com under technical specifications right here this is the list that we have in regards to accessing all the different data sources and again if there's something here well I don't think I don't see NEDESA here if there's something here that we currently don't have then there's usually some sort of wrap around that we can access to one of the things that go ahead I think NEDESA is in here because it's under the parent company I thought I thought maybe not maybe not but I wanted to point out we also have a community group called altrix community now for those who are interested in downloading the tool you just go to altrix.com there's a 14 day free trial there that you can download onto your PC and you can start creating a workflow we also have a huge community group where we let's say for example NEDESA and it is under the IBM you were right there it is so we do have an database under IBM where we can access that this is a great site that we have this altrix community back then before this site was created we usually rely on stack overflow now we have a budding community group where we actually help out answer a lot of our questions here and then even share workflows among each other and share macros going back to your demonstration instead of excluding the null values could they have been switched to show a zero absolutely again it's going to be dependent on the particular workflow so if I go back to my original workflow here so you see this as a use here I can actually replace that with a zero let's go ahead and replace it with a zero put a formula tool there and let me zoom in a bit so you guys can see where I'm going to say that my loan now will be zero and I want to do a union where I'm stacking it and I'm bringing back all those records again right there and there it is you can do that love it of all the specific questions coming in too the questioner says this is all very cool but I could get overwhelmed by the volume in some of these actions is there a way to apply a distinct I still like to see all possible combinations the only one instance of each yes the answer to your question is yes we do have a lot of different functionalities here go ahead and stop cancel this workflow for now so I can show you the different functionalities again within the summarized tool and I know that we're going to more into the minutia the questions here and more in the minutia we do have a lot of function where you can group by and some sort of distinct count distinct and all that stuff we do have a lot of functionalities now this will lead more into a question again for all of those users that are new to altrix is there training involved are there resources for me out there where I can just after I download the tool can it show me how to create a workflow so again within our let me go back to altrix.com here altrix.com and under the resources you can see training right here and we have a lot of free training within the overview you can there is we have videos out there that you can watch and how to create a workflow and it's a self-paced training as well where you can do something that is very simple as accessing data or you can do something advanced where you're grabbing data from the web and it's coming in to as a JSON format and you need to parse that out for example right so I think one of the questions was reviewing the questions can you grab data or can you download data from the web and the answer is yes we have a download tool that can do a web scrape depending on the URL and the API and then once you do a web scrape usually sometimes it comes out as an XML format or a JSON format from there you can actually configure that data set into more structured tabular data set using our JSON data parser and again just want to reiterate we have a lot of training here as you can see on demand training as well where you have little youtube videos that you can view once it renders here you can watch little youtube videos on how to access data how to do a join or even something how do I do forecasting algorithm and I just wanted to give this is great training and I want to give a plug for the starter kit too in conjunction with the resources that Albert is mentioning the starter kit has pre-built workflows and there's even a marketing specific one that helps you assess customer ROI and segmentation so again incredible resource that already has pre-built workflows and pre-built tabular workbooks and then again if you need more help you can go back to some of the training as well so as Albert said a ton of resources available here and then once you take the next step of download the data and you want to communicate with us we will work with you to create a specific use case so there will be an engineer like myself who will help you create your workflow and then save you all that time and resources that you initially put and then into one simple workflow that you can save and run on a schedule and you can save tons of hours sometimes days in your work process so can the final prepared data be outputted through other BI tools like click etc Laimona you want to actually do I have fall I need to try so yes it can so click I can also output back to Excel peri data so there is a full list of what we output back to at www.authorics.txt and I will show what we can output to but yes we love Tableau great visualizations will obviously we offer multiple other options for users. Is there any data describing that has to be done to function display properly for example moving Oracle data to a text file you can actually do the data scrubbing within the workflow that you created you do all of the data cleansing data prep data scrubbing that you need to do within the all tricks workflow and then after you do all that data scrubbing process I'll put it to the another data platform whether that be an Excel or text file yeah and that's what we see a lot of customers using all tricks for is when you're trying to match all this disparate data of null values and consistent column names a lot of cleansing you know the one where the user is cleansing there's a lot of cleansing going on and having that reusable workflow is really exciting because then every time you're adding new data you're not starting from scratch you already have that work built you can add new data in very quickly. So of course we can't have a webinar without this hot topic of metadata so where are the definitions of metadata for the alterics objects okay so within the workflow right here let me go ahead and share my screen again alright so yeah is the metadata saved the answer to that is yes it is saved again when I say metadata it could be you know as simple as you know what type of value that is is that retained. Yes sometimes it is retained when you know depending on the format that you're outputting to if you're outputting into a text file then yeah that metadata can be gone. This whole entire workflow as you're seeing right now it's actually it's XML based and we can actually also read XML and within that XML you have all of that different metadata types that are stored in there you can import this and then you can see and we can do some another workflow parsing where we can extract all of the metadata information that's contained in here and I think you already answered the next question there and your answer asks with the inquiry of exporting to XML yeah yeah go ahead yeah go ahead. No I saw one question here regarding the MAC version of Altrix the answer is no there is no MAC version however you can run in parallel in the MAC you know if you have a Windows version running parallel with your MAC you can download that into that parallel and you can use Altrix and the reason why I know that because yesterday we just had a client you know he uses MAC a lot and he doesn't want to you know use a PC so he was using Altrix in his MAC by running parallel. Yeah I just want to echo there's quite a few customers who you know for obvious reasons are MAC based you know in different industries and yeah quite a few customers running in that parallel instance and we've got a lot of questions to get through here well coming in we've got a few minutes left so we can get in fit in a few more to support full T-SQL logic case statement substring recap data type Yeah the short answer is yes so for those SQL guys out there again this you know this tool is really directed towards the non-SQL non-scripting line of business analyst don't feel left out we do have a text box a SQL editor text box where you know I know that some of you guys have created pages and pages of your scripting here and you don't want to lose it something you know because I used to be a SQL coder myself where I have pages and pages and stored procedures and I was able to translate it into an Altrix workflow one two sometimes when I get you know if I created something in SQL and I you know I want to do some sort of in database processing where I grab my records in the database first using my SQL logic the answer is yes I can drop in my SQL script in one of those input data tools and it will do you know processing so instead of grabbing millions and millions of rows you're just grabbing a subset of that data. An interesting question you know compared to other data prep tools where does Altrix stand out? Yeah I mean I think I think where do we start no I mean I think it's a hard question to answer but Albert showed the Gartner sections earlier and we really do stand alone in the middle there right so as far as having an ETL functionality and an ability to connect and blend all sources the ability to have this cleansing and then the repeatable workflow I think Altrix just really stand out there and you know Gartner agrees and other people agree that it's really we're in a unique space for you might have heard of the term self-service analytics thrown around a lot lately I mean it's a little vague term but it is true this is self-service analytics and it's front to end and reusable which is amazing. I love it. Do you have any chemical functionality for example the ability to carry out structure searches or calculation of chemical properties within the tool set? Chemical properties with that I think I'm assuming you're probably asking some sort of MATLAB function you know because of our integration with R you can actually put in as many type of formulas that you want to put in there whether that be some sort of differential equation integral derivatives any type of type of functionality because of our you know integration with the R open source R you can print to any type of formula that you want. And is there a free version for academic use or academic partnership? Yeah I can answer that we do work in conjunction with universities there's some parameters around it so I am more than happy to send that information privately and connect you with the department and try to work through just a couple more questions here before we run out of time is there connectors to social networks available? The answer is yes we have native connectors to some social networks and within our community some people within our community actually created macros to connect to some social networks whether that be Yelp or Facebook or Twitter actually we have that's a native connector so the answer is yes. All right well I'm afraid we are just coming up to the top of the hour here and that's all we have time for thanks to all of our attendees for such great interaction and such great Q&A and thanks to Albert and Maimuna for a great presentation we really appreciate your time today and of course thanks Deltax for sponsoring today's webinar just a reminder I'm sending a follow-up email with links to the slides, links to the recording I'll also include the link for the 14-day free trial and I'll give you access to make sure you have somebody's point of contact so you can ask your additional questions and get those answered and I hope everyone has a great day thank you everyone again thank you