 Hello and welcome. My name is Shannon Kemp and I am the chief digital manager of DataVersity. We'd like to thank you for joining this DataVersity webinar, IT plus line of business, driving faster deeper insights together sponsored today by Altrix. Just a couple of points to get us started. Due to the large number of people that attend these sessions, you will be muted during the webinar. For questions, we'll be collecting them by the Q&A in the bottom right-hand corner of your screen. If you'd like to tweet, we encourage you to share highlights or questions via Twitter using hashtag diversity. If you'd like to chat with us or with each other, we certainly encourage you to do so. Just click the chat icon in the upper right-hand corner for that feature. And as always with some to follow-up email within two business days containing links to the slides, the recording of the session, and additional information requested throughout the webinar. Now, let me introduce to you our speakers for today, Tim Chandler, David Potts, Todd Toginton and Roman Kaler. Tim is the principal analyst, a business analyst at Marketo. He currently leads Marketo's BI and data solutions to deliver business and operational insight across all of the company's business teams. With global experience in both business and IT, Tim has worked on many complex BI data service deployments and developed sales and service application solutions for Cisco, Sun Microsystems, Hewlett Packard, and Nortel Networks. David is a manager of Solutions Architecture with Amazon Web Services. He provides technical guidance, design advice, and thought leadership to some of the most successful U.S. software partners on the planet. His deepest experience expertise spans big data, security, storage, and software as a solution business application. Over the last 15 years, he has brought an intense customer focus to challenging and deeply technical roles in multiple industries. He holds both a world record and has a number of patents. Todd is a senior technology partner at Tableau, and he has been in the high-tech industry for 15 years and has spent seven years managing technology relationships. At Tableau, Todd has focused on managing the relationships with Tableau technology partners. In this capacity, Todd has been able to become a thought leader in how Tableau technology partners can bring solutions to market that help Tableau customers better see and understand their data. Raman is an Alliance Marketing Manager at Daltrix, where she focuses on defining strategy and executing programs for joint marketing efforts with key strategic partners. She currently focuses on partner marketing with efforts with Tableau, Amazon Web Services, Cladera, Databricks, and Salesforce. With that, let me turn it over to Raman and Tim to get us started. Hello and welcome. Hi, there. Thank you for joining us today as we present IT plus line of business, driving deeper insights faster together. I'm Raman Taylor with Altrix, and I'll be your host today. Today we have a great panel of speakers joining us from Amazon Web Services, Tableau, and of course Marquetteau. We'll kick off the hour with David Poates, a Solutions Architecture Manager from AWS, and Todd Talkington, a Technology Partner Senior Manager with Tableau. David, Todd, and I will provide a quick overview of how Altrix, AWS, and Tableau can further yourself service analytics practice, and then we'll hand over the floor to Tim Chandler, our featured speaker for the day. As mentioned, Tim is heading up Marquetteau's BI and data solutions to deliver business and operational insights across all of the company's business teams. At the conclusion of Tim's presentation, we'll take Q&A from attendees. And with that, let's go ahead and get started. Altrix is a leading platform for self-service data analytics. It enables analysts to use a repeatable drag and drop workflow to easily prep, blend, and analyze all of their data, be it on premise or in the cloud, as is the case with AWS. With Altrix, analysts can deploy and share their analytics at scale for deeper insights in hours, not weeks. All of this empowers analysts and line of business users to perform their own analytics, quickly process large volumes of data, and then output results to all popular formats, including Tableau, which we'll be discussing today. We'll hear from Tim. Altrix makes it easy to perform analytics, but what's more, with Altrix, you can use the same workflow to enrich your data if need be. We have relationships with leading providers of third-party data so that you can augment your data with additional insight to create the perfect data set for analysis. Regarding our advanced analytics, the predictive capabilities are based on the language R. The tools require no coding expertise, but if you do have R programmers, they can customize these tools or can even create their own tools to be reused in Altrix. Our spatial analytics tools utilize data from TomTom to build trade areas based on drive times or other variables to do sophisticated geospatial analysis. Finally, Altrix makes it easy to share the business insights you produce. With the legacy approach, analytic apps and reports often have to be created in another environment or custom-coded. With Altrix, this step is simply added to the end of the workflow to enable reporting, output data for visualization, or create apps that allow business decision-makers throughout your organization to customize and run their own analytics without having to create custom reports. With that, I'd like to pass the floor over to David. All right. Thanks for that and thanks for passing me the virtual floor. I'm David Potez and you know, that kind of pillar slides that we're going to do an AWS overview, but what we're really going to do is we're going to focus on one particular service and that's Amazon Redshift. I'm a big fan of Redshift. It's fast. It's simple. It's scaled to petabyte scale and you can get started for less than $1,000 a terabyte per year. So go ahead and click forward, please. So what is Redshift? Well, it's a relational data warehouse and it's something that we recognize in the industry. Data warehouses aren't new, but there were definitely some pain points that customers came back to us with and we'll talk about that in a second. But if you think about it, one of the great features of Amazon Redshift is that it's fully managed, right? So you don't have to worry about getting it necessarily set up and do kind of the day to day undifferentiated heavy lifting of managing a data warehouse. We offer options. So you can do SSD platforms or even HDD platforms, which are great for sequential queries, big blog queries. It's massively parallel. So you can farm out the processing to a number of compute nodes and we'll look at that a little later. And it can scale to petabytes. We have plenty of customers that are using Redshift at petabyte scale. And all for that, all that for a reasonably low price, get started 25 cents an hour, $1,000 a terabyte per year. So go ahead and click forward, please. So it's the bias data that I've been doing this for a long time. And in the old days, data warehousing was just kind of for the largest global 2000. And data warehouse vendors would sell to central IT. They'd look for multi-year commitments, multi-year deployments. Sometimes it would take that long to get it deployed. And you'd have to cough up sometimes millions of dollars to get it set up and get it going. That's a little challenging for most of us. So we took that feedback and used it to create the Amazon Redshift product. And in the old days, because these systems were so expensive, difficult to get, we would often have to leave data on the cutting room floor. We could only have a very small subset of data that we could look at because the systems were expensive and hard to get more. Go ahead and click forward. And what happened was companies found that it was leading to dark data. We know from everything we've heard about big data, big data that it's growing the velocity, the volume, and it's accelerating frankly. But because it was expensive to pull in these data warehouses, it led to a lot of dark data. And one of the things that I think the market missed in the past is that small companies have big data too. Whether you're mobile, ad tech, IoT devices, generate tons of data, if you're into gaming, they track almost everything you do in there. And because of these long cycles and the high cost, it really kind of stifled innovation. And one of the great examples I'd like to give is one of the best metrics for your agility as a business is how long it takes for you to have an idea in the shower and be able to test it out later that day. And one customer that really kind of changed the game is Yelp. So what they've done is they've democratized their data services because they can experiment. And that allows them to come up with new ideas, new functionality, and new value for their businesses. Go ahead and click forward, please. So what's great about Redshift, if I had to kind of drill it down 10 times faster, 10 times cheaper, easy to provision, you can get it going in just a few clicks. It doesn't necessarily completely replace the TBA, but you sure can manage a lot more data using a managed service like this. And we allow you to easily leverage the tools on top of your big data platform that you already know and love. So we have a great partner ecosystem and there's great tools that you can use to look at it. For example, Tableau works very, very well with Redshift. And then because it's a staff service, you can build it in line with your process flow. So if you have data pipelines that you're putting together through an ingestion tool like Alteryx, you can easily connect that to Redshift. And pay as you go. Our customers sell us over and over that they only want to pay for what they use. And we support that through a lot of features, functionality, and a very friendly on-demand pricing approach. And it's managed availability. So no worries about necessarily learning difficult clustering technology and having to maintain that. We provide that for you as well as disaster recovery. So a lot of great features there. Go ahead and click forward. And don't necessarily take my word for it. We've had a lot of customers that have stood up and said, we are really excited about Redshift. This is how it's changed our business. FINRA, for example, cut their costs of data warehousing by 70% by implementing Amazon Redshift. And you can take a look at the website. We have a case study on that. Go ahead and click forward. And we have a huge ecosystem, too. So tons of partners integrate directly with Redshift. They see the value. Customers see the value. Go ahead and click forward. So a little bit about architecture. We're not going to go super, super deep here, but I just want to give you a sense of what this looks like. So Redshift is set up with what's known as a leader node. And that's what you're going to communicate if you're using a SQL client or a VI tool that you need to contact Redshift. It stores some metadata. It'll do your query optimization. And then it coordinates the query execution through the compute nodes. And the compute nodes are sliced up. So they're able to work in parallel. And they have what's known as columnar storage. Now, this is a technique that data is stored in columns primarily rather than rows, which is great for analytical queries, time series queries, and makes it really easy to save quite a bit of IO in these types of analysis functionalities. And you can grow up to two petabytes compressed, which is quite a bit of data. So there's some scale numbers there. And you can get started, like I said, for just 25 cents an hour. So good value, a good structure to do very highly-performant data processing and analysis. Go ahead and click forward, please. And that's all that I had on Redshift. I'm going to jump into the job. If you have questions about Redshift or AWS in general, I'll be answering Q&A while the other folks are talking. So thanks so much. And I hand it over to Tableau. Hello, and thank you for passing it over here. Can you go to the next slide, please? To explain what Tableau is for those who have not been able to interact with it, Tableau is a visualization product which so it will be able to bring data in and very easily look at data. We have a mission set statement here at Tableau, which is we help people see and understand their data. Now on the surface it sounds pretty simple, but as we look at it, it is really much deeper. So our focus here at Tableau is really to help when we say people, we mean absolutely anyone in the organization see and understand their data. And we do that by making Tableau very easy to use. And so it has drag and drop functionality that literally when you attach to data, you will be able to drag it out into what we call a palette. It will begin to build charts. And you can then begin to play with those charts to really understand what it is that you're looking at. So for example, as mentioned, we do connect to Amazon Redshift. We see a lot of customers using that as a platform to store data. And we see a lot of customers that use Altrix to help clean and repair data that they're using. And so once it is up in Amazon and ready to use, Tableau can get connected there, bring that data down. Anyone in the organization can do that. And obviously you can set up rules and process to make sure the people are seeing the right data. But the benefit of Tableau versus really anything out the market is that you're able to really explore the data. And we talk about the unknown unknowns in Tableau. So it helps you see those things that you may not see in other situations and be able to really discover things about your organization that can help provide a roadway that's more beneficial than maybe the one you're on. So it allows the front-line people to go get data, understand what's going on with their data, and ask questions. And those questions lead to true insights. So next slide please. And we keep it simple. There's really two products at Tableau that we sell. And one is Tableau desktop. And the other one is Tableau server. Now server comes in a couple flavors and I'll explain that. But Tableau desktop is where someone would offer a workbook or dashboard within Tableau. So installed on your desktop is very easy to use. In fact, you can go to Tableau.com and download a free copy for a period of time as a player at Tableau. I highly recommend that you do that if you have an experience there. And then once you build those workbooks and those dashboards, those dashboards and their workbooks can be published up into Tableau server. Now as I mentioned, there's a couple flavors. And from a cloud perspective, we try to meet you where you are. In other words, we try to make sure you have enabled what you need or the way you're wanting to go to market with your visualization tool. First we have Tableau server. Tableau server is a can be installed on premise number one, but it just certifies be able to be put into Amazon as well and use it as a hosted solution in Amazon. Third option is to use Tableau online. If you don't want to, if you want us to manage the infrastructure upgrades, we can also offer a way where you can use Tableau online to be your sharing mechanism. We do have the ability to push those visualizations out to mobile as well. So your phones and your tablets can be received as well and you can do all the interaction that you can do on Tableau server on those mobile units. So all in all, Tableau is a way to see and understand your data that you have and we look forward to what we have here with Marketo. So I'll let you move forward from here. Thanks, Todd. My name is Tim Chandler. I work at Marketo and I've been there for a little over a year and the thing I really want to discuss with you over the next 20 minutes or so is really, if you go to the next slide, is the whole challenge of what's really happening and what really was presented to me a little over a year ago when I came to Marketo. You know, on the very surface of the challenge when I talked to the CXOs and many other line managers in the company, it was we can't see our data. We need to understand a better view of what's going on and when two people from different processes organizations come to us and show us a chart, the numbers should match. A pretty cool concept. Hey, Tim, I don't mean to interrupt you. I'm so sorry, but can you speak up a little bit here? You suddenly come very quiet. Is this better? It's a little better, if I could be louder. There we go. Just hold the mic near my face. Okay, so the challenge on the surface seemed very simple. We have some data. We want to see what's going on and we want to see it consistently. However, as probably many of you are kind of aware of, this is really just the tip of the iceberg. We go to the next slide. The challenge is not for us to create a bunch of charts, but it's to get the business people involved in understanding their data to create the charts themselves. It's one thing to go out and actually create a chart that someone has asked for, but it's much more powerful if you can empower people in the business to actually create the chart that they want. They have a much more intimate understanding and knowledge of not only what they need today, but what they need tomorrow. We go to the next slide. It gets even more complex because we're not talking about one business person. We're talking about several business people across many different organizations. They're grabbing data from many different disparate areas and they're creating many charts. It's not about what we see today, but it's the sustainability of these charts and the trustworthiness of these charts as we go on. The challenge, although it may sound really simple at the beginning, it's a pretty tough challenge as we go down this path. I just wanted to kind of frame the challenge that we had getting into this because if it was simple, it would have been done a long time ago. We look at the next chart, the next slide here. Really the thing that I first set up, and this is probably a couple of months into this, is I set up something I call Mavericks that it's a data pipeline and simply put it's a way of grabbing disparate data in the cloud and wherever it happens to be and then it pulls it into three separate stages. The first stage is basically the ETL stage of where it's massaging, blending, prepping the data. The second stage is where it's actually putting it into a relational database where it makes it a lot easier to get the data later on such as historical data in Salesforce. And then the third stage is actually creating it in the format and presenting it in Tableau in the business terms. So the magic, this is really the magic of this presentation or what's happened over the last year and there's a lot going on on this page so I'm going to talk about it a bit more. So on the left-hand side and this list keeps growing every month, we've been working in Alteryx to actually go in and grab many of these different areas. Now some of these areas like Salesforce.com there's a nice little tool you can go in, put your credentials in and boom, you're pulling out data right away. A lot of these though you're having to use rest API tools to actually go in and grab this data in a secure fashion. Now when it first comes in, that first box that we're looking at here, it says AWS. This is all AWS in this yellow box here. As it first comes into this environment, this box is crunching over 80 data flows every day. It's essentially refreshing and reaching out and doing incremental and full loads based on the business requirements all the time. So if I was to look at that server right now, it's busy, it's got a couple things in it's whizzing away, making things happen. As it goes into Redshift, only certain jobs are actually going to go into Redshift. So for instance, Salesforce.com every night at 11 o'clock, I go snapshot about 50 fields what's happening in our sales funnel basically and you take those snapshots over the last year and a few months now, you can actually do things like pipeline velocity and many other cool things. And this was again simply just going in with Alteryx and saying, okay, read this table and save it in Redshift. Let's do it again tomorrow. The last part here in more detail is really making sure that we save this as a data extract that is in Tableau's native format and that really enables the development of the dashboards and the presentation of the dashboards to happen much faster. So you can kind of see it's a pretty simple concept. It's all secure. It all flows from basically left to right and it's been up and running for well over a year now and we've had great success in this. It's been very simplistic. You notice there's no MDM here because we're able to actually go back into the sources and actually if there's a single data source that's going to be responsible for something then we make sure that the business and IT understand that for instance Workday, that's going to be the definitive set of what we're looking at from an employee point of view of hiring and that will then transcend through the whole data modeling that we set up here. Let me go on to the next slide here. So I mentioned there are like 80 workflows that are going. Probably one of the most nearest and nearest data workflows is the opportunity workflow. This is one that a lot of the sales ops people are looking at, a lot of the finance people are looking at and I know you can't read the little widgets but I just thought a little train track kind of view of what's going on and it's kind of interesting because you start off on the left it's basically just reading in five different tables from Salesforce.com and as it goes from left to right it's basically pruning the data, fixing up the data and it's doing a couple, those gray boxes are basically blending the data together. The green boxes are creating US currencies and then you've got a few other boxes. You've got oranges and a couple of other boxes that have collapsed and this is where the businesses come to us and said oh we want to be able to see these extra fields that will make Tableau a lot easier for us to create. So one of the decisions we made off very early in this is that we are not looking to create Tableau experts. We are looking for people in the business that are going to be doing Tableau work but we want to make it as easy as possible and easy as possible means creating the best data modeling as possible so when it's there they can do more drag and drop and less calculations and less things like level of detail. Creating the right data model makes speed data. So as you finally go over to those far yellow boxes on this slide that's where it's actually publishing out. So it's interesting. It starts off with reading, pulling in massaging, enriching and then pushing out into Tableau and this whole process takes about a half hour and we run this periodically during the day to refresh this and it's interesting there are other data workflows that will take data from this workflow and further enrich it and create it for other parts of the organization. The nice thing is that we've got multiple, we've got many people using this data set so when it comes time to doing reports the numbers match as we go around the company. If we go to the next slide this is not to say there aren't challenges out there and really the next few slides I'm going to talk about the data challenges that we've had and the things that I think that anyone's going to run into as we see this happening. So the first challenge we had is when we started to see where all the data was coming from we had to start building out our own REST API and really developing that expertise in-house and working with Altrix to create this. So there's many different connectors that exist today and this is a picture of some of them but the one that's probably the most important and the one that you'll probably come to know the best is the one at the center bottom and this is the download tool and it basically allows you to insert REST API code into it and allows you to suck out data from whatever source you're looking at. It's proven to be very reliable it really made this happen in many different ways and it's also been proven to be very secure so generally you're encoding up front some custom work and you typically have that conversation with vendors but when you start to say hey can I have a REST API access and credentials you're in the headlight look from some of the vendors but then as you go through and talk to the right people you're able to get that. We go to the next slide so the next thing is not really about technology it's really understanding what the business objectives are so you can go look at some data and you can kind of come away with it and go oh well if we just take this data that they're using in Excel spreadsheets grab from right sources automate it and push it into Tableau job done not at all. You've really got to be able to understand how the business is using that data and in many cases when they pull up an Excel spreadsheet they're really only touching on the tip of the iceberg with this so it's very important to develop a relationship with the business understand what they're trying to get at what are the insights that they're trying to understand of their business and then start to develop data that makes Tableau easier to use over time and this is not a quick type of process this is something that comes with iterative conversations with the business over time. The next slide. So kind of building on the last one the most important aspect I think is once you start to understand more and more what's happening with the business objectives you have to translate that into a data model you have to better understand what they are trying to accomplish what the types of charts they're looking for what are the answers that they're really trying to see and as you build this out into a Tableau abstract that you're kind of looking in two directions you want to make sure that the business has what they need but you also want to make sure that when you have multiple business groups looking at data that whenever possible they're looking at the same data instance you want to cut down on the chatter or the division of data in the back end systems. So this tool created by Alteryx to publish into Tableau is pretty much at the end of probably 50, 60, 70% of all the workflows we have and at the end of the day we have probably about 12 core data extracts that are updated every few hours in the system being used by the business today. Over to the next slide please. So those were the data challenges that we're primarily running into. If I had hours more I could probably talk to the sunsets on that but those are kind of the highlights. I'm really now shifting from data challenges to more around the business challenges that are happening and let me tell you how we approach this at Marketo. So I've talked about these data models in the lower left hand orange circle there. The challenge now was how do we get individuals across the company to drive adoption of charts and dashboards and move away from Excel. Our goal is not to get away from Excel but to move into an automated process so people aren't doing the dive and catch every month or every quarter. We're able to automate as much as we can. So what was done very early on is we identified people in each of the organizations. So the partner organization, we had Jane, we had Catherine and Legal and so on and so forth through the organization. And depending on which organization we are looking at we really looked at the skill sets that each person brought to the table. We did not go out looking and saying hey you're a data steward and you're going to have these roles and responsibilities. We did not approach it that way. We really looked at it from hey how can we make your life easier and let's look at a way of making that happen for you. So it became a very personal conversation with people but at the end of the day you start to evolve into this relationship. It's really about them understanding their data. You having a broader understanding of the data across the company and then being able to supply and quickly get them to where they need to be in automating dashboards. So let's move to the next slide please. So there's again, I kind of break this down to three challenges. The first thing I talked about was identifying the business owners that we're going to help make this happen. And I know in the traditional sense I know when it's been some larger companies we've always kind of gotten this very structural way of having data stewards and giving them roles and responsibilities. But I really don't believe that that's a key way of making that happen. I believe that it's better to look for people that are having pain with Excel. I don't know if you've run into it but I have where you someone literally got macros and VLOOKUP tables and all kinds of stuff going on. And if you ask them to explain it or it's documented they kind of look at you with that stare that you recognize no this is their tribal knowledge only they understand it. So that's a huge opportunity to really understand and develop a way that they can legitimize the processes in the background and to make this happen in a much more automated way. So these people over time become more of data champion. But again it's really looking at the people in the groups knowing what they bring to the table and putting it in personal terms of how you can actually drive and make things happen with them. Let's go to the next slide. So here's something I think that's really interesting from a training point of view. I know if you go out there and you engage different groups or companies and you say hey I want to train people on Tableau. There's this I've seen this where they want to create experts out of people. They want to get them creating charts and they want to talk about line of detail and they want to talk about calculations and they want to get into all this stuff and usually my opinion that is not the way to go on this thing. You want to take people's data that they are using. In other words you need to have the data up and running and ready and you need to put that in an environment where that person is going to look at their data and they're going to ask questions on how to make their dashboards and that's the connection you need to do in training. You need to be able to show them their data and their dashboards. It's interesting looking at sample data but it really clicks when they see their sales, their territory and they want to know how do they do territory planning for the next six months. The questions get really interesting and they progress down a path that makes it much more sustainable and drives more adoption. We've really tried to stay away from teaching anyone about data blending, level of detail or any advanced charts. Those things will come to people who want to continue on. We're not going to hide that from anyone but we're not going to go out there and really when people are introduced into Tableau we're showing them how to solve business problems not how to be experts in Tableau. That is I think a very key part because most of the people that are going to be creating these charts have other day jobs and it's hard for them even though they wish to get into these things more and more it's just not practical in many ways. Let's go to the next page. Probably one of the biggest challenges overall is getting company-wide ratification of data. There's three different ways I look at this. One is when someone in finance is actually doing a calculation or when someone in sales office is doing some other aspect of data and they're building something out. It's more in your interest to have that done in the back end by Altrix as it pushes it into an extract versus actually creating that custom field in Tableau. And here's why. If it's a real valid field that many people are going to be using in the company you want it to be a standard and you want to drive that as a standard across the company and there's no better way of doing that than just making sure it shows up in everyone's dashboard and you don't have to advertise that calculation. Now I know there are ways around that in Tableau but again let's keep it simple. The second thing is sharing and using common standard fields and that means taking away the fields or not ever showing the fields that just aren't being used in the company. And this is a huge advantage over what we see in Salesforce.com reports. A lot of people are using a lot of different fields in Tableau. Now you can govern this a lot better and you can drive things into the mix and create greater standards across the company. So the other aspect of this is really categories. Now a lot of people in the Marketo will look at the same number but they'll slice it different ways and these slicing are basically different categories of how they look at it. And being able to make sure that those categories are understood and are company-wide finance, sales-offs, partners and everyone is using those same categories again is a huge win and this is something that we're able to push through and work with people in the business with Tableau. So let's go on to the next slide. So I presented a lot of the data challenges and the dashboard challenges that we've gone through. The thing that I wanted to talk a little bit about now is that the real magic in this whole thing is how do you go from engaging if you have an idea and a shower to getting in the production changing it a little less sexier, I'm going from business requirements to production and you need to be able to do this in a pretty fast pace. And if you really think about this you're going to engage someone in the business and they've usually got a fair bit of excitement of this is cool, I want to get my dashboards up, I want to make it happen and you need to be able to engage them in such a way that you're able to pull, modify that data. If there's something wrong with the data, there's something that needs to be added or changed, you need to be able to do that quickly and then get that into production as quickly as possible. If there are delays of weeks then the excitement goes down with the business, their shift and their attention goes into other things and it jeopardizes the adoption of what's happening. So it's really important to be able to go in, see the business requirements, change, hours later go in and change something in all sorts create a different pipeline and then push it into production or into a beta environment and then be able to come back to them and go hey, is this what you're looking for? Is this what you're looking for? And be able to quickly get it to that pace where they're going, yeah, that's exactly what's going on, great, I can replace it. So speed is still a very important aspect of this, to maintain that excitement with the business. Let's go to the next slide please. So what's happened to Marketo? So these are some of the charts that we've actually seen happening inside the company. And basically, I've got a red arrow here. It points to where we actually launched Mavericks. So Mavericks was running in the back end, working with many different business groups and then at one point I was saying, okay, the data is locked in, it's been ratified, we're seeing that it's high quality, but we're ready to go make this thing happen in production. And that's where we saw in each one of these colors represents a different organization in the company. So you can see up until this red arrow we're seeing moderate growth across about five different departments. But then when the data became available in the back end, the adoption really started to take a totally different trajectory. And we started to see more and more organizations come on board. So the really exciting story that I love is that being able to have this data pipeline in the back end and being able to have this humming away day in and day out, it's the foundation of being able to deploy Tableau and to get it into the hands of people and to get them to start publishing across their departments. So we're seeing an uptick in the production or the publishing of these dashboards and also the uses of these. And we went from main user licenses up to a whole site license of about a quarter ago taking advantage of that nicely. Next slide please. So the next few slides are really just, you know, these are just some of the snapshots of what the teams have done. So this is me, this is an IT crate, this is the business actually taking the data and going out and making this stuff happen. So this was our, there was a whole set of slides in here of territory planning apologies for the grayed out areas. Can't share some of the intimate data that's in here, but you kind of get a flavor that all of a sudden now that they're able to see where our existing sales are in certain capacities and what industries of work that we're able to look at, see where we need to actually do our sales territory planning for the following year. Next slide. Finance is a series of dashboards that are out there. They're able to see a lot of this in Excel spreadsheets, but it was typically at the month end or, you know, once a quarter they would see these things. Now they can see it up to date every few hours. So not only is it a lot less work, but they're seeing it when they want to see it now. Next slide. So partners, again they were doing the dive and catch with a lot of Excel spreadsheets and we were able to create a extract that focused just on partners and they were able to create this. Again, this is an IT business, this is people in the business creating these charts. Next slide. Likewise with legal we're actually continuing to do a lot of stuff with that work. As soon as they recognize that we could do automated pulls through APIs into their back end systems and pull it right into Tableau, they got very excited about this because again they're not having to do the dive and catch. Next slide. So there was a lot of really good feedback, but the one piece of feedback that I really liked the most was that it's saving just this one department 25 hours a month. So I talked about the dive and catch. There's a lot of people inside the company that were actually using Excel and having to do a lot of manual work in that and once we were able to automate these pulls through the APIs using Altrix, we would pull it in and push it into Tableau and then have Tableau have the business create those dashboards boom, it was all automated. They could sit back and they wanted to see exactly what they needed to do. So it's kind of going from a manual to an automated world, but the real trick here is we got the business to really drive and make this happen. So next slide please. So if I leave you with any kind of feeling today, the really interesting thing I think is that IT is not around developing out these dashboards. We're about enabling people and making it as easy as possible to deploy these dashboards. We want to promote people doing things themselves in a self-service way. We are basically trying to be very responsive to the data needs when they need that little hint or little suggestion on how to make the charts better. We're there, but largely we want them to be autonomous in creating those Tableau dashboards. We in IT on the infrastructure, we on the governance, and we on the security. Less sexy stuff, but without that stuff nothing else is going to happen. So with that we'll go to the next slide and say any questions. Thank you, Tim. That was definitely an excellent presentation and a great explanation of how the three platforms work together while it's furthering your analytics practice. And with that I'm going to go ahead and jump into the questions that we've had come in. So we have one question here. Is altrux being used by end users after they view Tableau reports? And if so, for what purpose? I.e. post-ETL analysis or data cleansing? Oh, okay. So today I'd say probably 90% of all altrux workflows are IT originated talking with the business. In finance and a little in product marketing we have people who are interested in altrux and we've created something called altrux in a box where we've created a virtual PC in AWS. We've set it up with altrux. It's all set up. We've provided copy of static data in that environment. And we've given them some basic training and say go play with this. Go see what you can actually do. If there's stuff that you need help with. It's basically an experimental sandbox for the business to go look into. And if there's some good ideas that come out of that, we will take those, we will put them into production into the altrux, Mavericks data pipeline. The real challenge here is that we're not looking for more workflows, we're looking to streamline the workflows we have. So if there's a workflow we have today that 20 people are happy with, if we can get that to modify that ever so slightly, so 25 people can use that data workflow, that's much more interesting to us than creating a separate new workflow. Perfect. Thank you. And how long did it take you to set up this architecture at Marketo? So it took probably a month to set up the architecture with two or three data feeds. So this would be Salesforce.com was the very first one. Orlando, our exchange rates was the second one. And Koopa I think was the third. So those three pulling that data into altrux, into Redshift and then into Tableau, that whole thing took about a month to set up. The further refinement of adding more and more workflows, adding more and more sources and all that, that was another four or five months. And even today, for instance we're seeing more, for instance Jive is a data source that we want to get into and start reading and writing into. So this environment just continues to evolve. There's a new data source that's slowly being added every other month. Thank you for that. And where in the process did you have to manage cleaning the data, i.e. any duplicates, invalid data range, data exceptions, et cetera? Oh wow. So this is the dark work that happens. If you do it perfectly, no one cares about it and if you don't do it exactly right, everyone shakes their head. So this happens right up front in altrux. Also a huge part of this is understanding what the data really is. So a lot of times someone will come back and say, well hold on, this data is wrong because I'm seeing dolls with business units. And you spend time, you go in and you look at the data and you understand why it's actually that way. And it's really more of understanding the data at that stage. And does the data model need to change? Maybe, maybe not. So there's understanding of the business model. Cleaning up data, like for instance our leads data. There's no set drag downs. It's all free forms. So we have macros that we've created that go in and do huge cleanups of countries, of addresses, of names, of email addresses. And there's a lot of basic cleanup that happens in data like that. With our opportunity data and things of that nature a lot of times we're not in the business of trying to change the way people use Salesforce.com or other applications, but we will create reports that show where attention needs to be applied to improve the quality. So it's more around shining a light on issues as opposed to fixing them when we're looking at reports. Great. And has management noticed the difference between groups using the Mavericks Altrix in a box solution versus groups that are not adopting it? Yeah, so pretty much everyone is adopting it across the company. So people don't go asking for Mavericks. They go asking for Tableau in the company and when that starts to happen they then start to ask where this data coming from can I rely on it? And when we say, oh yeah, well Binance has been using this for eight months and Salesforce has been using this for three months. This is the same data that they're reporting up to the VP's right now. So there's a lot of confidence around that and people are really looking to that as they're looking at it as a way of making sure that they're getting the right numbers now. Thank you. And can you describe the reliability of this architecture? Any issues and maintenance? So I'd say over the last 14 months this has been running. It's only probably had two hiccups and each of those hiccups were about four hours. So there are reports on reports and one of those reports is when did this last get updated? When we originally started this whole process we actually did a lot of the development on the production system. Big no-no when you're scaling out, right? So we hit a certain point where we said, okay, we need to create sandboxes. So there are sandboxes for Tableau. There are sandboxes for Altrix. And things got tested into these environments before they go into production. So the architecture slide I showed you is actually there's some sandboxes that could be kind of view those behind the existing boxes we have today. Thank you. I will wait for a response and see what the individual asking says. Another question we have is, can you provide an estimate on how much volume of data I am assuming you are analyzing? Well, the key areas of data that we're looking at inside our company are opportunities data that is blended with user data, account data, exchange rates, and product data. That is in the vicinity of probably only around a terabyte of data that we're looking at each day that we're massaging. When we start to look at our leads and our accounts we are looking at vastly more data. We're looking at hundreds of millions of rows and these rows are incredibly wide. We're talking with 700 fields. So that is a challenge. The approach there with that is to really understand what are the most important fields, what are the criteria on how we're going to pull data and push it through our Mavericks data pipeline. When we look at other data sources, they're relatively small, exchange rates, very small piece of data that we're looking at only weekly. There's several other in there with finance and all that. Again, we're talking in the hundreds of megabytes in most cases. Usually nothing ever too big. The one thing I think it's interesting to call out is that we have just one server running all of this in Ultrix. It's got two parallel processes that can run at any period of time. We noticed that if we use APIs all the time... I'm getting a few remarks saying that folks are having trouble hearing you again. Just wanted to jump in. Okay, apologies for that. Sorry. The one thing I think it's worth noting is that when we looked at all of the Ultrix workflows that are running, the ones that take the longest time to run are the ones that are doing these API calls. If anyone's done API, REST API development, you know you can typically only grab about a thousand fields at a time. If you're grabbing a few million rows and you're doing it a thousand at a time, that's going to take a while. One of the things that has changed with many vendors out there is that they're recognizing that this is a limitation and they are creating ODBC connectors or database connectors that we can go to and grab. Instead of taking an hour or so to read data, it could take seconds. We're migrating on some of our slower workflows from REST APIs to these ODBC connectors and taking that data and we're just pushing it into Redshift as quickly as we can and then doing what we want to do with it when we get it there. Thank you. I will take one more question since we are coming up on the top of the hour. How did you track metrics and show quantifiable metrics to stakeholders? For example, the number of data sets cleaned up and how it impacted the business line. I'm going to cut that in two parts. One of the first things we did is we wanted to have a baseline on this whole environment. There is a set of Altrix Tableau dashboards that we've created that actually monitor all the data usage and all the Tableau usage. We've done that right from the beginning so we have a really good understanding of who's logging in, who's viewing what and which data is being viewed. That was really essential setting that up. Then the aspect I think probably more to what the user is asking here is how do you actually quantify and measure this. You saw that one letter out there at the very end of my presentation that says they were able to save 25 hours a month. One of the things that I did is I reached out to every business owner that was actually going out there and creating these. I'm asking them a set of questions. How many hours is this saving you? What are your wins? Asking that once is not enough. I've been watching and monitoring what they're using. When I see something fix up and I see 10 more users that are in there, I go, what's going on there? Then I can get some conversation with the business and they'll tell me that, oh, we've just been able to get rid of this Excel spreadsheet and where I've been able to do this type of planning which we couldn't do before. Great. I capture that. I report it back in so the CIO is understanding what's happening. There's a greater understanding of the business benefits of what we're rolling out. It's really about maintaining a relationship with the business, asking the right questions, and pulling that back in and then basically making people aware of that in the IT crowd. Awesome. At this time, I'd like to extend a big, big, big thank you to Tim for presenting the Mercato analytics story as well as David and Todd for the great information on AWS and Tableau. As a next step, please visit us at alterstock.com.com to download your 14-day free trial of the Altrix designer. With a kit, you'll get pre-built Altrix flows that output to Tableau, making it that much easier to get hands-on experience with the platform. A follow-up email containing this URL will also be sent out to all webinar attendees as a follow-up. With that, I'd like to thank you for attending today's presentation. Thank you and enjoy the rest of your day. Robin, thank you so much. Tim and Todd, thank you so much as well. Just a reminder that I'll be sending a follow-up email by end of day Thursday with links to the slides, links to the recording of the session, and anything else requested throughout the webinar. Thanks everybody. Again, as Robin said, we'll get this information displaying right now as well as in the follow-up email. Again, likewise, thanks to our attendees for being so engaged in everything we do and asking so many great questions. We just love it. We will hope to see you in the next webinar and hope everyone has a great day. Thanks all.