 Hello, everyone, and happy new year on behalf of the Amplitude and Willow Tree teams. Welcome, and thank you for joining us for our virtual event today. Today, we're going to discuss the top 10 digital analytics mistakes with the goal of really helping you inform some decisions as you're moving through FY23 to help guide you along those decision-making processes. And we have some experts here with us. And with that, I will pass it off to our presenters to introduce themselves. Hi, everyone. My name is Adam Greco, and I am a product evangelist with Amplitude. I've been in the digital analytics space for about 20 years. I've been through 1,000 or more digital analytics implementation, so I'm really excited to be here today and share some of the common or top 10 mistakes that we see organizations making. And I'll hand it off to Jeremy next. Great. Thanks, Adam. My name is Jeremy Stern. I'm Senior Director of Analytics and Optimization at Willow Tree. Willow Tree is a digital product consultancy that builds digital products for all sorts of companies out there, and me and my team help make sure they're measuring the information they need to measure success of these products and optimize them over time. And I've only been in the industry about 10 years, so not quite up to Adam's level yet, but hopefully still have some insights to provide. And yeah, we're just hoping as we kick off the new year, maybe you made New Year's resolutions to get your analytics house in order, or just using maybe a slow first week back in the office as an excuse to just really take stock of what you're doing and just maybe focus on your analytics program and give it a little love after being distracted at the end of the holiday season. So with that, we'll get in the first item on the list. Which is just at the highest level having a lack of formal analytics ownership at your organization. There isn't one right place that analytics should live in every company. Maybe you have a dedicated analytics department, maybe it's a product team, marketing team, it could even be IT. You can figure out what works best for your org, but ideally it's someone. Someone is kind of where the buck stops for analytics at your company. If it's everyone's job, it's no one's job. And that can lead to a lot of the other issues we're gonna talk about today. Yeah, and that kind of dovetails to the second item that I wanna talk about is really kind of lack of departmental coordination. So as Jeremy said, sometimes at a lot of organizations we see that there's a lack of ownership around analytics. In many cases, we even see a little bit of a fight between different departments about who really holds it. And lately I've seen the biggest ownership battles have been between product teams and marketing teams. And just a little history on this, for many years the digital analytics function was started in the marketing department. Because digital analytics actually started 20 years ago when it was all about seeing how your digital ads were performing on a website. But as things have gotten more sophisticated, we've started to see that product teams are using digital analytics to improve apps just as much if not more so than marketing teams are using it to look at their advertising. And so over the last couple of years, we've started to see a little bit of a shift of ownership from digital analytics being in marketing to product. Now at some companies, as Jeremy mentioned, there might even be a third team, like a data team that owns it. But I think what's really important is that you have to have sort of coordination and you don't wanna have kind of a wrestling match between the groups. And one of the biggest mistakes that we see organizations make is that one group decide we're gonna use an analytics tool on our website, but then they might even use a different analytics tool, say in an app or on a mobile app. And that's because these groups aren't really coordinating. And when that happens, you basically have two analytics tools at your company. And now you have kind of different sources of truth and it makes it really difficult to understand what numbers are right and what numbers are wrong. Another downside of not having coordination around analytics is that you might have marketing teams that are basically looking at acquisition, but they don't really look at the data related to engagement or retention. And then you might have the product teams that are looking at the engagement retention, but not really looking at the acquisition. And in the most extreme cases, we've actually seen organizations that have competing KPIs or North Star metrics between different departments that really cause problems. Just to give you one example, when I worked at salesforce.com, I was in the marketing group and I did marketing digital analytics and my North Star metric was cost per lead, but then the product team, they managed our free trial experience and they were looking at how many people went into a free trial and then actually became customers. And their North Star metric was trial to paid conversion. But what was interesting is there are many things we did in marketing that actually helped us get a better cost per lead, but they actually negatively impacted the product team's North Star metric of getting trailers. For example, I might get a lot of students who wanna learn about Salesforce or CRM who fill out forms, which makes it much better for cost per lead, but then those are unqualified people using our free trial, which hurts the product team. So we're starting to see that now marketing and product teams at all different departments have to start working together and figure out what are the metrics that they wanna focus on where if the metric goes up, both teams or multiple teams are actually happy. So a lot of information there, but that's just all around departmental coordination. So the next one I'll talk about kind of related to this is what I call the incongruent analysis model. Now, when you're doing digital analytics, there's a couple of different approaches you could take to doing analysis. One approach is taking what I call the centralized model, which is anyone who wants a dashboard or report or analysis done, they submit a ticket or they go to a certain centralized team at the organization and that team does the analysis, does the reports, does the dashboards or whatever and then gives it back to kind of the requester. The other model is more of a self-service model, which is anyone can go in and use the digital analytics product, do their own reporting, do their own analysis, come up with their own insights and their own conclusions. And then some companies try to have kind of a hybrid or a mix between there. But I've seen a lot of organizations that the culture of the organization is incongruent with the model that they've chosen. So for example, product teams, they tend to really like the self-service model because they like to be hands on, roll up their sleeves and basically, tag their own data, do analysis on their own data, come up with their own conclusions. In many cases, marketers or large like Fortune 500 companies, they are a little bit nervous about giving everyone access to data or they just can't train everyone effectively. So they kind of gravitate more towards a centralized model. But if you're a startup, a centralized model may not work because it could actually put extra barriers and put no longer durations that it takes and puts the people further away from the data they want. So the advice here is to really take stock of your organization and determine are you kind of more of a centralized approach, more of a self-service approach and make sure that that fits your culture and the people. Because if you have people who don't actually want to do analysis or use analytics every day, then you might want to go towards a centralized model. But if you have people who do, then you might want to go towards self-service. And I think this is probably true of a lot of the points we make, but the right model for your company probably will change over time, right? So this might be something where as you are a startup, you do have one model, it works well for you. And as you grow, as your organization grows, and as your maturity in using analysis and data grows, it might be worth taking stock and revisiting and wondering like, is it time for us to centralize? So even if you've had this process internally a couple of years ago, it can't hurt to revisit. Yeah, and one last thing I'll say is if you are at a very large organization, it could actually be that you have both. There could be some teams that really want a centralized model and some teams that want a decentralized or self-service model. So you just have to kind of know your organization and know your teams. So next one I'll dive into and then have Jeremy chime in is around the lack of strategic business requirements. And this is probably the common culprit across all types of organizations. In the digital analytics area, we tend to see that many organizations love to collect data, but they don't oftentimes take the time to identify why are we collecting data? What data should we collect? What are the business requirements or business questions that we're hoping to answer? I jokingly call this kind of the shoot fire aim approach. You basically are kind of doing all of the work, but then you're not actually sure what you should be doing. And as a consultant prior to joining Amplitude, I was a consultant in analytics for about 10 years. And my full-time job was going into organizations that had kind of messed up digital analytics programs. And I would analyze all of the data they're collecting. And most, I say about 95% of the time when I asked them, can you show me what are your business requirements around analytics? What are the questions that you hope to solve? They either didn't have it or if they had it, it was five years old, they never kept it updated. And so I would actually go through all of the data that they're currently collecting. And I would do this process that I called reverse engineering where I would then turn it into the form of a question. I used to joke that it was digital analytics jeopardy. And then I would show them the list of the business questions that could be answered based on the data they're collecting. And I kid you not, 50% or more of the time when I showed them the business questions, they would say, hey, half of these questions, I don't care about, why are you saying that we need to answer these questions? And I would go back to them and say, listen, I'm not saying you need to answer these questions. These are the questions that you can answer based on the data you're collecting. And the ironic thing was that a lot of times if I would go to the same organization and say, hey, I think you should get rid of half of the data you're collecting, this seems stupid. They'd be like, no, we can't get rid of that data. But then once I turned that data into questions and said, do you care about these questions? They'd say, no, I don't care about any of those questions. So I would then go back to them and say, okay, so therefore I can get rid of any data that is being used to answer these questions or if the only reason that data is there is because of this question, I can get rid of that data. And then they would say, no, can't get rid of the data. So it was kind of a weird phenomenon, but it is really important to kind of start with the end in mind. Yeah, just last week we had a call with a new client and they had a lot of events they were tracking and we were asking similar questions about the why and they said, we get so many ad hoc requests from different teams, there's no way we could keep up and actually document why all these events are getting tracked. It's all about speed and just making sure we get them in there. And then so we pushed back and said, well, makes it hard to audit for a lot of different reasons. When you have such a piecemeal implementation, they said, well, it's not piecemeal, it's very strategic. And I think that framing the analytics jeopardy is a really good way to showcase that for folks. So I think I'm gonna steal that idea. And kind of related to that, when you have these longer implementations that build up over time, over years, over different stakeholders, an easy mistake to make is to never revisit your tracking requirements and your business requirements. Ideally, we're always treating analytics as an iterative process. Over time, your business questions are going to change. One of those questions you uncovered might have been extremely important to the business two years ago, they got an answer or maybe the product shifted and that answer doesn't matter anymore. Stakeholders change and something that has a very high priority for one executive might not matter to their replacement. And so you just don't wanna end up with tons and tons of events of properties that just don't matter. And so setting some sort of recurring cadence for your organization to look back over everything, ideally check them against those business questions, see if those business questions matter and call things that don't make for a cleaner implementation and make it easier to onboard new folks to your team. They make it easier to identify if you have an issue come up as opposed to when you just have these sprawling legacy implementations, it just becomes harder for everyone to do their job and every step of the process. And if you are really disciplined and keep a list of the business questions in some sort of spreadsheet, then what's really nice is that you can revisit that every six months and you can kind of knock off questions that aren't important anymore. And if there's data that is only there for that business question, you can get rid of that data. And I also recommend that if you do make a list of your business requirements or business questions that you also add a priority next to them. And one of the things that I like to do is look at the top priority business questions and then use that to map to the data elements that are supporting those business questions. And those are the data elements that I wanna really, really make sure I have good data quality and we'll get to data quality in a little bit as well. And as you're doing tracking, you might know from the start this data is only gonna be relevant to us for the next six months. Whether it's a one-off event, just something where you know a certain portion of your product is only going to be relevant for this calendar year, for the summer months. And once you've learned what you have, once that feature is retired or that section of your website is kind of removed from your nav, maybe you can remove some of those events. And if you ahead of time say, we only need this through September 31st, that makes it easier to set a reminder for yourself of like, okay, when our engineers have time, maybe we can remove some of this tracking. Yep, exactly. And that kind of segues into our next common mistake is general implementation missteps. And there's a couple of key things that we see that people make mistakes around implementations. We touched on this a little bit, is just over implementing and just having way too many events and properties. And people think like, hey, data is cheap nowadays, you know, AWS, you know, server volumes aren't that expensive and so on. But what people don't think about is every data point you add to your analytics implementation actually does have a cost because you have to identify like, where's the data coming from? You've got to do tagging or, you know, some sort of coding through an SDK or JavaScript. You then have to figure out the data quality. You also have to teach people what it means. You have to figure out which reports it should be used in and so on. And so a lot of people don't realize how expensive every data point is. And I like to joke that in many cases, if you're spending X number of dollars a year on digital analytics between the team and the product, and you divide that by the number of data points you're collecting, like you could almost jokingly see, like how much it costs per data point. But I think having too many data points in your implementation makes it harder for users when they're even just building a report to be like, hey, there's like 50 properties here. And if there's only 10, then it's much easier for them to understand. And I don't think that, in some cases, I think that less is more when it comes to digital analytics implementations. I'd rather have an implementation with maybe, you know, 15 events and 20 properties that are rock solid, really important that answer really good business questions versus just tracking everything and having all of this data that you're not sure if anyone's using or not. And I think having the business requirements that we talked about previously is a good guard against over implementing because it makes sure that you're not implementing any data that doesn't tie to a prioritized business question. Now, related to having too many events and properties is a concept that I always like to break up is what I call the over reliance on auto tagging. Now, there's different schools of thought around auto tagging at amplitude as a product analytics vendor. We generally are not huge fans of auto tagging. There are some cases where it does make sense, but auto tagging is where you're basically just kind of opening up the floodgates and just walking the DOM or using technology just to track every single data point that possibly is clicked and so on. In general, we find that this causes more problems than it solves. It definitely is easier to implement because you don't have to be prescriptive and have to, you know, specifically say, I wanna track this event or this property. So the prescriptive approach sometimes takes a little more time upfront, but when you do auto tracking, you end up with just a lot of information then some administrator has to go in and label all these things. And then if you make changes to a website or an app, it possibly could break some of the auto tracking and then you have to kind of fix the data. And a lot of times people start to lose faith in the digital analytics implementation. But there are vendors that we even work with that do do auto tracking or auto tagging if you absolutely can't get development resources. But in general, if you are gonna do auto tagging, I recommend that you do that and then eventually find the things that are really important to your implementation and then make those more prescriptive because you'll end up with better data quality than you will with auto tagging. So those are two of the implementation missteps that I've seen and I'll hand it off to Jeremy. Yeah, and one other point on potential risks of over tagging, which could be a whole additional webinar is the privacy implications as, you know, fortunately the industry is starting to focus more and more on good data hygiene and good privacy practices. It's good to be able to explain why you're collecting pieces of information about your users. And when you're just hoovering up everything possible, it becomes a lot harder to answer those questions when maybe legal comes and wants to know why are you collecting these five pieces of information about every user. Ideally, you know exactly why it's tied to your specific business question and everyone can leave that meeting happy as opposed to having to go do a deep dive for the next three days to unearth everything that you auto tagged. And the last misstep we'll talk about going all the way back to Adam's point about a lack of departmental coordination is just when different departments or teams within your organization aren't aligned on the taxonomy or on definitions of metrics. So this could be when your web team and your native team aren't talking to each other as they write their taxonomies or if you have similar flows across a handful of different products in your portfolio. A big issue is just impacts the ease of understanding of your data if someone has to learn net new what conversion rate means on the web versus on the app. It makes it harder to onboard new folks. It makes it harder to sometimes do valuable analyses about how your web and users are different or how users of different products in your portfolio behave differently in different situations. And so just again, everyone talking to each other everyone making everything as clean and as aligned as possible make everyone's job easier. Yeah, and one horror story that this reminds me of is I was working with one of the largest retailers in the world really, it was I believe a Fortune 50 company and they had implemented on their website a full digital analytics implementation with all the events and properties that they wanted. And then a couple of years later a different group implemented their mobile app once they created a mobile app. And what was interesting is about 70% of the events and properties they were collecting were the same, but the mobile app team they just went off in their own world and they never even spoke to the web team for some reason. So they tracked basically 70 to 80% of the same data but they put them in different kind of variable slots or variable names, which meant that it couldn't kind of be rolled up and you couldn't see aggregated data. And I got brought in as a consultant and I literally had to make a spreadsheet that said, okay, in this project, it's called this in this project, it's called this and they oftentimes had similar but not exactly the same name. So I had to kind of verify is this really the same thing? And then the mobile app team and the web team had to decide which team was gonna basically lose all their historical data and basically have to kind of readjust so that they were using the same events and properties both web and native app, which was just unbelievable. You'd think like, why would anyone ever do that? But it does happen out there. And I ran into a situation. We had a client who they had a very lengthy registration process. It was, let's say 10 pages or screens long and the conversion rate, the completion rate across web and native were very different in ways that didn't really align with what we had learned from talking to users from the other pieces of the data. Just when higher level executives saw a rolled up conversion rate across those two, the numbers did not make sense. And we dove into it, asked questions and we figured out that one team considered the flow to start on screen one. The other team considered the flow start on screen four because they're like, well, it doesn't really count until we get this piece of information from them. And so at the end, everyone just saw a conversion rate. Everyone just assumed these teams had talked to each other and they must be using the same definition. And of course, the numbers didn't align. They were measuring completely different things. And it took someone coming from the outside and asking extremely detailed questions that they thought were honestly a waste of time because they're like, we know what we're tracking. We know what this definition is to unearth those differences. So a recurring theme for teams to talk to each other. Yeah, and then analytics not being just embedded in your overall processes. The biggest issue I see here is just waiting until the very last minute in your release cycle to implement analytics. In an ideal world from our perspective, as you develop a new feature, as you build a new screen or a new page, add on the analytics for that page, add on whatever interactions, whatever new user properties you learn, as that page is being developed, test those as you are testing out that page. And then ideally when that page is finished being built when everyone's confident in all the data, everything's tested, that all can be merged into your main project and we can treat that feature as fully feature complete, including the data. What we see a lot of the times is they push analytics to the end. They say in the final sprint, we'll go back and we'll track everything that we've built for the past eight months. And I think to be honest, it's more likely to lead in errors because just going through now and tracking every single relevant button, every single relevant interaction all at once is a little more tedious than doing it as you're building that thing as you're most familiar with how that flow is working. It makes it a lot more tedious to test because it's not part of all of the edge cases of that specific flow. You're just only looking at the analytics kind of in a vacuum. And the biggest risk of pushing analytics to the end is it means that when you start bumping up, it gets a deadline, analytics are gonna be the first thing that gets cut because we need to ship. We can't ship if the checkout isn't working. We'll figure out how to measure the checkout later. And that might make your PM happy for that last sprint that you get to release out, but then inevitably the first question you will get from some executive is, how's it doing? And then you can't answer that question. So the easiest way to make sure you aren't in that situation is to make sure that you already have analytics in the app three months, four months before it goes live. Yeah, and this happened to me over and over again when I was at Salesforce, everything you just talked about. And we did agile sprint methodology and we basically kind of changed to make sure that there was a swim lane for analytics because it had to be involved in the beginning of the project, not just at the end. And I was kind of a jerk at times because there would be some teams where they would wait till the last minute, as you talked about, and it would basically get cut out. And at that point, they would say, well, after we release, can we go back and put the analytics tagging? And I would basically say no. I'd be like, listen, if it's not part of the release, like then you don't get it right away. So when someone asks, how is it doing? How is the new redesign doing versus the old? I don't want them coming to me and saying, why is there no data here? I'm gonna refer them to you. And I kind of had to like almost punish people a little bit. It's to the point where some executive would say, why do we not have the data? And I'd point them to the person who cut it out at the end and then that person kind of learned their lesson and then they didn't have, we didn't have as many of those problems, but it is really frustrating. And one extreme case that I had at Salesforce is we had someone who kind of in the mad panic right before the thing released decided to try to put the analytics in and the actual tagging of analytics in one case actually broke the experience. And then they were all, everyone was mad at our team, the analytics team, and we basically were like, hey, they were the ones who decided to not do this the right way, put it in at the last minute and that definitely increases risk when you do that because a little bit of code here and there can break experiences. So definitely agree with all that. Next tip we want to kind of mention is what we call using the wrong tool for the job. Now, there are a lot of different technologies out there as many of you have seen kind of the brinker, 5,000, 10,000, Martek stack slides. And I think there's a lot of people out there, especially product teams who get a little bit confused on what is the right tool for the job when it comes to analytics. So the first mistake we see a lot of companies making is that they're using what we call a traditional marketing analytics product like an Adobe or a Google analytics to try to do product analytics. Product teams have different needs than marketers. Marketers are very focused on acquisition. How did someone get to my app or my website? And they don't go super deep in engagement and retention and tracking features and buttons showing all the types of reports that product analytics vendors really have built up over the last 10 years. Now you can use marketing analytics products, but in many cases it's kind of like trying to do that from product analytics is equivalent of trying to put a square peg in a round hole. And so I have many talks I've given over the years with folks about why you need to use the right kind of analytics product for the right job. One of the things we're doing at Amplitude right now is trying to make it so that our product can do both marketing and product analytics so you don't have this issue. You could use one product for both and that's just a big focus for us. But another problem that I've seen is a lot of organizations and product teams get really enamored with business intelligence tools Looker and Tableau and Power BI. And they basically say, well, what if we just collect all the data we need, shove it in a warehouse and then we just use a BI tool on top of it to see how everything is doing. Now the problem with that approach is that BI tools are really good at kind of high level summary information reports and dashboards, but they don't go super deep. You're not gonna be able to do a conversion funnel. You're not gonna be able to de-dupe unique visitors very easily. You're not gonna be able to look at flow diagrams, build really cool segments or hold different attributes constant like you would in like a sequel. So there's just a whole bevy of reports that product analytics vendors have kind of perfected over the last 10, 12 years that you're gonna miss out on. And when I see companies that try to get by by just saying, we're gonna have a data warehouse and a BI tool, they oftentimes can get some of the information they need, but what they don't realize is the opportunity cost of all of the deep dives of data that they're missing out on by not having a true product analytics product. So if your organization is either using a marketing analytics tool or a BI tool, you might wanna have some conversations about is this the right tool for what we're trying to do or at least explore to see what are the things that you can do in product analytics that you can't do in marketing analytics or BI. And honestly, those same things you miss out on by using a BI tool instead of a product analytics tool. I've also seen in a few cases for companies trying to use like performance monitoring or crash reporting tools as their analytics tool. I think having good crash reporting is very important. Crash analytics, New Relic, all those sorts of tools, like they're valuable, they don't replace product analytics. That's not what they're designed for and you will very quickly run into gaps. You can get most of the top line vanity metrics you might need. You'll be able to see total visits. You'll be able to see maybe some sort of user level like identity resolution stuff, but as soon as you wanna start answering complicated questions about segments and about flows, it's gonna fall apart. We had one client who their procurement team told them we have one analytics tool. It was a performance monitoring tool. And so we tried to figure out how we can make that tool work for them. We said, hey tool, can you do flow reporting? And they said, not an aggregate. If you look up any individual user, you can find that user's flow. And that's it. There's no way to even add two users together, let alone find any sort of patterns. And again, these tools all have purposes. They work a lot better when you use them for what that initial purpose is. Yeah, and one thing I'll add to that is we also see companies who say, you know what, we're really good at SQL and we could do everything we need with SQL. We actually had a little demo at our company where we basically had our best SQL person try to answer a question that is like, how many people came to this funnel, went to this step, this step, while, you know, when they came to this step, they had to have done this. And I think it took like 15 minutes for someone to write the SQL to do that. And then we showed them in amplitude how they could do that in about 30 seconds. And so sometimes the answer is, yes, you can find a way to do it in SQL, but how many people at your company can do it in SQL and how much time are they gonna spend doing that versus using something that was purpose built for that? Yeah, and then another issue in terms of when well-intentioned product teams sometimes bump up against their procurement organizations is a lot of teams can fall for the mistake of assuming that analytics are free. These tools are free. You'll never have to pay for these. They grew up using free Google Analytics for the past five years. And folks high enough away from the actual data just go, Google Analytics is free. Why would you want us to pay for amplitude? Why would you want us to pay for anything? This is one of the few tools that we have that's free and they don't realize that either, maybe when they were at a much smaller scale that worked for them, but now they're bumping up against limitations in terms of numbers of hits or numbers of users based on the tool or just in terms of overall functionality, you're hitting walls, the type of analysis you can run, the level of complexity of the questions that you can answer. And it might be a large hurdle to get over initially of just the mental idea that this is something that's going to be very valuable for organization. We'll come back to this later, but something that is going to drive profit, it's worth making an investment in that tool so we can actually use the data in the ways that it's meant to be used. Another issue, this is a little bit of a pivot from kind of the analytics tooling itself to ways people are using dashboards. It's important to think of when you're building a dashboard on top of this data, what is that dashboard for? We've written into clients that every possible business question they've thought of, so every question in the list Adam came up for them in, they want the answers to all of those questions in one dashboard that maybe will scroll for a couple minutes to get to the bottom or is 20 paginated pages. And at that point you lose the value of a dashboard that you can open up real quick, get a health check, see how things are going, move on with your day, or look at quickly, notice an anomaly and then go back to your data to dig in further. But just thinking through what are dashboards for? Are they for doing one-off deep dive analyses? Ideally not. And so figuring out what are those things you want to be able to see on a glance on a recurring basis versus what are things that when you need to know you can go into the data and find it. And then finally, sometimes product analytics overall isn't the right tool for your job. I think we can all fall into the trap of this data is so powerful, any question we have we will find an answer in the data. But a lot of quantitative analytics will tell us what users are doing, it's not designed to tell us why. And so you might see you have a lot of drop-off on the third page of your site. You might see a new redesign, negatively impacted conversions, and you aren't sure why. So then figure out what tool do you need to actually get hypotheses as to why folks are dropping off. So maybe it's a session recording tool. Maybe it's pulling in user researchers to actually go talk to the users, do surveys, do interviews, do focus groups and actually say, hey, we've noticed people having trouble with this one page, can you tell us how you're experiencing it? But just don't think when you have a hammer, everything's a nail. No matter how good your data is, sometimes you have to go talk to real people to actually figure out what your users think and need. Yeah, and I think that's really important. Just going back to some of my days as an analyst, one story I could tell you is when I was at Salesforce, we had like, I think it was like a 17% conversion rate of someone looking at a form, which was pretty big on our site and completing it. And one of the things we couldn't figure out is why were they not completing it? And we had all sorts of theories about, oh, well, we're asking for too much personal information and so on. And eventually we basically, if someone went to the form and then navigated away for 10% of them, we would pop up a quick little survey and say, here's the top five reasons we think you may not have filled out this form. And then there was an other. And we asked our executives to tell us their opinion of why they thought people weren't filling out the form. And every single one of our executives was wrong. They all thought it was because we were asking for phone number and asking for email address and people weren't comfortable sharing that. But the number one answer that we got and it actually started as many people filling it out in the other. And then we added it as a viable kind of radio button option was that they didn't know enough about the product yet to fill out the form. Like they weren't educated enough. And so that was a huge learning for us. And we actually added some videos and some information on the form so that they could actually, while they're right there, learn a lot more about our products. But sometimes the data won't tell you that why. And the other thing that I'm excited for you to mention this is one of the things that we're doing at Amplitude right now is we're actually integrating session replay right into our product. So we have partnerships with a couple of session replay vendors so you can go into Amplitude and find out kind of where the problem exists. And then you can actually go watch just the people who have that problem through this partnership. So really excited about that. And I totally agree that sometimes you've got to kind of mix the yin and the yang and the kind of what the data can only tell you so much. You know, you can be the best analyst in the world. You still can't understand that why until you kind of get the qualitative into the quantitative. So really glad you brought that up. So next one we'll talk about is data quality. We alluded to this a little bit earlier and there's not too much to say about this because everyone agrees that they should have really good data quality when it comes to analytics. But some of the mistakes or tips that I'll give you is I see a lot of organizations that when they first do their implementation they spend a lot of time verifying that all the data is correct. Then the minute that they get sign off and that the analytics implementation is done then they kind of forget about checking the data quality. They don't periodically spot check it. And as they add new things to the implementation they're not oftentimes as stringent or strict as they were during the initial implementation. So again, if people use a digital analytics product whether it's a marketing team or a product team if I call it the three strikes rule if they look at the data and it looks bad or maybe data just stops and then kind of starts again later or there's gaps in data if that happens more than once by the third time that happens basically people lose faith in the analytics implementation and the analytics team and they just stop using the data. So this is one of those things it's like they always say about your reputation it takes a lifetime to build but it takes like a second to lose. You need to really focus on data quality and make sure that you have either a people or a process in place to make sure that you're periodically making sure the data is accurate. And one of the things that people use in our product and amplitude is we have kind of a way to do root cause analysis or anomaly detection so we can alert you if there's cases where data looks like it's way beyond kind of the standard deviations. So make sure that you take the time to check those out and see if that's actually something that's really happening or if there might be a tagging issue but I don't know Jeremy any other kind of data quality things from your perspective? Yeah, just one like very specific zoomed in example but hopefully people can extrapolate from it is one thing I've seen is when you do an initial release of a site and it's English only and then later you go back and localize and we inherited a site from another agency and we looked at their analytics and they had let's say the first 10% of their page views had English names and then the next 10% had French names and the next percent had Chinese names and those were the same sets of pages but they were just dynamically pulling in the page title and unless you had a very well-studied analyst who knew a whole lot of languages it was very hard to aggregate performance across all those different languages and that was not an intentional plan they didn't have an intentional plan they just set up their initial tracking they did a lot of testing of it and then at some point just localized everything to Chinese and just assumed it would still keep working and it did that data wasn't inaccurate it just was increasingly hard for them to use in the way they were hoping to so things like that just taking a second to go back and look at your data makes it easier that's happening. They had built reports on the English language names so they just a lot of those reports just didn't include any of that data that wasn't in English so it took them a while to catch it. Yeah, cool. Okay, so kind of last couple here one I like to talk about is what I call the failure to document and kind of take action on what I call aha moments. So if you think about it the whole reason why you're using digital analytics or whether like let's say you wanna make your app better you wanna make your product better you're not just going in there to do reports just in dashboards just to do it you're doing it because you want to learn something you wanna have the data speak to you and tell you like, hey, here's an insight here's something that you learn now product teams they're very familiar with the aha moment the aha moment is when someone's using a product and it just clicks with them like why this product is awesome and I can't live without it but we oftentimes don't take the time to do that in digital analytics but every time you're using a digital analyst product you're always kind of querying, segmenting and a lot of times you don't realize it but you're actually learning a lot but a lot of companies don't document this they don't share it with their colleagues and they don't say, hey, I just learned five cool things and they don't communicate that with others which means that other people have to learn those things or they just get lost. Now, like in amplitude we've tried to make this a little bit easier by having things like notebooks where you could add kind of free form text and you could talk about it and you could embed learnings right there but there's definitely more work that analytics teams need to do and one of the advice that I would give all of you is come up with a way whether that's a spreadsheet or confluence or something where if you learn something really interesting if you have an aha moment, document it and find a way to share that with other folks and if you think about it the most number of aha moments if you come up with 50 aha moments you should be able to go to each one of those and they don't do you any good unless you actually take action on the aha moment. So for example, if you're doing a conversion funnel and you realize that I'll make this up first time visitors convert way less than return visitors and you can see that consistently that's a good learning but it doesn't do you any good if you're not saying, okay well, how are we going to change this so that first time visitors are converting at the same rate as returning visitors what is the difference? What is it that we need to do for first time visitors to get them to convert at the same kind of pace as return visitors? And so documenting aha moments the cool thing you can do is say, listen every month or every quarter I want the whole team to make a list of the key things they've learned and then let's look at each of these and make a goal of saying which ones should we take actions on? Which ones should we do? Because once you take action on it then you can make whether that's an AB test or you're changing a feature or adding content then you can actually measure the result and so you know that old expression you know if a tree falls in a forest but no one's around to hear it doesn't make a sound well, it's the same thing with digital analytics if you learn something new but you don't share it with anyone and you don't take any action on it like did you actually ever learn anything or did that have any impact in your organization? And very few companies I go to any product team and I say, hey, can you show me the list of maybe the last 50 things that your team has learned by using data that you know through your digital analytics product and usually I don't get anything they're like, well, you know, Joe learned this and Joe learned that but it's all in their heads. So find a way to document that and share that. And I think that leads to kind of my next point is that you have to treat digital analytics as a profit center instead of a cost center. Most organizations are treating it as a cost center. It's like, hey, we spend money on the doing analytics on the team we spend it on digital analytics vendors like Amplitude and it's just an expense it's just something we have to do but that's a terrible way to think about it because the whole point is that you're collecting data so that you can learn things, have these aha moments then turn those aha moments into actionable things you're gonna do, test new things, try new things and hopefully those actions then can be quantified. And this is the big learning here is that most product teams aren't quantifying what they're learning by taking actions on their insights but if you quantify that, then you could say, you know what, we spent X number of dollars on our team and Amplitude this year but we saw a return of this much. And if you do that, then you'll never have to worry about justifying your investment in your analytics team or your renewal of an analytics product. And if you need additional tools or you wanna increase the headcount of more analysts you'll always be able to do that because you're showing a return on investment. But treating analytics as a cost center is just a shame because there's so much money in that data you just have to kind of go from aha moment to action to ROI. And that's something that very few product teams are doing but I highly recommend that you do that. So that was our top 10 list. Hopefully there's some of those maybe if not all were helpful for you as you think about 2023. Jeremy, anything else you wanna chime in before we hand it back over to Gretchen? No, I think that was a great ending point. Just hopefully take what you learned today put it into practice and prove your value to all the people you need to. Awesome, I'll give Gretchen back over to you. Thank you so much, Adam and Jeremy. That concludes our presentation. We hope you found today's conversation helpful. As we close, I wanna thank everyone for your interest in learning from these speakers. Have a great rest of the day and we look forward to connecting with you again soon.