 our thank you guys for letting us know where you're zooming in from and one thing that you're grateful for. Today's webinar is sponsored by AWS Amazon Web Service and today we're going to be talking to the experts and we're going to talk about how to utilize analytics to better understand your donors. My name is Aretha Simons. I'm the webinar producer here at TechSoup. I'm going to talk about how you can engage today. I know many of you are on our webinars and you know you're so we would love you to put your questions in the Q&A section. If you need the closed caption, go ahead and tap on the little CC at the bottom of your Zoom screen. This is being recorded. You're going to get the recording and the slides by tomorrow. Yes, by tomorrow. And I'm going to go ahead and turn this over to Mike Young and Angela. Guys have a great webinar. Thank you for being here. Aretha, thank you so much. Hey everyone. Thanks for joining today's AWS and TechSoup Ask the Experts webinar. Today we're going to be discussing something that is near and dear to all of our hearts, especially this time of year, how to utilize analytics to better understand your donors. My name is Mike. I'm with the AWS programs team here at AWS and our goal is to bring programs to life that are going to benefit nonprofits like yourselves. So the goal of this Ask the Experts series is really to provide some practical guidance, our technical experts, so Angela, incorporate live like how-to demos to give you some real world examples of how you can really bring these technologies into your own organizations and sort of make them your own. So AWS was founded over 15 years ago and some of our earliest AWS customers were in fact nonprofits. Organizations like the Red Cross, Code.org, PBS, all of whom are still using AWS to power their innovation today. So in just a minute, Angela is going to hop on or go off mute, I guess. She's an AWS solutions architect. But first, I just want to get a sense as to who is actually in the room today. So before I hand things over, if I could just get a quick poll within the chat, if you could just put a one in the chat, if you either currently have now or have some time in the past received AWS credits through TechSoup. So one in the chat if you have now or some time in the past gotten AWS credits. Oh, not sure. Okay. Hopefully we can fix that again. Okay. Thanks. So well, now let's say whether you're a current user or not, can you put a two in the chat if you are completely new to AWS? Awesome. Oh, great. Okay. So that's awesome. So it looks like we actually have attendees from both camps, which is great to see. We love chatting with old friends as well as making new ones. And I know somebody had asked to talk a little bit more about the credits. So we'll do that in a second. So for those that are newer to credits, AWS works directly with TechSoup to provide nonprofit organizations with funding to support AWS cloud usage. So basically these credits can be used to offset your cloud technology costs, whether it's storage, compute, running a virtual contact center, or maybe it's like experimenting with machine learning. These credits that you can get through TechSoup, these are applied against your AWS billing directly within your console. And through our new tiered program, AWS nonprofit actually have access to credits worth $1,000, $2,000, or $5,000 annually based on your organization's annual operating budget. So if you haven't already, there are a couple of twos in the chat there, but we encourage you and your nonprofit to take advantage of this program simply by registering with TechSoup just to kick off that process. Again, if you haven't already. Also please keep in mind, as Aretha had said, that this is an opportunity to ask us questions as well. So feel free to type questions over the next 45 minutes to an hour. We'll be happy to support you the best that we can. So that is it for me. Let's get to the good stuff. I'll hand it over to Angela. Thanks. There's my mute button. Hello, everyone. My name is Angela Si and I am a solutions architect here at AWS. I work primarily with nonprofit customers, sometimes your Excel. So I saw that in the chat, we have quite a few newcomers to AWS to give you a little bit of an idea of what solutions architect do. My role is to be here to be your technical guidance as you navigate within the cloud. So I'm here to help you learn what AWS services will help advance your organization's missions and challenges. And I'm a free resource to you. So if you are thinking about going to the cloud, you're not sure where to start. It's definitely good to engage us, your accounting and your solutions are. So without further ado, let's go ahead and dive right into why you're here today, which is to learn about how you can utilize analytics to better understand your donors. So we know we want to understand our donors, but let's take a step back. Why is that important? What do we want to understand? And when I say understand and analyze donors, it might not be donors for you. It might be your volunteers and might be your members. So anything that anyone that you're trying to reach out to. So why is it important to analyze this data? Well, first and foremost, we want to get insights on donor behavior. We want to learn what do they care about? Who are our donors? We want to learn who is connected to our organization and how do we engage them? What motivates them to support our causes? And how can we help keep our costs top of mind? So basically, we're trying to understand which population is interacting with us. And with that information, we can either hit the iron wall, it's high and pursue other potential members or donors within the same profile. Or maybe there is a population that you want to reach, but you're not quite reaching and that you see that from your data and now it's time to pivot and reach out to the organization, the population that you're not quite yet connected with. So the next reason why we may want to analyze donor data is donor retention. So donor retention is one of the most common objectives that I hear from the nonprofit customers that I worked with. They want to be able to identify trends, right, to find out what factors are encouraging our donors to continue to give to remain a part of our organization and what factors may cause our donors to start becoming more inactive and to start fall off being very engaged with us. And so why was this important? How can this help us? Well, if we know the trend and we know what the donor activity looks like, then maybe we can get ahead of the curve, right? If we observe that a donor is starting to trail off, maybe it's time for us to reach out to them, reengage with them and bring them back before it's too late. So it helps us develop strategies for communications. And the next reason why we want to analyze donor data is to learn how our organization's brand and mission is being perceived by your community. So I'm sure you all have board of directors that you need to provide reports to. Maybe it's volunteers, maybe it's donors who want to see how you have been using their dollars to do the good work. And being able to provide these reports and showing them the data that backs up the good work that your organization is doing is really important because you are building report and credibility, which then helps them to want to continue engage with you more and have that good reputation within the community within the mission area that you are serving. So last but not least, why we want to understand analyze donor data is to make data-driven decisions. And I'm sure you all have heard the catchphrase of data-driven decisions because I know I certainly have heard that from many of our nonprofit customers. So we want to have these data so we can decide what fundraising events do we want to host next, which fundraising events were successful, which were not. Is there a correlation in the method of contact? Do donors that we reach out through phone tend to give more or maybe donors will reach out through text? Is there any sort of correlation anywhere that can help us better understand what we're doing and what can be improved and help us make the next decision? So on the topic of being data-driven. So Harvard Business Review actually did a study and they found that 99% of businesses want to be data-driven. And I'm sure that's not surprising to you, which is probably why you're here today because your organization probably also wants to be data-driven as well. But guess how many of those 99% of businesses were actually successful in becoming data-driven? If you want to take a guess, go ahead and put it in the chat. So of the 99% of businesses, including yourself, right, that want to be data-driven, what percentage did Harvard Business Review found were actually successful in that mission? Okay, I'm getting 50, I'm getting 10, 25, yeah. So I guess you guys are pretty accurate. You guys are a little more, I guess in two with realistic, with reality than I was. So it turns out 26%, 26.5% of the businesses have been, only 26.5% of businesses have been successful. And I was surprised when I saw that number because collecting data and analytics has been around for three, four, five years now. So I was assuming that it was going to be much higher than that. But it turned out only about a fourth of the organization thus far has been successful in becoming data-driven. So why is that? Why is the number so low? So we identify here actually some of the key challenges that our nonprofit organizations are facing. And so as I'm going through these, feel free to put in the chat if plus one, this is what my team is feeling, or plus two, or plus 1,000, yes, this is exactly what is going on with us to let us know this resonates with you. So first challenge is the explosion of data. These days, everything that you use, everything, the data is being collected. And that results in a lot of data, which means you need to sift through the noise to extract meaning from this data, which takes time and resources, both from a human resource perspective as well as a technology perspective. You need to have your business analysts who are going through all this data and to be able to clean them, transform them. And not only that, you need the technology, right? If you're currently on premises, you need to have the servers, the disk space to be able to hold all these data and to perform that analytics on this vast volume of data. Next, we have an explosion of personas. And what I mean by that is that there's many stakeholders at play. So from an external perspective, like I mentioned, you probably have board of directors that you need to provide reports to. You have your donors, you have your volunteers, if you are working with governmental agencies or other chapters of your organization. So you have all these different external parties that you need to report to you. And also internally, you have different teams who have different objectives, right? So if you have a marketing team, they might be interested in data that's related to their marketing efforts, whereas your, let's say, programs team might be interested in how they're outreaching efforts, their event fundraising events, how are those doing? So what does that mean for your IT, for your developers, and for your business analysts? They have to go and create all these different reports to satisfy everyone's objectives. And we simply don't have time for that, right? It's very tedious as well. And another challenge that we hear a lot is the demand for data driven. So we just talked about everyone wants to be data driven, but it's not just about being data driven, but it's about being data driven in time. And so what I mean by that is to give you an example, one of the first customers I worked with when I first joined AWS is a financial institution who had their data analytics solutions on-premises. And this forced them to only be able to run their reports once a month, because that's all they were able to do from a business analyst perspective. They have one business analyst, and you know, that's as much as she could do is once a month. And that was not enough for them, right? Can you imagine a financial institution only producing financial report once a month? We need insights faster than that, right? So let's say you have a, in your nonprofit organization, you have your marketing department who are producing newsletters. And let's say that these newsletters, you know, we're trying to garner donations, but it turns out they're not doing that well, whether it's because the content is not fitting what our constituents want, or maybe they're too repetitive, and so people start clicking on them and reading them. Now, would you rather find that out after maybe the second month of donation, dropping, or not as much engagement from the marketing? Or would you rather find that out every quarter? Probably the first one, right? And so having faster decision making, again, takes time and effort that a lot of nonprofit organizations do not have from a human perspective as well as an on-premises technology perspective. And then on top of that, data is stored in silos. So a lot of the nonprofit organizations I work with, they utilize a software as a service solutions, which means, for example, out of the box software, right? So Salesforce, that's where the donor data is stored, or Google Analytics, where your web traffic is stored, or something else, right? And these software solutions are great because they're out of the box and we can use them right away. But the challenge with that is now my web analytics data is in the Google box, whereas my donor data is in the Salesforce box. And then maybe you have even separate software that you're using to actually collect payment for the donation, and that's it's in itself box. And so to be able to bring all of these data together, again, it's time that you have your developer have to take away to do that ingestion. And it just becomes very tedious. And you have to spare that resource that a lot of times nonprofit organizations do not have. And on top of that, a lot of nonprofit organizations I work with have solutions that are on-premises, that was built by the guy before that no one knows how to change it because the guy is not with us anymore. And so we don't want to touch it because we don't want to break it and it's working. So there's this application with this data that's sitting on-premises. So we need to also try to somehow incorporate that too, right? So data is in silo and that's a bit challenged. And the next is unlocking value of the data. And so what that means is now that you have all this data, what do I do with it? What do I even do with it? What questions should I ask? People, when they have all this data, they have decision paralysis. So think about, if you were going to restaurant and you're handed a book of menu with 10 pages, it's probably, I don't know about you, but it's probably going to take me 15 minutes before I decide what I want to order from that menu, right? Or think about time when you have a really long to-do list, what is the first thing you decide to do or I decide to do is usually, you know, what I'm just going to take a break first, right? So as humans, we tend to get decision paralysis. And so with this explosion of data, this also saw all of a sudden we have all these data at our hands, but what do I even do? How do I unlock the value of it? What questions do I even ask? Where do I even start? And, you know, that's a challenge. And last but not least, like I mentioned throughout this whole thing, resources, right? We don't have the resources technical wise and human resource wise, and it can get expensive. If this was just something we're trying out, we're not sure if it's going to return on the investment. We're just trying it out, which can turn into an expensive experiment if you're trying it on premises, right? So at AWS, we recognize that these are challenges within nonprofit organizations. So what we've done is we developed a suite of low-code, no-code data analytics tool. And so these suite of tools that allow you to do your data and analyze your donor data without having to write any code, which means you can start experimenting faster. You don't have to bother your developers, your IT team, your business analysts, they can jump into doing this themselves. And also these suite of tools, they're all serverless. And what that means is that you don't have to manage the server, you don't have to manage the compute resources, the storage disk space behind it. All you have to do is worry about how you want to use the service to analyze your data. And so today specifically, I'm going to share three tools with you that are all low-code, no-code, and are all serverless to help your organization get started on getting insights into your donor data. But before I do that, I want to tell you a little bit about Girls Who Code. So Girls Who Code is one of the many nonprofit customers that we have. And they are in an organization that is dedicated to bridging gender gap that is within the tech field. And so they offer programs to students who are traditionally underrepresented in the tech field and provide them with the education and provide them with programs so that they stay interested and get the correct support that they need and to continue on this path when they go to college and get a job and beyond. And since launching in 2012, they have helped 450,000 girls through its program and currently they have around 90,000 college-age alumni. So it's a very successful program. So the challenge that Girls Who Code ran into is that their data analysts were having to spend up to 40 hours just on data preparation before any data analytics were to even happen. And so just the data preparation piece was taking a really long time and it was a bottleneck for them. So the solution they found was to build a 360-degree view of a data hub on AWS where they were able to bring in data from third-party solutions that they were using, such as Salesforce. And they bring all the data into AWS and build this 360-view data hub on AWS so now they can gain insights on how their programs are doing, how their alumni are being engaged, and be able to make real-time decisions on how programs are doing and make decisions on how to better engage their alumni and their donors. And the CFO of Girls Who Code said that because of AWS, they can now have real-time and dynamic data that's coming in, which help them optimize their fundraising campaigns and help optimize their effort in this area. And so my goal is in the next 40, 45 minutes or so that we have left here today is something that I share with you, some of the tools that I share with you that we can offer. I will resonate with you and can help you optimize your organization's fundraising campaigns and optimize your efforts and just like how we were able to help Girls Who Code. So before I jump in to tell you about the tools we have, let's set the scene a little bit. In the report that we published back in 2019, we basically broke down data analytics lifecycle into five and half stages. So stage zero, the half stage, it starts with the data source, where your data is coming from. So this can be on premises, right? If you have excel sheets that you hold on premises, this can be application data. So we have our data source. Stage one of the lifecycle is data injection. How do we bring that data in? And after you ingest the data, it needs to go somewhere and that's where stage two comes in. That's the staging area. And we call that the staging area because typically this is a raw data holding space, right? This data is, like we said earlier, has a lot of noise that we don't necessarily want, which is where stage three comes in, where you're doing data cleaning. You are only extracting the columns that you want. You are maybe joining two data sets together. Maybe you have a data set on donor information and another data set on how much they actually donated and we need to somehow map these together to create one view. So that's where stage three, that's what happens on stage three. Then you move on to stage four, where you're actually doing the analytics and the visualizations and then finally to stage five, where we're archiving that data, maybe because once we do that analysis, we have a dashboard, we've got our insights, we don't necessarily need to hold that raw data anymore and we can push them down to an archival stored to take advantage of cost savings. So starting with stage one and two with data injection and data staging. So the first one I want to talk to you about is Amazon AppFlow, which is a data ingestion tool that provides native connectors to bring in your third party data. So what do I mean by native connector? So with native connector, you don't have to write any code. It is going to be a couple of clicks, a couple, oh, a couple of sorry, I didn't know why it advanced on its own. Yeah, so a couple of clicks and you will get that connector set up and with the minutes, the data will start flowing from your data source into AWS. And so what this tool can do is bring in data from silos. So if you have Salesforce data, if you have Google analytics data, if you have SurveyMonkey data, that's what it can help you do is to bring in data from silos and bridging that gap. And the fact that it is no code means that your business analysts can do with themselves as well. There's no API calls that needs to be written, no code that needs to be written, so your business analysts can easily get in the console and do this themselves. And as far as cost goes, there's no licensing fee. So with AppFlow, you are charged only for the amount of data that you are generating. And so once you ingest that data, if you're not using it anymore, it is not going to charge you. And this offers you speed and agility. So like I mentioned, within a few clicks, it will be ready and data will start to be ingested, which means there's no need for decision paralysis, right? Because you can, also I apologize for the slide going back and forth. It looks like there's some sort of timer that is on that I was not aware of. So I apologize for the auto jumping. But yeah, speed and agility, there's no need for decision paralysis because you can easily try it out, right? Set up the connector, just couple of clicks, start ingesting the data, try to see if you want it. And if you don't want it, no worries, just stop using it, take it down. You're not going to get charged for it anymore. Experiment, right? And it's not going to be an expensive experiment. And last but not least, it is secure and scalable. So at AWS, security is something that we take very seriously. And so all of our services are built with security as job, zero and mine. And again, this service is serverless, which means that you don't have to manage on any of the underlying compute resources, the storage, we take care of that for you. So with Amazon app flow, there's only two things that you need to do as a user. So the first thing is authenticate to the source. So what I mean by that is if you are, let's say, again, Salesforce user, and I keep using Salesforce because that's definitely one of the most common ones I hear from my nonprofit customers. But you are, you're provided with a token to app flow so that it can authenticate you to Salesforce so that Salesforce knows which accounts you connect with your AWS and that, you know, it is authorized to do that. And the second thing that you need to do as a user is just tell them, tell them what you wanted to do. How often do you want data to be ingested? What type of data do you want it to ingest? What columns do you want? Do you want just the donor names? Do you want just donor amounts? Do you only want donor information who is donating this amount of data, this amount of dollar amount, or do you only want donors who came to this event, right? You configure the flow. So after you do the authentication and configure what you want for the data injection, you're done. How flow takes care of everything else for you, the compute, the storage, the execution, everything else is taken care of for you. So to really drive home the point of how easy it is to do this, I have a couple of screenshots here to show you. So on the left here, that's how you can configure a flow. As you can see, it is a simple drop down menu where you select the source of the data and then you select the destination of the data where you want that data to go. And then on the top right and then bottom right, the flow trigger and filters are just some examples of how you can set up your data ingestion customization. So you can add filter, like I mentioned, maybe only a subset of data that you want to ingest. And then flow trigger, do you want to run this on demand? Do you want to run it on a schedule? Or do you maybe want to run it any time a specific data is updated, right? So you can get that real time data ingestion data analytics, if that's something that you're looking for. Or if not, right, if you're just trying it out, then do it on demand. That way, you're only ingesting data when you want to. And so I hope this view shows you how it's simple, the UI really looks and how easy it would be for your business analyst to go in and get this set up. Because after you click through these data comes in, no codes needed. So I know I keep saying Salesforce just because that's the one I get asked the most often about. But here's a list of all the connectors that AppFlow supports today. And we are continuously adding additional supported softwares. And so I've also highlighted some of the ones I hear the most in orange to see if any of these sound familiar to you. So Facebook ads, Google analytics, MailChimp, Salesforce, SurveyMonkey are definitely some of the ones that I hear of the most. So just now I took you through how we can ingest donor data using Amazon AppFlow. And once that data is ingested, we typically recommend customers store it inside Amazon Simple Storage Service. So just like its name, Amazon Simple Storage Service is a storage service that we offer. It is a cheap storage service that we have where you can store a large amount of data. And so by cheap, here's what I mean. So with storage, with Amazon Simple Storage Service, we have different tiers. So you have your most frequently used tiers and then you have your cold data tier. With the most frequently used tier, that's obviously going to be the most expensive. But even with that tier, we charge at .023 cents per gigabyte. And so if you're like me where rates are something that is a little hard to comprehend in the head, that will mean that if you're storing one terabyte of data, which is a lot, a month, even in the most expensive tier, it is only going to be $26 a month. Now, like I mentioned, Amazon S3 has the capability to tier your data depending on how often you're using it. So if you are done with the data analytics and you are ready to archive the data, if you move that data down to an archival tier and let's still use one terabyte as an example, it will only be $1 a month for one terabyte of data in our coldest archival tier. And so S3 is a good place to put all of your data. So that is the first part of the data life cycle. Next thing we want to talk about is stage 3, which is data cleaning. So data cleaning, we have a visual data preparation tool. Again, no co-needed. Again, it is also serverless. It has a simple UI for you to use for you to clean your data, transform your data, and that service is called AWS Glue Data Room. A couple of things that I can do. First is that it can help you analyze your data quality. Does my data have any missing values? What is the pattern? What are the outlier values? What type of data is it? And it makes the data cleaning process much easier, especially if you have a large volume of data. And of course it provides capability for you to clean and normalize your data. So it has over 250 built-in transformation. So think about it as a more advanced Excel. So it even kind of looks like an Excel sheet in terms of its user interface. So it has a lot of transformation that you can use. You can join columns together, split the columns, etc. And the cool thing about it is that it will remember what you did. So the steps that you take to clean your data, let's say I drop a column or I change the capitalization of this specific column or I add a calculator field. It will remember all the steps that you went through and create a RUM book. And what that RUM book will be able to do is next time you ingest the same data set, but with updated values, it will automatically apply that RUM book. So you don't have to go in there and clean those data again, which allows you to automate at scale. So if you have donor data that is coming in from Salesforce every week and you want to provide a weekly report, well, now you can do that. You just have to go through the first week of setting it up, setting up that RUM book. And then after that, all the cleaning will be automated for you, which takes away the burden and allows you to provide more reports and more frequently. So again, to drive home the point of how easy it is to use, I've included a screenshot of glue data proof here. So as you can see from the screenshot here at the top of each column, you get a little graphic view of the data, how your data is like, so you get a better understanding of your data and your data quality. And then as you can see from the example transformation on the screen here, this is to join two columns together of long two and large two into just one column of a coordinate. And so there's more transformation beyond that. And on the right side, you can see it's recording all the steps that this business analyst is taking. So then next time they don't have to do it again. So that's on a high level of what glue data group can do. It provides data transformation, which adds to the next step of the architecture that we're building here on how to analyze our donor data. So now that we've done the transformation of our data, the next step we want to do is to be able to analyze this data and visualize this data. And that's where Amazon QuickSight comes in. So QuickSight is our business intelligence tool. It has a unified business intelligent dashboard. And what I mean is that with your dashboard, you can embed it into your website. You can create reports, you can create emails where it automatically sends dashboard as an email to the subscribers. You can export the dashboard as a PDF and print it out if you would like. And the other thing about QuickSight dashboard is that you can add filter. And you might be wondering, okay, like why is that important? So remember in the beginning, I mentioned that a lot of the challenges that I hear from customers is that we'll have the marketing team that they want this report, but then the programs team wants this report, but then my board of director wants this report. And so instead of building out individual report for each team, you can now have a generic dashboard but add in filters so that each individual team can go in there and change that dashboard to accommodate to what they're looking for. So maybe a filter for the communication method. So marketing team can go in there and filter and only see the results of people who responded to emails. Or maybe your board of directors are there to analyze last year's donation compared to this year. So maybe they can have a filter based on the month. So I'm comparing January to January or May to May. You can, they can have a filter where it's filtering down to the date range, for example. So you don't have to build individual dashboards anymore. And it is a low cost solution for you to experiment and try out and to utilize. So with QuickSight, just like all the other services we talked about today, there's no licensing fee. You only pay for what you use. So in the case of QuickSight, it is pay per user on a monthly basis. And you can say this user is just a reader user or this user is going to be an author user so that you can only pay for what you need or probably pay for what you're using. And it is serverless, auto scaling and provides you with high performance. You don't have to take care of the servers. You just worry about how to use it. And last but not least, it incorporates with machine learning. So if you tuned into our big conference that just happened two weeks ago, AWS re-invent, we released this new service called Amazon Q. Basically, it is our generative AI chatbot. And if you go, if you're using AWS today, you'll go to the console, you'll see the chatbot, or even if you just go to AWS documentation, the chatbot will come up as a gen AI chatbot. So Amazon Q actually integrates with QuickSight. So you can actually directly ask it questions like, you know, how much donation did I garner in the month of May, or generates me a dashboard on, you know, how many people responded to my email campaign, right? So that's something QuickSight can do at all. Which brings us to the end of the architecture that we have. So we started with data ingestion, then we went to data transformation, and now we end up QuickSight, where we're visualizing that data, donor data. So the next thing I want to do is go ahead and show you. And I'm only going to show you Amazon QuickSight out of the three services I've talked about. And the reason why I want to show you Amazon QuickSight is because I saw that earlier, we have quite a few people who are not yet on AWS. And QuickSight, the nice thing about it is that you don't need the first two pieces to get started, because you can directly upload your data to QuickSight and start generating insights. And it has capabilities for you to clean data a little bit too, you know, not necessarily at the production scale and not at the scale that data group can do, but it can do a little bit of data cleaning. And I think it's a great place to start the three services. This is a great place to start because you can build dashboards, see if this is a good investment for your company, see if this is what you're looking for without building up an entire end to end workflow. So with that, I am going to switch my screen share a little bit. All righty. So let me adjust this window so that's real quick. All right, so here you can see a dashboard that I have built. So I'm going to pretend I am a nonprofit organization that is based in Denver because that's where I am based. And I am a business analyst for a nonprofit organization who provides afterschool programs, family-friendly activities for intercity students and their families. And after this past Giving Tuesday, I got the donation report, donation data, and I've built a dashboard out of it with Amazon QuickSight. And so in this dashboard that I've built, you can see on the top left, I've included our logo to make it look more professional. I've added some filters so that, you know, my board of directors, my team, they can filter this dashboard based on what they need. Maybe they want to analyze which geographic region, which communities give the most, and which communities have the most participation rate. Or maybe they're trying to understand which type of donors participate the most, right? Their year membership within our organization. On the top, I also have inside staffs was generated, directly generated automatically by QuickSight. So I have the top three zip codes for total donations so I can see the communities that are the most engaged. I also have the top three donors named who provided the most donations, so maybe it's time for me to send a thank you card to them, right? And on the right, I've also included the sum of donation from this year compared to last year. And as I can see from this key metric that was generated automatically by QuickSight, that compared to last year, we increased our donation generation by 136%. And I made sure to put this at the top so that my leadership can see that right away. And so coming down here, I created two different graphs that are split based on the method of giving. So on the left here, this is a pie chart that's count of method of giving. So how many people gave during Giving Tuesday through each method? So I can see that 246 donations were generated by people clicking on the email link, and that's about 31% of the total donation that was generated. But on top of that, I've also paired with a bar chart where it's telling me the donation amount. So the total donation sum gathered from email link was $43,000. And so this tells me that for this past Giving Tuesday, it seems like my donor, they're most engaged through emails and second by social media and direct website contribution. And it looks like text link is the one that generated the least donation, only 17 of them. So maybe my constituents, the populations that I resonate with, they don't really give through text for some reason. And so coming down here, I also used QuickSide's GeoMap to automatically create a dot map. And so basically the larger the circle, the more donations we were able to garner from that particular area. And so I can see that the spread of my donors came mainly from central to central west of Denver, and quite a few up here in the Boulder, Longmount area. And so let's say, you know, hey, earlier I saw from my insight that the zip code A0228 was one of the top region of donors. So maybe I want to go ahead and investigate, okay, what do we know about this area? So when I built into filter action where when I click on that, it filters the entire dashboard based on only the people that's from this particular region. So if I come back up here to the method of giving, you'll notice that now social media is the number one method of giving in this region. So in my most giving zip code, social media is the way that reaches the most people or garner the most donation. So maybe I can use that information to target that specific area more on social media, right, for my Lakewood region donors. And so let's say I don't want that filter anymore, I'm going to click on that and make it go away. And then coming down here, I created a dashboard for my marketing team too. So we're comparing the newsletter, how well did people donate, did the people who donate, are they the people who also subscribe to our newsletter? So on the left here, we have number of donations throughout the year that were garnered on average and true versus false, did they subscribe to newsletter? So as first glance, we see that for people who are subscribed to the newsletter, they donate about four times a year. And so do the people who are not subscribed to newsletter. So maybe my newsletter's not working. But if I look to the right here where I'm working on the sum of donation instead, I will notice while the number of donation did not really affect based on these newsletter, the amount did. From people who subscribed to newsletter for this year, we got around $94,000 as opposed to $50,000. And to make it more readable, so where you don't have to hover over it, I added a metric here at the end of the bar chart to make the people, my readers easier to read this dashboard. So now I've identified that our newsletters are working. People are donating more dollar amounts as opposed to people who are not subscribed to the newsletter. And then down here, I've just added a couple more charts here to showcase to you other graphs that QuickSight is capable of. You can include a table with raw data, if you would like, additional line graphs, additional waterfall graphs, and additional histogram. So you get the idea with Amazon QuickSight. There's a lot of graphics that you can generate and a lot of insights you can start visualizing, right? Because data numbers, it's hard to see what they mean. It's hard to see the trend until we put them on a graph to see what's going on. And so last thing I want to point out here on this view is on top here, this is the search bar for Amazon Q. So let's say I want to ask it, what method of giving generated the lowest donation? And what it will do is it will generate me a graph to showcase me that, right? So you can ask it questions using natural language, and it will be able to either generate you a graph or it will be able to give you an answer. And you can give a feedback on whether this, whether this answer was correct, what you're looking for or not, and to help it learn and become more accurate. So in the last bit of time we have together, I just want to quickly take you through setting up this Amazon QuickSight. Because you might be wondering, okay, Angela, this looks great and this looks beautiful, and I would love to have this, but how much time did you spend on this, right? It looks like it was a lot of effort to put these together. And I'm going to show you in the next couple minutes. So this is the author view of QuickSight. On the left hand side, I'm going to go to dataset. To create a new dataset, you simply click on the top right where there's new dataset, and as you can see here, there's the upload file. So if you're someone who's new to AWS, you're trying this out, this would be the easiest method to get started. I'm not going to upload a file just now, just for the sake of time, but I'm going to show you the donor data that I did upload here. So when I come into this dataset, it tells me a little bit about my data, how many rows are there, what is the size. I'm going to go ahead and go to edit dataset here on the top right, which takes me to the data preparation view of QuickSight. From here, there's quite a few things you can deal. On the bottom left here, you'll see excluded fields. These were the columns that I dropped. On the top here, there's a plus sign, and you can add calculated fields or predicted fields if you have a time series data. And in the middle here, you'll see actually in this dataset, I have two files that are joined together. And when I click on that, I can see that I joined them together based on donor ID. And so my two datasets are actually my donor data from this year compared to last year. And that's how I was able to compare the two different years' donation. And here in the middle of the screen, you will see a preview of your data where you can change the column name, change the data type. And you can drop the column from here as well, and just get a preview of your data of what it's going to look like before you analyze it. So this is the transformation screen. I didn't do too much other than dropping a couple of columns that I know I don't need and pairing these two together. So going back to the visual, so coming in here, this is the exact same dashboard that you saw earlier, except from the author's perspective. So I'm going to go ahead and create a new sheet and start from scratch, just to show you how quickly you can start building your graphs. So when adding a visual on the left here, you'll see a list of the fields that I have in my file. I'm going to go ahead and click on the last participated event as well as donation. And boom, immediately, it gives me a bar graph where it tells me, hey, people who last participated in the bake sale, they donated $45,000, close to $46,000. So maybe because that's the most recent event. And so, you know, our organization is still on the top of people's mind. That's why that event, people who went to that event came back and donated the most. Now let's say I don't like the graph that Quickside suggested for me. On the left here, here's a list of different graphs that you can do. So maybe instead of that, I want a donut chart where it shows me the total donation in the middle to help me get a better understanding. If you don't want Quickside to give you a suggestion, you can also directly add a visual right here and directly select the type of graph that you like. So let's say I want to build a horizontal bar graph. And again, let's use last participated event as an example. And then let's use donation as the value. And now maybe I want to group it by years of membership. And now I have each individual last participated event but broken down by how many years have they been with our organization, thus further giving me insights. And from here, I can add more graphs. I can add actions to be able to filter it. I can have filters. And as you saw earlier, it's as simple as dragging and dropping the fields into these specific places. And it will create a graph for you. All right. So the last thing I want to show you within Amazon Quickside is the interaction. So let's say I want to have an interaction where if I click on this, it filters on the right hand side here. That's how you do that. So if I go to interaction and then under action, I can add the action of filtering. I'm going to leave everything as default and click on save. So now when I click on the holiday big sale, if I come to the right hand side of my graph, you'll notice now the other graph was filtered too. So that's how easy it is to add the action in. So within three minutes, I showed you how to build two, how to ingest data, how to clean the data and how to build two very pretty good looking dashboards. And so as you can imagine, the one that I just show you now, it really didn't take me that long. So that's it with Quickside. Before I go back to show you, to kind of wrap up this conversation that we're having here today, I showed you Quickside because, you know, I know a lot of us are just starting on AWS and I think it's a good place to start playing with just because it does everything within one service. But if you're someone who wants to do a little more and wants an intense solution like what we talked about today, one of the ways that we help our customer is by providing these guidances. And within these guidances, a lot of times we'll give you an architecture. So if you'll notice this architecture is very similar to the one we stepped through today. And if you come down here, you will see a link to GitHub where from that GitHub, you can actually launch this exact architecture in your own AWS account without you having to build it yourselves. So if you're one of the people who are already on AWS, have familiarity with the cloud, and want to actually start building a full-blown architecture, this would be a great place to start. All right. So with that said, I am going to share some wrapping resources with you. Let me see here. Finishing thoughts. All right. So today's conversation, we talked about how you can use analytics to generate insights about your donors. We talked about how that's really important and how, why we want to analyze donor data. And we also talked about the challenge of doing that and why it hasn't, you know, a lot of businesses despite wanting to do that has not been successful. But then we also talked about how some of the AWS services specifically today, we touched on the no-code services for data analytics that can help you easily get started. And so as a finishing thought, I want to share with you some resources that we have. So on the left here, you have some blogs and some video demos, some getting started guides, as well as the guidance that I shared with you, screen showed you earlier, the links to them, as well as the QR codes. And if you are ready to get started, or if you want to have a conversation with us, or if you have any questions, feedback about the conversation that we had today, feel free to reach out to us at the email that you see on the screen here, or scan the QR code and submit a contact form. And we are happy to meet with you and learn about your organization and see how we can support you. And so those are the technical resources that we have, but we also have a lot of resources dedicated to nonprofit customers, such as credits that I know a lot of you were asking about earlier. So Mike, do you want to talk a little bit about the credits and other stuff we have? Sure. So first, let me just say, Angela, thank you so much for that demo. It was a really powerful information. We definitely practiced what we preach. We actually use QuickSight really heavily on the program side as well. It's an awesome tool. And like you said, it's really kind of easy to implement. Also, I wanted to address the 10-page menu at a restaurant. If I went to a restaurant with 10 pages as a menu, as long as they have chicken tenders, I'm good to go. I literally get that everywhere I go. So I just want to say thank you all for joining us today. We'd like to invite you to connect. Any questions that you might have, feel free to reach out. In terms of engaging with us, we have multiple programs where we focus on things like partners and marketplace, cloud training, events and educational experiences, and of course, grants and credits. I've actually dropped a couple of links directly into the chat there. If you click on that last link there, that'll take you to the AWS site on the TechSoup page and it'll give you some more information for you. Also, just want to plug, we mark your calendars March 20th, 2024. It does mark our annual eighth annual Imagine conference. That's being held at Amazon's HQ2 in Arlington, Virginia. That's a chance for nonprofit organizations to get together, connect directly with AWS as well as with other nonprofit leaders on how technology can really just help drive impact for your organization and for your mission. With that, again, just like to say thank you, Angela, for joining us today. Again, I'm Mike Yoon. Thank you all for being here. And of course, thank you for your interest in a TechSoup and AWS. And I hope you all have a great holiday season. We'll see you all soon. Thank you.