 Hi everyone. Thank you so much for joining us for our webinar today, Online Serving for Non-profits and Best Practices. I just want to go over a few housekeeping items before we get started, so all callers will be muted. If you have questions, you should see a chat box to the left-hand side of your screen. So as the webinar moves along, feel free to type in your questions, and we will try and answer them as we go along, or during the Q&A at the end. If you lose your Internet connection, just refresh the browser, and there was a link that was emailed to you. So just click on that again, and you should be able to log back in. If you want to watch the webinar again, or if you have to leave a little bit early, we'll be hosting the webinar on our website at techsoup.org slash community slash events dash webinars. We'll also be sending an email once the presentation is over with the recording and any relevant links. If you are on social media and you want to send us a tweet, you can use hashtag TSwebinars at Techsoup. But like I said earlier, the Q&A is what we're going to be focusing on in the chat box for this webinar. So just a little bit about Techsoup. Before we get started, we are in 236 countries and territories, and we work with over a million nonprofits providing either donated or discounted technology to help them run their organizations. So I just want to give you guys a chance to try out the chat box. So if you can send me a chat and let me know where you guys are dialing in from, that would be great, and I can read a few of them off. All right, so we have Arizona, Chicago, Atlanta, Chesterton, Indiana, which is great because that's where our guest speaker is from today from Indiana. Massachusetts, New York, okay, so we have people calling in from all over the country. Do we have anybody international? I don't see anything yet, but hopefully there is some international presence on the webinar today. All right, so some of our technology partners include Adobe, Intuit, Microsoft, Symantec. Today one of our technology partners, QuestionPro, is here to explain online serving and how it works. I think a lot of us know the technology that they've provided is very easy to use, but it goes a little bit deeper in how do you ask the right questions, how do you collect the right data, so that all of that information is really valuable to you and to your organization. So I'd like to go ahead and introduce today's speakers. So we have Vivek who's here from QuestionPro. He is the founding member and executive chairman of QuestionPro, and he plays a key role in defining the company strategy and uses technology and innovation to maintain its leadership in the industry. In 2008, QuestionPro made Inks Magazine's list of fastest growing private companies ranking 172nd overall and 25th amongst business service providers. We have Mitch from the Schaefer Leadership Academy, and he is the executive director. The Schaefer Leadership Academy is an organization that provides the tools and resources to help develop leaders for people of all ages, backgrounds, and interests, and they are located in Central Indiana. And then myself, my name is Sima, and I am the online learning producer here at TechSoup. And I'm excited to host both of our guests today, so I'm going to go ahead and pass it off to Vivek. Awesome. Thank you, Sima. Thank you for having us. Hey, this is Vivek here from QuestionPro. And a couple of quick kind of items. We'd love to have you ask as many questions. I love to have webinars when things are interactive, and obviously I can't hear you guys, but definitely keep asking the questions. And we throw out this presentation as we go along. Our intent is to kind of stop and kind of see if we can answer the questions as we go along for the next 35 to 45 minutes that we're going to go through this. That's me, and I'm super excited to have Mitch also with me. Mitch is actually one of our clients, and I'll just go through the agenda really quickly and then we'll kind of use ourselves. The goal here today is to talk a little bit about online surveys and research. And we will go over a couple of different survey types, give you some examples, and Mitch has some very pretty amazing examples that he's used QuestionPro internally within the Shepard Leadership Academy. And he's going to talk a little bit about how you actually use QuestionPro. It's not even about QuestionPro, it's how you use Solace for anything. So that's kind of the general agenda for the day, for the hour, if you will. Hey, VVAC, sorry, sorry to interrupt. We're getting a couple of VVACs saying that they're having a little trouble hearing you. If you don't mind getting closer to the mic. Okay, I'm on a headset. Okay, all right. Is this better? Can you guys hear me better? Yeah, if you guys want to chat in and let us know if that's okay. Yep, I think that's better. Okay, cool. So just a little bit about QuestionPro. So we are an online survey platform. I started it back in 2005. We have about 100 employees in the fairly global. We have offices here in San Francisco, in Chicago, and also in India, Germany, and Mexico. And I've been in the business for quite a while, frankly, and I've also written a couple of books around online surveys and online research. Mitch, I'd love to have you introduce yourself. Well, good afternoon for everyone who's in the Midwest or the eastern side of the nation. And good morning to those of you who haven't had lunch yet. Thanks for the introduction, VVAC. My name is Mitch Isaacs, with Schaefer Leadership Academy, Executive Director. You're going to be learning a lot more about us a little bit later on. But as you can see, we are a very small not-for-profit located in Muncie, Indiana, which is about an hour northeast of Indianapolis. And you may know of Muncie because you might know of Ball State University, which is David Letterman's alma mater. He's our proudest graduate. So that's what put Muncie on the map. You can see we're a small organization with a small staff, but out there doing a lot of great work and look forward to telling you a little bit more later on about who we are and what we do. But before that, I think VVAC's got a question for the group. Okay. All right. So the first question I have for everybody. I'd love to have a question here. And have you guys ever run a survey for your organization? And I went to, if you can just force that answer right there. We wait until we get about 100 or so results. And I think we got most of them. I'm going to just chime into those. So most of you have run surveys. So great. So a few of you haven't. Obviously, that's the ratio, as you can see, 75 to 25. So let's jump right into it. So what is generally the purpose of doing online surveys? So there's a lot of ways to collect data and you can measure behavioral data very easily. A good way of, you know, there are two kinds of data if you want to kind of think about it. One is behavioral data and one is what we call attitudinal data. So attitudinal data is what people think and behavioral data is what people do. So a good example to kind of kick stuff this conversation would be to say like, let's say you invited 20 people to come to a webinar or come to your birthday and it said yes and 12 said no. So you have behavior which is 8 people did something and 12 people didn't do something. But you don't have a pretty good idea of why those 8 people showed up or why those 12 people did not show up, right? So what a survey does in general is to kind of understand why people are doing something. So in general, you can go down the path of thinking. If you ever keep asking the question why, you can probably use a survey to determine the answer to that question. So that just goes for regular surveying and surveys have been out for quite a while. This is not something new. And so frankly online surveys have also been out for quite a while. But online surveys using online surveys within the nonprofit kind of ecosystem has been, I wouldn't say it's new, but I think what we want to do today is to kind of walk you guys through a couple of use cases that we've used online surveys to kind of create interesting data. The key elements as you can see on the screen, especially when it comes down to online, you collect the data real time. We just did a survey. We figured out 75% of the people have done a survey. So this is information. This is information that you've gathered. So that then I know kind of my audience now. So I know my audience really clearly like almost 70%, 80% of the people have been exposed to surveys. So I can tailor the content as I talk to you guys. I can tailor the content to that. So the general idea behind running a survey is to collect data and then fill that data into content and action, as you can see over here. So use that data to actually kind of, you know, inform your decisions. So that's the key part that I would say. One of the key takeaways that I have for you guys is like, you know, if you want to measure attitudes, surveys are a great way to measure attitudes. Surveys are probably not useful for measuring behavior, but surveys are great to measure attitudes. So now let's talk a little bit about what kinds, what are different types of surveys. And this is not, I would say this is not specific to nonprofits. But one of the challenges that I think nonprofits have, you know, compared to kind of commercial, we are a commercial entity. We are a tech company. We are a commercial entity. You know, I have my board and my board, you know, has very clear metrics for me. Right. So, and they're all, you know, typically they are financial metrics. In many cases, there are other operational metrics for us. Right. So, so we have a very clear, I think in the commercial world, it's very clear, you know, there are financial metrics, you know, all the way from profit margins to net revenue to everything else. There are clear metrics for us. Our permission-driven organizations having, obviously, there are no financial metrics. And it becomes really tough for mission-driven organizations to showcase kind of outcomes and showcase results. And so surveys, actually a lot of, you know, mission-driven organizations use surveys as a mechanism, as a proxy to determine how they are doing. And these are the different survey types. I would say that all our customers have used over the years. And just to give you guys some perspective, you know, we're in question 12. We have about, you know, I think about 12,000 customers, you know, including nonprofits, commercial and enterprise customers across the board. And we do about, you know, about five to six million surveys a week. Between all our customers. So we've seen, you know, we've seen kind of pretty much every single kind of survey that people have done, you know, as you can see, the scale is extremely large. And so the types of surveys that you can do, you know, really is not even dependent on the platform. It's really dependent on your imagination, if you will. I have a quick, again, survey for you guys, a question for you guys, really. It's like, have you, I mean, I know a lot of nonprofits depend on volunteers and volunteer kind of volunteers coming in doing work for them. Have you used surveys to kind of measure their experience? Volunteers are coming in typically for an experience. Have you used surveys to do that? And I'm going to skip through the results in a minute. I've got enough data. So as you can see, yeah, we're getting a live feed. It's like, you know, yes, people have run surveys, but they've not used it for measuring volunteer experience. So that's one use case. So you could use a survey to measure volunteer experience. That's just one particular way. And this is going to show a couple of different interesting ways, you know, what, you know, 10 or 15 minutes. But, you know, we've seen that volunteer experience is kind of one of those things that people come in for the experience. And you can measure that by just asking a few questions about how their experience was. And what I want you to take away is Phyllis can be a very effective mechanism for impact assessment. When I say impact and impact showcasing is also important, partly because nonprofits are clearly mission driven and in a mission driven environment. And the same thing applies to even government entities a lot of times. And that's why we do a lot of work for a lot of government entities where, you know, where they use the surveys. The legislature almost has kind of mandated different agencies that they must provide a level of customer satisfaction. And how do you determine a level of customer satisfaction? Well, you know, a simple way to kind of determine that you are providing a service is by asking the customer's thing, are you happy with the service? Are you satisfied with the service? So we'll go through a couple of different ways, couple of different kind of models for asking questions also in a minute. So now let's talk about the survey itself. So in research there are broadly two different models for collecting data. One which is quantitative that collects data. And like we just, you know, you just saw the yes no question that we just asked. It's a quantitative model to collect data. And then there's a qualitative model. Also within a survey you can ask open-ended questions like, hey, how do you feel about it? What are things that we've not asked you? That's an important element. A lot of times in surveys you may not think about asking all the questions. And an easy way of getting beyond that is by saying, hey, tell us something that we've not asked you. And you can just look at those comments and then make an informed decision. So broadly speaking, when you start designing a survey, you've got to think about, okay, what is the information that I'm going to collect quantitatively? And what is the information that I'm going to collect qualitatively? And both of them can be done in the same survey, but you have to understand that there are two different models typically in place. And when it comes down to it, you know, creating a survey within a question poll, we're not going to go into question poll. This is not a talk about how to use question poll, but generally a talk about surveys. So there are over 80, like the slide says, there are over 80 question types that are available all the way from star rating questions to slider questions that you can use. And the system is obviously fairly intuitive, but you have to think about these kind of questions in terms of asking the satisfaction question, asking questions like, hey, how satisfied are you? All questions like, hey, would you recommend this nonprofit as a volunteer? Would you recommend this particular nonprofit to your friends and family and colleagues? So these kind of questions, that kind of illicit response beyond, you know, that is the thinking of the users is what you want to think about and you want to design that survey. You can include logic within the survey. So if you say yes to not ask these questions, if you say no, then obviously not at this point asking redundant questions. And one of the interesting things that we've seen over the years is that, you know, people love answering, you know, people give, love giving kind of, you know, visual feedback. So things like star rating questions as well as slider questions are immensely popular amongst respondents. And the response rates for these kind of questions are always higher than simple radio boxes and check boxes. So that's something for you guys to think about. And the final thing I want for you guys to think about is customizing the things, the logos and the color. It's important to, you know, bring your brand into surveys because the surveys effectively represent your brand. And, you know, within questions, obviously you can customize the theme, the logo, add logos, change the colors to a survey, and you can do all of that. So the question types and formats that are available are, you know, fairly large within question point itself. And frankly with most other, not only the question problem, but any survey platform has allowed, it gives you the option of creating different kinds of questions. And there are a whole bunch of libraries that are also available. Predefined questions, we call these survey templates or question library. And you can use any one of those libraries also to design the survey. And one of the things, like I said earlier on, key takeaway is, you know, when you design a survey, you know, most people go in with open, you know, very closed-ended questions. But my recommendation again to all our clients, including nonprofits, is to always include an open-ended question in your survey. This kind of gives the respondent the ability to say something that you've not asked, instead of being, you know, so this doesn't give that feeling of being boxed in. And here's a quick example of, you know, a survey. As you can see, you know, most people have seen surveys like this. This does not include the slider question, but, you know, the background, the photo have been customized. The slider and the logos have been customized over here. Like I said, like, you know, making the brand carried through in the survey is hugely important because this is, you know, again, it's something that represents you and it's important that, you know, partly this increases response rates. That's the fundamental reason because people know that it's you who's, you know, collecting the data. It's not, you know, it's not somebody else. Sima, I'm going to stop here really quickly for see if there are any questions that are coming up that we can engage in. Yeah, so I think we look like we're pretty good right now. I don't see any questions coming in. I just want to remind everyone, you guys should see a Q&A box on the screen and we have, you know, Yvette and Mitch available to answer your questions in addition to some representatives on the back end. So feel free to ask questions as we go along and we'll try and answer all of them. So I think we're good for now. Okay, great. So the next thing I want to talk about is, you know, surveys. You can measure surveys for, like I said, volunteer experience as well as, you know, how making, doing kind of impact analysis of, you know, if you're, if you're serving a particular area and then, you know, really going back to that area and saying how before they feel about your service. But that's the typical model. And then what I want to talk a little bit about is some of the ATP people want some, you know, some things that people have used. And, you know, we are a technology platform. We don't tell people how to create surveys. You know, we help create, you know, we help with the technology itself. We're not, we're not a major research company. So, but we've seen a whole lot of surveys over the, over the many years. And some of these, as you can see, you know, as you can see the variety, the spectrum of things that you can do with surveys is extremely large. You can, you know, measure employee satisfaction with surveys. You can measure customer satisfaction. You can measure any program that you're running and saying like, is this working or not working? And simple, you know, you could shout out with a simple yes-no question, or you could go a little bit more nuanced in terms of like, hey, raise this on a one-to-five scale or raise this on a one-to-five scale. And how do you, you know, how are you doing with respect to different, different, you know, different services that you're providing? And one of the stories I want to tell over here is, you know, we got contacted. This was, this happened about two or three months ago by a nonprofit called the Startup Policy Lab, right here in San Francisco. This was during the net neutrality debate when FCC kind of canceled the net neutrality kind of ruling. And they got like 20 million comments on the website, right? And the Startup Policy Lab wanted to figure out, hey, you know, the FCC has 20 million comments and there was a lot of kind of talk about, you know, most of them are fake bots and people were jamming the, you know, effectively jamming the public commenting system through bots and through kind of fake, I guess, people just pushing it in. So they came to us and said that, you know, can be, you know, through the Freedom of Information request they got all the folks that were actually, that actually had submitted comments. And what we kind of devised with them is through those 20 million we sampled a small subset of 600,000 users and we sent out a survey to all of them saying that, hey, did you post this comment on the FCC website to make sure, to validate the fact that it was actually human beings that were posting comments versus bots that were posting comments on FCC's website. So this was a project that the Startup Policy Lab and us, we collaborated on it and we had this project. If you think about it, in the traditional sense it's not a survey, but we used the survey platform as a mechanism to kind of validate some piece of data that was already given to something. Not that you guys would do something like this, but it's kind of like what I want to do is to kind of expand everybody's horizon into thinking about how can we use, you know, how can we use the tool, how can we use surveys. And the key thing that surveys provide is the ability to collect data, open in the data, close in the data, any kinds of data that you want that you can create, collect and deploy fairly easily. And so these are some of the models that we have used internally and then you continue to use as we go along. I got another kind of quick question for everybody in the group. You know, for the 75% we've actually done surveys. I'm assuming it's some sort of an impact you know, execution. Have you used that data to present to your board itself? So let us take a look at the results. So good, at least half of us have kind of presented to the board either relevant or likely. We think, you know, a lot of our customers use survey data to kind of show impact and therefore, you know, and the board for most non-profits are sure, you know, impact assessment is being made. So this is a good example of, you know, when we collect data you need to present the data somehow back to the relevant stakeholders. And my key takeaway and the final one over here is that, hey, you know, with question pro and with TechSoup you guys have access to question pro you have access to a fairly versatile tool that can do a lot of data collection, execution, as well as reporting. And we'd love to kind of have you guys use it in any way you can see. And with that, I'm going to pass it off to Mitch. All right. Well, thanks, Vivik. Once again, hello everyone. And my charge is to give everyone here a few examples of how a not-for-profit is using a tool like question pro in what we hope are some predictable but also some innovative ways as well. And so a little bit more about Shaffer Leadership Academy before I explained how we use online surveys for our organization. I tell folks we help people learn how to lead. We don't teach leadership because I don't believe you can teach leadership. Our goal is to be facilitators. And so what that means is having programs and experiences and some of our programs are simple two-hour workshops all the way up to our signature program called 11 Years, which is an 8-week 30-hour experience. So we have a lot of different things we do but all of our programs all come back to one simple idea. And that is leadership happens between the years and between the rib cage which is my way of saying it's a head and heart thing. And I can't tell you how to see the world differently. I have to help you see the world with a new set of eyes and learn how to interact with people in a different way. And so we do that through facilitation through small group conversations and intentional activities and discussions. And so you can see on the screen some of the topics that we do. One of our previous slides said that we did 113 programs for 3,000 people last year. That's not a 113 different program. That's some stuff more than once but we cover a lot of topics. And one of the important things is that it's really not probably too different than what you do. In the not-for-profit world, measuring your mission I think can be a real challenge. I tell folks all the time that the difference between a for-profit organization and a not-for-profit organization is that a for-profit organization measures their success in one very simple way. Profits, money. How much money do we make? Those of us in the not-for-profit world we have a more complicated task in front of us. We measure our success in mission. And hopefully we're making a little bit of money along the way to pay the bills and to put some stuff away in the bank for a rainy day, but the end of the day our goal in the not-for-profit world is to live out our missions. As you listen to me talk about Shaver Leadership Academy, I hope you can relate that leadership is really tough to measure. It is hard to quantify what leadership is. Let alone measure if someone's become a better leader as a result of our experience with us, and I would expect for some of the other not-for-profits on this webinar you have a similar situation, but you can see on the screen in front of you what our mission is. And like any not-for-profit organization it always comes back to the mission because that's why we exist. So let me tell you broadly how we use online surveys and question pro to assist our organization. Really first it's measuring mission and then board development, which I think is probably I don't want to say unique to us, but I think that it's something that we do that maybe some other organization on this webinar haven't had a chance to do. And so in talking about measuring mission, Vivek touched on attitudes versus behavior. Excuse me, sorry about that. I got a little bit of feedback there for a moment. Vivek talked about attitude versus behavior. And when we think about measuring leadership, it comes down to similar components. What we call satisfaction, which is another way of measuring attitude. Pre- and post-test, which is cognitive, and then assessing behavior. Hopefully with those three things together we're able to capture a more complete picture of how we're living out our mission at Shaffer Leadership Academy. So measuring program satisfaction. What does that mean? It's pretty simple, really. It's just asking folks, what did they think of our programs? Is Vivek said another way to consider that is attitude? And so this is probably not very surprising, but after we complete a workshop, and I think we had four or five different workshops just last week, we have an instrument that we send out to everybody who attended. And we ask the same questions. A lot of the programs we do year in, year out. So we have found that it's helpful for all of our programs to ask the same base set of questions over and over again. And what that does is it gives us longitudinal data so we can see how our fusion leading multi-generational teams program, how satisfied were people in 2018 versus 2017 versus 2016 versus 2015? Are we getting better at that program if we're doing it often enough? Hopefully we should be. And if people are less satisfied one year than the next, so what do we need to look at changing or adjusting? And so it's a way of making sure we're hitting the mark, and if we miss the mark, where is the course correction? This is also something that I've used often in post-grant reports or in grant proposals. As we all know, funders love to ask about goals. And so to be able to say one of our goals is to hit a certain mark on satisfaction is an easy goal to measure. It's an easy goal to benchmark one year after the next, and it's the kind of data I found that funders really like when you're able to say, you know, 92% of our participants are satisfied or very satisfied with programs. It's certainly a way to show funders that people are responding to what you're doing. And it's also a way to help people understand that if they're going to participate with your organization, they're probably going to have a good experience doing it. But I would say the key is to ask those questions the same questions you're in and you're out so you can really start benchmarking your data. So the takeaway here is satisfaction data is not robust. You know, you're just asking for people's opinions. And opinions are important and opinions matter. And so I don't mean to minimize that, but it doesn't always give you a complete picture. But again, I do think that if you ask that opinion in the same way, in the same questions, time after time, it does provide you with some very helpful benchmarking data that can be used when assessing the effectiveness of your programs and your offerings and your services, and also reporting out to funders, donors, and stakeholders who might be interested in those metrics. Okay. Pre- and post-test gets at the cognitive idea. One of the things about leadership is if you want to find out if people are better leaders one way, and again, this is a multi-pronged approach, right? So satisfaction is the first way. This is the second way. Another way of getting at if people are better leaders is finding out if they know more things. And so you do that through pre- and post-test. One of our favorite pre- and post-tests is called the Leadership Practices Inventory. It comes from the Leadership Challenge, which if you know anything about business books or you are interested in leadership at all, the Leadership Challenge has been around for almost 40 years, widely regarded as a really helpful model for leadership. And with that comes an assessment on side metrics. A model of the way. What kind of role model are you? Encourage the heart as it sounds. How good do you do at encouraging others and connecting with them on an emotional level? Inspiring a shared vision, which is really not just your vision, but inspiring a vision that's shared among multiple people, challenging the process, making sure that you're not satisfied with the status quo, and enabling others to act. And out of those five things, there's a number of questions under each. And folks take the Leadership Practices Inventory before one of our programs. And then they take it again after the program, and they're able to see, and we're able to see, and everyone who's interested in the progress of our organization is able to see what kind of growth they have, at least in what they knew before and after the program. Some way that you can tie pre and post to behavior. This has become one of our most important talking points in the last two years. So we talked about satisfaction, what did people think of our programs and offerings. We just talked about pre and post. Did someone learn more after the program than before? So giving them the instrument before the program, giving them the instrument again after the program. And then the behavioral piece is another way that you can do a pre and post. And so what we've started doing is, Mergent is our signature leadership program. It's where we started 11 years ago. It's eight weeks. It's what people know us for. We've asked people when they've registered now how involved are they in this community? Our mission is to create leaders for East Central Indiana. So we want to know when you interact with us how much are you already leading? What level are you leading at? And so what we've been able to do is using a tool like QuestionPro is we reach out to our graduates one year, three years, and five years after they graduate, which is a model that a lot of alumni engagement offices use on college campuses. My background's in higher education. That's what my master's degree is in. So I've learned some of these tools from my time on a college campus. So colleges will often reach out to their alumni one, three, and five years, and oftentimes even longer than that to see where they're at. So that's what we do. One year after completing our signature program, and then three years, and then five years after completing that program, we're finding out how are they involved in the community? Have they taken a leadership role? Are they serving on a not-for-profit board? Have they been promoted? Have they taken on a new job? Are they taking on more opportunities at work? These are measures of behavior. When we're trying to figure out what it means to lead, and there are lots of ways to lead, what we've said and what are the behaviors that leaders do. And then we ask that question after people complete the program, and you can see that some of the statistics we've been able to report. The biggest one is that 75% of our graduates have taken a leadership role in the community. You can see the number of folks who have volunteered for extra assignments at work or serve on a not-for-profit board, and that's huge for us. And I would guess that for many of you, that could be huge too, because what I do as the executive director is now I have just two to three major statistics that when I'm in front of funders or donors or people who want to learn more about us, I can say, you know what, 75% of our graduates take a leadership role in the community. That makes an immediate impression on people who are interested in our work, and it tells a story in just a very little bit of time. And for the people who are data-oriented that you are trying to seek their support or their engagement, it helps them understand very quickly that you're affected at what you do. And so measuring behavior and just trying to drill down to what are those two to three key talking points that if you could stand in front of a group of people and really give them two to three data points for impact, what might those be? And you can see what those are for Shaffer Leadership Academy and as we've been using them. So again, pre- and post-tests are helpful for assessing participant growth and also for tracking behavior. Before I talk about board development, I just want to pause real quick and see if there are any questions on the topics I just covered. Hey, Mitch. Yeah, so we do have some pretty good questions coming in. Let's see. I'm wondering if I should... I think they're probably relevant to both you and Vivek, so I think I'm going to hold off until the Q&A which is coming right up, and then we can just tack them all at once. Okay, happy to. Well, let me move forward with the second part of explaining how we use question pro for board development. That's not a picture of my board. I wish we had a board room that was that nice. Our board room is not that nice, but it does make for a nice picture. You know, if you're like us, engaging your board can be a challenge. You know, our boards are volunteers. They commit to our organizations because they have a heart for our work and they care about what we do, but at the end of the day, this is not their full-time job. You know, engaging your board in just a little bit of time can be really challenging. So I think that, I hope you will agree, we found a couple of ways that are quick and efficient that help us not only engage our board, but better assess our board for board recruitment and also for better board effectiveness. So let me tell you a little bit about those things. We have a board matrix and you can probably see on the screen a few of the things, but we give this to our board before we start our board recruitment season and we use it, think of it as an inventory to analyze our current board. So we ask questions, we ask demographic questions, you know, about race and gender and age. We ask background questions in terms of, you know, they have experience in fundraising, budgeting, human resources, government. We ask questions about their access to money. We ask questions about their connections to the community, but what we've done is built a profile where if we had a perfect board member, what are all the qualities that that perfect board member would assess? And then we send this tool out to the board online, takes them about five minutes to complete, and they check the boxes of what qualities they actually do have. Not every board member has everything. You might have a board member who has a lot of money to give, but not a little bit of time, or a board member who had plenty of time, but not a whole lot of money, because legal advice couldn't market their way out of a paper bag. So, you know, we all have different needs from our board members, and so this is creating a list of all of the things you could possibly need in a board member, giving it to each board member to complete online. And then the great thing about a tool like QuestionPro is it aggregates all that data. And what we're left with, you can see in the pie chart, is where our areas of expertise are, for instance, or where our demographics are, or where our connections are. And once you know where your board is strong, it also helps you understand where the board is weak. Oh, look, we don't have very many people of color on the board, or we have a lot of men and not enough women, or we have a lot of folks from government, but no folks with legal experience, or we have a lot of folks from this sector of the community, but not that sector of the community. And so board members can do it quickly, anonymously, on five minutes, and once all that data is aggregated, then what our board recruitment team has is a really good understanding of where the gaps are, and we can build a profile of board members that we need. And so we can be strategic when we go out and look for board members because we know where our gaps are. The other thing we do quite simply is a board exit interview. And so when a board member is done with their term, and we have three-year terms, probably like most of you, that could be re-up for a second term. But when somebody's done being on the board either because they term out, or their circumstances change, they meet with the board vice president, and before they meet with the board vice president, we give them this very simple survey where they evaluate their time as a board member, they evaluate their relationship with me and the board president, and the other board members, and they provide also some written feedback about their experience with the board, and then they sit down with the board vice president and go over that. Why the board vice president? Well, because as you all know, if a board member's going to have problems, it's probably going to be with the chief executive, or it's going to be with the board president. Rarely is it with the board vice president. So we find that it's helpful for the board vice president, because they're usually an easy person to talk to, and it also gives them information before they move into their term as president, because in our organization, once you're board vice president, next year you're expected to be board president. So very simple. Folks can do it in just a couple of minutes. So key takeaway here is quit demographic surveys, simple evaluations. Again, time, time. Your board members are busy. So these things can be done very quickly, and it gives them an opportunity to do it online, and you get to see all of the data aggregated, and you can use that information for better board effectiveness. And with that, I'm going to turn this back over. Awesome. Thank you. Seema, should we just go into Q&A right now? Yes. So we can go ahead and move into Q&A. So let's see. Okay. Sorry, give me one second to gather some of the questions. Okay. So we have a lot of good questions coming in. So in terms of, like, the length of surveys, is there a best practice around, like, how many questions? There's too many questions. And, you know, if you do choose to incentivize people for a longer survey, I think the challenge with nonprofits is obviously you can't offer, you know, any sort of, like, monetary rewards. So do you guys have any advice around that? I think feedback if you want to maybe take that question. Absolutely. I think what we've found is kind of, you know, especially kind of governments, nonprofits, schools, universities, a lot of universities use us. You know, the general perception is that you are being incentivized people. We don't think that that's true. And actually the data proves that a lot of people take surveys because they actually want help. They want to give feedback. And, you know, and it's not always true that, you know, you need to give somebody a $5 Amazon gift coupon to a gift card rather to get feedback from somebody, especially if they have some level of vested interest or some level of even interest in you. So clearly you're not doing consumer research. You're getting feedback. Typically rewards are given for consumer research purposes where, you know, a procter and gambler is trying to come up with a new kind of Gillette kind of razor and that they want to try out some new ideas and they can go out and get some research done. That's when rewards are typically given. Within the nonprofit ecosystem, I've rarely seen people giving rewards and frankly I don't think that that will substantially change the response rates. What does change the response rate is the way you communicate and the way you ask for people's feedback. And there is a science to that to make sure that you kind of say, like here's why you were asking the data, here's what your feedback is, how your feedback is going to impact us. If you can explain that clearly to somebody, your propensity for, you know, to get them to respond and spend five minutes with you to take the survey is exponentially higher. So that's what I would say. That's my opinion under the law. You don't just necessarily need to give rewards and the answer is like, look, we need some feedback. We need some input and here's how your feedback will actually change our behavior. And if you are aligned with that, then please spend five minutes and you'll get people to respond to it. Now, the second part of the question was, hey, do you, you know, long surveys versus short surveys? Clearly, you know, short of the survey, the easier it is. I mean, nobody, you know, obviously nobody has the time to go through a 30-minute survey. I have seen kind of organizations that get carried away in terms of designing the survey because it's actually fairly easy to design, you know, a 50-question survey or a 30-question survey. I would say like the general rule would be like, you know, within five minutes somebody's got to complete the survey. And after that, you know, there's a huge kind of drop-off in terms of potential span. I would say even, you know, it's two to five minutes is where people have enough time to do this. So that equates to about, you know, 30 seconds per question or 20 to 30 seconds per question. So five to 10 questions if you can condense everything within five to 10 questions. That would be ideal. That's what is my opinion. Mitch? Oh, I would echo all of that. I think that's good advice. The one thing I would add that we have found is timeliness matters. And so obviously not every organization here does work the same type of work we do, but when we do a workshop we try to have a survey out within 24 hours. We find that if it takes us much longer after that the response rate falls off the cliff. So we try to get people, while they're still thinking about the experience they had with us. And having the report survey, thank you for your points really helped. Yeah, that's a good point, Mitch. I mean, I think in the commercial side is to think about it, you know, it's called the moment of truth. It's like, you know, right when somebody buys something Amazon sends me a survey literally within, you know, 20 minutes of me receiving the package. So it's really important because obviously we all live very busy lives. We don't remember a lot of it and it's kind of recently by us. So therefore, again, Mitch had a good point about making sure the survey goes out as close as you can to the point of interaction. Great, that's super helpful. Any other questions? Yeah, we have a lot of really good questions coming in. So I think another question, I guess in terms of like anonymity and, you know, giving people sort of some ease around offering this information and making sure that it's private. Are there any, you know, best practices or anything that you specifically communicate to, you know, the people that you're sending it to so that they know it's, you know, anonymized? Well, for us, with the kind of work we do, somebody goes through a one-day seminar. We let them know in the follow-up, well, we tell them at the end of the seminar, but then we also let them know in the follow-up email. We're going to send you a survey and it's going to be anonymous. And I think when you're asking people to tell you their experience with an organization, it helps people be a little bit more candid when they know that's the case. So it's as simple as just telling people and hoping that they trust you enough to know that you're telling them the truth. But I would also say, though, for some of the things we do, it's not anonymous. You know, if you're going to do a pre and post, you don't want it to be anonymous because you need to know how you're impacting people. If you're going to measure their behavior, when we follow up with our grants one, three, and five years after completing the program, we have an online survey, but then we also have interns who just call them and are putting in the information in the online survey. And in those cases, you know, anonymity don't help us. So in the cases where it doesn't matter where it's anonymous, we make that very, very clear, but sometimes we need to know who the people are. And we just tell them up front and to Vivek's point earlier, we tell them why it's important to know who they are and then move forward as needed. Yeah, I would make all that. I mean, I think the only thing I'd like to add over here is that this has come up obviously many, many times over the years. And to a point where we actually even have built a tool where we call it responded anonymity assurance where we as a tech provider will guarantee that this survey is anonymous. So even when, you know, and this is not for nonprofits, but for universities, this has been a bigger issue when universities are surveying their old students for research purposes for higher fees. They're typically institutional research boards require that a platform provider get somebody to guarantee anonymity. So that can also be done. And it's a flag that you can enable on the unquestioned approach or there for an IP addresses, email addresses are all masked out so nobody has any information, at least personally identifying information is not available. Even though it's collected, it's not available and you can go down that route also. But again, the answer really is contextual, frankly. I mean, in some cases, anonymity is valued and in many cases, you know, anonymity is irrelevant. Great. Super helpful. So I guess in terms of like, you know, response rates, is there like a healthy number in the survey world? So if you send out 100 responses and, you know, 30 people respond, is that considered good or bad? Or is there, I don't know, a general number that people should kind of keep an eye on? Again, it's extremely good. Go ahead. Because you, I know what works for us, but you see this more globally than this. I imagine what you have to say is probably, you have a broader perspective on this. So please go ahead. Yeah, I think it's very contextual. It's extremely contextual. And we've had surveys with response rates of 80%, where employee surveys are typically extremely high response rates, right? So, you know, CEO can get an email to 40 of his employees. Pretty much everybody is going to take it. You know, whereas when you send out a survey to kind of customers, you get response rates. If you get response rates of a lot of 10%, you're super happy because, you know, customers are not in the business of taking surveys for you. So really, you know, it's, you know, my answer to this typically is, you know, the response rate is a direct reflection of the strength of your relationship with that person. So, you know, in the case of board exit interviews, for example, I imagine the board exit interviews are very high response rates because, like, look, they're not that, you know, you work closely with the board member and they're leaving and you are sending them a survey. I mean, they're going to take it very likely. You know, you know, eight out of 10 guys are going to take it. Whereas if you are kind of, you know, if you're kind of, you know, if your relationship with that person is weak, then the odds of them taking it are lower, simple. So the range, you know, again, had to be a consultant and say, like, it depends. I mean, general customer surveys for us are, you know, in the five to 15% for customer oriented stuff is, you know, I would say pretty good for employee oriented that you have a much stronger relationship, constitute an employer employee relationship to be a fairly kind of strong relationship that usually is in the, you know, 70, 60, 70% range. At least that's what we've seen. Mitch? Well, I mean, I think that you're exactly right. It is contextual. I chuckled a little bit when the question came in because it took me back to my grad school days. Because this was certainly something we talked about in my master's program. And I think everything Vivek said is spot on. As a rule of thumb, I was taught. Rule of thumb, not by any means a fixture, just kind of a guideline. 20% response rate is pretty good depending on your sample size. So I was always taught if you can get between 20, get it around the 20% mark, then it's probably a reliable measurement. You know, the statistics I quoted about the 75% who have gotten involved of our graduates. I think that was probably about 40 to 45% response rate. But that was with interns calling people a lot to get them to respond. But in basic stuff, in grad school, what I always heard thrown around was about 20%. Perfect. Let's get to know. I think it's nice for people to have kind of a number to focus on. So another question that we got was in terms of paper surveys versus online. I think, for example, you brought up volunteers earlier in the presentation. So if there's some sort of volunteer event and people are leaving and you hand them a survey, like do you find that pretty much all of your surveys are done online or is paper surveys still kind of a thing? Almost all of our surveys are done online with the exception of, I do have for one of our workshops, an old school person who really likes paper. And because that facilitator likes paper, we hand out paper. And the response rate, of course, is significantly higher when you do paper. My only concern about paper over online, you're going to get a better response rate, but working in leadership, one of the things we talk about is people having different learning, thinking, learning, and working styles. And for those people who need some time to think about an experience, I don't know that paper is always the best format because there are people out there who need it. Sometimes they need a couple of hours or maybe a day to process whatever it is that just happened. And when you hand them a piece of paper and say, fill this out, you may not be giving a complete response because for those folks who are, what my wife keeps reminding me, she's a processor, right? I'm not. I know how I feel immediately. But in my marriage, we've learned that we think differently. And so, you know, for the processor, sometimes at any time. And so paper, you know, paper has its uses for a high response rate, but it's not my recommended way of doing surveys. Okay. Yeah. So one of our reasons is to eliminate paper surveys clearly. We are tech company, and that is completely connected to what we believe in. We do have a solution where a lot of, in fact, that's one of our, you know, one of the products that we sell and it's actually included as part of the TechSoup offering is an offline app that people usually download on an iPad. And I think the question was surrounding an event. We, commercially, we do that in conferences, as well as, in fact, a lot of, actually, nonprofits have also used us in collecting data in international kind of areas, you know, going to Africa, collecting data. The mechanism is you download an app. It's a question for an offline app. You set your survey up online, that, you know, it sets up exactly the same way. But the data gets, the structure of the survey gets downloaded to an app, and then you don't need to be connected. And then you can go where you want, collect data on the app, on the iPad. And then once you are back on Wi-Fi range, you can synchronize the data back to the cloud. And that works, that works, you know, I think in developing countries, we use that very, very effectively as a mechanism, because we have a census mechanism, frankly, not even just kind of attributes, but more census data. Healthcare, you know, health services providers have used that mechanism to collect data around, you know, impact assessment of different programs that they're running in remote parts of the world, where obviously there's no cell phone connection, there's no internet. So that's another mechanism. So using paper and pen, you can pass an iPad around and do that. But obviously, you know, you can use an iPad and make that happen. So that's another mechanism you can do it. That's great, yeah. I think that probably helps with the privacy as well. So that's really helpful. So another question that we got, let's see. So do you have tips on creating a brief survey that captures stakeholder feedback on how our programs are viewed and how they may value our programs slash organization? So I think Mitch, that might be a good question for Mitch. What are your key performance indicators? What are your metrics? You know, so it all comes back to your mission, right? So if you think about your mission statement, what are three to five questions you could ask that capture that? So I explained what our mission was. So I would encourage you to think about your mission because that's always our true north in the not-for-profit world. And then what are, you know, a couple of questions that reflect that, that help you truly understand the job you're doing at meeting that mission. And you'll know that not only from the work you do but the opportunity to ask the stakeholders their perceptions. And so I think if you bring it back to your mission and questions that reflect your mission, you're not going to go wrong. Great. Okay, so I think we have time for one more question. And again, this is probably a little broad, but in terms of, like, best practices, you know, and you kind of went over this earlier in the presentation, but in terms of analyzing the data once you have it, you know, like, I guess do you have any best practices around that? Yeah, go ahead, David. Yeah, I'll go brief and then you can jump. I mean, I think the general model is that you cannot have high-level data or all aggregated data, right, produced by default. So here's what everybody thinks. And really depending upon the volume of data you get, then the next level is to start segmenting the data and say, like, look, like, you know, do males think differently than females or do people with, you know, five years of board experience think differently than, you know, 20 years of board experience. And to gain some insights from them. So that's the typical process of kind of understanding data. You start from the top and you say, like, this is what everybody thinks. Let's say we did the survey, like, 70% have done a survey. Now, we didn't go a follow-up question would be what kind of surveys have you done as a board development of this and go down that process. So my another thing here is that, you know, the best way to do it is to start from the top and say, like, hey, do the aggregated data and then start doing the segments I think that is, that is spot on. The only thing I would add and that if you kind of touched on this earlier is the importance of qualitative feedback. And so almost all of our surveys are both quantitative and qualitative. So we're asking those rating ranking questions. But then we're also asking open-ended questions that complement that. And I found that that's helpful as well. So when you see a data point that maybe leaves you scratching your head a little bit when all you are seeing are the pie charts you can usually then go back to the open-ended responses. And so one of the things I like about tools like QuestionPro is if you hear and you see some dissatisfied for something you're doing, you can click on that. And you don't know who the people are but you can see the dissatisfied responses and then you can go through and look at all the rankings and you can look at also the open-ended qualitative feedback and usually you're able to come to a more complete understanding of what happens. Perfect. All right. So I think we heard some bells in the background which is our cue to wrap up the webinar. So I just wanted to go over a couple of last things. Thank you so much Vivek and Mitch for presenting today. I think you guys probably saw the Sheikahs chat but we have a program with the QuestionPro so if you go to techsoup.org slash QuestionPro you can get more information if you are interested in serving your audience. And you can also go to our product catalog for more general information on what we offer. So just one thing that we like to do is we would like to ask you to chat one thing that you learned in today's webinar and in the spirit of surveys we also have a survey and we would love your feedback because it really helps us kind of dictate future content and we want to know what you want to learn and what you learned today and any feedback you can give us is always really helpful. We're on social media so if you guys are on Facebook or Instagram or Twitter we love social media love and we post a lot of tips and tricks and things like that on there as well. And then we have a blog which we also post a lot of articles and how to check our blog for more information. And we have a few upcoming webinars so you can see the schedule here. I won't go through each one but we'll be sending out the slides and this is also hosted on our website if you're interested in joining us for a future webinar. I'd like to thank Vivek, Mitch, Lechica, John and Esther who were on the back end answering questions and thank you all for participating.