 Fantastic. Welcome to Creating and Building a Legal Aid Dashboard. As Brian mentioned, I am Laura Quinn. I am the founder and actually now the director of partnerships and knowledge at IdealWare. For those of you who may not be familiar with us, IdealWare is a nonprofit research organization geared to providing information on everything from very nuts and bolts infrastructure technology all the way through program stuff and communication stuff, all the way to kind of cutting-edge technologies. So we're a resource with tons and tons of free articles and reports on our websites, and we've been doing a lot of work with the legal aid sector, including LS Untap and, in fact, LSA, which brings us to Peter Campbell. We're really excited to have you here with us today. Peter, you want to quickly introduce yourself? Yeah, I'm Peter Campbell. I'm the Chief Information Officer at Legal Services Corporation. Here, less than that official capacity than in my more general capacity as somebody who has worked with Dashboards over the last decade or two. Absolutely. Yep, so we're excited to have Peter with us as well as Brian to provide some kind of context for the information that we'll be talking about. Which is what? So we're going to start by just kind of talking a little bit about what a dashboard is. And then we're going to walk through a process for understanding kind of what it means to you. Dashboards is one of those interesting words that often people feel like they know exactly what it is, and everybody has their own definition. We'll talk through defining what it is for your organization, what people want, creating the metrics, figuring out what the metrics are, which is probably the most complicated part of the dashboard. Choosing your platform, charts and displays, implementing and rolling out, and iterating, which becomes a really important aspect, even more important than usual in a technology project. And then we'll turn specifically and look at three or four legal aid case studies. So what folks in the legal aid world are doing with case studies. So basically a dashboard is anything that's going to help you to kind of consolidate information. If you imagine the dashboard of a car or the cockpit of an airplane, it's the idea that it's a lot of information all brought together in the same place to help you easily kind of monitor and manage things. And it can track a lot of different kinds of data. And this is one of the reasons that if you ask different people, you'll get different definitions of what a dashboard is. So, for instance, Brian was just mentioning that LSNTAP hosted a seminar on Google Analytics dashboards a week or two ago. And so that's 100% lots of data coming together into one place. But I think that many people, when they think about it, tend to think of it as a more of a kind of an organizational control panel. But it's going to vary from person to person, as I mentioned. So really important to think through what it is for you. So operational data, spend and budget, program impact, all of those things. Good comments here. One of them is my organization is new and we don't have a dashboard system, but we want to use one for our board of directors. Kind of so they can know what's going on, health of the organization, that type of thing. And the other one is viewer 26 who says examples of dashboards for case handlers, board members, funders, those kind of the common ones. Fantastic. Yeah, and for both, both those comments, that's great to know that's both folks looking for kind of high level information. So stuff for like a board member and then thinking about that compared to what other people in the organization might use. And that's exactly the line will be going down. So thinking about everything from operational day to day. So things like, all right, what are upcoming milestones, what the calendar looked like, what are the number of open cases assigned to each attorney. More specific things. So like things like the average time that a case is open, time spent per program, being able to see caseload based on different areas of law. Or I think for some people that kind of the holy grail of an operational dashboard is the idea of seeing program impact. So being able to see, are you succeeding? How many cases have you won? What is the trend in here? I'm a lower right foreclosures prevented. And so being able to see the impact of your organization all gathered together. And in fact, there's been a fair amount of work in these areas over the last couple years. So in fact, I have two new case studies for you on people using dashboards to track outputs and I sorry outcomes and impacts. So we'll talk a little bit about software, but unfortunately there is no I think many people think of this as a technology problem. And unfortunately, it's not one that technology can easily solve for you. A lot of people in fact are using something as simple as Excel or the existing system that they currently have. So case management is supposed to grant management or things like crystal reports. We'll talk specifically about software, but it's not really fundamentally a software problem. So that's kind of my overall definition of kind of a dashboard and how it fits in. Peter, where have you seen dashboards having kind of the most impact for organizations that you've worked with? I mean, I think you use dashboards for different things. So if you ever go visit Microsoft and Seattle, they have screens up throughout their building showing largely outputs as both outcomes, but how long the server has been up, what the situation is. So clearly operational just saying is everything functional and be able to see that in one place. More often in our sectors, they're used for exactly what you were just showing, tracking the effectiveness of the organization and outcomes so that those cases can be made to funders so that they can just keep a good eye on what you're accomplishing with your work. And then I think one thing that we kind of push it all to see that the best use of dashboards is to learn from them to have that kind of big picture view of what's working, what isn't working, and then from there you can apply different strategies and see how they affect things overall for your organization. So not in little silos of we did this and this happened, but we did this and these are the things that changed across the board and how we operate. Yeah, absolutely. And thinking about things across the board including potentially, you know, like caseload by individual case handler and things like that. So making sure that we're thinking about the kind of the operational out well as the overall impact. Any thoughts to add in regard to kind of how dashboards fit in for the legal aid work? I think that a really good way to measure them or to use them is to measure social media specifically when you're putting things out there trying to see what the reach is, what the feedback from the community is. Most social media sites have very good built-in dashboards where you can see how adding an image impacts things. We're asking a question has a different impact than just stating something. Being able to watch those things over time, let you know what content is working and where your clients are really engaging. Absolutely. Toby is asking a really interesting question. Has anybody advocated for courts to use dashboards so legal aid providers in the public can understand? Well, our courts are working. Interesting. Peter or Brian, do you know anything in this area? I know a gentleman who worked with the King County courts that was looking at doing something kind of inside the courts to let people know what was going on. The project was still in its kind of planning stages the last time I talked to him, though. I haven't heard of any such projects, but I think it would be awesome. Yes. Yes, I think clearly it would be awesome. The buy-in sounds like the challenge there to try to figure out where the money is coming from and how do we send the courts. Fantastic. All right. So let's think through kind of those seven steps that we talked about. Starting with defining who and what your dashboard is for. So given the whole variety of things you might use a dashboard for, you could have as the court users anything from the general public to board members, to individual advocates, to managers, you know, like so it could be across the board. So it's very useful, I'm not critical, to define who is the one or two highest priority users of this. Then obviously to say who else are audiences. But it's really going to be problematic to say we want to make this dashboard be all things to all people. You want to make sure that you know who is like fundamentally if you need something to support the board, to make sure that you understand the board's needs and you can in fact fill them as opposed to saying, all right, we're going to make something that's useful to both advocates and board members. So advocates for day-to-day use and board members for monthly use. That sounds like a real challenge. And to think through your overall goals. Organizations can have actually some fundamentally different goals. So one of which might be to help everybody kind of see a centralized set of metrics. So to help everybody be on the same page as to how the organization is doing what's going on, et cetera. Or you could in fact have almost the opposite goal, which is to say that each staff member should be able to customize their own dashboard to be able to kind of manage operationally what they need in order to do their day-to-day job. So these aren't completely opposite, so you could do some of both. But it's important to recognize that there's a tension between these things. The more people can customize, the less people will be able to simply say, okay, this particular metric, you know, the one on the upper right-hand side and be able to have a kind of a shared understanding of what people are looking at for the organization. So there was a great question here, which is, are there any orgs out there that have pro bono programs that use dashboards to help manage those programs? Unfortunately, I am not aware of any, although the Web Lawyer Project, kind of the Tennessee online project does have some information from that online clinic that is stored in kind of a central location. In Washington State, we just implemented that and we're still customizing it some. We're going to try to bring in some of the different problem areas and lawyers available that type of information and put it all onto a single page that way. I think we're not that far from moving in that direction. I know there's some volunteer lawyer programs that use kind of the constituent management software, and those usually have very good dashboards in them, but I'm not sure. So I would say in our sector that there are a handful of programs that use Justice Server, which is the Salesforce-based kind of light-case management system, but I think it was the first pro bono gateway or one of the early pro bono gateways, and Salesforce has built-in dashboarding that's very easy to get going, so I'm sure it has good pro bono dashboards. And then I'd kind of be curious because I know that camps and legal server all have pro bono gateways as well at this point. I'm not sure whether PICA has one yet, but whether they also have good kind of metric displays, dashboards built into that or not. Yeah, absolutely. And I think that's always easy to get. We shall put three of those next here and try to get some pictures of those and share whatever is available on them. Great. And for something like that, obviously it's always useful to have an example of what other programs are kind of tracking on their dashboard for a pro bono program. But it's also, I think it's worth noting that basically a lot of that comes down to what it is that you want to track about your programs. So it's really a fair amount of it is what are the numbers that you need to track in order to effectively manage the program. And so a lot of, and we're going to talk about this in just a minute, a lot of kind of the question of do dashboards exist in any given area is not necessarily is there stuff out of the box, but how do we define what it is that we want to know? Because honestly, once you know what you want to know, what metrics you want to track, the actual creation of the dashboard side is probably not a vastly complicated ordeal that you may or may not be able to get it in the system you've got, but something like plugging numbers into Excel may work perfectly satisfactorily for your needs. So yeah, just thinking about it as not something that you necessarily need out of the box, but something that will be customized to your organization almost always. I think asks, is there a rule of thumb for how many metrics should be on a dashboard? Five, seven, we're going to get into the kind of idea of visualization in just that number, six or seven in our list. So we'll get to that. The quick answer is, you know, it's going to depend a lot on how you're visualizing it. So let's let's hang on to that. And who the audience is. Absolutely. Another source talking through goals for your dashboard and thinking through what's important to your organization is a dashboard. Another key question is, is it important that people can self serve? So is it important that, you know, they can turn on the faucet themselves and get this dashboard? Or is it unlikely like so like a board member, for instance, it's kind of unlikely that they're going to want or need to self serve to say, I want to go to this website and see a real time up to date set of metrics. And so it can be just as effective and potentially a lot easier to think of it as something that's generated every month or every week or whatever, rather than something that people can go to at any moment in any time to see it. And so just in general, you want to, like in any technology project, you want to define what success will look like. So how do you know that this is a good project that is done well that meets the needs that were defined for it? Because just saying we're going to create a dashboard is not at all sufficiently specific that you're going to be able to know if you're done or whether it worked. And it's a real danger here to try to say, okay, everybody, what things would be useful to you? Great, we've got 60 different metrics. We're going to just glom them all because that's really the opposite, truly, of what a dashboard is for. That's an overwhelming ton of information that's probably not very well classified as to how it works together or what it needs. So basically, in that case, you've made something that, by attempting to make something for everybody, you've in fact made something that's not terribly useful to anyone. Peter, what would you add about the realm of defining your goals before we move on to actual requirements from users? Well, I mean, again, I think it really depends who the audience is. You're going to make a different dashboard for your board than you're going to make for your internal staff. That speaks to the number of metrics, too. We have a dashboard for internal staff that scrolls down and probably has about 15 or 16 different charts and graphics on it. We were trying to, I think, make a more readable summary of what we're accomplishing. We would cut that down to four and five for maybe an external audience. That wasn't your question, though. No, I think that is. So kind of defining who it's for is kind of a key, kind of a fundamental question. Absolutely. Yeah. Great. Brian, anything to add? No. I definitely agree there that the focus of it really matters a lot. I get a lot of people asking to customize or add different things to them. And it really depends on your user base and whether that is needed or not. Yep. Absolutely. I think also if you're using something, you know, software that supports charts and dashboards like SharePoint or Salesforce or even, you know, I know Legal Server does, the best ones will allow you to, A, have kind of a company dashboard that stays fixed per the people in the organization responsible for defining what goes on there in addition to personal dashboards where people can kind of pick what they want to look at. Yep. Yep. Often people use that as kind of, there's the organizational dashboard, which is what everybody kind of thinks about in decision making. And then there is the operational day-to-day that each staff person can look at whatever they want. Yeah. Right. And what makes the individual really accessible is when they don't need to code. They just have drop-down menus very easily. They can pull, choose, customize without having to go through IC to do it. Yeah. Yep. Absolutely. Great. All right. So let's talk a little bit about understanding what your users want from the dashboard. This is a really interesting requirements definition process because it's really all about thinking through not necessarily what the gaps are in terms of features, like it might be in a more typical requirements gathering process, but more like what is the information that people are lacking in order to do whatever decision-making you're trying to support. So talk to people about their current processes and think about the gaps in the workarounds. What are people doing that, what are they not able to do or what are they doing some crazy work around because they are missing information that could be more easily put at their fingertips? I've also, so back in a former life, I actually did a fair amount of reporting design. And something that worked really well for me was what I would call the magical dashboard. Very useful for a non-technical audience. So basically saying, all right, so imagine every week or month or whatever timeframe you're using, there was a piece of paper slipped under your door. What information would be on that piece of paper with basically there being no limitation in terms of what's feasible? Just any magical information can show up on this piece of paper. It's a very intuitive way for many people. Some people can't really have a hard time thinking in metrics at all, but for somebody who knows kind of numbers but not systems, this can be a very intuitive way to gather what they want to know. And in fact, my experience is people would say, oh, well, what I would really want to know, this is really crazy. I'd really want to know the number of advocates working 50% or more. And you're kind of like, oh, well, 50%, not very much. So an 80% or more in terms of allocation. And you hear that and you're like, oh, well, we've never really looked at the numbers that way, but that's actually really easy. So you'll hear needs that people have assumed are not possible that in fact are. And of course, you'll also hear needs that are not, or wants that are not possible. But I think that people don't tend to have a very good understanding of what's actually easy and what's hard when it comes to learning. I have totally experienced that one that people think the simplest thing is really hard to ask, so they don't ask for it, and vice versa. The other thing is having rolled out a number of reporting and dashboard systems in my career, I've often found that asking people what they want to see in the dashboard is kind of like asking, what's not in your pocket? Because you can tell me what's in your pocket really easily, but if I ask what's not in your pocket, well, that's really hard to pin down. So definitely if you're trying to do the requirements gathering for what people want to see on the dashboard, come up with examples, never go to them cold and say, let's discuss this in a meeting with, you know, no visuals, but also work with the people like in your finance department, the ones who are going to, you know, comfortable with numbers are going to be able to come up with examples. So you have not just your examples, but real-world examples from other departments to show the people who are going to struggle more with it. Yeah, when we were working on a dashboard project here for our CMS, we did two things. We came up with a little one-page or kind of explaining what dashboards were in advance with some visual examples. And then when we went to people, we tried to figure out what types of information they accessed on a daily basis. So asked them some general questions about what they were doing repeatedly and then tried to construct something for them kind of in a paper prototyping way. Yep, absolutely. Great, lots of different ways here. So, and in addition to metrics, which we're going to talk about in a second, try to define what they really need. So is this just something simple? Are there more complex indicators? Do they really need to tailor it? I find that many people think they want to tailor it, but in fact, what they want to tailor it to is the same thing that many other people want. So as a potentially a starting point, it can be easier to obviously create one thing. That's not customizable, depending on what you're using. Drill into details. Scenario planning is something that doesn't come up very much in the legal world, but the idea of saying, all right, well, what would happen if, like, for instance, we took on this gigantic claims case, for instance, what would that look like in terms of allocating lawyer's time? So is that type of thing desirable or necessary? So that's the high level kind of understanding of needs, and that bleeds a lot into mapping out what metrics you're actually going to track. So let's actually dive into that one. So this is fundamentally goes all the way up to kind of strategic, mission-based, how are you going to, what are the fundamental goals of your organization and how are you going to track them? So if a fundamental goal of your organization is to have affordable and decent housing, how is it that you're going to, what metrics are you going to be able to track to say whether you're actually on your way to your goal? So this goes back to the idea of what outcomes are you tracking, what connects to your mission in what way? So this can become pretty tricky. It also, even on the operational level, it says things like, if what you want to know is how busy people are, what metrics are there that are going to track that? And it's important to logically define not only what's desirable in an ideal world, but what information you can actually collect, because there's no point, like other types of projects, we really discourage people from saying, all right, great, I need a dashboard and I'm going to put in a TIG grant for $100,000 in order to be able to do that. That's not the way to start. That might be the way to continue something that has worked well for you, but you want to start with something that you can actually do and iterate it from there, because we'll talk about it in a minute. Iterating is especially important in the dashboard world. So, like I said, this is something, defining your metrics could take a day for, you know, you could have a two-hour meeting to define some simple operational metrics, or maybe there's already a bunch of metrics that you're tracking and you just need to centralize them into the dashboard, or this could be a year-long strategic process. This could be kind of infinitely large. And then thinking about where you're actually going to get this data. So, do you have it? How easy will it be to get it from where it is into a dashboard? We'll talk more about the technical aspects, but just because that data exists doesn't mean that it's easy to get together or to especially pull automatically. Fundamentally, this is where most dashboard projects live or die, you know, that it's great to say, all right, we have defined our goals for a dashboard and we've got this beautiful technology solution, but if you don't know exactly what metrics are going to be useful to people and you don't have buy-in to the organization as to what those are, then it's going to be very hard for your dashboard to succeed. So, this is step number three, is where dashboard projects live and die. Peter, is that your experience as well? Totally. I've seen them have the most traction when the board is demanding it. And then flounder a bit when it's like anything. Where is a person situated who is driving the project? Right, absolutely. And Brian, how have you gone through processes in actually defining what to track in the past? So, what metrics to track? I mean, it's definitely sitting down with users, interviewing them, having them give it a try and then going back and revising it. What they think they're looking for may actually be different when they start looking at it and when they start to see trends, there may be other variables or other things that they would like added to that. So, getting users involved as early as possible and then updating it based on their needs. Yeah, so just talking about the idea that this is a super iterative process throughout because it's very difficult for really anybody to know exactly what the data will look like, what the metrics will be able to tell them and how they'll use them to make decisions until they actually have that data in front of them. All right. So, let me pause before we kind of walk through some of the more kind of tactical and tactical aspects of this to ask for questions. Anybody have questions for any of us in regard to kind of the whole kind of strategy definition side of dashboards? Anybody have comments about what has worked well for them or not so well? You're either welcome to hit star six and say it out loud or enter into the chat. So, there is a question in the chat. Yeah, so Jessica asks, does anyone have an example of an effective dashboard requirement gathering questionnaire that they've used? Let me, so I'm going to turn to Peter and Brian, but I would just say that to me a questionnaire would be a adjunct to interviews. I feel like a questionnaire is going, so maybe you would have like an interview guide with questions, but I would really encourage you to think of a kind of a survey only as possibly like a way to hit a bunch of people who aren't really core stakeholders. Brian, Peter, any experience in doing questionnaires for? Yeah, I have to say I haven't personally done a questionnaire. I've always done this in meetings and... Yeah, we've definitely just done an interview with a few questions. I'd have to look back. If I have one, I will definitely post it to the wrap up to this, but I think it was less formal than that when we did it. Great. And I can look back through my past documents to see if it was something that I did a lot was to look at, do kind of interview guides for this. And Nola is mentioning that he would love to see a bunch of examples of legal services. Absolutely. We're going to... We've got, I think, four examples before we wind up this particular session. Fantastic. All right, so let's talk a little bit about platforms. So the idea of your platform is it's going to be able to take all of your data from whatever format it's in and show it in the form of charts, metrics, good stuff. And the most logical place, if it works, would be your existing legal management system, your case management system. So it may or may not work. So it may or may not have the functionality that you need, but it's pretty clearly the place to start, because it's got probably the data. So, well, fundamentally, the answer is the best place for the dashboard is wherever the data already resides, because, in fact, moving the data and trying to match multiple data sets together gets really quite complicated. You can also think through an Excel spreadsheet, which sounds really boring, but is, in fact, a really great place to start. Because fundamentally, if you are, especially if you're doing a weekly or a monthly dashboard, accruing it by hand is not actually that complicated. So you could potentially spend an hour each month putting together a dashboard in Excel, and you can do things like, for instance, do just a dump. So you run a report out of a system. You dump it into this particular tab. You do a run on another report from a different system. You dump it onto a third tab. And then automatically, the first tab is a summary dashboard that works some Excel magic. That's the type of thing that you can build that actually becomes really quite comparatively straightforward to update. Obviously, Google Docs, Google Charts, things like that are also useful. Think about plug-in reporting tools. So either something that sits on top of an existing system. So this is, for instance, a plug-in module that sits on top of legal server. We will look more at this. This is Atlanta. We'll look more at this particular case study coming up. Something like Tableau. This is the wrong post-it note for Tableau. Sorry, because Tableau is incredibly powerful. And it is not straightforward to say. It's like the opposite. So it's very powerful. It's very complicated. For those of you who are familiar with something like Illustrator or Photoshop, Adobe products, I kind of liken them to that. You can kind of fiddle a little bit and get somewhere without specific training. But it's really easy to go down some rabbit hole where you're like, what just happened? Undo. Undo. And so it can be kind of, it's something that really even quite technical people might want to consider getting training in. It is free if there's only one data source. If you have multiple bases. Sorry, go ahead. That's not how I understand it. So just data visualization software like Tableau, as those programs go, Cognos is kind of a granddaddy of the made-by-business objects. And they're at least 20 or 25 different ones out there are all incredibly complex. Tableau is actually about the simplest of that bunch, given that they're all incredibly complex. So it's Tableau, but not as incredibly complex as say Cognos. My understanding of the pricing model is that they have nonprofit pricing, but when we looked at it, it was still too expensive for us, even at the nonprofit discount. But what they have is Tableau public. So the deal with Tableau public is you can use Tableau and you can do visualizations on their public server as long as you are willing to upload your data to that server where other people can see it. So if your dashboard is using information that is aggregated enough that it's not revealing any client privileged information or personal identifiable information, then Tableau can be free. And it is very powerful and flexible. And one of the big advantages of Tableau is that dropping any Tableau chart or map or graphic into a web page is really simple. Absolutely. I think that, in fact, it no longer needs to be public to be free. So we can look this up. Okay, you should clarify that. Yes. I do know that you have clarification in the wrap up. Great. Sorry about that. We, in fact, used it last year without it being public and it was free. However, as soon as you have, you want to attach essentially more than one Excel spreadsheet worth of information than it becomes, yes, people saying very expensive. Yeah. Cool. Let's see. There's a lot of kind of specific dashboard tools. So things like I dashboard, good data, click view. These all sound like they might be an easy solution to a dashboard, but they probably are not. These are designed to have, to basically be lower end solutions to things like Tableau or business objects or Cognos or things like that for fairly complicated dashboard needs. So these mostly I include these because they sound like they might be something really easy and in fact are mostly, basically getting the data all together into one place, into one tool is fundamentally one of the hardest things about it. So a number of these tools are primarily, one of their primary functions is to map together complicated different data sets, which may or may not be something that you're actually looking to do. Or things that are specifically meant for reporting. So Crystal reports, Jeff reports, business objects, SharePoint services. All of those are potential solutions. This long list of potential tools is basically, you know, it's basically representative of the fact that depending on exactly where your data is, how your data is stored, how complicated your metrics are, it's really, like I said at the beginning, there's not really specific technology problems around this. If there is a technology problem, it's the idea of getting multiple data sources to live together and to overlap in ways that they can be reported upon. But otherwise, this is mostly a realm of visualizing data that you have, which, and I would definitely recommend that you start with something like Excel or even like literally a piece of paper. Sort of type it up in Word, send it around to start that as your iterating process. So you can make sure that you know what people actually really do need and will use before you go down the road of a fancy tool or in particular, building your own, which is, you know, this is Kellogg. So Kellogg has built their own and they had a lot of money to do it. Peter, let's start with Brian, because we're always starting with Peter. Brian, do you have any thoughts as people think about technology solutions to dashboards? Any advice there? I would just say don't get intimidated by the high-end price tags that are out there. Definitely take the advice that you had on, you can use Excel, start simple. It's not that difficult to get started. And then as you're working with a lot of products, some of them will have built-in dashboards also. There's some great tutorials out there online for just using what is built into things also. Viewer three six comments said, Microsoft has free tools that are kind of competing with Tableau and Cogdose and things like that, Power BI. So I would say that those are free. I don't think that they're simple. So, you know, if you have technical staff on hand to figure them out, it might be worth looking at. And they add as well the biggest challenge is having folks think through what data they need to look at and why. Yeah, absolutely. Absolutely. So let's look a little bit at kind of just the idea of charts and displays. So going back to how do you visualize your data, how many things on a page, stuff like that. So you clearly a core thing here is that you want to match your metrics to your visuals. You want to make sure that what you're doing with your dashboard is letting people effectively see the data and to let the data itself take center stage. So that typically means things that may feel to us that want to do glitzy things that may feel kind of boring because that puts you in the realm of bar charts and line charts and just simple numbers, pie charts, not generally the preferred method of data visualizers in particular. Like if you look at this particular pie chart, how does this red slice, compared to this yellow slice, really hard to tell? If you put them in a bar chart, it'll be immediately obvious what the difference are. What the differences are. Something like a plotted chart, a scatter plot, is a little less familiar to it. Like it's not the type of thing that Excel will easily, it will do it, but it won't as easily do it as it does some other things. And so we might forget about it, but it can be for the right type of information, a really powerful thing to show, to kind of show overall trends. Hey, Laura. You want? Yeah. Just on that point, I'm copying and pasting a URL into the chat here. Can people everybody see the chat? I don't know if they can. Maybe we can send this out afterwards. I think they can, yes. Okay, it's not a very readable URL. It's one of those tiny ones. But this is a blog post by Ankay Emery, kind of a guru on this stuff, about when to use pie charts and when not. And it's one of the most brilliant things I've read in the last month. Really wonderful samples of, you know, why people do hate pie charts, to really explain it well and what you can use instead to really, you know, more clearly present your data. Ah, fantastic. Very nice. We'll definitely share that on the blog. Perfect. I'm sure that you are actually doing something with your graphics. This is the type of thing. This is actually some of the sales force out of the box, visuals. I think that when people think of dashboards, they often think of fancy things like this. But in fact, if we look at this, so this is showing us that, you know, we've had $4 million worth of sales that's not a legal specific example. And that means we're in the red. There's really almost no information conveyed in this graphic other than would be conveyed by the number and color coding it. And if you felt like it was important to somehow display more than the red, yellow, green, there's no reason you couldn't put a spectrum in here as well. Like it could be a 10-point spectrum to show you're really red or an only kind of red. So this is the type of thing that uses up a lot of visual and cognitive space and is basically telling you nothing. So to really be aware of that type of thing, because it's exactly the type of thing that somebody who, like you often hear it from a really over excitable board member or something like that, that they want something that's really visually polished and polished seems to translate to glitz in their mind. So there's obviously a huge amount more to know about graphics and dashboards. Let me actually just take on the question that someone asked earlier. How many is the right amount? It depends a little bit on how exactly you're showing them, because you can imagine if you can just show your metrics as like a simple number and that can be powerful or a number with a color code. So this is how many cases we did this month. Here is how many cases we did per program area. So that can be really straightforward. But you're going to have information overload if you try to put 30 of those on a dashboard. On the other hand, you could fairly straightforwardly put a bar chart of 30, especially if they're ordered in some way. That would be a little difficult with 30 program areas, but if you have 30 things or you're like certainly 30 timeframes with a line chart, that becomes much easier to visualize. So it's one of the core things to ask yourself is are there ways to align this data so that this particular metric and this metric and this metric can be associated with each other and kind of intuitively grasped as kind of one object. You might want to think of your dashboard as being, you know, you probably need two or three before it's going to feel like a dashboard, you know, three probably minimum. And then you probably want to limit yourself. I do agree with Peter. It depends on if you have an internal audience who is looking at it all the time, you could have something that's like six pages long and people just know to scroll for what they're looking at. But they basically that assumes that you've got a user base who is going to kind of be tolerant of more information than they really want. So I'd say maybe optimally, I'm kind of making stuff up, but eight or 10 kind of objects and making sure that they're not all really heavy objects. Like you don't have 10 stacked bar charts like this guy here. I don't know, Peter, Brian, do you have any thoughts on kind of how to make your dashboard visually un-overwhelming while putting some good information there? I mean, I think you've really got to test them with users directly. Colors on there matter a lot. You can often play with the size of things or colors display them in a different way. But what looks good to me does not look good to the senior attorneys that we worked with. So you really need that hands-on testing. And thinking about the colorblind becomes really important here as well. Very true. Great, Peter, any thoughts? I'm not sure that I have much to add on this. Cool. Great. All right. So yeah, this is something else like the outcomes that are like defining your metrics, which could easily be not just the session, but like a book in and of itself and is. All right. So we've got seven steps. And here's number six. So implement and roll out your dashboard. So this is the implementation stage, which could be very straightforward if you're basically pulling things together by hand or it could be a year-long project for a very complicated automated dashboard. So you're basically in this process, you're pulling together the data, the visuals, and going through a typical, if you're actually building things or configuring, mapping things to go through that typical process. Don't forget about the process of training and or getting people on board with the process. So hopefully you've already gotten people on board through the definition process. But make sure that they know not only that it's there, but what they can get from it, how to use it, how it fits into their day-to-day job. Getting buy-in tends to be a lot easier if you define exactly some specific situations in which you suggest they use it. So for instance, how about you guys take a look at this dashboard every month in the senior advocates meetings to kind of see where we are and what adjustments you might want to make. Having an actual place and time for it and where room is provided to think about it can really make a difference. So another big topic. Peter, any silver bullet solutions to effectively rolling out your dashboard and implementing it? Well, I was going to say maybe on a different point that one thing I do advocate is that as you're doing your technology planning, do look at your dispersed data sources and think in terms of integration. At LSC, we are now kind of standardizing on Salesforce and we're doing that because it is so easy now to get all the different access databases and Excel spreadsheets out there that have information that we want to look at collectively with our other information and move that data into Salesforce and have it all in one place, which makes the building of the dashboard just so much easier when you're not going to different sources and doing all sorts of data transfer or rekeying it in, which unfortunately happens often. It's not an overnight process to get your technology to that point, but I would say it's often a multi-year process to get your technology to that point, but we're thinking about it becoming an information-based decision-making organization is the goal. That's not the question you asked at all. No, I think it is the idea that this can be a mini... The difficulties can come from many directions and I think fundamentally if you're looking to do a fairly complicated dashboard that is a data centralization problem from a technology perspective, that you've got a lot of data coming from a lot of different sources. It's essentially a data warehouse or some data warehouse-like solution to getting things to talk to each other and to be close enough to each other that they can... I would say we chose Salesforce. It was pretty close evaluation with Microsoft. There are different ways to do this, so I'm not trying to say that everybody has to use Salesforce. There are different platforms. Absolutely, and in fact, depending on what exactly you're tracking, if you're looking to track kind of operational metrics and outcomes of cases, then a case management system is the logical place to do it because your data is already centralized. Your data is already in one place. So doing it there makes all sorts of sense. All right, so number seven. So we've been talking about this all along. So the plan to iterate. So you really, in this realm, it's really important that you start small. It's, as we've mentioned a couple of times, it's really difficult for really anybody to know exactly how the dashboard will be used and what's going to be very useful until it's actually in front of people and they are trying to make decisions based on it. Because fundamentally, if you haven't tracked the metrics before, it's going to be like when you first look at it, you look at it and you're like, great, 87%. Is that good? Who knows? And over time, you'll come to know more about it and realize, oh, well, in fact, maybe that's not the right thing to be tracking at all. That doesn't, in fact, even really track with what we care about. So to basically do something that seems pretty approachable and start small and plan to refine over time with the idea that implementation is only kind of a small part of a cycle, that you're defining your goals, potentially as an organization or at least the goals of the dashboard, your features, what metrics it is. They are that you are measuring. You're then implementing that and you're measuring it. You're seeing how it works. You're talking to people. You're seeing what's actually used. And then you refine potentially all of these things. So not just, you know, what reports, but potentially what metrics, what goals, what features your dashboard has. And to think about the data as well, this is another huge topic and thinking about kind of incenting high quality data. But you want to keep an eye to make sure that your, the way that you're using your metrics does not seem to, well, so the way that you're using your metrics will probably have an impact on the data itself. So if people are looking at the data more regularly, it may well be that the data gets better because it becomes obvious when there's gaps, you know, people feel more accountable for it. People start to yell at other people because their data isn't in, things like that. It's also possible that you have the, you know, kind of you get what you measure type of fact that if you have, especially if you have metrics that aren't super well considered. So logically, if you are, like, and especially with really, you know, detailed incentives tied to them. If, for instance, you are paying advocates based on how many cases they close, that is obviously a bad idea because they're going to then be incented to pick easy cases, to close cases that aren't, in easy ways, that aren't necessarily the right ways. And so it's a kind of extreme example. But thinking through, okay, in what ways might be what we're measuring actually impacting either how the data is tracked or in fact what people on staff are doing. Peter, do you have any examples of, sorry? Yeah, that is very true. It reminds me of one of my earliest jobs working fast food in which we were incentivized to do things in a certain amount of time very quickly, the quality dove, but we met that metric. I've seen the same thing happen before getting into legal services. I worked a little bit with call centers and there was an incentive there to close calls out very quickly and you got very terse, your quality went down, but that metric went up. So you do have to be very careful of how you're presenting this to staff and how it matters to them. Yeah, I have a rant on the help desk metric because if your help desk metric, if you were deciding that your help desk is effective because of how quickly they close tickets or how many tickets they get each month, those aren't good metrics. If you're closing tickets quickly because you've been told to, you might not be solving the problems. You might actually be having more tickets open up because the ticket that wasn't resolved was closed so often, that type of thing. And if you're going by the number of tickets you get and saying, well, getting fewer tickets this month and last month mean that we resolved a lot of big problems and people are having an easier time or does it mean that we were a lousy help desk? We didn't help anybody, so they're asking each other questions instead of putting in tickets. So you do have to be very careful about what it is exactly that you're measuring. Yeah, we're from a software testing team that incentivized the testers to put in bugs. You were awarded for the number of bugs and that was a disaster. The whole week. The bugs were very low quality, but there were lots of them. Totally makes sense. We've got a great question. A comment here from Jessica. Could be good to start by creating a list of metrics that would be on your utopian dashboard and then identifying which metrics would be required for a minimally acceptable dashboard. I really like that idea. So going back to this kind of concept that people are talking about called the minimally viable product. So basically the idea of where, what's the very smallest thing that we can start with that would be better than nothing and to kind of iterate from there. I really like that idea a lot. Yeah, I'm a big fan of Think Big, but plan realistically. And then the flip side, which you mentioned earlier in the slides, is also prioritize your metrics gathering by what information is easily available to some extent. I mean, you don't want to just collect a bunch of information. It isn't useful, but you look at those metrics and you ideally want, you look the ones that are high priority, and then you look and say which one of these are already in our case management system. So they're right there versus something that we would have to go out and maybe do surveys or a lot more labor to bring in. And then you do that evaluation. Is that effort worthwhile or is that one that we put in the B list of metrics we'll collect and go for the low hanging fruit first? Absolutely. Yeah, in fact, my colleague is currently leading a course that looks at kind of measuring your mission and there is actually a scatter plot on that which is basically how useful is it and how difficult is it with the idea that you're basically plotting it and you want to try to find the ones that are both pretty useful and pretty easy to start with. Yeah, your average legal aid organization does not have a lot of man hours to devote to the collection, so you have to prioritize it. Yep, absolutely. Fantastic. And people are asking questions along the way, which I love. Please continue doing that. And if you have a question, certainly feel free to put them in. So I'm going to ask four John and we'll talk about some case studies. But any questions on either the case studies or anything else is more than welcome. Sorry, I trouble with that because I'm leaving. Let us start with looking at the Atlanta Legal Aid Society. So they have a, so they've now finished, logically, it was from 2012, an executive dashboard, so designed to be used by the management team and a little bit by the board in legal server. In fact, legal server has incorporated some of the features that we'll be looking at into their base. So they are, this is primarily looking at types of cases. So it's a count of number of cases. You can see here percentage of LSE cases. Down at the bottom here, it shows a kind of a line chart of those that fit the LSE requirements and those that don't. One of the main things that it does, which is a little hard to show you effectively in the form of screenshots, is that you can drill down through this. So it becomes really useful to see not only, oh, great, we've got this many new cases in Cop County, but also be able to then say, oh, what are those? And to drill down through them. And in fact, here we've got it by map. So you can see how many cases there are in each different place in Atlanta. The way that they went about this is they basically pulled a number of other dashboards that other people both in and beyond the Legal Aid Organization were using. They showed them around staff, two staff, they developed prototypes and are, so they're now done rolling it out, but they are in a very iterative cycle here with the idea of starting relatively small on top of legal server, so then people can drill down into the actual, just lists in legal server to something that was not incredibly large and they can then iterate from that. Peter, Brian, I don't know if any of you are familiar with that case study. Anything to add or anything just to comment on about that case? I am familiar with it. Northwest Justice Project was similarly involved at the same time period working on dashboards with legal server. I think that one thing just to be very aware of when you go into a project like this and you're working with a contractor is putting things together to where you have specific deliverables in mind. It's very easy for a project like this to eat up a pretty good amount of hours and unless you've got those endpoint deliverables, it can be challenging. So we spent a lot of time iterating on this and that's one of our lessons learned from this. I once negotiated a big software contract for a non-profit where the initial proposal from the vendor was we will iterate and iterate until we get this right and the way I read it was we will take as long and charge you as much as we possibly can before we get this done because of the negotiations where you iterate three times and get it right. Absolutely. Terrific. So let's forge on. Certainly if you have questions about Atlanta Legal Aid, certainly let us know and we'll see what we can answer for you. Here's another really interesting example. This is from Blue Ridge Legal Services. Blue Ridge Legal Services has a, well, either multiple, a lot of different dashboards or one really long dashboard depending on exactly how you think about it. They do a lot with dashboards which are in fact all built in Excel. So they have data that is pulled out of their case management system, dumped into Excel, which then uses graphs and charts. You can see what we've got here to show a bunch of different things. So they have, this particular report that we're looking at, you'll notice is for the Winchester office. They do a lot of slicing and dicing. So they have versions of, and there's more to this report. This report is about three pages long. We'll look at other pages in just a second. You can see it for the program as a whole. They can see it for each office. They can see it for different areas of law. And they can see it all the way down to the individual case handler. So we can see here that they've got client satisfaction levels. So they have a satisfaction survey. And you can see that sliced and diced down to the same level of detail. So basically, see it for office versus office or advocate versus advocate. And they have outcomes data. So what were the tangible outcomes of each individual case? So we basically stopped debt collection once. And 11 there is we obtained or preserved or increased Medicaid benefits. So you can see, and the dollar amounts for each. And in fact, one of the byproducts that Blue Ridge mentions is that this was not initially really recognized as a core goal for the project, but having these reports to show the individual advocates that you have saved people and gotten people $14 million over the course of the year. And here is what your satisfaction rating looked like was actually a huge morale booster that the advocates really loved to be able to see this information and really to get a holistic sense as to what they are achieving. And there's a really interesting blog post or a white paper by the executive director of Blue Ridge talking about kind of this idea of you get what you measure and the idea of moving from simple case closed to these more sophisticated outcomes including in them this idea of satisfaction data which he mentioned so they don't really proactively use this a lot. There's not usually a need to say, you know, you really need to do something. They don't say, for instance, you need to do something about these numbers. But if there is... He mentioned specifically, you know, if somebody looks at their numbers for, you know, treating people with courtesy and respect and is lower than a lot of their comrades, well, that person is probably going to give some thought as to what it is they're doing. It's something that's basically just feedback from the person. And his point is that by measuring this we are giving a clear sign that this is something that we care about. And the same is true with the outcomes produced. So we are showing the advocate that what we care about is not necessarily the number of cases they turned through, but how much money, how many people helped, how much actual... What were the real results in the world with the idea that somebody with two cases could have helped millions of people? Millions a lot, but tens of thousands of people with a class action suit or something like that. And that can't really be represented at all with just a number of cases closed. Brian, any thoughts on kind of this whole model of dashboarding? No. This is the first time that I've seen this. It looks interesting. I'm curious how it actually works for the advocates there, but no, I like the idea. Yeah, certainly what they report is that the advocates really love it and are fascinated by it. So the advocates get kind of obsessed with them and really want them to compare themselves with it. I mean, that's really one of the things you want out of dashboard is that engagement to where it's driving them to do something interesting with it. Yep, absolutely. Peter, thoughts to add? No thoughts. I'm impressed with this as well, and I have seen it before because we have it in our toolkit. But again, this is excellent work, and they've gotten, as you're describing, the information I think is really tactical and useful. It's a lot of work. I mean, they are collecting a lot of information. They're surveying to get that information. This is an information that is just kind of naturally coming into the flow as they do their day-to-day work as case closure information does. So it's a lot of effort, and it would be really interesting to deep dive with them on the returns. Yep, absolutely. All right, let's talk through another one. Utah Legal Services. Utah is kind of a different one. It's almost entirely kind of looking at internal productivity-type things. So they have built on top of camps to basically look at quarterly performance reports. So this is, among other things, on an advocate level. And this is an interesting thing that I'm sure that they would say that there's a lot of context used in addition to this. So they're not simply saying, well, great, Julie is percent time to cases. Oh, so this is percent allocated. So this is not necessarily good or bad. But looking at, this is the amount of billable work, or non-billable, in this case, maybe, that each of these people are doing. So we can see that Leah has a lot more time. She's putting a lot more time to cases and projects than Tyler has. They putting more time to cases than Tyler's putting more time to projects. So they actually provide these kind of numbers and metrics quarterly to allow staff members to see how they compare to other folks. And this is, you'll notice on the side here, this is the number of times that notes have been unchanged. So open cases have been unchanged for more than one, two, or three months. And you can see, like, for instance, Randall here seems to be somewhat challenged in either getting his case closed, cases closed in the system, or getting his notes updated. And this is something, this is a type of thing that you can also use as a quality check when you're tracking other metrics. Because logically, to the extent that what we're looking at here is actually what it seems, that Randall is just not updating data as much as potentially he should. By the way, these names have been changed, obviously. We might have a skew of the data that we are collecting. So it might look like we are actually doing less in family law than we actually are, because Randall is entirely family law and that, you know, messes up the data. Pierre, are we going to say something? No. Don't do anything right at the top of my head. Yep. Brian, you mentioned that you were doing some dashboarding on top of legal server. What types of things did you guys track in your dashboard? Was it a more operational type thing, like what we're looking at here, or a more outcome-based dashboard? So one of the things that we were working on was bringing in an external data source, such as census data, and comparing that to our case data, and then creating a map so that we could try to overlay and look at different client communities and see if we were reaching different potential client communities. Great. Great. How did that work? So there were some challenges to it. It's improving. I'll be honest. Are there challenges that you're willing to share that other folks might learn from? I mean, a lot of it is just in the practical visualization. We're all used to kind of what you see out of Google Maps. And when you're creating something in-house and trying to get gradation colors, that type of stuff to make it easy to use and understand, designing those tools in-house can be a challenge. Or actually, I mean, we're working with a contractor, but it's a lot more work to compete with what is a commercially available free product that's out there. Yeah, we've done similar mapping exercises here. And when I looked over the instructions of our data person who has since left, what was doing those, she had 137 steps to create a map that kind of overlaid, where the League of Eight officers were the populations that needed League of Eight were that type of thing. It was this huge, immense process she was going through to develop them. We've since gotten more efficient at it with better software. Great, yeah. That sounds huge. All right, let us wind up with one last case study. So this is the Cleveland League-Layed Society, which some of you may know has been going through in just the last two years or so, a pretty substantial look at how they are tracking, especially case-result outcomes. So what happened at the end of extended representation cases? So they spent about a year defining a kind of a system and process for what they were going to do. So about six months strategizing, so figuring out what metrics they were going to define. They had each area of law, each department think through how they would optimally measure their own success and to come up with actual results that could be kind of answered yes, no. So here, for example, at the bottom, we've got the idea of these are consumer cases. So did we obtain a monetary claim? Did we reduce or avoid debts? Did we avoid arbitration? So six months defining those. Four months designing and implementing the system itself. They actually built it on top of, I think these camps, but it's since been rolled into camps, so you would no longer have to build this. And then there are two months of training at the end to make sure that people knew what they were supposed to be doing. This, what we're looking at, is a form that an advocate would use to define the outcomes on the case. And any given outcome could be a yes, no, or a not applicable. So a no meaning we tried for it, but it didn't happen. Well, not applicable was we never really tried. They view it in a bunch of different ways, but one of the ways in which they view it is through crystal reports in a dashboard format. So you can see here, this is a time-based look. So how many foreclosures were prevented in the lifespan that we're looking at here? And you can see that it is, I mean, among other things, it is going up, the success rate is going up. So they are either more successfully picking their cases or they are more successfully closing them or getting good results. They can look at, for instance, the amount of money. So this is down in the lower left here, the amounts that they have increased assets or decreased debt. And these numbers get very big for almost any legal aid organization. So it can be a really powerful thing to show just how much money you have earned or saved, often the most vulnerable parts of our population. And they continually kind of, as we talked about, they iterate, they assess. I think this is a really interesting report. They ask both their advocates, and I think they do a sample of some cases. They ask their clients to define what outcomes were actually reached. And this is the level, the graph is showing the level of agreement between advocates and clients. And you can see that in fact, in many cases, they result pretty well. And in fact, where they diverge, they are almost always the clients being more optimistic than the attorneys, which I think is a pretty interesting thing. Could that be because the happier clients are more likely to fill out the survey? Could be, absolutely. Yeah, and it depends on exactly what this graph is showing. Is this graph only showing the actual outcomes that were compared? So is this the attorneys rating only from the ones that were returned, or is it the attorneys rating for everything? So, yeah. But as always, I mean, we're using this to measure the effectiveness of a different metric, but this also should be questioned. But in fact, one of the things that they were really concerned about, and it doesn't, they haven't seen any evidence of, was the idea that attorneys or the organization as a whole might be cherry picking cases in order to boost their numbers. So they're taking easy cases. They're avoiding hard cases, even when it seems like there's a true need and a real benefit that the organization could really help with. So they haven't seen this evidence of this kind of idea of cherry picking. All right, so Peter, that's another one that I think you know reasonably well. Are things to add or other things that seem interesting to you in the Cleveland case study? Yeah, I think the only thing I'll add is that from LSE's perspective, we are starting to look at outcomes and they're doing exactly the type of thing that we're looking at, which is the outcomes, the case closure outcomes in extended cases. Yep. So I know that the LSE grantees in the room will be hearing more about this as it happened already, but our approach is to not be too demanding at first. We understand that outcomes management is a challenge. So we'll be introducing it gently and looking for those extended cases. Fantastic. Great. Brian, any comments on this Cleveland case study or anything else? This would also be, for those of you out there, a great time to ask any questions or make any comments as we are wrapping up. Brian. Yeah, my closing remarks here are just get those staff involved that are going to be using the dashboard as early as possible in the process. This may be very new to them, and the more that they feel invested in what's going on and understand the type of metrics that are possible, the more success you're going to have to them being used. I'm also just posted a survey there in the links or in the chat for feedback on this webinar. Fantastic. Peter, any closing thought or two? What would you most hope that people will take away from this? Well, kind of to Brian's point, something I'll also look out for in my experience working with attorneys most of my career is that attorneys and numbers aren't often kind of oil and water. So you, again, I'm just going to reiterate, find your allies. If you're the IT person who's trying to put this together or you're the champion of it and there's resistance among the attorneys to actually quantify what they're doing, which I have run into at law firms that I've worked at in the past, find your allies among the attorneys who do want the data, do want to see that, let them be your surrogates for selling it to the rest of the organization. Yep. And through the, at least a couple of, so I've been working with LSC looking at outcomes type things. So through the two case studies that LSC has collected, they have found that in fact, in a lot of cases, tracking the numbers was actually really good for morale, that people were very skeptical to begin with. And it turned into a something that, that being able to see the tangible outcomes for the community was kind of like staff enhancing, morale enhancing things. Yeah, my experience is that's commonly the case that the fear is not terribly justified having the actual data become such an asset to doing the work. Yeah. Yep. And another reason it's starting small. Data is one of the things that definitely, if you put it in front of people, they are much more likely to see the value than just talking about it, which is obviously a reason to start with something but not invest all of that much before you know exactly what's going to resonate for people and what's not. Turn it back over to Brian for closing thoughts about what's coming up for next year. Great. I think we are at an end. Brian. Excellent. Thank you guys so much for attending. We will be posting a survey here within the next week or so that will be open for a few weeks for topics that people are interested for next year's webinar series. Also, the LS NTAB email list has moved over to Google groups. I'll drop another link to that here in the chat in just a moment, but it's a great place to ask questions about this type of stuff and to survey other experts in the field. Thank you so much. I deal where I agree, and Laura and Peter here, Peter with LSC, greatly appreciated. Very good presentation today. Thanks, Brian. Fantastic. Thanks so much, guys, and I hope to see you next year. Okay, bye everyone.