 Hi everybody, this is Dave Vellante at Wikibon headquarters, and I want to talk to you about modernizing data protection. It's a topic that we've been talking about for many years. Really, we've started the theme that data protection in many organizations is broken, and data protection as a service is the fix. Why do we say data protection is broken? Well, one of the things that people always talk about is data growth, and they draw these curves that are linear. Growth data doubles in size every 18 months or so. In fact, the curve is exponential, and the data growth curve is reshaping. Why is that? People thought that the systems of record would sort of give way to systems of engagement, and what's really happening is transaction and analytic systems are coming together, creating systems of insight, add Internet of Things and machine data, and as I say, the curve is reshaping even more exponentially. So that's one factor. The other factor is cloud. Cloud changes what's possible in businesses. So in this 24 by 7 by 365 world, you really don't have any more backup windows. Cloud is also complex. You've got private cloud, you've got public cloud, you've got hybrid cloud, you've got this multi-cloud situation, you've got all your SASSs in there, you've got a multi-hypervisor world, open source comes in, and it really is putting pressure on backup windows. Backup surface still failing. Our recovery can't be tested. You can't go to your board necessarily and say, hey, we have tested our recovery. We can guarantee that we can recover on a disaster or even lost data. So there's cost and complexity that is really driving IT practitioners crazy. By the end of the decade, Wikibon estimates that 70% of all organizations are going to need to re-architect their backup. Why is that? Well, it's still mostly a one-size-fits-all world. Backup generally is a daily incremental, maybe a weekly full, and whatever that policy is, organizations tend to apply that policy to their entire application portfolio. So they pick the midpoint. So by definition, applications are either under-protected or over-protected. What people really want to be able to do is, in a granular sense, tie the SLA to the application value. Data loss really is more costly than ever in terms of reputation and compliance and legal exposure. And we think that a software-led infrastructure approach is the key to building a cloud-like services catalog model that can allow you to have flexibility, scale, agility, and, of course, cost effectiveness. So we're here to talk about this and we've got two practitioners. I'm going to start off with BJ Devkota. BJ is the CIO of Charles County Public Schools in Maryland. And then later on in the discussion, we're going to bring in Joe King. He's a technology expert and CTO and Vice President of Technology Services at CAS-Saveron, a technology services company. So we'll start with BJ Devkota. Welcome to theCUBE. Thanks very much for coming on to Skype and joining us today. Thank you, Dave. So Charles County Public Schools, very interesting operation, large organization, 40-plus schools, $300 million plus budget. Tell us about the organization and your role there. Thank you. We're about 30 miles south of Washington, DC. We have about 41 buildings, 27,000 kids. And as you can imagine over the last 11 years, the amount of data that we needed to store in our servers, student data, all the demographics, testing. You hear about testing every day in the news. All the data that we store and how to back that up over the years, we have tapes. Now we've gone away from that to utilizing spectrum protect to back it from disk to disk. Like you mentioned, that saves us a lot of time, a lot of manpower and a lot of resources. So you've got a distributed network that you've got to protect data all over the place. How does that affect your data protection strategy? It certainly does. When you have 41 buildings spread throughout the county, you need to bring that data and make sure that they all comply to the same standards for your backup. It is so much easier for us to go away from tapes to have it in a disk format. And if you can do this to this, it saves us time. And we're also very, very responsive. If a site was to be down, and we need to bring that up, the time to bring that site back up and running is we're talking about minutes and hours versus days. Talk about the before and after. So you had to consolidate a number of resources. Talk about that project, that consolidation. What was it like before? What did the project look like? And let's talk about what happened afterwards and what the outcome was. So we're using homegrown backup system back in 2004. From that, we went to tapes. That was the logical thing, tapes. Just back it up to the tape, take it to off-site location. And if you did it, you need to use the tapes. You load it up and you brought it back up and you would tell your customers, which are principals in these buildings, whether you take a few days to get your data back or able to take you maybe even a week to recover an entire site. So all these things coming into place now has given us spectrum protect sitting in the middle of this architecture, 41 different sites bringing in the data, storing in disks and having the same architecture with the disk at our off-site location. So there really doesn't need to be any kind of manual intervention. This thing just works like clockwork. So in case of a backup failure, a disaster or a site going down, all we really have to do is look at our spectrum protect, get that data, and get the school up and running within minutes and hours. You talked about essentially recovery time is much, much shorter. What about the RPO, the recovery point objective, the amount of data that was exposed? Has that changed at all or can you talk about that? The student data, the testing data, and the data that was needed at the school, like you mentioned, was very, very critical. Now at this point, if one of our locations was to go down, we could bring that location if we had the hardware within the same day and get the school up and running without a whole lot of disruption. Joe, I love this conversation. Joe, by the way, is the CTO and VP of Technology Services at Cassivron. I love this discussion because BJ's job is to, and his organization, they run the school. This is what you do every day. You run it to organizations that know their business, have technology problems. Tell us a little bit more about your business, what you guys do, and it's a perfect case study. Dave, first off, great to be up here and talk to you again. I think if you take a look at CAS7, this is a 39-year journey for CAS. We've focused on data protection as one of our core businesses of what we've been doing. That could be K-12 school districts like Charles County Public Schools or large federal government agencies that we're working with. What we wanted to do is to take the best of what we learned from those very large, complex environments that we have, but make that something real for our client that might not have a petabyte of data. If you take a look at a K-12 school district, that data is critical for them. Irreplaceable, just like it would be for a large corporate customer. The best practices that we learned at those very large, complex clients, we were looking for a method of, we didn't want to have to go to market with three or four different things. We wanted to go to market with one. Coming up with what that was going to be, our methodologies that we're going to use, that's a lot of the transformation that we've been making at CAS over the last few decades. Part of your role is the upfront consulting piece. I presume you go in, you do assessments, you look at the application portfolio. I was talking about application value before. From a backup context, presumably you've got to do a lot of that work up front. Can you talk about that a little bit? Yeah. Let's talk about a couple aspects of that. You brought up what's called policy. The policy question of how long do I keep something? It's really intriguing when you go into a client and you ask that question. Some have a very robust idea of what that is. You walk into some organizations and they go, well, that's a key. Well, let's keep everything forever because we have no idea what to do, not understanding necessarily what that exposes them from a legal exposure or just from a practicality standpoint. We have conducted a lot of policy workshops for our clients that goes in and gets the stakeholders in a room, asks those hard questions. What are your regulatory mandated to do? Do you know? Don't make decisions on a whim. Make decisions for a reason. We really like when those decisions are made for a business reason. Those policy workshops have been critical for us. Now, if we take a look from a policy standpoint and shift in, we'll call it more into the technology, if you ask your average customer how much data that they're backing up and how successful that they are, that is a nearly impossible question for most clients to answer. So we've actually worked with IBM and one of their methodologies that they have, but we're able to go in and collect the hard data, not subjective data, but the hard technical data on a system to take a look at how long are they actually storing data? How successful are they in that daily backup cycle that they have? What does that look like? And then we're able to then relay that information back for a lot of them. This is the first time that they've been exposed to that level of detail. And when you really make that plane for them, then they're able to then engage in a conversation with how do they change how they operate? And then, you know, that's when we come in with things that I'm sure we'll talk about like blueprints and the actual underlying software technologies that we use in order to solve those problems. But it starts with having clear understanding of the problem, not just guessing. So, BJ, was that part of your journey, you know, go back a decade ago, was trying to really upfront understanding what those exposures were, what the policy should be? Was that a part of the process or did that come sort of later? Can you talk about that a little bit? This is something that we looked at, what do we need to backup? What are our policies? What are our procedures? What are the federal guidelines? What are the state guidelines? And the terms you hear on the news, whether it's no child left behind, our park assessment, our common core, all these federal and state requirements have mandates where we need to store data. But Joe is absolutely right. That doesn't mean we should store everything forever. There are specific requirements and when we looked at it, we looked at it in such a way that there's always obviously limited resources. We don't have unlimited funds. So we needed to make sure the architecture we came up with initially and again, evolving is how flexible is our architecture. And then Joe, I wanted to come back to you and talk more about just the whole Spectrum Protect and where that fits in your portfolio. So for us, it's a key part of our data protection portfolio and it really sits at a core of a lot of other technologies because from a data protection standpoint, in the end, you need someplace where data is collected. And so for us, we've really standardized on Spectrum Protect for our business. So if you take a look at what that has done from a driving of a growth perspective from storage software and from our consulting work that we've done, that's really quite a double digit growth over the last couple of years. What I tell folks out there is, us identifying and onboarding practitioners in this space is basically happening as fast as we can find practitioners in that space to bring on board. And if you take a look at it, that tends to be a business then that folks are looking to change the way that they're doing things. And we see a lot of that transformation now from those purpose-built backup appliances that you were mentioning earlier in the discussion today, and then moving more towards a purely software-defined basis where they can build those just to the scale that they need. So let's talk about software-defined. Before the whole software-defined meme sort of hit the industry, we talked about software-led. I guess that didn't stick. Software-defined is sort of the rubric now. But what does software-defined mean, Vijay, from a CIO's perspective? Is it a buzzword? Is it actually extracting function from the hardware cement? Is it a value to you? What does it mean to you as a CIO? I certainly think software is evolving in terms of even looking at your day-to-day operations, whether you're talking about your ERP or even the student information system. But what we're really concerned at this point in K-12, specifically in our county, is the data that we own. Software is one thing, but the data that we own, the data that we have collected over so many years that has sensitive information, we've got to make sure we protect that data. And you would hear horrible stories about big school system losing their student information data, our medical information, and that tends to hit any major newspaper or Washington Post or CNN. And you don't want to do that. And the best way to do that is to make sure your corporate data, your administrative data, for us is student data, student demographics and your teacher data. If you keep that intact, then it really does not even matter what kind of application that you're using at the front end. And even if you wanted to take that and put it on the cloud, the application itself is not going to be that important. For us, the most important thing is the data that sits inside our environment. What I heard from Vijay is, very well, we know it's an industry buzzword. Okay, fine. But there's meaning there in terms of being able to scale, to accommodate mobile, to accommodate growth in, you know, other organizations, not just this Vijay. So specifically, from a technologist's perspective, what does what we call software led or software defined mean? So I'll give you how I look at it. So how I look at it is, I don't want to have to choose a different solution depending on where I want to deploy it in. So can I look at things that I can deploy in a public cloud, you know, then possibly hybrid and then on-prem without having to choose something that's individualized for each one of those purposes. So let's talk about a use case for that. So say you're a client that wants to do your data recovery in the cloud, they don't want to have their own on-premise disaster recovery site. So for us, it's very easy to take Spectrum Protect, load it up in one of the public cloud providers that's there and use that to catch that secondary disaster recovery data for a client where they're going to do that data recovery in the cloud itself. And so this allows us to not have to pick two products, three products, stack them together, come up with, you know, some type of overly complex solution to the problem. We can keep it very straightforward and based off of software. And we think that that's pretty powerful. This is one of the reasons why we've really liked the shift away from purpose-built hardware devices into the software because it allows that kind of transformation and really in the end the choice to occur. So it's a mirror and efficiency, flexibility, lower cost, let's say choice. So it's a buzzword maybe, but it's here to stay. No, it's not going anywhere. It's not going anywhere. We have a number of practitioners in the Wikibon community. The old TSM days, very powerful, but complex, not necessarily easy to use. Has that changed? Has the investments that IBM made changed? Oh, I think that's putting it lightly. So let's talk about a little bit on that journey that's occurred over the last few years. If something's easy to use, you'll probably actually use it. So you don't have that paralyzing feeling that some clients have in their infrastructure or the software that they've chose where the complexity keeps them from using it. That they're basically scared of what they have. The other thing that we see is when you have this design thinking methodology where user experience is taken into account and ease of use is taken into account, our trouble calls in from CS7 of folks that need help with what we would consider relatively rudimentary operations have been cut way down. And so again, our goal is to get what we deploy in the hands of our customers that they can then run what's needed without having to have a fleet of professional services in order to maintain that moving forward from a day to day operation standpoint. How about virtualization support? IBM invented virtualization on the mainframe and VMware comes along and now Hyper-V is all the rage in Microsoft shops and other open source hypervisors. How has IBM done in terms of supporting these various hypervisors and virtualization scheme? But what we've seen in the last couple years is a real focus on that area. So let's talk about what's possible today with Protect directly through an interface that they're used to through the VMware interface that they're used to and from a self-service standpoint. So again, no phone call, no phone call. There doesn't have to be again 17 emails that go back and forth. And to the point now where this is gone is users themselves, not VMware administrators, but the users of those VMs themselves being able to do recovery self-service of file and directory data that's out there. So you take a look at, again, staffs aren't growing at any of these organizations. So self-service has to be very intuitive. So you can start to now see the feedback loop. This design thinking user experience work that IBM has done now allows things like user-led recovery to actually work. And that's been pretty powerful from our business. So BJ, can you confirm that? I mean, from your standpoint, the ease of use, the self-service piece, has that actually turned out to be the reality in your case? Absolutely. And I'm listening to Joe and I'm going back to the 2004 days and now we're 2015. It feels like that the product has evolved so much that even though we have not added any more resources, even the resources we have in-house, they're happy. It's easy to use. They could do a lot of the troubleshooting. And the only time we really have to look at the product or place of call is when we come to some sort of a bigger issue. That's excellent. Joe, give you the last word here, your technologist advice for technology practitioners from a practitioner. What would you advise people specifically in the context of modernizing data protection? So first off, we follow a philosophy of don't guess. Don't guess. Don't stick your finger up in the air and see which way the wind's blowing. Understand your data and understand what you're on the hook for. Understand what your industry rules and regulations are. And then if you don't know what those are, bringing a practitioner might be able to lead you on that, lead you on that way. But go from a strong base of information. Excellent. Gentlemen, great conversation. BJ Devkota, Joe King. Thanks very much for coming to theCUBE. Great to see you. Thank you, guy. All right. Thanks for watching, everybody. This is theCUBE and we're live from Wikibon headquarters. We'll see you next time.