 from the WigWam in Phoenix, Arizona. It's theCUBE, covering data platforms 2017, brought to you by Cubull. Hey, welcome back everybody. Jeff Frick here with theCUBE. We are in the WigWam Resort, historic WigWam Resort, just outside of Phoenix, Arizona, at Data Platforms 2017. It's a new big data event. You might say, God, there's already a lot of big data events, but Qoble's taken a different approach to big data. Cloud First, Cloud Native, they're integrated with all the big public clouds, and they all come from big data backgrounds, practitioner backgrounds. So it's a really cool thing, and we're really excited to have our next guest, Colin Riddle. He's a big data architect from Epic Games, was up on a panel earlier today. Colin, welcome. Thank you, thank you for having me. Absolutely, so enjoyed your panel. A lot of topics that you guys covered. One of the ones we hear over and over again is get early wins. You know, how do you drive adoption, change people's behavior? It's not really a technology story. It's a human factors and behavior story. So I wonder if you can share some of your experience and best practices, some stories. So I don't know if there's really a rule book on best practices for that. I mean, every environment is different. Every company is different. But one thing that seems to be constant is resistance to change in a lot of places. So... That is consistent. So, you know, we had some challenges when I came in. We were running a system that was on its last legs, basically, and we had to replace it. There was really no choice. There was no fixing it. And so I did actually encounter a fair bit of resistance with regards to that when I started at Epic. Now, it's interesting. You said a fair amount of resistance. Another one of your lessons was, you know, start slow, find some early wins, but you said, but you were thrown into a big project right off the bat. So I'm curious, how did the big project go? But when you do start slow, I mean, how small does it need to be where you can start to get these wins to break down the resistance? So I think what we, I mean, the way we approached it was, we looked at what was the most crucial process or the most crucial set of processes. And that's where we started. So that was what we tried to convert first and then, you know, make that data available to people via an alternative method, which was Hive. And, you know, once people started using it and learned how to interact with it properly, you know, the barriers start to fall. What were some of the difficult change management issues? Where did you come from in terms of the, you know, technology platform and sort of what resistance did you hit? So it was really, a user interface was the main factor of resistance. So we were running a Hive cluster. It was fixed size, it wasn't on-prem, but it was in a private cloud. And it was basically simply being overloaded. And so we had to do constant maintenance on it. We had to prop it up and the performance was degrading and degrading and degrading. So the idea behind the replacement was really to give us something that was scalable that would grow in the future that wouldn't run into these performance blockers that we were having. But again, like I said, the hardest factor was the user interface differences. So people were used to the tool set that we were working with. They liked the way it worked. What was that tool set? I would rather not actually say that on camera, if that's okay. Does it source itself in Redmond or something? No, no, it does not. No, they're not from Redmond. But I just, I don't want to cast dispersions on it. You don't need to cast dispersions, yeah, yeah. So the conflict was really just around familiarity with the tool. It wasn't really about a wholesale change of behavior and becoming more data centric. No, no, because the tool that we replaced was an effort to become more data centric to begin with. So there definitely was a corporate culture of we want to be more data informed. And so that was not one of the factors that we had to overcome. It was really tool-based. But the games market is so competitive, right? You guys have to be on your game all the time and you got to keep an eye on what everybody else is doing in their games and make course corrections, as I understand, something to become taught or new. So you guys have to be super nimble on your feet. How does taking this approach help you be more nimble in the way that you guys get new code out, new functionality? So it's really, really very easy for us now to inject new events into the game and to, we basically can break those events out and report on them or analyze what's going on in the game for free with the architecture that we have now. Does that mean it's the equivalent of in IT operations, we instrument everything from the applications to the middleware down with the hardware? Are you essentially doing the same to the game and so you can follow the pathway of a gamer or the hotspots of all the gamers, that sort of thing? I'm not sure I fully understand your question. Like, when you're running analytics on a massively multiplayer game, what questions are you seeking to answer? So really what we are seeking to answer at the moment is what brings people back? What behaviors can we foster in our players? Yeah, engagement, exactly. And that's how you measure engagement, it's just as simple as do they come back or time on game? That's the most simple measure that we use for it, yeah. All right, so Colin, short on time, I want to give you the last word. When you come to a conference like this, there's a lot of peer interaction, there are some great questions coming out of the panel. Around specifically, how do you measure success? It wasn't technical at all, it's what are the things that you're using to measure whether stuff is working? I wonder if you can talk to kind of the power of being in an ecosystem of peers here. And any surprises or great insights that you've got? I know you've only been here for a couple of days. So I would say that one of the biggest values, obviously I mean the sessions and the breakouts are great, but I think one of the greatest values here is simply the networking aspect of it. So the being able to speak to people who are facing similar challenges or doing similar things, even though they're in a completely different domain, the problems are constant, right? So, or common at least. So, you know, how do you do machine learning to categorize player behaviors in our case, in other cases it's categorization of feedback that people get from websites, stuff like that. So, yeah, I really think that the networking aspect is the most valuable thing to come to this. All right, awesome. Well, Colin Riddle at the game, thanks for taking a few minutes to stop by theCUBE. You're welcome, we're welcome, thank you very much. Absolutely, all right, George Gilbert, I'm Jeff Frick. You're watching theCUBE from Data Platforms 2017 at the Historic Wiglaum Resort. Thanks for watching.