 Hi, this is your host, Sapil Bhatia, and welcome to our yearly predictions video series. Today we have with us Rob Schmidt, partner at Carick. Rob is great to have you on the show. Great to be here. Of course, I'm going to ask you to pick your crystal ball and share your predictions. But before that, just quickly remind our viewers what is Carick all about? Carick is all about providing excellent service, technical ability in cloud, application transformation, digital transformation, helping companies accelerate their transformation journeys and deliver more value for their customers. Now it's time for you to grab your crystal ball and share with us what predictions you have for 2024. The first prediction I have for 2024 is we're going to start to see in the LLM space, companies really starting to branch out and think about the types of LLMs they're using, where they're running them, how they're running them. What we saw in 2023 was a lot of open AI consumption, a lot of people kicking the tires on open AI. I think people are going to start to spread out, they're going to start to try lots and lots of smaller models, whether that's Falcon or Lama 2, things like that. They're going to start to really, so those skill sets are going to be really important for enterprises to help build out. You're going to need to really start to rethink how the teams that are using these things are considered, how they're running them, how they're instrumented. All of that stuff is going to become a very top of mind thing for a lot of enterprises, as they're starting to get more value out of LLMs in their enterprise journey. So that's one thing that we're already starting to see that in our work at Carrack. A second area is I think we are going to see kind of the return of metadata management into the enterprise vernacular. A couple along with all of the companies trying and starting and exploring LLMs and trying to figure out how to integrate them into their enterprise workflows. It becomes very apparent very quickly that if the data isn't there and it's not usable or it's not well categorized and understood from a metadata perspective, it becomes very, very difficult and unwieldy to just fire that stuff into an LLM and start asking questions about it. So having an organized approach to maintaining the data that you're embedding in the LLM or the data that you're asking question of is going to become a very, very hot topic. So I think a lot of the companies are going to have to start to explore how they're going back with their data libraries and managing them to get greater efficiency. The third prediction which I don't think is much of a prediction at all, it's more of a reality already, is that coding assistants are going to become a daily part of life for most developers out there. We're already seeing in our own work and with our own customers the speed at which we're able to improve and speed up delivery and what we're able to do now with smaller teams and the effectiveness of those teams is really astounding. So most developers who are out there are going to have to start thinking about how this is going to integrate into their daily workflow, how they're going to use the technology, how they're going to figure out how to turn it into like a virtual paired programmer running along with you. And I think it'll, there's lots of other ways that they're going to start to use it too, especially jumping into a code base that you're not familiar with. They're extraordinarily helpful at just figuring out where stuff is and how this works and how all the plumbing is set up. So that's definitely something that we're seeing and we're already helping enterprises adopt that technology and figure out how to integrate it into their development workflows too. So now what we're seeing in the projects that we're working on is actually taking, they're out of the tire kicking phase and they're into the, how can we use this to boost the productivity of our employees, right? How can we use this to automate things that have been kind of painful for us in the past? And we kind of see two big modalities are really starting to take over. So one is the retrieval augmented generation is it's here to stay, right? And that is a technique that every enterprise needs to get familiar with and needs to figure out how they're going to use that. Because that enables a lot of, I would say very complicated and time consuming tasks to be automated, particularly where you have people combing through reams and reams and reams of documentation or texts or files in order to come up with some kind of a decision about something, right? Whether that's credit report or something like that. That being able to do that and do that at scale is a huge value add for most people. Like if you can just accelerate the people going through that and help them find the stuff they're looking for faster, the productivity gains are astronomical. What kind of challenges you see will be there in 2024? Not just for the users, customers, the ecosystem, but also for character tackle. While the generative AI stuff in the LLMs is very important and has a lot, I mean, everybody sees the potential in it. The challenges that come along with it obviously like is when technology, this kind of breakthrough or disruptive comes about into the enterprise, there's a lot of other challenges that arrive with it, right? It's not just how do I plug this thing in and make it useful for my organization. There's cultural changes that have to happen. There are oftentimes structural or business changes that need to happen as a part of that. So a big challenge for a lot of enterprises is gonna be how do we package all of this stuff up together into one cohesive transformation. And that's where we come along, right? That's where Kara can have a lot of help for our customers and we have a lot of great relationships with great customers where we're doing this kind of work now and helping them figure out how to integrate this stuff and what the use cases that can be done safely are, right? And how to do this in a way that can move the needle for them, but thinking about it holistically so that we're packaging this up not just as here's a technology solution, right? But here's how it works. Here's how this is gonna change your day to day. This is how that you're gonna interact with it. Here are the guidelines, here are all, but the whole package, right? In addition to, you know, obviously there's financial considerations for all of this stuff as well. So making sure that we present, you know, the whole vision of what it can be and how they're gonna get there and what the changes are gonna be like based on our experiences. Looking at these challenges which also kind of translate into opportunity, what is going to be the focus for Cary in 2024? Our focuses are definitely an ever-evolving thing year-over-year. You know, we really started as Agile and Cloud transformation specialists, people who could come in and help an organization modernize their infrastructure, right? Whether that's adopting Kubernetes or things, you know, moving to serverless, there's lots of different ways that we've helped customers do that kind of stuff. So that kind of base now allows us to move more effectively into this new LLM generative AI world and help companies adopt that in a thoughtful and meaningful way for them. So that's gonna be a big focus for us. In 2024, we have some techniques and we have some packages and programs that we've built out that can really help accelerate the journey both from an exploration perspective and an integration perspective, right? So we've now done a lot of work on this in order and with customers already to deploy this stuff for real, right? In enterprise settings at scale, you know, with significant volume. And it's been nice to partner up with, you know, all the LLM companies that are out there, Langchain and, you know, Google Cloud and OpenAI and all the groups that we've had the experiences working with. So it's been a lot of fun for us to put this stuff together and learn from it. And then, you know, in 2024, we get to pass those things on to hopefully a lot more people. Rob, thank you so much for taking time out today and share your predictions with us. I would love to have you back on the show next year, not only to see how many of these three predictions turn out to be true, but also get the next set of predictions. But I really appreciate your time today. Thank you. Thank you.