 One easy definition of big data is anything that doesn't fit on your laptop. In historical terms, big data was something on the order of petabytes, which is, that was the impetus for creating these distributed systems. So Yahoo would have petabytes of web search data that they had to analyze. You know, same thing with Google, same thing with Microsoft. And so you've seen a lot of work on these distributed data systems at those three companies. Google has the patent on MapReduce, which they've been very cool about not enforcing very strictly. Yahoo contributed a lot of the code base to Hadoop. Microsoft has its own version of Hadoop that's written in C-Sharp instead of Java. But in all three cases, the motivation was how do we deal with these petabytes of data? And so that's where the definition of big data started. I'd say today, you know, it's anything you can't analyze in a spreadsheet. Maybe that's a way to look at it. So anything in the terabyte range, I think, is probably going to be big data for most companies.