 Okay. Okay. So, welcome back. So, Dr. Lapidi, we have a question from one of our students and he's asking about the model of overfit. So, the specific question is, is he saying that researchers are obsessed with lowering or reducing errors? How can I manage them to explain the effects of overfitted models on the sake of strategic information? Okay. Actually, I learned this very early in my career. I had a boss who used to be my manager, who described to me two concepts. One is validity of a model and the second is phase validity of a model. Now, what did he mean by validity of a model? validity model has to be, does it explain prior variations in the band, right? That's validity. It explains it very accurately. Maybe I fit it to 100% in it, right? In this case, it's perfect. That's validity. So, you can say, okay, it's accurate. But the question is, if a reasonable person looks at it, reasonable man or woman looks at it, does it make sense? Is it phase valid? So, a couple of ways to get around this because if you're purely a mathematical person, you're going to basically try to minimize that error as the model. And you have to get the forecast that explained to you all the causal factors. There's a difference between causality and correlation. Does it make sense, right? And actually, I'm a big advocate, the simpler the model, the better off you are. Because if you take 30 potential demand signals and you're trying to forecast 50 data points from 50 historical data points, you get very good audio explaining 90% of the variation. But you can't explain why this factor has some impact on demand at all. Because in reality, the way I always like to say it, is demand forecasting is about understanding variation. Because if the demand did not vary, guess what? We're out of a job. We're not the forecast. We just forecast a flat line. It's real simple. So, we have to understand all the variation. And we can't explain it with factors that don't make sense. Certainly, we can explain variation in terms of seasonality, seasonality in terms of trend, seasonality in new product introduction, seasonality from promotions. But that doesn't make sense, because we're doing things to impact those. But sometimes, they don't make sense. You have to get the model, as I mentioned before, facts, figures, and assumptions. They have to be very transparent as to what they put in that model, so that reasonable people would agree, yes, those are things we should put in the model. Another thing to do, which I did an exercise with students once, where we gave us some data to forecast, and they were trying to minimize error. So, they put it into Excel Solver, and they put the data into that, and they developed a least fit error model. And so, they presented it to me. And the first thing I had them do was, I said, okay, I want you to put the forecast history on the same chart, a figure, a historical history, and the future that you're forecasting. And about half of them had forecasted a data set that looked like it was slack and growing to be dropping down going to zero. And they said, does that make sense to you that the business is growing and you're forecasting that? They said, well, we've got the best fit. No, that's not reasonable. So, you have to be able to explain what's in it, and what makes sense to be in it. And if you have to get rid of some factors, because you can't explain why they have the impact on demand, then you take them out, even if it affects your so-called accuracy. So, just getting the most accurate from a least fit perspective is not the same thing as getting a valid model. And that's the issue face validity versus validity. So, hopefully that addresses your question. Thank you, Dr. Laferi. So, we'll start reading some of the answers that the students provided. Okay, so the question that we asked was about the factors that the executive team and other managers used to develop their own forecast. And let me start with the first group. And they are saying that the company is explaining the data and approach behind forecasts instead of heuristics. Using reliable metrics to assess and forecast cost budgets being open to proving from relevant levels who wanted to hear something else being more prepared, knowing that you're sharing bad news, not bad news. Operable approach for both sections of user was similar. Be objective, be professional, be open to feedback, and proven. I think this is one comment more related to what? The forecast is supposed to do. But the answer is supposed to be about why the people around me did their own forecast and had doubts about my forecast because they did their own in their head. So, I don't see that addressed here. But you're right about the forecaster. But what I was looking for is why do the executives feel the way they feel and doubt my forecast? What about what's going on that they doubt my forecast? Same thing with the other managers. Why did they doubt my forecast? So, I'm looking for the perspective of my co-workers and my managers in terms of what is making them think that I'm wrong because I say revenue is going to be flat, right? And you're right. I have to do all those things that you mentioned to explain to them. But why did they initially have doubts? That's what I was looking for in the case, all right? Okay. So, let me take a look to another answer. So, Paulina at least is mentioning, firstly, that executives were constantly disappointed without trying to dive into the issue and get the big picture. And secondly, nobody wants to be the messenger in this case, but you have to be. Unfortunately, in many companies, it's not easy to stick to the truth. I guess the executive team hadn't analyzed the component of the forecast. I think once again, I think this is more suggesting to the forecast. What you have to do to counter. And again, it goes back to fact figures and assumptions, improving your model as both as best accurate as it could be as well as face value as it could be by explaining all the facts. Again, transparency in the forecast is the most important thing. And I give four different things that the forecast has to do. One of them is not to be political. It's very hard not to be political, because you're afraid you're going to lose your job. That is a really hard thing to do. I just recently wrote an article that took the examples of the 2016 presidential polling where they blew the forecast totally. They did not forecast any chance of Trump winning. And I pointed out the factors that were wrong with that. And then I took that and made that lessons for forecasters and planners. So I was in a general business forecasting a couple of issues ago. And sorry, what was the lesson to be learning that? One of them was don't be political. Of course, in politics, you have to be political because that's your business, right? Others was they don't believe their forecast. If anyone at the post that had forecasted Trump, they didn't believe it because they didn't want to believe it. That's the problem. You cannot let your emotions get into the forecast. It has to be based on facts, figures, and assumptions that you can explain to people and that people could say, yes, they make sense. And the other piece is a lot of the data used in the 2016 campaign and polling was done by the internet and what I call the virtual world. The TV media, what the TV media was saying, what the pollsters were saying, and many of the pollsters did survey monkeys. That's a virtual view of what's going on. They entirely missed the whole segment of the population out in the rural areas pulling out online who don't take the inventory and internet survey, who don't even watch the TV, the same TV. They're living in the real world. So the one lesson for supply chain managers is you may get all the stated, but it's virtual still. Is it telling you what's happening really on the ground? Well, what's happening in the real world? And what happened with the 2016 polling, which is what I said, is that many of the pollsters were too political, biased, and others to recognize and even go after what was happening in the rural areas of America. That's one of the lessons. Okay. Now addressing this. Yes. So we'll have another from Giovanni. He's saying that being touched with different departments of the organization can make the difference in order to share the results of the forecast and to gain trust on it. I mean, it's important to share the results, speaking different languages in accordance with the department you work. For instance, marketing needs only a portion of them as well as finance. But I think this is again, it's not addressing the question that you asked. We have another comment that is from Robin, large company, lots of people to play, to point blame to and keep their own paychecks. I had to get to some of the points that I did want to come out. In other words, other people get paychecks, other people get promotions based upon revenue growth next year. They expect to be rewarded. It's been growing. It grew last year. Now I expect to get my reward. My reward is pay raise. My reward is the promotion. Because maybe it's a test for year to year before. Why are you now forecasting this? I can know instead of getting rewarded, I may lose my job. And I think that's what they said. And that's a good valid reason to doubt is it impacts me from a perspective of my career growth, right? Not the forecaster. The managers that I'm telling you, the companies, maybe they're going to have to think about leaving the company even because the company's not growing anymore. And high tech, an important thing is to be growing all the time. Otherwise, you're not a really a good high tech company. Platte is not acceptable in high tech. Only positive growth is acceptable in high tech. Okay, so let me give another answer. So they mentioned that they discussed on how we decide on the best method of forecasting on a scenario and pertaining to the discussion we made on Dr. Lapid's oil company. It always involves high costs when we try to implement something out of the forecast and people don't always like changes that involves a high cost of predicted or predicted that it might incur a good cost or future. Sometimes forecast bring in a solution that's not accepted initially by the majority, but eventually we turn out to be the best over time. Something that you already mentioned. Yeah, so repeat to me back what you think the student is asking. No, but in this case it was just a comment about that in this case changes. A change, first of all, we have to explain. I think that's a very valid point. And the issue is had growth all along, that's what I pointed out before. Had growth all along, why is it going to change? You have to prove to them why it's going to change. And this is what I said, turning points are the hardest thing to be forecasted. I once wrote an article called on this basis that I think I mentioned in here, and I called it Forecasting Superheroes, Forecast Turns, because that's the hardest thing to do in forecast, is this major inflection point or turn in what's happening to the business. Okay, so another answer is they are mentioning that the management of the company were looking only at historical data as a realistic method. That's a very good one. But with that digging too deeply into the reasons why profits were where they were. Exactly. They didn't understand that the backbills or a big piece of the revenue, 50% revenue last year, that a lot of that 15% was backbills, but you're not going to get again. It was backbilling for putting equipment on the contract late and therefore having to backbill the customer. Okay, I think we don't have more questions. So any comments that you would like to share with the students about the answer? How much time do we have? You have 10 minutes. We can go through. Okay, so what I prepared is some of the answers I was expecting to get. Do not feel bad if you don't get any answer right. That's okay. It's not about that. I wanted you to go through a discussion because the pre-elegated issue is if you're going to have a job and forecast in industry, you have to understand you have good news is easy to give, bad news is hard to give. And I've been through a couple of cases where I had managers call me and they were running into trouble with their management because the salespeople want to get you fired sometimes if you don't come up with what they came up with. So you have to do what I did, which is stick to your facts, figures, and assumptions and be strong about that. And if somebody takes one of your facts and figures and you find out that yes, maybe you shouldn't have done it that way, you correct it. Okay, so it's what I call, and again I mentioned in the forecast, the way you do the buying and the forecasting is your forecast is innocent and to proven guilty. So if anyone can point out to you an assumption, a fact or a figure, which is a little bit faulty in some way or did not take something into account, then you have to really accept you have to change your forecast. But unless they come up with that, then they would have to keep the forecast the same because you're the forecaster, you get paid forecast, we get paid good money to forecast. So therefore, the organization has to have, I'm credible to the organization. What I like to do, which is hard to do is if the salespeople think that they don't like the forecast, I tell them, okay, why don't you take this job and see how you can do it, how well you can forecast. I'm using the best methods. I'm getting paid pretty good money for this, why don't you take my job and forecast. Nobody wants it in the organization because I used to get a lot of people who said, I don't want to forecast because I could be wrong. There are people in this world that do not want to forecast because they could be wrong. That's the worst forecast. They should never be forecasting. You're going to be wrong. Yes. And that's why it's so important that when you produce a forecast, you produce the error associated with forecast as well because most forecasters are ashamed of the error. We're ashamed of the error because on a forecast of an SKU, a couple of weeks out, you could be 100% wrong at the warehouse level. That's not uncommon. A good number is 50%. And a lot of it has to do with the fact that promotions drive, promotions and pricing changes drive variation. And especially in consumer products. So the forecasters in those environments are ashamed of their error. No, you shouldn't be ashamed of your error. You should give the error so that everyone understands the world is uncertain. The variation is caused by you and marketing and sales. If you didn't do activities about being a sales and have impact on demand, then demand wouldn't vary and that my life is easy as a forecast. But you want to be competitive. You want to put new products out. You want to do promotions. You are causing variation in a product that has maybe flat consumption. But that's what I get paid to do. So every time you ask a sales rep if they want to take your job, they'll say, no, I'm pretty wrong. All right. So I think that's a little bit of an embellishment on that. So where are we? I'm trying to remember what we're talking about. So let's put this up down. So I put together some of the things I was looking for. And as I said, I don't expect any of you to get any of it because you're new to this, but I didn't want you to have the discussion because I think that's where the learning takes place. But I wanted to give you back some of the things that I suggest would be things to talk about. So I think what is up now is these are illustrative answers you could have come up with. So the first one has to do with the executive team. Why did they doubt my forecast? Well, as we said before, years of double digit percent revenue growth, it's a shock to them. So what's the first doubt? They were instituting programs that they were telling their bosses, they were instituting programs to increase revenues. And now I'm telling them revenues would be flat. So what's going to happen when they go talk to their bosses about that? They're going to say, what, you can't grow revenues? And they said they're going to put programs in a place to grow revenues. Last year's revenues was healthy, despite a slowdown in new computer sales. So they think, okay, we did slow down in computer sales, but not as badly as we're showing in the service area, which is usually doesn't get impacted. Service business doesn't get impacted as quickly as the new business can. New business you could drastically drop in sales, but it doesn't impact your install base and your revenues in a sense. So they thought last year service revenue was healthy, even despite the fact that computer sales were down. So they expect the same thing happening next year. No, it was impacting the new revenue coming in from brand new computers being put on contract. Service revenue pattern never changed much. If you plot this out, you see very, very little change in the service. And when you're dealing with service parts, sales, and supporting what we call durable products, you deal with what's called the install base. So automobiles need pots. How do you forecast how many pots, how many tires, how many other things you need to put on cars that are working and install base, so to speak, that are active? Well, if you know how many there are, you know how many tires you need, right? And if they break, if they need to tie it, so you do what's called install base forecasting, there are all kinds of things. And when you look at the service pots revenues, they don't change much because you have an install base out there, they don't break down at the same year. And so therefore, it's relatively flat business, but you've always had them out there and you don't lose too many, all right? Backfill revenues should still provide growth. Why? I have to show them, I had to go very deep with the financial organization because they were the ones that understood the back bill. We had to do it. I had to work with the financial organization and look at the back bill because that's the one that was giving everybody a false sense of comfort. The computer, yes, the computer growth sales and computers was down, but service revenues were great. And we always expect the service values to be great. They'd be relatively insulated from the new computer sales. No, in this case, because of back bills, it gave everybody a false sense of comfort and they weren't going to recur again. So I had to prove to them they weren't going to get the same level of back bills the following year. They want to avoid having to lay off people. Executives hate laying off people. First of all, they're all interested in increasing their headcounts. They want to lay off people. They want to give their people raises. They want to do lots of stuff for their people, but they want to avoid having to lay off and give no promotions or salary raises. From the executive, you don't want to have to do that, but you're going to have to. You have no choice. So that's the executive team. You're going to put up the other managers. What did they have to ask? If you look at finance, finance is one of no change in financial performance. If anything, a financial organization likes to use last year's profitability, last year's operating margins, etc., and assume it's going to happen again. So they don't want a lot of changes, but now they have to change significantly the budgets of all the organizations to take into account flat versus that. Therefore, it's a change issue. Again, like I mentioned before, change. Financial organizations aren't really good with change, because if it means that operating margins are down, the next question is, well, how are we going to get them up? So they don't want to change financial performance, but it's going to have to be done, because that's what's happening at the revenues. Marketing and sales managers typically goal on revenues and sales performance. Sales person is saying they're going to sell so much, and you're telling them, if I add up all the sales commitments, that they're bigger than we're actually going to get, because they were all forecasting growth as well in their sales plans. Now I have to convince them that's not going to happen. But that's the reason why they have doubts is because they told, every sales rep and every sales manager told their bosses that they were going to increase revenues. The marketing organization as well, they were going to increase revenue. My boss said he was going to increase revenue, right? But they weren't. The other thing is, they were friends with the fore. I worked at this company for 10 years. I knew many of them, the managers. One thing you learn as a forecaster is the way to forecast the best is to talk to everyone in the company about who can impact the man. How is this year going? How are your sales going? How are your prospects going? The more information you collect, the better you can forecast. The people who are working with customers, working on promotions, the more you understand. And you develop friendships. You see them every day. You see them in the lunchroom, all that stuff. So these are your friends. Now you're telling them that, oh, guess what? You thought you were going to buy their new car? Probably not. You're not going to get the raise done, right? Or you thought you were going to get a promotion and a raise? Probably not because you'll be lucky that you don't get laid off, all right? So this, you have to tell your friends. And as I wrote in here, for a long time, people didn't want to talk to us. Not because they hated us, because we weren't saying things they liked. Okay. Then I could put the third try. The third try is just employees in general. This is both executives and managers, right? The big issue was worried about losing your jobs. Layoffs and high tech, if you were in a flat, like I said before, high tech, you'd better be growing. Because if you're not growing, we're reducing headcount. We're going to knock down headcount, we're going to knock in raises, all the other stuff. And so people were actually, and actually we did have layoffs several years later where we had layoffs as a result of the fact that the computer sales were coming down, which impacted the service revenues coming down. And the company as a whole had flat revenues of five years and through and until I ended that before I left, there were every six months there was a layoff. Okay. So it was a reality, but you know, that everyone's worried about kicking their job. They all expected promotions and raises next year's reward for having a good Friday. Everybody expected, you know, because I made this forecast, probably we used to budget from August to March. We were, obviously, it started in January, but the executives can never come up with the final budget for several months after January 1st. Because this is a big battle between executives on resources, you know, what expenses is my budget, what's my cost, how many people can I hire. This takes a long time in my organization to actually get resolved. But we started in August. And so in August the far year, I was telling them flat revenues and took me months and months and months to convince them that they were all working on, okay, how do we do this, right? Not so easy. The stuff we did before is not going to work. Because we were in a growth environment. Now we're in a flatter environment. I as an employee, you know, things are not going to be the same. You're giving bad news. That's why it was the worst. Yeah. But it was the best year, because I learned what it takes to be a corporate forecasting person. And when I talked to forecasting organizations and they asked me, what is the successful forecasting? Has the best accuracy? I said, no, it's not the best accuracy. It's the only organization that is credible enough to give us the best forecast with the best accuracy, year after year, they may not do a good forecast one year. But if you look at the history of the forecasting organization and what they've been doing, generally forecast areas, the best we could do internally, which we have to do internally, right? This is the group that we have to rely on to come up with the most credible forecast. Credibility is so much more important than accuracy. But credibility is gained by consistently forecasting to a level of accuracy where everyone agrees you are the best team possible in our environment to do this. So that's pretty much generally. So one thing, if you remember the end of this, that senior VP said, I was the only one who would tell me the truth. Throughout my forecasting career, I was lucky enough to have pretty good accuracy. If the senior VP wanted to know what revenue was going to happen in this quarter, he didn't ask the salespeople. He didn't ask the market people. Hey, that's not my forecasting organization. That's when you know you are a good forecaster, is when the executives rely on you to find out really what's going on. Excellent. So it's time to wrap up the event. Thanks again, Dr. Lapidi, for accepting the invitation to carry the case study and also all the learning. So just a final reminder for the students that tomorrow, July 26th at 1500 UTC, we are releasing the meter exam. The exam is going to be available for one week. So it closes on August the 2nd at 500 UTC. Remember that it's a time exam. So once you start the exam, you'll have only four hours to complete it. Okay. So thanks again for joining the event. Thanks, Dr. Lapidi, and see you next time. Thank you for joining.