 Welcome back. Hope you enjoyed the time in the breakout rooms with your fellow learners and that you have very interesting discussions. We're waiting for your insights in the main chat, so please share them with us. We already have some insights from breakout room one, but we're waiting for the rest. Don't be shy and share your insights. OK, Larry, so should we begin commenting on the first question? Sure. So the first question was, how might you analyze your forecast errors in order to improve your forecasting? So how to use forecast errors to improve forecasting? OK, we don't have feedback yet. We have some feedback, so let's see what they suggest is from breakout room one. They said, trust, track forecast accuracy and variance bias, if the money is bigger than supply, might lead to finance issues, accepting that forecasts are a challenge, real adjustment to baseline forecast, understand that all errors are not due to bad forecast. It could be supply issues that may reflect in error figures. So that's a summary of the discussion in breakout room one. OK, room one. Well, we get starting from two, or you want me to go? I hope so, but we can't begin commenting on that. OK, so I think that a person, room one, mentioned it. One is comparison to the baseline forecast. Because what I found in my research is a lot of companies never keep their baseline forecast. They come up with a final forecast. They compare forecast error against it. But it's very important to look at the baseline forecast and see in what cases was the baseline forecast the better forecast over time. And if it is, then we go with the baseline until we learn how to do better with what we're doing. But the current forecasting process is adding error rather than getting rid of error. So it's important to understand we did all this final forecast on top of all these baseline forecasts we had. Maybe we're better off with the baseline forecast in some cases for some products and some geographies. And I think that's important. The other one that I mentioned in the article was a lot of people are now comparing everything to the naive forecast. The naive forecast is whatever happens today is going to happen tomorrow. That's the most naive thing. Whatever happened last month is happening next month, right? Whatever happened last week is next week. It's a very simple way to forecast. It never is far off. It's not a bad forecast. It's actually one that's really good to look at. If you have a sophisticated forecasting methods that you're using and you compare that error to the error you would have gotten if you just used the naive forecast. If you can't beat the naive forecast, then you are not adding any value to it. So therefore, it's important to recognize that as well, that a simple forecast like naive may be even better than your current baseline forecast, right? So that was kind of the other thing I talked about. And another piece of this is, remember, forecasting is all about variation. And so you're going to get very high errors on demand that varies. Demand that varies for promotional reasons because we promote it. Demand that just naturally varies, right? And basically, you need to put special focus on those if you can, right? But there's only so much you can increase the accuracy of that. And so what I've talked about a lot is ABC analysis. When we do this in inventory, we should do it also in forecasting. In inventory, we say, A items or the 50, 5% of the items represent maybe 95% of demand, right? And then we got the B items and the C items. C items may be the 50% of items that represent only maybe 20% of demand. So therefore, it's important that we take the C items and put them on, just let the computer do it, right? The B items, the computer does it with some exceptions. And the A items or the A products are the ones we spend a lot of time trying to forecast, right? And so that's a key thing. So, but there's also a twist on that. If you take the two dimensional matrix, we have something called volume variance analysis. That's ABC is a volume analysis. There's also whatever you want to plot a two by two, which says, you know, which are the ones that are A items that have high variation of error, between low variation of error. And we want to basically do our analysis against that as well, because it comes to a point when you get high variance items, that you don't want to spend a lot of time trying to get an accuracy because you're not gonna get it. It's just this highly variable. So that helps you figure out how to learn from the forecast errors. And so those are some other breakouts. Other breakouts include product groups, customer level, aggregated forecast, the typically more accurate than the aggregated forecast. So if somebody needs something at an aggregated level, we should forecast the aggregate. All right, so that's again another thing. You know, when you look at aggregations versus de-aggregation, is it better to de-aggregate and aggregate? Or is it better to produce the forecast at the aggregate level and push it down to detailed level? So those are what I mean by, because what happens in many companies, they're so busy doing forecasting, that all they do is keep track of error. That's it on a total level. And they don't understand that looking down in the depth of it, understanding where those errors are, you could figure out what we call the forecastability of each item. Is how forecastable is it? What can be the lowest error you can get because that you reach a limit? And it's not worth spending a lot more time on it. So that's what I mean by looking at the error in much more detail, lots of breakouts, okay? I think your article about forecastability can be really interesting for our learners too. So I might share it with them. There's an article called Forecastability, right? Yeah, okay. Yes, that's fantastic. Well, thank you, Larry. Should we go to the second question? Okay, no more feedback from the folks. No more feedback, they are kind of shy. So please go ahead and share your feedback. And now it in... We have more? Okay, we have more here. Oh, good, okay. Okay, I missed it. So they said for question one, forecast bias tendency to be over and under forecast. Also check this against the original sales target and purchase budget to see if those were good assumptions to begin with. Okay. So yeah, connecting with the other functions. The forecast error in dollars and units. It would then focus on my top 10 or 20 SKUs using the power rule, 80-20. 80-20 is for ABC. Good point. I tend to use the 80-20 rule just because of the large number of SKUs I would be managing. So that's your priority. Standard deviation. How stable or volatile is my forecast? You also mentioned that. If the standard deviation is large for a product with fairly stable demand, this could mean that my model needs to be tuned so that it is not overly reactive. Right. So yeah, thank you. Thank you for your insights for breakout three. That's really interesting points. And I think very aligned with what Larry has just mentioned. I will agree with 100%. I have no comments on that one. Yeah. It's just like pointing out important things again. Right. And from breakout two, now we're getting their feedback too. They said calculate the MAP and compare it with the MAPV. That's also like a message I was suggesting in other audio articles. MAPV value lower than MAPV is a sign of value addition insights from the article. It is a smiley face. High level, actual forecast. Actual forecast divided by actual times 100. So the percentage. You can analyze by skew to see which ones generate more errors, effect of promotion, effect of promotions and other market distortions or conditions and trends and errors and understand and remove bias. Okay. Just a little comment on that because again, I probably didn't get you to error. Error of forecast for error is actual minus forecast over actual. There are many companies that do the error and denominator is the forecast. Isn't it? Now, what does that mean to do the forecast? That's not what we in science would say is an error. Actually, I just got this little email conversation with a company recently about this and they were finding that the company was dividing by forecast. It turns out if you do the math and those of you out there are probably mathematical, you could figure this out. There's a tendency to over forecast if you use forecast as a denominator. The reason being if you forecast it a lot like infinity, forecast error goes to zero. And so that's what you want to project. Again, the reason I wrote this article was that we're ashamed of our forecast error. The way to reduce the forecast error is not to divide by the forecast. That's wrong. You're going to divide by the actual number. So people say, well, what if you have to divide by zero? Well, okay, you work around that. I get to divide by zero because that becomes infinity but still correct formula for error is actual minus forecast divided by actual, right? Not forecast, right? So that's a little common. Important point, yeah, sure. Unless you want to fake everybody out and have lower forecast error. That's your, yeah. Not professional. That's not professional, of course. Okay, great. So thank you for your feedback. And let's go with the second question that it's actually really good for our course because it ties up forecasting with inventory management. That's what you all are going in now. So our second question is in what ways might a company use forecast errors to manage risks? So we have some feedback here. Would you like me to read it before dealing with? So from breakout room one, they say forecast errors can be used to analyze where the root cause of error was to avoid them in the future. Say the assumptions were wrong, went wrong, for example. So forecast errors can be used to cover for those errors by keeping some safety stock. So to determine the safety stock. Also agree with the mention of errors caused by any kind of bias that can be found so we analyze errors. So for example, if sales department is consistently over forecasting their sales, that happens. Then for, this has very, very long feedback. So I'll try to find the main points. So basically they say they can use the forecast errors to manage the inventory accordingly, reduce losses due to overstocking and understocking caused by the most recent forecast, make changes to EOQ, EOQ being considered right now. Good start to see how SKUs are performing and hen assist the companies to prioritize and approach a targeted market being well-informed. Would you like to begin with that? No, it's all very good anymore. Yes, they say, I think they're describing a whole process. So while the past, other forecast model is always a predictor for future forecast errors and appropriate adjustment for the future can be used to plan the efficacy of the supply chain logistics. Analysis of the forecast errors and underlying assumptions and models can be used to refine these processes to avoid the same mistakes in the future. And forecasting models assumptions may need to be adjusted per SKU, geography, market, et cetera. And the adjustments made after analyzing forecasting errors and taking appropriate actions may help to reduce the risk of targeting the investment appropriately be per SKU, location, market, et cetera. So reduce the risk of targeting the investment per SKU and location market. And the last feedback from breakout room three, companies can use forecast errors to negotiate trends like service level, discounts, service reliability, network design, basically use the errors as a means to doing sensitivity analysis and then optimize the total cost incurred. Very good. So that's from our students. Now a lot of the input is around inventory and that's okay, because that's what we do. We know safety stock. Everybody's learned safety stocks. So we learned that, but I wanna step back a little bit first, because whenever I talk about risk management, the key thing for me is understanding the uncertainty, what I call the risk profile. I gotta understand the risk profile that I'm dealing with, but I always use this quote from a philosopher is Sir Francis Bacon, English philosopher, lives in 1561 to 1626. So here's what he said. Of course, he's not politically correct, but in those days they were not politically correct. So he says, if a man will begin with certainties, he shall end in doubt. But if he is content to begin with doubts, he shall end in certainties. And what does he mean? What he means is that, and one of my students put it really well, don't think you know everything, because you don't. That's basically what one of my students said. But basically what he says is you gotta start off with the fact that you don't know, understand the uncertainty, like the risk profiles. And if you do that, you'll come up with the certain answer, the certain decisions to be making, the certain forecasts to be making, et cetera. But if you don't understand the uncertainties and think you know everything, and this is the point forecast, and this is a problem in industry. Many people, many industries only give a point forecast. And everybody has to run with that. And that was the reason why the second question was important. And one of the major reasons why I also wrote this article was the fact that everyone planning in the organization has to understand there's errors in this forecast. We have to tell them, we can't be ashamed of them, we have to tell them, right? So that's one thing. So now, so understanding the risk profile and the errors is the first step towards great risk management in all areas, all places in the company. What are some of the things you can think about, buffering? There is a book called Factory Physics. I learned about this a while back. It's actually a book called Factory Physics. It's a great book. You could look it up on Google or Amazon sells it. But what it talks about is one concept is buffering. How do we buffer in a supply chain or a factory, et cetera? You buffer on three items. You buffer basically on inventory, which we all know about that safety stock, but you also buffer on capacity. What does that mean? That means we buffer on, if we have a plant, maybe we run it for two shifts, but we always keep the possibility running a third shift. We have excess capacity that we have not planned for. And so that's a buffering concept, that's a concept of buffering manufacturing. The other one was buffering inventory. The other one is time. The longer time you give yourself, the better you're able to handle uncertainty. So the concept is, if you're gonna use a forecast to do something, do it as late as possible so you're closer to the forecast so a short-term forecast are more accurate than long-term forecast. So you wanna basically make a decision on something when you've got a decent forecast about it. And if you wait long enough and close enough to when it's actually gonna happen, your error will be better. So that's the buffering on time. So it's an interesting thing, it holds, and it holds in a lot of areas. So if you look at, that was a hedging strategy where we use our capacity, but we then have excess capacity if we need it, right? Delaying the terms, and then risk pooling is a thing that actually Yossi Sheffi's done a lot of writing about in terms of risk pooling to performance strategy is you hold your inventory at work and process in which case you can forecast that more accurate because it's an aggregate rather than a detailed SKU. So risk pooling is an important concept as well. And pretty much that's what I covered in the article on data, but the capacity has all these different ways. Time, capacity and inventory can be used in multiple functions in a company. Certainly even marketing and sales and various pieces of supply chain can all use concepts. Even production, inventory, size of distribution centers when I need overflows, these are all things you've got to take into account that mitigate risk. Because the real issue is you're trying to get upside revenue potential. When there's an upside for revenue possibility and supply chain, we can't say no, we don't have it, we gotta have it. So we have to have the capability of capturing upside risk. Because if we go with a point forecast, it turns out that if it's normally distributed the error, half the time will be too high and half the time will be too low. The ones that people really worry about is when you're too low, because then you lose revenue. But to set your inventory on the average as you're gonna be out of stock half the time. That's not right. That's why we put safety stock into it. Same thing with plants. You don't do your plant on the average, you do your plant on the fact that you need to capture some upside potential, okay? So that's pretty much what I wanted to talk about here. Really interesting, Larry. And the buffering idea is a great addition and it's gonna be very helpful for our learners in the day to day at work. So thank you, Larry, for being here today with us. We have some very interesting questions that were raised by our learners about other topics related to forecasting. But we have no time to address them now. I will talk to Larry about them and share his insights with you in the course, in the section of the Second Life event, okay? So we can have some interesting discussion there in the forum if Larry agrees. I agree, it was nice to talk to everyone today. Thank you. So it was fantastic to have Larry here. He has, he's a great expert in forecasting and I think he added a lot of value to our course. So thank you so much. And just a couple of reminders for our learners. We have, we are discussing current topics. This week's topic is about forecasting. So you will find this post by Chris Kaplis in the general channel in the forum. I will put the link in the life event section too, in the course. So this week we are discussing about Apple's problems with forecasting demand for iPhone X. And they're having a lot of problems there. They over forecasted by like 50%. So join the discussion and share your insights about this topic with us. And another reminder, and this is an important one. We are opening the midterm exam tomorrow at 1500 UTC. It will be open from tomorrow, February 7th until February 14th. So it will be open for one week. But remember, it is a timed exam. So once you begin to take the exam, you only have four hours to complete it. So good luck. And I hope you all pass. And so you will do a great job. So thank you again, Larry. It was fantastic to have you here. Nice to be here. Yeah. Bye.