 So, now we move to the contributed paper and the first one is by Luca Rossini from University of Milan. The title of the presentation is Bayesian Multivariate Quantile Regression with Alternative Time Variability Specification. You have 25 minutes, then you have 10 minutes for the discussion and 10 minutes for general Q&A. So, the floor is yours. Thank you very much for giving us the chance to present the paper. It's always nice to come after Monica. She was my supervisor, so it's always a pleasure to listen to her. This is a joint paper with Matteo that you've already seen. He was answering before and he was doing the discussion. And then also within Francesco Robert Solodette, a lot of you may know working on forecasting techniques. The title already told you a little bit what we will do. There is no surprise, let's say. So, we go directly on the title and say, okay, you are doing Bayesian. We are doing Multivariate Quantile Regression. And then we change a little bit the literature. So, what we are trying to do it is we are trying to put a little bit on time-varying volatility specification there. So, if you follow what Massimiliano was presenting this morning and also Karin a little bit was presenting, they were moving more on the conditional mean, let's say. What we are trying to do today, we are trying today in this talk, let's say, and also what with Francesco and Matteo we have done it in also last year and so on, is trying to move in the direction of working directly with Quantile Regression. We have a different kind of paper with Matteo also and Aubrey that is on the floor working on this other way of Quantile Regression. But I think it's something that is coming up to be really interesting, let's say, for the future. So, okay. Okay. I think that all of you, well, we are in the center banker, we are speaking about the fact that was coming up after the COVID-19 situation, let's say the pandemic was changing a lot the way that people want to estimate model, let's say. But also the fact that also the Russian-Ukrainian war was coming up, so maybe one of the referees of one other paper was saying not this Russian-Ukrainian war, but this Russian invasion of Ukraine, so maybe we should change also that one inside the paper and inside the presentation, but it's something that is coming up to be really interesting also from, for example, an energy point of view on electricity or gas point of view, let's say, and it's also changing the interest of people to working with indirectly tail, let's say, on that side there. This morning also, if you remember, Massimiliano was citing the paper by Adran et al. They were the first one working a little bit on the tail risk part and tried to quantify the uncertainty around this prediction. And as I stated, and as you've seen this morning, let's say, all the time series model, let's say typically, just start working within the conditional mean or the variable of interest. What we would like to do it is a little bit changing because what we saw, what we thought about with Francesca Matteo is try to work that conditional means a little bit more, it's not so sweet, let's say, it's unsweeted to try to capture something that is queerness. Fat tails and attires, as you were questioning to Monique about this queerness part also this morning, is something that within the quanta regression you can deal with. The quanta regression is not new, let's say, it's not that we invented something about quanta regression, but there is an old paper by 78 on Econometrica by Kirin Keren and Basset. That was the first one introducing quanta regression model on that side and they were using for exploiting the heterogeneous impact of covariates on different quanta levels of a variable of interest. Obviously, there is also some literature in economics and in econometrics that is moving in that kind of direction. Ferrara et al. was trying to introduce some paper on Midas, so mixed sampling and two quanta regression. There is also another paper of Massimiliano within Andrea Carriero et al. They were proposing to now cast theories to GDP growth rate by using some sort of quanta risk model. There is also another paper from Simona, the chair of the session, about quanta vector autogasim model. They were trying to define some quanta response function in that direction. But what we saw, let's say, with Matteo and Francesco, that maybe something is missing on the direction of time-varying volatility on that side. And this was starting from last year, let's say, on this paper that we got with Matteo and Francesco on JBS, where we started defining a sort of a symmetric, a different kind of continuous probability score for forecasting. And we were trying to propose this sort of novel, a CPS, a symmetric continuous probability score that was able to try to evaluate and compare density forecasting, particularly from an asymmetric point of view. That was the starting point of our discussion with Matteo and Francesco. And then we were continuously discussing, or thought we are not in the same university with Matteo through Skype, WhatsApp, and whatever, and tried to think about what we can do with it. And that was the direction. And also in another paper that we got with Dan and Matteo, we started moving in the direction of mixed frequency, tried to apply with QVAR, that was trying to combine different frequency in macroeconomic and financial variables. So this one is a little bit the literature, too. I know that I miss a lot of people that were working, trying to visualize inside the paper. There is all the other references there just to make a resume. Otherwise, I think I would spend all my 25 minutes on that slide. So moving to the contribution, let's say. So this one is a nutshell. What we would like to do it, then we will go through a little bit of formulas, let's say theoretical formulas, is what we tried to do it in this paper. We were trying to propose a framework for modeling time varying scale, which is a multiplicative component, let's say, of the variance inside the multivariate quanta regression models. And what we can do it is we can do it through a sort of, the usual, let's say, representation of time varying volatility. There can be the stochastic volatility, and the garge model, let's say, the two more known, let's say. So what we would like to provide you is a little bit general framework, and then we will just focus on these two time varying volatility models there. So what we were able to do it is we build up the likelihood of this QVAR model through a sort of multivariate asymmetrical Laplace distribution. It's something that is done in the literature, in the statistical literature. Petrioletal is working in that kind of direction by trying to do it within multivariate asymmetrical Laplace. So we took from the statistical literature a more statistician or a little bit statistician and econometricians. So we move on that kind of direction. So another thing that we were trying to do it is also we were coupling stochastic volatility and garge effect with mixture of gargantuan representation of asymmetrical Laplace distribution that results in a sort of standard deviation that was affecting also the conditional mean. That is something different a little bit from what Aubrey was doing in gargain mean and also Jamie Cross were doing the gargain mean and stochastic volatility mean models. And then we will reformulate the models and make sort of possible of joint sampling of all the trajectory of the time-varying volatility. So if I didn't get in time to arrive at the end of the talk, let's say this one is just like one slide to give it the take or if you get bored, for example, this is a take-home result just to let you know what we find a little bit. So the first things that you should know is that I'm not here to selling that one of the two time-varying volatility specification is better than the other. What we are trying to provide you is the fact that by using time-varying volatility inside this QVAR model is providing you better result by using not time-varying so constant volatility. This one is one of the first take-home result that you should take. So we introduce in time-varying volatility it beat the constant volatility QVAR model. So this one is the first things. I'm not stating that stochastic volatility is better than gargain, but because it depends also on the specification and on the other way around, if I will arrive in time hopefully we will introduce some sort of model combination that is based on sort of quantized score weighting schemes. If you came through the poster, there was a poster by Giulia Mantuana and Krutare, they were doing this sort of quantile quantile combination. Let's say it's something that we will look a little bit closer related to the literature to the quantile VAR within stochastic and time-varying volatility. What we saw is that this combination weights show a lot of significant variation over time, in particular during the tails. So I will show you hopefully the left and the right tails, let's say, and not focus on the medium, but in the paper we have all the different kind of quantize. And then we saw that QVAR with time-varying combination with time-varying weights preference accurately well. The only point is that, as I said, we were not able to state exactly that you can use this one and state the QVAR within stochastic volatility is beating anything, but we can say that using time-varying is beating obviously the constant volatility. So this one is just one of the these ones are the take-home results for the paper. Okay, so this one is just to make you a little bit of algebra, let's say, and a little bit of formula, despite all the formula that Monica was providing you, we are not going too much inside the tense representation, it is not that kind of part. We are going a little bit more simpler, maybe on the part of the conditional mean. So as you may see, we are not moving too much about the conditional mean there, but we are just using the multivariate quanta regression through this MAL, the multivariate asymmetrical ablash distribution that has three different kind of parameters that one should take care of, that is the location, the skew parameter, and also the scale matrix parameter there. So as you may see already from this slide, nothing is putting on the conditional mean, let's say, so also in the paper we are just using the easy, let's say, Gaussian or normal distribution, one can apply anything like global location gets prior or whatever, but it's behind, let's say, the scope of what we are presenting. We are presenting more and working more in a different part, and the fact is that this representation here, as you may see, you have this d, theta 1, this capital theta 2, and so on and so forth. This theta 1 and theta 2 are a sort of representation of the quanta. This is where it's entering inside all the quanta. Tj is the quanta of j series, and obviously as you may know, the quanta is taking values between 0 and 1, that's for sure, and then we have this sort of d matrices that's just like the diagonalized square, and then you have something that feeds your correlation matrices with 1's on the diagonal, let's say. So this is just the starting point. So here there is no time-varying volatility, nothing is changing at all, it's just like a Q bar, or we can say this is a quantile regression model, let's say, with constant volatility. So what's gonna be happening here is that we build up in the paper there is a little bit more theory just to give you the flavor, that's one of the Laplace, the MLA distributions there, and what is happening here is that we can rewrote as a sort of regression model, as a quanta regression models, when we were depending on a sort of WT that is a sort of auxiliary variable we can take care of it. This equation 3 is let's say you have this xt that it can be just like the Kronecker product where you're just taking care of any sort of regression like another variable, but if you're just substituting it by including any sort of leg of your YT, it can just recall as a sort of quantile VAR model in all the rest of the presentation let's say we are just taking care of one leg of the VAR, you can extend I think whatever you want it, but I think that what we would like to do it is better to have just one leg inside, so that's made the point, and again since we are starting here there is no heteroscedasticity as written in the title of this slide there is only homoscedastic variance for the conditional distributions so nothing here at least is coming up related to T as you may see there is no T let's say coming up inside the D component or whatever if you want just to simplify as possible there so what we would like to do it is this one I think is the core slide of all the presentation is the fact that we would like to enter inside within this time variable utility and within this heteroscedasticity component here when you have this sigma that is no more that is becoming time varying by fact of this substitute either, but also by the fact that you have this sort of diagonal matrix ht that is taking care of the positive elements and on the other way around as done it let's say in the literature of stochastic volatility model then we have this representation of a that is a sort of lower triangular where you have just once on the diagonal there so in practice this one also has let's say to rewrote the quantire regression model the multivariate quantire regression model that we saw before in a sort of time varying specification and this one is what is coming up in formula 5 here when you have with respect to before entering this ht representation in this two terms that are strictly related to the quantile because just remember that Q theta 1 and theta 2 are entering inside let's say these tau's are the quantile of interest for us there so the things is that one questions that can come up is that why you decided to put this time varying volatility let's say is just not a matter of just like a theoretical stuff and just because we would like to add a more complicated model to what you have in the literature too but also it's due to the fact that if you thought about what Massimiliano was presenting this morning in the stochastic volatility almost the basic things trend if you want to forecast any macroeconomic variable you for sure need to add inside stochastic volatility you don't even take care about a BVAR model or a VAR model without stochastic volatility or without time varying volatility so is this one the fact that a law has to start going in a direction also thinking about why you don't try to put also time varying volatility and the inclusion of for example stochastic volatility within this kind of model so we start stochastic volatility and the idea there is also a slide but maybe I will not have time to show you is also that what we propose within a stochastic volatility you can generalize through a GARCH model let's say obviously you should take care about all the constraint that you have within the GARCH but that's the only things that you should take care so in practice what is entering is that you have this sort of HD that is what is your variance let's say the time varying volatility by mean of the stochastic volatility representation there and this one is more or less the representation that we saw before let's say by the fact that in practice this D maybe here there was a little bit of mistakes is that this D can be your DT let's say it's something related to your time varying volatility part there and this one is just the fact that what we saw it when discussing and when trying to derive all the things with Matteo and Francesco is that what's happening is that when we introduced the stochastic volatility in our sigmas in our scale matrix there we came up that we include the square root volatility inside both the parameter related to theta 1 and the parameter related to the variance let's say and in practice what is happening is that if you're conditioning on this is a sort of auxiliary variable let's say WT or omega T as you want to call it we resemble to have a sort of VAR with stochastic volatility in mean the main difference is that what we include here is a sort of the VAR SVM model is including a sort of vector of log volatility while in our case we are just using a square root of volatility there and the other advantages let's say from a computational point of view is the fact that we have a sort of efficient algorithm from a computational point of view that allow us to speed up the computational details and this I guess coming up here so in practice is we were able to rotate down a sort of likelihood function as a sort of based on the draws from the different kind of variable of interest and this one allow us let's say to instead of working on a loop over like the time constraint let's say we're able to work directly on a sort of cycle or a loop that is working on the series but the nice fact is not that we are working within t or j but it's the fact that when we're working within on the series we are some sort able to parallelize because when you work within time it's really hard to parallelize because what's happening today is really influenced when it was happening the day before and so on and so forth while in this case we were able to rotate down as a sort of cycle over the series let's say and this one can be really easily parameterized and in that term let's say from a statistician point of view is that we are replacing let's say an O capital T complexity within a one that is really I think O capital N a little bit smaller let's say so this one is more or less what is the advantage we provide something that is computational feasible and on the other way around just to give you the flavor that I'm not saying something that is not true we have also something related to the guard the representation it doesn't change too much with respect to what I was providing you before the only things is that this diagonal term of the HD that is no more depending on the exponential part as one is supposed to have when working with in stochastic volatility but it's depending just on the on square of the power of 2 let's say and then obviously when you deal with in gauge model you need also to take care of the stationary issues related to the parameter of the gauge specification because you have this usual gauge representation for the variance component there is that again as what we stayed within the stochastic volatility mean within the guard within the guards we are able let's say to arrive at the same result so if we condition on this auxiliary variable WT we are we are able to arrive to a guard we sorry to a VAR with guard to mean representation model and again is that we are working with in not a vector of volatility but we are just working directly with a square root of which we have the same advantages computational advantages that I show you related to the stochastic volatility representation let's say so as I said we didn't take too much about which kind of prior we want to put on because the interest for us is working more or less on the time very important also on the quanta representation so we just take care about the easy Gaussian prior representation but obviously we can just work within that scenario there but that's not an issue then we have the same specifications let's say for the usual prior representation the ones used there for the stochastic volatility and the guard parameter there so since I'm just I have five minutes more let's say quanta score I think that Massimiliano was giving a really good what is a quanta score so I will not spend too much time about the quanta score it's one of the tools that we are using to try to evaluate our main results there and what we try to propose also is try to start having a sort of combinational of different model based on the quanta score so we were not building up a sort of quanta CRPS as we were showing before but we were just working within the quanta scores and building up a sort of forecast combination based on a sort of time varying weights so you have the weights that are changing and there are based on the past performances and then we just take care about something that is more a constant average weights on the forecast combination there so just to arrive to the results for the last three minutes or less so we just straight to a really simple example let's say so we are not having a huge number of variable it's not due to the fact that it's computational intensive but just for try to see what's going to be happening so we decided to work within eight variables and then we had also the NFSEI and this one is just the representation we do an in-sample analysis that is arriving till 2022 the second quarter and we have a sort of out-of-sample analysis where you use a sort of rolling and expanding window with the length of 40 years when we're using a rolling window obviously and the out-of-sample is running taking care of at least of the COVID-19 pandemic period let's say it. Okay this one I hope was better at least on my computer this one is a little bit of a presentation let's say of the result of the quantized score that is changing through time and this one is related to two different variables so the left is more or less to a macroeconomic variable so the GDP while the right is related to the financial condition and I hope you can see we have three different kind of colors here coming up so there is the blue one that is related to the left tail the red one is related to the medium one and while the yellow one is related to the right tail there so what one can see from the quantized score let's say within stochastic volatility let's say is that what we see is that we were able to see some jumps around the drop that was happening a little bit inside the GDP for example representation something coming up for example for the 90% there was a peak around 2014 and then there was also something coming up really strong let's say in the left and the right tail let's say because you as you may see the red is a little bit smaller when it was coming up the COVID-19 pandemic and then also we are still seeing that at the end we were at the beginning let's say of the Russian invasion of Ukraine and it's something that is coming up also in particular for the left tail while on the NFCI what we can see is that it's not changing too much but there is a lot of variation in particularly during this period the COVID period but also related to what is happening when there was the debt ceiling in 2008 for example here when you have the left tail let's say so this one is one of the result that we see then there is something that is making us puzzling a little bit so it was a lot of discussion with Francesco and Matteo about these results because is what we were doing in combination weights so here is changing a little bit the color of the lines so I apologize but the thing is that the red lines is related to the time varying specification with stochastic volatility the yellow one is related to the garch while the blue one is related to the custom volatility model there and so we show you what is happening to the NFCI for two different kind of the left and the right tails there and so what we can see is that the blue dots is almost near to zero so it means that the quantile way the quantile has almost zero weight for mainly here I just showed two percentile at least in just some period is having chan jumps for example during the COVID 19 but on the other way around the other choose in particular also the garch but also the stochastic volatility is having more is a little bit more persistent let's say fruit times there so this one is related to how the we're dealing within the combinational part and just to conclude because until now what I see is that I didn't compare too much with respect to the constant so one can say okay until now what you said here is just like that when we're doing time varying combination weights we are not putting too much weight related to the constant volatility part through the estimation part but from a forecasting point of view yeah this one is just for table representation but just because I think I have just one minute or two this one is just a quantized score for two different kind of quantized let's say the left and right tail so what I can see here what we what you can see is that if you have a black let's say a bold number let's say it means that it is the one of the best model let's say and also it's a sort of ratio so one what you can take is that despite the variable that one is taking care having a sort of time varying volatility model and also some sort of combination in particularly in a multivariate setting is providing you better results with respect to not having anything there despite the quantized that you're taking care because also when you're taking the right or the left quantized it doesn't matter but the values are except some small cases for example the unemployment rate but for all the others it seems that we are beating let's say the cuvar with constant volatility so this one is just for clarification of what is happening inside the paper we have also for all the other quantized this one is just the idea so it was happening on the left and on the right on the right quantized there and that's just to conclude is that what we saw is that what we were proposing is a new quantized regress quantized VAR model when we have some sort of time varying volatility there and again I want to highlight and stress a little bit is that we are not stating that one of the two time variable volatility is the best what we are stating is that we beat some volatility model and the second part that I would like to align is that is we are not working within the conditional mean directly but we are working directly on the quantized so this one is something that is a little bit changing your idea let's say or how you work let's say because the literature until now is focusing on conditional mean and then you build up the quantized after the conditional mean in this case you're working directly within the quantized so this one is one of the main aspect that we are trying to highlight in this paper but also in the other paper that we have with Obrie and Dan and Matteo the paper is available on archive there is a little bit of an old version we are trying to provide a novel one obviously and if you have questions feel free and thank you I hope to be on time and thank you very much perfect thank you very much Luca the discussion is Matteo Mugliani from Band de France 10 minutes okay thank you very much so it's a pleasure to discuss this very nice paper as Luca said there is no surprise in the title well what I was expecting is some surprises in the paper itself so we discuss about that over the lunch we say that there are probably margins to improve the paper not on the theory but on the content of the paper on some discussion of the results even your approach so that likely lead to a nice publication that I sincerely wish to you even though neither of you passed by my poster so so just a little bit a couple of slides of comments that I think are most important for this paper so just to review a little bit of the model so what you are doing here very it's worth it was me but okay so what Luca Matteo that I had the pleasure to meet for the first time today are doing here is simply well simply of course not so simply but moving from VR which is quite standard we saw some representation of the Bayesian VR today to a little bit more sophisticated version of the Bayesian VR which is the Bayesian quantile VR using this multivariate asymmetric Laplace definition for the arrow structure and giving providing this form for the Bayesian quantile VR here we say this is this is the scale matrix this is the correlation matrix here we have the square root of the correlation matrix of course it works because this is at least as a metric positive defined and they are not happy with that they also move to something even more complicated which is Bayesian quantile VR with time bar in volatility so I honestly in my reading of the paper in this discussion I didn't go through the algebra here I totally trust you on the final results which are in the end something very similar to the Bayesian quantile VR but in addition we found this a term here which is the result of the truly this is a triangular matrix and most importantly this h square root of h here which is the time varying volatility matrix here in your model so what is the time varying volatility model in this paper they are not happy with just one classic stochastic volatility they discuss two time varying volatility approach the first one is stochastic volatility itself so as we say this is the standard representation of an autoregressive which is in your paper I guess it's a random walk a process for the log volatility which leads to something which is similar to a VR as you say the VR with stochastic volatility meaning but here we don't have the volatility we have the square root so it's not exactly the same thing and we also have a garge type of time varying volatility which leads to something which is similar to a VR with garge in mean but not with the volatility itself but rather a square root so what are the main contributions giving all that of this paper is the first one to my opinion I'm not completely ok with the full literature but I have the feeling that already considering a quantized VR in the Bayesian framework which is already a nice feature of the paper by building upon the liter literature of Bayesian quantization and using this multivariate mathematical applies for the errors second contribution is extending a one type VR to time varying volatility and develop here a sample that I couldn't really check because the technical appendix was missing in the paper I guess but I had some idea maybe I'm wrong it should look like the sample provided by Farofa with us no it's not the Jacquié or something ok ok ok no but it's ok so they develop a sample in Psydo Stochastic volatility in Psydo which is nice and on the empirical side they show that the quantized VR time varying volatility may improve on one step ahead forecast and also if your question is why considering stochastic volatility in GARCH probably it's a posterior let's say addition to the paper but I don't know so they show that combining almost stochastic and heteroskedastic quantized VR forecast with both those time varying volatility and time varying weights may improve the one step ahead forecast a lot of material here so some comments first of all quantized regression with time varying volatility this is a serious question I got many times I don't work all my time with those models but I got this question why could this time varying volatility be empirically relevant so in a linear regression time varying volatility accommodates for the changes in distribution of the shocks but in the baseline take just a baseline quantized regression not the quantized VR regression what could argue that this time kind of model already accommodates some of those features so here this is the quantized regression models and you have that the fact is that you always look for using those models are generated by the asymmetry in the coefficients this term here shifts the location of the targeted quantized this term here rescates the shocks for the quantized and as you can see here and here you have some terms that are already in the exact time t so I honestly believe that adding some flexibility by introducing this time varying volatility is good for the model but maybe the paper should elaborate that I think it would be the first time more on the relevance of time varying volatility in a quantized regression so I couldn't find any paper discussing very clearly why should be important another comment is on the efficiency of the sampler so you claim in the slides but also in the paper that the sampler you develop is more efficient so can you provide more details on the efficiency can you provide Monte Carlo simulations just to get a grasp of the efficiency of the sampler again something that we completely forgotten nowadays to do is just run some conversion dynamics and show what are the diagnostics for the sampler on time variation time variation for the volatility is as often the case a part just a part of the broader picture in terms of flexibility so I don't want you to suggest to accommodate time variation also in the model parameters in this paper but this paper I think should take this point but just discussing already discussing the fact that you are picking up just time variation in the volatility but you can have also this time variation in the signal also in the parameters and suggest in your paper a way forward in order to accommodate also for some time variation in the parameters and finally on the forecast horizon so this is a very important point the VR models are often used and very useful to get multi-step iterative predictions but in the quantized VR framework we are still limited in some way to one step ahead so Simone works a lot on that nice paper but I would like to discuss with Simone about that paper but I think in order to get let's say density forecast at each step ahead computed through a quantized VR in an iterative way it's still quite complicated to understand it so it would be nice since the paper of course limited to one step ahead it would be nice to really discuss this issue in the paper since we don't have this is because we don't have many papers on quantized VRs so it would be nice that at least the first papers working on that they could discuss all the issues that maybe will be solved in a future literature in a few years I hope I have one minute so some additional comments mostly on the empirical analysis so how well the model does compared to let's say standard VR with stochastic volatility at a Carriero, Clarke and Marcellino not the one you are citing in the paper the other one which I discovered a few days ago for coming into GMCB second point how well does the quantized VR model compared to univariate quantized regression once you include also lacks of additional exogenous predictors so it would be just like you take one equation and you put in the lack of endogenous plus lack of exogenous so you don't have VR system but you have just a univariate quantized regression with all the ingredients of the VR I would like to know we'll be curious to see how this compared to your quantized VR then on the results you only focus on quantized score what would be nice to also provide an assessment on the entire density forecast here using different metrics like CRPS, low score but this would imply and this is quite important in the literature the choice of the computation of the entire density from a discrete a discrete number of quantized so you can use a fine grid of quantized and then try some hardness moving or just or use this QT matching approach it's very popular nowadays so it's mostly up to you but you can increase the the amount of results in your paper those issues here and finally we'll make a bridge towards the work of Simone this paper discuss a sort of testing approach so for casting quantized a given quantized Q variable Y conditional quantized Q prime of variable X so could also treat in this framework I think the answer is yes but could you do it in this paper to approach the the issue of stress through this the proposed model some similar way as Simone did in his work thank you let's collect a few questions from the floor a lot please state your name and affiliation so everybody knows so hi very nice presentation I'm John Paredes from the ECB I wanted to ask you a little bit more on your idea of the combination because normally when we think about combination we think about different variables that are important for different models to capture different aspects of the economy in this case if I understand correctly you are basing your combination of different types of stochastic volatility but then when I think about this and when I see your results sometimes one stochastic volatility is more important than others you are picking up different weights and this gives you an idea I'm thinking how I can express to my policy maker how to explain now I pick this one because I think now this volatility is different than this one so how can you conceptualize this idea what you want to capture there and another point that I have is when I hear you are choosing three specific aspects stochastic volatility models but I could think of a generalization which is what the machine learning models are doing so you don't pick up one you just allow machine learning that you are going to find so how could you maybe make the match between these two types of literature thank you I am from the National Bank of Poland and I have three questions first you have this term WT right and is it IID because I don't probably I missed the time structure on WT and here my question would be perhaps you could give up full vectorized stochastic volatility only have something like stochastic volatility inside WT what would happen then second question related to that is is it exponential I don't fully understand is it exponential distribution is WT so what if you make it gamma with estimated parameter right probably it spoils your Gibbs sampling scheme that's my guess but it's a single parameter and third is what's the gain from the conditional correlation matrix because my question would be what if you simplify this stochastic volatility into single factor but instead go for something like dynamic conditional correlations right because I'm not sure if I picked up correctly the gain from the multivariate nature of your model because if your model is multivariate what's the point from the forecasting point of view what's the value added this is Davide Delamone I have a couple of quick two questions one more technical about the difference between GARCH and stochastic volatility specification in one case GARCH find the likelihood function is much easier in the other case stochastic volatility seems a bit more difficult so I didn't understand what the difference in the two sampler and the other question is more about it's reconciliated with the Matteo discussion you put together two models that both suffer one of the overparameterization and the quanta regression with the problem of to estimate a lot of parameter with the same number of observation and then you add as well stochastic volatility so you have a lot of parameter there for example when you standard quanta regression you use var Bayesian techniques to do reduction because you have parameter uncertainty right you have a lot of parameter with the same number of observation so adding again another layer of uncertainty because you have more parameter to estimate so it's a bit it's a bit it's it's stuff right stuff to to extract a lot of information from the same number of observation lots of parameters yes, Daniele Bianchi, Queen Mary University thanks Luca for the talk very interesting I guess I have very two simple questions and apologies I'm not familiar with the multivariate Laplace the first question was the obviously you buy flexibility and computational flexibility but obviously one cost that you pay is that the third moment is not time-varying as far as I understood right so you time-varying volatility but skewness is constant all the time and obviously the two things they have an interplay so I was wondering if you you know what's your thought about it if you thought about extension or having time-varying time-variation in the third moment that's probably relate to the WT comment before and then the second question that I have is I was curious if again I'm not familiar with the framework but if you can capture correlations across quanta is because I think that would be from a narrative perspective that would be interesting for instance if you have upside risk on PPI that translating downside risk in industrial production you know this type of correlations you can elaborate on that thank you other points I'm Aubrey I'm also co-author Luca Matteo so what I so it's great that you're using my method that's just got accepted at Juno Chrometrics so I think the efficiency of sampler so what we did was for the SV and mean approach what you could do is what is the alternative approach is part of who gives with ancestry approach so what you could do is compare that and that's what we did in our our paper and we showed that the part of who gives with ancestry has path degeneracy issues or you get highly auto-correlated methods so that's just this is not a question but that's one comment you could do to show the efficiency of your method for the SV and mean saying that you're not going to have this if you used alternative methods that is used one last one and I have one as well yeah Ben Schwab from the UCB so a linear Gaussian VAR can be estimated equation by equation so if there's n variables it's n-univariate time series regression and so in your settings there are n variables and p quantiles so I was wondering under which conditions does this boil down to just running n times p univariate quantile regressions yeah and we are late but I cannot refrain from asking a question as well I'm probably the only one and repent the frequentist econometrician in this room but from the frequentist perspective there is a one-to-one matching between time-variant volatility garge and quantiles that's actually my first paper in the area with Rob Enger was actually making the point but I think more importantly one of the advantages of quantile regression is that it's a semi-parametric technique you just have to specify the dynamics of the quantile that you don't care about what the likelihood is it seems to me that going this route you are mired into all the problems of the likelihood the question that we're asked point exactly to this okay we go with time-variant volatility what about the exponential distribution what about time-variant skewness we can go on what about time-variant kurtosis and so on so it seems to me that this way of modeling creates other problems how much time I have for answer to let's say until 20 and then we still some time from the coffee break so I think I will not have time to answer all the questions in particular from Matteo which I thanks for the discussion and I apologize for not sending this supplementary but it was a really hard and tough period this one and for not coming to your poster obviously and that's my fault completely I was discussing with Monica about doters and let's say I was showing her the photos of my doters I think it was more important than now I'm joking about it I will come later on for sure so I completely agree with you about the fact that we need to motivate better the part of the empirical we know that is a little bit the difficult part let's say of the paper more statistician Matteo too Francesco maybe is the one more from central banker let's say and so we know that we need to motivate better that kind related to time-varying regression so that the time-varying in the parameter well that's something that is opening a really huge we can also add neural network we can also let's say we want to stay within the scenes it was the first time that we were working within let's say time-varying volatility despite Simone but is a frequent let's say not the dark side of the Bayesian part but that was the point is that we want to be simple as possible let's say just to give you the idea we actually had prediction is something that we didn't go and look at too much we just focus on the first one and efficiency of the sampler we run something if I remember there isn't supplementary material and that's my fault completely if I remember but also Matteo can add anything but it's something that we run it for sure on the server but maybe we didn't include because I think that the paper has a lot of pages a lot of things inside adding also the code analysis inside the paper I think is getting sometimes it's not so get accepting inside the paper let's say the comparison with Bivors to casting volatility let's say that is a different kind of approach they are working within conditional so as I said we are not working within conditional mean let's say but we're working directly that that kind of paper was working directly on conditional mean let's say so that's maybe a little bit more hard maybe in simulation we can try to do some sort of comparison by taking but I think it's a little bit more hard let's say the things of the universe quantile with additional regression something that it's really nice and I've done it in a lot of paper with Francesco we're working with electricity prices but in this time I forgot about it and I think it would be interesting at least because we can so what's happening in regression CRPS that's something that we didn't look at we just focus on the quantize score honestly and we can try to look with Matteo Francesco stress testing approach I think it's so the paper if we had all these things and plus the stress testing we became I think 60 pages and is already long enough I think and maybe for a second paper or maybe we because we focus on just on the forecasting scenario and stress testing Simone has done in the paper that I was citing with Shakis Bidi if I remember correct the name and if I pronounce it wrongly sorry and so maybe it's something that we should also investigate with Matteo Francesco and how much I have it one minute more for the audience part so the combination is made across model across model not across variable let's say we take each variable separately I didn't get too much about the machine learning techniques on the stochastic on a time varying part it's something that Massimilano was doing a little bit with the Dirichlet Dirichlet process but as you were stating this morning you were kicked out by the referee let's say because I think it's still no because I think it's still a little bit of problem I'm working within Dirichlet process with Monica we have work about it and I saw there is a little bit really really difficult let's say to work and to sell to the community to the statistician you can sell it really quickly let's say because Danson has done a lot of paper but to the econometric side I think it's a little bit hard to make people thinking about this sort of infinite representation and also the neural network that Karim was doing for the time varying volatility I think it's a little bit of a difficult to think about it also from a sampling point of view let's say regarding to WT that was one of the main things about it so that's something that we just take on the exponential part I think because yeah it was just a data augmentation things let's say just like a sort of as I said is a sort of auxiliary variable just that augmentation one maybe we should thought about it and well if we open the stuff about third moments fourth moments I think that we can discuss for when you open the second moment yeah yeah yeah now but moving to the third one fourth one we can continue during dinner I guess but I think we can continue coming days weeks because I think it's something interesting let's say but I think that as you said all of you let's say or at least also the referee of the paper let's say what's saying is that it's a lot it's complicated let's say already the model in this way if you start third model you get even complicated so from a certain point we are trying to as also I was stating to Matteo before is being let's say not easy because it's not so easy let's say but we already introduced something that is really complicated I think also having third fourth that's something that is a little bit becoming really cumbersome and again I think is maybe in the future there will be time to moving also in that kind of direction nowadays we move to the second order let's say but also when we work with conditional mean within time varying we still thinking about moving to third fourth moment then we open but I think it's something interesting and we Matteo and Francesco and those over that we're working on that direction would be something that would be nice and obviously for the others we can discuss otherwise we are thinking about time so thank you very much again