 Hi everybody, my name is Marcus and I'm a VHD student and today I'll introduce you to my paper of Chaoticity versus the Hosticity in Financial Markets RSMP500 Return Dynamics Chaotic. So what will I present you? I will first present the initial situation and the research idea behind the study. Afterwards I will present you the chaos analysis framework we developed, then I will present empirical results and finally I will elaborate on implications and research prospects which origin from the study. So the initial situation of the study is not within the study itself but we conducted a broad literature review about financial and risk modeling where we analyzed the situation in the field since 2008, since the end of the big financial crisis and deducted whether there are any changes and what happened since then. We analyzed a lot of stuff here, most importantly we highlight quantitative features, implemented models and we also aggregated them into model family trees so we really in depth analyzed all financial and risk models applied by researchers during this time period and our review resulted in over 800 unique models, methods and algorithms with the huge result that there is no single best approach in financial and risk modeling even under consideration of stylized facts like structural pranks, asymmetries and so on. So what does it mean? We made one mayor inside that models incorporating non-linearities, multi-fractality and dynamical trades performed slightly better than the other models under consideration but all the research we read did not really provide an explanation of why non-linearity and fractality provides better results. So we dug a bit deeper in the topic and realized that non-linearity, multi-fractal features and dynamical properties are all inherent in the field of non-linear dynamics and chaos theory that's why our research idea was to conduct a financial time series analysis based on non-linear dynamics and to fill this research gap to elucidate why are non-linear models performing better and why there is no single best approach. So what did we do afterwards? We conducted another literature review on non-linear dynamics to check whether there are non-linear dynamic researchers analyzing financial time series and indeed there are many results which analyze non-linear dynamics and financial time series yet most results are inconclusive mainly because the researchers could not differentiate between stochastic and chaotic dynamics. So the basic question is can we determine the true nature of underlying time series dynamics whether they are deterministic chaotic or stochastic and most researchers who analyze financial time series weren't able to really differentiate between those two. You have several problems because many researchers work with the same kind of methods but they don't share exactly all methods or they only use part of the necessary framework which is really required to make this differentiation between non-linear dynamics and stochastic dynamics. So in short our study presents a novel combinatory framework of chaotic and non-linear dynamic measures which we tailor towards financial time series in order to quantitatively differentiate chaotic and stochastic dynamics in the time series and we really aim to close the research gap with it. So for those of you who never heard anything about chaos before and since we're on a clock I really made it very very brief with a citation of Edward Lawrence who said when the present determines the future but the approximate present does not approximately determine the future you have chaotic dynamics. So the problem with chaos is that the chaotic system seems random but isn't. If you have a chaotic system it seems apparent random yet is a complex system with very very different underlying patterns feedback loops recurrence fractal patterns and so on. So you see a apparent random system which follows like a random walk or another stochastic process but the underlying true dynamics follow chaotic behavior and it's vastly more complex than randomness could ever be. So in short there are three main characteristics for chaotic systems it's sensitivity to initial condition which is reflected in the citation of Lawrence so that even if you start with initial conditions which are slightly different the results are mostly expected to be slightly different but near each other but in chaos it's like if you start near each other you can end up wherever you like you don't get the same result even if you start infinitesimally next to each other on a face plane. So I drop the other two's due to time they're just just mathematical properties of chaos. The most important key feature of chaos is sensitivity to initial conditions which means the result highly highly depends on the starting point. So what is our chaos framework? It's a little bit overloading here but I break it down in four major parts. The first part is that most studies do not have enough rigor towards the prerequisites and standard tests which need to be met in order to conduct a non-linear analysis at all. So in our first step we prove we try to test whether our data set fulfills all the requirements in order to be analyzed with non-linear dynamical methods. Mostly the most important three parts are you have to be non-Gaussian you have to have non-linearities within your system within your data. You need to be stationary that's important. There are possibilities to analyze non-stationary data as well but for our framework you have to have stationary data which is non-linear and noise reduced because noise is going to destroy your complete analysis. Also you are required not to have temporal correlations within your data because like noise it will render all non-linear analysis futile. So once we have the first step completed and we fulfill all the requirements we can shift to chaos measurements and tests which are basically chaotic measures and methodologies in order to determine is it a chaotic system is it just a non-linear system without chaos and so on. After we conduct that we can reconstruct a phase space so the space in which our data lives in in order to check can we reconstruct visually the attractor of the system can we really show how the overall system looks like and in fourth we conducted a recurrence quantification analysis which is based on the recurrence properties of our system and which is vastly independent from the steps one through three so we use the fourth step as an independent validation of our previous results. So what data did we use for it? We used daily S&P 500 logarithmic returns and we denoyced it using a cascadic wavelet filter bank. So like I said noise is bad for the analysis so we reduced noise in our data using discrete wavelet filters and we specifically chose daily frequency because the majority of literature argues about whether daily frequency is too low and you have to use intraday data so we used daily data to make a point that it is possible with the right tools and the right methods. Secondly we used surrogate data sets which is a transformation described in Concentraiba which basically eliminates determinism within your time series. You reassemble your time series randomly using Fourier transformations and so you literally destroy determinism in order to compare your original data set with the surrogate data sets because if your original data set is stochastic you should see no difference between the surrogates and the original. In order to be really really sure about this differentiation we used Brownian motion realization as an example of a pure stochastic system and we used a Lorentz system as an example of a pure deterministic and chaotic systems where you know the initial equations. So we compare our denoise data sets with the same data set just minus determinism, true stochastics and true deterministic chaos in order to be sure. What are our empirical results? I know we use a lot of different methodologies and since we're on a clock it's really a little bit quicker so on our first step we see that we reduce the noise successfully, we don't have analysis destruction levels of temporal correlations. These are the most important ones. We have strong nonlinearities, we have stationary data sets even if we use logarithmic distances where you can argue about if it's artificial or not but it's for us a brute force method in order to ensure stationarity. So we can say okay our prerequisites are met and we can calculate all the chaos measures and tests. For the chaos measures you calculate correlation some schemes in order to determine whether they are scaling regions within your data and we show okay we have scaling regions within our data you can show it graphically and we show that the correlation dimension, so the dimension, the fractal dimension measure of the system saturates with the embedding dimension. The embedding dimension indicates basically the assumed dimension of the system. The problem with time series and the reconstruction is that you don't know the underlying true dimension of the system so you have to make an educated guess using embedding dimensions. You can vary these embedding dimensions in order to pinpoint a dimensional value which should be correct and if you increase your embedding dimensions the correlation dimension have to saturate. We show that too. The next point is you have to have a low entropy level within your data reflecting on information content and what is here labeled with Leampunov exponents. Leampunov exponent especially the maximum Leampunov exponent reflects upon whether the two nearby points diverge exponentially or not. So basically all the Leampunov exponents and sums are in order to determine the nature of the phase space in which our data lives in and we can show that we have positive Leampunov exponents which are significant. We have a negative sum which means our system or phase space is dissipative which means it will deflate to its own attractor. So you can't see several attractors flying around in phase space you just have a strange attractor where the phase space deflates to and we are able to show it. This is point three in our analysis that we choose based on autocorrelation functions time delay and we have two approaches a more original approach using tokens embedding and we have the spectral embedding approach which uses Laplacian eigenmaps and that's where we are able to show the strange attractor. I will show it afterwards and in our fourth step we can really not only show it on a visual graphical basis we can really quantify the true nature of the dynamics and in our big result the major result is that the S&P 500 return system is equally divided so a mixture between deterministic and stochastic chaos and it yields a strange attractor which is basically the explanation why the literature is not conclusive most of the times. So since it's all a little bit cryptic for those who are not into nonlinear dynamics we see that here we can see the strange attractor of the system and we compare it to a Brownian motion which is the lower picture and the Lorentz system where we can see it may be a mixture of both but we can quantify it successfully using the recurrence quantification analysis. So what are our contributions and limitations? This framework enables a clear and distinct result of inconclusive research results. We can differentiate between stochastic and deterministic chaotic dynamics for time series data sets with very high levels of security because you can quantify basically the true dynamics and you can describe the true underlying nature of time series dynamics properly so you basically based on the recurrence quantification you can tell what process the data follows which is a major contribution to the research field which does not come without any drawbacks so it's not applicable for non-stationary data types we developed another framework for that temporal correlations and can really render analysis futile so if you don't have that you can't do it and you need a lot of data for it but since we're in the information age the data point is not really a matter of fact since you especially on financial markets you're very well documented so you have few limitations in comparison to a greater insight so what does it all mean in a nutshell to sum it up what does it mean the S&P 500 return dynamics is a mixed theory of stochastic and deterministic chaos and most importantly you can calculate liampunov times which is the inverse of the maximum liampunov exponent which tells you how long the system stays deterministic before drifting into chaos so for the S&P 500 returns you have around 34 as a value which can be interpreted in two time dimensions in an academic time it's like the frequency of the time series itself so in days so you have 34 days for it or if you interpret it in se units and seconds you have exactly 34 seconds before the system drifts into chaos and after the expiration of the liampunov time no further predictions are possible no matter how many calculative power and models you apply to it after this liampunov time is expired the system is no longer predictable so if i conclude what we found out it's that we need a paradigm shift from building new models of a new models towards determination of predictability property first so you need to determine whether the system is predictable at all before you apply a modeling approach so and after you determine the predictability properties you can select a model which suits the time span and you can pick a model which reflects the underlying dynamics proper properly so there are several problems with that and we have future research as well because there's a field called chaos control for those who don't know it which talks about can we control chaos can we un-chaos chaotic and are is chaos stable so if the question is if we're in chaos for one time can we go back are there time spans where the system turns predictable again or not thank you