 All right. Well first what I'd like to do though actually is I'd like to thank this man right here Everyone give him a round of applause. Let's Tell us about knowledge yeah, well, okay, so you know Yeah, I'm here. I'm a mathematician. I think sometimes I'm also an AI researcher for a very large Corporation which I'm not going to promote Because I'm not here for that but apparently somehow but my mom's really proud of that She doesn't know what I do, but she sounds important So what I want to talk about here is a mathematical theory of knowledge. Now. I'm releasing a paper now Part one where we define knowledge mathematically. This is not a philosophy or something like that This is a little bit different So, you know, you're gonna have to Really sit down for the math here. I'm sorry for that, but I'm I'm Bart Maul St. Clair And I do something called the amateur academic where I tried to promote open science and academics for everyone And that's what I'm really here trying to do and I really love doing that You can of course always find me on Facebook Twitter Quora LinkedIn even research gate. It's all available there Please like and subscribe. I do this for the love. I do this to promote open science knowledge And you know, it's quite ironic that I actually, you know Mathematically try to come up with definitions of knowledge when I try to promote it like this. So To a mathematical theory of knowledge part one it's a part of a three part series and Releasing that now if you're watching this on YouTube on my channel, then you can click on the link below if you're here There's no link. I don't know what you're gonna click on but So I have three choices how to present this because this is like really difficult to even do and Presenting it in a very short period of time, you know, that's that's really difficult. So what are my three choices here? I can describe it with math, of course I could try it without math That's a that's an maybe And I could talk about the applications which you know, honestly people love the applications of things They love practical things. They love examples. So, you know, I'm just gonna skip that and go right into the applications Maybe that's easier and it will motivate you guys to care about this So, of course in machine learning what you can do of course is for typing machine learning algorithms And combining them if you have a mathematical theory of knowledge, it's very helpful for that Knowledge extraction obviously knowledge extraction and of course even possibly new methods So that's what I'm really excited about and that's why I do this really abstract mathematics The other thing you could do with this is of course in expert systems You can improve the modeling of expert systems, of course You can create new logics and you can even combine different types of machine learning and expert systems to try to build Hybrid systems that are much better. So that's those are those are the the practical motivations And you can of course put this on the blockchain if you like I mean we're at a blockchain talk So I'll throw in those words every now and again like blockchain or maybe Internet of things if you fall asleep, you know So we we're now at the crossroads here Do I do it mathematically or not? And you know, if it's a mathematical theory of knowledge, I actually do need to hit you with math I hope you don't fall asleep During this I think they would kill me for this. So why don't we split the difference? Why don't we do some math? Okay, just saw a light version, okay? And you know if you clicked on this and you're watching this I don't know what you were thinking because it's a mathematical theory of knowledge not a philosophical theory of knowledge So bear with it here. Okay, so first off. We have the first axiom here in words Knowledge is a compression of informational observations What what the heck does that mean? First off, let's define information We have this guy his name's colas professor colas from Switzerland. He defines information as an abstract answer to a question That seems good. Now That's and there's no math there yet. So that's good. We got two slides in and we're no math, right? So but Here we go. So an information algebra is what he does and I'm very glad he did this So I didn't have to thank you for that Professor colas so an information algebra is a pair of Satisfying axioms one through nine, which I don't put here because that would take way too long So there's I don't put those axioms in but you can find them easily And that's from like I said colas and that's our foundation of information when I talk about information I'm not talking about facts a lot of people think facts or or information technology. It's not what it means It means something else. This is another type of information So let's get into this here You see there's like a like a cursive D here. So there's some special type of D represents domains of abstract questions now What it what the heck is that so D is a distributive lattice and I've underlined this here because I'm going to be going to Lattice's all the time. I love order theory Things like that. So so we're going to be talking about lattice is a lot. So unfortunately, that's that's why actually It's a cursive D. Sorry. I didn't use LaTeX for this presentation. I didn't want to go to full math for you guys But it looks prettier in LaTeX. That's for sure the D here is Cursive because it's a lattice. So let's let's just talk about that So this is important and I don't want to do this to you, but it's important So if you look it says important on the slide, so it's actually important Lattice has Two basic things. Okay, it's it's an order. It's a partially ordered set. You see the normal L here that's a set and You have some sort of Inclusion it could be greater than less than equal to that kind of thing, you know Or it could be it proceeds or includes But you know, you can go with a greater than less than if you want that works fine And secondly, there are operators for meet and join The heck is meet and join? Well, you got a you got a great like diagram here So a lattice has operators for meet and join and if you notice there's there's a for meet It's actually pointing down. So it's the infimum. It's going down to try to meet two members of a set it's trying to go down and and and find out where they where they meet the infimum and You have also join which is is pointing in the other direction. It's going up and when it's going up you got The supremum So that that's really all you need to know there is where where things go up and they meet or or they you know And they join and you just you need to know that basically that's that's a lattice more or less So now things are getting serious. Let me let me take off my jacket here This is getting serious All right, here we go So now back to that So D is a distributive lattice Good, we've got that now. Everyone's still with me. You're still awake Yeah, okay great. Just checking just checking So all right, we have Phi here and it represents a set of pieces of information Where each piece where you see that that's a little Phi Relates to a specific domain in Our lattice of domains D. Okay, so this is this is from Professor Cola So this is not I wish I would have come up with all this myself didn't I didn't do that wish I could take credit So that's that's what information is here. It's relating to domains. All right, so and you have Phi here It's a semi-group Which you know, I don't know if you need to know what that is and maybe not it's good to know But I wrote that down here on the slide. You can read that. That's good So Now this is important because it's under combination. There's several operations You can do an information algebra and combination is one of them. I'm not gonna get into that. Okay, I'm gonna spare you It's gonna be okay. We're gonna get through this together. So knowledge representation is medium dependent The next action what does that mean? What does that mean medium dependent? Well, it means there are knowledge types surprise surprise there's deterministic and Non-deterministic and we're gonna be discussing those types of knowledge. Okay, so Let's go for the non-deterministic. That's the Shannon information entropy is a good example of that Maybe some of you know that maybe not if you don't know what it is. It's surprise Okay It's surprise where you didn't know I was gonna do that and if it was a probabilistic thing You would see that If you don't know what's gonna be predicted in a system next of signals such as with Moore's code or something Then you know the Shannon entropy is higher because you're getting new information and that's that's Shannon information entropy That was a very bad example, but it's funny. So I thought I'd go with it You still there I guess right? Okay. Yeah, okay. So deterministic now. Let's talk about deterministic. Okay So information content is one of the important things there and that's order that's remember We're talking about the lattice and we're talking about ordering and the next one of course is vagueness You know, you've ever heard of vague information or vague data where things aren't very precise That can happen, right now It would be really cool if we could combine Boolean and non-Boolean set theories meaning, you know one That's very precise and one that's very vague People have been doing this of a lot but a lot of it wasn't very very good and there are very technical reasons for that and then we're gonna talk about that right here so What are the predicates of set of set logic or set set theory and that's membership and equality membership Of course element of equality. So if it's planetary, right? So when you get into fuzzy sets You have fuzzy membership where it's it's it's defined on a closed unit interval Usually something like that and it's it's a it's a function. That's a mapping there to the closed unit interval or to a lattice as we talked about before lattice and Still regular quality though Not fuzzy equality and there was a lot of problems with that because if you only if you only take one of the predicates of Set theory and fuzzify it It breaks things. It's not very good. I'm not gonna get into why so there's these guys and pits he first suggested it and In bar Michael bar who actually did this and they came up with this thing which I call fuzzy I know that because it's funny because it's it gives you fuzzy equality and So I call it fuzzy sets. They don't do that. I do that Hopefully that sticks and yeah, these guys did a really great job with this I highly recommend reading the paper, especially if you're into category theory. It's really good so just really briefly I'm gonna show this you don't have to understand what this is but you can see here that that There there's a Two projection mappings to to a set X here and that they'll they'll actually they're morphisms in the same category I don't know if that means anything to you guys, but that that's what that is And and so, you know, you have one mapping for your membership and one for your quality Okay, so that's that's really great. Now. Why is that cool? Well, it forms a complete lattice remember that lattice Okay, hopefully And the cool thing about this is you can do power sets and function sets and you can do everything of an Intuitionistic logic, so that's really really great and this was proven by Michael bar. He's shout out to that guy He's amazing that guy did some really good. He's still alive by the way 81 years old So with these fuzzy sets You can do boolean non boolean you do fuzzy information. They form a topos, which is really great So it's a complete hating's algebra. That's great for some reasons which I'm not going to explain So that's really cool. Okay, that's that's one of the foundations of these types of knowledge, okay? I Hope you're still with me on this So the cool thing though is with fuzzy lattice type embeddings So I put a meme in here. So just because they said that would be funny. Is it no, okay? darn, okay, so you can do lattice embeddings here For different types of fuzzy sets so you can produce all of them. That's really great, you know, so Type two of course is the measures of well in this case. We're going to talk strictly about probability but it's measure theoretic in general and under our The real real numbers it does not form a complete lattice unlike the other type and of course you have Information content and entropy now of course information is very important information is handled discreetly It's handled fine. It's it's countable. Whereas knowledge may be not So that's the difference here and that's actually another thing why in in the case where you have R for the real numbers or something like that. It doesn't form a complete lattice. So you can't have an order theoretic Description really that's complete and it does not actually form an information algebra according to colas because it has to be a distributive and complete Lattice and that you don't get that and the the reason you know behind all this is very very very complicated But maybe you've heard of the Bonak Tarski paradox where you can you can you can if you have uncountable sets You can you can rearrange them in a way that you think you would actually get you know a sphere That's now two spheres and when you're trying to measure stuff like volume or something like that These measures become meaningless then and so that's that's actually why the guys who invented probability formally, you know with with Sigma Algebra's and This kind of difficult stuff to explain right here They did this they limited this so that you could actually get meaningful measures out of these things and That's that's why otherwise you'd get into some really bad problems here And that's why it doesn't form a complete lattice. I wrote a proof on this. It's actually pretty straightforward. It's in the paper All the stuff has been proven you can check it out or you can just come up to me and ask me to write it down really quick Before you hear something So going back to it though knowledge is a compression of informational observations. What do we mean by compression? What do we mean by compression? Well, that's using these types of knowledge using type one and type two now Sometimes they could be isomorphic with each other. Sometimes you can have cases where they're equivalent Okay, and in some cases they're not in some cases one is better in some cases you can use both of them and You can combine them and you're always trying to find what you would call the atomic algebra Something that's the least representation that you need Which would be compression which this this is something we're going to be talking about in part two of the paper I'm not going to do that to you now. I think this was enough for you guys I hope that like I said this time I decided to go really really technical and I'm sorry for that But damn it. It's a mathematical theory of knowledge. Okay, so deal with it So anyway, I I do the amateur academic you can always check me out on Facebook Twitter Quora even LinkedIn, whatever I'm always available. You can ask me questions especially on Quora. I love to answer questions of Quora And so that's what I do as the amateur academic what I'd like to do right now really quickly is thank all of my friends My loved ones all of them the LG's out there all you guys watching. I know you guys are streaming this right now probably my mom Thanks mom Thank you so much for all of your love and support over the years so that I could do this crazy math stuff and Thanks to the big company that pays me to do AI stuff so I can do this kind of crazy stuff with you guys It's all about the love and I hope really that this inspires people in one sense or another to to be more academic and research because I think honestly that When it comes to things like fake news or whatever you don't need to build a machine learning algorithm You need to build people who can actually figure things out better and that happens through academics and teaching academic methods and if there's only one thing you walk away from besides what a lattice is here It would be that so That was a mathematical theory of knowledge part one defining knowledge. Thank you very much for your time Let's get the heck out of here