 So what do you come? Coming to you today from Thailand. I want to talk to you real quick about data and anecdotes, so if you've been in rational circles for a while, you know that data is king and anecdotes are supposed to be essentially irrelevant. Anecdotes aren't data and there's something to be kind of despised that those dumb lay people talk about anecdotes and they don't have real data. I think this is a mistake. I think anecdotes are one type of data. They are somewhat lesser quality than regular scientific data, but in some circumstances they are higher quality and more reliable. Data is simply reporting of something happened. So when I took this pill, this happened, when I said this thing, this happened, when I put these two elements together, this happened, it's just reporting experiences. Anecdotes are people reporting their experiences just in a non-quantified, non-scientific way and I think in many cases it's dogmatic to dismiss anecdotes and wholeheartedly embrace scientific data just because it appears more scientific. Let me give you a couple of examples. First, let's talk about a hypothetical drug. Let's say there's a drug out there. Let's say it's an antibiotic drug and it's been approved by the mainstream FDA organizations and has a very low rate of side effects. Some of the side effects are fairly severe but it's approved for mainstream use. But after the drug's approval, something happens. You start to get a lot of anecdotal stories. People start saying, hey, I took this drug and something really powerful happened to me. My tendons started to degrade and I popped a tendon. I've had healthy ligaments my whole life, but my knee tendon popped, my ankle tendon popped. They said, I took this antibiotic for a sinus infection and now I can barely get out of my chair because my joints hurt so bad. And these anecdotes, they're data, they're just anecdotes, keep piling up, keep piling up, and then communities emerge around these anecdotes saying, oh, that happened to you too? Yes, that happened to me too. From somebody trying to evaluate what's going on from a scientific rational standpoint, what do you do? Do you look at the scientific data that says, well, in the randomized controlled study, there was only 3% of people had severe side effects or 1% of people or a fraction of a percent, whatever, and yet in the real world, when people are taking this drug, lots of them are reporting very severe side effects. So are they just making it up? Is it placebo? Do they not do, do the anecdotes not trump the data? Well, this isn't just a hypothetical. I urge everybody to Google the drug Levequin, L-E-V-A-Q-U-I-N. This is a antibiotic drug in the quinolone variety. I'll give you a few anecdotes. In my life, I have met four people ever who have taken Levequin, and I'll tell you how I know about this drug in a second. I've met four from four different blood lines, each of which who have had significant joint pain as a function of them right after taking the drug. Maybe it's not connected, but maybe it is. My wife, my father, my sister-in-law, and a completely unrelated doctor in Atlanta. Four different blood lines, only four people I have ever met who had taken Levequin, and each one of them had significant joint pain. My wife only took, I think, one or two pills, and she almost had to wheel her around because she was in so much pain. My father was in the same circumstance. He took pills for sinus infection and was practically crippled for some period of time. I don't know as much about what happened to my sister-in-law, but I know she had joint pain, and my doctor from Atlanta literally had to get foot surgery because, and it just so happened that just prior to him getting foot surgery, because the tendon snapped, he had taken Levequin for an infection. Isn't that interesting? Now, according to the data, the scientific data, it's a very, very low percentage of people report these side effects, and yet if you Google Levequin joint pain, you will find an insane amount of anecdotal stories from people online telling exactly what happened after they took the drug with really tragic circumstances, everything varying from their joints being ruined or losing their eyesight and all kinds of crazy shit. I don't think this is coincidental. I think it is dogmatic. It is like religious dogma to say, all these people are just making it up because they didn't make it into the randomized controlled study. Like a far more plausible explanation is the actual official data that came out was wrong. It's far more dangerous drug than it should be. And go figure, just in the past few years, now the FDA in the United States is issuing a black box label or whatever saying, oh, by the way, some of the side effects of these could be very severe. And hopefully this drug will be taken off the market at some point because just from the research that I've done, it looks that it's causing more harm than good. Don't take my word for it. The point is to say, if it turns out that Levequin causes catastrophic joint problems, then it is the case then that somebody would be more justified in believing stories like I've just told you about the four people I've met in my own personal life that have had joint problems after taking Levequin, then dogmatically believing the official scientific study that came out, which was wrong. So that is not to say, of course, that all anecdote versus data situations are like this, but I do think it illustrates something which I believe to be far, far, far, far, far more common than most rational people think, which is that the data is probably wrong either because it's done incompetently or because it's done, it's biased that the drug companies that are financing this research and they have an incentive to bury the results that they find. Or like I said, it's from sheer incompetence or a relatively small dataset. Unfortunately, I think you can take this example and apply it to all sorts of circumstances, especially in the field of nutrition and the side effects of drugs. One of the things I'll talk a lot more about in the future is there's an epistemological problem here. There's a methodological epistemological problem where the data that comes from the real world, the actual ingestion of antibiotics by people who are being prescribed them from their doctors, doesn't go upwards. That information about, oh, by the way, this causes joint pain, doesn't go back to the doctors, doesn't go back to the drug companies, doesn't go back to the people producing the studies most of the time. Most of the time, if you have any interaction with the medical establishment whatsoever, if you say to your doctor, oh, I took this drug, I had this reaction, they're not going to record it, they're going to say, oh, okay, well, sometimes that happens. Or they're going to say, oh, it's totally unrelated and you're just making it up, don't worry about it. There's not this reverse flow of information from the real world to the people who are publishing the studies. It happens all the time. Before we talk about the next circumstance of anecdotes, I do want to mention just how poor the actual theories from the professional intellectuals in when we're talking about nutrition or human health, in terms of the crappy theories that we've had for the last several decades about the lipid hypothesis, about eating saturated fat makes you fat and clogs your arteries. And so there, the elite intellectual recommendation is to eat a bunch of carbs, eat a bunch of processed greens. Well, it turns out, most likely, this is a completely bunk theory and the opposite is the case. Eating saturated fat is probably good for you, doesn't make you fat while eating a bunch of carbohydrates, does make you fat and causes heart disease. So they literally got the truth 180 degrees and this was the expert opinion being recommended by doctors and so on for decades. I suggest a great deal of skepticism when these people come out with their official data, especially if there's fundamental flaws in their theory or how they gather the data. I think to have faith in the scientific establishment is not much more respectable than having faith. A bunch of religious leaders telling you what foods you should or should not eat. Now, to avoid the inevitable criticism about my skepticism towards science in practice and data gathering in practice, let me give you another case of anecdotes which I think illustrate the problem with how most people conceive of anecdotal evidence. And that's ghosts. Lots of people in my life who will argue that they have seen ghosts. They believe in ghosts. They think they've seen ghosts in a graveyard. And I'm very, very skeptical of this notion, even though there's tons of anecdotal evidence. Well, we have to be very careful with anecdotal evidence and we have to do the following thing. When somebody reports data that they've received, either I took this pill and this happened or I was in the graveyard and I saw this, we don't have to throw out the data just because it's anecdotal. What we need to do is challenge the theoretical interpretation of the data. So I'm not going to sit here and say nobody who's been in a graveyard have seen things. They haven't seen anything. They're just making it up or they're just hallucinating. No, I don't think that's the correct way to approach any kind of data. Say this. I have a positive belief based on anecdotal evidence that when people are in graveyards late in the evening that they see things that they think they can best explain by positing the existence of ghosts. Now, they might even agree that with that. And I think most people would agree with what I've just said. I've not dismissed the data. I've not dismissed the facts as they've reported them. What I've said is there's a difference between somebody saying, oh, I saw this light in a graveyard moving in a funny way and oh, I saw a ghost in a graveyard. So I have a theoretical explanation for what people see in the graveyard at 2 o'clock in the morning that doesn't include ghosts and yet still gives you explanatory power. Maybe that theoretical interpretation of the data is superior. Let me give you my attempt at explaining what people see in graveyards in the evening. I, for one, have been in graveyards late at night at 2 in the morning and I report seeing lots of lights and shapes and movements that I never see in my everyday life. Now, if I had a belief system which was casual about what types of things I think existed and I thought ghosts existed, I'd say, yeah, I've probably seen some ghosts out the corner of my eye in a graveyard. But I think that's actually really, that complicates our theories greatly if we posited the existence of ghosts because then we have to say, well, there are such things as these souls that are not physical and yet we can see them and they only inhabit the graveyards at night and nobody can ever take a picture of them. That isn't blurry, but that's just because of their nature. You have all kinds of really difficult explaining to do, if you were to say, that you think ghosts exist. But I can attest to the fact that spooky and confusing things happen at 2 o'clock in the morning in graveyards. But here's my explanation. Why? In graveyards there are objects which you really don't encounter anywhere else. That's gravestones, lots and lots of different sizes of gravestones. Some are rectangular, some kind of have the domed tops, some are really tall pillars, some are mausoleums, little mini buildings with dead bodies there and grates on them. There's all kinds of weird shapes in the graveyards that you don't encounter anywhere else and they're all packed together in this way that you wouldn't encounter anywhere else. In addition to the funky shapes that are in graveyards, you also have the natural result of being in this place with a bunch of funky shapes. Light does really weird things. If a car goes by and you've got this field of weird shaped objects, you get a bunch of really weird patterns of light all around you. The little bits of light bounce off this and they just barely make it through that and you have this kind of dome shaped thing that's being projected over there and it kind of looks like it's walking and moving and then it goes away really quickly. It's just because you're in that circumstance where you're in a weird circumstance at night so you can't really see very well, so you can't make out what the motion is coming from. The light off in the distance and then maybe you have animals there that you can't see because it's dark. So you have sounds that you're unfamiliar with and motion that you're unfamiliar with and light that you're unfamiliar with and it's all kind of moving quickly and you can't really pin it down. Now that explanation explains the data. The data being I saw a bunch of things I can't easily explain and I don't know what they are in graveyards at two o'clock in the morning and it doesn't posit the existence of objects that are ghosts which are really hard to explain in any kind of rational sense how ghosts work. And this is the point. Whenever you're encountering data, anecdotal data, scientific data, you don't have to believe the communicator's theoretical attempt at explaining the data. You can accept the existence of the raw data without accepting the theoretical explanation. The same thing is true of Bigfoot. Do I think Bigfoot exists? No. Do I think lots of people have seen lots of really big moving furry things in the forest that they don't exactly know what they are? Yeah, of course. This is also true with religious experiences. Do I think that people have all kinds of remarkable experiences in their sleep communicating with what they think are beings from another dimension? Yeah, I think that happens. Do I think that in reality there are these interdimensional beings that are communicating with earthlings? I'm a little bit more skeptical of that. I don't want to take away from the data of the experience. It's just I'm skeptical of people's theoretical interpretation of that data. Now, this video would not be complete if I didn't, of course, bring it back to mathematics. A similar phenomenon is happening in mathematics, especially with calculus. People try to argue that the theory of calculus, let's say, must be correct because in practice calculus works. We have lots and lots and lots of data of the calculations in calculus giving us practical solutions to problems that work in the real world. Then people turn around and say, ah, therefore the theory must be correct. This is no different than somebody seeing apparitions and graveyards, seeing lights and graveyards and saying, and the reason they're there is because there are ghosts. And that fully explains all of these experiences. No. In mathematics, if you guys are unfamiliar with my position by the way, I claim that there are subtle yet fundamental errors in the theory of modern mathematics and has to do with this idea of a completed infinity, which I don't think exists. You can see some of my other videos on the topic if you're interested. But I would say I have an explanation for the data of why calculus works and it's not because the orthodox theoretical interpretation is correct. The reason calculus works is because reality is finite. The reason calculus solves Zeno's paradoxes is because there's a fundamental base unit of space. Reality is not infinitely divisible. So this relationship between theory and data and accepting the raw data but not necessarily accepting the theory is not just applicable to areas when we're talking about antibiotics or ghosts or any of those things. It's also applicable to what people think is a purely deductive field of thought, which is mathematics. It applies to anywhere where there is data to be gathered, which is pretty much everywhere. My suspicion is that 50 years from now, we're going to have a much different perspective about the value of anecdotes, anecdotal evidence. And some of the dogmatism that has plagued the past 150 years in regards to what people think of as scientific data and what I think is really a false dichotomy between the two. The difference between anecdotal data and scientific data is really one on a spectrum and I think it's dogmatic to either dismiss or accept one type of data over the other just because it fits that particular taxonomic distinction. Plus, we use anecdotal data all the time in our everyday life for constructing our beliefs about the world. There's some product or service I'm thinking about using and I know somebody personally who I respect their opinion. They've used that particular product, good or service or if I'm trying to get information about a person, their opinion, their experience, though it's not formally quantified in any kind of white paper, I listen to them and think it's important. I think you can learn probably more about other people telling you about their experience than you can about reading some formalized white paper by people you have no idea who produced it. I at least know if the people around me are trustworthy and competent. I don't know if the people in the scientific establishment or the people in the business, I don't know if they're competent. I have lots and lots and lots of data of the established experts being fundamentally wrong and mistaken and not worthy of trust. So for those few people in my life that I do actually trust, I put a lot more weight in their anecdotal evidence than I would in the official orthodox evidence.