 My name is Ray Hall and welcome to the 12th Sunday Papers at TAM, 12th. So we have limited time budget and we have six great talks lined up for you, so let's get started. Our first speaker is a psychiatrist, his name is Dr. Robert Stern. He's the Clinical Director of Behavioral Health Services in Iowa Falls, Iowa, and we welcome him to talk to us about mystifying, misdefining, and misdiagnosing of dissociation. I think I got that. All right, welcome Dr. Stern. Howdy gang. This paper came about as a result of my inability to understand what I was reading in a very popular psychological test called the Dissociative Experiences Test. It's a test to screen for multiple personality disorder and a couple other things. Okay. Dissociation started out as hysteria. In 1860, Jean-Martin Charcot, who was the most famous neurologist at the time, got interested in a group of people whose symptoms made no sense to him whatsoever. They'd be paralyzed in their left arm one day, they'd be paralyzed in their right leg the next day, they were blind the third day. He was absolutely convinced that this was due to a neurological lesion, but he could never find the lesion. He finally gave up and said, well, this is psychological, but being a brilliant diagnostician, he said that this is the symptoms of suggestibility, exaggeration, selective amnesia, and attention seeking behavior. Skip ahead 90 years to the first diagnostic and statistical manual of the American Psychiatric Association. It's conversion hysteria and dissociation is mentioned as a symptomatic expression of hysteria. In the DSM II, second edition, you still have hysterical neurosis, but hysteria is broken up into somatiform and conversion disorders, which are the physical expressions, and the purely psychological expressions, which are called the hysterical neurosis dissociative type. Those of you who remember your Monty Python will appreciate the Spanish Inquisition as I go through this. It's in DSM II, it's an alteration of consciousness and identity. DSM III, it's identity, memory, and consciousness. DSM IV, it's consciousness, memory, identity, and perception. And in the DSM V, which came out last spring, it is consciousness, memory, identity, emotion, perception, body, representation, motor control, and behavior. One thing should be patently obvious. There is no diagnosis which would not fit into that definition of dissociation. You could put half of the medical diseases and western civilization under that diagnosis. Nonetheless, that is what we use. Well, how did that definition get to be? There is no empirical validation for that definition. You have to remember that the DSM is not a scientific document, it is a consensus document. Well, this creates problems for people who want to diagnose multiple personality disorder. And in 1986, a couple of researchers got together and wrote up the dissociative experience scale. 28 questions to help people diagnose MPD, which is now called dissociative identity disorder. Well, I'm reading through this and it's not making sense to me. And so I start reading through it. This questionnaire consists of 28 questions, blah, blah, blah, etc. Look at the scale and something's not adding up in the back of my mind. So I go through and I actually look at it. And all right, there are four questions. First, you know, I'm sorry, four sentences, first sentence, 28 questions, 24 hours in your day. You know, second sentence, it says how often active voice, third sentence, how active passive voice, fourth sentence, to what degree this experience described in the question applies to you, which I don't know how to answer. And then you are to circle the number to show what percentage of the time these things are happening. Trying to answer how often something happens on a scale that is measuring how much time something happens is a little bit like asking someone his weight and feet and inches. Well, I got called up the head of the psychology department in our little town. His name is Michael McDonald. Everybody calls him Macko. Hey, Mack, you want to do a little experiment? And we get 58 students together and we give them the DES and they zip through the DES. And the moment they turn in their questionnaire, we give them one more questionnaire with one sentence. How did you answer the DES? How often, to what degree, how many hours or you didn't know? 33% said they were answering how often 18 said to what degree, how many hours one of them and six admitted they didn't know what they were answering. Now, this is on a test whose authors state this has good validity and reliability. I mean, this test has taken off like a rocket. It is used not only to test for multiple personality disorder, dissociative identity disorders, we know it today, but for mood disorders, psychotic disorders, personality disorders, eating disorders, as well as things like irritable bowel syndrome, immune diseases, pelvic inflammatory disease. And there was even one article I read trying to correlate dissociation with belief in big foot and alien abductions. I was stumped. Well, we get these results back. Mack looks at me and he goes, Bob, surely you can't be the first person to see this. And I go, I don't know. So I go through the literature again. These are all published articles in peer review journals. This amount, I can't point, but one group of experts say this is the number of times. One group of experts say this is the amount of times. One group of experts, including so one of our speakers says it's both. And at least one person says it's neither. These are all peer reviewed journals. And this is not an exclusive list either. This is just what I fit on the PowerPoint slide. Well, this doesn't even get to the questions. So let's look at the questions. There are absorption questions. There are amnesia questions. And there are depersonalization, derealization questions. Well, absorption means how much you're concentrating on something. But absorption is not only not pathological, it was never even a part of the definition of dissociation. The argument goes like this. The more you are concentrating on something like a baseball game, the less you are concentrating on the barking dog outside, and therefore you are dissociating. However, if you get distracted by the barking dog outside, you have shown a disruption and attention, and therefore you are also dissociating. And if you are watching the baseball game and simultaneously distracted by the barking dog, you are also dissociating because you have split your attention. You can't go wrong with a definition like this. Well, there are also 16 amnesia questions. The questions ask you to remember not only what you forgot, but to put it in the form of a percentage. And there are depersonalization and derealization questions, which is asking you to quantify your feelings. And it's a little bit like asking you, what percentage do you love your neighbor? How do you answer that? Well, it occurred to me that this wasn't due to some sort of distortion of psychological principles. This was just plain bad writing. Let me show you some of the questions. Some people sometimes find that when they are alone, they talk out loud to themselves, show the percentage. Do you sometimes never talk to yourself? Do you sometimes always talk to yourself? Or do you fall under the bell curve with the rest of us? Are people who sometimes sometimes talks to yourself? I have no idea how to answer that, but this is small fry stuff. Try this one. Some people are told that they sometimes don't recognize friends or family members. Well, if you're stuck in a room with your aunt Esmeralda and somebody flicks on the light for 10 times for one second each time and five of those seconds you recognize Aunt Esmeralda and five you don't. And for five of those seconds you recognize Aunt Esmeralda and for five you don't. And you feel the supplies to you 50% of the time and you circle 50%. You have answered the question, right? Wrong. It's not asking you how often you don't recognize Aunt Esmeralda. It's asking you how often you're told that you don't recognize Aunt Esmeralda. She's still going to be upset, no matter what. Well, like I said, this is simply bad writing. So I go to my old drunken whites and there are the answers primarily in Chapter 2. Use the active voice. Put statements in a positive form. Use a specific concrete language. Amit needless words and you avoid the use of qualifiers. Well, so I rewrote the instructions and said, well, how many times a day and how many hours a day you've talked to yourself. And I rewrote about 16 of the questions. In addition, I asked them how often they spend going to school or work, spending eating and how much time they spend sleeping. But I wanted to know if there was a correlation between what they said they were doing and what they said they were dissociating from doing. Here are the stats. 43 of the 58 finished the questionnaires. The average DES score was 23, which is considered fairly normal for college students and pathological for anybody else. You can make of that, which you will. The average number of hours in a 24-hour day that they spent in these 16, not in the full 28 questions, was 48 hours a day. With dissociation ranges up to 1700%, was there a correlation? Apparently not. This might make statisticians happy, but it doesn't give you an idea of what these students actually said. So let me give you some examples. Student number one said he got so involved in watching TV that he was able to tune out things 90% of the time. How many hours do you watch TV? None. This was no accident. Well, when I checked the frequency scores, how many times a day do you watch TV? None. Highway hypnosis, you know, driving down the street and not remembering how all are part of the trip. Student number one again, he dissociates 40% of his travel time. How many hours a day do you travel? None. Contrast that with student number 16, who remembers everything as part of his trip and spends 10 hours a day traveling to and from school or work, which is the equivalent of traveling like from San Diego to here and back every day. Autoprosypagnosia is one of those $3 words you will never see outside of a spelling B or a neurology unit. It means the inability to recognize your own face. It is extremely rare, it is extremely debilitating. Nonetheless, 13 of our students in our little town of 5,000 people suffer from this rare malady, one of whom said he doesn't recognize himself 90% of the time for one hour a day. Contrast that with student number 14 who always recognizes himself but spends six hours a day looking at himself in the mirror. Suggestive of another psychiatric malady called narcissism. All the students daydream but two of them said they daydreamed 100% of the time, including while taking these tests, which I could certainly believe. Student number 14 said he was able to ignore pain 90% of the time. When asked how many hours he was in pain, he said none. Well, it's always easier to ignore pain when it's someone else's. And half the students said they were hallucinating. One of whom, however, said that he never hallucinated once daily, absolutely worthy of Alice in Wonderland. Well, what can you say about my little test? Some people sometimes exaggerate, some exaggerate more than others. Some people haven't learned that there's only 100% of anything. Some people haven't learned that there are only 24 hours a day, even when it's written in great big black letters at the top of the page. Some people will say yes to anything. And some people don't know what the hell they're talking about. What can you say about the DES? It's a test whose instructions and questions aren't understood by this subject's taking it, whose answers are recorded on a scale which can't measure them in order to test for a group of disorders which can't be defined, but with good validity and reliability. Thank you. And with 25 seconds to go. Dr. Stern, thank you for setting the bar rather high for the rest of the speakers. But anybody have a question for Dr. Stern? We have time for one or two questions. Please come stand up and come to the center if you have a question. How did this happen? Well, at the end of this test, I called up Mac and I said, Mac, I don't think psychology should be taught as an academic subject. Any other questions for our speaker? All right. Well, thank you, Dr. Stern, very much for that. Excellent. Wake up. Our next speaker is Mr. Jacques Rousseau. He is the head of the Free Society Institute and he comes from us all the way from Cape Town. So he's here to talk about the responsible believer. Jacques? Thank you, Ray. Good morning. These celebrations of reason, I think you might all agree, take place in what you could describe as a general climate of unreason. From your towns to South Africa, where I come from, which you might know from Oscar Pistoris or that other Nelson Mandela fellow. There's something fairly simple I think we can do which could help assist in combating this climate of reason. And I think I could summarize it by saying that one of the important lessons that the skeptical community can teach others is that things are often uncertain. We might have very good reasons to believe in something, yet not feel entitled to claim that we can be sure of it. And this attitude of epistemic prudence, i.e. not making claims that aren't warranted, aren't sufficiently warranted by the evidence, alongside a certain humility. And by that, I mean the ability to accommodate the possibility that you might be wrong are essential resources for being a responsible believer, which is what I'm talking about here. So the idea of epistemic prudence is worth dwelling on for a moment. I think one striking difference between skeptical thinkers and non-skeptical thinkers is their attitude towards certainty. I'd like to suggest that a skeptical thinker would be far more likely to recognize that when we speak of truth, we're typically talking about something that is the best justified conclusion available to us rather than something we would claim to be certain of. By contrast, I think that non-skeptical thinkers often entertain the fantasy that they can be more sure of things than they are entitled to say. Justification is therefore our proxy for truth. It's our method of triangulating it. It's our method of approximating it. And a focus on justification rather than certainty can remind the community, remind the people that we interact with, that they need to adopt this more modest attitude to what you can claim to know for sure. So as I said, a non-skeptical view is that things can be known for certain. And what that leads to is a cycle of absurd proof and disproof of things, as you all know, when you engage with the media where things become true and then false when the next study is revealed or released. So here's one example. The diet wars are a current and fitting example where moderation and any kind of nuance and scientific understanding is ground up by inflated claims that one particular diet is the only way to go. It's going to save your life or it's going to kill you. Or this example that sugar is not just something to be careful of, as we all know, but something that is both addictive and perhaps even the new tobacco. So it creates a kind of risk society where about people stop thinking and stop being able to think about a nuance view of something and instead leap towards dogmatism. And I think the temperature of these debates inside the scientific community, inside the popular science community might be far more comfortable and far more fruitful if proponents adopted a more considered tone and resisted claiming the final word of things. After all, our knowledge is contingent on what we can know at a given time, what we can possibly know. And the chance that you get something right now at any particular point in time are therefore often vanishingly small. Secondly, I'd like to encourage us to consider adopting a more humble attitude towards our epistemology. Being able to recognize that you might be wrong is an essential component of being persuasive in teaching people about how to think about scientific things. A smugness or overconfidence regarding the set of ideas that you regard as true might sometimes be justified, sure, leaving aside the question of how politically attractive that might be or how effective it might be, but it could also be a sign that your belief has to some extent ossified or calcified. I think our convictions can become a kind of item of faith rather than something we hold as a responsible belief, i.e., it's something we regard as potentially falsified. And I think it's exactly because skeptics tend to be good at this that we need to remind ourselves sometimes to do a kind of diagnostic check, especially perhaps in situations where we are emotionally invested in a conclusion. And the point is that being better at avoiding things like confirmation bias and other sorts of common ways of getting things wrong doesn't make us in need of them, and perhaps it might make it a particularly problematic line spot for us because we feel that we're good at it already. So in light of the fact that the world is complex and that we can only know what is available to us at any given time, it should strike us as surprising, I think, how seldom do we hear people say things like I simply don't know. I do not have an answer to that question. We tend instead to come up with an answer and then defend it in a post hoc kind of fashion. Those of you who read Jonathan Haynes' work will know that he uses this as a way to explain moral reasoning, where he speaks about the emotional dog wagging a rational tail. We come up with a conclusion, we feel very strongly about it, and then we act as a lawyer in a sense and try to justify that belief instead of saying, well, Holden, you've caught me. I actually don't have an argument for that view. I take it back. I take back what I said in a sense. We're stubborn. We don't like to be wrong. So once we make that commitment, that emotional commitment to something, we're reluctant to confess that we could have made a mistake. And this sort of escalation of certainty, this hyperbole closes off the space for nuance, and I think that we as skeptics are the ones who have the best, we're best placed in a sense to defend this more considered and nuanced way of expressing our views. To add to the difficulty of entertaining and encouraging this considered style of debate, the widespread availability of information via the Internet has arguably democratized the concept of expertise itself, and the idea of authority and who can act as an authority are on a constant challenge from anybody who has an Internet-connected device, i.e., everybody. And while it's, of course, true that we shouldn't accept the word or the testimony of authorities in an uncritical sort of way, we shouldn't surely accept that authority and expertise are not the sorts of things you can get via Google. I mean, there are lots of people here in the medical profession, and I'm sure you are dismayed by how Dr. Google is often the first port of call for patients. Sometimes, in fact, most of the time somebody out there will know more than you do, and you could quite possibly be wrong. And I think we should remind ourselves of that more often. And I think another aspect of what this death of authority means is that no matter what your point of view is, you can find somebody who will support it, and reinforce your belief while you, in joining that community, reinforce theirs, and you, in a sense, have this innocent escalation of confusion with all of you walking away believing you're the authorities and everybody else baffling any opportunities of the matter. Eli Harris' concept of the filter bubble, I think, articulates this well, because if you're looking for evidence of Bigfoot on a cryptozoology website, you're, of course, going to find it. And you might end up walking away believing in the Loch Ness Monster 2, because that's the kind of thing you'll get exposed to on that website. So there's the self-supporting web of evidence, which is immune to, in a sense, correction from the outside. And as you all know, with conspiracy theories, the situation gets even more absurd in that being unable to prove your theory is evidence of a conspiracy guy who's to hide the relevant evidence, therefore leaving you more entitled to believe what you'd like to believe. So combine this filter bubble and this democratization of the idea of expertise with the nonsense of a kind of blanket generalized respect for the opinions of others. And you quickly end up drinking too deeply from the well of post-modernism, where truth takes a backseat to sensation, and where simply being heard takes so much effort that we either withdraw from debate entirely, or we become superficially smug. I'm not accusing the grass types of being superficially smug, but people who engage with science on that kind of meme level via Buzzfeed or the Internet or whatever, you surely have come across people who don't seem to have an understanding of what they're talking about, yet feel quite smug about their rectitude in the conclusions that they're expressing. And despite the complications I've sketched now, I think we can develop as well as teach resources for separating unjustified from justified conclusions and for being more responsible believers. And what I mean by responsible believers is both taking responsibility for our beliefs, by holding ourselves to account for them, but secondly also holding beliefs responsibly, in other words forming them as carefully as possible and changing our minds when it's appropriate to do so. Some of you might have read Peter Vagosian's 2013 book, he was here last year, a manual for creating atheists, and in there he introduces the concept of street epistemology, which are simple but effective rhetorical and logical maneuvers that we can deploy in everyday situations. In a similar vein I'd like to articulate a few concepts that can serve as resources for making it more likely that we end up with responsible beliefs and also that we encourage responsible belief formation in others. So many of our blind spots in argumentation I think involve not understanding or taking full cognizance of the politics of the conversation, rather than focusing on the logic of the argument. So forgetting that arguments and debates occur in the context, in the heat of battle we forget about all the things that our books have taught us and that our lessons and logic might have taught us. Debates occur in the context rather than in this hypothetical space of reasons. So tomorrow from Dan Demet's book last year, Tuition Pumps, we might usefully remind ourselves of Rappaport's rules. We're in engaging with an argument, with an opponent. Rappaport's rules invite us to do the following, to first attempt to re-express our target's position so clearly, verbally and fairly, that they say thanks, I wish I'd thought of it. Secondly, to listen to any points of agreement, especially if they're not matters of widespread or general agreement. Third, mention anything that you might have learned from your opponent and only then are you permitted to start with rebuttal the criticism. One immediate effect of following these rules is that your target becomes a more receptive audience for your criticism than they would otherwise have been. Then, the distinction between explanations and reasons is worth dwelling in. Rosenblatt and Kale from Yale University speak about the illusion of explanatory depth. We're inclined to believe that we have a great handle on our argument and our explanation. Those of you who teach like me have the feeling of preparing a lesson plan and thinking it's personal here and then walking out and somebody asks you a question and you suddenly find yourself a little bit dumbfounded, that's a hard answer. So they argue that instead of trying to provide reasons for our beliefs, we should try to provide explanations for them. For example, instead of asserting that you need universal healthcare because everybody is equal and therefore entitled to care, try instead explaining how your scheme would work, who would benefit, how would they benefit, who would pay for it, etc. So not showing your workings, demonstrating that you understand the logical flow of the argument or of the claim you're making stands a better chance both of persuading somebody else and secondly revealing to yourself the flaws and how you could improve your own arguments. Secondly, a thing that you all ensure are aware of the blackfire effect, perceived threats don't make critical reflection easier for your opponents. If they feel like they've been threatened or challenged, an ego certainly does come into it. We don't like our deep convictions being challenged and perhaps this caused us to dig in our epistemic heels. So again, this is about politics and the tone of argumentation whereby we should try to be charitable to the fact that the other person is deeply convinced of their point of view for sincere reasons of their own, whether they're high-quality reasons or not. Consider the possible long-term implications of how we can, via the blackfire effect, rule out certain criticism as out of order. It stands to the potential of running out of getting us onto a kind of slippery slope because once you rule out one set of criticisms, how much easier it might be to rule out a second set or a third set until you eventually become as epistemically virtuous as an astrologer. So we create our own filter bubbles potentially in skeptical gatherings like town and all of these sorts of conferences and we need to go out against that. There's no shame in saying I don't know. So I'm arguing that there's a strong signaling value in nuance. There's a strong signaling value to those outside of the community and those inside the community to acknowledge the possibility that things are complicated and that certainty is often not available to us. Our skeptical currency, our value of skeptics, is invested in the fighting for a considerable view and showing how it's often the most accurate reflection of the data available to us. And the certain certainty can be harmed to that political cause. Skepsis is not, after all, about simply being right. It's about effecting change in the world. It's about persuading people to take more responsible attitudes towards factual claims. And nearly reinforcing our identity as skeptics and appealing snug about that can get in the way of affecting this change. So we should all try to help other people develop resources for being visibly leaders who believe in the truth things rather than false things. So to conclude, Skepsis is not a conclusion. It's a way to reach conclusions. And our job is to always demonstrate that method, to always demonstrate that way of reaching conclusions rather than asserting those conclusions. Setting the example is a vital part of our mission. It's the way that we encourage and inspire others to adopt these strategies. And I'd like to suggest that as humanism is to ethics, i.e. a kind of woo-free inspiration and guide for living a good life, Skepsis is the pen of the science. Providing responsible examples and guidance on how to be a responsible leader, and the importance of following yourself accountable for that. Thank you. We have time for one or two questions and we have a question for our speaker. Do you think that more speech available improves the level of discourse? Yeah, that's a good question. In theory it should certainly do so, yes. But I mean, as I'm sure we're all aware, that's a very, very high, or very low, signal-to-noise ratio out there in the blogger's case. So we do need to develop resources for separating good sources from bad. So yes, in theory it does, but the noise can drown out the quality stuff, and also because of the kind of climate of sensationalism where, as always, headlines sell. It's, I think, easy to gain that system and spread nonsense via high popularity. So yes, in theory, but we need to develop resources for moderating the quality of those resources, too. A hand for our speaker first. In terms of diving resources, see our talk from last year on rebutter. I think we addressed that a little bit in our Sunday session last year. So our next speaker is Dr. Stuart Robbins. He's a PhD research scientist at the University of Colorado Boulder. His specialty is planetary and geophysics, and also he has a blog called Exposing Pseudoastronomy. So please welcome Dr. Stuart Robbins, who's going to talk to us. We're asked the question. Sidoni, Mathematical Keys to the Numerology of Mars. So I'm going to talk about something that I don't think has been talked about a bunch of time. I'm going to do sort of a straightforward debunking, and I'm going to focus on this Sidonia region of Mars. And Sidonia really isn't, it's not that special. It's just a region of the planet. We named a lot of regions in the early 1900s based on telescopic observations that just showed lightness differences, color differences, slight differences across the planet in early telescopes. And Sidonia happened to be one of those regions. Now it was made famous, and perhaps the king of all, or the queen, depending on, I don't remember exactly what Sidonia means, but it's the cornerstone of a lot of space anomalies. That's because it's in Sidonia that in 1976 scientists from NASA unveiled Viking photograph 035A72, which had the face on Mars, the infamous face, this mile long feature that looked like a face under certain lighting conditions. And unfortunately, that sort of sparked this modern day anomaly hunting on other planets. And on Mars, it's ripe for anomaly hunting, because an anomaly is very simple. You see something in any of the literally millions of photographs we have on the planet. You don't understand it, and immediately it's whatever. It's aliens or Bigfoot. There is a picture of Paratoli of Bigfoot on Mars, or there's Jesus on Mars, or almost anything else. And so what I'm going to take you through is a case study of some numerology claims or numerological claims that have been used to argue that the broader Sidonia region, all of these features that you see in this image, all combine to claim to show that Mars was created, or at least this region was laid out and the features were created by some sort of intelligence. You might almost say it was intelligently designed, but I don't quite go that far. So this particular version of the claim I'm going to be addressing is made by Richard C. Hoagland. For those who don't know Richard C. Hoagland, he's one of the original face on Mars guys. If you really do anything about or look at anything about space or astronomy anomalies, you're going to come across Richard Hoagland. He's the guy on the SGU and I'll back away a little bit who filthily goes Hoagland. So that's Hoagland. This particular claim as I said is about numerology. In particular there are in this case 19 features that show numerological relationships that could not possibly the claim goes be created by nature, natural processes. They do this by looking at different kinds of features. So for example the angle between the staircase between me and the panel table that would be one angle that would be measured. Another angle could be the angle between me and Ray and guy backstage. That could be another angle and if we convert those angles from degrees to radians just multiplying by a scaling factor we get a small number that's a key number and an irrational number. It's something like the square root of three or pi or perhaps the square root of five divided by pi. And because that's a weird irrational number it can't possibly be created by nature so the claim goes. The claim also goes that these angles are made to three significant figures accuracy. That's like saying it's exactly 1.32 or that's also saying that the ratio of the first angle divided by the next angle is one of these numbers or that tangent or the cosine or the sine. And so very quickly you can start to see that this might not be the most precise of claims but it goes further. There's a feature called the D&M pyramid named for Vince DiPietro and Greg Molinar who were some of the Faithanmar researchers in the 1980s. It's been termed the mathematical Rosetta Stone of Sidonia because all of the angles and all of the ratios and all of the trigonometry all of these then small numbers find themselves in interestingly the D&M pyramid. So it can be used to read the mathematics of the broader Sidonia region. And so because they really start with the D&M pyramid I'm going to start with an analysis of that. So first we're going to address the part of three significant figures. And what I'm showing you here is the latest high resolution imagery of this shape. And as you can see it's not a perfect pentagram. In fact it's really hard to figure out where the walls are where the edges are. And when I drew this three or four times in the space of a half hour I got completely different results. You can squick in your eyes and you can guess and you can get something that sort of maybe converges at the same apex point. But trying to claim that this is an accurate measurement to three significant figures is a little bit silly. And that's illustrated by this movie as well where I've taken what the anomalous claim are significant angles in the middle column or I guess the column in the middle of the slide. I then show you my angle measurement and then I show you the percentage different or difference. And the angle measurement is done by just shifting the vertices a little teeny tiny bit. And right now I've faded in the Viking era imagery which has a pixel of about 50 meters. So pretty much like the width of this room versus CTX is seven and a half meters maybe half the width of this stage. What you're seeing is the percent difference where I've color coded red as being greater than one percent, yellow as being less than one percent, and green as being less than point one percent. So that's sort of a proxy for one significant figure, two significant figures, or three significant figures. And you might notice that it's mostly red. And that's because they don't match. But another issue is that there are a lot of angles here. They've only pointed out in this case nine. However, there are actually 35 angles in this pentagram. That means that there are also 595 ratios between all of those angles, one angle divided by another, which also means that there are 105 trigonometric relationships if you have a sine, cosine, and tangent of any of those angles. When you match it to a significant number you have 94 different significant numbers you can match to. This all points to a lot of cherry picking, which I'll get to in a few slides. But for the sake of completeness, when I actually do the measurements on the latest data, and I basically check the work of what was done 30 years ago, only two of the claimed significant numbers actually match to three significant figures. That's the cosine of angle E, which is allegedly equal to the square root of 5 divided by the constant E, and the angle of D divided by angle F, so ratio, which is equal to pi over the square root of 5. Okay, maybe that's significant, maybe not. One thing that we like to do in physics and related fields like mine, astronomy, is something called a Monte Carlo simulation. So Monte Carlo is where you basically want to understand these statistics of what would be going on if this were a purely random phenomenon. In this case I've simulated 15,000 pentagrams. I've measured 35 angles, 595 ratios, and everything else, and I've compared those with the 95 different special numbers, and when I say I, I mean a computer code because that'd be a lot of numbers, and I've been to the results. On the top graph you're seeing the number of those 735 possible important numbers. How many of them match the 94 allegedly not numbers that can appear in nature? In the bottom graph you're seeing the same thing, but 2.1 percent versus 1 percent. And what you're seeing is a buildup of these statistics. What you need to understand the null hypothesis, the null hypothesis being what you would expect from pure random chance. The null hypothesis is, this is a natural feature, let's get the statistics. The average on the top is you would expect 222 of these numbers to be significant. You would expect to three significant figures, about 22 of them, and what just came on that you can't actually read are one sigma and two sigma lines. One sigma is where you have about 70 percent of the data. Two sigma is where you have about 95 percent of the data, which means that outside of those you have about two and a half percent of either wing. In physics the magic number is a five sigma detection. That means that there is less than a one in one million chance that your result is due to pure random fluctuations, random chance. For example the Higgs boson, when that was first announced from CERN that hey we found it, it was a four and a half sigma detection. Then it was a few months later when they built up enough statistics that it was actually now the magic five sigma detection. In this case what's being shown by the red arrow is exactly where the D&M pyramid lies. It is right within one sigma, which means that statistically speaking this is a pure random natural feature that shows absolutely nothing out of the ordinary. It's it just it's natural, but some people don't like that answer. It almost seems superfluous now to go to the Sidonia region to do this exact same thing. You might notice in this case though that there are a lot more yellows and a lot more greens that are popping up and that's because with Sidonia you don't have set features that you're trying to match to. In a pentagram you have an edge and a line to the middle, but with Sidonia and now it relates to the Viking imagery you can really pick and choose whatever you want and say that that's a significant feature. If this particular pyramid that you're calling a pyramid doesn't actually fit your numbers then you can just go to the pyramid a kilometer over and it fits your numbers. And because we're skeptics and we like logical fallacies this is the Texas Sharpshooter fallacy. You get a bunch of data in this case 735 different possible angles ratios of trig. You look for any cluster and say hey look 29 of these match these particular significant numbers and because they match that three I'm drawing my target around it and hey look there are 29 significant numbers in my target and you can ignore the other 700 that I'm not telling you about but there are 29 numbers in my target and therefore it's significant and this is very closely related to the cherry picking because I've now just plucked my numbers and said hey there we go those are the ones that are imported. So what are the lessons learned in the other applications to broader skepticism, broader debunking exercises? Well first off from this you have to have an a priori model of what is a significant number because you can't just look at your data and figure out what's there and then say well because there's this cluster that is what is significant. In this case it's just numerology and it applies to a lot of other things like say I'm hunting bigfoot. Well what a priori is going to count as a bigfoot photograph. If you don't have that in mind then you can take a picture of something blurry and say yeah that counts just because that's what I got. You also need to know what the background level statistics are in order to be able to accept or reject the null hypothesis. If you don't know what you would expect from pure random chance you can't say well this is beyond chance this couldn't possibly be natural you just you can't say that making that claim is fairly ridiculous. You also need to know the limitations of your image analysis and data in this case 50 meters per pixel it was great for the 1970s we did a lot of science and still do a lot of science with Viking data but when you're analyzing a feature that's a kilometer or two kilometers across with 50 meter pixels and claiming that you can do this and measure angles to three significant figure accuracy it's just impossible and this applies to a lot a lot of things like where's the birth certificate you know the whole birther stuff with the birth certificate that has to do with image analysis and understanding what the images are showing you. It's the same thing with the twin towers with various photographic and image anomalies and you have to be able to understand the limitations of your data. You also have to be wary of Paradolia. In astronomy I mean the whole fact that we have constellations that is Paradolia but in astronomy claims in general especially from Richard Hoagland and others we deal with a lot of Paradolia in this case in Sidonia we're connecting a face with a pyramid and another pyramid and a fortress and a tank and a city square and various other things and when your claim is based on connecting these features that supposedly are intelligently created to begin with you've already reached your conclusion before you've even started to gather your data. So with that in mind I have 45 seconds left so I will say that I did a detailed analysis of this that perhaps makes slightly more cohesion sense or cohesive sense or whatever that phrase is supposed to be. It can be found at YouTube at this link or various other areas. I do want to also thank Sharon Hill at Doubtful News for helping to promote that and that's all. Thank you Steve. Once again we have time for one or two questions from Steve. No question. We seem convinced. More so than people I argue with online. Can't these folks know their mouth? They can't argue with it. Well thank you again, very much. All right next presenter is using President. It takes a little while to get it to pop up. So we're pleased to welcome Chris Gasti as the president of the Australian Skeptics Victorian Branch. He's a software developer with a academic background in philosophy and I'll present for some more math. I've heard a great book, How to Life Statistics. Well it gets even more fun if you use Bayesian statistics. So here to explain that to us. Mr. Gasti. What should this glaring of me on a question we're looking at is Bayes theorem. Okay it represents a way to improve that probability of the hypothesis giving extra evidence as it's going. Now I've got to feel like maths can be bit ominous for some people but there's laughter in this as well. It's all it's all going to be fun. Okay now you know this is ancient century mathematics. It's all fairly established. It's used extensively in a lot of scientific and medical research. It's used in a lot of software applications like you know image processing. So for example facial recognition. I've used it for text classification like spam filtering and so on. It gets a bit of use in you know consumer prediction and risk analysis. And I don't have an access to grind with the approach per se. It's very established but there are some you know peculiar cases that are worth delving into. So I want to start first with a simple real-world example of Bayesian inference that's used in a fairly reasonably valid way. Now with HIV testing the preliminary stage is called an lysotest. It records a false positive rate of about one in 10,000. So if you've got a low risk populations of males getting this test for every 10,000 men you'll get one of those men who've got HIV. Now that will probably lead to one true positive result. But then of all the men without HIV you'll probably get one false positive result. Now with a high risk population you'll have 150 or so with HIV. 150 true positives and you know probably one false positive. Now we've been told that this has got a false positive rate of one in 10,000. So it you know we'd be kind of assuming that if you get this result it's 9,999 times in 10,000 correct. So but what are the chances of having HIV given that an lysotest comes back positive? It actually depends, our level of certainty depends on the population we've measured. Now I'll throw in a bit of maths here okay. Now every time we see this vertical stroke that's just shorthand for saying given that. In this case we're trying to figure out the probability of the HIV positive status. Given that we've got a positive lysotest and we'll plug this into the equation we saw in the first slide. It's not too scary. And using all the numbers we just done in the previous slide for the lowest population we get this result. It's less than 50%. Now for the high risk rate if you get a positive result in the test you're about 99% sure that you've probably got HIV. So if we're told the test is accurate 9,999 times out of 10,000 it's a little bit counterintuitive to expect a positive result to be wrong 50% of the time. Our minds just done naturally scared towards dealing with probabilities and there's often been a lot of problems with this. There's been tragic instances where there hasn't been sufficient post-test counselling and people have committed suicide misunderstanding of this kind of probabilities. Perhaps fuel the fire of the AIDS denialism. But in this case we're talking about something that's that's all the terms are relatively well known and that can be derived from observation. Now things get fairly curious when we get involved in looking trying to guess probabilities without empirical corroboration and turning to areas you know of study where this this kind of thing doesn't happen all the time. Now back back to the originally unsung I'll just go through a bit of terminology. The initial probability we're dealing with is known as the prior probability. Then the other term that introduces the new evidence is known as the likelihood. We've got a normalizing constant on the bottom we'll get rid of that later and the final outcome is known as the posterior probability and okay the the next bit is probably the most important bit of maths in this talk if you're going to remember anything. I just want to talk about conditional independence. Now if you're dealing with the probability of two events it's the probability of those two events is going to be equal to the the multiplication of those probabilities on the provisor that they're conditionally independent things that they're not in some way causally related. I'll give an example with the deck of cards. So the probability of drawing an ace of hearts from a deck of cards is equal to probably drawing a heart times the probability of drawing an axe. So it's one in 52 and that's the case of you know the the suite of the cards and the numbering of the cards it's all it's it's conditionally independent that's fine but if we're to say the probability of drawing a number and drawing an even number we can't just multiply those probabilities because one is completely dependent on the other. Now we get back to this later in the talk. Now I've given an example of appropriate I want to tell you about some of the more egregious uses of Bayesian reasoning. Okay so of course I want to talk to you about Jesus. Now I'll go into the Apologist Bayesian Jesus and then I'll also cover the secularist Bayesian Jesus. First off Timothy and Linda McGrew were some theologians. They had a paper called The Argument from Miracles and you know they try to rescue the argument miracles from the the jaws of David Hume pretty much and their hypothesis is that Jesus was resurrected from the dead and the facts that they're using to support this are the testimony of the women the testimony of the disciples that saw Jesus after he come back to life and also the conversion of Paul wrote to Damascus. Now the chance of being we're trying to find the chance of Jesus being resurrected given a set of facts. Now they do this in an odd form so they try to establish a ratio of that compared to the chance of Jesus not being resurrected given the same set of facts. So we can let's try to figure this out. The first term is the priors so you know as a starting point what are the chances Jesus someone could just be resurrected from the dead and they multiply that by you know the the likelihood given what the women have said you know is it more likely for someone to come back from the dead and given what the disciples have testified on pain and martyrdom and so forth and you know and then the conversion of Paul so they're all in terms and notice these facts have been expanded out with an assumption of conditional independence. Now they're pretty charitable to the kind of you know skeptical cause that they think people being resurrected from the dead you know the odds are something like 10 duo to a million to one you know I'm I'm that that seems about right to me but then they look at what the women have to say and that they think they're honest people and you know the chances of them lying are going to be kind of about a hundred to one. Now they count about 13 people as disciples and they're all pretty honest blokes and you know each one's conditionally independent of the next so that's you know about a thousand to one each and they ah something curious happened there and Paul too he's a pretty honest bloke. So when we're dealing with the evidence it's 100 treachery to one that these guys are fipping about and so we end up with the odds against Jesus not being resurrected at 10,000 to one. I kind of the question is should we take this to our bookie. But I think they're on the right track I think you know I feel the premise here is extraordinary claims you know tend to the minus 40 require extraordinary evidence but what I guess we're all differing on is whether this is extraordinary evidence. I think the problem is that they've said their likelihood terms at impossibly high values they claim all the facts are conditionally independent but that kind of rules out the possibility of some kind of collusion between these witnesses or hallucinations those sorts of things that a lot of us might think of like the explanations and this is a it it's not a crazy paper it's well researched the maths looks right there's lots of you know references to scholarship but it's been reviewed by theologians you know statisticians haven't taken much of a look at this. Now I'm all for teaching the controversy okay so I want to talk about Richard Carrion and his his position on this he's a prominent cumulus classical scholar now he's just put out a book I haven't had a lot of time to look at it that's 700 pages long that sets that to demolish the existence of Jesus using a Bayesian argument but what what I'm talking it it's there's some quality scholarship in there but what I'm looking at today is an earlier paper that he he looks at the issues with the passage in Tacitus that that ties together Christ and Pontius Pilate and the persecution of Christians by Nero. Now the passage is known as the Testimonial Tacitus well I'll just say TT from here on in because you know I'm running short on time and it deals with four bits of it four facts that would lead us to believe that it might not be accurate that it may have been played with by you know Christian scribes at a later date so for example there's no it this passage in Roman book has no influence on Christian stories about persecutions there's no explicit mention of it by other Christians to the fourth century no Latin or Greek mentions to the fourth century and it also it lends itself to interpolation perhaps because it fits some other kind of cult rather than the actual Christian cult so he starts off saying you know it's 200 to one that someone would have fluffed around with this passage in an old Roman book and then he puts together some other figures I won't go into details as to where he gets them from but it's starting to take the form of a fairly similar kind of argument again we're looking at all these terms being multiplied together and you know he we end up with the odds against this passage being authentic or about we're over three to one now what the problem is exactly the same as the problem with the McGraw's paper we've we've got this assumption of conditional independence thrown in here the proof assumes that all these things are independent but we really that that's not that's something through that that's not an obvious thing given that we're talking about historical absence of references to these passages I don't I can't quite fathom them so if any of these facts are mutually dependent then the whole mathematical argument is invalid I just want to say look it's a Bayesian analysis is a fairly powerful technique but we just have to keep an eye out for in some cases like this where there can be some circular reasoning you know and it's just used as a gloss on top of it or if there's some you know undue assumptions of conditional independence okay thanks in 2003 I debated Richard Swinburne here in the United States on the question of does God exist and I attacked his Bayesian analysis his effort to claim that Bayesian analysis shows a greater probability that God did exist because of first cause and fine-tuning what have you and the way I attacked it and he actually didn't even defend his position which surprised me was I said given that the Bayesian analysis requires prior background information we have no prior background information of the supernatural so therefore that is the link in his Bayesian analysis that destroys his argument I appreciate your comment yeah it all comes back to metaphysics and if we've got no empirical basis so certainly I'd like to say the paper yeah the question is a question well thank you for that one so before we start with our fifth speaker out of six just want to remind you if this looks like fun by the way how are we doing so far we enjoy your thoughts so if you think it would be fun to stay up all night worrying about the talk on Sunday you can apply for a 15 minute slot we usually send out the call for papers around january then find me at google ray hall Fresno state and just contact me if you have any interest in presenting a paper next year and let me know we have a little bit of vetting process but are we ready for the next speaker we have a little bit of a video thing going on we just check real quick looks like we are already almost so our next speaker is michelle nara and she is secondary science school teacher who is now pursuing her doctoral degree in education she's come to speak to us about oh actually she's at the curriculum and instruction and department at Purdue University and she's going to tell us about let's see what does the title of her talk teaching the nature of science and she's making the case that it's a social justice argument so michelle taking in front of 135 hormone riddles eighth graders but for some reason i'm a little nervous speaking in front of rational people so go figure like i said i was a teacher for five years i spent most of that time teaching eighth grade physical science although i did teach environmental chemistry forensic science which was a lot of fun school system and i resigned in 2008 i pursued photography and i did pretty well i sold some prints and i was really happy but i am an educator and that's my passion so in 2011 i decided to go back and get my phd and i like ray said i'm currently studying at Purdue University in the curriculum and instruction department in the curriculum studies program most of my work focuses on multicultural education um but i can't give up my science roots science is embedded in everything and i think you all agree so when i took up the study of the nature of science i did it with the curriculum studies lens more specifically a cultural education or sorry multicultural education perspective which includes a strong social justice component and a paper i recently wrote i proposed that an understanding of the nature of science may provide learners with tools to transform societal norms that reflect greater social justice in other words teachers should be teaching children the nature of science with the direct purpose of promoting social justice with social justice meaning all people are treated with fairness respect dignity and generosity there are many theories regarding the nature of science from pop popper to froben to coon to chalmers and then there are various sociological explanations of the nature of science due to time constraints i will not go into these definitions or ideas though for the purpose of this talk it's important to distinguish between science scientific knowledge and the nature of science science offers the best explanation of our natural world available at the time scientific knowledge changes as new discoveries are made and new technologies are developed the nature of science is the process of developing scientific knowledge and this process includes critical thinking gathering evidence skeptically questioning and peer review one of the intents of teaching science is to create scientifically literate citizens which bill nye spoke about last night and one function of scientifically literate citizen is to use the process of science in solving problems making decisions and furthering his own understanding of the universe using the process of science or the nature of creating scientific knowledge to make decisions and solve problems in addition to producing scientifically literate citizens there are other arguments as to why teaching about the nature of science is important these include a utilitarian argument people need to know how to use science and technologies that are available to them in everyday life that seems obvious i guess we would be lost with our smartphones a cultural argument we need to understand the complexities that go into making scientific knowledge in order to appreciate science and its value within society and a democratic argument people are bombarded with scientific and pseudo scientific claims all the time through social media the news magazines television shows and political propositions they're expected to make decisions whether it be just daily lifestyle decisions or voting decisions based on their understanding of the nature of science as i engaged with the philosophical and sociological aspects of the nature of science i found myself asking shouldn't we use science and its methods to promote social justice and this is where my multicultural education comes in so i proposed a social justice argument teachers need to equip children with the understanding of the nature of science critical thinking skepticism evidence based reasoning in order to combat things like racism sexism heteronormativity and homophobia or in other words to promote social justice fair treatment for all when confronted with uncritical stereotypes black people are criminals girls are poor at math gay men are pedophiles students need to be talked to ask is that true what is the evidence by demanding answers to these questions and by evaluating the evidence i am confident that students will come to socially just conclusions and act accordingly so how can teachers teach towards social justice teachers need to be skeptical of canned curriculum they need to make necessary changes to create curriculum that promotes or results in socially just actions by reflecting on questions such as why did i choose the curriculum how do the explicit and implicit messages promote social justice how can i adapt the curriculum to make it relevant to my students in addition teachers can seek out curriculum that promotes social justice jref offers free k through 12 curricula that addresses areas of scientific skepticism critical thinking through investigations of paranormal paranormal pseudoscientific and fringe claims these modules give students a chance to engage in the nature of science through examining topics such as astrology esp and illusions also by promoting critical thinking and skeptical questioning by allowing students to share their perspectives by making predictions based on their prior knowledge and by demonstrating the failure of hypotheses which isn't always done in science classes in other words they're given the tools to become critical skeptical thinkers james randy and the jref use skepticism science and education to expose unjust acts why shouldn't teachers use skepticism and other elements of the nature of science to encourage students to question and ultimately change unjust social norms thank you thank you so any questions from the show uh so i think science is intrinsically interesting if you just show people the universe they'll see it's awesome and that we don't need to like tell them why science is worth studying but i also think that when people come to science from the angle of having a particular political axe to grind that can have a danger of biasing the science for example a steven pickers book the blank slate the modern denial of human nature uh has a lot of examples of that um so what do you think of that i think the processes of science ultimately get to the truth and i think whether it's a moral truth or scientific truth i think the processes of science will get there i think it's wrong to be racist maybe i'm biased in that sense but i think that you can use science and evidence-based knowledge to create a better world if i could add to this question perhaps he's asking um how do you get past perhaps a ideology that is resistant to the facts and facts that science come to that that's kind of the challenge of the teacher i don't know yes it's a challenge so we can put it out there any other questions i'm gonna shut up oh sorry let me take it like many skeptics are sometimes giving the question you know science is just another religion you know at some level we you know they'll say we have faith that these journal articles are true um how do you answer that question that science uh is not equal to religion because science is based on evidence and skepticism and peer review and and just the whole process of science is not a religion right well just in the definition i think we could get a team of philosophers up a panel and take that question for an hour or two i would think on your spine on evaluating curriculum you first should see if there's anything there that goes against social justice i agree with you and and also it seems like if you're trying to get something into the curriculum you have to be careful that you're not being ideological i agree and i disagree i think that um i think you're right i think we need to present evidence and we need to let students decide for themselves which is what what the j-rock curriculum does and i think that's what a good teacher does a good teacher presents the content presents the curriculum and allows the student to work through and create their knowledge based on their perspectives and on their experiences i don't know if that answered your question but well thank you michelle is uh steve kuno uh he is the president and founder of response agency uh hold your seats that say evidence-based marketing firm which is here every day this is uh i think your third time on the stage for correcting me i don't want to admit that okay well we say based on the whole our final speaker steve good morning everybody nice to see you all here my name is steve kuno and i'm going to talk today about what you always wondered about advertising and insider fesses up i'm the insider who's fessing up speaking of advertising and starting the commercial some of you know that i'm the co-author of a book by a former polygamist former Mormon wife uh it's called it's not about the sex might ask and some thank you some of you have asked me about it i happened to have some copies with me so if you're interested to grab me after how's that for advertising now i'm going to dispel some myths about advertising and some of you may think that i'm only up here defending myself because many of you don't like advertising and you may not like me i'm really not here to defend myself or advertising i do want to dispel myths because i think if you're going to dislike me you really ought to do so from an informed perspective most of you are more inclined to click on i know when you walk into a store which direction most of you are likely to wander right or left here's a hint if you're from the united states you'll probably wander to your right if your Karen Stolls know you'll probably wander to your left i know what kind of an envelope to send you in the mail that you're more likely to open and if i put the letter inside that envelope i know where you are likely to start reading i also know which time of day which day of week and which month of the year you are more likely to respond to certain kind of advertisements than others i even know if you're interested in learning to do some auto mechanics i even know that about 20 of you would rather learn to fix a car than repair it now since i know all this stuff and before you stick the hounds on me let me tell you how i know this i know it by putting stuff out there and watching what you do and if you respond positively to the ad i'll do more of it and if you don't respond positively to the ad i will try something else now that has to be really disappointing because the mythology about advertising is so much more juicy so let's dispel some of the myths i get this all the time that we know how to control minds that we hypnotize you somehow or short of that we find a way to make you somehow do something you wouldn't normally do we evil advertising geniuses know how to do that okay no we don't know how to do that no advertiser on earth knows how to make you act against your will how to make you make a buying decision without knowing why we don't have that power if we did then the best advertising minds in history would not have come up with advertisements for the following failed products and i swear i'm not making these up by show of hands how many of you out here today are wearing big disposable underwear for that matter how many of you want to smell like a hog i present harley davidson oh the toilette now mind you the best advertising minds in the world were out pushing these products they failed you know if you're hungry there's nothing quite like a frozen dinner from the makers of colgate toothpicks and speaking of appetizing wouldn't you love some yogurt from the people who tell you every month fifty ways to please your man i give you cosmopolitan brand yogurt and speaking of appetizing again how about for the kids purple ketchup from hines absolutely bombed here are a couple of more recent product failures some of you may have heard of that were backed by incredibly huge budgets how many of you remember new coke how many of you tried it how many of you liked it better that's why it's gone and it didn't matter what the ad said did it so if we advertisers are good at turning you into robots and making you buy against your will we are doing a very good job of hiding it but where on earth do the myths come from well we can go back to 1957 when a man named james vickery held a press conference and he told the press that he had tried an experiment in a movie theater he arranged with a theater owner during the movie picnic which was a hit at the time to flash on the screen at one three thousandth of a second once every five seconds a hidden message it either said thirsty drink coke or hungry have some popcorn at the end of about a four to six week experiment he discovered that coca-cola sales went up 18.1 percent and popcorn sales went up 57.7 percent now you can imagine when that hit the media that caused outrage so much outrage that five years later when james vickery admitted to having made the whole thing you know what happens with retractions they're not as much fun as the sensationalized account so we have that now in fact the idea of subliminal advertising controlling it has persisted to this day you will still find people writing about it in all seriousness and citing this case study as if it is valid in fact 20 years after vickery did his stuff wilson brian key wrote subliminal seduction where essentially he saw paradoia everywhere but he didn't call it paradoia he called it advertising people manipulating you with sexual images in the book on the left there's a photo of some booze with ice cubes and somehow mr key saw naked women in the ice cubes and that was making people buy the booze but my favorite key book to hate is the clam plate orgy where he found that howard johnson's had photos on their menus of clams and in the photos of clams were hidden messages about sex and also naked women so the poor guy went he saw sex everywhere he looked i feel sorry for his wife and he came home and stared at the wallpaper too long but since he was so obsessed with menus i have to digress for just a minute share this menu with you this is a photocop of a copy of a menu that was printed in salt lake city for a restaurant i've obscured their name um of these had been used among customers before somebody noticed the typo it's about six down under entrees i don't know if you can see it here let me let me magnify that for you roast dick with a red current sauce i wonder what wilson brian key would have made of that think of all the helpless people ordering that without knowing why another source of the myths of course is yellow journalism sensationalism sells nobody wants to read an article that says well advertising didn't undo the influence anyone today and you get the media reports about laboratory tests these are fun these tests do exist the testers will bring subjects into a laboratory they'll hook up an fmn fmri or they'll check for galvanic response they'll show these people ads and they'll say oh look this brain area lit up where there was a galvanic response that means they're going to go by a couple of years ago there was a study out that really bothered people because it showed that an advertisement could implant a false memory people saw the commercial and thought they had consumed popcorn which in fact they had not consumed and everyone concluded this will make you buy wait a minute this is a laboratory translating that into action in the marketplace is a whole other thing and my own personal experience measuring and tracking advertising which incidentally most advertising do not do is that there is no connection between the laboratory response and the real market response so when you put all this stuff in the lab it's interesting and it may really tell you something about how our minds work but in terms of data that's all you get another source of the mythology is you and me people we love to spread rumors when we hear about it and advertising is kind of a love hate thing we have going on if somebody says i saw this commercial it was great everybody piggybacks and talks about the commercial love if somebody says those damn advertisers are controlling us everybody piggybacks on that and away it goes uh we also like someone to blame don't we i mean face it i would much rather blame McDonald's advertising for my gut than admit that i just can't resist the sausage muck oh my goodness the industry itself is a source of the rumors in his book ever i will be on advertising the late icon of advertising David will be told a story of hiring a hypnotist to create a commercial and he said the result was so compelling it terrified me so he took a match to it and barely avoided involving his client in a national scandal uh-huh okay no copy survives for us to look at no testing was done and you know commercials don't make themselves we have never heard from the director of photography the director the director of lighting the editor the actor or even the hypnotist this smacks of the story that although we invented to make clients want to hire his agency and by the way the mythology that we advertise in people somehow know how to control you is something that serves our industry when we're trying to convince clients to give us money to make their ads another thing we do is gather ordinary everyday people into a room and we show them the ads and we ask them what do you think about the ads do you think you would buy it if you saw the ad now that's just plain silly people cannot predict their own behavior and i'm going to prove that to you right now right now you knew no one was watching by show of hands how many of you will not wash the next time you use a public restroom you know what some 80 percent of you are either mistaken or lying consider that before you shake hands with anyone today another myth that the industry perpetuate perpetuates and this serves the industry is that all advertising works you know most advertising does not work and and some of it drives sales down it's very unusual when an ad actually drives sales up those are the ones you hear about but you don't hear about all the failures but the advertising industry is very good at saying the ads work no matter what here's how they do it if sales went up they go to their clients and they say wow look the advertising campaign worked if sales went down they go to the client i say good thing you ran the ad campaign or they would have gone down more it's it's a wonderful business when you can't lose another myth is that if you remember the ad it was effective how many remember the Taco Bell show up yeah i loved that campaign didn't you and i attended a presentation by the head of marketing for Taco Bell and he showed the slide and he showed how sales i'm sorry how popularity of the campaign and people remembering the ad just went through the roof he also showed us what happened to sales at the same time they went through the floor it seemed that people loved the ad and they thought chihuahuas were great but they weren't buying tacos so they retired the cute taco that everybody loved and they went back to very mundane close-ups of chopping lettuce and tomatoes and sizzling beef laced with oatmeal and sales went back up again we get accused of creating greed nonsense readers of human trade i do admit that we capitalized on it but we didn't give it to you we get accused of causing goods to cost more no no board of directors ever sat down and said you know we should start advertising well how will we pay for it let's increase prices they divert funds from somewhere else and when you hear of astronomical advertising budgets you know look at what went down the billions McDonald's spends if you took that and divided it up over all the units sold you wouldn't even see prices go down by a penny advertising often however increases volume of purchase which creates economies of scale which allows them to sell the product for less which is why a lot of electronic things nowadays cost less than they did once upon a time we get accused of creating social deals that's sort of true and sort of not we we actually kind of reflect what society is doing if there's something happening in advertising you don't like chances are it's sourced in something that's happening in society this doesn't make me very popular when i say this but advertising usually trails behind what culture is doing here's a hot button example we get accused of pushing pink on girls and blue on the waist incidentally these are my grandchildren clap if you think they're cute but you did that we weren't leaving this slide until you did that this is kind of true and kind of not true as it does on people that they didn't know they wanted the classic example would be post-it notes no one knew they needed post-it note until advertising pushed it they gave up free samples and suddenly no one could live without them whole strategies are easier and more common it's where we figure out what the market wants and we let them pull it out of us much much easier to do one of my clients had a catalog where he was marketing products for horses saddles bridles riding crops a vendor came to him and said put my pink riding crop in your catalog my client said no way now most of his customers were middle-aged women for some reason he said no one's going to buy this darn thing so he gave it a tiny space in the back of the catalog sales were overwhelming and the way the catalog business works is if people are pulling your product you keep increasing the size of the ad in every edition of the catalog so eventually this thing was consuming half a page toward the front of his catalog and it was his best-selling item for a couple of years now was he pushing pink on women no women were pulling it from him most strategies have a bit of push and pull and marketers do push pink on women and girls and meanwhile girls are pulling pink rather wear some other color but if you're a marketer your job is to supply what people want and lots of people want pink so that's my granddaughter again clap excuse me princess i apologize to anyone that offends um 13 seconds i'll do it quick advertising can't control minds but it lies all the time and that's what we should be concerned about first kind of lying the first level of lying that advertising gets away with is called puffering which is essentially exaggerating a claim and the idea is i know i'm exaggerating you know i'm exaggerating and it's all in good fun and here's an example here's a fishing capital that claims more hits than google i don't believe that's a literally true claim it's kind of cute though and i hope you agree with me that it's harmless not all puffery is harmless this is for joint juice uh here's a golfer and his lower back or maybe it's his butt crack is yelling yippee okay there's puffery butt cracks don't yell well actually um but the claim the explicit claim in the headline really bothers me joint juice make backs are happy to play 18 more holes drink this stuff your back up quit being sore you can play 18 more holes i don't see this is puffery i see this as quackery and a little bit dangerous next level of lying and advertising now i have to back up most advertising is honest it's above board i mean it's my profession these are my peers most advertising people are quite honest but there are some scowls out there and that's what we're talking about right now the bad apples and weaseling is pretty common weaseling has to do with telling the truth so telling us something so that it's technically true but it's meant to mislead that may sound familiar to some of you um i've always said that if you have to say technically you're lying michelin at stop up to 19 feet shorter with a michelin tire okay and it shows these cute animals that are not being run over by a car the problem is up to 19 feet includes zero and shorter than what well if you read the type they're comparing themselves against one tire in various specific conditions i call this weaseling okay it's true but they're weaseling uh also if you happen to know as i do that when you're browsing through a magazine 80 percent of browsers will read the headline and not the rest of the ad so they have to convey a false impression i don't think that's honest i don't do that kind of thing when i do my work finally there's out and out lying and you'd be amazed at the advertisers you can get away with it for three reasons one is that regulatory bodies are busy and they're not running around chasing false advertising claims when you see something you say that's false advertising clear in the angel because no one's going to chase them most false advertising claims are civil suits filed not by government and not by individuals but by competitors the most famous one i can remember is when uh coca-cola sued pepsi years ago their claim was that if pepsi if you took the pepsi logo and turned it upside down and squinted at it it looked like the coke logo and there was a big suit about that another reason they can get away with lying is that some of them have false identities or they're out of the country and you can't find them but the biggest and most most loathsome way that some bad advertisers get away with lying and it just slipped my mind isn't that great it'll come to me it's that they can make money anyhow you pay a fine but you've made so much more money than the fine that it's worth it to keep going and you pay another fine and you pay another fine but you're more profitable than the fines so you keep going would you like a perfect example of that look to kevin trudeau who is still making money from jail on his fraudulent products what can we do about it don't expect the market thank you don't expect the marketer to notice we're a small pond but there is this personal integrity thing and i have to tell you it's easier said than done for instance i happen to think whole foods lies or at least weasels in its advertising but you know what i love their deli so if you see me sometime in the deli you can call me a hypocrite and you'll be right the other problem is that with a lot of companies if you dig deep enough you'll find a wart and so we all have to draw our own boundaries live our own standards and not judge others who draw different boundaries but we cannot buy the darn product you know for instance i won't go to hobby lobby but less be too impressed they don't sell anything i want i might as well i might as well stand up here and pledge not to chew aluminum foil we can write we can blog we can write letters to the editor um i happen to be published in a number of advertising journals and so i've written articles i don't know if you can see this the headlines are like morality and marketing of caviar and torah and fertilizer because they wouldn't put bullshit in the headline and and i have to give these publications credit number one they're paying for these articles but number two they run them and they're going to offend a lot of the readers one of them has a you know i've got one called how ethical is your direct marketing that ran in a magazine with a circulation of 300 000 among marketers and so there's something you can do you can raise your voice you may not get any advertisers to change what they do but you might warn off potential marks and that is worth something so we've looked at some advertising myths and we've talked about what advertising can do and most of it i think is is helpful it promotes a good economy which is good for us sometimes it brings needed products and services to our attention and that's a good thing for instance the advertising industry pretty much created the demand for deodorant in the 60s without making you obvious sneak a look at the person sitting next to you and now if you're grateful that advertising pushed deodorant on us raise your hand now i do want to end out a positive note so i want to show you a big obnoxious in your face can't get away from an ad that i kind of like and well there's a deodorant and there you have it thank you very much any questions that's how you find me i'd love to hear from you what would you say about the relationship of advertising models and young women having a rise or maybe maybe it's a myth in anorexia that's a good question the question was how does anorexia tie to advertisers using skinny models well there's i think we're back to a pushy whole thing definitely advertising is helping promote the problem where did it start in movies popular culture i don't know what advertising can follow picks up on that uses skinny models women want to be skinny it's a tragedy it's awesome just like the blue and pink thing i mean that's really unfair to girls who want to wear something besides pink and boys who do want to wear pink but there is no conspiracy back to pink for a minute to push pink if girl's color of choice tomorrow turned out to be green you can bet the pink would be on the clearance racks and green would be sold and if suddenly rubeness women became the trend advertising would show it out they'd take a little while to catch up but they would do it do you think that advertising as an industry is overly enamored with things that we might learn in neurobiology and thinking that they can actually take advantage of the way our brains work absolutely that's why you see all this laboratory testing going on and what it really does is so far it is not established how to make a more effective advertisement it has shown how to sell your advertising agency services to a client if advertising can't cause large numbers of people to buy it or to take a bad or inferior product why is politics completely dominated by raising money for advertising wonderful question my personal view were it possible and it really is would be to ban advertising and keep it out of politics altogether you know if i sell you a borrow soap that you don't like you're out a couple of bucks if i sell you a president you don't like you could be out wars ruined economies all kinds of things and so that's a shame i've pondered this there's no practical way to get advertising and promotion out of politics but it's a shame but to your question then all the fundraising that is directed to political advertising how effective is it good question you know there seems to be an indicator that the client more likely to win can raise more money and so again we've got this chicken and egg thing going on much political advertising though is hopelessly ineffective it's just name recognition which for instance Hillary Clinton already has and name recognition is kind of another myth about advertising effectiveness how many of you recognize the name ford edsel yeah didn't sell to any of them good question thank you i've read your television program in years everything i recorded because i don't think it's possible to watch 60 minutes without zooming through the commercials so how do you fix it well that's a good problem that advertisers are facing this market is getting more and more segmented when i went to college we had radio tv newspaper magazines billboards direct mail that was it now we've got this thing called the internet that is all of the above and like you i don't watch broadcast tv i stream i'm very hard to hit with an advertisement now you can get me through facebook but those are largely ineffective you can get me online and actually it's really fun because online ads are extremely accountable either you click or you don't people know if you're clicking and so it kind of holds an advertiser's feet to the fire what i really love is when i'm i'm looking up a page i want to see and up comes a commercial find someone has to pay for this page i want to see and then suddenly it says you can click out of this ad now love that because it puts the advertisers feet to the fire they have to make a commercial that you want to see good okay one more question i'm sorry we're running out of time i wanted to ask a little bit more peripheral question what are the ethics in advertising uh specifically very recently we had the the facebook thing where they did this study where they try to change people's moods and i was wondering if you could comment a little bit on that in the short time we have left i appreciate that we chatted privately about this the other day first of all the facebook thing wasn't an advertising experiment but i don't care it's kind of the same principle when i talk about putting things out there and seeing what people do you know the way i might test i'll tell you how i tested an ad for it's not about the sex of my ass on facebook we had two ads headline a headline b and i ran them both and i counted the clicks and i found the people like headline b better so i got rid of headline a that's not call that manipulative if you want i just call it being smart but now the facebook experiment where they played with your mood over time it's a whole other matter it should have called for informed consent informed consent apparently no calamities came out of it but could have in the old days they used to stage psychological experiments on the street they'd create a trauma and they'd see what onlookers would say or do and finally they realized you know it's not really ethical to pretend to have somebody killed by a car in front of innocent people that is effectively what facebook did with their experiment and i don't approve it thank you very much thank you steve ladies and gentlemen that concludes uh the 12th sunday papers of ten thank you very much for taking