 Oh, not bad. Maybe a chance to get set up if you get on there. What time do I start? You've got about 5-10 minutes to set up. We covered a lot the last two days. What do you think? We're jonesing for a drink. It's just like you're crying to us, Amanda. It's five o'clock. It's a good time. Do you want me to introduce myself? I got a joke planned. I had jokes planned, too. You just do your thing and I'll do mine. Ladies and gentlemen, our next presenter, last year, graduated with his N.A. here from the University of Georgia. He's currently doing his Ph.D. work at the University of Arizona. But we've all been really excited to have him back this week. It's been really great to see him. And I know that at least Rachel Kim is very excited that it's a very temporary visit. It's nice to finally... For those of you who don't know, I was the guy emailing you. Now you know me. I'm told I'm a syntactician. I don't have handouts, but I'm told that I'm one. I'm not sort of about syntax, but really about how much I don't like it. So we'll just start. Actually, Vera and Tim aren't here. Quick poll. Who here, whose first experience with syntax in a syntax class was an enjoyable experience which enlightened their view of language? And you feel like you actually learned... You got some people in here. That's more than I expected. How many people actually can look at a syntax article and actually understand what's going on without reading it like five times? Okay, well, several liars. So anyway, you get into syntax class and you see all these magical trees and you have all these magical concepts like functional categories, covert movements, specifiers. I don't even know what a specifier is. No one does. You have all these trucks, feature checking, death eaters. And you start wondering what is science and what is... When you step into a syntax class, it's like stepping into some sort of seance or something like that. So that's the feeling I get at least. One of the problems I think is that syntax has been burdened with some very unfortunate books like syntactic structures, aspects of a theory syntax, Cartesian linguistics, and a couple others. Now I have no idea what the common denominator behind all of these books are, but I think they all share the same sort of muddy headedness and I'd like to sort of elucidate what I think is the problem with all of them. So that's what I'm going to do here. So the flaw. What's the flaw? So there's this diagram that I show in every presentation because it's like a really annoying one and it's called the reverse Y model of grammar. And basically the idea is that syntax is the core of language. Syntax produces a bunch of strings in language and it becomes sound or meaning. The phonetic string, it goes to the motor system and it becomes phonetic. What you enunciate. On the other hand, it goes into part of the conceptual system, it becomes meaning. Now the problem with this, I'm not going to even talk about sound and meaning here. The problem is that there's nothing up here. That is, there's nothing that feeds into syntax and that's a huge problem that you might not necessarily realize. So classic generative grammar models language as a random generator. That's to say that they think of it as a machine or it's not even people think about it. It's just, it's modeled as a machine that produces random sentences from a grab bag of words. And the objective is to make a system that can produce all grammatical utterances of a language and rule out all ungrammatical utterances. So first off this is obviously not how the mind works. So what happens when you produce languages that you have an idea and your language faculty has some way of embedding this in syntax and eventually in unscated words. But if you model it like this you end up with a bunch of problems that you don't actually need to solve. Just to run through a couple of them there are these things called selectional constraints like the harvest was clever to agree, the boy elapsed sentences like this. Traditionally these were huge problems in syntax because no one really knew how you're supposed to solve for them in generative grammar because of syntax is the very core. How do you rule out anomalous sentences? And the idea was well maybe there's this part of the syntax called selectional constraints that weaved them out later in the derivation or something like that. So the actual answer is that no one says these sentences because they don't make any sense. Why would you try to produce a sentence like this? There's not a derivation that's beginning and crashing, it's that it can't even start because no one wants to say this. Also if you look at different categories, one of, another Smith not me one of my colleagues at Arizona has done this work on C selection saying that basically C selection which is thought to be this arbitrary aspect of syntax can be boiled down basically entirely to semantics and other things like that. So you don't need all of this sort of machinery. And syntax ends up just recapitulating parts of meaning. So what ends up happening is that you have this semantic syntax does stuff that other parts of the mind are supposed to do like you know make sense of the world. So to isolate the error there's a very good article, very clear article called well it's wrong but it's still clear by Hauser Chomsky and Fitch 2002. This is the language evolution article. And it does well to distinguish what's called the broad language faculty from the narrow language faculty. So the broad narrow language faculty is everything that goes into play in our production of language like our vocal apparatus or you know understanding of events, you know everything we need in language. Well the narrow language faculty is that extra element that makes humans capable of producing sentences in the ways animals can't. So biolinguistics and syntax are sort of in the search for whatever that narrow language faculty is, what that extra element is. So it's just an issue of so you have a chimpanzee, what do you add to it to create a human besides the bowtie? Like what extra mental element do you need to create sort of a human like recognition? So and Hauser Chomsky and Fitch identified this as what's called recursion or merge or generativity, a bunch of different words. And their idea is this element called merge evolved in the human psyche or something like that. And what merge does it is it combines different mental elements. So you have conceptions of apples and sally and eating. And so monkeys presumably have concepts of those things too. You do as well but the unique thing about humans to Chomsky is that there's this unique way of binding them together to produce sort of new sentences, new expressions. And it's not just as soon as 2002 if you look back this is from Cartesian linguistics. Chomsky says sort of the same thing. Linguistic and mental processes are virtually identical. Language providing the primary means for free expression and thought and feeling as well as for the function of creative imagination. So the idea is basically everything in our conscious mind comes about from this operation merge. And merge is tied into language. So anyway, why is this obviously wrong? That's the main question. Now they can't tell us but pretty much all evidence shows that animals do have merge like thought. And C.R. Gallistel has interesting work in this showing that really animals do have a conception of argument structure of direct and direct objects, patients, stuff like that of dikesis, of basic numeracy. They're at least equivalent to people who have languages with primitive quote unquote numerals. And in fact when you think about it, what's the use of having concepts of things if you can't merge them together? So if I have a concept of Lisa, what use is that concept if I can't think of Lisa eating something or doing something, you have to make use of concepts you have to be able to combine them together. Oh yeah, I like this example. So generativity is in all human behaviors. So if you ever work at Waffle House, you don't actually have to be taught the 22 million ways to make a burger at Waffle House. What you're taught is a generative operation. You perform a set of tasks in a recursive way to produce this hamburger to its specifications. And animals can do this too. I mean maybe they can't make hamburgers, but they can do similar generative things. They can understand argument structure, et cetera, et cetera. And one of the things I like talking about is how syntactic structure mirrors semantic structure. So this is something really weird if you assume merge. Because merge, so it combines linguistic elements, but it doesn't specify how they have to be combined. But there's actually a lot of rules as to what can be combined in language. So the weird thing is that the structure of syntactic and semantic derivations are eerily similar as Barbier puts it. There's one property that makes generative grammar particularly on that economical, namely the fact that X bar structure itself does not contribute to the semantic interpretation. So the idea is, well, we have all the structure and it looks like a semantic derivation. Why don't we just put semantics in it? Because when we do, when you take a semantics class and you do these relative clauses and dependencies, they look exactly like syntactic dependencies. Why don't we think of these as the same thing? Well, they are the same thing. So let's talk about cartography, which is similar. So, of course, Qin Kui and other linguists have gone into the sort of semantic structure behind languages, and pretty much all of them show the same things. There are very detailed semantic categories that appear in particular orders in all human languages. So what was traditionally called CP and TP are really very expounded categories that have very specific meanings in them. Noun phrases in adjective orders have very specific orderings, prepositional structures, same thing. And if you assume merge just puts things together in whatever order, this is a really weird coincidence. So, oh, well, here this is just cartographies. So, you know, these are like adverbs are in this particular hierarchical order. Manor adverbs are going to be lower in the structure than perspective adverbs, stuff like this. And this holds, you know, Qin Kui's original stuff is, you know, on like 75 different languages, and they all show the same order and stuff like that. And you have for argument structure, et cetera, et cetera. So the problem is that merge recursion as the narrow language faculty is too powerful and too complex. It implies that animals basically have to be cretins, that they don't do anything like they can't really have any kind of thought behind them. And as Qin Kui notes, I hate this word universal grammar because no one understands what it means. But we'll just say the language faculty has a relatively rich content far richer in fact than most people are used to and perhaps willing to assume. And he's referring to this complexity here. There's an enormous amount of complexity that seems to have come from nowhere if you assume that merge generates it all. So merge can't, merge doesn't need, just need to combine. It has to know all of these orders. So it's not just one operation. It's actually a very complex operation. So my answer and I think the answer of a lot of people, but to put it in words is this, is that you don't need an operation called merge. You need an operation called externalize. That's to say that we should assume that these semantic structures that we use to analyze the world exist not only in humans but in non-human animals that are cognitively similar to us. So they have recursive ability to understand events, relative clauses. They don't have them explicitly, but they have mental lionelogs. And it's an epiphenomenon of how their brains are arranged. This is just speculation, but it's speculation we're working into. But what's unique about human cognition is that we can externalize the structure. So our brains have been arranged in this way. What the language faculty is, is a mechanism which links these lower mental processes into the conscious mind so we can externalize them and stuff like that. So time for the green smiley faces which half of you guys have seen already. So my idea is basically that there are cartographies in your brain and when you want to enunciate a DP or something like that, you really just have this externalize operation just go up it and say, okay, what do I want to say in all of this? So it goes up and the adjectives are arranged in a particular order depending on what they are. Which ones you need, whatever. And again, the same thing is true in morphology. This is a Korean example. If you have agglutinative morphology, you see the exact same thing, same order. It's just an issue of an externalization. So in a nutshell, all animals have evolved a series of mental heuristics for making cognitive distinctions and recursion. So I'm just going to say that animals are not stupid. Animals have basically similar cognitive apparatus. It's a fourth of the convention, isn't it? Apparatus, that would be the proper plural. Anyway, I think. So animals pretty much have the same cognitive repertoire as us. The big difference and this repertoire is mostly non-conscious. When they analyze events in things their mind of course addresses them in this way and they make decisions at a non-conscious level and react to them. But our language faculty is an externalization scheme. And I guess it's sort of like what you would call synesthesia. So synesthesia is where you hear colors or smell tastes or something like that. It's a merger of this sort of cognitive processing system with motor and naming systems. So you have this merger of these two systems and what you get is that this lower portion of cognition sort of bubbles up into the conscious mind and we can actually externalize it. So this allows otherwise non-conscious thoughts to be externalized into the world and importantly in our brain. So we can actually have like a metacognition in things like this. That's one of the huge advantage, as Chomsky notes this is one of the huge advantages to language. We can actually think about language in our head contemplate events and things like that. And that's a huge benefit of it. So the end result is that I feel like this sort of reconception has pretty much all the desiderata of a theory of grammar. Again the narrow language faculty is mechanistically small meaning it's something that can actually have evolved over the past 5 million years or so that we need. It's not like merge which has to be incredibly complex and has to basically have all the cartographies embedded and implicitly stuff like that. And it also implies that animals are sort of stupid. And syntax, I think in my idea, syntax becomes the study of the cartography as structure. So like what order do you have these kind of adverbs, prepositions etc etc. And importantly in some way this is a window into human cognition because it's showing us that our mind has a set way of analyzing events and that can conceivably tell us how mental algorithms have evolved to address the external world. So if you go back to the cartography, which I'll do later, if you go back to the cartographies there might be a particular reason that they are in a certain order because they have evolved in a particular way or because of natural law constraints or something like that. And one of my favorites is that because syntax is tied to a set structure, it suddenly becomes falsifiable. That means you can actually make scientific statements about syntax. You can't just say oh well this doesn't work let's just oh AP, this is an AP, this is an XP, this is an FP. All these structures that are just mean nothing. So that's a nice benefit like actual falsifiability is new. So that's how much time do I have? Okay, well fine. So that's about it. I was going to go into more but it's good to quote your colleagues when they cite you so I'm going to do that merchant yesterday said hey Luke we're going to be moving to my house. Stop working on your presentation and come join us. So I'm going to leave the rest of the time for questions because that's all I got and I hope that made at least a modicum of sense. So that's it. Any difference that there is between humans and animals has to be a very simple one and that was merge and things making it through the interfaces. So I guess are you claiming that the same cartographic hierarchy is also present in all animals and if so is it the same one? Basically yes. So if it's not clear what I was trying to say is that over the course of bajillions of years of evolution we have developed algorithms for addressing these kind of semantic things and the order we find them in is just a sort of an emergent property of the fact they're being externalized based on their sort of ordering if that makes sense. Is there any sort of evidence that the same cartographic exists in animals as well? That would be very hard to find evidence for in terms of like the actual ordering I think. Again, Gallister's work which he has a lot of interesting stuff. I mean I don't know how you would do actual ordering here but it is sort of clear that they do have things like argument structure and they might have similar categories so I don't know how to test that unless you have some brilliant idea. Mr. Ok Doug, yeah. So Namjewski, Wachow, Kofi, they lack externalization? So, yeah. So yeah, well there are two ways you can interpret that. You can say on one sense their reactions are just behavioristic. They're just learning, okay I make these signs and you know I get a response. That's one possibility. Another is that they have, they can develop gradually this sort of relationship between meaning and externalization but it's not or you know some kind of phonetic realization but it's not like there's an actual algorithm like the green smiley face that's going through all of the structure you know forcing it out. So it's not, it's... If they can't, if they can, for some words, they could be able to. Then what's preventing them from forming the structures if not merged? Why is it that we doubt that Wachow said Waterburn getting done? Well, because if you don't have it analyzing you know sort of algorithmically, you can't have I mean you can't have particular orders of objects and direct objects and you know stuff like that. You can't really have you could theoretically have the externalization of individual words but it wouldn't necessarily have any rhyme or reason to it. It just sounds like you're saying that they should pass it. I mean it, okay there, well let me put it this way. The brain does a lot of things and I'm always consciously aware of them. You know what I mean? So my whole point is that this is something you know this sort of quote-unquote syntactic processing is happening in all animals but we just have this extra algorithm that you know computationally externalizes them. Okay so in the same way it's like the difference between a first and second language learner. There are different cognitive processes that are working on this to address similar problems and the output is totally different but more salient in this example. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. So are you thinking of vervet monkeys? Is that what you're thinking of? Okay yeah. So vervet monkeys have set calls for predators depending on their location and I would say sort of what I said, the dogs, that's probably a behavioristic response. It's not like their calls have specific truth functional you know content, you know what I mean? So it's not really the same thing. It's something that looks like language to us because we're humans but it's probably just some other, I mean it's as related as B communication. You know what I mean? I would say that what they're doing is not related to this at all. I would say it's just like they develop a reflex sort of cultural reflex based on sort of behavioristic things so it's not really the same. Yeah Jason. It might not be. Yeah. Yeah. Yeah. Yeah. So I will say you know one of the big differences for Chomsky's ideas that language exists not for communication he gets really pissy when people say that. He'll say that the purpose of language is the organized thought. And even my model does the same thing because yes you have this semantic processing but it's actually bringing into the realm of like metacognition. You can actually think, you can analyze your own thoughts and sort of think of them again so that's what you mean. Yeah. Yeah. Who is? Oh Galistal. No it has nothing to do with cartography. He was the one who said basically animals have merged like structure. Like Chomsky said okay the language faculty basically gives us the ability to count, argument structure, things like this Galistal says the empirical evidence says that's not true. This is Qin Kuei. This is like Qin Kuei, Ritzi. I will yap on to you for ages about this if you ask me later. Sweet. Yeah. Yeah. Yeah. Well you know parts of speech might be totally epiphenomenal. You know it might. I don't believe in them. Yeah. Yeah the why? Yeah. Yeah. Yeah. Yeah. Yeah. Okay. Yeah. Yeah. So the lexicon is an input to syntax it's something feeding into it but I mean like in the sense that syntax in this model syntax is the thing that generates what you want to say basically. So what I'm saying is that what you really need is a I mean think about it whenever you say something you're saying it for a reason. You don't just mumble a bunch of sort of like postmodern diatribes of sentences. You say thanks with particular reason and Chomsky models this the wrong way because he's sort of implying that meaning is generated after syntax or something which doesn't even make sense. So yeah there is something visually above syntax but that's something different that's just like feeding into the syntax system. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. We're fortunate to see you 100 times and speak with Cindy a state once, and he came on stage with a bottle of satchel. It's okay. And you, and he just won't look like him. I had to figure this out so that he wouldn't be able to break the water bottle. So it's wonderful. It's wonderful. It's very engaging. And when I look like her, it's always okay. Sure, sure. Just checking.