 From the mid-1950s onwards, the American linguist Noam Chomsky established a number of objectives which have continued to direct the course of linguistic research to the present day. Central to Chomsky's approach is the notion of the well-formed sentence. Any theory capable of producing well-formed sentences and rejecting ill-formed sentences is referred to as generative drama. So we will look at sentences first and then look at a central notion, the notion of competence and finally look at the components of generative drama. So let's look at sentences first. The simplest imaginable syntactic theory is a grammar that consists of a list of all well-formed sentences of a language. However, such a list would be too long. Since there are infinitely many sentences of English, so the number of sentences in a language is infinite. Two principles are responsible for this non-finite character of natural language. Let's look at them more closely. The first one is called creativity. Now behind me you see a picture which we could describe with a sentence. The pink horse welcomed the cross-eyed elephant called Susie. So grammaticality judgments and this sentence is certainly grammatical do not only apply to sentences we have heard before, but also to such novel utterances. This remarkable ability is referred to as creativity. Any native speaker is capable of understanding and producing new sentences. Well, here's another one. The interactive whiteboard behind me is used to present the main topics of my e-lecture. Have you heard this sentence before? I don't think so, but you can understand it. This creative character shows that language cannot simply be learned by imitation. Rather, languages are acquired by principles of abstraction. Perhaps some linguists argue these principles are even innate. Now the second principle beyond creativity is referred to as recursion. Natural languages typically allow constructions that involve the repetitive occurrence of elements. This phenomenon is defined as recursion. Some people define it as iteration. Any attempt to describe such growing structures by enumerating the elements in them faces the problem that there is no upper limit to the length of such sentences. With a very simple rule system, however, the recursive character of natural language can be described. Well, let's look at these examples of recursion or iteration first. The first one is called sentence or noun phrase coordination. This one here. So here we have examples such as Jack saw Bill and Bill saw Mary and Mary saw Jane and so on and so forth. So there's no end to this sentence and we simply apply the same rule time and again. Another example would be adjectival iteration or recursion. Well, here you have examples such as the wonderful long old English and so on tail. So four adjectives and even more on the trot. Well, the last one is referred to as prepositional phrase attachment. And well, I would like to exemplify this example in more detail. Well, here is my example. Let's start with the noun phrase, the ball. Now, a next step would put the ball in the box. So we have the first prepositional phrase. Then let's continue with the ball in the box under the table. Well, and here we can go on the box, the ball in the box under the table with the lamp. You see, there is no end to the number of prepositional phrases that can be attached to this construction. And the rule is quite simple. Even though the number of sentences is infinite, not all sentences that are used in human conversation find their way into a grammatical theory. Syntacticians define a specific level of abstraction referred to as competence. Do you know what I mean? Well, as a native speaker of language, we are able to make numerous intuitive judgments about our language. We do not have to consult grammar books or we do not have to interview large groups of people of native speakers. Rather, by virtue of knowing a language, we know that sentences are fine and some are bad. This is referred to as competence, as native speaker competence. Consider some English sentences. Well, here we have the first one, the table saw the woman. Well, this sentence is grammatically correct. It has a subject here, it has a verb, it has an object. So the typical English word order. However, it is semantically at least questionable. Even though this wonderful picture here shows that tables might be able to see, we would confine this to special worlds, for example, to fairy tales. So linguists normally put one or two question marks in front of such sentences to signal that they are at least semantically odd. The next sentence, John put the car. Well, this sentence is clearly ungrammatical. It violates the argument structure of the verb put. We know that put requires not only an object like the car, it also requires a location or an obligatory adverbial. That is, where do we put the car? So something is missing here. Well, then finally, we have a sentence that is also totally ungrammatical because here we have a word order which is not in line with what we know about the English word order. Here's an object. So the VSO pattern in declarative sentences is not the standard word order of present day English. So now we've already learned how to assess the status of a sentence and in linguistics we're using two symbols, the asterisk and the question mark. The asterisk means ungrammatical and the question mark means it is not or non-meaningful. Having defined the prerequisites for a grammatical model, let us now see how such a grammar should be built. Two components constitute the core of a generative grammar and interact with one another. The first is called the phrase structure component. So here you'll find the word phrase structure. Its central task is the structural description of all well-formed sentences of a language. To achieve this goal we need a well-defined rule system. So we need rules in this component that combines all categories into successively larger units. You see the syntactic tree here, which is a visual manifestation of the syntactic rules. This rule system consists of so-called phrase structure rules, the technique of analyzing sentences in such a way is referred to as constituent analysis. This approach, which receives empirical support by means of a large number of constituent tests, has a long tradition from the early systems of the 1950s to today's elaborate X-bar syntax scheme. The second component is called the lexicon. This one here. The lexicon interacts with the phrase structure component and contains information about the lexemes of a language. This includes phonological, morphological, syntactic and semantic aspects. Well, the whole system works as follows. The phrase structure component contains a limited set of rules with which basic sentence structures can be generated. These structures are licensed by means of lexical information. In other words, the lexicon delivers the lexemes on the basis of whose information the basic sentence structures can be generated. Together, phrase structure component and lexicon constitute the core of any modern generative grammar. Let's summarize. Generative grammar not only provides a description of the structure of a language, but it seeks to explain among others the following phenomena. It wants to explain language processing. That is, how do humans understand and produce speech? Language acquisition. What goes on in an infant's mind when he or she acquires his or her mother tongue? And language variation. Why do languages change and what is going on under the surface of language variation? Well, as a result of the discussion of these phenomena, it seems plausible to claim that language is like an instinct. We acquire and use it subconsciously. The core of this process is what many linguists refer to as universal grammar. But that's an even more complicated story.