 So, this course deals with formal language and automata theory. So, in this lecture, the introductory lecture, so we will simply discuss what this, what the motivation behind this course, the content of this course, how we are going to deal with this course, what are the books, what are the basic requirements and so on. So, this course will be taken by two of us, I am Dikanda Goswami, Faculty Computer Science Engineering Department and my colleague Kavik Krishna from Mathematics and Computer Department of IIT Guwahati. So, we will briefly discuss about the course, the contents, what are the books, how the topics will flow when you cover the syllabus and what are the prerequisites for this course. So, we all know that competition is basically solving problems through mechanical pre-programmed equation of a finite number of unambiguous states. Competition can be performed using a computer that is known to us and theory of competition comprises the fundamental mathematical properties of computer hardware and software and theoretical computer science basically comprises few components, first is data structures and algorithms. For solving any problems, we need to write an algorithm that means you have to follow a finite number of steps which are unambiguous, which when followed in some order it solves a particular problem. While executing the algorithm, we need to store data, access data, manipulate and so on and for a purpose of course, we need some discrete structures is called data structures for example, stacks, queues and so on. So, in data structures and algorithms, we discuss all those concepts various objects and the algorithms. Then to implement those algorithms, we need to write some programs in some program language. So, therefore, semantic sub-programs comes into the picture and then theory of competition already we have said that various mathematical properties of both hardware and software and this present course is a foundation course for theory of computation. So, what is there in theory of computation? So, mainly what we discuss here is that what can be computed? We have large numbers of problems in the world in the universe. So, numbers of problems will be infinite. So, can we compute or can we solve all those problems? That is what we have to see in theory of computation. So, if we cannot compute why we cannot compute or what are limitations of computation that we have to discuss over here. Then once you can compute or solve a problem, how difficult it is that means what is the complexity of the computable ones? That also will be discussed in this course, but obviously to address all those aspects we require certain models of computation. So, these models are basically some abstract computer through which we can demonstrate the mathematical proof of the assertions we make about all those. Now, this course is mainly to introduce and study the properties of the fundamental models of computation. Since, you have said that we need various kinds of abstract machines or abstract devices and there are different kinds of abstract devices, abstract computers or models. So, we will study basically the different kinds of models of computation and the properties. For example, we will have here very simple model of computation that is finite automata, then we have push down automata, then Turing machine which is most powerful of all those models of computation. We will introduce those concepts those machines and study the various properties of those models. Now, the problems that we want to compute using those abstract devices or models of computation will be described as formal languages. Therefore, we need to look forward for an abstract computer which can recognize or accept those languages. When an abstract computer or those models can accept the language we say that it can solve the problem because whenever it accepts that language means it has accept solve that problem that is whatever notion. Now, whenever we say that a problem gets solved. So, there will be some easy problem and there will be some hard problem that means we want to classify the problems in two categories based on their complexity or hardness easiest to hardest problems and then since problems are given in terms of language what are the corresponding languages to each of those. Then, we will have some problems which cannot be solved at all by any computing devices that means we want to show that there is no computing device to solve a particular problem that means unsolvable or uncomputed problems and then what is the corresponding language for those problems. So, since we have said that there are various models of computation and we give the problems as formal languages as input to these computing devices. Therefore, corresponding to which automata we have a class of languages for this course we have a large number of books available both Indian and foreign publication. So, but there are some classical books for example, the book by Hopcroff, Motwani and Ullman. So, this one is a classical book that you can refer to. So, whenever you have since we have different books we will see that different books discusses or introduces or attach the problem in different ways. So, some books will discuss this automata first various models of combination then formal languages. So, every book of course, will first introduce this basic concepts like was a language and then various kinds of languages will be discussed and then automata will be discussed. So, some books first discuss automata then introduces grammars which is a mechanism to generate languages. For example, in this book Hopcroff, Motwani and Ullman first discusses this automata then introduces grammar and so on. Then another book that book by Lewis and Papa Dimitro which elements of theory of computation is a very classical book. Then Sipster, introduction to the theory of computation will undergo a book. Then we have a book by Peter Linz which discusses large numbers of examples may be useful for many of you. Then the book by Sudkamp, T. S. Sudkamp language and machines and introduction to the theory of computer science. Now, this book basically first discusses grammars, languages and then automata. So, whenever you have any doubt maybe you can refer to any one of those books and get clarification. Now, we will see what way we are going to cover the various the content of this course. What is the content and what are the flow of topics basically that is what we need to want to discuss here. So, this course is divided into 15 models and each model is comprised of many lectures. So, model 1 contains mainly say 2 lectures. So, initially we give the basic concepts what is alphabet, strings and then we formally define languages and then we go for a finite representation. What is mean by finite representation is that since we have said that we need to use computer to compute and problems are given in terms of formal languages, but you know that languages may be finite or infinite. In case of finite is fine we can give it as input to a computer is finite representation, but in case of infinite language we need to represent that language using some kind of finite representation and then give it as input to a computer. So, therefore, for every language we need to have some kind of finite representation. So, what is that finite representation and how we represent any language using finite amount of information that is what we discussed over here. So, we will use some kind of tool called say regular expression which can be used to represent any infinite language using finite amount of information, but it is not the case that every language can be represent by using regular expression. There are many languages which is not for which is not possible to give regular expression. Then we will go for grammars in model 2 which is again divided into 2 to 3 lectures. So, grammar is basically again a combinatorial tool which can generate languages. So, it is also a finite representation of for languages. So, here first we will discuss a kind of grammar. So, in this grammar called context free grammar. So, in this grammar we have rules to generate the strings in the languages. Now, we can impose some restrictions on the rules of the grammars or we can have more general grammar based on that we have different kinds of grammars which can each of which can generate different languages. So, first we will discuss the context free grammars and then we will see how we can derive strings in a language. So, when we derive a string in a language there are other ways to do it as well a technique calls derivation 3 or parse 3 which is also useful in parsing a programming language. Then we will impose some restrictions to the context free grammar and we will see that there is a grammar called regular grammar which is actually equivalent to the regular expression. That means the way we represent that the class of language that is generated by regular grammars is equivalent to the class of languages represent by regular expressions. So, for that in module 3 we discuss we come to say automata. First we discuss the simplest kind of automata which is called finite automata. So, always automata are basically abstract computing devices models of computation and can compute or accept different kinds of languages that means you can solve different kinds of languages. So, finite automata is the simplest of all that is why it is computing power will be limited. Then we introduce non-determinism in finite automata by default finite automata is deterministic. Non-determinism is an important concept is counter science. We introduce the concept of non-determinism and then we have another model called non-deterministic finite automata or simply NFA. But we can show that the deterministic finite automata and non-deterministic finite automata both are equivalent that will be shown in this module itself. Then in module 4 we come to minimization of finite automata. So, what is done here is that since we have a suppose we have a language and we can construct a finite automata to accept that language. We can construct it in many different ways. In some cases the automata will be simpler in the sense that number of states will be less. In other case numbers of states may be more, but both are equivalent in the sense that they accept the same language. So, therefore for any language would like to ask is there a finite automata with minimum number of states? Yes, that is what we can do. For that purpose we first introduce a characterization for the language the class of language that is accepted by the regular expression and then we can show that we can construct a finite automata with minimum number of states. Then we go for showing the equivalence between the various concepts like say regular language, regular grammar and finite automata. So, the class of language except by regular expression is said to be regular language. That way we want to show that the finite automata and regular expressions they are equivalent. The class of language except by regular expression is equivalent to the class of language except by finite automata. Similarly, regular language and regular grammar they are equivalent. The regular grammar accepts the class of regular language and so on. So, we will show the equivalence between the various models in module 5. In module 6 we will discuss some variants of finite automata. So, whenever normally discuss this finite automata we see that it accepts a class of it accepts a language particular language. But finite automata can sometimes also produce some output given some input it may produce some output. So, in that context we will see Moore machine and Milley machine which produces some output. And then we will also see another variant of finite automata is 2DFA where we where the reading head can move back and forth. In normal finite automata reading head can only move in one direction, but in this case 2DFA it can move back and forth. So, these various kinds of finite automata will be discussed in module 6. In module 7 we will discuss properties of regular language. So, what are the various properties or is there any characterization of regular language that we discussed in this module 7. Again suppose there is a language where this language regular not suppose this language is not regular then how can you show that this language is not regular. Where we have any tool to do this this will discussed using in module 7 using some kind of lemma called pumping lemma we can show that a language is non regular. And then many set theoretic properties for regular language will also be proved over here. Then we will go for a simplification of CFGs. So, context of grammar we have already introduced in module 2. Then in that grammar we have we will have many different kinds of rules. We will show that some rules are not really necessary and we can simplify the grammar by eliminating or simplifying some of the rules. And still we will have an equivalent language that means given a grammar we can simplify that grammar. And still this grammar will generate the same language as the previous one by simplification of e means reducing the number of rules that will have in the grammar and some other way of simplification. And then we will discuss some standard forms which will be useful some standard forms of grammar which are called normal forms which are useful in proving many theorems. And how to arrive given any grammar how to arrive at a standard form for the grammar which generate the same language as the original one that will discuss in this module. There are many different kinds of standard forms we are not going to discuss all kinds of standard forms only a few will be discussed over here. In module 9 we discuss properties of CFLs similar to the properties of regular languages. The properties of CFLs means context free languages the language which generated by context free grammar various closer properties. And some other properties like say given any language we have to show that this language is not a context free language. So, how to prove that we have some tools to prove that and will be discussed in this module. So, as you have said earlier that with its abstract device or automata we associate a class of language. For example, for finite automata the class of language is the regular language. Similarly, for the corresponding to this context free language that the automata is the corresponding automata is your push down automata. So, you see how this push down automata is different from finite automata. That means we will can enhance finite automata by adding some more features in it. So, that it can accept different types of language with nothing but the context free language. We will also show in the process that the class of regular language is a proper subset of context free language. That means push down automata is more powerful in terms of accepting language. And then we will prove here that push down automata and context free grammars they are equivalent. We will show that if we have a push down automata we can construct an equivalent context free grammar from it. Similarly, if we have a context free grammar we can construct an equivalent push down automata from it. Then in module 11 we discuss and introduce the most powerful computing model that is Turing machine. Turing machine is the most computing most powerful computing device in the sense that whatever a Turing machine can whatever a general purpose computer can do the present general purpose computer. A Turing machine can also do the same thing in the sense that it is capable of computing whatever a general purpose computer can compute. So, you will see how we can use a Turing machine to compute a function or the various functions. Then we will use some modular approach in the sense that we can construct a very complex Turing machine to compute some complex functions using some simpler Turing machines which can compute some simple functions. That means you can combine Turing machines to compute more complex functions. Then we will also introduce the notion of algorithm through Turing machines. And there will be different kinds of Turing machines that means variants of Turing machines. For example, in normal Turing machine the basic model will have only one tap. There may be multiple tap in Turing machine that is a one variant. The multi hat Turing machine is another variant. The two way infinite tap normally we have single way infinite tap to provide the input in the basic model of basic Turing machine, but we may have two way infinite tap as well we can introduce. And you can show that all these models are equivalent to the basic Turing machine models. In model 12 we will discuss structured grammars which can compute any kind of functions that can be computed by a Turing machine. That means the class of language generated by structured grammars is equivalent to the class of language except by Turing machine. So, we will introduce here grammatically compatible functions. Structured grammars are basically the most general form of grammar. Then in model 13 we discuss decidability and undecidability. So, what can be decided and what cannot be decided? So, we will show that many problems normally the problems related to languages that we have already introduced. Many problems are say decidable. For example, given a final tomata and a string where are the string is accept by this final tomata that means where are the string belongs to a language which is regular. So, this problem where are decidable or not decidable? We can show that this is decidable problem. Similarly, many other problems related to languages can be shown to be decidable. So, we will discuss some problems which can be solved and then we will discuss undecidability. That means there are many problems which cannot be decided using any computing device that means for example, say Turing machine. You consider any Turing machine, but we cannot compute or we can solve that particular problem. So, in this case we need to use some tools, some methods to show that these problems are not decidable. We will use the concept of diagonalization and firstly show that the halting problem of Turing machine is undecidable and from there on we can show that many other problems are also undecidable. We also simply introduce our problem called PCP post curson problem which can be used to show that many problems related to languages are undecidable. So, once we have problems which are decidable we want to know how difficult those problems are to solve. So, in model 14 we discuss that issue or introduce that those issues. So, discusses complexity theory, how complex problems are. So, can we classify their problems based on the complexity level. So, there of course, large number of complexity classes, but here we will simply introduce only the basic classes. For example, p, n p and n p complete, here again we have to use many different tools. For example, say reduction which one tool to show that if we can solve one problem then you can also solve another problem with that means they are having similar complexity level. So, here we will show that one problem is n p complete and then using that assuming that that problem is n p complete. We can show that many other problems are n p complete using this method of reduction, but to show that first one problem is n p complete. We use this Cook's theorem, we introduce this Cook's theorem and prove it and show that the problem is n p complete and then we use the method of reduction to show that many other problems are n p complete. We give various different kinds of n p complete problems by the method of reduction. We show that all these problems are n p complete and finally, we conclude this lecture by giving a hierarchy of the language classes. So, this is known as Homsky hierarchy which is named after the famous language norm Homsky. There are various languages for example, say regular language, context free language, deterministic context free language, recursive language and recursively enumerable language. Turing machine basically accepts the class of languages called recursively enumerable language and in between we will also introduce here under computing model that is called linear boundary automaton which accepts context sensitive languages and the corresponding grammar for this is context free context sensitive grammar. And here we will show that there is a proper containment for example, say regular language is properly contained within context free language, context free language is properly contained within context sensitive language, context sensitive language is properly contained in recursive language and finally, recursive language is properly contained in recursively enumerable language. And this hierarchy is known as Homsky hierarchy. So, this is all about the plans or the flow of topics that we are going to cover in this course. But this course requires some basic concepts from descriptions. For example, once we have the basic knowledge of set theory, what is set, cardinality of set, how do you find finite set and infinite set and so on and the various operations on sets, union interaction and so on. Then functions and relations, what is a one to one function or bijection and those things for the equivalent relation, equivalent class, partial ordering those concepts will be required. Then most of the results or theorems that will have in this course will be proved and many different preprotections will be used, but most commonly we will be using mathematical induction. So, where we will have a premise inductive hypothesis, we will have some basis step, we will have inductive hypothesis and then we will have inductive step to show that to prove the result. There is another kind of induction called structural induction, which will also be very useful. For example, we will have that can be applied in many structures, which are recursively defined. For example, trees or some sets, where we have some recursive definition, then we can apply the structural induction to prove many results on that. Then some concepts of graphs and trees are also required, because this finite automata, we can model it as a graph, a directed graph and when you see where are there is in a cycle or not in a graph, where are there is a tree and so on, where are various properties of trees, numbers of what is the height of a tree, what are levels of trees and so on. Many other properties of trees will be required. For example, when you parse a particular string in a programming language, we consider a kind of tree structure which is called tree, which is called parse tree. So, all those concepts will be required. Once we have these basic concepts, we start having the following lectures very easily. So, that is all about the introduction of this course.