 We have been working on this robotic platforms from about last two years and like so this was our first version. So this was like configured as a universal robotic platform. It has this front looking camera for image processing. Then there is a wireless transceiver for wireless communication and there is one position encoder also and this was like a universal design. This robot have like you can have any kind of processor card in between and you can have any kind of control system. So this was like a universal robot but it was just like a prototype phase. So this variant have particularly have FPGA board on top of it. Then we also have a LIGO or 8051 boards. Then we designed this second version. We call this as a CD bot. This robot is made from CD casing. It has a four infrared proximity sensors. These sensors are used for obstacle detection without any physical contact and then there are four line sensors which can like follow a height line which is on the ground for localization. This robot also used to have a shaft encoder and ultrasonic distance meter for obstacle detection. This robot was designed in a modular way like this card over here. It's originally it was designed with 8051 core and then we can replace with any kind of microcontroller. Even sensor boards are also interchangeable. So you can have any sensor board, any control strategy and any microcontroller family. So this was like a sort of like rapid prototyping platform. Then last year we designed this robot. So we call this machine as a Firebird 1. This robot have seven line sensors. Then it is a two wheel differential drive configuration. It has an infrared wireless communication and it has three infrared range finders. So these sensors give you distance in, precisely distance in millimeters from the obstacle. And this robot actually we gave it in the kit format like this to our MTech students. And students actually, this box carried all the mechanical parts, then electronic components, all the tools necessary. And students actually assembled this robot and then used RTOS to run these machines. So this was our like last year's experience. So we deployed this 20 machines in our lab. Then meanwhile we like, but the problem with this machine is like it was quite expensive. It was like something like a 7000 rupees. So then we again like went to drawing table. We then redesigned the whole concept and then over here like we have a sort of a universal machine which can like have a 360 degree sensor scan. It has a reconfigurable eight proximity sensors. Or pick and place it as a electromagnet over here. And then there is a line sensors are also there. But still it was not like good enough for mass production. So then we designed this machine. So we call this machine as a Firebird 2. So this is our current platform and we are already deploying this in CD in our remote centers. And already our MTech students are using this machine. So this machine have three white line sensors and then there are a few bump sensors over here. It has a wireless communication and a few range finding sensors. And you can also have a wireless camera. And this is currently used in three courses in at IIT Bombay. One is our own embedded systems, second is sensor networks and third is mechatronics from mechanical department. So this is another highly mobile edition of the same platform. So these robot like basic parts are like it needs some intelligence like microcontrollers or CPLDs or any other device. Then sensors, locomotion, power and communication. So this particular robot have a ATMEGA128 microcontroller. Now we chose this microcontroller because this like can be used in sensor networks also. I mean tiny OS porting is available. So you can treat this robot as a mobile sensor network mode. And then this robot have a three linear infrared range finders for like distance from like it can like estimate distance in millimeters from obstacles. Then it has a three white line sensors for localization, five bump sensors for bump protection. It has two position encoders. So you can actually identify robots like by what distance robot is traveling or its velocity. You can run this machine in closed loop control also. Then it has a one light sensor. You can also measure battery voltage. It has a three optional infrared proximity sensors which are cheaper in nature. And you can also have a seromounted part. So you can have a 360 degree scan. Then it has a one LCD buzzer, wind speed, LED indicators. The best part is it has a 2.4 gigahertz CD may wireless transceiver. So using this device at any given instant you can have a 3000 machines talking with each other simultaneously. So that's the best part. So that's what make it ideal for mobile sensor network and collaborative robotic research. Then it has a onboard lithium ion power pack. So we are actually using two mobile phone batteries. So while designing this machine, like we actually took extremely care that this has to be very cheap, easy to maintain. So we chose this lithium ion batteries and we also designed our own ultrafast one hour lithium ion charger which is usually used in mobile phones. So this is how this robot looks like. This is a very modular architecture. So like I say tomorrow if I want to use say 8051 or PIC or maybe say ARM-7. So I can just replace this microcontroller and I can have any board over here. So this is the main board and then this is the programmer come position encoder board over here. And then on top of this microcontroller board I can have any sensor board. So like I can actually use any sensor that I want to use. And then over on this side you have a 2.4 gigahertz CDMA trans receiver. The bombs which has sensors are fitted over here five places. And this LCD is fitted over here. And battery actually fits in between the LCD and this main board. So it's like a quite modular design. And this is it's infrared linear rangefinder. So it actually measures the physical angle between transmitted and received beam and gives you distance estimation. So it gives you distance estimation irrespective of it is no longer dependent on surface reflectivity. It is actually triangulation based sensor. So this is how this sensor looks like. So it has a infrared transmitter and a CCD array. So on which when light falls it actually calculates on what point the intensity is maximum. And using that it gives you distance estimation. Then these are bump sensors. And these bump sensors are not ordinary switches. They have switches with hysteresis. So they don't give chattering response like when they bump into something. Then this is the light sensor. And this is the placement of the light sensor at the bottom side. And these are very directional sensors. So they don't get affected by any ambient light. And then it is powered by two very low power DC motors. So because if we say that this is a mobile platform, it has to consume very little power. So that it will have a long battery life. And the motors have to be cheap. So we choose motors which are used in digital camera for roll winding. And these are very small size motors and consume not more than 20 milliamperes at any given instant. So this is the physical size of the motor. So this is our 50 paisa coin. And this is the motor. And this is very cheap. Like this is something like 110 rupees motor. And we control this motor in a closed loop control. So you actually have a position encoder on top of it. So position encoder is like a disk which has slots. And there is a transmitter and infrared transmitter and receiver. So when that slotted disk pass in between, you get square pulses. So using that you can actually find out like how much of that machine is moving. And to control robot's velocity, we use pulse width modulation. So by like at any given frequency, by varying the pulse width, you can actually control power given to the motor. So that way you can actually control the speed. Like in this case the duty cycle is small. So motor will run at slower speed. And over here this duty cycle is very large. So motor will be like running quite fast. So this is a position encoder. Again it's a photo film. So it's just a exposed photo film with this pattern. And here is a infrared transmitter and infrared receiver. And through this it is doing the position encoding. And this wheel is made from simple foam. So it's like a very low cost design. This is another view. And this is it's 2.4 yards CDMA transmitter. And we also designed an in-system programmer for this machine. So you don't have to remove microcontroller for programming. You can do just on the spot. Thank you. And just to add to what he said, the idea or evolution was to make it as compact as possible. And the design should be modular so that if someone wants to replace particular component and add his or her own component, it should be easily doable. So that's the idea. And we are also trying to reduce the cost as much as possible. But as he said, I mean to sustain our operation and the labor cost and things like that. So it's coming around 15K. So the basic idea with this robot is to spread robotics across engineering colleges. So that's why we even have a sort of spin-off from our lab which takes care of financial things related to this robot. So if I mean any of the engineering colleges want, I mean they can contact us. So I mean if the order is bulk or something like that, even some sort of discount is available. So that's maybe we can talk about it tomorrow when we give the pamphlet about the next robotics. So what I will do in next couple of maybe 15, 20 minutes is I'll briefly describe the embedded real-time activities that are going on in our lab, the research that we are doing as part of the lab work. So basically, we are into a couple of vertical domains. One is automotive embedded systems, like providing a real-time support for designing such embedded applications that go in automobiles. And the other thing is sensor network, where we deal with localization or sensing some interesting event and then broadcasting it to base station. So that's one vertical. And other thing is real-time system as such. So the Vibhuti over here is sort of working on an interesting RTOS project where we are sort of trying to design and build a kernel to support multiple RTOS on top of it. So we have sort of requirement from a customer. So first I will go through the automotive embedded system. Since I have a couple of videos that people have worked on in the lab, are you aware of adaptive cruise control system that's there in the high-end automobiles? Is it rightly said? I mean, adaptive cruise control is one which sort of tries to maintain safe distance from your leading vehicle at any given point of time. So driver need not keep on pressing accelerator or brake pedal continuously while driving on less congested roads. I'd say he's driving on a national highways and you just want to be a sort of relax. Then he can just set what safe speed he wants his car to maintain from his leading vehicle so that embedded system will take care automatically for him. So this is basically an enhanced version of cruise control. So there are some couple of interesting real-time issues here at what rate you need to sample your own velocity and then at what rate you need to sample the leading vehicle velocity and then at what distance it is. And then at what rate your task should periodically compute and take decision and then actuate the electronic and brake control. So this is all robots and all even the software part is done in our lab by M-Tech students. Ashish is also working on this. So this is just a simple cruise control operation that the video is demonstrating. Cruise control is nothing but you can set your vehicle to always maintain a safe speed. Let's say you want your vehicle always to be at 40 kilometer per hour on road. So it will do it for you. You don't have to worry about braking or accelerating. So this is the machine that we used. So there's a leading vehicle and then a following vehicle. Leading vehicle is one which is sort of in front of the big machine. So what we're trying to do is we have set some initial safe distance that the big vehicle has to maintain. And then even in turns and the curves and all in the roads, it's sort of trying to maintain as much as possible. So no point of time. I mean collision will happen. So that's what this embedded system ensures. So this is basically to increase safety and comfort for the driver when he is on the road. So these are in the same video, but this sort of a concept called platooning where vehicles move in groups, maintaining the same safe distance from each other. So this nicely fits in a highway scenario where you can do this stuff. So basically what you have done is we have used the same robot that the lab in a box that we call for even demonstrating our automotive embedded systems, I mean applications. So in that way this is sort of multi-purpose robot. So this is another application what we call as automatic merge control system. It's like a scenario where two roads are merging at an intersection point and you need to make a decision which vehicle should go first and which vehicle should go next. So the vehicles are intelligent enough to decide among themselves depending on some criteria. I mean who should be the leader and who should be the follower. So the criteria can be anything. I mean whoever is nearer to the intersection region can go first or the velocity with which they are moving can decide the strategy or even the priority associated with vehicles. Priority in the sense a particular vehicle may be ambulance or a fire engine so that needs to move ahead. So things like that. So here we had the criteria of whoever is nearer the merge region should go first. So what we do is we do program these things using real-time application interface RTAI which you will be sort of using in the lab session to program these robots. This sensor network application where the larger goal is to find, I mean detect oil slick in sea water and then sort of try to intimate the whoever relevant person saying that there is a oil leakage over here you try to contain it as soon as possible. So what this video is showing here is sort of simulation test bed of that. It has got a light source mounted on top of it. So this robot is sort of trying to scan the whole area for the light intensity and then it will start rotating along the circumference of that light intensity where it dies off. So the idea is when you have a oil leakage so differentiate between the boundary of water contain oil and then pure water and then make these robots to circulate I mean to circle among that boundary along that boundary. So this application where it needs image processing support so this sort of robot soccer thing where this vehicle is supposed to drive this ball towards this goal post. So we have some, I don't know why this mouse is not working. We need to detect the position of the robot and position of the ball and position of the goal post and then dynamically take that ball towards that goal post. So that was the objective in this. So we're trying to I mean do lots of stuff with the same robot or the same platform that we have designed and built in the lab. So that's the basic idea. We are sort of reducing the cost and also not compromising on the technical abilities that this robot comes with. That's the basic idea. So the problem statement is you have to write a module for Firebird tool, the robot which is shown over here and part A of that assignment is it should be able to follow the white line. We've got some white line steps in our laboratory. So your job is to program the robot so that it follows the white line wherever it is. And as soon as the white line is not there it should stop at that moment. And part B is you have to implement adaptive cruise control. The videos which you saw and what Guru explained right now. So I don't think I need to explain any more on adaptive cruise control. So we'll just see. So whichever PC you will be sitting you are given login password. And you have to go to the CEP directory. So we have provided you with a small piece of code that is CEP.C. So basically what your job is to create multiple tasks. So if you want to read foreign sensor value or white line value or you have to control the robot. So you need to create multiple tasks for all the individual operations. And so basically your program will be in three parts. One is the initialization part wherein you will create whatever task you want to create. You will allocate various resources to each of this task. The other part is where you will de-allocate all the resources. You will stop all the tasks. And the third part is you will define all the task functions. Like if you have three tasks then each task should have a function wherein you have to define what activity that particular task has to do. So in simple C program we generally start the execution of the program from main function. But in the kernel modules it has to be done in it module. So this is the unit module. So the execution will start from this function. Since there is a single serial port that is every task is sharing. So you need to define CMA4. So basically it's a mutex wherein multiple tasks they try to access the serial port at the same time. So you need to synchronize between them. Because every task will send some command and it is expecting some response. So it should not happen that one task has sent a command for a particular sensor and another task gets that data. So that's why you want to protect that particular serial port in the mutex. Then you initialize the task. So like we have this, we have already with the code in which we have created two tasks. One is you read the white line data and another task is you read the sensor front sensor. So this is the front sensor and if you keep some obstacle in front of the robot you can sense the distance in between the two vehicles. Then you set the task in periodic mode because we want to sample the data periodically. So we set the task in periodic mode. Then we start the timer. So basically this timer will help in scheduling all the tasks. So for this, so we start the timer then in order to communicate with the robot unit you have to make use of the serial port. So you have to configure the serial port here. So we define that we have to use the common port of this PC. Then you specify the board rate, the number of bits, how many stop bits you want. So once this is done, then you make the task, then actual task execution will start from this point onwards. So when you say RT task make periodic. So time now is the time at current instant. So from period onwards the task will start scheduling. So as soon as these two tasks are created and initiated here they will start execution. So the first task is a white line task. So this is the white line task. So every task will be an infinite loop. So you have to read some data. So we have to read white line data. So we have a function over here. And if you want to print whatever white line data has been read you have to make use of RT underscore print k function. So in C we have got a print f statement. So that won't work here. So because this is a kernel, we are going to write a kernel model, not a user program. And this RT task wait period. So we have programmed it for 100 millisecond. So the period here is set to 100 millisecond. So every 100 millisecond you will be reading the white line data as well as the front sensor data. So the RT task wait period, it will, your task will stop here and will execute only after 100 millisecond. And the second task is front sensor task. So we have a function defined for reading the front sensor value. And this is how we read. So first we lock the port because we have to use CMA4 in order to access the serial port. So this is a macro basically. So you can go through the macro what lock underscore port one means, what unlock underscore port one means. Then RTI has provided some APIs for communicating through the serial port. So we have RT underscore SP write. So we specify the port number, like we have specified common as the port. Then what command you want to give? So the command is get front proxy. And then what is the size of that buffer? And so nano to count. So we have specified five millisecond as the timeout time. So within five millisecond it has to write the command. After writing the command, you are expecting some response. So you want to read the whatever data the robot is sending via serial port. So there is a API provided which is RT underscore SP read underscore time. So within five millisecond it has to read the data. If it is unable to read the data, then you will get an error which will say it's a timeout. And we have provided some temporary variables for converting whatever raw data you will be getting that will convert into millimeter. So this is just a small program and the cleanup module is a module wherein you will delete the mutex. You will stop using the serial port. You have to stop the timer and you have to delete the task. So this way you ensure that whatever resources has been allocated to the module in the init module, they get de-located. So your job is to create and you can create two, three, whatever number of tasks you want as per your convenience. So your job is to like as specified in part n, part b. So you have to follow the white line. So once you read the white line data, you have to decide whether you have to take left turn or right turn, how to maneuver your vehicle so that it doesn't overshoot the white line. It has to properly follow it. And secondly, part b means you have to see whether there is some obstacle in front of your vehicle and based upon some judgment and finding the velocity in between two vehicles, you have to adapt to the velocity of the front vehicle. So you have to implement these two modules, these two tasks into this program. So let us see how to compile this program. So we have provided you with a make file. So the file name goes here, cep.o. So just give make command. So I have written here, first we have to load the, so as per this document, first you have to load the mandatory modules. So those modules are RITI underscore HAL, which is the hardware abstraction layer. Then you have to insert the scheduler, RITI has provided some scheduler. So that you have to insert into. And since you are going to use CMA4, so even that module is required. And for serial communication, you will be using this RITI underscore serial.co module. So when you have to compile the program, just do make command and this will generate the kernel object module for you. So this is the file that is generated by make file. So just switch on the robot. So there is a switch over here. And I will keep this robot at certain distance from this obstacle. And the second step is, now you have to insert this module into the kernel. So just give make in command and then give make out command. So what make in will do is, it will insert the .co module that has been generated into the kernel. And now it is periodically sensing the white line data as well as the front sensor data. Now I will remove the data. I will remove the co module and whatever printf statements, whatever data has been, whatever data has gone into the kernel buffer, I have flushed it into a file called A. So this is the distance that is the sense and it has given 15 centimeter as the answer. So I have everything is clear. So just three commands. When you modify your code, just compile it using make command. You have to insert it, just give make in and you have to remove it from the kernel, just give make out command.