 Hi, everyone, and welcome to our presentation. They, me, Marta, and Dalek are going to introduce to you our project, which is a pick and place vehicle. Before diving into it, we would like to present you a quick demonstration of how it works. So that concludes the demonstration. Let's now dig behind what happened during this one academic year. The table of content for our presentation is going to be. We're going to introduce the motivation behind this project. We're going to walk you through the process of what we did around this. We are going to quickly demonstrate you the working principle on our project. We are going to talk about some of the applications it has. And we're going to tell you how we envision the future works on it. We're going to conclude. So as many of you might know, robotic arms are heavily used in industries, during manufacturing processes. And they're there mainly to assist humans in mundane and dangerous tasks. So when researching, we thought about what we would take a robotic arm and actually mount it on something to make it mobile to use it in different areas. Under points of motivation that also inspired us is that the specialization of both fuzz. We are both interested in electronics, engineering, and we want to professionally pursue this career. So we thought that this project will be a great opportunity for us to learn a lot about it. Also, this kind of a system is not widely available for audience for research purposes. And we thought that a lot of students would have benefited from this if they had it during their studies. And finally, this past four years in the AUA, we had a multidisciplinary course. It was composed of studies from mechanical engineering, electronics, software. And we thought that it would make sense if our project was also multidisciplinary one. So the three main components of our project is the vehicle, which is for the mobility. It is the robotic arm, which has the grabbing mechanism, and we have a gesture control mechanism, which is a glove. This is our system design. So we have separated the vehicle with robotic arm and the gesture control separately for the vehicle. And robotic arm, we have two DC motors and six servo motors. We also have the Arduino Nanoboard with PLE communication. We have the motor driver. And we have a video camera system here. For the gesture control, there is no actuating part here. There is only sensing. We have five flex sensors here, a battery, Arduino board, and also separately, we have a screen for the user to have a vision of what is happening in front of our device. Let's now walk you through the process of what we did. We have separated it into three parts. We're going to talk about mechanical design and we're going to switch about the electrical circuits and then the software programming behind this. Let's start with the mechanical design of the robotic arm as we started with it as well. So these are the larger parts. They were designed using the SOLIDWORKS software. Holes visible here are for the servos that will be mounted on them later on. The dimensions of those were taken into account. This is the second part with smaller parts mainly here introduced. So the smaller parts were for the grabbing mechanism and the larger parts you can see starting from the base. So the larger parts were printed with 50% infill. There were a few reasons for that. First of all, we wanted them to be lightweight not to overload our motors and vehicle and also for financial reasons because the last infill, the last costly it would be. And for the smaller parts, we had 100% infill because it was responsible for the grabbing mechanism and in terms of finances, the change wouldn't be that much significant. Next is the mechanical design of our vehicle. Again, most of the parts were pretty printed here except all of this plate. The plate here is not real invisible because of this color but it is a four millimeters thick acrylic glass which was cut using a laser cutter machine. The choice of the material was based on a few criteria. It needed to be strong enough to enter the weight that we were to mount on it but it also needed to be lightweight enough because again, we didn't want to increase the overall weight which was going to be on our motors. And finally, we opted for a material that would be cheap and available for us. Here you can see our first prototype because the delivery of the parts was a bit delayed not to lose time on the logistics. We decided to build this kind of small vehicle to test some software, to test the communication and this is our prototype. This is our design alternative. This is the first design of the vehicle. It's more or less the same. The only thing different are the DC motors. We had a problem with those because they weren't providing enough torque so when the vehicle was just set on the table it wasn't able to move. It was because of the absence of a gearbox and the second problem with those motor it was the coupling between the motor and the wheel. It was not called centric and it resulted in vibrations. That's how we decided to go for the older motors which we can see here the yellow ones and in the design and it already had integrated gearbox and the coupling was done much easier. And you can see here without the cover and with the cover design in the video you also saw without this cover. We did it so you can see the electronics inside but for statics we have the cover here. Let's go a bit into the electrical design. On the left you can see the power tree of the robotic arm and the vehicle. It's powered using three lithium-ion batteries connected in series which are resulted in around 12 volts. Those batteries are powering direct to the Arduino nano board which because it has a wide range of input that's why we don't need any converters. It supplies the DC motor drivers and the DC DC converter which steps on voltage from 12 to five volts to supply the servo motors and the video transmission system. On the right you can see our circuit. We have six servos connected directly to the Arduino. We have six pins which are going to the motor driver four of them are digital pins used for the direction control and two of them are analog for the speed control of each motor. For the motor driver we have the supply and outputs for two DC motors. Now in terms of software, when the program starts it moves all the servos in some predefined zero position which you have seen in the video. It initializes the BLE, Bluetooth Low Energy Model and it starts to search for the central device which in our case is the gesture control glove. When it receives data from the gesture control glove it starts to read the data and understand in which mode it currently is. When it's off it just does nothing, it stays as it is. When it's a vehicle mode it controls the motors, the speed and direction with the motor driver and in the R mode it controls the servos. We had a slight problem with that because Arduino documentation provides only five PWM pins pulse with modulated pins which are required for servo control but we used two of them for speed control and we're left with three but we needed six of them for servos. So we tried to implement a software solution, generate PWM manually using timer interrupts which we succeeded in but the problem was the timing wasn't perfect, there was some noise in the servos and then we decided to research what is the actual PWM frequency that Arduino is generating and we found out that this Arduino board can generate PWM on every GPIO pin, that's how we just connected to six pins next to each other and worked pretty well. From gesture control, this is our gesture control device. Here you can see again Arduino same board, five flex sensors for each finger, power on switch, battery, mode button to change the mode, on board our GBLED and the battery. What design alternatives we had here, firstly for like major issue we had that we didn't know how to attach flex sensors to the glove, we thought about gluing but we thought that it might damage the sensor that's why we didn't proceed with it. We tried knitting the sensors to the glove but it didn't work well because sensors were slipping out of their places and finally we used double sided tape which worked pretty well as you can see on the actual design. And the second question we had is whether we should put flex sensors on the inside of the pole or on the outside. In this picture it's on the inside before it was our first test but it didn't work well because these are flex sensors, they are not bent sensors and when it's on the inside and you bend your finger the sensors are not giving consistent value so you can't understand whether it's flex or not. That's why we changed the design to have it on the outside and it worked well. From circuit point of view power tree is pretty simple it's just an idle battery connected directly to the Arduino. In the circuit we have five flex sensors with their signal conditioning. We use voltage divider to supply the voltage to the Arduino and read the resistance changing the flex sensors. We have a power input and we have a button with a pull up resistor to read the mode change. From software point of view algorithmically it's a bit more complicated but the block diagram is simple because we are initializing the INU model inertial measurement unit, the BLE model and we're connected to the peripheral device which is the vehicle in this case. When we're reading the button from the mode we had a problem that we had we had to deal with button debouncing we did it software wise so that each press of a button is detected as one press at multiple which is a common problem with cheap buttons. When it reads the mode in the off mode it sends an empty buffer to the vehicle so the vehicle knows that it has nothing to do. In the vehicle mode it parses the IMU data the roll and pitch angles and it then sends the vehicle the control buffer. In the R mode it parses both IMU and flex sensors data and sends the appropriate control buffer to the vehicle. Okay so now that you have been introduced with the work we have done let's have a small demonstration of how it works. I'm turning on the vehicle in the gesture control globe. So if I have my red button displayed it means that I cannot control nor the vehicle nor the robotic arm. I'm switching one time and now I'm controlling the robotic arm. And yeah okay let's see how it works so it uses the data from the IMU and flex sensors. So the banding of index finger controls the bottom server which you cannot see and this one and tilting it right to left controls the bottom server which is not seen right here. Now if I am bending my middle finger it controls next to servers. Again the movements are same. We thought about designing them as intuitive as possible. The ring finger controls only one server which is right here. It only has this two directions no directions for the right and left. And the pinky finger is responsible for the grabbing mechanism. Again it only has the upward and downward direction without doing it for the right and left. Now if I switch to the next mode which is the green and the vehicle so the robotic arm remains in the position it was left before switching. So we have placed this on something so that you can see how the motors are rotating to the forward backward direction and tilting them right and left results in clockwise and counterclockwise direction. Let's talk about few applications. Firstly it can be used in the military. In the battlefield you don't want soldiers running around the battlefield. So you might have this kind of small vehicle which can transport some objects from point A to point B. On the top right you can see a small ball which is thrown in the fire. It can quickly extinguish it. This kind of small vehicle can carry few of them and place it in the burning house and it can open the route for the fireman to go inside the house, which is a problem sometimes. It can be used in the hospital. It can travel, transport some medicine or some papers from one room to another without any human interaction. And it can be used to help handicapped people and maybe increase their life quality. I want to talk about the future works and where this project can be used in our universities. So for the arising seniors and other students in engineering sciences this can serve for them as a base and they can make improvements and automate some processes and present it as their capstone. A few things that we thought of, it could be done by planning for the robotic arm and for the differential drive vehicle control can be applied here as a project. Also, this can be a good for demonstration purposes during different classes we are studying. Some of the options that we thought are the cut circuits, mechatronics, design systems, engineering and much more as these courses cover some of the concepts that we have applied here. A few learning outcomes. This surely was a journey for us and we have learned a lot throughout it. I want to talk about the 3D design and printing first. So we already had some kind of experience from our previous course, but we learned a lot of new features and tools and did our research on different filaments in-field and they dig into the process of the 3D printing. We also improved our software knowledge as we learned a lot about car-sync and new data, button debouncing, reading sensor data and also configuring the communication. We also improved our soldiering skills and we'll try our best not to burn down anything in the process, we didn't. We also learned a lot about different types of motors there are, we learned about their capabilities and limitations as well as we use flex sensors and they were also studied and signal conditioning was applied. Also different types of Arduino boards were studied to decide which board we actually want to choose and a lot of research was done on our microcontroller. Also, this video right here that you can see is our initial test vehicle. We also automated it so that it can drive without our, without gesture control mechanism and it drives with its sensors. Finally, I want to say that this project will have become a reality if it wasn't for the support of our supervisors, our dearest professors and the whole College of Science and Engineering which showed us its utmost support and gave us the motivation to pursue this idea. I want also to give a special thank you to Professor Mitna Guranyan who was our supervisor to Dr. Rubina Tenileva, Dr. Hrecha Kocharian and Dr. Bilark Guranyan for their contribution to this project. And we also want to thank you for your attention. We're ready for your questions. Could you please tell how, or if you had any testing procedures like how you achieved the performance that you were showing, I'm sure there were many failures before you have it working the way it is working now. What kind of strategy you were following and how you have tested what phases have been passed? Okay, I think I can start and then you'll continue. So I want to mention about the robotic arm. Some of the parameters that we had is the maximum weight that it can endure which was about 200 grams. With it, it's kind of manual testing when we picked an object and gradually increased their weight until we figured out that at some point which was in between 250 to 350 grams that it is having a hard time. Like the motor was pulling it upwards but it was doing it extremely hard. And actually we bring down one of the servos because of it. At first we didn't figure it out because it didn't stop working right afterwards but it was it. This was one of the testings that we applied. Excuse me, can I additionally ask like what do you think why the motor, like why it stopped working because of the weight but like what happens inside? I don't know if you can explain like. I can try. So the pre-servos that you can see right here are actually different from this ones. They are much smaller and I'm guessing that their applications are limited and they're just not made for bearing that much weight more electronically. I think for example, is it because of current voltage that what's happening? I think the reason is that we have gears rotating each other. So we have, we're playing small voltage but it gives a very high force for the gears. So when we're trying to grab a heavy object, the gears are like mismatching from their positions. That's why it's not able to then detect because there is a sensor in the server that detects its position. It's a feedback. So it's not able to track its position. It's not related to that. It's just increasing the current and that's why it burns the motor because the current is increasing because of the load. Anyway, okay, thank you. Should we continue about the testing part? Go ahead. From software point of view, we can say that at first we tested on the test vehicle and like each module was tested separately, we have separate files for maybe DC motors testing, separate for servos, separate for BLE connectivity. So tested each module separate, they try to integrate them into one project and test everything together. What is that separate testing code? Component testing. Component testing and then integration testing. In the software world, it's more unit testing rather than component testing. Okay, thank you. I have two questions. It's more, first is like more technical. So I've seen you've using these flex sensors. I guess they're working as an analog, right? So if you're banding, it's changed the resistance. I was hoping to see that you could use that resistance change in the control rather than using as a switch. So I've noticed that you're using as a switch. Have you tried to use that as a control and what, why did you stood back from that idea? The main problem with this was, imagine like one of the fingers controls this servos. For example, this servo, when you bend your finger, it moves, that's what I'm saying, right? Then when you switch to the vehicle mode, you need to remember which position your finger was. So if you move your finger a little bit and you switch to the R mode, the servos will start rotating randomly in the current positions. That's why we opted for direction control, the direction control instead of the position control. So that we user won't remember its older positions. Well, I mean, if you would use your, so you're using IMU on your hand side, right? Like if you're rotating your hand, that could be something that is moving the vehicle. And if you're moving your fingers that can, you know, use the arm, for example. That kind of models, have you experimented with those? It can, but even in this position, when you're trying only to control two servos with one IMU, so one roll controls one direction, which controls the other, sometimes when you're doing pitch, you're doing a little bit of roll because you don't have like any feedback how much you turn in each direction. So it's a bit complicated for a person to track everything, and especially when one area of finger is bent, your hand goes up a little bit. So you're bending your finger and it gives some pitch angle, which is a bit complicated. So we tried to like make it as simple as possible and then we used it as a switch rather than a look. Did you consider two gloves so that it's like decoupling a little bit of, there's like, with thought, incredible dexterous manipulation? Yeah, we actually thought about doing it. But then the end of the day, as also mentioned in our capstone report, we wanted it to make in a way that if we introduce this to someone who has no tactical background, it will take us like five to 10 minutes for them to actually use this because we also wanted to use it in real life. But yes, we had that in mind. We just opted for a simpler version and also because of the lack of time. And why it will also be more complicated, imagine like there's one going from one hand to another, you're limited in your motion. And my second question is if you would have more resources like time and money and expertise, what would you change in your project? Are there any things that you were planning to do but because of lack of resources, you gave it up? One thing that I want to mention is that now as you can see, we're switching from robotic arm to the vehicle mode, which means that when you're actually driving, you cannot pick anything. And we thought it's okay to have as a case, but sometimes you're grabbing something on the go. And what if we had more time, would consider doing it in the way that they can work simultaneously? That's one thing from me. Also, we can use automation, we can maybe place a GPS module there and give points where it go from point A to point B and group object from this coordinate. That's automation is another place. Maybe we can, we showed four applications, but this one is like proof of concept. So we could have built a real vehicle for one of those applications, not just a general vehicle that proves that this is possible and what we can do, what we did. And the final thing, we have a Bluetooth connection and it is working like up to 10 to 15 meters. So that is also limiting condition. I think we'll work on something to expand the range of motion of this. Okay, thank you. Couple of questions for me. Answer the first one, why Bluetooth versus Wi-Fi? And also have you prepared the user manual? Well, this is not a user manual. Well, it kind of includes how it's controlled. So yeah, it's not like intuitive for user, like you do this and it happens this, but it's more or less the same concept. And why Bluetooth versus Wi-Fi? Because we have this board available here, so we won't waste some time and wait for all the boards to arrive. Maybe Wi-Fi would consume more power. Maybe. I don't think we have power consumption problem here because we have batteries that are running for a long time. Good. Thank you. Thank you. Thank you. Thank you.