 In today's world, it's not sufficient to design autonomous systems that are able to repeat the given tasks in repetitive manner. We must be ready to push the boundaries by leveraging the current state-of-the-art autonomy level towards smarter systems which will learn and interact with their environment, collaborate with people and other systems, plan their future actions and execute the given task accurately. Tech Hub has a fascinating motion-capture system with 15 by 15 by 5 meters of flight arena equipped with 12 WICON V8 tracking cameras capable of tracking multiple flying objects in 6 degree of freedom with a range of millimeters of accuracy. My research is about graph structure data analysis for computer vision applications like human action recognition. We do research on a skeleton-based human action recognition by modeling a sequence of body poses in an action video as an evolving graph. Proposing such efficient perception methods like efficient human action recognition is of great importance in robotics for human behavior characterization, human-computer interaction and many other real-time applications with limited computational resources like embedded GPUs and just CPUs. In our research we make digital twins of robotic systems. In our research we have developed Orhus University robotics toolbox and this toolbox allows users with little to no experience with robots and digital twins to achieve very high-fidelity mathematical models of robots. What we hope to be able to achieve with this research is very high-fidelity digital twins of robotic systems so we can do, for instance, payload estimation, safety monitoring and predictive maintenance. One of our research areas here at ALAP is safe base navigation through congested harbour and waterways. We're utilizing autonomous drones and state-of-the-art AI to tell the captain the distance between his vessel and all the surrounding solid objects. This will enable the captain to have a safety trip from A to his destination. So the prospects of this research is that maybe in the future we can use this technology to enable autonomous ship floating around in harbours and having no captains at all. In ALAP we designed the autonomous surveillance systems for the monitoring of traffic patterns. We try to meet artificial intelligence with drones to learn anomalies in an autonomous way and to notify human operators in case of any abnormalities. Our ultimate goal is to learn anomalies in a fully autonomous way and to eliminate any mistakes caused by human factors and to improve the safety of society. The research here is mainly about the application of mobile manipulator in the factory and it's mainly about the force control and the computer vision part. For now we apply the AI and enable the robot to detect the future of the humans of the working stations to keep the safety. In the future I think the mobile robot can work as a co-worker in a factory so they can improve the working efficiency of the whole factory.