 So, we have introduced what ethics is overall and that we need to take care about it in software engineering. Because it's easy to build products, it's cheap, it's quick to deploy them to everybody. In many cases it's hard to take back once things might be on the internet, you cannot just move them away. And as I already said, the focus on the product is maybe a bit to pronounce at the moment in computer science and software engineering. As a software engineer you need to take care about a lot of different things. And what most other disciplines do in engineering but also in other areas in the humanities, for example in medicine, is that they have so-called codes of ethics. So, they basically have documents that state what your moral obligations are as a doctor, as a mechanical engineer, and you basically subscribe to them. And if you are in many countries, if you are in a professional organization, if you're a registered engineer, if you're a registered doctor, then it's not only something that you put on your wall and everything is fine, but you're actually obliged to follow that and you might be liable if you act against that. So, in software engineering we're only at the start of that because we're reasonably young discipline. Software engineering is 50 years old now, but we have the so-called ACMI Tripoli code of ethics. ACMI Tripoli are the two largest professional organizations in computer science and software engineering, so they have launched this code and it lists a number of things, but in particular it lists these eight areas that you need to take care about. And I just thought of introducing them quickly and discuss them so that you at least are aware of that. So, first up, there is always the public interest. So, you should, as a software engineer, act in public interest, only release, for example, products that are in the public interest. And this is a rather tricky discussion because obviously there might be some things where you can strongly argue that they're not, for example, drones that kill people, military airplane software, these kind of things. But then things get a bit more blurry, I would say. So, if you think about the current debate around social media, Instagram, Facebook, that makes people addicted, that changes behavior in particular of teenagers, this is something that, of course, nowadays companies, including the two, are probably aware of, but when things were launched, I'm pretty sure there was a legit interest behind that and people actually thought that this might help others. So, it's not always that straightforward. There's also, if you're interested in it, there's a game called Paper Clips, which is a really basic web interface. If you think about it from an objective point of view, it should not be anyhow engaging, it should be a rather boring game, but it happens to be really, really addictive. So, similarly here, it's maybe not that easy to predict these kind of things, but addiction, for example, is another point that we might have to think about. Making games that are addictive, but also, for example, launching products that cost money to people that already have an addiction, gambling, for example. So, are you targeting certain groups because you can exploit them, which would definitely not be in the public interest. So, that's very much now on the level of what kind of products are we launching, is it, again, is it, for example, segregation of families on airplanes, things that are clearly unethical, but in general, it's not that straightforward. So, public interest. The next thing is, in most cases, you have clients and you have an employer and you should obviously also act in their interest as far as this is possible, as far as this is in accordance with public interest. So, you should not, for example, keep back features for your client because you know that later on, they're really important. If you bring them up later on, you can get much more money, for example. That would obviously be unethical. Similarly, the employer, I mentioned the Weymund-Juber case, you should not take code home and then take it to the next employer and use the code there because you're basically stealing from your employer or you're acting in a way that is not in the employer's interest. There is, of course, a clash here. If you think about Dieselgate, for instance, you could argue that the engineers that built the cheating algorithm and the emission software, they were probably acting in the employer's interest, but not in the public interest, so there can be a clash here and that's why the statement in the Code of Ethics reads, you should act in the client and employer's sense if it's in accordance with public interest. Next up is the product and this is not only about what product you build, but it's just how you build it. So, you should not be releasing a product that you know is under-tested, that has potential critical bugs that could lead to, for example, fatal injuries or financial loss or anything like that, so you should build and test the product in the best professional way. And again, this can often clash with other interests so your employer might push you to release as quick as possible, but you know there are critical bugs left, so then the ethical way would be to actually resist that decision. This is, of course, again, not an easy thing to do. Dieselgate is a good example. I would argue that there are probably some of the engineers that built those algorithms that were really very much in this clash. Am I resisting my manager or am I just doing it and basically work against public interest? So, not an easy thing. Good. As a software engineer, you are an expert in a certain area and the expectation is that you maintain this expert judgment independently. So, for example, if you are any kind of expert panel, you are asked to assess a product where that has the right quality or choose a product out of a number of competitors. You should do this in an independent professional way without, for example, being affected by certain lobbying groups. One company pays you a lot of money for, well, doing the judgment in their direction, so you should be independent in your judgment. Now, many of you will end up in management positions. So, important here is also that you are managing people. You should, of course, keep all of these things in mind. So, as a manager, you make decisions. For example, when do we release? You should not make decisions that affect that things are released too early, for example. But also, when it comes to your employees, you should really aim to, well, not push them into ethical dilemmas, but also, for example, not burn them out. You know that you can achieve a release in a certain time, but only if they work way too much. So, you have the additional responsibility towards your subordinates, basically. Then, this is something I mentioned in the previous video, you have the responsibility towards the profession, that the profession is in the right light, essentially, has a good reputation and is acting professionally. And this is something where I believe we definitely need to work on in the next couple of years, because currently, we're more and more moving in a direction where computer scientists, software engineers are not necessarily seen in the best light because of all these issues that are coming up in the media about addiction, about dangerous products, about unethical products and so on. So, you have a responsibility to make the profession look good. And this, of course, relates to all the other points. The danger here is not only the products we release, but also if there are enough people that are obviously doing mild practice, then no one will trust software engineers in the future. So, imagine half of the doctors are doing really weird things that harm patients that give wrong diagnosis, then people will stop going to the doctor. So, similar issues we're having here. Now, most of you will work in teams, so you have the responsibility towards your colleagues to work in an ethical way with them. And that means, for example, that you don't, similar to the client, that you don't keep features back and then you know that some colleague will probably have to do them on the weekend, but you disclose information, you help each other and you support each other. And then finally, there is this point called self here. And this is essentially about lifelong learning and updating yourself on the current knowledge. So, it's your duty, it's your responsibility to keep learning because things in the discipline change, techniques change, the knowledge changes. So, if you are able to build a product in a good way right now, you might not be able to do that in 5 or 10 years if you don't keep learning about software engineering, about programming, about quality assurance and so on. So, that's the bad news for those of you who want to graduate and then never learn anything again. This will not stop, especially if you are going towards being a registered engineer. So, this is the code of ethics that you should definitely know, especially the ones of you who want to be engineers. Nevertheless, the question remains, does this help? So, we have this code, is everything solved? No. There are actually studies that show, even if you give this to people, before they have to make certain decisions or implement certain things, that they might even act against it because they're more aware of it so they can sort of go around the issues. This is only a starting point, of course. This is something that needs probably quite some time and a lot more debate. But as a starting point, it's really, really important that you are aware of this and it's not just some document that you can throw away and ignore. This will likely get much, much more meaningful in the next years. As I have mentioned in many other areas, you are professionally liable if you violate this code of ethics. We are probably going in a similar direction here. So, that's what I want to talk about ethics, just to give you a short intro on that. In the last part of this module, we'll now look into something completely different. We'll go into productivity.