 Have you ever wondered why you're seeing certain ads for job positions online? Online platforms allow advertisers to deliver job ads to carefully selected audiences. My name is Alexander Pirang. I'm a researcher at the Alexander von Humboldt Institute for Internet and Society in Berlin. Research shows that targeted advertising can perpetuate gender stereotypes or discriminate along racial lines. These risks call for new approaches to online advertising. At the Humboldt Institute, we recently hosted a virtual clinic that examined the ethical implications of targeted advertising. The clinic brought together 12 fellows from five continents and eight disciplines for two intense weeks. Gender stereotyping and targeted advertising met us to me because it's an example of the intersection of gender democracy and technology. Let's say you're a woman looking for an engineering job and as you browse like your Facebook newsfeed or like your LinkedIn newsfeed because Facebook decided that you're more interested in seeing, I don't know, like a jewelry ad or because maybe there are men also at the same time browsing the newsfeed which Facebook thinks are more relevant to them because of that you might not be presented with that opportunity. I would say this is a systematic problem because sometimes it's really unintentional. From the advertisers' perspective, they want to have this content reach their targeted audience whereas the platform want to also maximize matching the advertiser's content to the targeted audience but during this process because of the data collected, because of this already exist bias and stereotype within the society they will be reinforced in the process of algorithmic distribution because the system is designed that way. The objective of the clinic was to produce actionable fairness-oriented solutions. The fellows produced three sets of guidelines that covered the whole targeted advertising spectrum. These guidelines correspond to the three main areas of the targeted advertising process. The first area is ad targeting by advertiser. This involves how ads are placed on the platform. The second area is ad delivery by platforms. This decides which users get to see what ads. And lastly we have the ad displayed to users. So that means which users see what ads and are they provided with additional information. Here are the solutions our fellows proposed. They first focused on ad targeting by advertisers. They called for a legality by default approach which prohibits discrimination based on sensitive criteria. This ensures that marginalized groups are not prevented from receiving certain ads. They also called for a feedback loop that enforce advertisers of potentially discriminatory outcomes of their ad campaigns. This improves much-needed transparency in the space and ensures that platforms and advertisers can be held responsible for potentially discriminatory outcomes. Second, the fellows addressed how platforms optimize ads for relevance. They proposed a user-centered approach that allows users to be in charge of their own advertising profiles. The thing that's very important for users to realize is that you leave a data footprint behind you. You should be aware that all the platforms are collecting your data and they are using the data for their purpose, their objective. None of the systems are built in a neutral way. Lastly, the fellows examined how ads are displayed to users. Currently, users are given very little information about why they see certain ads and why they do not see others. The fellows proposed an avatar solution. This is a user-friendly gamified tool that visually communicates to users how platforms target them with ads. You can see how the avatar could look like and how you can interact with it. Specifically, it allows you to choose and select certain criteria and preferences that are decisive for how ads are delivered. Targeted advertising raises crucial questions. Who gets to see what on the Internet and who decides why? This matters to all of us and we as a society have to negotiate these issues.