 Yeah, so what we see when we try to translate sentences that are neutral in English, for instance, I'm a doctor, I'm a teacher, things like that. The machine basically doesn't know who is speaking, so it picks a gender. In many languages you would need to pick one specific variant, so male or female. And so what we see is that when you translate, I'm a nurse, it will be in the female form, I'm a doctor, appears more often in the male form and so on. The data sets are not diverse, they're very big but no one checks whether there is a good representation of every group in our society and this can be on many levels. There's usually no representation of white males in our data. Yes, there is definitely bias when you look into translations created by machine translation systems at the moment. However, this bias has not been put there consciously. It's a side effect from the data that we've been feeding it and our own biases.