 Welcome, everyone. I'm just going to give it a minute or so for folks to file in virtually to the space. Welcome, everyone. Welcome to the first seminar in the 2020-2021 Health Law and Policy Seminar Series, convened at the Schulich School of Law in Halifax, Nova Scotia. My name is Matthew Herter. I am the Director of the Health Law Institute here at Dalhousie. The HLI is an interdisciplinary institute comprised of scholars from several different fields, including law, but also health promotion, occupational therapy, nursing, and social work. We also have policy analysts and a growing number of student fellows engaged with the institute, and together we are dedicated to improving social justice, health and well-being through our research, our teaching, and our public service. And the seminar series, which is now in its 23rd year, is our flagship public event that we have hosted most of the time, and to date has been in person, of course, online this year, which is a first for us, and we're so pleased you could join us online from wherever you are today. Curated and developed this year by our Associate Director, Professor Adelina Iftenay, with input from all of the HLI's members and coordinated so well by Ashley Johnson, our administrative assistant, this year's series is focused on justice and diversity. A focus which, it goes without saying, cannot be ignored with so much injustice in the world today. Injustice and inequality, which the present pandemic deepens by the day. And I can imagine no better speaker than Lana James to open this year's series with that focus. Before I outline a few logistics and introduce our speaker, I would like to begin by acknowledging that we are in Mi'kmaqi, the ancestral and unceded territory of the Mi'kmaq people. This territory is covered by the treaties of peace and friendship, which Mi'kmaq and Wulastaki'iq people first signed with the British Crown in 1725. The treaties did not surrender lands and resources, but in fact recognized Mi'kmaq and Wulastaki'iq title and established the rules of what was supposed to be an ongoing relationship between nations. In terms of logistics for today's seminar series, it will of course be a little bit different from the past for those of you who've attended in the past. So I want to note a few things before we get going. First, we'll be recording the session. My understanding though is that it will be mainly the speaker's presentation that is captured during the course of the event. Second, there is also live captioning during the event which everyone should be able to see as the presentation proceeds on your screen. Third, you may have already noticed that there is no chat box in this Zoom webinar. We're going to instead use the question and answer function that's below in the bottom of the Zoom screen. But we would kindly ask that you hold your questions until the end of the presentation. Lana will speak for approximately 40 minutes or so, leaving a fair bit of time, roughly 20 minutes for discussion. I will field questions in the question and answer box and I'll read them out loud for Lana who will get a chance to respond to each intern. Before I turn it over, let me introduce our first speaker in this year's seminar series, Lana James. She is a public intellectual and a scientist. Her career spans the private sector and public service. She examines how artificial intelligence or AI disrupts the practice of healthcare and medicine while increasingly redefining rehabilitation, public health and healthcare systems. Her research lies at the intersection of race, ethnicity, clinical care, data privacy, AI, and the law. She is a doctoral candidate at the University of Toronto in the Faculty of Medicine. Please join me in welcoming Lana James to our seminar series. Her presentation today is entitled Algorithmic Racism, Healthcare and the Law, Race-Based Data, Another Trojan Horse. Over to you, Lana. Thank you, Matthew. I appreciate that. And I will start with sharing the screen so that we can all go through this together. Okay. All right. Can everyone see the screen over? So I'd like to say thank you for welcoming me here. And I'd also like to acknowledge and thank the Big Mom First Nations of Nova Scotia and the Health Law Institute for inviting me to have this pressing and important conversation. And as Matthew said, my name is Lana James. And I would also like to do a land acknowledgement because I am in a different territory. I am sitting in Ontario, Toronto, as I give this talk to you all. And that means I'm in the territory of Haudenosaunee First Nations. The Heoran Wendat Ojibwe, Ojibwa, and most recently the Mississaugas of the Credit. This territory is covered by the dish with one spoon, Wampoon Belt Covenant, an agreement between the Haudenosaunee Confederacy and the Anishinaabe Ojibwe and Allied Nations to peaceably share and care for the lands and resources around the Great Lakes. And I always invite everyone to think about how we might end the occupation in which we've been engaged in through the colonial process. And the next thing that I do whenever I do a talk is I ask everyone to join me in a moment of silence and three deep breaths. So the moment of silence is in recognition of the people that are around the world next door down the street and across the country who are struggling for justice for our just life and for the opportunity to live in a free and kind and carry world. And so I'd like to take that moment of silence that is joined by three breaths. So first a moment of silence. And now I ask you to take three deep breaths. And now I'd like to begin with the provocation. And so the provocation is the question, is race-based data another Trojan horse? But I think first we need to revisit what is a Trojan horse? And as Matthew said, the title of this presentation is Algorithmic Racism, Healthcare, and the Law. Race-based data, another Trojan horse, and here we go. As we think about the provocation, it's important that we also think about the context. Context is everything and as the saying goes, the devil is in the details. And as we go into this talk, I want us to be very conscious and cognizant of our location of this talk, which is in Canada to an audience that is residing within Canadian territories. And it's very important because often when we think of discussions about data, we are automatically pulled into a discourse and a conversation where we are looking south of the border rather than within the configuration of this colonial nation state. And that's really important because one of the pieces that we're going to discuss is how did we end up with the current calls of race-based data. And the story goes that that comes from folks looking next door. And so we have to understand that the United States is, of course, different than Canada. In the United States, they have organized their political system not long after its inception of a nation state into the formation of overt racial capitalism, where race defined not only relationships, but was a defining characteristic of the law. So that means that very explicitly it stated who was a person, who belonged and who did not, who had the ability to use and access the law and who did not in explicit terms. And racial capitalism can be seen throughout in terms of who was allowed to conduct business, who was allowed to have land deeds. And this affected, in particular, black folks and indigenous folks in the Americas, because we are the first two sets of peoples in different ways and configurations of the Americas. And so in the US, racial capitalism was front and forward in its laws, in its administration of day to day life. So when we look at Canada, for those of you that know the history of Canada, what is called Canada now, and how racial formation took place in Canada, Canada took the opposite approach of the United States. It was equally invested in race, equally as invested in racial capitalism. Canada has been and continues to be organized according to a racial hierarchy in which if you look at the racial hierarchy, you will see that assets, power and decision making follows a hierarchy of which white skinned Europeans and those who are perceived as such are at the top of that hierarchy. So in Canada, there was a different approach, and it was an implicit approach, meaning it was very clear who could and could not go into various establishments. You know, segregation was practiced, both by policy and by social construction throughout Canada. And we know that because we have across the country, cemeteries that are declared for colored folks only. Russia, Ontario, Alberta, all the provinces of what we now call Canada have had rules and regulations about where black people could and could not go, as well as where indigenous people could and could not go. And that has been from the inception of the colonial project until now. However, due to later forms of immigration and globalization, other peoples of color came to and were recruited into the national project. And so we must understand that because Canada chose an implicit form of racial capitalism and situating of race as an implicit organizing factor. It means that it was, in fact, part of our organizing, but unlike the US, you will not see it as robustly represented explicitly. However, that does not change the fact that it is an organizing frame. And so in the US, they organize all of their statistics by race. For anyone who holds an American set of identifications, it tells you your race. It says whether you are black or white on your birth certificate on your driver's license on almost all your identification and your social security. And that is how the United States racial apartheid system functions. Canada was a little more subtle, but nonetheless, we are also organized in the same fashion. However, our race is not on our birth certificate, and it's not on our driver's license, and in that way we differ. And so what that means is that we have a different tale to tell. And so it's important that we don't rush over and think that because the United States has racial data that that's a wonderful thing. As I move on to the next slide, I want to point out that the racializing of data as people is a project of racial capitalism and apartheid. Right. So it's important to understand that we don't have racial markers in the world for any other reason than to initialize, sustain and promulgate racial apartheid. Before organizing capital around race occurred, which is before 1492 and in particular in the United States of America before 1601, we did not organize ourselves globally according to race. Right. That was not a thing. This is a modern social construction. And so there is no existence of racial markers in disaggregated data without harkening and moving back into the formation of racial apartheid. That is why we have racial markers, both in Canada and the United States. So where the United States wanted to be very focused on managing and preventing relationships between white folks and black folks and non-white folks, Canada also managed that, but they did it through a different set of means. And so I think it's very important that we understand that we are not actually doing a social justice move in and of itself by asking for racial markers within disaggregated data because racial markers arise and exist solely to maintain racial apartheid. And now we have tried to relanguage that, but it doesn't change its origins. So moving into the topic at hand that has us talking about racial markers. What is a Trojan horse? So a Trojan horse, you can go to Wiki and you can see, but more importantly, I hope you read the ancient story so that you understand it more in depth. And so this is a story that tells us about a 10-year war that involved the city of Troy and a siege. And in order to end the siege, there needed to be some kind of movement made. And that movement resulted in a construction of a wooden horse. And within that wooden horse was given as a gift. It was given as a gift in one telling of the story to the goddess of Athena. And it was to say basically, you know, we lay down arms, we're done with this war, take this gift, offer it to Athena, and hopefully we can start on our own foot. Well, needless to say, that is not what happened. What happened in fact was that within that horse, there were soldiers. A select force, the story tells us of soldiers who, when the city was dark at night, went through and was able to overcome their enemy. The other definition of a Trojan horse is the one we more know the statement and the way I used it as a provocation. So it speaks to someone or something intending to defeat or subvert using deceptive means. It is also fittingly useful in describing a computer program that has a concealed set of instructions that perform an illicit or malicious act, such as destroying data files. And this, I think, is why the provocation is so suitable. And so one of the questions I want to bring forward when we ask is a Trojan horse is to put forward another set of critical things to think about. So given that we know the difference between the US and Canada, that the US was explicit about how it wanted to mark each person's body, especially the bodies of those in the lives of those who were not black to make sure that they could track control and restrict through their formations of racial apartheid. We know in Canada we had similar practices. However, they were not done through putting our racial identification on our actual documents. But we were in fact tracked and we did have restrictions about where we could go, who we could be with, what we could purchase, and when. And so we have to ask the question as we push forward into this conversation about disaggregated data and race-based data and to clarify disaggregated baked data almost always contains race-based data as one of the characteristics. However, race-based data speaks specifically to the demarcation of race or ethnicity. And in our conversation I'm going to use race and ethnicity interchangeably, not because they are one of the same, but because they function that way. And so one of the questions I want us to ask ourselves is how might the focus on data collection be a tactic of distraction. And so this is a conversation happening amongst many data justice people and scholars. And so what do we mean by a distraction? So in Canada we have a plethora of data that does in fact have quote racial ethnic markers. However, it requires the users to do a little bit of work to use that data in a way and to coordinate that data so they can get a fulsome picture. Unlike the United States who's invested in a different expression of apartheid where you're constantly just able to click and take a look and then interpret that data without context. In Canada, you must actually engage with information in order to extract that information and use it. And so I want to point back again, if the focus on data collection, particularly on race-based data is a tactic of distraction, we can honestly kind of consider how it is subverting action into acts of observation. And when we look at the amount of energy it takes to collect, curate, store, and negotiate data, we have to ask, what are we assuming? What is the assumption built in? And the assumption built in to data collection via disaggregated data, which includes often race-based data, or if we're speaking particularly and specifically about race-based data, which is what the call was for. We need to understand that we are also moving into a commitment with technological determinism. And what does that mean? It means that we are saying that technology has the answers and is the path. And how do we come to that? Well, data is one component, but data does not work in and of itself. And so we need to engage with technology in order to be able to use that data in a particular way, to use it to count and to curate. And so we have to ask ourselves, how are we doing that and why and who does it serve? The second component of this provocation is data as a tool for evaluating the presence, accents, access, degree of human and civil liberties. So many proponents say we need to know people's race. We need to disaggregate the data because we need to evaluate whether human and civil liberties are equally present across all groups and that they are accessing it and to see the degree to which they are able to access and enjoy their human and civil liberties. And I think it's important also to differentiate between data and information. And one of the concerns I have about the current discourse is that it is not engaged with critical data studies. And it is not necessarily completely engaged in the 21st century and the realities of AI and how data moves, unlike in previous calls in previous decades. And so using a definition that you can find online and you can see the reference there. Data is a collection of values. Those values can be characters, numbers or other data type. If those values are not processed, they have little meaning to a human. Information on the other hand is data that was processed so a human can read, understand and use it. And example here goes further to say the P and CPU stands for processing, specifically data processing. Processing data into information is the fundamental purpose of a computer. And so I want us to highlight that data is attached in the modern period, our period where we are right now to computation, right, and to computers. And note that we're talking about the processing of data and it is done by computers, not by humans, right and increasingly less. And that brings us back to the other part of the title of the slide, which is the issue of power asymmetry. Power asymmetry is the difference between who holds power and who has to respond to the power that's being held. So currently we have seen marches across the country. We have seen huge protests. We've seen sit-ins. We've seen hunger strikes. We've seen calls to defend the police because of the abuse that is involved in power asymmetry. And we must understand data is different than information. Data allows for those who are already in power in the status quo to define what becomes information versus the calls that we have for justice, which are about using the information that we have shared. Black folks, folks living with disability and at multiple intersections that we have shared in our demands for justice. And so there are hundreds of reports across provinces and school districts identifying issues around racial inequities and racial injustices that function at multiple intersections. So I want to clarify that when I'm talking about race-based data, I'm talking about it on multiple axes. What does it mean to be a black woman, a black woman who is disabled, a black woman who speaks a different language, a black woman who may or may not have citizenship? I am not talking about race as a single static category because my life, for instance, is not lived in a single static category. And so it's important to make sure that we understand there's a difference between data and information. Data has no meaning in and of itself, and information is data that has already been processed. Power asymmetries are central to understanding because the power of the individual who is processing the information, collecting the information, and storing the information determines what happens to the data and whether it becomes information. And there has been an increasingly discussion around community governance models and secretariats. And I also want to put a caution here. So the purpose of this talk is to incite critical thinking, discussion, raise literacy and awareness of how technologies and the technologies of AI fundamentally change the landscape of the demands that we are and have made. Right. And so this demand in this moment around disaggregated data being unleashed into the system without the kinds of protections that I'm going to talk about very shortly is a recipe for disaster. And I want to point out that this is an article that I wrote that came out in the conversation just a few days ago, and I go through some of the reasons in that article as to why we need to be very careful about how we enter and how we push for demands and And I want to point out that this is a discussion of race based data and I want to say Snowden here, who, who was gracious enough to wake up the world into understanding that data has the power to define what is, is not possible for your life. It's a government on citizens, and it has come to pass, you know, he made that call in 2013, and it is in this year that the US courts decided that in fact he was correct that it was unconstitutional, and there are deep concerns about what is going on. And I highlight that because data cannot be governed in the manner that we once thought. Data does not go and sit in a file and when you call it, it comes data can be moved as long as it's on an interoperable platform to wherever to whomever commands it, much like a genie in the bottle. And that was the point of Snowden's work. It was to highlight to people that we need to be very careful about who collects data, and one of the most dangerous holders of data, and one of the longest abusers of data has been our government and governments around the world, and we're seeing it as governments shift hard to the right. As historically, we've seen before. In the beginning of the 19th century when we saw the rise of fascism, we saw that data became very central in the genocide of peoples and we saw that without the ability to access massive amounts of data in South Africa, South African apartheid, in fact, would have been nearly impossible to manage. And interestingly, the same players that made the past book of South African anti-black apartheid possible are also the ones who are giving us the technology as IBM and Watson the supercomputer to manage the very same data sets. And so I want to point out a lesson that should be learned in power and democracy for those of us who are in the conversation around race-based data and disaggregated data. And so what we saw in AI is the use of big text power to actually structure a conversation, and I'm actually going to put another provocation forward. I'm going to suggest that our actual focus on data is following the path that Silicon Valley has laid out for us. And I say that because just like they wanted a conversation not to be about laws and regulation that are needed to hem in the use and abuses of big tech of our data. They spent a lot of money and resources swaying that conversation to make sure that we spent time talking about ethics and governance, all without any binding ability, all which are toothless, and actually have no ability to restrict or prevent the abuses that in fact laws and regulations and penalties could. So I would like us to really meditate on this example. And this is an important story you can find in the intercept. And I think it's important that you read through it and the title is the invention of ethical AI, how big tech manipulates academia to avoid regulation. In this story, you'll see right here, quote, the discourse of ethical AI was aligned strategically with a Silicon Valley effort, seeking to avoid legal, legally enforceable restrictions of controversial technologies. This is important in this conversation because those are the technologies that will be used to process your data. In Ontario, for example, the provincial government without any bid provided a $20 million contract to Salesforce. Salesforce has recently been hit with a $10 billion class action suit from the EU's GDPR for violation of the regulations around data in Europe. Salesforce is also alleged to have issues around human trafficking. And so Salesforce is one of the largest companies in the US that does what they do, and they are in the midst of a lawsuit, as are many of the tech companies and the technologies that will be used to process this data. It's important not to separate these two things. Data has to be processed to become information, and data has to be stored in order to be shared. And so Snowden reminded us that the issue is not on agreements on how you store agreements on how you share it, but it's on not collecting it in the first place. And if you are collecting it, collecting it in ways where the retention and the maneuverability and manipulation of that data are addressed before, not after it's collected. And you can watch any one of his lectures. In fact, he actually gave a lecture here at Dell Housing about a year ago. And so I think it's really important to understand that when an individual who has risked his life amongst so many others to make sure that we become literate and aware of how data and technology go together, that we don't like the story of the Trojan horse. It includes a prophecy, you know, the people were made aware that inside of that Trojan horse, there was potentially a problem, and it was ignored. And that is how the city fell. And so I offer this caution with that provocation to point out that we already have examples of how data has and is a Trojan horse. And we saw that with Cambridge Analytica. And in the Cambridge Analytica scandal, they told us themselves. They just needed 5000 data points to help restructure a potential outcome of election. And that concern is still with us. So we need to understand that when we talk about disaggregated data and race based data, and we talk about doing that with the government. We need to understand that it is the government and their partnerships with private corporations that continue to impair the lives of everyday citizens who pay their taxes and who live here and who contribute. And it's really important to understand that we already have seen the playbook of how technology will change, restructure and distort our demands. And so I also want to put this on the table. One of the arguments for race based data has been we need to make sure we know what's happening to the different populations and ensure that they're being treated equally. And I want to point out that one of the things that has not been done in these demands is to ensure that the actual laws and regulations around the collection and use of data precede the collection of it. Right, so we've had individuals who have many vested interests, not all of them shared by the public, who have pushed a demand for data prematurely. And we have to ask why, because right here in the Canadian Charter of Rights and Freedoms, it says that every individual is equal before and under the law and has the right to equal protection and equal benefit of the law without discrimination. How will that be possible without data laws and regulations in place when we have already seen and we have numerous articles, both in computational science, in data science, in statistics that warn us that not only is the data and the algorithms being used to provide that data problematic, but that it has in fact already acted in a discriminatory fashion. So we have to ask the question, why on earth would we ever add more race based data because we have race based data. That will be open to more parties and have less controls without the laws and regulations to ensure that that data in fact does support each of us being equal under and with the law. And so I'm arguing that in fact we need to make sure that our data laws are designed to fulfill and use the beneficence offered to us by subsection one. So we are allowed to organize ourselves in Canada and we are supported in doing that to make sure that we proactively protect populations and groups who have historically suffered at the hands of injustice. And the question is, why aren't we doing it with data to suggest that community governance structures and ethics can manage the power of governments who paroch who shut down legislative debate who are currently misusing and many would argue abusing their powers under the pandemic to prevent lawful discussion and protest. We have to ask ourselves, what does it mean to fully hand over our personal information to governments to private corporations to researchers and scientists who have already been documented as categories for misusing and abusing data. And I want to point out this. Hey, Alexa. And this article is a good kind of tipping point for us to understand this gentleman here in the picture is a previous NSA executive and he is the one who was intricately involved in taking up all that data. That the US government has actually courts have actually said was unlawful. He is now part of Amazon. So the gentleman that was part of the brain trust that undermined the constitutional rights of Americans and collected that data and understands the system intimately is now in a high powered position of Amazon, Amazon, the company that your federal government is in contract with that has access to and a smack dab in the middle of our federal data cache. And that's important to understand because currently the discourse around data is making assumptions that this data is going to follow some path of the 1990s or early 2000s. And so the second provocation is here. Why would the cart data come before the horse data laws and who benefits. And to further my point, here's an example of data. And so here we have what seems like an almost incomprehensible string of alphanumerics. And in the second box, you have an example of information. This is very important to understand. It's important to understand the difference between data and information. I am part of ready for black lives, and we have held the largest pan African conversation and pan Canadian most importantly conversation around race based data. We have had a conversation that began in May and continues where we are experiencing a conversation with one another to have critical thoughts, and we are expanding that conversation, not just for the policymakers and the academics and the private sector who all have vested interests and money to gain. But to you the people for whom your data will be taken, and for whom your data will be purpose, and for whom you will not have control of your data. Once that happens. And I think it's very important to come to terms with the fact that once you release your data, you no longer have control. No Community Governance Committee, no secretariat can guarantee you that because once it looks like this. You require resources and computational power in order to have it mean this. So many people think that I'm going to share my race I'm going to share that I am, for instance, if someone says I am black I am Caribbean and I am from X location. Well that's not actually what goes into the computer as zero and one combination goes into the computer. So black actually gets weighted it becomes something else, something that you cannot define yourself because it is inside the computer and out of your control. And I think that it's important to understand that when you give it away, you have done just that. You've given it away. And the current conversation around disaggregated and race based data is not happening within the framework of informed consent and informed refusal. Many people have not had this wonderful conversation we're having now they have not been given an opportunity to gain data literacy of the 21st century. They do not have the opportunity to understand how data ecosystems work, and even more importantly, they have no understanding of who holds the contracts for that data. Who has the ability to move it around, whether it's resting within Canadian jurisdictions or not. Is it cloud based well it's pretty much out of your hands for sure. Is it in a server down the street. Maybe it's still within someone's oversight, but to understand what race based data and disaggregated data is we must understand how it will be used. And I want you to continually ask the question. When we put the cart before the horse, who benefits. And we saw in previous slides that who benefits our corporations and governments who are in business with corporations and that business is not always in the public interest as we can see. And so as I come to the third provocation, I ask, how has failing to ask the question about the death drive to data vacation replaced demands for action on data. We already have. So data vacation is the process of taking information and reducing it to data bites, reconstituting and manipulating it in ways that algorithms deep learning and different kinds of AI subfields can. And that has very little to do with the justice demands that we have. So again, the third provocation is how has failing to question the death drive to data vacation replaced demands for action on the data we already have. So we already knew before COVID-19 happened what happens to black people and indigenous people in colonized territories and people of color, because pandemic has been a fundamental, central and important organizing of how colonization happens. If you read the epidemiologic record and you understand the process of colonization, you know full and well whether we're talking South America or North America or Central America. The Europeans that arrived on those shores did not win because they fought valorous battles. They won because their bodies contained diseases that indigenous peoples in these territories did not have. And most of the warrior class was ill, sick or dying by the time those confrontations happened in Central America, South America, what we call the United States and what we call Canada. And so the pandemic was immensely important. Without the pandemics of Russian measles, rubella, chickenpox, smallpox, there would have been no way they would have been able to rest these lands out of the stewards for whom they are their homes. And so we know also that we're in the moment of pandemic. And for those of us who understand the histories of black and indigenous peoples in the Americas and around the world, we know that without pandemic, we wouldn't be in the current state that we're in of colonization. And so when we allow the pandemic to drive the speed at which we are moving and drive the demands for race based data, we are actually falling into the perfect strategy of colonization. And we would call it recolonization this time. So I am going to suggest that it's really important that we understand the role of pandemic in them and what we've become as Canada colonial state that occupies indigenous land without full respect for the indigenous peoples that are here. This is possible because pandemic. And in this moment, recolonization is actually occurring in the process of verification. And so we have to balance our request for data with the necessity of protecting against data vacation. And the second part of my provocation is how has data vacation and how is data vacation devaluing the information that black people provide. So we have, as I said, a plethora of reports in education in the carceral system in the health system in for veterans, we have document after document recommendation after recommendation. We've had protest and this has been going on for hundreds of years. We have warehouses of data, not only of the problem, but how to fix it. And so the question is, why are we now going down the road of big tech into data vacation, rather than holding our ground and demanding that they act on the data we have for black people. We had the 2017 UN report and it was skating it identified and spoke directly to the entrenched racial apartheid that is Canada for black people. There were very specific recommendations in that document and several others. Those have not been taken up. Had that document been acted on in 2017, we would not have been looking south of the border to ask for data that belongs to the American racial apartheid system. And so, again, we come back to the provocation is race based data another Trojan horse. And I'm going to say, yes, and no, it is a Trojan horse, because we've already seen what has happened with big tech, and they're distorting of our conversations and driving us to data vacation. We've already seen the misuse and abuse of data. And we've seen it already here as a case example. There is a vice article and I encourage you to read it. And that article speaks very specifically to an accelerator where they're taking the data, your health data in the province of Alberta. They're taking the data out of your health files out of family and social services across the province with your name and address attached, handing it over to the police services the same police services that has and continues to be dogged by both allegations and proofs of mistreatment and murder of black and indigenous peoples, as well as other peoples to now take that data and to use it for the purpose of business. We must understand that when we make a demand for data and not a demand for justice, we are actually fueling a system and refurbishing a system of racial apartheid in Canada. And so we need to also make sure that we are raising the literacy and pushing informed consent and refusal. We need to understand that we have lots of data and we have not acted on it. So are we deleting ourselves to think that more data is going to do anything else when we have hundreds of years of data of injustice that has not been acted on. So let us be careful that we are not on a fool's errand. It's also be clear that there are different types of data and that we have to raise the literacy of those types of data so that we can have informed consent and have the choice of informed refusal. So there is some data that we can collect safely in the current conditions, and there is the vast majority of data that is not safe to conflict in under the current conditions because the technologies that we're living with can re-identify your data that data can end up in countries, in places and with people you know nothing about and did not consent to. And more importantly, we don't need to always worry about what's out there. It has been our governments, as it was in World War II, where IBM sharpened its teeth on genocide, one of the largest holocausts, followed by how data was instrumental, again tech companies leading the way in organizing that data for South Africa and the South African Pacific. So as a country who has not actually left off of our apartheid regime for black and indigenous peoples, it's important to understand that without laws in place first, before we go on to the mass collection of data, we too might find ourselves in the same place as the NSA unlawfully collecting data because we have not protected the people to whom that data rightfully belongs. And so as I end this presentation, here's some tools for informed consent and refusal. We must have debate and discussion like we are here. We must make power visible and the reorganizing of power a priority. Simply talking about community governance models and secretariat as has been mentioned by some provinces is insufficient. We saw here in Ontario, how with the change of government, we had a secretariat on anti-racism collapsed, compressed and turned into a site of surveillance. And so we must really understand that these are dual use technologies. The same data that we want to track human rights and civil liberties with is the same data that is used to track and create social credit systems. We must go through transformation as the path to decolonization. We must transform our relationship to power. We must reorganize power. We must raise the literacy so that we can all participate in this discussion and not have the power brokers, the government, the tech companies and the researchers and academics take over and decide for us what is best. And so in closing, our collective survival requires that we question the premise and the past because data won't save us. Acting on the information that we already have and believing black people, people living with disabilities, when they speak, trusting them as the reliable narrator means that we act. We don't need to further observe and collect on suffering. We act to end it. And for more information to get a good understanding of the complexity and the diversity of the discussion around race-based data and disaggregated data, please go to Ready for Black Lives. Go to our YouTube channel. We've had a COVID conversation symposium where this was the question over 12 different conversations of health. And you can download them. You can watch them. You can have a conversation with them. And many people both in high schools and universities are using them as teaching tools. So please get educated on what race-based and disaggregated data actually is so that you can engage in informed consent or informed refusal about what you want to happen with your life and your data. Thank you very much. Thank you, Lana. This would be the moment, of course, if we were able to do the series as usual in person, there would be a round of applause. And so my attempt to do so as one person seems not appropriate because I'm sure there would be that lived experience would be very different. But thank you for such a thought-provoking series of provocations at a time when many have called for data to be collected so we could demonstrate the injustice that is occurring. Your series of provocations which ask us to think about how data distracts or worse could feed into the abuse of the neoliberal state and corporations that are aligned with it is a very, very critical observation. And of course, you need not to hear that from me, but very thought-provoking and questions have already been streaming in. I've tried to field a few of them along the way in terms of asking for references and the like. But what I'd like to do is to invite folks who've been listening in to populate the Q&A box with questions and I'll try and keep a running list, read them aloud. I will add that because of the way in which we're doing this this year, I won't identify anyone. Lots of people's usernames are not easy for me to identify in any way, but if you want to be identified, please indicate that in the question that you're drafting, if that's somehow relevant to the context of your question if you'd like. So we have about 15 minutes or so for questions before we wrap up. The first question that I have on the list for you, Lana, from one of our attendees was to ask you to respond to what I think has become a bit of a talking point or the person raising this question as framed as such that race-based data collection, at least in the United Kingdom and the United States, arose from the symbol during the civil rights movement. And so the question is, could you speak to that kind of talking point, that history in some way? Yeah, and this is where context is everything. So I've heard that sound bite before and it's actually a little problematic because it forgets the first part of why it even came up in the civil rights movement. So let's do a little history lesson. As Black folks continue to push back and demand full emancipation. So for those of you who don't know, we as Black people, since encounter with Europeans, have been fighting to end enslavement. And so in the modern era that translates into walking while Black, shopping while Black being surveilled and constantly not believed to be who we say we are. And so the whole piece around the civil rights is a bit of a misinterpretation because Black folks made their demands for equality and for justice. And the state came back and said, prove to me your suffering. Go fetch and bring forth something that I can parse. That is actually the conversation. Civil rights leaders and activists were inflamed. They were angry because they had the proof. They had the lynchings. They had the mob violence. They had the depressed wages. They had the rest racial segregation to prove it. But the white folks in power needed to buy some more time. And so they gave them the fool's errand and said, go fetch. And because of the power position that Black folks were in, they had to figure out how to mobilize that and make it work for them, rather than against them. So let's not misquote history. We approached with demands to the powers that be to the state governments of the day. And we said, it's time for equality. It's time to rectify. It's time to make equity a reality. And the state pushed back and said, I don't see why you seem to be doing fine. What's your problem with suppressed wages? What's your problem with redlining? What's your problem with segregation? Prove to me that white supremacy is hard for you. So let's not distort the experience of our ancestors and let's not revise the history of how we come to that. The response should have been, let us deal with the inequities in the funding of schools. Let us have the banks stop refusing mortgages and access to appropriate lending tools. Let us, those were the demands that they had, and instead they were told to collect data, data that in fact reinforced a system of racial apartheid. Right? Because remember, we've been trying to move away from having Black be our sole identifier on our public documents, not towards. So this actual turn to a demand for race based data is actually a form of forgetting that happens in projects of white supremacy, which is the state and all of its institutions mobilized to erase memory to invisible lies the past so that everything is new. We're always starting over again. And as someone who's a student of history. I don't forget, we've been here before and be careful what you asked for, because what we asked for the first time was justice. And what we got was the errand of collecting data or white folks to decide whether or not they were going to quote support actions towards equity. Thank you for that response. The second question I have in the list just to move along is reads as follows data are answers to questions questions designed by those with power to enable power and decisions made with analysis of this data impact those without power. And just is there in doing the opposite, empowering those most impacted to define the questions, collect their own data, do their own analysis and challenge current decisions with new information. And if this is not a priority where there's no time money for this. Does that confirm your hypothesis. I think your questioner has it on the money. That's exactly my point. So my work. In terms of informed consent and reform refusal to ready for black lives, we have a platform we've had over 10,000 people involved in this conversation. Much more than any government in this province or in this in this country has had yet premieres and ministries are moving forward, making decisions and not talking about what actually the risks are. Right. And so, in fact, the questioner is right on the money in the sense that for communities to be able to understand the difference between data and information there needs to be literacy rates like the way we're doing right now. And then they need to understand, well, what is it that we want to solve and what communities would find out is like, didn't we like complain about that already like didn't we already do a report about that and communities have memories. They have institutional memories and they know what they demanded. Right. For instance, we've had an issue in the Toronto District School Board for decades it is the district in the country with data on race for 10 years. And we see that every generation that black people are in Canada, their ability to participate is blocked successively. So for multi generational Canadians who have been in Canada from the inception of what we call this nation state, they are doing the worst. Categorically, because what happens, the system and its cumulative pressures of anti black racism wrap around you like a Python and squeeze the life out of you. And your entire job is to resist. So we know what data tells us, but we don't see the action. So this individual is very correct. We need to be asking the questions because what we would find is we would actually ask them, and we've answered them, and we have the recommendations for them. And what's being done right now is in the interest, not of human rights and human justice, but in the interest of the new data economy. Right. You will see in the coming years. And if you're paying attention to FinTech and the geopolitical reorganization of capital, you will see that this system is demanding this data, not us, even at the international level. We have the Human Rights Commission going straight down the road of data vacation, never addressing the power asymmetry between the government and the data corporations and the people rights are predicated on power differentials. And so for the people to do those questions, they need the training they need the resources and they need the support and they need no strings attached. We also need a process of how do we come together and have these conversations. Who's able to have these conversations. If we're talking about rural areas if we're talking about places without unlimited data plans, how do you have those conversations. And so those aren't being funded interestingly are they. No they're not. So, I think this questioner has it already on the money. So we need to really collect our own data for ready for black lives will be launching our platform that addresses a number of the issues that student Snowden has brought up. So we don't create data exhaust. We, it's not accessible by other people who are not part of those pods that data can be disintegrated if necessary, should an abusive power seek to access it. We've been over the last couple months bashing at it doing adversarial attacks, so that we can have community based data platforms. My concern here is we have historical examples as we do now, what happens when the government has huge caches of data, and we see people go missing from Canada to the US to Germany. People go missing and so does democracy. So we need to be very careful of that, and the power of who asked the question is everything. A third question which sort of pivots in the sense of thinking about a context where data is already being collected. And probably many joining in the session can relate to so the question is, can you talk about race based data in relation to higher education, collecting data on students and teachers. There's already a push to drive equity hiring, can you comment on that and I guess to put a few extra words in the question, questioner's mouth, you know, any potential misuse or or what are the sort of advantages or risks associated with that. So we'll start with the misuse and I'll go to the positive you so in terms of misuse, we've seen this in jurisdictions in Europe and we've seen it in the US, where when we see educational disparities we see it currently under the Trump administration and we've seen it in provinces in Canada where the rhetoric is very racialized that the poor performance, the suspension rates of black indigenous and racialized people become an indicator because remember data is not information. Data can produce information but only information is information. And so that data gets turned into a type of information that says the issue here is that well you know, they're just not as bright, they're just violent people they're criminogenic. So of course we have to suspend them. This is actually proof and we've seen this argument. This is actually proof that these students are actually violent and we're protecting other students and our teachers and that our suspension policies are working. So data is a double-edged sword. Information is a bit different because you can go and ask who you got it from. Is that actually what you intended? Data is raw and it is defined, interpreted, coordinated and curated by that which has power, right? The computational power to maintain and support data requires the resources. Are you ready for black lives? We're lucky. We came together as computational scientists who are committed to social justice to develop a system that doesn't look anything like a tech company we built because we're not looking to make money. And it doesn't look like anything the government would build because we're not trying to control anybody, right? And so when it comes to that kind of data, I think, yes, there's a way to collect that. But you want to use a system where you're not producing data exhaust, where there are no individual data fires attached to it, and that in fact the purpose of it is explicit. And so therefore the questions that are going to be answered cannot then be manipulated to say something they were never intended to do, right? So the question is how might racism be impacting as a multi-access issue, the recruitment, retention, graduation of students in said institution? That question can be asked without ad-hatching or ever collecting individuals' identities, which makes them at risk for other kinds of misuse. It can also be collected in a way by asking the question very specifically and collecting the data field very specifically and making sure that that report is produced. And then the data, once it's verified, that's in the report and in the table, is not then able to be, quote, used for secondary analysis for an intent and a purpose that was never thought of to begin with. And so I think yes, you can collect some kinds of data. This is one of the kinds of data, I think, that can be done safely and easily. But when we get into data that has to do with pandemic and health, this my friend, I mean, all I can do is quote Snowden, right? It's all in the collection, because that data is weighted in a very particular way. All data is mathematical code, and those are mathematical equations. And so if the math has variables that have a particular weight and that weight continues to carry through, it's already producing its own kind of decontextualized information. And that is the concern. So if the demand for race-based data and desegregated data is human rights, we have a problem. Because we're not using the second part of that constitutional tool that we have that says we not only are able to, but are supposed to be proactive in protecting those categories by using the laws to do so. Does that answer the question? Yes, of course. So just a final quick comment and question from one of the other guests. This one is a member of the Institute, so Sean Harmon. And he writes, the issue of cart and horse is so important. We did a report on personal health information a year ago. I was a part of that as well. I think we recommended an alternative foundation for the collection and use of personal health information, greater transparency. Thinking about the utility of the platform that relies on local expertise and true engagement or meaningful engagement with those whose data is being collected. We got the sense in the course of this project, which really goes to your point about the relationship between government and data companies, that the government was unwilling to consider a new course. They had relationships they wished to preserve. And what we often got in response to calls for regulation is that we don't know what we're regulating yet. We need to operate first to see what needs to be regulated monitored. Oh, my God. So that has been, that is the playbook of big tech. I have a colleague who is, you're going to see an article in the next two weeks and we're going to have him on the ready for black lives platform where we have COVID conversation. So stay tuned for that. And we're going to be talking about the role of big tech as big tobacco and how that's functioning in Canada. So we're going to have the second part of that piece that you just asked laid out, laid bare, because this is not a coincidence. We're not like, in what world do you say, you know what, we'll just, we'll just put a plate in the sky and see what happens, you know, maybe, maybe it'll fly maybe what we will have any safety requirements, we'll just, you know, put 300. No, that's not what we do with drugs. We put rules in place first. It's called, you have to have approval before you put a drug on the market, correct? We'll speak with COVID. Right, right. And so the question is, why are we not having those rules in place and the idea that we don't know what we're regulating is, I'm sorry, that's patently untrue. We know what we're regulating. We have journals of computational science and medicine. We have a plethora of articles. We are in these debates about accountability, transparency, explainability, interoperability. These discussions have been going on. Europe actually let these discussions for over a decade. This is not a new topic. Regulation, if they don't know what they're regulating, how do they get the GDPR? Which I might add needs work, right? If we would use it in the Canadian context, it's seriously wanting. So what you're hearing there is the propaganda of big business coming out of the mouth of your government. And we have to pay attention that these are the same data companies that do not pay taxes, but use our public infrastructure as the raw material data generators for their profits. So I think this is where Canadians across the country have to stand up and say, hell to the note, we need to have the horse before the cart. We are a democratic constitutional society and we need the tools of democracy and the constitution to function in sync. And that means regulations and laws. And if you don't know what you're regulating, lovely, you need to stop because you're not ready. You didn't come ready. And you're undermining the people, right? You're undermining the ability for us to democratically engage with our institutions. And I'm going to finish on this note. At the pace that we're going in Canada, remember, we have one of the lowest thresholds for data. Like, you can just have at our, like people, corporations are descending on us like a carcass that's dying because you can just about get away with anything in Canada around here compared to those that are inside of the GDPR compared to the UK, even the US has better than us, right? They've bad facial recognition, right? They've at least acted on the hotspots. We continue to sit here as our police forces get militarized. And so if in fact our data is held by all of these, the sales forces, the Googles, the Amazons, why do you need to go to the polls? Since they can't do anything anyway. And that was the push for sidewalk labs in these smart cities. That's why unanimously around the world, people have shut those projects down time after time after time. And we have to understand that before the pandemic, they were against the ropes and actually they're being put back up against the ropes in Congress, right? So you cannot separate data from the corporations that have brought us to this point and from the power that they have to flout the laws and pay governments to not do laws and regulation. So I'm going to really encourage your partners and yourself that did that report. Please share it with us at Ready for Black Lives. We will push it and get it out. It's important that the people know that the government is not necessarily doing their bidding and that they may be doing the bidding of corporations that do not have our best interest at heart. Thank you, Lana. Data collection is the end of democracy. Yeah. Nothing short of that. We can change it though. We can change it. We can change it by refusing the way the project has been framed and reframe that project into action. All that time and money and energy being put out to collect data and have these should be about how come you're not acting on the data that you have. We should all be standing up with our wands of reports and saying, why haven't you acted on special needs and special assistance in elementary schools? Why haven't you done equity around pay? Why haven't you? Because let's be clear. In H1N1, they knew the black folks were going to be in the order of over seven times more effective. They saw that in the data. You could have modeled that with COVID. We know what happens. Pandemic has been instrumental in the demise of black and indigenous life. Nothing new has happened here. Absolutely nothing. And calling for racial apartheid markers to be reinstituted means that we have an amnesia that is dangerous and that we need to recall and remember that our ancestors did not struggle, did not die, were not brought down by the lash of the sword for us to go back and drag out of the garbage. The markers of racial apartheid. And I will fight every tooth and nail. I refuse to have my race on my birth certificate on my driver's license. Those days are over. And as I close, I want to say one thing. My grandma, anybody who knows me knows my grandma. My grandma turns 102 in April of 2021. Please God, she is solid as a rock. She expects a touchdown. Not a fumble. One century. And we're still having the same conversation. Thank you, Lana. Before I thank you at a little more length, I'll just make a note for the folks who've joined us that this is. I mean, I feel like we could stop the seminar series after the first one. It's been so powerful, but this is the first of eight seminars will be hosting this year. So if you can find the sort of list of speakers and guests on the website of the Health Law Institute, all are going to be via zoom and you'll be able to join them. Our next is on October 2. The speaker is Florence Ashley from the Faculty of Law in the University of Toronto. And the title of that talk is can health law help protect trans youth from conversion therapy. Continuing our theme of justice and diversity in this year's series. So that little reminder offered. I just want to offer my very humble thanks for your wisdom, your expertise, your insights. All of your calls to action that you shared today. It is an incredible privilege to host you for for this presentation. And I hope everyone who was able to join us online from wherever they are feels as as terrified, but also as motivated to try and be an ally to assist and help and ensure this conversation stops. And we do things differently. Thank you so much for sharing your insights Lana today. Thank you and I hope that you will come on over to the platform everybody for black lives and join us for October series where we're going to continue this conversation with some really interesting folks. And I want to say thank you for welcoming me. I appreciate it. Thanks everyone. Take care.