 of applause, Gert, Jan, and Michel. Yes, good evening, everybody. I am presenting here as a parent who is concerned about the use of digital tools within education. I'm also standing here on behalf of my wife, Micah, who unfortunately couldn't be here, would have been a wonderful presenter as well. I have no IT background for myself. I am basically in development finance, and my wife is in children and adolescents psychiatry. But we have two kids, six, seven, I should say, sorry, seven and nine, who go to primary school here in the Netherlands. And yeah, I'll tell you a bit about how we became involved with this initiative. Let me do that. So this is basically what I just mentioned. Let me introduce myself. My name is Gert Hilm. I work as a, I do work in IT. I'm a network designer for mainly telecom operators. At home, I own six computers, well, actually a bit more. But it depends on how you count. Six that I use often, two daughters, and one wife. So I'm a long-term open-source supporter and a privacy activist. I'm also active for the Pirate Party in Delft. One of the photos you see on the slide is a couple of weeks ago with the small balloons. That's a security camera with a party hat on it for George Orville Day that we celebrated. I also do some sports, like mountain biking. Used to do triad loans, but now it's mainly mountain biking. So that's me. Yeah. Please continue. Yeah, thanks. So how we got introduced as parents to digitization in schools was basically when COVID started. And our kids came home with two Google accounts for each of them, made in their personal names. And basically, yeah, administrating things about them at school, which was new to us. So then we asked the school, why is this now being done? Because before that, the school had not been digitized that much. And basically, the response we got from the school was, well, the kids need to work with this because they need a job. After a while, they are basically safer using Google than using a standalone computer with a USB stick. And it is all privacy compliant. And we have a contract with Google to keep us safe. And basically, the last comment made me think back of my first contract with Google when I became a proud user of Gmail. And they said they would never look at my emails to provide me with marketing materials. And by now, they're scavenging every corner of my inbox to show me advertisements, of course. So the school was actually very helpful because they said, we will delete your kids' accounts and we'll make aliases for them. So they're now named after cartoon figures at school. But still, it is a bit worrisome to us that this is just being used without a clear reason, at least, to us. And soon, we noticed that more and more accounts were being made. So at the moment, there's about eight digital tools the school uses for varying things, like doing school exercises, like having video metrics about how they move around during gym lessons, exercise lessons. And of course, using Google as a general tool, also for video meetings. There's also a fully digitized school report, which shows everything and presents everything around the kids, which can be accessed by not only us, but others as well. And there's a lot of WhatsApp use at school at the moment. And a bit of a worry to us as well is that there seems to be a collective unawareness of the risks of using all these tools. They're, of course, very handy and very practical. But at the same time, you really see that school doesn't really lecture the kids on it. By saying, for instance, when they go through what to do, when to log into an app, they say yes and accept all cookies. And when there was one app that needed a password, they had the list of all the kids in the class. And they made some usernames for them and password. And they sent the list of the 31 kids in the class to all 31 kids with all the passwords. And this was a tool that was also linked to the general education system in the Netherlands. So it just shows that school is also, and very understandably, not really on top of things. So what did we do? Well, every school in the Netherlands has a privacy official. And basically, that's a person who doesn't work for the school but is a hired lawyer, basically, who represents the school when there's questions about privacy. So we asked him. But basically, he didn't know anything about the school because he probably had 50 schools to help on these kind of matters. And the next step we took was to ask or send requests to the providers of the services, digital service to the school about what data was stored of our kids and for what purpose. The response was that some of these parties, they gave us a call and they said, well, we were actually storing lots of data and are you from the media and what should we do? Because they were a bit panicky. But they said, we will delete all this access information. They had years of data stored, but they can only store it for months, for instance. But they were actually very helpful. Then there were other parties who responded by saying, well, you have to go to your school because we're only the data processor of this data. And then there was Google who just never gave a reply at all. So we were in a situation where school was doing this and some other parties were pointing at the school. Some parties who did actually respond said they were breaching it but tried to amend it. And then the biggest one, Google, was a bit of a question mark. Yeah, thanks. So basically because of that, we were actually quite concerned because first of all, the school should be a very safe place for kids, also for their data. But it's also a very good place for parties to potentially gather data. If these kids are filling in everything, if they have to fill in what they're afraid of, if they had to do exercises on it, there's a lot of data being collected from these kids. And you just want to protect them against parties that are just collecting all the data to create profiles. Because with these profiles, you never know what they'll actually be doing with it. And the only thing Google, I think, promises is they won't use it for marketing. But there's, of course, they can change it one-sided. And then at the same time, there's lots of other things they might use it for. And we would like our kids to have to write a resume the next time they want a job instead of employers just googling them. And there might also be a bit of a risk of a self-fulfilling prophecy. If you have a profile that is really, this kid is good at that, then you might actually be steered by IT or whatever to get more involved with that. And you also don't want that. Then there's, of course, Vandalog in. Because when a school is using a certain tool for a long time, that school, like me, I still have my Gmail account. So the school will probably use that service for a long time. And actually, they can one-sidedly change the terms of use, which is something they could use to actually work these profiles a bit more and extract more value out of it. Because they offer this service for almost nothing. I think 10 euros per kid to use the software and to have sufficient computers in a class to be using this, which cannot be the cost to Google of offering this service, I believe. And I think it's a basic right for kids to make their own mistakes, but also to be forgotten, especially at that age. And we also think that digitization in education may be researched a bit more before being used at such an extensive scale. Because we think face-to-face education is something that might really work well. And the school, like I said, is very understandably also not really on top of things. So they teach kids about the dangers of social media and things like sexting or people making misuse. But the dangers of using digital services without really knowing what is being shared is not really something that is done. And there's lots of cool things that kids could do with digital stuff, like programming and just being more creative, which is not really done yet at school. So we think we were a bit concerned. And that's how we joined the coalition for honest digital education, which yeah. Some people came together. We started on the forum of Freedom Internet. And there were lots of people that were worried about what's going on in schools and where the data was going and who had access and why. But instead of all being angry, we decided to just let's do something. Let's try to fix it. From there, we started with the Coalition for Fair Digital Education. That's the Indutch Coalition for Ehrlich Digital on the West. Fortunately, we're not alone. Can you do the clicky? Get back. We have quite some organizations that are supporting us in this mission to take back the educational system from Big Tech, basically. I'm not going to name them all, but just a few examples that the Princeton Free Software Foundation is quite involved. Freedom Internet, because we started on the forum of Freedom Internet. And a couple of while ago, we launched our petition on the conference of public spaces. Well, first something about the Dutch digital, about the Dutch educational system, because it's slightly different than in the most European countries. We have in the Netherlands, we have the freedom of education. That's basically that the school does pay for the government is paying for the schools, but the schools organize themselves. Or basically, it's a way for confessionalists to do their own kind of indoctrination. And sometimes I have sources, links to sources in this presentation that are not in English. But you can use a service like Deepo for that. OK, what's wrong with surveillance capitalism? Well, a lot. It's basically a threat to our democracy. If we want to have a free society, a free and liberal society, in democratic society, we cannot have surveillance capitalism. It's a problem that's growing. It's feeding polarization. Algorithms are optimized for engagement instead of. So they try to track your intention as long as possible. Entire business model is based upon collecting and harvesting data, extracting conclusions about the data gathered and combined and all together. And what's also wrong is it's a privatization of collective effort. Let me explain that. If we all together train an algorithm, then the result of that training should belong in the public domain and not be part of some kind of big corporate. And also the long-term consequences for kids are not very clear. Big tech clouds are far from free software. You can't really see what's happening in the cloud. I'm sorry. Free software gives you four freedoms. None of those four freedoms are available in the cloud. Because you can't download the software. You can't use it for any purpose. You can't modify it. You can't adapt it and redistribute it. If you want to have all these possibilities, we should be stop using big tech clouds. Also important reminder, there is no cloud. There's just computers of someone else. You can run capacity there and deploy some software or get some default services. Yeah, but there's no real cloud. It's in the end, it's just computer somewhere else. OK, let me do a quick introduction of the GDPR, because that's quite an important topic in this matter. For processing data, you have six legal grounds for which you can process data. First is consent. That is if you agree that your data is used for a certain purpose. But the problem with consent is that you can always withdraw at any moment. So that's not a very useful basis to process data. If you basically have your school, you can't suddenly stop processing. So that's why schools never ask for consent. Second one is if you make an agreement. For instance, if I'm a work as a network designer, if I make a design for you, then I can send you an invoice. I don't need to have your consent to be able to send the invoice to you. Excuse me. There's also lots of laws that require that you send data. Fourth is vital interest. If you're dying somewhere, I don't need to ask your consent to check your blood type or something like that. There's common interest. That's mainly government that's using that for processing data. For instance, if a municipality wants to hand cameras in the public space for security, that's based upon the common interest. And you have the legitimate interest. That is if I make a balance between your privacy and my interest, for instance, if I run a web server, it produces logs. I can keep those logs because that's in my interest to be able to do troubleshooting and stuff like that. That can be based upon legitimate interest. But the problem with surveillance capitalism is that complying is not a real business interest. You can make more money if you do not comply and try to stretch the GDPR as far as possible. I see mainly three large problems with the GDPR. First of all, there's the consent. What basically comes down is that you have all this annoying cookie banners that you have to click on OK, or you're being tricked in clicking on OK. And if you would want to read it, there's a lot of legalized stuff that's very hard to read and you don't have the time. And even if you have the time, they're not very transparent in what it's actually done. So it's difficult to see what the consequences of it are. Authorities are very understaffed and underfunded to properly enforce the GDPR. And what is important in schools with the GDPR is that your parents are allowed to know almost everything until your age of 16. So can you do the next? Well, the enforcement. Authorities are understaffed. It's not that they don't want to enforce, but they don't have the budget. They don't have the staff. And in my opinion, this should also be more clever in automating things on how they, I mean, you can, a nice example is all the security cameras that everybody just hooks up, points at the public streets that's not allowed to do that under GDPR, especially not if you did not properly secure it. It's a Chinese camera, et cetera. But what the authorities are doing is responding to individual complaint about a specific camera. Well, you have on OpenStreetMap, you have thousands of cameras that are already there. And you can see that they're pointing on the street. You can automate that. You can send everybody automated the warning, but your camera is not compliant. Another very important thing is privacy by design. That's part of the GDPR, but it's not enforced at all. I mean, you can't call all the Google services privacy by design. They're data extraction and profiling by design. And I'll try to make the juridical documents to make it seem compliant to the GDPR, but it's not. It cannot be, because it's not privacy by design. Privacy by design is that you collect only data you really need, delete it when you don't need it anymore, and protect it very well. Another nice example is those advertising IDs. They promise in a legal document they won't be building profiles. But in schools, the Chromebooks and Microsoft computers, they still have an advertising ID. You can reset it if you do that, which maybe someone occasionally does, but the profiling just continues. Luckily, we're not alone in this battle against for better privacy enforcement. There's none of your business, the organization by Max Frems. That's why we're doing a tremendous job in using automated tools, also automated tools, to basically have your cookie banners. This is a cookie banner with a dark pattern. There's a highlighted green button that you're supposed to click immediately, because that's the most easy solution. And if you want to reject it, you first have to open the policy, and then maybe do another dozen clicks, as opposed to, this is the way you should do it. If you use cookies for more than strictly necessary, you just click on Reject All if you don't like to have that. But the fun thing about this project is they automated it. They sent thousands of, they just create the most popular websites that are used in the EU, and send them automated warnings, but your cookie banner is not compliant. And the consent. This is a nice button, so you can, for instance, let's stick to the cookie banner example. If you open the terms and conditions, the privacy statement, you get often very complicated to read very long documents with a lot of blah, blah. You don't have the time. If you had the time, you probably don't have the legal knowledge to fully understand it. And maybe you don't have the technical understanding of what exactly can be done with data. So what do you do? You click the accept button, like everybody does, because you can't read all of those documents. It's too much. And they made it very difficult to read all this stuff. Let's grab a little bit of water. But there are some shortcuts you can take. If you read the privacy statement, there's a few things you should be looking at. First thing is, we don't use the data for. Well, I don't care what you're not doing. I want to know what you are doing, because if you don't, the things they've described that they're not doing, that's always the very bad thing that you think is very good that they're not doing it. So you're feeling a bit more comfortable. Another nice one is, for example, we use the data, for example, to provide you this website. No one's going to object that. But the thing is that this is not complete. It's just an example to make you feel good, but the other things are not told. Also a nice one is, in the end, probably you've all seen that, this statement can change any time. So if it can change any time, you've been reading it, and then it says it can change every time. What's the point in reading it? The moment I click it away, it could be changed. And then you need to check again and again and again if it changes it. But they should be informing me that something has changed. Another one is, we don't sell your data. We only share it with selected business partners. You can translate that as just anyone who wants to buy it. So that's bad. Fortunately, there's also examples that do good in how our privacy statement looks like. A good privacy statement is short, simple, and understandable. It comes down to, we use your data to provide you this and this service. That's it. Well, here's a few companies that have nice examples about that. I won't be standing in front of that. One that I love really much is this one. It's an open cut that's a security-scanning tool that Breno de Winter in the corona pandemic made to scan all the testing streets for compliance and for security. This is the entire privacy statement of the website about this tool. It's so short, it just fits in two languages on a single slide. Basically, it's in Dutch. It's a one-liner. In English, it became two lines. But that's really how you should do it. Don't collect more data than you need. And then think really good about what you actually need, not collecting things that might come in handy. Then the GDPR age. If you're under 16, it's legal that all your parents know everything. But if you ask a kid, if he likes it, if you ask a 13, 14-year-old, if he likes it, that his parents are immediately informed if he was late in school, did not make his homework, forgot his book, et cetera. But in nonetheless, 80% of the secondary schools use Magister. That's a tool to track if you're not late, made your homework, book forgotten, all your grades, et cetera. And they have a very handy app for parents that you get immediately pinged if something, some new information is added. Well, a default setting is that parents can see everything. Only school really needs to change this default setting to not informing the parents immediately if they want to have that. But legally speaking, there's no problem, because you're under 16. So a little bit more on Magister. It's basically a student or a pupil tracking system with a big centralized database. The problem with this very big centralized database is that it's basically the information of an entire generation and their school behavior is in one database. So if you want to know something about someone, you know it's in that database probably. So there's all kinds of foreign tree letter agencies that are very interested in knowing very much about specific individuals from a country. Can you move that one back, please? So the question you should be asking is not the chance that this interest falls on your kid that's maybe very small. But in the entire population, there will be kids that will attract attention. Maybe because they've become a terrorist or a criminal, but maybe because they become a politician or a negotiator or a CEO of some company, then suddenly they have the interest of secret services from abroad. As Magister has 80% apart from privacy in the data collection and the centralization, it's also huge vendor lock-in. It's quite difficult to migrate to some kind of other system and it's usually overpriced. If you look at the markets here, it's around 70% to 80%, but we have 900,000 kids that are going to secondary school. If you calculate that with the price per year that they're doing, it's 12 million per year. So imagine if those schools would put this money together and rent a development team instead of keep paying licenses year after year after year with prices that keep increasing and increasing. Well, we talked a lot about privacy now, but privacy is a very important public value, but there's more public values. So please discuss with your neighbor what other public values there are than only privacy. Can you take two minutes to really think about that? Anyone who wants to share some other public values that are important to have in schools? Do you want to learn from it? Yeah, you want to study the code and learn from it. So that's transparency. You want to see what's happening in the cloud. Good. Anyone else? Political views. They say political views because children are immersed in certain ideas and they don't want anyone to know that. But that's, again, privacy again. Anyone else? Also need to go on because equality, yeah, that's a good one. It's on the list there. What do you need? Public domain. If you create something, if you train an algorithm, you do that with a million people, then the result of the algorithm, that should be public domain. Also autonomy is very important. How can you be yourself if you wonder what this algorithm is learning about you? I mean, you go to school to learn that you, as a student, that you learn, but not that the algorithm is learning about you. That's the other way around. Also equality. There's lots of adaptive learning systems. But they mainly work very well for children that are already motivated and very good in self-discipline and working. Not for the other ones, because there is, so you increase the gap between the already motivated good students versus the one that need a little help and a bit of a push. It's also, if all this knowledge and all this power of the entire generation is in the hands of a few, for a very few small group, then that's a threat to democracy. If you know so much about so many people, it's very easy to manipulate, to transport, pump around, fake news, to gain their attention, to hold their attention, to push their attention to where you want to have it as an organization. So who owns the spell checker? I'd say if it's public effort, it should be public benefit. So Google and Microsoft both have a really good spelling checker, but it became so good because they harvest at all our corrections. So why isn't it a public domain spell checker? Schools should be a safe space. You should be feel free. You should not be wondering what the algorithm is learning about you. For the time, I'm going to skip a few slides. Democracy, we're not going to skip. We've got a very important one. The people that stormed the capital on 6th of January, those were people using Facebook groups. They were using YouTube. And YouTube also provides you with a next, more radical, more movie. As soon as you're interested in a certain subject, you continue. It will continue to provide you with more and more of this on this subject if you're going into transparency already has. And so many GDPR is good, but it should be enforced. And it's not enough because it's not the only public value you have. Privacy is important. And that's what the GDPR is doing a good thing. But we should have schools should look at the entire package of public values, not only privacy and not only GDPR compliance. It matters what you teach children. If you teach children to use all those services of surveillance capitalists, then that's what they'll use. As soon as even if you, within the school, make sure that everything is legally compliant and you assume that even although they're in the business of stretching the GDPR and expanding their space to use data as much as possible, as soon as the school bell goes to pick up their phone and accept any terms and conditions that are connected to that. And now they won't read the policies then because they learned in school that these are good services. This is what you can use. I'm going to skip this one as well. Kennesnet is also having quite some good recommendations on how you can use Google and Microsoft products in school. But it's also very impractical for a school. I mean, then you have a computer that can do all those fantastic things. And we as a population made, for instance, the spelling checker very good. But then you read the recommendations of Kennesnet on how to use it privacy-friendly. And then you cannot use the spelling checker. I mean, that's absurd. We made it. And now we can't even use it because then our privacy will be violated and it's not according to the GDPR. Another one is for Microsoft. You must make sure that in the entire school, nobody is going to install a mobile app. You should only use the desktop. These are recommendations from Kennesnet on how you can work GDPR compliant. So that's not the way we should be going. Well, are schools nuts that they give all the data of the children away? No. They don't have the time, knowledge, and money to build their own environment. So we should be helping them with that. And the good thing is we don't have to start from scratch because there's lots and lots of very good open source tools. The only thing is that they're not yet integrated in a complete package that can go to school. This is an example of a German organization, Univention, to do identity access management. And they link to lots of other open source tools. So it's really doable to base your digital system completely upon open source tools. This is another example that will skip. So what I want to do is make sure that out of all these building blocks, we get it together and make a nice package that will solve the problem for the school and make it easy that they can get a service and the SLA and support and integration and training for a fixed price per student or per pupil. Well, we have three ways of doing that. First is, well, let's just build it. Let's show it on a few schools that it is possible and doable. And then lobby for political attention to get support for this. And also legal by making sure that GDPR is properly enforced. Well, how we're going to build that is a matter of looking abroad. I mean, in Germany, in Luxembourg, in France, there are all fantastic deployments with Nextcloud, but also other open source tools that are used in school on a very large scale. So it's possible and doable. Roughly there are four parts to a school and IT environment. You have the generic cloud environment. You have the student administration. You have the hardware in class. And you have the educational material. Currently, we're doing a pilot in two schools in Amsterdam. Those are primary schools. They have Nextcloud as an IT environment. And all the communication with parents is using signal for that. Nextcloud is done by one of our supporters from the Nextcloud. So next step is we have a petition on Ehrlich Digital Underweis.petitius.nl. You can scan the QR code or go to the URL directly. I'll give you a bigger one. Sorry, was there a question? But there is also other good news on the legal side of things. Last week, Denmark has forbidden the use of Google products in Google Workspace in the municipality of Helsingor. But they've done that on a very generic base. The ground for this is that data transfers from Europe to the USA are not allowed. And that's applicable for a lot of other of these surveillance capitalists. Also Office 365 has said that that's not usable for in schools in some parts of Germany. Next thing, how do you get that done in the Netherlands? So if you're a lawyer, please join us. So how can you help? There's a Matrix chat room about this. So please join it if you're on Matrix. If you're not on Matrix, please come on Matrix and join the chat about the subject. If you work for an NGO organization that's doing anything in the digital field, so please join us too. But then as an organization, sign the petition. After you've read it, of course, because you have to read things before you click. And most important, if you have a school or work for a school, let the school join. Say as a school that we want to have a digital environment that's based upon public values. So anyone has a question? Enforcement is a large part of the whole GDPR shit. Are you willing to work for the Autoritat Personschreven, Scherthian? As a lawyer or as a techie? As a person who wants to automate the stuff you talked about. I'll consider it. And the second question is, shouldn't we just hack Magister? But hacking, that's illegal. It would give a nice scandal, yes. Thanks for the talk. Do you think it is possible in now and five years to have open source solutions and service providers that are willing to more in privacy matter and provide schools with the correct solutions to have the children, the correct environment? Well, they better get started now, because in Denmark, those schools there, they have two weeks to this Google, and not years. I mean, two weeks. Imagine you're a school and you have everything on the Google Workspace. And the Autoritaters say, now you can't use it. And in two weeks, you should have deleted all your data. So it's very wise to start now. There are open source solutions that provide the same possibilities as Google and Microsoft solutions to. It will not be immediately on par. But if you look at the next, for instance, next cloud, yeah, you have a cloud environment where you can work together. And you can have lots of plug-ins for schools. It's doable. In Germany, there are already schools working. I mean, there's hundreds of thousands of students that are being migrated from Office 365 to next cloud. OK, thank you. Yeah, it's possible. Would it be a good idea to teach the teachers? That's a very good idea. Yeah, can hackers help if they have children on school that they give the teachers a short course in things like this? There's an organization in Austria that has a very German name that they cannot go out. That's exactly doing this, is teach the teacher. One of the things, sorry, then I'm finished. One of the things which should be clarified is that open source is not free. It does cost money. It's not gratis. It's free is in freedom, but not free is in no money. That's true. No, you have to pay for the people who develop things, but it's open for improvements. So that's a big difference. They think that they get everything free, but it's not true. So let me just jump in here so that the word you're looking for is kaos magtuleb, kaos does schools. They go into schools and teach teachers and kids and parents about all things online, et cetera. Like, who do you trust on the web? How much information do you want to give out? But that's not IT. That's not IT education. Can you make it an individual choice again? But the one is that the board of schools says we are going to base everything upon the public values, and not only privacy, but also transparency, democracy, sovereignty. But I'm interjecting. It should be a Q&A between you and the audience. I'm sorry. I just wanted to help with the name. Carry on, please. Next question. Thanks for the presentation. When it comes to climate change, a lot of youngsters are very much aware that it's about their future. Are you aware of any pupil-driven efforts to make this statement of yours and do it in public? Do it in their schools? Do it at a, well, a juridical court? I don't know. Not organized, but. I think that would be a very powerful statement. Yeah, you can help me to, I mean, I talk to my own kids, but usually only to warn them about the situation now. But I don't hear much other initiatives that are coming from kids. OK, thanks. Next question, please. We have a few minutes. I want to get our link. I was wondering if legally it's the students or their parents can refuse a use of Chromebooks or Microsoft online stuff for schools or for extra activities just outside of schools. Is there a legal basis where they could say, no, I don't want to create a Google account? No, because then you're, there's only possible if you would be based upon consent, but they just make an agreement to provide IT services and have a processing agreement with those providers of which they claim that it's legally all OK. So as a parent, you have nothing to say in this unless you go to court and proceed all the way until, but that's, yeah, then you have to take your school to court. That's not a really nice solution. So the parents have to agree to the use of Google services when putting their children to school? No, they don't. The parents do not have to agree because the school makes an agreement. Yeah, no, well, as a parent, you have agreed that the school provides everything they need for education. And that is, of course, a very broad understanding. And then the school says we need Google services for education. And by that, you have consented at the beginning of the year when you have enrolled your kids. So the school should make it public before people register their kids there that their data will be sent to Google. Should that be explicit when people register their kids to school? It's, you know, with our kids, when they were at school, they introduced Google. And they introduced seven other apps that, you know, so it's then difficult. And they say we need this for our education. So it's difficult, yeah. And I do have to interject here because we're running out of time. Please come up to the speakers afterwards, grab a drink at the bar and discuss this further. I think it's a very important topic. Please, for the end, please give a warm round of applause to Gert Jan and Michel. Thank you very much. Oh, thank you, Owa.