 Good morning. I am Beeban Kidron. I'm a crossbench peer in the House of Lords in the United Kingdom. And I'm also the founder of an NGO called Five Rights Foundation, and we work at the intersection between childhood and technology. I just want to say before we get our session going what an incredible privilege it was to sort of be listening into Katina there, demanding a better world a better digital world for all of us and her absolute optimism that it's about to come down the pike. And I have to say I'm not one of her students but I definitely would like to join her movement. So during this session we're going to ask you to use the Q&A board to ask questions, and I will towards the end of the session, try and put your questions to our panel. And shortly I'm going to introduce my colleague who will talk about our work with IEEE, which is to develop an age appropriate framework for published terms. Now we at Five Rights are very excited about it, and we believe that when it comes into the public domain, it may well be a game changer because in considering how to look at published terms, you actually end up thinking about what the published terms offer. We really do celebrate the role of collaboration in this arena, and we have had contributions from engineers, from academics, from business, from actually policymakers, even regulators, and actually the enforcement community across the world as we have developed our age appropriate published terms. And I think that this conference, I think, sort of embodies what we have been living for the last year or 18 months in pulling together technology and the needs of society and considering what public interest is in this space. I believe that there are sessions on racial justice, on environmental justice, some on the role of education, but this session focuses specifically on the rights of children and young people. And maybe we should call it intergenerational justice, just to be mirroring your language. But it's really important that in the conversation that we have that young people are not seen as passive recipients of technology and or technology education, but as full citizens in their own right. And we find that they are forgotten in some of the biggest conversations about technology, and they are marginalized for their age. Children are one in three of the world online population. And if we design a world that is only suitable for adults, then actually one in three of the online population are simply unseen and unheard. And getting age appropriate published terms is part of this story. So let me go straight away to our first speaker. I'm very proud to introduce my colleague, my friend, Dr Reese Farthing. She is the director of projects here at Five Rights Foundation. She is named to us from an academic background of co-developing policy with marginalized young people, and trying to realize, trying to help them realize their economic, cultural and social rights. And she is going to set out the background to the issue, and a glimpse of the work that we've done so far. Well, firstly, thank you very much for that introduction. It's really fantastic to be here and just to hear the enthusiasm around this project is incredible. I've been tasked as the sort of background information panelist, and I'm taking that brief seriously. I'm going to start at the beginning if I can by highlighting that, as Beaver mentioned, the size and scale of young people's digital lives is truly staggering. There's now over a billion children and young people online, and every single day another 170,000 go online for the first time. And as I'm sure we all saw this year, COVID has done nothing but absolutely amplify this. One US survey that happened over the summer found that now nearly half, nearly half of American children were spending six or more hours a day looking at screens up 500% from pre-COVID levels. In Ireland, around 70% of children increased use of their smartphones, they upped their use of social media, and two thirds of them increased their use of gaming consoles all at the same time. Now, a lot of this was driven, as we all saw by the move of education online, but also other aspects of young people's lives moved online too, like all their social life, their interaction with friends. So now we can clearly say that young people live on and offline seamlessly, and this is just their life. And for us as a society, this creates some new obligations for us. Just as we want our young people to flourish offline, we must want them to flourish in the digital world as well. And we really need to protect and promote their rights in this digital world to achieve this. Now seeing moves in this direction, there have been legislative and regulatory responses worldwide to this from online harms legislation such as that being put forward in the UK or that enacted in Australia or Rwanda recently to international actions such as there's a general comment being drafted by the United Nations Committee on the Rights of the Child to enshrine children's rights in the digital world. Outside of regulations and law, which you know lawyers and politicians love, children's rights need to be realised absolutely everywhere, including in the design and delivery of products and services. Now this must be everything from the way we design features on social media platforms to the algorithms we create that recommends them content, right through to the terms and conditions that we create for these services. So what we're talking about today, terms and conditions, T's and C's, as we all know them, the fine print that you click I accept on at the start of using any service. More formally we refer to these as published terms, which covers everything from terms of service documents, privacy notices cookies policies and countless others. The reason we're talking about these terms is because they're important. They set out the rules of engagement between a product and a service user. And it's incredibly symbolic then that they've become absolutely infamous for the unfair power asymmetries between users and tech companies that they cement. They're described variously by academics and lawyers as the most pernicious violations of consumer expectations, and once even as sadistic harsh words, but more specifically, they're criticized for being click through excessively long, deliberately drafted in a style that actively discourages reading and for binding people to terms and conditions without any meaningful consent. And that's just for adult users for young digital service users. The problem is even more acute. I'm going to talk us through five of the big problems we see for young users. Firstly, in terms of presentation, we've got a really big issue, quite apart from the services that are clearly aimed at young people, most children spend their time on mainstream services. Now terms and conditions are not presented in age appropriate formats on these mainstream services at all, nor indeed even in the services for children. Everything from legal jargon to simply relying on plain text, with a whole rest of a service or website is designed to within an inch of its life to absolutely maximise user engagement. Terms and conditions are presented in plain black and white font making it all too easy to click through without reading. Secondly, we've also got a really big problem in terms of offering fair terms. Terms quite often fail to meet basic consumer law obligations, especially around informed consent and parental consent for young people. Thirdly, terms just really often they just obscure the fact that the service in and of itself is not age and appropriate. And the terms and conditions somehow become used as a tool that justifies its inappropriateness. For example, you know, we've seen terms and conditions that have asked young people to click, I agree, to fundamentally inappropriate things like consenting to having your data collected and sold for marketing purposes or consenting to turning on your location sharing services on social media. But you know, we know it does things like it broadcasts their location to adult strangers. You know, fundamentally, you just cannot click I agree to having your rights violated, nor having laws broken. Fourthly, terms often make this really weird and inappropriate trade off between young people's rights and commercial and business incentives. By choice, terms and conditions are created to create the best relationship for a product or service and its consumer interests before the needs and best interests of their young users. For example, in-game purchases link seamlessly to default payment settings on operating systems. Business models are driven by profit and sales maximization, but that does not mean that they should run roughshod over what's in our best interest for children. And lastly, terms and conditions have this kind of inclination towards inappropriately responsible young people and parents. By this, what I mean is that terms and conditions create these wildly inappropriate responsibilities for safety and data privacy for parents and young users. Think we default to this low safety mode unless you the user or your parents, which all of these features off. Why are we thinking that young people and their parents are best placed to do this? It's like a baby seat manufacturer saying it's up to you toddler and your mom to calculate if this seat has the right features to avoid terminal velocity in a car crash. These responsibilities must actually instead rest with the service and the product providers first, because they're the experts who built these products in the first place. So fundamentally, we've got this big problem that many terms are just wildly inappropriate for young people, but it's incredibly important. Getting terms right for young people is both a moral, legal and business imperative. So as Katina talked earlier about, there's a huge moral imperative for this. Young people's rights as codified in the Convention on the Rights of the Child, they actually apply to young people wherever they are and wherever they go, including online. There can be absolutely catastrophic effects when these rights aren't realised. Think unlike online harms where terms and conditions allow young people to be recommended or even introduced to adult strangers or offline harms where we share their actual sort of geophysical location or we allow dangerous misinformation to spread. Terms and conditions need to be changed in ways that create the conditions to realise children's rights. Similarly to we're seeing regulators increasingly acknowledging the importance of terms for young people. For example, in the UK, we recently enacted a piece of sort of, you know, incredible well breaking legislation in the age appropriate design code where it spends a full paragraph making it a requirement for any information society service that's likely to be accessed by children, which is a very expansive target in itself to present terms and conditions in ways that are clear and accessible. Or internationally that general comment on children's rights in the digital environment, which I mentioned earlier, requires every state that's ratified it. Now this stage I need to point out the UN CRC is the most ratified document around the world so this has huge implications for states. States are now required to require businesses to establish and implement regulatory frameworks and terms of service that adhere to the highest standards of ethics, privacy and safety. But fundamentally, it makes good business sense. We heard earlier about IBM making moves into sort of more ethical business models and businesses trying to adopt that. New startups investing in privacy enhancing technology and services have never been more demand, and the drive for sort of ethical business and ethically, you know, trained staff is absolutely increasing. And this includes consumer demand for ethical digital services. So basically what we've got is there's a real problem with terms and conditions for young people and there's a real drive on the other hand to find a solution. So that's the agenda that I was setting out as the sort of agenda setting panellists, but I'd like to just add that at five rights were also working on solutions. We're doing some research around this issue and working with children and young people to which I believe everyone is going to talk to you a little bit about later. But we're also currently working with the IEEE to support a working group who are trying to develop a standard around published terms that will provide an absolute framework for service providers about how to actually create age appropriate terms, addressing everything from how to actually present your terms to what your terms actually need to set out. So what are the conditions that your terms need to create. And I think L-Pesh is going to speak to us a little later, a little bit more about the capacity of a standard in this space. So I'm going to finish if I can now by extending an invitation to anyone who's so inclined to join the standards working group and help us to really craft these solutions as well. Thank you so much, Rhys, for that contribution. And indeed I would like to invite anybody with comments and questions and thoughts into the Q&A as we talk. And just to say that the work of the working group over the last months has really gone very, very carefully and fully through all the aspects of design that might be impacted by published terms. And we've actually had to be grateful for contributions to our work after this event. Let's turn to Ephraim Lwemba. He's currently based at the University of Nottingham's Horizon Centre for Doctoral Training. He's worked in the digital marketing industry, which I think gave him pause for thought. And he thought maybe the uses of personal data on the web were not quite right. And he's recently been working with us at Five Rights to develop recommendations around public standard of children and young people. And let me first ask you, Ephraim, you know, your research has been really focused around questions of privacy and personal data. And I know you did a recent audit of services. And I'd really love you to sort of explain to us what some of the key problems came out of that audit for you. Thank you, Bebun. Yeah, I have been doing an audit of the services that are targeted at young people. And I define those services based upon some data that I found on Amazon Alexa, Tekeen and Kid, focused websites. And I think there are a few things that came out of that audit. I think some of the things that are particularly important to keep in mind are that on a lot of the websites that are targeted at young people. And the strategies that services use in order to show privacy information and terms of service and other published terms are exactly the same as the ones that they use for for adults. And as Reese was saying, these strategies tend to be that the terms are written in very long and legalistic language. And they are targeted in ways that make it difficult for people to understand who who the users user base are. And I think one of the most interesting things I found from it was that a lot of the guidance that is actually available out there for young people. And presenting policy information and published terms to people is not actually heated on most websites. So I looked at 100 services and found that 90% of them, for example, put their privacy policy and the footer of a website. And a very large percentage of them were also not using any, any, any sort of strategies for simplifying the information so the most that you'd find is like a header saying, this is how we collect your data. And we use your data. And this is who we share your data with and no ways of splitting out that information in ways that makes it more usable for any user, not even a young person. It's really fascinating that whenever we talk about young people and terms and with adults, they always say, immediately they say, Oh, but I have the same problem. I just wonder, you know, obviously, you know, from a five rights perspective, we think that the children deserve a higher bar and we have a duty of care. So I was also interested in a piece and I wonder whether you felt that they had more creative solutions than adults. And I know you've actually got a little bit of a video for us, which I'm hoping that someone's going to play and maybe after we play the video, you can actually explain some of the ideas that the young people had. But if I understand it rightly, you asked them to come up with a list of should and shouldn'ts to tell the conference audience. Yes. So we do have that video from some young people who we spoke to in a covert secure youth club in the southeast of England, and we can share that with you and discuss after. So we think that the shoulds that people should do for terms and conditions is make it presentable for everyone to agree. So for everyone to be able to know what they're agreeing to. Another is that they should put an audio button so they will read out for you so we won't have to read it ourselves and also if you're blind or you know you can't actually read it yourself someone else will read it for you so you can hear it instead of reading it. So if there is a lot of information to consume, you can put it into different sections and in the beginning of the terms and conditions you can use like subtitles and put an index so that people can see exactly what they need to go through and address their worries. And now that is get to the point. So bullet points, short sentences, short paragraphs, lot, masses and masses and masses of writing. Yeah. The next one is to not make it boring. Obviously a lot of people will just get unengaged after a while if you know if the terms and conditions are a bit boring or if they use too much complicated language. It shouldn't just be an agree. There should be a disagree button just in case you don't agree with what they're doing. You don't agree on them collecting your data and tracking other apps you go to and other websites. And they shouldn't use jargon, which is basically tricking people into agreeing. So you have to show exactly what they're agreeing to. And don't make it too complicated. So use words that we would, we would hear we would use in the day to day basis and not like those big, big words that we wouldn't understand. I think that's absolutely brilliant because in one fell swoop, they really focus in on a number of things that the working group, this very expert working group have spent 18 months sort of identifying. And here you have two young girls sort of coming up with it in a moment. What what else did you get out of speaking directly to these young people. So the next next opportunity, I think some of the things that they mentioned there really point to a problem that that we have that's linked to not only to the way that the terms and published terms are presented to people but a level of a way that the services are shown to be disproportionate to what you'd see in real life. One thing that we mentioned that was mentioned by the young people in in our workshop was that the, there was a disproportionate level of there were disproportionately long and the practices that were displayed on display were also disproportionate to what people got from the services one of the young people said that they would be inclined to read published terms for example. So it was in a situation that was useful to their lives so consequential like going to university but not necessarily social media and that goes to a deeper, a deeper problem with the volume of the times and conditions that they see. And I think another thing. And I think this was one of the most important points to me personally was that when the young people looked at some of the examples of published terms that we had. Cookies notices that we showed them and examples of privacy policies. They talked about how the way in which the information is presented to them would nudge you to accept accept accept, but I found that the way that they understood it was that there was no clear way to really improve on the way that the data is presented without making it either too long and therefore too difficult to understand or keeping it too short. And when it's kept too short, there was a level of confusion that they had about it. But they, they thought that this should be the responsibility of the services themselves rather than the young people. You know what that's absolutely fascinating because I think that I probably have had that argument with a couple of tank platforms myself. And I want to ask you and this is something we really did find in the working group. I want to ask you whether you think that that is actually because fundamentally, those published terms are are covering up so much that is inappropriate for those young people that you can't in all conscience, make it short and true. And so we have this sort of absolute sort of tension between actually making it short and revealing and perhaps an unpalatable phrase. Precisely. One of the things that one of the young people said to us was that they wish that they have the choice to really decide how their data is being used. But the way that the sites and the services present the information to them often misleads them. One person said that they weren't really sure what a cookie notice was until quite recently they thought it was a reward that a site gave you for signing up to their services. And they use a lot of really confusing language that the young people don't understand but one of the young people pointed out that perhaps that's because they don't really want you to know what they do with your data. Perhaps that's because if you did know everybody would say no. And this is a problem that you help with cookie notices in general you have a accept all button and you have a manage your choices button. Specifically telling you that if you accept all you get to click through and enjoy the site and if you have to manage your preferences does this long thing that you have to read. And you may not like the choices. So I think yeah that's definitely a very big problem. Yeah, I'm, I'm, I'm also, I remember actually being in a workshop and watching in front of my eyes. A young child, I think there was sort of 12 or 13 invent two factor authentication in, you know, they didn't, they hadn't sort of worked out but already existed and it was sort of a genius moment when we set them the task of trying to solve a problem. And I happened to know that one young person did suggest that they have personal terms. And I wonder whether you could just explain what personal terms are and whether you think that's, that's, that's a go for the rest of us to consider that as a possible solution to the problem you're describing. Yeah, this was something that was actually brought up by by a few young people I know that five rights have spoken to before and I think on a broader perspective personal terms have been tried before in a realm of ways. So personal terms are basically is you would have your own privacy preferences that you have an idea for and you'd set them at some point at your leisure, and then allow websites and services to read to those in a machine readable way. You wouldn't have to constantly accept things you have your personal terms that you've set at the beginning of your process and then once you've set your personal terms you don't have to change them and websites have to come to you to say yes or no to the terms that you've set and I think young people thought this was a good idea, both in our own research but also in other research that I've read before. Great. So I want to actually just finish by by making an observation and asking you if you have had a similar experience in that short clip that you showed us. There's an incredible generosity from one of the young girls about accessibility and this idea that we should look after people who who perhaps aren't coming to this technology with all, you know, all available roots, all available personal tools, etc, etc. And, and, you know, I've noticed this before I've noticed young people saying, Why don't we have accessibility tools on, and then we switch them off for the more, you know, for more able bodied folk, etc. So I wonder whether that's something that is a theme in your work whether whether young people are very sort of specifically concerned about that, because it wasn't something that came up in the same depths granular depth in in the working group I must I must admit. As you saw some of the young people mentioned that it would be useful to have something like subtitles on your personal terms kind of by default. And another thing that I thought was very pertinent. One of the things that relates to accessibility was that they also thought that personal terms should be read, read aloud as a default as well. And that you should have the opportunity to switch that off but it should be as accessible as possible on the first port of call as a first port of call. Thank you so much for that and I'm sure people will ask you questions and I do encourage people to put their questions in the Q&A for a little bit later. I'm really happy to introduce Alpesh Shah, he's the Senior Director of the Global Business Strategy and Intelligence that the IEEE Standards Association, where his focus is on organizational organizational growth and advancements of ecosystems towards accelerated outcomes. So joining the IEEE Alpesh worked as an advisor to various organizations with a focus on organizational growth, but I have to say that Alpesh has been a great friend to five rights, and we've asked him to talk from his perspective about about some of the work that we've been doing there and the intersection between standards and rights and sort of whether this is in fact an important part of the ecosystem. Now, my understanding is that I can't see Alpesh but that we are about to hear him. I hope that is in fact the case and Alpesh, can I first ask you just about some of the things that Ephraim's been saying about, you know, there's this incredible creativity amongst kids to find solutions and I just wonder from your perspective, you know, where you think who you think the stakeholders are in solutions and where you think standards fit into that stakeholder mapping. Sure. Thanks for throwing me a softball there. First off, I just want to thank you and Tina and the team here for inviting me for a tremendous day and a tremendous conference with such a, you know, such a proven group of speakers. I really appreciate the comments made earlier both by Rhys and Ephraim and I think when we think about the stakeholders. I've always thought about it from the aspect of it always has to start with the child and then work its way out from there. But too often, it's been the other way around. So, as we're thinking about things standing and growing and we look at the role of standardization, we standardizations a great tool to help harmonize to drive interoperability and to offer greater access to innovation. To Bevan's earlier point about, you know, the role it plays within the ecosystem as well, it plays a very critical role in that when it was mentioned earlier in terms of service providers offering solutions that are better catered towards children. You know, part of it is we need great city and also perspective in the innovators that are coming in as well. And at a time 95% of if you think of even the US economy was really driven by small midsize enterprises. And many of these players may not understand may not have access may not have the ability to develop the right types of solutions as well. Thinking about it from a standardized fabric, where the access to the libraries and the open source platform and a children's stack really if you can imagine that upon which they could build. And this idea of code is live you will sort of being accessible to them starts to improve and increase the ability for the ecosystem move in the direction that you've heard even mentioned and speak about for quite some time. You've heard Katina speak about earlier you heard Reese talk about it you heard it from the examples from Efron, and you'll hear many folks talk about the concepts as well throughout the day in the past few days. And significantly to all of this is an age appropriate design framework. This is something that be done and Reese Afron and a number of folks have started to really come together in a five rights foundation really help drive this idea of imagine if you could really lay out a common approach, where that these ideas of the terms and conditions and ideas of parental rights where these ideas that are really even reflected through the 16 elements of the ICO are accessible are addressable are achievable in a mechanism where others can deploy it in a not only in a manner where they do it right but in a manner that's also cost effective right that's that's the additional benefit of standardization here. And when we think about an ecosystem perspective how do we make the ecosystem better with ground more fertile. There's also give the opportunity for us to think about certification and imagine those age appropriate design terms and conditions for every one of those service providers and every one of those companies that have stepped up to the plate and made those TNC is more accessible. We're bringing in a registry. So now you've made it easier for school systems to go there and identify where the right software or the right solutions and imagine for parents and the PTAs around the world. Having a clear ability to understand where is the data compromising our students and their futures. These are all elements that are quite feasible, your standardization, and the point I really think that is necessary and has been driven by all the speakers here is that it's really important to be accountable today for the future by taking proper action and standardization is one mechanism to help us take a step forward in that direction. And you know what that was so wonderful and I have to say I almost broke into song when you are building up your imagine imagine imagine I, I do spend a lot of time imagining just what you say and I think it's a powerful picture. It leads me to ask you another question, which is one of the things and and I speak now a little bit, you know, as in my role as a legislator that one of the issues we always hear for, or one of the obstacles should I say is, you know reasons for inaction is, you know, this is a global problem, so we can't solve it on a national level. And, and, and actually, so long as we see it that way. You know, everybody's in some sort of checkmate, but it strikes me that standards don't necessarily have to observe national boundaries and I just wonder whether you could talk a little bit about that and about how, how it might help you know, knock that particular sacred cow off the, I don't know, I'm in the middle of, I'm in the middle of a sentence I can't finish, but how it might help us attack the jurisdictional problem. Sure, I think, you know, when we look at Wi Fi technologies, your Wi Fi for your laptop works wherever you go. IEEE is a global standardization body. Many of the standards that come out of IEEE work across borders, across the world, you look at our medical device work as well. And really, I think what it boils down to is context on one front. But I think the other problem is really something that we were talking about a little bit earlier which is just folks getting over that hump, feeling the ability that they can really contribute in a meaningful way and having access to those tools so they can start doing it. Part of it is building capacity. Now there's this other part of the current, let's say, players in the space. And for some there's there's this statement of it can't be done. It's too hard. But you know, honestly, we hear that about many things. Right. But that's not a reason not to do the right thing. And what we are talking about today is doing the right thing. And doing the right thing in this case means taking the proper steps forward, showing folks it can be done showing them how to do it, and then enabling them to move forward with it. I think the other thing that maybe as an observation worth making and I'd be interested to know whether you see this really either in market terms or whether you see it in terms of soft power. But, you know, I think that when we talk about tech companies we sort of have, you know, a handful of tech companies in mind, but actually, if you think about it, most advanced businesses are in some way at tech companies. So actually, there is a culture shift and many, many of those companies don't necessarily have the same interests and do have a specific and corporate commitment to young people. And do you see this work around, you know, what you refer to as the children stack as sort of, you know, an opportunity for those companies. I see it as a great opportunity for the established businesses I see it as a great opportunity for those that are starting their businesses now and I see the great opportunity for those coming in. You know, you asked me at the very beginning to be done about this idea of, who are we really designing it for and who are the key stakeholders and where are we right now. And really, you've already started five rights has already started taking the necessary step right you you've already started working with children, getting that getting their input into here. I think of even the terms and conditions problem statement, and there's really no reason why we can even think about that becoming some sort of capture instead you know as you go in to think about what are the elements here that can be visualized in many ways as well. In the same vein, the established businesses may also be in a position where they may get disrupted through through a lot of what needs to occur and so, yes, I think either it's it's either they want to do it or they'll be pressured to do it or, you know, they opt to see others do it and then sort of access through it that way. I'm going to actually just sort of finish with one thing which is actually one of the things that we keep on hearing children say is that they want consistency. And it struck me as you were speaking is that's what you're offering I triple E is offering kids the consistency they crave. Have you found other user groups who have that same desire for consistency. Yeah, I think if I if I were to think about just the various stakeholders, I think everyone is looking for consistency, because that consistency can help lead towards proper policy in place as well that complements it so strongly. And when when that's in place, you know, it becomes almost a part of normal course of business then. So, where we've seen it, we see it amongst not only small and mid-sized businesses we've seen amongst the NGOs they're involved, we even see amongst some of the major companies that are offering toys for children or are in the space. And more and more around the cursory sides we're seeing more folks starting to come into this space as well with an interest to see this, because for them it gives them a nice pivot point. Fantastic now I may welcome back to you because we have some questions from our audience and please do keep them coming. I'm actually going to take one of these questions and go back to Reese here, because someone said, what are the most urgent requirements in helping young people in this space and in a sense, the work of the working group sort of identified what they thought were two obstacles and huge positive drivers. And I wonder whether you could just sort of extrapolate those four for us. Yeah, sure. Absolutely. Thank you, Beeban. I mean, it's an interesting question because this is exactly what the working group that we've set up with the IEEE that we're working to support have been grappling with for quite a while now trying to work out actually if you put it in and not show what is it. What is it that service providers can do to break down those barriers and go forward. And we've sort of created four categories of things that service providers need to do to make their terms and conditions age appropriate. Firstly, the big one that's that's the easy go to I think which is about age appropriate presentation. So making sure that when you do have terms and conditions young people can actually access them, you know, in multiple formats in a language that they understand in a way that appeals to them so the information actually is accessible to them. The second sort of thing that the driver that we thought they could come up with is is the idea of you don't have to reinvent the wheel every time. There is actually a legal concept of fair terms in consumer law, and it's actually quite developed. And actually, if service providers and product developers actually took a look at what were fair terms in the legal sense, and applied those into their own terms and conditions, you'd immediately get this this bump in, the interactions between young people in a service provider and improvement in the rights and the realization for young people. The third thing that the working group sort of centered around was this idea that, and again, you don't have to build everything from the start, we do have this huge cannon of children's rights and children's rights law. And actually that has all sorts of principles in it that are applicable in a digital setting as well. So from things like best interest, which actually talks about what's the role of, you know, or how is it that when you're looking at your business model and your business maximum how do you balance what's right for your business and what's right for the child. And actually it points out that actually what children need and children's rights has to always be the paramount consideration and when you're interacting with children. So if we can get terms and conditions to be compliant with children's rights, all of a sudden we start to reshape the offer that young people and children are able to access. And then the final sort of bucket in terms of the good things that service providers can do. It sounds really obvious, but it's just not done very often. But it was as simple as recognizing children. So there's this idea that actually a lot of service providers actually do provide services or products to huge volumes of young users and quite often young users who have specific needs or specific issues. But because they don't necessarily claim to target them. They just pretend that they don't have young users so their terms and conditions don't need to be fit for them. And so actually there was a kind of first and obvious step of just identify if children and young people are using your service because most likely they are. And so those were the sort of four positive things. And then in terms of what the working group decided were the kind of, I guess, difficulties that services and products had to look at. Firstly, it was around challenging their own commercial interests. So looking at actually what's your business maximum. And do you have children and young people written through your corporate social values. And can you do that as a way to change the product or to change the service that you're offering to make it more age appropriate. And then the last one was the difficulty that very particular platform intermediary status can rise can lead to and trying to look at how services and products can actually not inappropriately hide behind their intermediary status to continue to offer services and products that aren't age appropriate. And so in those kind of six areas, we've had this whole working group operating for a year to develop actual lists of dos and don'ts. What are the tasks you need to do to get children's rights in terms and conditions to make them offer fair terms to make them age appropriate to ensure that you don't harm their commercial interests. You know, looking at actually how do you do this and how do you get that right. And that's been the work of this working group for quite a while now. But we are approaching the draft of a standard I think it's safe to say, slowly, slowly we're approaching that. But I think it's going to be when it's released possibly quite a game changer. Because as El Pesh put out, this is one way to just, you know, make it really easy to do the right thing, which is a type of capacity building and stuff. We're going to really tool on the sector and tool the industry up with a way to, you know, here is how you can do it right. Thank you, Rhys. That's absolutely fascinating. And you can tell exactly why it's taken us so long when you get that list of things up there because they are not small issues. I have a question here which I might actually pose to Al Pesh. And the way that it's been formed it says, might we be able to use the response to terms and as a reform, especially in the online gaming center sector for application to adults as best practice. And I'm going to invite you Al Pesh to, to sort of think of that also more broadly, because of I know you've been doing a lot of work around ethics in general. And what about this idea if we work out what's good for kids? Does that tell us anything about what's good for adults? You know, I love that question that's been on my mind for quite some time, especially as we think about both this idea of the children's stack, which is something that we've been exploring for quite some time and you know, as open source is also another mechanism to standardize with an IEEE. I think it's an interesting idea, especially if you're considering that your platform is one that you're looking to license down the road to others. And in particular, some of the things, if I were to use the children's stack as an example, you know, many ideas of by design really need to be fully vetted and developed here and then having the ability to sort of inverse what's happening now in terms of turning things off for a more adult oriented approach versus, you know, the current approach of trying to just dump things down, if you will, you know, may have more of a marketability in the future. I think it also depends on what segment you're looking to really focus on and what you're looking to offer. But at the end, I think what we are all finding is that the big no no is trying to take an adult oriented system and giving it to kids. You know, that's that's the big issue, I think the others are ones where when we're getting to the gray, we really need to understand this, what is our true intent. What is our expectation, we do need to take when I hear the by design, quite a bit for me it's, let's take a step back and just understand what is our end goal, what are we trying to do for the betterment of what. And too often I think that's something that we're forgetting when things are being designed, and we end up doing things because we can as opposed to, should we be doing it. And even this goes to your earlier point on, you know, there's been a considerable amount of work being driven by SSIT members of the global initiative on ethics of AI systems and others where we start thinking about these ideas of nudging, for example. And as these technologies start to evolve, and nudging plays a bigger and bigger role. How do you manage for that, and will an adult system, better cater towards children in this way, or will you take a children's based approach and have it bubble up. I think these are things that require, you know, closer look at, but I do think that if you went with the former of a nudging system based for children. And having transparency in terms of where those decision points came in the nudge, making that accessible to the proper players in the ecosystem the parental ecosystem that would that could be a good thing so sorry that's a bit of a broad answer but I do like the thinking behind it. Well I think it's actually a fantastic answer and I think, you know, one of the sort of if you like the category error and you can see why it was made was one of the you know utopian vision was each one of us would be the same online. Unfortunately, or fortunately we're not all the same we have different needs different requirements and in the case of children different rights. So it is a sort of a maturing of the sector to actually have to treat its users in different ways. Now if I have a question here which you are absolutely the person to answer, which is when gathering data and thinking around terms and conditions for a new service. How important is it to consult the primary stakeholder. In this case, children and youth. I think that it is very important. From the workshops that we've been running. I think that I've, I've learned just from trial and error is that the, the majority of insights that we gain as part of this talking to children and understanding their points of views and specifically understanding what their, what their pain points are in the services that they use and where they're coming from when they're talking to their friends about what what their pain points are. It's important to understand that children do have quite a lot to say about certain cultural and situational aspects of using services that you can't necessarily get from talking to adults about them or even briefing people about what it is that children are said by the literature to acquire in certain circumstances. I think a good example of this from the workshop that we ran most most recently was when you look at a lot of the literature on how you should draft policy information to people, including young people they say that you should make sure that it has a formal appearance so that it builds more trust and invites people to to read it as a as a quote unquote serious document. When you actually show children the policies that are built that way. They have very different ideas about how, how, how formality comes to play or, or what is that invites them to to read or see something. So much that's fantastic. Reese, I'm just going to finish and I'm aware of the clock ticking down so I'm going to ask you a very short answer, please. Just around the role of child impact assessments you know what what what responsibilities to digital content and service providers have in in dress in addressing their own service using child impact assessment. And the person who's asking has made the point intended and unintended consequences of their service. Apologies, there we go that's a one year on and we're still unmuting ourselves. Look, I think it's an incredibly interesting question at the moment, there isn't a requirement for, you know, services or products to undertake child impact assessments. And I think that's where a lot of this goes wrong like Elpash was talking earlier about the solutions not being upstream enough and the people just don't think about children young people until the product is fully developed. And, you know, part of the work we're doing is to really try and as a friend put it make sure that children and young people are seen as stakeholders from the beginning, and that their needs and their right to recognize, including as a friend was pointing out, to participate to be involved to be heard to help come up with the solutions to co design. But there is a simple tool that product services companies can use around doing a child impact assessment to actually evaluate, you know, whether it's themselves or every new product that comes on the market from the outset and say okay. For example, here's a list of commonly known hazards when developing products for young people things that can go wrong. Have you looked at these or here are some things you need to consider in your corporate social values. So there's a huge capacity for child impact assessments to really start drive that thinking upstream and help people develop services and products that are right from the get go. Thank you so much, Reese. We are in the final couple of minutes of our session. There was a conversation. There was a question about media literacy and I'm going to use the role of the chair by saying, we are hugely supportive of media literacy of data literacy of digital literacy of all the literacies, but it is not instead of creating a an age appropriate online world for children in the first place we mustn't educate them towards a system that is not good for them. And now I really do want to thank the panelists, Reese, Ephraim, Alpesh, thank you so much for your words of wisdom. I do want to thank Katina, who I don't think we said right at the beginning is in fact the chair of our IEEE Working Group. Katina thank you for today and thank you for all that you do on our behalf. And I want to thank the team at IEEE and fit UN conference for their support. And mainly, I want to thank the young people who gave us their thoughts and always are our inspiration. We're going to close with another very short video from the young people we spoke to in Southeast England. Goodbye. Thank you very much.