 Hi everyone and welcome to the panel that's going to be talking about privacy buying design practices in like the various organizations that are present on this in the call, but also of course in the sector as a whole. I'm with half the body and I'm a senior manager for global public policy at the Mozilla corporation, and I'll be moderating the discussion today. Before we kick off the discussion, I thought that we could just very quickly go around the room and have the speakers introduce themselves so over to you to start off and then please feel free to pass it on to the next speaker. Thanks with Bob. Hi everyone, my name is Uttara and I lead public policy for snap chat in India. Very nice to be here. Kailash, would you like to go next. Yeah, thanks. Hi Kailash, I head technology at Zerodha. Zerodha is a stock booking firm. Over to you, Sherrin. Hello Uttar, hello everyone. So this is Sherrin Amanu. I work as a privacy engineer at Zeta. Great. Thank you. I just wanted to confirm, Sherrin, whether AVS Prabhakar would be joining us as well because I know that he had said yesterday invite but I'm not sure whether he's on the call so I just thought I'd check with you since he's also from Zeta. Yes, he is actually traveling so he won't be able to join us today. Okay, no worries. So I guess that we can have the conversation with the folks like in the room. I think that to like kick it off. Privacy by design of course as an idea is something that's been around for quite some time but it's the recent spate of regulations, starting with the GDPR in Europe, a couple of years ago but even now in India with the data protection that's really brought this concept to the fore as organizations of various sizes are thinking of how they can look at ensuring that their practices, both when it comes to the designs of their applications and services but also with internal practices with regard to how they deal with the data that they collect from from these services is as privacy preserving as possible and on the conversation today we have organizations from sort of like a multitude of sectors like from the sector to the social media sector that I think are all carry a wealth of experience with regard to both best practices internally but also how they engage with the space overall so to kick the discussion off I thought we could start off with Dutra, you coming in and just firstly, gently talking about snaps own experiences with with both the concept of privacy as well as it sort of incorporates it into its applications but also how you see both the sort of external environment when it comes to privacy regulation and how it's impacting the development of apps and the provision of services in the technology sector. Oh, thanks about. You know, there's obviously a lot of complexity that you have to grapple with when considering questions about privacy, I designed right, and my learning from, you know, being at snap over the last nine months is that, you know, it is important for us to come together as an ecosystem and delve into these questions. In my mind, a good sound privacy by design practice really right is a design development process that considers sort of privacy and safety implications of new product features at the front end of the development process right that's that's what I've learned in the last nine months So, being at snap and learning from colleagues and a company that prioritizes privacy. And it's not a post facto consideration essentially or merely a catchphrase to use in regulatory filings. I think privacy by design was sort of use a lot, generally in sort of public policy parlance right but to me I think what I really understood is that it is sort of the thing that companies need to think about at the get go at the time of developing apps at the time of developing even new features right and sort of sets a baseline to say, you know, if there's a particular update feature that does not pass this internal privacy review and you have detailed processes to determine that you don't launch that right. I guess, you know, like, another thing that I'd like to say is that sort of making specific design choices does inadvertently sort of throw up questions about a trade off between higher user engagement and privacy, especially if you're like a consumer facing company that is basically your advertising your more your business model for instance advertising base right. And some companies for instance will for instance with zero that's on this panel as well, above your from Mozilla as well. I think there is a way to make intentional choices that ensure that we keep that balance in the favor of privacy. Right. So, you know, sort of going into snap a little bit. It was sort of, you know, a refreshing discovery for me how sort of this privacy and safety by design practice and the data management practice was it was when when it's practice in earnest has such a powerful effect on preventing the types of failures at regulations like we're seeing the people for instance, try to regulate for after the fact right. And so I guess you know a couple things that I could talk about is a couple of key principles right that seemed to me as sort of core to sort of a privacy by design practice. As I said earlier, a privacy and legal compliance by design sort of framework right so all features are reviewed by a product attorney at the end of privacy engineer before launch. Second sort of limited attention of data. Right. So, for instance, if you want to target ads, you are sort of only if you only sort of create like a profile of a person that is relevant to who they are today and not have some sort of, you know, a decade long profile of them for instance that's not reference that is no PII based at targeting and people sort of the data is related as soon as it's been used right so there's no sort of long profile that you make off a person and then target ads towards them. You don't use sensitive data for instance for targeting ads right and you limit the number of interest categories through which you target and add to a certain person. So the user controls and transparency also become an important factor. I think that sort of, you know, allowing users the option to opt out of third party data for targeting to see, you know, allow users the option to see the kind of data that is collected about them. All of that sufferance is important then of course you can have incremental protections for children for instance saying that you won't target ads to them at all, or you won't profile children's data in any manner or form. So, you know, I guess that these are sort of broad principles that strike me as from a design perspective, and from a data management perspective, sort of central to a private by design experience right. So I'll stop there and I guess the one thing I'll say in addition just to answer the second part of your question is that, look, I do think that there are apps that are making these intentional choices with regards their design with regards to data management practice. So when that is happening, a lot of thought is being put into creating these apps and I sort of talked about that trade off between privacy and engagement which sort of has an effect at the end of the day on the kind of revenues that a company earns as well. So when that sort of long term focus on privacy is taken, keeping in mind all of these other consideration practical considerations. A law like the people or other regulations as well. They sometimes seem to feel a little bit one size fits all, because I think they're trying to regulate for the kinds of harms that you typically see in these large ubiquitous platforms, right, and perhaps you know the smaller companies are not always sort of creating those types of problems in fact perhaps they were designed keeping in mind the types of failures are seeing in more broadly in the ecosystem. And yeah, it definitely feels like companies of a size of snap to sort of more medium sized, you know it feels like we're absolutely damaged and we they're not the source of the problem but we do get impacted by one size fits all regulations. Thank you. That was I think super like I think enlightening to be able to see both how an international sort of company that has services in multiple jurisdictions, rather than necessarily using the regulation as a motivating factor to you know in order to design services, like comes up with the concept of sort of principles that that it ties to like build globally as well it's something that we at Mozilla using the lean data practices framework that we've been talking about for quite some time and have worked pretty closely but has kick on as well as something that we've tried to do as well so like and I'd be happy to sort of like talk about that a little bit later on in the conversation too. But sort of moving from I think that to get out in the both of you would maybe get out first like the financial sector as you can imagine is one that you know is both. I think that a lot of the regulation that the rest of the technology sectors really beginning to see now the financial sectors had for a very, very long time. And intrinsically the relationship that financial players have with their like their customers are also among the most sort of like important relationships right because they're fundamentally relationships about about their financial investments their money and various other things. So the term privacy in this case almost takes like a whole new dimension because of how sensitive this data is and the the power that of the relationship between service providers and consumers so could you talk a little bit about both zero does own approaches that that it follows in designing its products but also some trends that you generally see in the industry because over the last couple of years, you know we've seen a growth of this idea the concept of fintech in a very expansive way and there are some practices that even regulators like the Reserve Bank of India have started to you know like look into and and regulate slash criticize in some instances as well. So your thoughts and opinions on those but more importantly on zero does own internal approach would be pretty welcome. Yeah. So we are in a bit of a peculiar spot here. So we're a stock broker and stock brokers in the capital markets are not just regulated but they are heavily regulated by multiple entities so we are regulated by I think five or six different entities including sebi the epics regulator. So we have very strict regulations pertaining to absolutely every bit of information that we collect. So to give you an example. We have a web server STP access lot we're mandated to store for three years and users data is mandated to store for eight years and every bit of information that we collect all the documents scans proves signatures, we are mandated. So the idea of lean data doesn't apply here because we are mandated to collect humongous amounts of extremely sensitive data. So that is a whole different playing field. And if you look at perhaps any broker or anybody any entity in this ecosystem, who's again regulated by the same regulations. We are in a position to collect any more additional data because there's no additional data to collect we collect absolutely everything as mandated by the law and regulations. So, the new these new bills, the new draft privacy bill draft by the JPC, it prescribes of course it's an omnibus bill. It has things provisions for personal data, non personal data, social media, media, I mean, all sorts of things and it's a really complex bill to even decipher or understand. But if you dissect that, many of the provisions that are prescribed in this bill, you already been following under, you know, our regulations and pretty sure the banking industry has similar but it's likely different set of rules. So, compliance is really a way of life here, technology, app data, these are all secondary compliance first, then you build whatever you want to build to fit into the regulatory compliance framework. So in this industry, the way of thinking is entirely different, you know, it's flipped around. Now, I say this industry because fintech is now an extremely broad, maybe even abused term. We are heavily regulated, you know, financial technology players but there are lots and lots of fintech apps that just pop up and that are not regulated that operate in gray as now when it comes to privacy by design there we've seen a very unfortunate trend of certain kinds of fintech apps, fintech apps especially ones that lend money on security collateral less loans infringing on all possible common sense first principle design ideas of, you know, privacy by design, mining people's SMSS or mining people's contacts to do dubious ways of profiling and pressuring people to take out, let's say loans that they can't afford so unfortunately the word fintech has been shadowed by a state of a huge number of such applications completely unregulated but the regulated side is, you know, is made above entities like brokers and banks. So, coming back to privacy by design as a product, as I said, there's no way to be lean as we are mandated to collect absolutely everything. And the amount of data we collect in fact goes up everywhere there are new regulations that come up that ask us to collect even more data. There are KYCs, there are VKICs, there are video KYCs and these have to be repeated regularly. That said, we still generate huge amounts of sensitive financial data, people who invest in trade with us, their transactions are recorded for at least eight years. So that, you know, there's this whole thing about data, you know, data is the new oil as the adage goes. So for product design philosophy, it's kind of the antithesis of that. We think that data in that sense, this whole data is the new oil adage is highly overrated. So, at Zerotha we don't do any sort of profiling, we have huge amounts of data which we store on behalf of our clients, but we never profile, we never extract behavior. We never use any of this data to send recommendations or any sort of to push any sort of engagement. In fact, I'd like to call what we practice user disengagement. So our philosophy is that you build a product you offer, you offer a quality product, offer quality services, keep everything transparent, if people will like it, if people like it, they'll come use it. And we've been following this principle since the very inception and over a decade, it has worked out for us. So we can say with conviction that that model works. You don't actually have to mine people's data profile, people's data to offer them products or services or make a sustainable revenue people pay us to use the product. So that is also possible and at Zerotha, that's the philosophy that we've adopted since the very beginning, despite storing mountains of, you know, financial data, which a lot of people may say is valuable but as I said, we think that is overrated. In many cases for many businesses, there is no need to mine and extract value out of people's private personal data. It's just possible to offer quality products and services may not work in all industries, but it definitely since we are specifically referring to fintech it definitely works in fintech and that's what has worked for us. Thank you. I think that like both was a very interesting insight but also I think like a very deliberate example of what happens that just because you have a lot of data that doesn't necessarily mean that you have to utilize that data in a particular way and that the sort of In fact, like one of the things that we even we at Mozilla when we talk about the lean sort of data practices approaches, essentially this the first principle is really only collect the data that you necessarily need and I think that both because of regulation and the kind of the sector you are even though you end up collecting a lot of data the follow on points which are what are ways in which to utilize it in a way that respects user agency and and is in their interest is clearly something that I think like I have a couple of follow up questions but before sort of really going into that like Shalini would you like to come in talk about both Zeta's experiences in the space as well as the in general the design of its products. Sure. So we are in the process of fabricating privacy into you know the culture at Zeta and more than that the processes that we are following the business and or even you know recommending things to the client. So we have been very cautious with that and also while establishing the privacy governance structure at Zeta, we have been, you know, very much, you know, we always made sure that they were the best sort of people who would manage privacy and in terms of handling the procedures and in terms of documenting things, everything was taken care in a very lucid manner. And in terms of, you know, user experience right, I have to speak about that the front facing user facing applications, but always mandated that they are, you know, privacy compliant in terms of having a transparent, you know, policy explaining to the user about how we would use that data, how, you know, Zeta is going to process that data and why are we processing that data. So things like that we have been very much peculiar about. And apart from that, we also maintain a habit of having a inventory sort of thing wherein we record all the personal data that is being stored or processed by Zeta so that we can, you know, in turn have keen eye on the data fields that have been generated and also, you know, set the right set of controls for each and every field that's been generated out of Zeta. Great. Thank you. I mean, I think that like it's pretty clear that there are certain sort of like trends that are emerging from, you know, the points that everyone's really raised and I think that one of the primary goals of this conversation is to really sort of like document some of these practices for the rest of the community and I think that while people have given an overview of what some of those practices could really be, we could also talk about before really getting into some of the practices in detail like at least what people think of the current state of these practices in the Indian technology sector as a whole. It's something that for example, Kailash, you referred to when you were speaking about other fintech apps, apps that are completely unregulated, you know, like deceptive design patterns and how they're associated with data collection as well. But maybe Uttara and Shalini, in that order, if you would like to sort of like talk a little bit about from your experiences, you know, as participants in this space, not necessarily as people representing your companies, what do you feel about the state of practices in the ecosystem right now? What are the things that should be done in order to enable some of these practices to be adopted better? And do you think that both the companies that are there in this room, but also otherwise organizations like Hasgli can play in making sure that those practices, you know, like are more commonly adopted? So Uttara, would you like to go first? Yeah, no, I think that's, that's a great question, right? I think that, you know, from, I think about Snapchat, I think what's core to the apps design release fmrality, there's sort of a separation of social content from media content. So effectively the part that there is no news feed, right, that, you know, say my post as well as the news is all in one place and then there's sort of like a public profile that people can sort of react to it and have a public, highly public conversation at night. If I'm honest, I feel like, you know, just as a user of the space, again, we go back to this whole privacy versus engagement question, right? I think the fact that some apps have open architectures and I'm talking strictly about the social media space is because there is something about that design that is much more heady and sort of compelling and how and grabs a user's attention, right? And creates more engagement, right? And I think that at the end of the day, what are the incentives for an app to prioritize privacy over that engagement? You know, I think we're now in a state where the debate on social media and safety on social media has reached a tipping point such that regulation is now coming to try to solve for that problem, right? I think what's distinct about Snap, and this is again maybe a drunk too much corporate roulette but also really having noticed how the company has been sort of designed right from the start, right? It was very intentionally designed as an antidote to social media, right? It was sort of, it was from the learnings that he having this one feed where everybody puts, you can see that their friends contents as well as, you know, the news or you could have a full record of everything you said since you were like 16 years old on a public profile, that sort of stuff was a big learning, right? In terms of this is not a privacy leaning and everything about the app, right? From the FM reality on chat and the fact that chat is separated from the part of the app where you can see the news, you know, where you only work with, we have an old school method of discover, which is our news tab, which is basically you only work with like a publisher, right? And you have a contractual arrangement with them. So that sort of takes care of the fact that there's no, you know, sort of content out there that is unlawful or sort of not of color in any way. These types of choices to me were frankly prescient. I think that there is a while before, I don't know, you know, I often I'd love to hear from other panelists as well and from YouTube above like in terms of consumer demand for this, right? I mean, is there some, is there a growing momentum for, are we reaching this space yet where users feel like, hey, I no longer want an app to specifically target me. I don't want to be on an app where I'm being performing for everybody. And there are a massive incentive in terms of vanity metrics, for instance, and, you know, there's an entire culture around sort of the headiness of more social apps, right? Which Snap, I think, is intentionally trying to steer away from. And I think that, I don't know, you know, I think that ultimately it's a matter of, like, this is a long term bet, you want users who ultimately choose your app because they find they feel safer and they connect better on the app. And I think that, you know, if I think that the user base of Snapchat is kind of indicative of a shifting trend, we have mostly 13 to 25 year olds on Snapchat world over, you know, and that's telling us something about the way that younger people are preferring to communicate. And hopefully that's a trend that will stick, you know, so yeah, like a bit of a ramble there but really I think that I don't know the answer in terms of user demand that tipping point. But ultimately, right, like if companies are saying, hey, I didn't, I don't mean to be this arbiter of big democratic questions or I don't mean to be an arbiter of free speech. I think Snap has said, has sort of taken a different approach and said, listen, we're just going to be a closed platform, and we're not going to become the destination of these heavy conversations in the first place that are causing the kind of harm set the regulation is trying to solve for. So, yeah, that's done that. Wonderful, thank you. And yeah, like I'm happy to also like sort of respond to a little bit because we definitely have some learnings on Mozilla's and in terms of the general demand for privacy and for more privacy preserving practices in this space and we've sort of seen that reflected both in the usage of our own products but also the kind of new products that consumers are asking organizations like Mozilla to create in this space as well and I'm happy to get into that but before going into that chat and I really want to give you an opportunity to like same broader question that I'd asked about Mozilla in terms of your views of practices in the space as well as things that could be done to sort of improve them as well. Yeah, so there's one point that I would like to raise here regarding the awareness of privacy in itself of privacy as a practice that has to be followed by companies, as well as you know in terms of understanding the different laws that are, you know, emerging, right, and in terms of prioritizing that and seeing whether it's met in terms of compliance as well as in terms of governance as well at the company and also hearing phrases nowadays like, you know, data as the new oil and, you know, like, keeping privacy in the center of user experience, right, but as more often that we hear that this is the era of big data, I feel that this is the era of privacy because with emerging laws globally, right, it's a very evident fact that we need to be, you know, responsible and accountable for the user's privacy and keep them at the center of building things, right, and keep embedding privacy into that into every phase of, you know, developing products, be it the, you know, the fintechs or be it the social media companies, wherever it is possible. Yeah. Thank you. Yeah. And I think that like, just to I think we'll take that point forward and cover the thing that I'd mentioned earlier so even at Mozilla like I think that, for example, when we started talking about lean data practices with other organizations which is now almost seven years ago, but even in general when we first launched, say enhanced tracking protection, right, which was a way in which you could not like where Firefox was one of the first browsers that proactively blocked trackers and cookies from following you around the internet like at that time I think that there was interest like you got media coverage consumers you like those services but if one were to compare say something like 2014 2013 which is when some of these things started happening to now there's a sea change in terms of both the demand from consumers for these services but also how other players in the ecosystem like engage with them right to give you an example of say enhanced tracking protection, the, of course the entities that this impacted the most were advertisers and website operators that because like advertising tends to be a source of revenue on on most websites on the internet but when they gotten like in 2013 to 2014 like, there wasn't really a conversation it was motor we're doing this, okay it's fine, you know it was the very similar approach to how people tended to view ad blockers at that time like that it was a miniscule fraction of the people who used to carry out those things and in and it's but now we've come to a scenario where of course Firefox doesn't block advertising in any way at all but but but we do restrict how users attract by third party cookies across different websites. Now the conversation is quite radically different advertisers other browsers like chrome are all really thinking about how to incorporate privacy preserving practices because they realize that there is a consumer demand for these things and now there are certain players in the space were actually outliers for not providing those same privacy preserving practices that even you know just five years ago, we were the outliers for providing them. And just in the browser sector, whether one looks at Microsoft Edge, Apple Safari, Google scroll and us like, almost all of them have some privacy preserving practices but at least three of those four browsers have very clear privacy preserving practices that outlaw certain kinds of third party cookies block trackers by default and do many other things or all of which have really happened in the last two to three years and I think that that's fundamentally come from consumer demand of the idea that like I think post Cambridge Analytica in 2018 like the general awareness and society of some of these issues is expanded and it's not really stopped since then and regulation in many ways, I think is like is playing a similar role. I do think however that like that the sort of incentive structures and this is something that we have spoken about in the past in terms of like you know, is the business model of the web truly sustainable in the way that it works right now and I think that whether it's the broader venture capital sector whether it's you know like the notion of the idea that the only way to just to monetize users that you have is to do something with the data that ends up you know like either targeting them in a particular way or profiling them in a particular way is one that is slowly trickling down like in even within the Indian sector but like it still remains a little more the exception than necessarily the rule and I think that organizations like ASGIC and them carrying out measures that both make it easier for technical teams to be able to like understand how to bring these practices to their organizations but also engage in the policy conversations themselves right because there is a I think mixture between the both of them that that's really important to make sure that they become more sustainable in the long term and I think that that's, yeah, like that's something that we're both been engaging with and have been a part of but also like I think are really hoping that we're looking to see scale in India over the next couple like coming years because it's it's something that ends up being pretty important. So, so with that I think that like now that we broadly covered Indians in like organizational overviews and what we think of the Indian space I thought that we could spend some time, maybe in particular talking about some of these practices like like while I think all of us have mentioned that in passing like Kailash maybe starting with you like imagine that you were talking to a room of developers like from the like you know from the space and you were you and you had to tell them. Here are certain practices that you should keep in mind when you're thinking about the idea of privacy by design what would those ideas concepts principles be that you'd like to sort of get across to them and you could of course rely on a little bit on the guide that he has together but even independent of that like from your own experience of running a tech team in, you know, like a successful fintech startup like what would you say. I think more than technical measures, which we can just Google and figure out, it really comes down to the culture and philosophies that power run a business. So to give you a concrete example, we have this only on a new basis philosophy, so we have a large number of you know support agents who pick up phone calls and answer customer queries to the need to look up information on customers for calling. Now, if there is a broad lookup form it's possible for anyone to look up anyone's information. By design we've restricted it in such a way that we have free level permission so irrespective of who someone is irrespective know by virtue of position in our company somebody does not just get access to like a super admin dashboard where you can do everything. So for that you have to have the top management on board if you know if you're CXO demand that they need access then there's no point in discussing technical measures because everything will trickle down to not be privacy friendly at all. So it's very cultural. So that was one concrete example, no matter what dashboard that we build what reporting system we build what customer support will be built. It's only given to people who really need to use and within those dashboards and screens and views people only really do. People only get to see what they're supposed to look at so feel level permissions. There's no place where somebody we have a thousand first people so there's no place where all people can just log in and you know start pulling data. Now, this simple thing you know people only get to see what they meant to see trickles down across all departments across dozens if not maybe like a hundred systems and that gets incorporated. Technically from access control to feel control to access audit logs to alerts that drop an alert if somebody's you know, some certain information bit of information is being pulled more than whatever an expected number of times so that just that principle has expanded to dozens of technical practices. And this again this this whole idea also gets embedded into systems development when you write software so we have a trading platform which which is used by it's a millions of people every day. Then there are these massive databases where data goes in sets you know the data that people generate their trades their history. So these are entirely separate they're sit on separate networks they don't have access to each other. Developers who work on let's say the training platform have no access to systems that do the number country. And just that that's incorporated right into network design there's no physical access there's no connectivity between these two systems and if they if these two completely different systems need to speak to each other. It's defined in a spec somewhere a certain API very restricted limited connectivity is created it goes through a gateway. So just because there's zero that has six products and they all are integrated into with each other does not mean that all of them sit inside let's say one network. So this whole idea of an application a person should only see what they're supposed to see similarly a piece of software that we write should only be able to access what it really needs to access everything else is closed off. Everything else is closed off by design. So that I think that's all that's all I have to say that philosophy then manifests in many different technical manners practices across all sorts of systems networks people business processes absolutely everything. Thank you and I think that like, if I could just like sort of add to that I think that even historically like that's very much been the case. I think that Mozilla like way from the beginning of that project in 2004 like there were ideas like it took us I think close to a decade to be able to start collecting very basic telemetry about users you know like how many errors are users facing because there was a lot of resistance internally to the idea of us collect like in fact the lean data practices were developed as a way to make internal stakeholders understand that when Mozilla would deal with data would deal with it in this way so it could be the beginning to sort of collect that and I think that philosophical approach has really helped us navigate a lot of our conversations in the space over the last couple of years. In a fast simpler way than I think many of our peers and competitors have had to because for us some of those answers that are complicated for others are quite simple if you follow certain cool code principle and cultural tenants with regard to how they should be implemented Sheldon would you like to come in here and in terms of like if there were either practices or principles or outlooks that that you'd have to share with the space what would they be. Yeah, so the very first thing that came to my mind was the one that Kailash said the role based access controls actually. And apart from that I would like to bring upon two things that is one is data minimization of course, because data minimization is the key way in order to limit the possibility of data getting breached or a privacy breach. So the lesser the data that you collect the lesser there is a possibility of having a breach. That is one thing that I'll tell the developer for sure. And the next thing is about the cookies that have been procured on the apps and that is something that is handled by developers. So they need to be really sure as to what each cookie does, because there are cookie policies being followed by companies and with state that there is a certain cookie doing a certain task a third party cookie or essential cookies for that matter. So that needs to be transparent enough and being communicated to the users, you know, very specifically. What I feel is to be very honest that that awareness is a little less in terms of the Indian region. But whereas, you know, in the international level, it's it's much talked about and it's taken very seriously. So yeah. Great. Thank you. Any anything for on your end that you'd like to sort of add. Yeah, no, I mean, I sort of already talked about this a little bit firstly fully concur both to chill in as well as get last right I mean there is very much. It has to be intentional and it has to be something that sort of permeates the entire organizational process right there should be a culture of privacy within the within the entity from a snap perspective what are some lessons I think again right and this is perhaps more relevant for folks that are in the social space, but in a sense, you know, each of the tabs on the Snapchat app have a lesson to offer on the chat side for instance as fm morality, and the fact that you know, folks need to add each other mutually or to communicate right it's not it's not going to send you a random message, and there's no like others holder for instance. Then we have a thing called maps maps is private by default that to me again as a user control that allows people to protect themselves and only, you know, to sort of turn it on and put themselves on the map if they want to. And then, as I said earlier, the discover the part of the news part of the app as well as the entertainment part of the app is completely separated from these two functions which again sort of enhances privacy, right. So these intentional app design choices, you know combined with the data management practices that children talked about as well for instance we have a minimization practice, we delete by default, and you also have super short attention periods right for data, and also again, a way of targeting ads that does not use PR does not sort of allow individual level targeting so all of these together right I think overall make the app a much safer and much more private app for users than you know say sort of more open guard firm so these to me are some powerful features that perhaps are useful for other types of use cases and other types of apps as well. Oh, thank you. I mean and they're like in terms of some practices that I think I'd like I'd be happy to sort of add as well at least within Mozilla one of the things that we tend to view with regard to user data is that that as much as possible. And of course unless there are like regulatory requirements around it, try not having access to the data in the first place right like so as an example, browsing data on Firefox is end to end encrypted which means that it's synced between different instances of Firefox when they come online and actually doesn't reside in a centralized server somewhere so that's not to say Mozilla doesn't have a copy of your browsing data we completely do because that's where it's synced from but it's encrypted by keys that are actually only present on the devices that user has used to login to that account and thinking behind that philosophy is that as a browser we believe that browsing data is among the most sensitive concepts with which users trust us and we definitely wanted to give them the convenience of having it you know between their mobile phone or their tablets and their laptop or desktop installations synchronized so they didn't have to like, you know, do it all over again whenever they set up a new device. But we decided that it was not something that we wanted to be made directly available to ourselves at the same time as well. And when it and Firefox for example also does rely on advertising as a part of some of its revenue as well so within Firefox's main new tab that one opens up within the browser there are certain tiles on that new tab that are sponsored that you know that like different players can either bid to be a part of or sometimes are directly synchronized with advertising servers, but none of that data is essentially shared directly with with advertisers it's either processing that happens on the device of the user so it never really is even available to a third party. And if it is available to a third party it goes through a proxy server where Mozilla sort of you know like it doesn't even share the IP address from which that request from that ad is coming from with the provider it's it's Mozilla operates a proxy server that sort of takes that information in removes identifying information shares that with like individuals who would like to show the ad processes it back and then sends that same proxy server sends it back to to the installation of the computer right and I think that some of these practices are practices that even within Mozilla we have definitely gotten more comfortable with the idea of of dealing with data of you know like of business models and of and of preserving privacy and it's very much like I would say an ongoing process so it's very easy I think for many players in the space to be like completely privacy maximalist and say like we will never connect data and like and we will never do anything in a particular way but I think that at Mozilla one of the things that at least I personally feel like sets us apart a little bit in the space is that we try to be both more constructive and come up with solutions to the privacy problems that that you know like are created because of the dealing with this data rather than completely not necessarily touching the data at all but also in cases like I described with that end to an encryption sync which is that for certain extremely sensitive pieces of data like building in technical protections that make sure that even we are never able in practice to be able to access it and I think that in general that sort of thing like it's a I think firstly awareness about some of that kind of thinking in in the space in India is is pretty important because I think that you know generally in whether it's in the Silicon Valley culture or whether it's in Europe with laws like the GDPR like the sort of gestation period for some of this awareness has actually been 15 years 20 years like not everyone always thought this way and a lot of this thinking is very, very new and I think that the fact that our technology sector is growing to the extent that it is recently makes it more imperative that some of this awareness is as readily available to the rest of the space as possible as well. So in general just sort of like keeping an eye a little bit on time as well I think that we counted broadly for about 15 minutes for this conversation I like thought that either I could sort of go into a little like ask follow up questions to the things that folks have asked said or we could use the last like four or minutes and like in case folks had closing remarks things that you know they wanted to either react to each other in terms of things that have been said in the conversation so far or wanted to make points that they haven't really gotten an opportunity to do so far. So, shelling I thought that I could start with you and if you wanted to sort of like I mean, anything that like you think hasn't been covered or responses to anything either I've said or or any of the things that folks have said as well. Yeah, so first I would like to thank has geek for making me be part of this, you know, panel discussion with, you know, all of y'all feels a little intimidating for me as well, sometimes. Yeah, so one thing that I would like, you know, has to do, in fact is, you know, raise awareness in terms of privacy engineering and, you know, privacy enhancing technologies, because I did a little research on, you know, privacy enhancing technologies, and how we can, you know, learn privacy engineering. So, all of that is either, you know, it's from Carnegie Mellon University or either it's from some other university across the globe, but not from India. So, I think that has to be brought in a culture of privacy has to be brought in. Yeah, I would like to end with that. Thank you. Thank you. Um, yeah, no real new thoughts to add I think again conduct list to has geek and to was allowed these on these new initiatives. You know, I think one important gap and I think the team and was at has because here as well is that, you know, there seems to be regulatory activity that's developing at a frenetic base in India, there's lots happening almost on a weekly basis that's a new announcement and I think a bunch of us are scrambling to respond meaningfully to policy. I think it's critical that sort of, you know, these conversations in this community very excited by the way to be honest on a panel of the privacy engineer and chief technology officer don't know how I got here. I think that it's important for there to be a bridge between the tech community and the policy community in a sense and to actually take the suggestions right back to regulators right because for several reasons, the policy landscape of causes is in some ways quite politically charged, but it's also sort of not allowing the space for legitimate conversations around design for instance to make its way to government corridors right so yeah you know I'm sort of eager for there to be more thinking around how that gap can be bridged, and for you know the policy making in general to be a bit more evolved right and sort of source itself from problem statements that that and from solutions that you know come from people like yourselves and like more folks and in tech that can speak to these from a design perspective or a data management perspective as opposed to it as opposed to being framed in sort of the antagonistic way that we sometimes see that is at the moment. Wonderful, thank you. Okay Laksh, anything to add. Yeah, a couple of things you earlier asked about the state of practices around us. I think the state is absolutely abysmal so these large scale policy interventions and activity, practically starting with GDPR these are all these are all reactionary. So the extent of views and extraction exploitation of data and private data, it breeds a different point that is when governments and regulators and you can't really blame policy makers because of the, because the extent of exploitation has been seismic, galactic really, some of the biggest most lucrative profitable corporations on the planet, make their money by extracting and exploiting data and many of these new practices etc have been institutionalized into, I know they've been given like a humane advantage, we can humanely collect and monetize private data. I think there are deeply philosophical questions here and I think technological solutions to privacy problems are most likely band-aids and I can cite an example which is the cookie pop up. You know, half the sites on the planet are plagued by dark pattern cookie pop ups. There's a yes I agree button then there's a customized my really complex preferences button. So the GDPR regulations included for a simple yes no, but that's not what's happened. So the prescription of very specific technological measures to mitigate non binary complexities when it comes to privacy and these are not technological problems right it's not a cookie that organizations use cookies as a means to track. So prescription of very specific problems, most likely, and often are highly ineffective what they do is they make user experience for the end users even more complicated than TVS. Now, when somebody sees a cookie pop they don't understand what it is, they hit yes, and suddenly they've legally consented to being tracked. It's probably slightly worse than before. There are all these really unintended consequences with extremely specific technological measures prescribed by policymakers and we also see this in our industry when our regulators come up with very specific technological fixes for human slash stock market problems. They're often, they make it worse for everyone, you know, increase the compliance burden, make it difficult to implement and make it, they make an entire bunch of things really difficult for end users. So, yes, we need really strong policies, but my view is, and from my experience of having worked in this, and from my observations over the last many years, very specific. Now policy makers should ideally stay away from very specific technological recommendations, and it should be on a principle basis. I use that word because maybe the capital markets regulator they have this very interesting concept where they publish circular and they use the word, sorry, in spirit. So the circular may not very specifically describe a technical measure, sometimes they do, but most often they don't. But in spirit, they have certain things laid out, and if you don't follow that in spirit and they can figure out, then you're in trouble. You can't say that, oh, you didn't ask us to show a pop up, but they, you know, in spirit do not collect private data or do not set cookies, whatever. So, these regulations, technology slash privacy regulations ideally should be in spirit, and the implementation should be left to broad industry discussion and like I said, there's just so much activity and there are so many specific extremely specific technological prescriptions that are coming through. I think that is just going to complicate things and probably have really bad unintended consequences. So, policy, technological policy making absolutely requires hands on technological expertise and regulators who are not hands on technical should work hand in hand with, you know, people who get technology in the industry. And I'd like to cite one last example. In 2018, Sebi came up with this very comprehensive cyber security circular for the whole industry, and they've just published a draft for review, and it had many infeasible, highly non technical things. And it was clear that, you know, they didn't really have a technical capacity to prescribe very specific things. So, we wrote back to them and said he was highly receptive. So in fact, he created an industry plus regulatory committee to sit and write the individual points in the cyber security circular together. So we accepted that they didn't have the technical expertise. So they welcomed people with and was a, and over a few months we co wrote the industry, the technical folks from the industry sat with Sebi in Sebi's offices and co authored. And, you know, it was reviewed past through main stages and became law for the whole industry. So that was great. That was holistic regulation making by a really powerful big regulator. And I think more regulators, policymakers, government should take that approach, rather than bombard the industry with highly technical measures, they should sit with technical experts and come up with holistic regulations. Sorry, that was a rant, but yeah, thank you. No, not at all, Kailash. Thank you so much. And I think that it's also something that, like folks in the, let's just say the policy side because I mean I know that I've had many of these conversations with it as well as it's something that we've been asking for for a very long time in the space as well. And I think that the, I wouldn't say the trouble but the issue with sort of, you know, like very broad horizontal laws like a data protection law or an data security act is that the sort of the political impetus behind who speaks who doesn't speak who's a part of the committee who's not a part of the committee who the law will impact who the law will not impact the timelines in which it will operate are all end up being so complex that at its core, there is that representation that you mentioned of you know like, for example when India's first data protection bill was like you know that first draft came out in 2018. Yeah, it had people from NASCAR people from the data security council of India people who were supposed to be the voice of the industry but the influence that the industry has in all processes since then and the law you know like, like I would say argue 25 to 25% of it has changed in all its four iterations and now it's almost a radically different law in comparison to what really started off. And it's the ability to follow through to make sure that the technical expertise is readily available to government stakeholders whenever it's necessary, but also from the government willingness to engage in that format is something that I like just from the example that you've spoken about I think from the savvy is definitely something I think many other regulators, both sectoral as well as horizontal could definitely learn from as well and thank you so much for sharing that experience because I think even from within the technical community who may see this conversation. It's, I think, a motivating impetus to you know like, reach out to regulators not just treat things that regulators give out as it's that this is now going to become law but to share feedback to be constructive and to attempt to shape things in a manner that you know end up benefiting not just themselves but everyone else in the space as well. Thank you for sharing that but I think that with that like we are now coming close to a close in the conversation as well. Thank you so much Kailash and Sheldon for like sharing the thoughts it's been very exciting and thank you so much to has geek for organizing the conversation as well so whatever you have a good day morning at night. Goodbye.