 Thanks, Tophik. I am monitoring the YouTube chat to see if there are any questions there. I have one question. Do you have any examples of scripts or automation which people can use to, let's say, monitor the APIs to make sure that people are being used? Is there any examples of automation? So I think what we've done is actually we've built a custom platform. And that custom platform essentially has these kinds of checks in place. When I say custom platforms, honestly speaking, it's just a bunch of collection of script that before you ingest anything into things like L or database or a Splunk or anywhere else, it will essentially look for regular expression pattern matches. And if it means that pattern matches, it will automatically mask the information before it records it. But yeah, I think I can give you some more examples on my slide before I publish them. Any questions from the panelists on Zoom? There are no questions on the YouTube chat at least yet. OK. OK, thank you. Devan, do you want to take it forward? Sure, Rajan. Thank you. So yeah, I think over the next 15, 20 minutes, what we're going to do is we have a couple of panelists here today with us who are going to discuss their point of views and share their knowledge on their experiences in the space of privacy by design. So we will continue with Tophik on our panel. He is a principal cybersecurity engineer at Emirates Group. We also have Sameer with us today, who is the co-founder of ARCA and comes with a lot of experience in the data security and privacy space. And we also have Satya, who is a senior solution engineer at Akamai. So they are going to be our panelists for today's discussion. So I would probably start with one question for all the panelists here. And this is perhaps just a continuation of what Tophik was mentioning in his presentation, which is around trust, but always verify. One of the things that I have seen in terms of a big struggle with implementing privacy by design is that there are a lot of commercial of the shelf software and tools and technologies that are being used everywhere in organizations, especially when we start looking at the fields of data science and engineering and analytics, et cetera. And that brings in a big challenge because to Tophik's earlier point, a lot of times we know what we know and what is in our control, but it is not in our control. And in this case, these would be the Scots solution. And a lot of times the choice on the Scots solution is not made like the due diligence doesn't incorporate all the different factors, including privacy by design. Sometimes they are made in terms of a specific problem that we want to address or a specific area that we want to address. So my question for the panelists would be that, one, have you encountered those situations wherein you have come across some of the Scots solutions in implementation, and you had to sort of retrospectively look at how to address some of the privacy issues that they bring about? Maybe we can start with Tophik because you were just sort of sharing your point of view, Tophik, and then go to the other panelists. Absolutely, thanks. I think your question is really valid, and I think this is a question that we asked ourselves back in the day when we started this whole privacy by design initiative. I think out of experience, we've come up with this process that we like to call as data privacy impact assessments. What we do as a part of this process is that every time you engage with a new partner or a new supplier or with a different team, you come up with all the data that is going to be exchanged at the time of development or during the lifecycle of the product itself. You catalog all of that data, and once you've cataloged them, you agree what kind of technical controls are going to be implemented. And then as a part of your development process towards the end, you go back and want to validate if those technical controls are implemented in the way that they were meant to. Now, there are challenges with systems that were already designed, so you have to do a lot of retro work. But you can always begin and then eventually move in the direction where all of this data is identified right up front as a part of your data privacy impact assessments. OK, I think that's a very interesting point. I would maybe, Sameer, can you share your point of view on this? Yeah, OK. So I mean, we have seen this happening across COTS. I would not want to go into the product development part of it because that's a year one to tackle. But when it comes to COTS, this is a situation that we have seen happening every time. And when you look at privacy by design and trying to implement, there are really three facets to a privacy by design. Facet one is where you're looking at saying, is this a situation where I need to deploy contracts to be able to take care of it? Facet B, you're looking at saying, can I build wrappers around the COTS so that I'm able to filter and manage their data sharing and their connections and all. And the third part is when you really look at the technology and say that, OK, can I force the COTS guy to go back and redo certain parts of it so that I'm able to take care of it. So so long as contractually with the COTS guys, you're kind of making sure that they take care of privacy in line with whatever you want. You're still in a lot of ways protected. And many of the contractual decisions is via what law says or even what Profig was saying is a privacy impact assessment, which is similar to what you would do like a security impact, but you would go a lot deeper in terms of looking at saying what is the kind of personal data and all that that is coming in. Then you can also look at a lot of wrappers that you can build around. Wrappers are, to my mind, smaller software libraries which would allow you to filter the kind of data inputs that are going in and also build access restrictions in terms of the data output that is likely to be shared with somebody else. Then as a last part, you go into the monitoring and you get into data minimization and what else you want to do and all. So these are the three facets when specifically when it comes to COTS. If, of course, I'm going to go out into development, there is a whole different layer of framework that you need to get building on from both the non-tech as well as the tech perspective as well. Thank you so much. I think you brought up the focus on two very important aspects that we typically look at when we are doing software development. And I think those are considered as part of your NFRs when you're building a good platform, an evolutionary platform. And they would be merit in bringing them into the data security and privacy world as well, which is extensibility and as a collective, perhaps as a collective in the industry, if we could push back on some of the COTS that do not sort of adhere to the privacy by design aspects, then that force would at least ensure that they actually go back and take it seriously. So this is going to be perhaps an outcome of the collective rather than of individual organizations. Satya, do you have anything to share on from your side on this? Yeah, so in my opinion, I think the way the topic sort of laid it out, it actually makes a lot of sense. If you have, whenever you're building a product, if your design teams and your development teams take care of privacy at the very beginning, that's your best case scenario. But I think even Samir and you pointed out that in a lot of situations, businesses typically buy off the shelf components. And it's very hard to sort of build a lot of these systems ground up. The other thing that's happening is a lot of businesses are also now being forced to consider privacy, it's not by choice. But because of regulations, GDPR, we have PDP on the way. So there are some market impetus which is sort of pushing that agenda forward, which is a very good thing. So I think a little bit of awareness overall in the ecosystem would go a long way. Understanding the basics of what is encryption, where is the content encrypted? I mean, encryption is often a term that's thrown around, but it's important to understand whether the content is encrypted at rest or when it's in transit, what is it exactly mean and how does it impact the workflows? Another thing that often goes unnoticed, I think Tophik touched upon this and it's very critical is, if you have content on the browser, there is no way you can actually hide it from a person who is using the browser. Browser tools these days, you can just right click, inspect the traffic, that goes a long way. But a lot of it is actually true for apps and other ecosystems. So in general, as long as you assume that clients are compromised, that goes in a long way, both in terms of security as well as in a lot of cases, privacy by extension. So I think businesses today are grappling with how to deal with privacy, given the fact that traditionally privacy essentially where a few compliance check boxes which were not the focus and it's hard to sort of justify at least in terms of dollars, irrespective of the size of the company. If you're a large company, it's easy to justify the damage to reputation brand and all that sort of stuff. But if you're a small company, small organization, that's an investment you're making upfront. It's sort of difficult to justify the cost. I think that's where the collective effort in sort of driving some of these best practices, a change in mindset will go a long way. Very valid points at them. Thank you for sharing that. So if I want to just quickly summarize what D3 just talked about is, I think one very important and very concrete step that the organizations can take right away is having a data privacy assessment framework that can be used whenever they are building a solution or are buying off the shelf or solution. And that goes a long way because even though that data privacy assessment framework may not be complete, but there is a starting point and over time some of the feedbacks could be incorporated into adding more elements to this framework. There are certain aspects that Sammy talked about which is around looking at what kind of contracts do you have around data with the solution that you're buying. Is the solution extensible and what kind of push you can have on the court solution provider to go back and fix some of the security and privacy issues that they might arrive at. And one key point that you just mentioned Satya was, look at worst case scenarios and sort of start from there because that sort of helps you understand what kind of risk you are running into in case something were to go wrong. So I guess that's gonna be helpful for a lot of organization and as well as technologists, we could work collectively to build some of these assessments or rather expose some of the vulnerabilities that come with some very popular court solution so that the industry as such becomes aware of it and the collective can drive our larger push towards privacy by design. This other question that I had and I think Taufi covered this in his presentation as well which is the balance between convenience and balance between privacy, balance between convenience and privacy essentially. Now, when we are starting with Greenfield project, sometimes it is easier to look at privacy by design as the first class citizen if you have the right personally, room or the right intent in the room. It becomes extremely difficult when you have an existing platform or a product and you have to go back and fix some of the issues in there. And I think we all saw this happening when GDPR was rolled out because a lot of people had to unhappily let go of some of the analytics and data science driven features that they were providing to the customers. So could you all share your point of view on how could an organization start looking at this balance and perhaps let go of some of the key drivers of their business, especially around data science and analytics? Satya, would you want to go first? Sure. So I think in my opinion, right? In general, the regulations on privacy and I just want to make the distinction that I'm specifically talking only about the regulation part of it. If you look at it for the growth of any product or industry that you're looking at, regulations are probably your worst nightmare because you know that there's a lot of information that you can collect. And today, if any product, you can pick a vertical and you're likely to see some amount of personalization and some amount of recommendation or advanced techniques on AI ML being used. All of this needs data to sort of bring out those recommendations, get you the next link that you're most likely going to click on or even make the right recommendations on product fit or what's important to your business. All of that requires data and the regulations specifically are going to hamper it. I think it's a given. Organizations are likely, even going forward will sort of always struggle when a new feature is being released. I think what's important is again, a baseline where you or every product owner and companies know that a new feature is going to be rolled out and if it needs a new piece of data, the adoption is going to be very slow and it's not going to be as fast as the size of your user base. That's critical and fundamentally it's a change in mindset. It essentially means that tomorrow if you're rolling out a feature, it's not going to be accessible to all of your users till they accept sharing that data with you or if a person sharing his data pretty much drives another person's ability to see a feature, then you're going to see very slow rollouts. So that's a new sort of expectation and we'll see that sort of evolve over time and that's where the regulations sort of come in. Unless there are financial penalties, I don't think organizations will be adhering very strictly to it. There will be creative ways and we'll sort of work around the limitations of each of the regulations of the land to sort of speed up the development, bring faster time to market for whatever the products are. So it'll be an interesting space and we'll have to see, we know how GDPR is sort of evolved. It'll be interesting to see when PDP comes and whatever updates to the regulation bills that a lot of countries are in the process of rolling out, we'll have to see how all of that sort of plays out. I like the skepticism that you have here and I think perhaps that might be the way to go forward that maybe as a tech community, we drive some of this organizations are never gonna adhere 100% to these because one, they'll always be creative ways of working around it and two, as long as it keeps contracting with the commercial goals, this will never become a first-class citizen. So I think definitely working with skepticism is definitely gonna help there. Tophik Sameer, would you want to share your part of view on that? I think there are two important aspects to this question, right? One is, where do you want to draw the line? And the other bit is what is your S-capitaite? Now, what I mean here is that at least for us having operated or operating businesses across multiple geographies, you want to be able to add a dollar value to say that what if you lose business from a given geography due to privacy-related requirements? For example, if my business is very, very active in Europe and if I have massive penalties and 80% of my user base comes from there, I need to sit back and take a call of whether I want to prioritize feature over privacy control. And I think helping businesses make educated decisions using proper data, I think drives a lot of changes as opposed to taking a regulatory framework and trying to fit your system into that framework. So I think the way we've tried, and as you mentioned in your question that it's very difficult to implement systems, implement privacy in systems that have been built over a period of 30, 40 years. I think the only way you could address that problem, it doesn't necessarily address, but it gives you a start is to have like a data privacy office that essentially has representation from different parts of your organization, which is something that I mentioned on my last slide. The reason we saw an advantage in doing that is because it's not about what the cybersecurity team says versus what the development team thinks versus what the business thinks. It's everyone giving their point of view and as a result trying to then choose what's the balance approach in terms of going forward. So that's one approach that we've taken to solve the problem. But yes, it's extremely difficult and you will never be able to 100% say that your system is compliant to all the privacy by design principles. Okay, I have a slightly different take on the whole convenience part. So when you say you can either have privacy or convenience and not both is actually incorrect because that's a phase that comes for security, but not really privacy. Privacy is a way of life. So as an organization, if I say that I want to embrace privacy and take whatever it comes to build a business model, then you'll embrace privacy and you're going to find ways and means to ensure that your ecosystem, your suppliers, vendors, the internal governance and all that kind of works with you to make sure that the privacy is built in. Of course there are legacy systems which pose its own set of entrances to building privacy, but it is not that it is insurmountable. Like I said before, we've worked with people where they've built wrappers and we've kind of taken care of it. And at least systems newer that are built in the last five, 10 years, even if they're using the latest technology and all. For those, it's a matter of the organization looking at saying, okay, to what extent I'm going to implement privacy. And then the tech community is able to build a lot of privacy controls within the entire place. Not that it'll happen overnight and I don't think the law is ever saying that it's going to happen overnight, but it's not that it is insurmountable as well. It's also not something that you need to try to do a conflict or try to do a trade-off between convenience and privacy. It's not required really because I don't think there is any end user or even us as end users would be happy to share data if the privacy practices are being met. So it really is not an inconvenience. And going by the way, as tech, everybody of us looks back at practices done by others, the challenge is even more easier because now that there are certain companies who are able to do it better, I would rather have a community working with them and towards them rather than anything. And in fact, for the whole privacy by design, whether you're doing it for legacy or build sometime back or you're kind of just starting out now, we already are building frameworks. We have implemented it at places as well. And we would open it for community as well if you're able to get more contributors, nothing like it because all of us will be able to drive and take care of the whole privacy by design, not design. An example that I can give is the inconvenience actually is more for tech because it was a time earlier when for building a mobile application, you could just simply list all the permissions and say the mobile app needs to have all the permissions without bothering whether I really need the permission or not. Today, the ecosystem, which is people like Google and Apple are forcing back and saying, no, you need to tell us why you need that application. And I will enable that permission for you as part of the ecosystem only when you really need it, not otherwise. So only when I want to access photos, I will get a photos permission. When I want to take a photo, I will get a camera permission. So this is more convenience portion that is coming towards us rather than the end consumer. And we need to embrace this. I don't think there is anything wrong with it because it affects all of us. You also have to look at saying whatever we build, if it does not have the right practices, it's also going to impact us. So therefore, it's not really an inconvenience. I would say it's a way of life. You either take it or you can choose to ignore and work around it. So there are some very important points that you brought up here, and it gives me a perfect segue into my next question, which is earlier privacy by design would be a conversation topic for perhaps the security professionals, but it does ask for a more interdisciplinary or a more cross-functional involvement from teams and different kind of leadership. You just brought up the point around organizational values, Sameer. I guess when you were mentioning that privacy is the way of life and you have to decide whether you embrace that fact or you essentially completely ignore it and still end up building your business around it or against it. So I think perhaps at this point, everyone would agree that we do need to take it as something where you need interdisciplinary involvement, whether this is from the organizational leadership, your legal teams, developers, security teams, data scientists, I would even say it perhaps goes even extended to the UX community who's involved in building these systems. I think that's perhaps sort of agreed by somebody, but have you seen instances in your life, in your career of organizations who have really embraced it and they have benefited from it? I guess we could talk about where things might have gone wrong, but all of us, we have very few positive stories to look at. So if there is any story that you would like to share from your experience. Sure, sure. So about five, six years back when the whole privacy thing was started, we started working on it within India, the initial discussions were all with security folks because it was more because security folks and tech guys are looking at it saying, okay, it's gonna come someday, it's gonna impact me, so I might as well be prepared for it. And we had to force ourselves to go till business or to go till product managers and tell them that, hey, this is really your problem. And it's not really a security problem because privacy is gonna be much, much bigger and much more than security. At least now, I think in the last two years, I'm happy to say that the discussions have started coming from either boards or the CEOs or the sales guys, the marketing managers or the product managers and all. And while yes, it gets deferred to the CISO, the tech community and all from an implementation point of view. A lot of decisions have started coming from there. We've also seen now, I mean, we work with the private equity, we see the investors community as well. And quite a lot of them now have started engaging with us saying we want you as part of the due diligence to check for how these guys are maintaining privacy and security within the system as well. So that is now becoming a live thing where everybody with the community is also looking at saying, how are you going to take care of privacy? And do you have any practices? Do I need to build something? So it's a very positive spin because one thing we have seen for sure when business takes a stand and says, yeah, I want privacy, a lot of work is easier for the implementation side of things because you don't really then need to go back and convince. And there is now, I think most of the larger organizations that we have seen and even smaller ones that we work with have come to the realization and in the last six months have actually started forming a lot of teams where multiple facets of the organization come together. Be it legal, if we have in fact included even finance being included, then the physical security part of it being included, you have the marketing sales and the operations and all coming in to have their chip in to say that, hey, I'm going, privacy is there, the regulation is there, I have to work with it. And so therefore what is my input going to be? So I see a lot of positivity that is available around and that is coming around, there's a lot of realization also. And I think thanks to the whole Cambridge Analytica scandal and the amount of media movement that was happening, privacy became mainstream at a much faster pace with business than what security has been able to do for quite some time because privacy is now not just linked to the regulation, but a lot of people realize it is linked to reputation. Most consumers surveyed have actually gone back and said that if your privacy practice is not great, I simply don't want to work with you. And they have seen the impacts of hash WhatsApp, hash delete Facebook, has delete this, has delete that working, which as a cancel campaign has served to increase and improve the momentum overall. But at the end of the day, I'm done. Right. And I think what he just shared is quite heartening to hear. And so thank you for sharing that, Sameer. Tophik Satya, do you have any stories that you want to contribute? Well, I think yes, I would agree to what Sameer was saying all this while that, I think there is a lot of change in the last two or three years from a privacy point of view. And there is a reason for it probably is because privacy back in the day was always looked upon to cybersecurity or security professionals. But today, privacy is much more beyond that. Right. And as a result, the businesses do want to consider, they want to, they do want to come on the table and have these conversations to be able to facilitate privacy related requirements as and when they're developing either new features or they're developing, I think, new systems. So I would say that I have seen a positive spin at least when I work where now it's not just about cybersecurity standing up or attribution saying that oh, if this is privacy, there are issues in privacy that have security impact, but not every issue, every privacy issue is security related. And as a result, it's very important to socialize and get different people across the different organizations to come together, talk about it, and as a result, have a positive outcome of privacy by design discussions. Right. I think it's a very important point you brought up, which is privacy is not equal to security and they sort of have to be looked at as two separate entities, which might have overlap, but it is not exactly equal. Satya, anything that you would want to add? I think I'll just stress on two things between Sameer and Tophi. I think the point that Sameer made that brand reputation is important today and there's a lot of at least heat on the social media if there's either a breach in privacy or security, right? I think that's one of, it's a new phenomenon, it's critical to sort of speed up the entire adoption of privacy guidelines or even security best practices. In some cases, it's a mix of both and it's hard to tell between the two. The other thing that Tophi was talking about earlier, sometimes it's been difficult to sort of adopt and the point that Sameer was saying that it's not about how difficult or easy it is, it's about the mindset of the organization. I think fundamentally, all of these are shifts in the attitudes, we'll have to see how it plays out. A lot of the power sort of goes to the consumers of a lot of the products, I would say, if they force all of these products to adhere to privacy or there's a regulatory framework sort of pushing them towards it, it's just going to accelerate the process because at the end of the day, I think all of the most organizations are there to make profit. So there has to be something that sort of pushes you in a direction which whatever it may be sort of hinders either the pace at which you can move forward or eats a little bit more of effort towards your goals. So I'll just round up with that. I love that point, Satya. And I think what you are essentially pointing at is consumers, us as consumers used to be, we used to be actual consumers of technology, but to a large extent now, we have become the drivers of technology as well. So I think there's a lot more power that has come in consumers' hands now in order to drive some of these shifts in technology world and specifically in the world of privacy as we're talking right now. I see that the audience doesn't have any questions, but does anyone in the panel or Rajat, Zainab, do you have any questions? So I would like to make one point here and also I think this is something that I wanted to understand from Tophik also. That Tophik, you mentioned a lot about the tokens and anonymization and encryption and all that that comes in. And there is a lot of additional quality review and all that needs to be done. But if I were to understand correctly, you were coming from the perspective of security and there is a security leakage and therefore because it contains personal identifiable information, there's a privacy leakage as well. Whereas when you look at a privacy by design or a privacy leakage, there would be a lot more than that also. And that's something that has a framework that we are working on. So is there something in your organization and all that you have seen happening or is there something, there's a movement that you've seen around which we can work together on? Because I believe as a community, tech has to drive this and unless everybody in different languages and different systems come together, will not be able to push this forward across. It will become a, it'll remain like a security issue, not really a privacy issue. Right, Sameer. So I think with our experience implementing or going through this process, what we've realized is that although there is a lot of overlap between security issues and privacy related issues, meaning because there is a personally identifiable token, it's a security issue, but it has privacy implications. So what we've realized is to be able to define a set of controls that have security implications and that only have privacy implications because the audience that deals with the solutions to both these problems may be distinctly different. So for example, if it's a security issue that leads or has implications of privacy, then you have a set of team that will come or an agile or a small team that comes forward and then decides the way forward versus something that is purely privacy oriented is when you have another set of audience that comes forward and decides the way forward. So I think it's very use case driven depending on what situation you are in. Sure, makes sense. I mean, that's what we have also observed. So even things like I was not able to fulfill rights becomes really a privacy issue. Whereas an actual data leakage becomes a security plus privacy issue. Correct. So the other example that I could potentially give you is around GDPR. You have GDPR has this provision where a user can request deleting all the data that is present or the enterprise holds about that particular user. Now, that's again more privacy driven but if you were really to be able to delete information about all of that for that particular user, you also need to know how many systems are cataloging that data as of today for you. And that is where that security part comes in where you're saying that let's not record what we don't need or let's mask the information. And as a result that sort of rolls up at a point where the business is fairly confident that if ever somebody raised the request to delete the data, I know that it will get deleted and it will not be present in any part of my ecosystem. So the GDPR thing has also now changed to say that it will become a right to forget and which means if regulatory wise if I need to retain something it retains, plus they've also added to say even masked information should be removed. So the moment you know where your masked information is, potentially you're opening up to saying that I can unmask and therefore the privacy is gone. So it's become a very touch and go situation in that. Yeah, I think it's, I fully agree with you. I think it's also very complicated when there are multiple regulatory frameworks that have a different way to look at user data, let alone security or privacy aspects for it. For example, in case of HIPAA, you want to hold the information for 10 years even after a particular treatment was done and closed. But if you then apply privacy regulations on top of it, it becomes difficult for you to retrieve that patient information if you've aligned to all the privacy requirements. Correct, correct, correct. So that is why I mean, multi-disciplinary comes in to decide but there is this whole set of security also which is required to contribute in the entire thing. It's just that I mean, off-lead we are seeing that security is not really driving the entire discussion till the end. Maybe one of the initiators and then a major contributor. But increasingly we are seeing tech and primarily because all the economies are moving towards tech and especially in the current COVID era, everybody is kind of looking at digital. So it's tech who is going to be the primary driver for each and each facet of tech is going to come in like whether it is AI, ML, or a warehouse situation or anything that has data in it. We will have to look at a filter in between which says, are you going to imbibe personal data? And if you are, then these are the additional things you need to definitely do, types. And I think that's what you can do pretty much on ongoing basis. I think as you keep doing it, over the time you will see that your systems are far more aligned to the overall goal but at the same time, the businesses do run their operations as smoothly as possible given the guardrails around these regulatory frameworks. Absolutely. Hey, thanks Devangana for letting me ask the question. Sorry, I took up a little more time I guess. No worries at all Samir. I think this was important and these are the questions that all of us keep coming across in our day-to-day life. So I think this discussion was certainly helpful. I know that we are over time. So I guess any questions that the audience might have or anyone might have, they can reach out to, any of you three or all of you three to get those clarifications. Raja, Tena, do you have any closing thoughts? No, thanks Devangana for moderating this session. For those of you who are watching, the videos of these sessions will be uploaded and processed by about next week on hasgeek.com slash shift elephant. And if you have suggestions for topics, please do leave them in the comments section on hasgeek.com slash shift elephant. We'd be happy to consider. And if you'd like to speak, please reach out to us. We're looking for interesting case studies and data governance and on various aspects of engineering for compliance. That's all we have for today. Thank you very much. Thank you to all the panelists, to Tophik for taking the time out to do this presentation and to Devangana and Rajat for being so supportive in getting this initiative together. Thank you everyone. Thank you. Thank you everyone. Bye bye.