 Principal engineer with Seagate Research Group focusing on data security based here in Singapore. Seagate are a relatively new member of our Open Trusted Technology Forum and he'll be kind of bringing the supply chain security dimension to this discussion. Good morning ladies and gentlemen. So this morning from the storage sector, I have the privilege of sharing our perspective on managing digital enterprises and particularly on how we think about cyber security and risk management. So this is the content that I put together with my counterparts on the US side. The names are mentioned here. So here we go. So coming from the storage sector, I will be talking about the digital disruption that we seem to be seeing through over the past few decades. With that basis, we will look at the scope of product security and then I intend to cover security certification and standards for designing and secure products and what is meant by a trusted life cycle which we will cover in my presentation as well. So I'm going to typically come from a practitioner's perspective. So and then before we end, we will look at how we intend to manage risk and associated with secure products and the trusted life cycle. Yep, so we will start by trying to understand how we got here. That's from the storage perspective. Okay, so we are really living in the era of the data. So it started some during some of our lifetimes with the IT 1.0 of the mainframe era and progressed to the client server platform. So the mainframe era to us was characterized by centralized computing model with millions of users and thousands of applications. So the way we see it is that the first platform of compute expanded as the number of companies expanded. The second platform was the client server era with distributed computing model. PECs were the primary enabler to expand the user base to hundreds of millions of users and use cases to tens of thousands of applications. So the growth in this era of compute is a compute platform is mainly driven by the PECs and systems. So currently what we think is that we are in the third platform, the immobile and cloud era. Like the mainframe era, the mobile cloud era is characterized by centralized computing model. So emergence of cloud computing is the center of this evolution with the ability to access anything from anywhere. So this is something we all know. So mobility expanded the user base to billions and use cases to millions of applications. So the platform is expanding as more people come online. So mobile cloud ecosystem is driven by five key trends as outlined by the IDC data H2025 study from big data to machine learning in the cloud. Mobility and IoT at the end points. So today we are living in a digital disruption as we progress from the current mobile cloud platform to the edge and edge intelligence 4.0 platform. Two of my previous speakers also mentioned about edge. So it's no coincidence I did not coordinate anything with them. So it's interesting that we are also talking about edge. Machine learning, artificial intelligence and new IoT use cases resulting in trillions of things. That's the internet of things. So there's a need for instant access and decision making will once again put the emphasis on the end points and expand this distributed ecosystem. So it's interesting we see we go from centralized to distributed to centralized and then I think we believe the distributed computing is coming again. If not, it's already here. So the digital disruption based on forecast studies we believe that the data actually will grow by 10-4 taking from 16 zettabytes in 2016 to 163 zettabytes in 2025. Well, whether this is accurate or not is yet to be seen but this is the forecast. It could be more, it could be less. So it will see that end points communicating with each other and making life critical decisions in real time. I mean the medical sector is a clear example of that. Oil and gas industry and many other industries. So our industry experience shows that customers are also coming to the realization that the amount of data is really growing. In addition to that, they're also realizing the amount of data that needs to be protected is growing. Data from new sources open up new vulnerabilities to private and sensitive information. It is projected by 2025, 90% of the data created will require some form of protection at some level. However, what is worrying is that less than half of the data that is about 45% would sufficiently be secured according to the study. So the problem will only grow as data creation grows exponentially in the next 5 to 10 years. And according to the cost of data bridge study by IBM security and Furniture Institute in the United States the average cost of data bridge to a company is above 7.9 million. This year's annual study was conducted in 15 countries or regions. It has spanned across 17 different industries and covered 477 global companies. So the same study indicates that average cost of average total cost of data bridge in the ASEAN region is above 2.53 million. So three root causes were identified in the study. Around half, about 48% of the breaches were attributed to malicious or criminal attacks. 27% attributed to human error such as negligent employees or contractors and 25% involved system glitches. So with newer regulations such as GDPR implemented, the cost of data bridge is expected to grow if nothing is done about securing these data in the future. So in order to tackle this growing data growth and the need for security, so we think that we need to leverage the enterprise security risk management spanning across the four pillars of cyber, product, physical, data privacy protection. So looking down, cyber and physical security are top priority as they are the foundation for product resilience. So with the advent of cyber physical systems in the era of the edge, it is equally important to focus on protecting the product through risk management and security by design practices. It's also interesting, the previous speaker also mentioned about security by design. So I would like to emphasise and suggest that this is apart from the conventional IT driven protection against cyber security threats such as ransomware, malware, social engineering and phishing. So what we are really targeting or thinking about is protecting from the product itself or the services that the companies would want to provide to their customers. Therefore from the storage industry that I come from we think the second line of effort is looking down the digital aspect of the product, protecting the device and its specific digital security authenticity. So I think those are things the previous speakers also mentioned about it. You have to take care of the device level protection through different means. We will touch upon those things in a short while. So developing specific device protection features such as locking down the software and firmware development and deployment are vital from our perspective. So the third line of effort involves securing the product through its product lifecycle and managing all associated risks with that. So this includes secure design, sourcing for the raw materials, manufacturing, delivery and service. So service actually to us consists of a few things. We consider deployment in use and lastly the retirement of the product under the service category. So it's simply not enough to design and develop secure products because gaps in supply chain sourcing, manufacturing and delivery of the finished product to end customers can undo the effort put in in the design and development of secure products. So as we go and think about protecting the digital enterprise, we really need to do a few things. First we need to make security available throughout the data sphere that we intend to protect. So we need to provide assurance on good security and authentic products across the global market space. So this is where security, certification and standards can help us. So we really believe in that and we actually advocate and do follow that. From our perspective, there are five critical areas of certification and standards starting with the security algorithms down here moving on to the crypto module that's security by design and security functionality. Security functionality was also mentioned earlier. Secure data disposal and lastly trusted lifecycle to have authentic and authentic security products. So it really starts with the foundational trusted cryptography designed into every product that is made. So we know that the National Institute of Standards and Technology provides a program where we can engage with independent labs to get the cryptographic algorithms validated. So and then validated algorithms obviously gets published online for the public to view. So this in a way gives us good confidence that we are following good cryptographic practices and getting them accredited to independent assessment. Next is the cryptographic module certification so this is the fundamental security by design that we validate. The cryptographic module validation program jointly administered by NIST and CSC of Canada. So this program certifies the products meet FIPS 140-2 standard. FIPS 140-2 certification as many of you would probably know is required for products that are procured by the federal organizations in the US and Canada and sensitive and unclassified space. So this is also a valuable certification to have in other geographies. So that's from our experience. So next is the security functionality. So this is about if we are achieving and communicating the security and data protection we intend to provide in our products. So this to us is achieved by the Common Criteria certification and Common Criteria calls this protection profile. So Common Criteria is recognized by 28 member nations globally for information security acquisition. So in short, product certified level of service with FIPS and Common Criteria solutions are for government and enterprise customers running highly secure data sensitive services and applications. Next is Eurasia. So this is often a neglected aspect of security. Products that are discarded without proper sanitization at the end of life are potential treasure trove of data. Be it personal or enterprise related. So securely disposing data need not be limited to government, security, state secrets. In the context of enterprise, the risk of valuable information such as price list, sales figures, customer data, engineering data and bits falling into competitors hands is alarming enough. Apart from reputation damage, loss of competitiveness, there is a possibility of lawsuit from those affected by the release of his or private information. So this is even more so with the new regulations such as GDPR and other similar regulations around the world. So with critical data security standards, laws around the world, can we be assured that data is properly sanitized in a strong way and can we prove that? So with data privacy regulations introduced by various jurisdictions around the world, the need for secure data disposal requires emphasis and it should not be ignored. Therefore, adhering to international data disposal standards can save enterprises in terms of effort as well as mitigate data breach risk. So apart from that, we also need to look at the longevity of cryptographic algorithms that we use in the products, systems and services. So with advances in cryptography, what's relevant today may soon become unacceptable sometime in the future. And designing and building products takes time. Typically, the development life cycle from conceptualizing to product realization can be anywhere between one to two or even three years. And the product can last in the market space for three, four, five, ten years depending on what product we're talking about. So choosing the right cryptographic algorithms, looking at the longevity of the algorithms is also important. So this is something that we also focus on. So that's from the designing and designing products. But after many years of designing secure products and conforming to O5 140-2 and common criteria, actually we have grown to realize that these alone are not enough or not sufficient. Designing secure products is important but definitely not sufficient to deliver authentic products. Because sourcing and building of the products play a crucial role in ensuring the designed product is manufactured and delivered, authentic and untainted. So having a secure supply chain where each and every component that goes into the making of the product are trusted is vital. And mitigating introduction of maliciously tainted or counterfeit products. So counterfeit is a huge problem. So if you don't secure the supply chain, you may end up suffering business losses, reputation damage. So likewise, globalization is inherent in the business environment. So to scale production in large volumes to deliver to a global base, it may be necessary for enterprises to have a global manufacturing presence. So it is likely not all manufacturing capacity is within the span of the enterprise. But enterprises could be utilizing multiple third parties that provide manufacturing as a service. This means providing IPs such as software and firmware, bill of materials, specialized tools and equipment to the third party manufacturer may become a necessity. However, not all manufacturing service providers security compliance level may be acceptable. It could range from unacceptable to acceptable. Therefore, enterprises cannot afford to ignore the security of operational technology elements that is required to build the products in large volume. So to us, it is about the lifecycle and can we assure the world that the products are authentic and have good security in them throughout the entire product lifecycle. So when I say the entire product lifecycle, I'm really talking about the design, design source, manufacture, deploy and service. So from product cybersecurity perspective, the NIST cybersecurity framework actually to us provides a policy framework for security guidance for enterprises, common risk management structure to access and improve the ability to identify, protect, detect, respond and recover. So flowing down from this framework, this is where working with the open group and aligning along the open trusted technology provider standard, OTTPS is complementary to secure product certification and standards. So aligning to global ISO OTTPS for product security policy compliance of identify and protect. So what we're really saying here is by aligning to the OTTPS, we think that we are also covered from the identify and protect aspect of the NIST cybersecurity framework. To address product security throughout the lifecycle, we have to apply the controls throughout the organizational structures. So that may require enterprises to establish high level corporate policies across the business units and also to scale policy compliance across internally and externally to the third parties. So third parties could be like manufacturing as a service provider. So we believe that these standards should apply to them as well. So you can use that to benchmark how's the security compliance of these vendors. This is because the ISO 20243 is the cybersecurity framework for IT components, and it is recognized in the marketplace. So in today's cybersecurity world, it also matters how we communicate and handle cybersecurity compliance and issues for detect, respond and recover. So not only the identify and protect is important, we think that we have to look at it from holistically. We also have to cover the detect, respond and recover. So what we say do and don't do as we drive continued transparency and incident response capability in the Product Security Operations Centre and Product Security Incident Response Team is important. So switching gears to managing risk across the trusted product lifecycle of design, source, manufacture, deliver and service. Enterprises have to consider or think about establishing level-based compliance goals across their product portfolio. A good step would be to perform self-assessment across the trusted product lifecycle which will help to determine the gaps in the areas of technology development and supply chain security for the enterprise. The conformance level, the conformance step over here is about gap closure actions that have been put in place after the analysis, the self-assessment is done. Once you have identified the gaps, we have to think about what to do about closing the gaps. So that's the conformance step that we are talking about. So we have to put in the gap closure actions and a key aspect of that phase would be to create awareness and education in the form of conformance training. So to us, it is also about validating the initiated gap closure actions whether they have achieved the desired outcome. So the next two levels is about preparing and attaining ISO 20243 certification. So part of this could involve identifying and training and grooming personnel within the organization on security certification. For example, OTTPS and supply chain security standard because such trained personnel could apply their knowledge and tailor and deploy these skill sets within the enterprise domains which will help to move the organization up the compliance level and also the end goal is to mitigate or even reduce your cybersecurity risk. So going by the standards approach, going by certification, we think that we have a systematic way of identifying these gaps, have a concrete action plan in place and address them. So that's what this slide is about. Basically, we're talking about adopting a standards-based algorithm and certification like FIPS 140-2 and Common Criteria. That serves to identify the gaps and resolve them for the product and provide assurance on security by design. We think we need to do likewise for the product lifecycle. So despite the enterprise best effort to design and build secure products, it is likely that cybersecurity incidents may crop up from time to time because discovery of vulnerabilities could come from various sources. Security research community is always looking out for such vulnerabilities and they publish them as they find them, they even notify companies. They expect some action plan to be in place and closure to happen in a timely manner. So that's one source. The research community is one source. Or it could come from internal stakeholders within the organization. Or the input source could be from publications or advisories that governments tend to publish from time to time. Or the other very painful way of discovering such issues could be counterfeit cases in the supply chain. So whatever the input source may be, enterprises will definitely do well if they have policies and processes in place to detect, respond and recover. So having an incident respond team in place that is able to review the situation, acknowledge the situation, determine the cause of action and follow through with the cause of action, both in terms of communicating internally and external stakeholders, and close out the cause of action in a timely manner is important because it's going to bring stability and reestablish trust with the enterprise customers. So this actually brings me to the last slide. So in closing, basically the enterprises are entering the era of the edge, as we know, as we keep hearing about it. This is not a fact. It's a reality, we are seeing that in our industry. Where machine-to-machine connections are growing, resulting in really data explosion. This presents a great opportunity for anyone involved in data, be it from creating to moving to storing to analyzing to learning and selling them and much more. At the same time, it presents great challenges in terms of security for the product and services. So therefore taking a standards-based approach to design, source, manufacture, deliver and service secure products will help enterprises on their governance, risk and compliance integration of information technology, operational technology and product technology. This in turn will help enterprises from IP leakage, delivering tainted product and having their network and systems available when they need it. So with that, thank you.