 So good morning to all of you. My name is Sanjay Dhawan. My focus for the next 20, 25 minutes would be to basically talk about what is happening in the automotive space and how display and user interfaces and human machine interfaces are playing a very important role in the way the car is getting more and more connected and autonomous. Before I start, just a quick introduction to our company. So I'm with Harman. Harman used to be a publicly traded company. Recently, we were required, and I'll talk about that as well in a minute. We're a market leader in providing the audio, video, control, and infotainment systems, both that you use in your daily consumer devices, in your homes, and so on and so forth, or in large venues like this. So these huge humongous speakers are JBL, which are Harman products. The stapled center next door is powered by Harman. And we are also number one in providing the infotainment systems that go inside the car. So this is the large screen, which is basically providing you all the infotainment and connectivity and other functions, basically. And we're a number one company in that space, a global company with almost 33,000 employees worldwide. Recently, about a month ago, a month and a half ago, we announced that Samsung is acquiring Harman. So as of today, we're Samsung now. Samsung, as you all know, 20 years back, got into the consumer device space and displaced the leaders at that time and became the number one consumer device company. About 10 years back, they entered the mobile space and also became one of the leading mobile device manufacturer in the world. Recently, they basically decided that they want to get into the automotive space, because a lot of trends from consumer to mobile is getting more and more into the car. And I'll talk more about that as I proceed in my presentation. And they decided this time that instead of doing it organically, they want to do it inorganically by acquiring a company. And they chose to acquire Harman. So as of March 10, Harman is the fourth new automotive division of Samsung. I think the goal here basically is that the combination of Samsung and Harman were basically using that to enable the new vehicle of tomorrow, the new car, which is being architected for 2020 and beyond, is going to be extremely complicated with lots of new technologies inside it. We already are basically seeing the emergence of the car getting more and more connected with its own telematics unit, with its own central compute platform, which basically has a huge amount of capabilities to process a lot of different data, which is being generated inside the car, has multiple different displays for instrument cluster to assist the driver, the infotainment displays, other rear seat displays, and so on and so forth. But more importantly is also integrating new technologies like the one that we heard about in the previous talk. We're working on bringing in AR with navigation inside the car. So today, you basically get your static navigation maps. But tomorrow, using a front-facing camera inside the car, you'll be basically able to get the actual camera feed and the navigation instructions, which are going to be augmented on your maps. So you can basically see the actual rendering of where you are going and so on and so forth, including the street names, the numbers of various different building numbers, and so on and so forth. Autonomous is another key area, which basically is requiring us to build some amazing amount of compute and other capacity inside the car. And I think displays and how the driver interacts with and the passengers interact inside an autonomous car is another area where the display industry can really help the automotive OEMs and companies like ours. So in terms of the key mega trends which are transforming the cars, I've listed six of those here. It starts with smart audio. One of the areas which is improving and playing a very important role inside the car. A lot of interactions are becoming voice-oriented. Natural language plays a very important role in giving those interactions inside the car, basically. So instead of having knobs or touch screens and creating distractions for the driver, audio and voice is going to play a very important role. Cockpits are getting more and more converged today. In a typical car, we have an instrument cluster. We have our infotainment screen. And we also have our mobile phone inside the car. And we as drivers are typically interacting with three of these screens when we are driving. In a typical, when you are driving on a freeway, at 65 miles an hour, if you don't look forward for four seconds, your car has basically traveled the length of a football field. So that's how important it is for all the drivers to basically always be looking forward. And I think cockpit convergence where a lot of information, which is being generated by various different devices, gets converged into a device, into a screen. That basically is assisting the driver in the direction of where they need to focus on, becomes extremely important. Cloud connectivity, this is another mega trend that I'll talk about in a little bit more details as I move forward in my presentation. But needless to say that the car is getting more and more connected with either embedded modem or through your mobile phone. Artificial intelligence, machine learning is playing a very important role as well inside the car. We're working on algorithms, predictive algorithms to creating predictions for what the driver may want to do. For example, many times, when you're going from your home to your from your office, you don't need navigation because you know that out as a driver. But the reason you start the navigation system inside the car is because you want to get traffic information. Why can't the car basically learn that you are at your office address, you're going to your home, and basically if there is any traffic problems to basically start an alert and change your route. So that's just a very simple example. But we're using machine learning, AI, in predicting the driver behavior, and basically coming up with new ways of interacting with the car and making it extremely simple and easy for the driver and the passengers. Shared mobility, it's a big trend, obviously. Many of us use the Uber, the Lyft, the various different car sharing apps. And this is a trend that is going to also transform a lot of what happens in the automotive industry. I'm a user of Uber, but this morning it really helped me. And I'll take a minute here to share with all of you. So I was taking a very early morning flight from San Francisco to LA. The flight was at 5.40 AM. I am five minutes away, around 5 AM, from San Francisco Airport. And the car that was taking me to the airport just completely dies on the side of the road. So this is my limo driver, and the car is completely dead. And I'm literally five minutes away, about to miss my flight, giving a heart attack to my colleagues who were waiting to make sure that I reach here in time. I get out of the car. It's 5 AM, obviously not very safe. It's dark. I can't really walk to the airport. I take my phone out, and I can request an Uber. The Uber could exactly pinpoint the location that I was in on the freeway. And within two minutes, there was an Uber ride to take me to that last half a mile to the terminal. So the point here, basically, is that I think the shared mobility and the car sharing apps and technologies are also really transforming the whole automotive industry. Autonomous driving, we all are talking about different companies at different stages. So are we at Harman-Samsung. In production right now is what we call level 2 autonomous driving capabilities, be it in Tesla or Mercedes or BMW or other cars, basically. But clearly, as things, as the technology evolves, the goal there, basically, is to keep on adding the functionality to take it to level 4. So level 0, by the way, hands-on wheels on, eyes on the roads. And level 4 is hands-off, feet-off, eyes off the road. So we're at the level 1, level 2 level right now, but a long way to go in the coming 5, 10, 20, 30 years. So a lot of investments, a lot of great work happening in the industry. Obviously, computing is playing a huge role, but also sensors, LiDARs, cameras. And all these technologies have to basically evolve together to create the level 4 autonomous experience that all of us are working towards. So in terms of bringing the discussion here more closer to the on the display and user experience side, the experience today from a driver perspective is very disjointed. Like I said earlier, there are many different displays inside the car. And each of these displays are basically working in a disjointed fashion. And what I mean by that is basically that the instrument clusters are working independent of the infotainment of your mobile phone and so on. The best example I can give you is that when you're driving, your navigation maps basically knows the speed limits on various different roads and so on and so forth. But your cruise control system or your dynamic adaptive cruise control systems inside your car are not basically talking to your navigation systems to basically say, OK, I want to stay within speed limits. A system where the two systems are talking to each other basically will be able to give the user the option that, hey, I want to go on cruise control, but I also want to stay within my speed limits of various different roads so that you drive safely. And there are many such examples that I can give where, when the systems are very disjointed, the experience that we are able to deliver to the customers is a disjointed experience. So what we are basically doing in terms of where this experience is going is convergence. In our company, we're kind of converging and creating a single compute platform that is a very powerful compute platform, which basically can run multiple different functions using virtualization and other technologies to kind of keep those functions separate, but also enable the interactions between various different subsystems inside the car. Just to give you an idea, in a typical car, there are between 20 to 100 embedded control units running completely, loosely coupled, separately. And there is anywhere from 20 million lines to about 100 million lines of software code, which is basically running all these different systems inside the car. The idea basically here is that from a user to improve the user experience for the driver and the passengers, we need to basically kind of drive towards convergence. And with the convergence, we'll come, these systems, basically working more closely together and presenting information, which is more driver-centric and more easier to consume by the driver. In terms of the key areas that I feel where work is going on with regards to humanizing a connected car, design and HMI is playing a very important role. HMI is human-machine interface. And there are all kinds of different technologies which companies like ours and many others are working on. Obviously, when in our mobile phone, or we use touch and other mechanisms to basically interact with our mobile devices, remember, in a car, you are driving. You're not supposed to, your both hands are supposed to be on the wheel. And your eyes are supposed to be on the road, you're not looking at different screens. So with that constraint in mind, basically, yes, the interaction with the driver and what technologies to use gives a new set of challenges. Obviously, voice is a very important way of interacting. And there's a lot of work going on in natural voice interactions, many different companies. Google is contributing a lot, and so are other companies as well in terms of basically making that voice interaction, audio interaction with the car more user-friendly. But there are other areas as well. For example, many of you may have experienced a little bit of a haptic feedback on your steering wheel. And the same technology is coming into the touch pads or the various different other mechanisms where you're basically getting, instead of looking at a screen or touching a screen or pushing a button, there are other ways of basically interacting with the car. And the HMI and the interactions basically uses these haptic feedback either on the steering wheel or on some of the device to basically tell you to press a button, not press a button, go left, right, whatever interaction you need to do with the car. Various different advanced interfaces like heads-up display are playing a very important role to basically take information from all these different subsystems that I talked about earlier and converge it and basically project it in a direction which is where you are driving and you in interacting with the car. Intelligent personalization is playing a very important role also inside the car. I gave some examples earlier, but I think we're working very closely with Microsoft, for example, to basically look at some of the productivity applications that produce and how to basically personalize it and present it in a very simple and easy way on the various different displays and otherwise inside the car. Similarly, contextual solutions are also playing a very important role because you want to bring the context of presenting and consuming this information for the driver. Audio and sound management, I mentioned earlier about voice, but one other example that we introduced a new technology called individual sound zones inside the car. How many times has it happened to all of you where you are driving in the car and you want to do a phone call and the kids want to listen to music and they're chatting away? Happens all the time to all of us. So what we have done is basically created individual sound zones so you can convert your car into either two or four sound zones. And while the driver or one member of your family is having a conference call, the others can be basically listening to the music and so on and so forth and all this without headphones. So this is just in your natural environment. You are basically sitting there and we do this with very sophisticated sound management, including very active noise cancellation and so on. And finally, obviously displays, both the center display and the rear seat displays and other instrument cluster displays will keep playing a very important role in humanizing the car. So in terms of the building blocks that are important, I mentioned connectivity, cloud, mobility, analytics will play a very, very important role in moving forward. The connectivity can be through native inside the car or through your mobile phone. The analytics, because there is lots of data that is being generated inside the car. And this data will basically play a very important role in terms of basically creating new use cases, new set of services for the drivers and the users of the cars. I mentioned the UI and the AI will play a very important role, sound management, et cetera. And finally, cybersecurity. I think as the car gets more and more connected, it's securing it becomes an extremely important area and almost all the companies in the space, including ours, were very much focused on that. In a PC or a mobile world, if there is a hack on your PC or a phone, it's a nuisance. In case of a car, if there is a hack inside your car, it's life and death. So we're taking, obviously, cybersecurity extremely importantly, just like we should. In terms of how this whole new area of a connected car is emerging, the approach is very evolution. It's an industry like one of the other speakers said earlier, which is typically in a PC world or a mobile world, you are owning a device for six months to two to three years, but a car you own for 10 years plus, typical life of a car. And as a result, basically, the process of adding some of these new technologies has to have a little bit more longer cycle in terms of their life cycle, and so on and so forth. The approach that the industry is basically taking is to start with the architecture and basically get the architecture right. So here, having a compute platform that can basically live the life of the car, that's 10 years becomes important because we don't change our cars as often as we change our PCs or our mobile devices. So looking at the core architecture becomes very important. Looking at how to secure it, but also how to update it is very important. So the industry is basically looking at ways of updating the car on a regular basis. And I think you have started seeing many OEMs basically introduced that, the technology over the year update so that just like you update your apps, your PCs, you can do the same thing with your car as well. Tesla obviously introduced it with their Model S, but many other OEMs and companies like ours are basically enabling them to bring these new OTA technologies so that the hardware, which is a longer life inside the car, can be basically updated with new software and new set of services and so on and so forth. Infrastructure is going to play a very important role as well as the car becomes more and more autonomous. We're looking at basically working with the state and various different city initiatives to make the city and the traffic signaling and other interactions between the car. So for example, there is vehicle to infrastructure and vehicle to vehicle communication innovation which is going on. So the idea here basically is very simple, that the car as it's driving on the road should be able to interact with other cars which are on the road but also should be able to interact with the infrastructure that's there on the road. So the use case could be that before it's approaching a traffic signal, it should know what the state of the traffic signal would be by the time it approaches there. And it's already basically taking certain actions as it's driving autonomously or semi-autonomously to avoid accidents, to avoid jumping of red lights and causing deaths and so on and so forth. So the whole idea basically is that the vehicle to vehicle interactions and vehicle to infrastructure interactions become very important. All this is happening with lots of partnerships which are happening in the industry. I think I don't think so this whole space is huge and I don't think so there is any one company who can do it all. Certainly as Harman-Samsung, we're one of the larger technology companies with a lot of different building blocks but we also think that we need very strong partnerships with industry leaders in different spaces, be it Google, Apple, Microsoft or other subsystem providers like many of you here in this room providing various different building blocks for display and so on. And talent obviously always plays a very important role as well. Moving on to the cloud platforms, the example I want to give here basically is that 20 years back, 25 years back, as we all were using PCs, we took our PCs which was a very powerful hardware compute platform at that time and we basically connected it to the internet. We created a platform to basically enable various different apps on the PCs. At that time, Microsoft did that with its Win16, Win32, WinWin, Windows 64 APIs, and a lot of industry got together to write lots of different apps for that PCs. The platform exposed the capabilities of that PC, that very powerful compute device and with a lot of different IOs and so on and so forth. And apps basically came in and used that capabilities to basically create new functionality. We know what happened with internet and everything else. We did the same thing 10 years back with mobile devices. Two major platforms were created, Apple created, IOS, Google created, Android. And basically, if you look at what these two platforms do, they basically take the hardware capabilities of a very smart mobile compute platform, which we call mobile phone, and basically expose the capabilities of this platform for apps to be written, and new services, and new functionality to be created for these mobile devices. Obviously, we saw what happened in the last 10 years with regards to the explosion in the mobile industry of using the mobile phone in all kinds of different ways that each of us do. Same thing is happening in the automotive space as well. The car is getting more connected, and the intelligence of the compute platform inside the car is also converging into a device. We're using a software platform, which is cloud connected. You can basically deliver all kinds of different services inside the car. Obviously, displays are going to play a very important role. We talked about that, but I think this connectivity to the car, which basically has very powerful capabilities to take us from A to B, will also create new opportunities. So for example, today's, the way the insurance industry works today, it's a huge industry, but the way the insurance industry works is that it basically looks at your past record. And based on your past record, they basically try to decide how risky you are, or how good a driver you are, and hence what your insurance rates should be. The future of this industry is very different. So as the car gets more connected and more autonomous, a lot of the way the driver is driving today will basically define how the insurance rates will be for that day, meaning if you're a driver who is basically accelerating and de-accelerating, this information, this data is absolutely being generated inside the car. And if with your user, with your permission, if this data is shared with the insurance agency, suddenly now the insurance agency can basically decide how good a driver you are, or do you always drive above the speed limits, do you stay within the speed limits, et cetera, et cetera. So the point I'm making is basically that you can take a decision about the driving patterns of the driver based on how they're driving today, rather than how they were driving previously based on your accident and other profile, history, and so on and so forth. There are similar other examples as well. Service industry. I'm sure it has happened to many of us that we take the car in for service, and our car needs certain new parts, and they don't have it, and they basically ask us to come back, and so on and so forth. A car inside is generating these codes called DTC codes, defect codes, which basically, once the car is connected to the internet, these DTC codes can be basically transmitted on the internet, and various different automotive service providers can basically take those DTC codes, they have the make and model and other information about the car, and they can basically decide, do they have the parts that are needed to service your car and basically give you a quote so that when you go take your car in for service, all the parts are there, and you don't have to make repeated trips, and so on and so forth. So the point I'm making here basically is that just like in the PC industry and also in the mobile industry, when we took this powerful compute platform, mobile or PC, and connected it to the internet, created a platform which basically allowed various different services to innovate, same thing is going to happen in the car as well, and what that basically means is a lot of opportunity for all of us and our companies. So in terms of what does this mean from a display industry standpoint at CES this year, we basically had a demo car which basically created a user experience for the car, for the driver and the passenger inside the car, and I keep mentioning passengers because if you think about it, over the last so many years we have improved the experience for a driver inside the car, but what we have not done is basically done much for the passengers. I think as this industry moves forward, I think we have to basically look at both the drivers and the passengers and what we can do to help and make the experience much better for both. I wish I had a video of this car to play here, unfortunately I don't, so I'll explain it, but the idea basically was to use the windscreen and also the rear seat screens as well as basically displays which can be used for various different purposes, they can be used for entertainment, they can be used for productivity and so on and so forth, and we created this demo car where the windscreens and the rear screens and the side screens in the rear seat were basically being used for various different purposes, not just the screens which we traditionally use inside the car. So we have to basically think outside the box on how to create new experiences and basically look at how do we give people what they want when they are making the journey inside the car and make it easier for the driver and make it entertaining for the passengers and create them to use that time for productivity or use it for entertainment or both and enable that with all these new technologies that I talked about. So in terms of summarizing the discussion here this morning, I think to create that ultimate road trip autonomy that we are all working towards is just the start. It's definitely not the destination. I think there is a lot of innovation that will come as some of these base technologies come together, user experience and how we basically create the display technologies and so on and so forth and the interaction with those displays technologies using voice, using haptic feedback, touch, other ways will become extremely important. I was thinking about the curved TVs but the curved displays inside the car is even more important. I think the real application of the curved displays probably inside the car than anywhere else because it basically fits in very well with the way we interact inside the car. I think technology will play a very important role in providing the connectivity, the centralization of all the functions and also creating autonomous and other capabilities inside the car. And finally, I think, as I mentioned, architecture, infrastructure and partnerships and people will play a very important role. Thank you very much.