 All right. Thank you Carlos. Thank you everybody for your time. Hopefully you find this valuable but as Carlos said to make it valuable you, I want you guys to ask questions and to have a healthy discussion here. So we had the same thing like few months ago in Santa Clara and it was very good that I enjoyed the conversation and the question. So hopefully this one would be good as well. I'll just start by giving you a little bit of background about myself so you understand the type of jobs I had and the type of expertise I have. So to give you a little bit more context, hopefully that helps you to ask the questions that I can add value. So as Carlos said, I'm right now a group product manager at LinkedIn. I have been at LinkedIn for the past 15 months or so. I'm the product lead on feed relevance. So LinkedIn has this news feed like Facebook and we have like three main teams on feed. One is called Feed Engagement, Feed Relevance and Publishing. So basically my job is to do anything related to personalization and ranking of the items that you see on the feed. And we do a lot of like machine learning as well to do the ranking. Before LinkedIn I was at Yahoo for two years. I have done bunch of different things but mostly also I was in the personalization team so on Yahoo homepage there's a stream of news that we were serving to our users and then we did again like a lot of personalization. I also was working on some sort of like innovation type of lab for six months and Yahoo that was also very interesting. So we launched an app from ideation to the launch. I think that was a good experience as well. Before that I was at a startup here in Menlo Park for two years. And before that I was a PhD student. Actually I got my PhD from USC in LA in information retrieval. And before that I was a software engineer. So basically I was a software engineering scientist turned product management. And I can talk more about that if you guys are interested because I've tried all these different roles. I think that's good enough for now and then if you guys have more questions about my background I can talk more. But I'm happy to answer any questions you have about anything product management data metrics, experimentations like different roles that we have in tech companies about LinkedIn, about Yahoo. I did some internship at Microsoft at some point, big companies at startup. Any questions that you have, hopefully I can answer. That's going to be a very kind of broad and I don't know if you can really answer it. But I'm currently working on a product that has to do with discovery. And I'm kind of having, back and forth with the CEO, we should be considering scroll without taking any other action as an engagement, right? So I was just wondering like Facebook and LinkedIn, do you consider scroll as an engagement, like valuable engagement? Yes, it's not really intended maybe to discover, but where would you take them from there? So do you discuss, do you kind of see that as a- I think you are asking the right person, it's very, very interesting. I got surprised because both at LinkedIn and Yahoo, so I repeat the questions for the live audience. So the question is that for products, for different products, do we consider a scroll as a positive signal basically, right? So both at LinkedIn and at Yahoo, we were like in charge of most of the core metrics as far as the engagements. And in both, actually, we had a lot of discussions about the scroll. So I'll give you a little bit more details. I'll step back a little bit. So at Yahoo, one of the main metrics that we have, actually the main metrics that we have was a notion of how much you are scrolling and how much time you are spending on the feed that they had. And also in LinkedIn, we have a metric we call Engage Feed Station or EFS. And for engagement, we are defining any clicks that you have on items. But also if you scroll for 10 items, we consider that as a positive signal, right? So your question is that, or at least how I see it is that, is a scroll enough or do you want to also have other positive signals like clicks, right? I don't think a scroll by itself is enough. And one reason is that you can argue actually the opposite. You can argue if you have the perfect ranking. People want to spend time clicking on the first one, clicking on the second one, clicking on the third one, right? Versus when you say that if people are scrolling a lot, you can argue that they are not finding what is valuable, right? But again, the argument from the other side is that nowadays people are used to just a scroll and just looking at the headlines, right? So you should be very careful here, right? So that's why I think at least based on these past two experiences, you need to have clicks and other explicit actions in addition to a scroll to have a good metric to capture the value of your system. Yes? So for a product where the stakeholders are, like a database, for example, where the stakeholders are primarily developers, how do you attribute the return on investment of that particular product? Say, if you're optimizing for metric like developer productivity or something like that? That's a good question. So the question is that if the product is very back-ended, like databases, right? And then your customers are mostly developers. How do you evaluate the performance of your product, right? So if it's only like maybe the happiness of the developers, how do you capture that, right? I don't think at the end of the day, any product, the only metric or evaluation criteria would be like the developer happiness, right? It's very important and we have to be very careful about that. And usually engineering managers are very focused on that, right? But even the products that they explained to you and the products have been working for the past few years actually have been very like more like back-ended, like personalization platforms. The way that we captured is that on LinkedIn is easier because at the end of the day you have users and we have this metric, right? But for example, at Yahoo, we were the personalization platform, right? We were not the actual user-facing product. Our customers were basically Yahoo homepage and other products, right? They had some user-facing metrics and the way that we evaluate ourselves was that we were serving them what we felt that is something relevant and then we would look at the metrics that you have and see if you're improving that. And using some A-B testing, right? So it's hard to capture the value. So usually we would give them like one set of results, second set of results, and like no ranking from our side, for example, right? And then they would see the difference between different experiments that they have using our product, right? Does that capture your question or did you mean like something even like more back-ended? Okay. Yes. Can you give me a little bit more examples of what you are looking for, like as far as the data? Okay. So the question is that what type of research we do just to understand our users like both as far as the quality and quantity, right? So I'll start from very high level and then you let me know if you need to want no more. So usually for any product there are two ways, right? So as you said, so you can evaluate the product qualitatively or quantitatively, right? So most of the things that we are doing are like quantitative, right? And this is mostly based on user's actions, right? So the one example that they said about is like engage feed session. So basically we want to understand if people are engaged in each session that they come, right? So the way to capture is that like you define a set of metrics. So even before that you as the business you define your metrics, you say my long-term metrics is just to have more users or to have more retention, right? And then like for each specific product you define some short-term metrics that are correlated to those long-term metrics. So I can talk more about that if you guys are interested later. So basically when you have those metrics those are your ways of quantitatively measure the performance of your product, right? So if you see more people are engaged, so if you see there is a upward trend for this engage feed session you know that you are doing a good job, right? I'm simplifying things like a lot here but basically by defining those metrics by capturing them and looking at the week over week month over month and year over year, you understand if you are doing a good job or not, right? That's more about like data and quantitative data. There's also another way that usually big companies do because startups don't have that luxury to do that which is like mostly around user studies, right? You can hear different names about them that so many things you can do but basically like for example, for the past few months we did a bunch of big user studies. So we want to also understand the direct user feedback not only based on the clicks and actions they do but also just talking to them and understanding the pain points that they have, right? And based on those things we probably may come up with a new metrics or may can improve the product. So usually there are a bunch of different ways you can do, right? So sometimes even on product you can do user studies so you can have some user surveys, for example. Like Facebook used to have these like questions of like do you find your new seeds relevant or not, right? So that's a sort of user survey to get feedback, right? But you also can have like lab studies. So we actually have like 20 people coming to our lab. So we have labs and actually we have researchers that is, she's actually an expert in doing user research, right? So and then they ask questions and it's actually a science. Like how do you ask questions without leading them to give you some answers that you want to hear, right? So we, for example, we ask them to just open their LinkedIn app and play with it, right? And then that by itself we can learn a lot of things as like quality of the experience, right? But then like again, like there is a science you can go deeper, you can show them marks, you can show them new features you want to have, you can ask them what are your needs like or you can just film them and just go and then talk with other PMs or other stakeholders and understand the challenges they have. So that's a qualitative part of it, right? So usually again, like big companies do both but the quantitative part is easier in a sense that you have much, much, much more data. It's more scalable, right? So every day you have like hundreds of millions of users like using Facebook and LinkedIn and Google. So you get that data and it's easier to capture versus qualitative understanding is much more expensive because you have to have people and then the scale is not that great so you have like maybe 20 data points or 50 data points. Does that answer your questions? So the question is what some examples of quantitative metrics for like LinkedIn news feed, for example, right? So I said like I told you like one of the main metrics that we have, this is called engage feed session. So basically we want to understand that not only if you come to LinkedIn news feed but also you have some sort of engagement and the way that we define the engagement is again like with like any click that you have like you can like or you can comment or you can like actually go see a profile from feed or you can start a conversation or you can scroll for 10 position or you can watch a video. So there are like so many different definitions I'm not going to go to the details. That's one example, for example, right? But we have much more. We have something we call viral actions. So we want to make, we want to understand if people are doing some viral actions for us like like comment and share, right? And the reason that is important for us is that that's not only a measure of engagement but that's also very good for the whole ecosystem because if you start liking and commenting and sharing we can show the things that you like and comment and share to your network, right? And that's going to create a new set of inventory for other members, right? This is another example. The other metrics like revenue, for example, is very important, right? So we have ads, for example. We want to make sure that we are generating a good amount of revenue from feet. And I'm only talking about feet, right? So there are so many other products that have their own metrics. But these are like just three examples of the metrics that we have, but we have much more. Like we have, actually my team has a weekly meeting we call ramp meetings. So ramp meetings are the experiments that we want to ramp and ship, right? And in that meeting, actually we have a table. We used to have a table of like 25 metrics. We made it a smaller nav, we have like around 15 metrics. So each time we do any A-B tests or any experiments, we go and review all those metrics for the past two or three weeks and then we make the calls. And sometimes there are trade-offs, right? And then we decide to ramp it further, meaning to go look to more users and then ship it to 100% of users or just to shut it down. Sure? Yes? Product manager, when you're interacting with a customer to understand the pain points and their problems, what do you prefer? Like when a customer tells you what feature they would want in a product or when a customer tells you they have a problem and then as a product manager you develop a feature around their problem. Okay, that's a very good question. So the question is that as a product manager do you prefer, when you talk to your customers or users, do you prefer to hear more about the actual problem they have or about the features they want to have? So personally I really prefer to understand the problem, right? I have had a lot of cases that people come to me and say, oh, we need to have this feature or like this feature would be a good idea. But even that user, himself or herself if you launch that feature they are not going to use it or they may use another feature much better, right? And I think the reason is that these people, these are like our users. These are not expert as far as coming up with the product feature or like doing the product design, right? That's one reason. The other reason is that for me as a product manager I want to understand the core issue here. I want to understand why they're asking for something, right? If I understand that, actually I may come up with a solution that may not even require to building like a new product feature, right? It may be just education, right? It may be just marketing, right? So for me that's very important. And like in any aspect, actually I was talking to a PM today and he was happy that one of his junior PM has started his deck and presentation today with coming up with the very clear problem definition, right? I think that is super important for product managers and that's something that we can help everybody else because if you don't start by stating what is the problem that you are trying to solve it's going to be a lot of confusion and discussion and then you build a feature and then you are going to see like nobody's going to use it. Good question. Yes. So the question is that how do I think that the growth of AI is going to impact the products that we are building and LinkedIn and how do we do the transition? So already like LinkedIn and many other companies are like we are very like machine learning focused, right? So that's the state of the art for most of this company. But we are also starting to go to the next level which is like more like just incorporating the actual AI into our system, right? For us at this moment, I think for at least for LinkedIn it's just like improving on machine learning. So there are so much more that we can do and there are like so much things that we can do like and I know all these things are very related. Like it's very hard for me to say like deep learning and machine learning and AI like all these things are like happening in a very fast pace and they're very related to each other that is like happening at the same time. For AI specifically or at least the way that I see it is like mostly for the new interfaces, right? So not only not about the actual rankings and personalization but like how we can use new interfaces using AI. Like things like Alexa is doing or Siri or like Google. So we also have like a small team in LinkedIn right now. We are exploring an option of like how we can like use Alexa and other sorts of data to just provide some useful information that is related to your professional network. Like maybe news related to some companies that are interested or things you can do, right? What are the other areas that I feel are going to be impacted? For me again, like day to day right now I see a lot of new techniques that are coming in machine learning. So for example, like for us for the past few months we have done a lot of like tree modeling, right? So this is like tree modeling has been around but like there are so many new techniques right now and it's like improving the relevance of things that we have like significantly. Maybe other companies because they are also working like on hardware have more opportunities for AI. Like I'm thinking about like companies like Uber, Apple, like even Google, right? For us there's a little bit more again, like just improving the core relevance and machine learning that we have. Yes. How do you prioritize content for each user because when you log into LinkedIn you'll see that some of the posts are like two weeks ago and once you scroll down you see a post that's like one hour ago. So how do you prioritize content that's visible for each and every user? Okay, so the question is that how do we prioritize the content that we show to our users? And the example that you had was that some item can be like two weeks ago and on top and then you have another item which is one hour old. So basically what you're asking is like everything that I do, our team do. So like if you're in charge of the relevant. So probably it's going to take like 100 hours or so just explaining it, but like in nutshell again, like we have like, so you start, this is like basically what machine learning models do, right? So you will start by defining, so I'm trying to simplify it right now. So you start by defining an objective function that you want to get to, right? So the objective function for us again simplifying is that for people to click on the items that we show and the more probability of clicking on the top items versus others, right? So this is another objective. So you tell the machine that this is the objective and then you have to have some training data, right? So for machine to learn that, right? So because you already have some labels, I'm going a little deeper, but like, I'll try to be simplified. So the labels are basically the positive or negative values that you know this is good, this is bad, right? So basically if you look at the like past three months of the data, you understand for this user, for this session, for this item on top, that was the positive, right? Because that person clicked on it, right? So this is a training data that you already have and then you have the objective function. So basically you have to train your model and the model is going to you learn. Learning based on what? Based on set of like huge set of features that you define for the system. What are the examples of the feature? Time as you mentioned is one feature, right? So you say, I think time is important for that relevance, but you let the machine decide how important is that one, right? So basically for your example, you're looking at the three months of data and understand yeah, time is a very important factor. So when you usually have the fresh content on top, people are more likely to click on that, right? But then you define like another thousand features, like you say your affinity to that person X is also important. That's the hypothesis that you have, right? It's not only about the topic but it's also who sharing that topic, right? So you use that as another feature. Then you use the topic of the conversation or the content as another one. You say like, we call it update type. Like what type of update is this? Is it like a job posting? Is this like a news? Is this like a people you may know? And then for different people, you learn different coefficients as we call it, as we call it, like weights, right? We learned that for you, looking at the past three months of data, you are looking for jobs, so it's important for you. So we try to rank them higher. For me, you understand that you are more interested in news related to my industry and then we rank them higher. And again, all these learnings are happening in that like using machine learning models. Is that a good answer? Yeah, okay. Yes. I'm just curious, how does LinkedIn store relationship data? How do you, what kind of data model do you guys use? Do they want to model on Facebook or is this something? So the question is, how do we store the data in LinkedIn? This is a very, very broad question but at the very high level, we are like Facebook. Actually, we have something called economic graph. So basically economic graph for us is that all these notes that we have from like these notes examples are like people, companies, jobs, content, meaning news, learning that we have, right? So we have all these notes and also we have all the relationship between them, right? So we know that this person is part of this company or this company is about this industry or this learning course is about this skill. So a skill is another important note, right? So we have this huge economic graph that is like, one of the biggest assets of LinkedIn, right? So yeah, we also have a network but then there are so many different things that we do on top of that we have stored it but every team has a different view if you call it on top of that data, right? And then you can use it differently. So the question is, what is the vision for LinkedIn? So we want to help people in their professional life basically, right? So I see that as a social network for your professional life, right? So Facebook is a social network and I think the value props they have is a stay informed, a stay connected and stay entertained. For us it's a different set of value props, some of them close. So we also have a stay informed but mostly on your domain that you're interested, right? So it's stay informed for me in machine learning and product management. We have a stay connected again in professional context, right? So we want to make sure that people are connected, people that can help each other as far as like their professional life. We also have like another value prop called find jobs. So most of you guys probably are using LinkedIn to find jobs, but like right now we are past beyond that. We are trying to do this a stay informed and stay connected more. So that's the third value prop, the same and network, right? So having a platform for people to network and then learn from each other. We are having so many interesting ideas right now. We are working on something just like to help people mentor each other, right? So anything that can help you in your professional life, that's our vision, like our vision is that to make it much, much easier for everybody to grow in their career, to find jobs, to stay informed about things that are important for their jobs and then they can learn and grow in that regard. Yes? How do you handle LinkedIn? How do you handle the signal-to-noise problem inherent in having lots of users that have 500 and lots of connections for example? So some folks accept any and all connection attempts and some are a bit more selective only people that they've had a relationship or at least a conversation with. Does your machine learning system or does your presentation system analyze the strengths of those connections? I mean obviously you said this is someone as a colleague or someone as a friend or whatnot because it may be a scenario where someone who is one link away from a target may have a weaker link with that person than someone who is actually two links away and they were all colleagues on the point. So how do you handle that? This is a very good question. So the question is that how we handle the signal-to-noise problem especially for this example of people that you have in your network and also in your network and how do you know who are the people that you should care or you care about them and we show content. So this has been one of my main challenges actually for the past few months. So we have a big project on this. So I'll give you a little bit details probably I cannot go too deep and share everything but we have something we call like actor affinities. This is like again how much do you care to see content from this person. And one of the problems that we have on LinkedIn is that most people actually just to grow their network they send a lot of invitations and most people accept those. So even for me like on Facebook I don't accept invites from anybody. I just have to know them. For LinkedIn I used to accept invitations from everybody and now I have this network like a network of like 2000 people that to be honest I don't care to see content from them on my feed for most of them. So this is a big problem and this is actually one of the main sources of complaints from our users as well. And the reason we did this, this predates me but I think because of the networking and because we wanted to increase our network of people we made it much easier for people to actually send invitations and people accept them. So now that we are focusing more on like showing very relevant content to you and things from the people that you care this is a big problem. So yes we are usually much in learning. I'll give you a bunch of like high level signals that we used to improve. So one is that number of interactions you have with that person. So if I just like somebody send me an invitation today and I accept and I will never have any interactions for the next six months we give it a little bit less weight, right? But like if I see content from that person I start liking commenting or messaging that person that's a very good signal for us to know, right? We're also looking at the recency of those interactions, right? So I may be interested in like software engineering like four years ago but I'm not anymore and hence I'm not interacting with the software engineers that much, right? The other signal that we use which is like it's not one signal it's like hundreds of signal is that skills that you have and how many common set of skills you have for example, right? So it's a higher chance that I'm interested in somebody that is like in product managers in Silicon Valley and tech company rather than somebody is like in health industry in this cost, right? And we already have those skills information in profile so we are using profile data to understand if people are like close to each other or not. These are some of the examples I mean we are doing so many things here but this is a very good question. Let me just ask new people and then come back to you. Mr. Contrerasian. Okay so then go ahead. So the question is like that's based on the assumption based on the things that you just said based on all the factors that you said it's an assumption that all the people are conversing and liking the content on LinkedIn, right? So for example, if me and your friends outside and there's no way LinkedIn obviously will know so how do you tackle this problem and how do you basically decide that content? So the question, the follow up question is that the assumption is that these interactions are like happening on LinkedIn platform what about the interactions that are happening off site, right? So I think you answer that question yourself. So we don't have access to that data. There is nothing we can do. I think using this profile data is some sort of proxy for that like again like it's a higher chance that I interact with people in like Silicon Valley in product management, right? But like we cannot read your Gmail data for example, right? And this is true for all these companies, right? So there is I mean the data that you own and you have access you can use it but if you don't have it then you have to just have some sort of like approximation using the things that I told you. Yeah. So is there so when they accept a connection how strong is this connection? One point, basically like that. Then what do you think it would be done? Yeah, so the follow up on follow up is that is there any value? How about like asking users, right? So I think you guys are just like trying to come up with a roadmap that we have and trying to figure out. So we have that also actually we are doing something right now which is very interesting. So not only for this specific problem for any other like personalization problem or any other problem that you are using implicit data as we call there's always another approach to get explicit data, right? So the drawback of explicit data is that it adds more friction for users, right? So they have to come and just have another click or two clicks and give us data. But the good thing about the explicit data is that it's very, very accurate, right? So they are basically telling us this is what they are interested, right? So we don't have to guess that this is the person that are interested. So yeah, actually we have thought about that a lot and we are doing a couple of other things. One thing that we are working right now I think I can share because we are ramping it to like 2% of the members is that for each update we are trying playing with something like swipe right and left to say show more of this or show less of this. So that's an explicit signal for us, right? So we basically understand but the challenge is that on the first level if you just do show more or show less it's a little tricky to understand if this is about the topic that you are interested or about the people that posted this one about the publisher or so on, right? But again like that by itself is very valuable because if you see that you couple of times you say show more of the same person you understand that you are interested in that person or show more of this topic you understand that one. So we are exploring with that that's going to be very valuable data but again the challenge is that people start using it and then we have to have a very good design it has to be smooth and seamless. These are the challenges of explicit signal, right? So and the second thing is that we can expand on that also, right? So we can say show more of this and then again if you have a good design we can actually show them option of show more of Alex show more of machine learning show more of like I don't know the tech crunch all these things are about that update, right? But again adds another level of friction but gives us much more accurate and more granular data. So these are the challenges that we have but that's why we do a lot of experiments to understand. Yes? I'll kind of follow up on what you said. Yeah I know. And I don't want to but so an old Facebook kind of got almost got in trouble recently with a lawsuit on collecting cookies on people where they will visit a site where the Facebook button was on. So with LinkedIn, I mean I don't want to get you in trouble either but isn't there if there is like a LinkedIn like extension or like a button somewhere and you know that I went there and you collect my cookies and my friend went or someone that I'm not connected on LinkedIn ended up on the Warriors page, right? And then later I go on LinkedIn it says, hey you guys are connected, you have five connections and you went to this event recently or you went to that page bought tickets, whatever. Have you, are you allowed to use that data? Because that, you know, there are, I feel like there ways around it that it's not on your platform but you can still use data outside of your world. So the question is that data outside the LinkedIn and the places that we have some presence we have some LinkedIn, like we have this in share button, for example, if we can use that data. I think we can, that's not my team but we have talked about that before. I think we can, I think, but the problem or the thing that this is not really the most important thing to do is that we already have a lot of good data on LinkedIn itself and to me at least I think there are so many things we can do on this data that we can improve on, right? So we also have that data, like we have this in share button in other places but that data is various parts, it's not that much and again I don't think we are done with just using the data that we haven't improved so that's why our priority right now is just using that data to make it better but at some point we are going to start using that data probably and but we have to increase the coverage as well just to make sure we have enough data. Yes. Can you talk about organizationally how kind of you see product and engineering working kind of most optimally together? So the question is that about the relationship and the work dynamics between engineering and product especially for the products that are very technical, right? So I have worked for the past few years, four or five years on products that have been very technical and I have had a lot of like completely different experiences, right? So sometimes you get a lot of pushback from engineering team and engineering leads because if you want to go a little deeper they feel that this is their thing and they have to take care of it and sometimes they are very open of like you helping them, right? And because like for some, for most of the product managers working on these things they are already technical, right? So I gave you my background like I'm coming up with like I'm from like PhD background, right? Which is usually not the case for product managers so that's good and bad, right? It's good because you understand the relevance of ranking, personalization, machine learning but at the same time sometimes you become like too much of like a hands-on person, right? So you tend to go and then where your software engineer or scientist had and then try to come up with the solutions. So the best approach I find so far is that for product managers to come up with what? With engineering managers and engineering coming up with how, right? So basically you clearly define the problem and why we should do that. Like we should tackle that problem and let them come up with the solution, right? Again, this is simplifying things. Usually this works but again when you go to day to day things sometimes it's very fuzzy to understand what is what and what is how, right? So sometimes I go and then come up with ideas of oh, we have to go and like understand the content and then find some tags from it and then classify the content because like these are like things that I can help with but then some people may argue this is like too much this is like engineering work. Some people may say no, this is cool. Like it's good that we have a product manager that can help. So I have had my own challenges and sometimes it's very tricky. So you have to find a good balance and one last thing I want to add is that it also depends a lot on the actual people, right? So even if you define processes, if you define like clear roles and responsibility, like if you find a person that you can have a good relationship, you guys like if you have a good counterpart you can come up with solutions that works for everybody and like you can come up with ideas, you can go to brainstorming sessions and everybody will be happy. But if you don't find that person or if you guys cannot like find a good way of like interacting with each other, it's other than it's going to be very, very tricky. I have seen a lot of like this type of products with a lot of challenges because people cannot find a good mix. Yes. So the question is about the LinkedIn for students and how we use that data differently from the data that we have on LinkedIn. Sorry, I don't have that much knowledge. I know we have the LinkedIn for students but I've never worked with that team or we never used that data from that team. So unfortunately I don't have much, I know that they have a team and they're working on things but for us for FIT we are serving it to everybody including a students but and we are using the data like very similar, right? For them but like knowing that they are a student or knowing that they are like at this age, it's a feature for us as I explained. So we can have a little bit different ranking but that's all as far as like my role. There may be other things that they are doing. So sorry, I don't know much about that. Yes. Sorry, I didn't quite understand the second question. Okay, so there are two questions, I start from the second one. The second one is that like how do you decide what is relevant and how the algorithm works to show the relevant content? Yeah, and how do you collect the data? And how do you collect the data? So the first part I think I already answered to some extent. So basically you define an objective that you want to get basically this is your true north, right? And then you use the like the data that you have from past to understand the different interactions and you come up with a set of features that you want to test. These are basically set of hypotheses that you have, things that are important for the system. Again like recency of our item was an example but like publisher of that article, right? But how to collect the data? So this is basically any type of interactions that users do on the site, we just sort it, right? And then we use it later. Like we sort of like click, like recently also like even is storing the dual time per update, right? So if you are spending sometimes like three seconds just looking at another item, that can also be a good signal for us, right? So we are collecting all this data and then just fed to that system, feeding that to the system. And the first part of your question, what was it? Needs, yeah. So the first question was that how do we identify user problems, right? Two ways, again. Like one is that the feedback that they haven't sent. So there is like a feedback link that you can go and send feedback. There are also like conversations where people are just come and post something and complaining, right? So we have a team, we call it GCO that they actually looking at all those complaints and they come up with a list of categories of the main problems that we have, right? And PMs usually have a very good relationship with them and they have to be very involved in just talking to them. Like I have a weekly sync with them, right? So I go and talk and ask like what are the main problems and feedback that people are complaining, right? So and then that gives you a very good understanding. The second part is the user studies that I was talking to, right? So when you do user studies actually it's not only for like new features, you also want to understand the existing problems that they have. You may start doing user studies to understanding new features and the new problems but if you are good at that, doing that you are also are going to learn a lot about the existing features and the problems that we have because going back to the core of the problem so if you understand the core of the problem for users then it's not really about the new features or existing features, it's about the needs they have, the user needs they have and how you can translate those needs to some product features. So user studies and also like these GCO like the tickets that they get from our users. Yes. So the question is that the difference between product management practices in big companies and startups. So like I'll give you my own experience, you have to talk to other people as well but like startups usually like a startup that I worked we only had like 10 people and I was the only one on product, right? Actually not only product I was helping there with data science, engineering a lot of things like closing the door like opening. So I mean that's a very like a general way of looking at the startups and big companies but that's true, like you do a lot of things in the startups and you learn a lot of things. So if you want to be more specific about product manager it's also true. So I had to basically learn a lot of these and what product managers by on my own, right? So I learned so many things just by practice and I know that I did so many things like wrong and then I learned, right? In big corporations you have mentors, right? So it's your manager, like other product people that you have, you have all these meetings you talk to them. So especially for junior people it's a very good environment, right? So you can just like actually they have some mentorship programs, most of these companies you have actually have a mentor which is different from your manager. You can go and like have a weekly meeting and learn from them. But also your manager is supposedly a good person that you can learn because these people have been around, right? So for me the best benefit I had working at Yahoo and LinkedIn was that I met so many smart people that have done product managers for long times and learned so much from them, right? For the startup, the benefit again was that I learned so many things like very fast, right? And by practice and by doing mistakes, right? And also the scope of your work is much, much larger, right? So in startup I did like B2B, B2C, like design, like I worked with a design firm for a few months I learned user studies. So because the scope of the work is basically everything you learn all those things but you don't have really a good mentor there, right? So those are the big differences for me. Sure, yes. So the question is what is my approach for for coming up with priorities and how is that different in smaller companies, right? This is very broad, like do you have anything as specific in mind as far as prioritization? Like prioritization, like what to get done like you're talking about just data, for example. Sorting out how you can better utilize LinkedIn's data for it's like going through priorities. Okay, so in general, there are too many different types of themes, right? So I'm very data driven. So many of my decisions is based on data. And actually most product managers do that as well. So if you have data, this is the easiest and best way to do the priorities. But the challenge is that many times you don't have the data, right? So you want to come up with a new feature or new ideas and it's very hard to understand because you don't have any explicit data from our members. You sometimes have user studies, sometimes you don't have those, right? And there are so many things that you do. Like you're looking at the things that have been similar in your company and other companies and trying to understand the impact of those things. You're looking at the strategy of the company and how those initiatives are aligned with those things. So there are so many things. And actually this is maybe the core of product managers many times. Like you also learn by practice which things are more important. Like you get some sense of like product intuition as well, but I want to talk a little bit about the smaller companies versus big ones. The challenge, the biggest challenge in big companies for priorities is that there are so many stakeholders, right? So I'm on LinkedIn and Newsfeed and there are so many different teams that come to me and they want to have their update types on feet, right? So we have this learning team for example. They want to put learning on feet, right? And then we have jobs and they want to have jobs and everybody wants to have their things on top, right? And everybody wants to have their things getting prioritized, right? There are so many dependencies and that's why actually some of these big companies, they have another role like Project Manager or TPM or Technical Project Manager, right? So those people are actually helping product people a lot as far as coming up with priorities, understanding the dependencies. The startup, it's much like at least in the smaller years, startup is much easier as far as you don't have much dependency but it's harder because it's very, very hard to actually come up with any sort of like approximation of what's going to happen because this is completely new, right? So it's very tricky to come up with priorities because you really have no idea what's going to happen but that's why it's a startup, right? So it's more riskier basically. Yes? You know the different functions that you have in a company like LinkedIn, data scientists and software, where do you see the greatest shortage of manpower in the next five years and where do you see the most surplus? So the question is that in different functions that we have in LinkedIn and other companies like data science and like other functions, where do we need basically more people, right? Especially in the next- Where do you see the shortage? Where do you see the surplus of people? Yes, shortage of people and surplus of people in the next five years. You have to have a very, very good holistic view of everything to say that, right? So I can tell from my past experience but probably it's not good enough. Like the best person to ask is like the CEO of LinkedIn or CEO of Facebook, right? For me, it seems that like for most of these companies they work, there is a shortage of talent almost in every function. I know this is not the answer that you are looking for but like I can see that, hmm? I think we can put it another way. I noted that when you were talking about your resume you switched out to be two years. What are the average duration of the program? Where is it the shortest? Where do people switch jobs the fastest? But that doesn't give you, like like software engineering is very hot, right? So like in Silicon Valley, like it's very, very hard. Data science right now is very, very hard. That's why I'm saying like I'm looking at like people I know I'm looking at the people that I'm working with and the people that are hiring and almost all these things like it's like a very, very, very hard competition just to find talent, right? Sorry, I don't have a better answer but like I can see maybe some functions like we have like PMMs, product marketing or we have like designers or we have like user research. Maybe a little bit less of like competition to get talent versus like the main ones which are like engineering, data science and product manager. And I think like data science and engineering are like hotter than product management. And I think this is because of the scale of these companies like Facebook, Apple, Google, LinkedIn that they really, really need to find people to build things and these are the actual people building things, right? Again, sorry, I don't have a better answer but to me it's like very hard like all these functions. Yes? I have a question about, so I assume that you sort of report up to self-management, higher management team and I'm wondering what tools you use to present your, whatever the results are that you're presenting to them. How do you show them your results, your data, whatever it is that they're looking for? So the question is that when you report to like your like managers and managers, managers and VPs and other what type of tools you use to present and report, right? So there are a bunch of different things. So actually we have, usually these big companies have very good processes and like meetings so that everybody knows about like almost everything, right? So actually today before I came here I sent an email. So we have this weekly report of all things on like not only feed but we call it like content ecosystem like feed, discovery, search, publishing and we have these good meetings and we come up every week, we come up with a document of like summarizing everything that we did, all the experiments that we shipped, everything that we are going to do and list of the top metrics that we have and how we changed those in the last week and we send it actually to the executive team, right? And other people that want to look at it. So basically they have a very good understanding by looking at this one or two page documents of what's going on in this ecosystem for feed and then there is another one for messaging and there's another one for like talent solutions, right? So they have a very good understanding of those things. We usually have like again like weekly meetings with my managers like even like with my manager's manager and in those meetings we just informally talk. Every once in a while we have product reviews and product reviews we usually create a deck using usually Google Doc, Google Presentations or like you can say PowerPoint as well and then we just show them and then but more than like to me what I've learned is that the tool and the things that you use is not really important, it's just like how you present it and what is the content and these people are super smart and they understand that, right? So even if you have like two pages of like a word doc if you present it very well and you come up with a good problem definition things you do, the metrics and everything it's more than enough and they understand that. Did that answer your question? Yeah, yeah, yeah. It would be interesting to know sort of what you're using below that to get sort of the data that you're presenting to them. So the question is what do I do like to get the data from like people that like work like the actual team and engineers and so on, right? So for them as a product manager you have to be very involved and work with them day to day. So basically they don't present anything to you but you have to understand what they're working on but just talking to them, right? So it's a lot of talking but also we have our own meetings, right? So this ramp meeting, the experiment meeting that I told you it's a very good forum for me to understand what type of experiments that we have who is working on what experiments, what are the metrics we are seeing from those experiments but then I synthesize all those information and make it very high level and then share it with like VPs and directors and others, right? So as actually that's good to know for product managers many times you are just a breach, right? So you have to be able to talk to engineers and understand what they are doing but you have also to understand the strategy of the company and how to translate that to that. This is very important and this is very hard many times, right? Because like some of the engineers on my team if they want to talk to the directors or VPs like nobody would understand, right? You have to be a little technical enough to understand them and also understand the details but you have to translate that to some high level understanding. Yes, give me a little bit more. I just want to make sure we are on the same page. What do you mean by financial data? So the question is about the relationship between the data that we have, I mean the things that we care about as far as the revenue and like financial data that we have and the product as we do. So I'll talk a little bit high level and most of the other things that I work on. So again, so like for example my team, job is just to make sure the feed is very relevant, right? But we have to make sure that we understand the revenue that is generated from the feed. This is another team, right? So they are just running ad campaigns that are like basically generating ads and then put them on the feed but there are dependencies here, right? So let's say if you improve the personalization and the ranking of feed, it's going to be more sessions and more impressions and that's going to generate more ad impressions and that's going to generate more revenue. So that's why although revenue is not the main metric for us, we capture it and we keep track of it, right? So in this experiment meeting that we have, we also make sure that we capture that and also we communicate that, right? So if you are impacting the revenue in positive and negative, you are going to report it. So that's a relation, I mean I know it's high level but that's a relationship that we have right now. So our goal is not to increase revenue, our goal is to increase engagement but we have to be very careful about the revenue because that's how we get the money, right? And then like every once in a while, I talk to the PMs on the revenue side making sure I understand the vision that they have and I share my vision and we try to find something that like works for both of us. Yes? The product manager, you have to work cost-functionally a lot. You work with the engineering team, you work with marketing team, the developers. So like what model do you prefer? The waterfall model is the commanding model or do you prefer to involve the developer in the engineering and the marketing team at any initial stage where you know that, okay, if they have come up with a design, that design is feasible and they will be happy with it. Whereas if you design something and then of course the engineering team, they might give you feedback that, okay, so the question is that there are different functions that PMs work with and like which model do you prefer to have like work like in like separately with different functions or like have people from different functions work as a team, right? There is no perfect answer here, right? Case-by-case product for product company by company is very different, right? But like in general, I want to make sure at least the leads of each function have a common and a good understanding of what we are trying to do, right? So at the beginning you want to assert to do something and make sure that my like design partner and my data science partner and like my engineering partner and everybody at least understand the problem. So that has two benefits. One is that everybody is aligned and they talk to their own team and then everybody would be aligned. Also to get input and feedback. This is not a, as the product manager, you shouldn't feel that you should come up with all the ideas and things you want to do, right? So these are all smart people and there are so many implications in like in design, data science, in like legal cells, everything that you want to assert a conversation as soon as possible so to get input, right? The downside is that sometimes you have ideas and you want to do things you don't want to like have these meetings like every week, right? So you say, oh, this is the idea that we have, let's sit down. So for me it's just like when I feel that this is an idea that we really want to do and this is going to get prioritized, I make them evolve. Before that, I just make sure that I talk to the people that I feel are like I can get input and then evolve them instead of like every week. Did you have a question? How do we grow and keep users? Okay, the question is how do we grow and keep users? That's a million dollar question, right? So the metrics that I explained, right? The rationale behind those metrics is actually for that, right? So that's why I said you start with the business, very long-term business metrics that you have. You say like retention is the long-term metric that we want to optimize for, right? Meaning that I want to make sure that people are coming back, right? So when you define that metric, then you run experiments to see if it's exactly that metric or not. The challenge is that some of these big, long-term metrics is very hard to move and you cannot get enough data in like two weeks or three weeks of experiments. And you cannot run experiments for four months because then you are going to move very, very, very slowly. So the way that these big companies handle that is that, they come up with that long-term metric, but they come up with a list of short-term metrics that they feel that are correlated with that long-term metric, right? And then they do some analysis. So that's how we did it at Yahoo, actually. So you run some analysis for three months and you understand, oh, this short-term metric one is highly correlated to retention. Then let's optimize for this metric. And the benefit is that that short-term metric, you can see if it's going to move or not in two weeks, right? So then we come up with that short-term metric that can be like scrolling and then like clicking, right? So dual or dual-time unfit, right? Then you run your experiments. You see that this experiment or this feature is improving that one and then you ship it. But don't forget, the assumption is that by doing that, by shipping that experiment, you are going to increase the retention and you are going to keep your users in long-term. That makes sense? So it's not easy to have that direct mapping at the beginning, but you have to find some proxy metrics to optimize for so that at the end of the day, you know if you add this feature, you are going to keep more users or you are going to gain more users. So at the end of the day, if you do the right experiments, you're looking at your metrics and if you had the right correlation, after a while you are going to see you are going to have more users and then you are going to keep them more. And that's how actually Facebook, LinkedIn, all these companies are growing, right? Because they are doing these things like in a very principled way. Okay, so two more questions. Yes, you guys do too. How do you deal with stakeholders who use data only when it's convenient to them? I don't know if you've dealt with that. And, you know, I mean, that's the question. So the question is that how do you deal with people that use data only when it's appropriate for them? For their thesis. Yeah. So the way, at least the way I do it is that I want to, I'm making sure that I understand the data, right? So I don't have to go deep in each data analysis that you see, but by asking the right questions, people will understand that you understand the data and they cannot come and say, just give you like a one view of data, right? Or one data source only, right? So if you're a good product manager, you have to challenge everybody on your team. You have to challenge all the data sciences. That's your job actually when you interact with them, right? Your job is not to go and like ask a question, get the data and then run with it, right? Your job is just to make sure that first you ask the right questions, but also to ask the question from different point of views, right? So if they come up with a data analysis, you should follow up and say, oh, what about this data source? Or what about this thing? Have you think about using the past three months of data as well instead of last week, right? So again, like by practice, you learn when to trust the data and when you don't trust the data. Also, you develop this product intuition. So sometimes when something is like very counterintuitive, you ask like more questions and you don't get convinced easily. That doesn't mean that your intuition is always right, so you have to trust the data, but you find this balance, right? Yeah? The strategy question. So see LinkedIn used by highly skilled people to actually improve their professional life, right? So get a better job or basically learn more from others. But there is a big challenge for people who are lower skilled. How do you basically kind of like help those guys who are lower skilled to actually grow? Because those are the guys who are not even on LinkedIn or basically cannot get an opportunity. So how do you like- So the question is mostly about a strategy for LinkedIn and the fact that we are more focused on the people with high skills and not people with lower type of skills. This is again, this is a question at the CEO level, but I know we are doing a few things. So as you said, we are focused mostly on this type of people and that's how we are growing and which is fine for now, but if you want to go to those markets, one of the things we do as far as I know, at a very high level is that we are having some programs with some schools, universities and some communities around different areas in U.S. to educate them how to use LinkedIn and how to put their skills and then also match them with some jobs from companies, jobs that are not like highly technical things that they do, right? So we have started those programs and I think these are like pilots to see what we can learn from those. I think that the main challenge is like education. Like for many of these users, actually maybe like third world countries, actually many of them are not using LinkedIn as a social network, right? So they don't see any value because there are not like a lot of jobs from companies put to proposal. So another thing that we are doing is just talking to different companies in different places in the world to also educate them to put like their job posting, right? But also to educate the other values that LinkedIn can provide, like just understanding and using your industry. This is a very good question, but this is something that probably in like next few years you are going to do a little bit better. Right now the focus is mostly on the type of jobs that we have right now. Thank you so much. Thank you, thank you everyone.