 And thanks for the patience. I'll try to, this also means this talk is going to be 35 minutes and not 45 minutes, which makes it easier on both of us, I think. I'll quickly get started. Welcome to this talk and what I'm going to try and do here is bring together two, three topics of interest and connection to agility. Learn some lessons from cognitive science and try and bring some research from anthropology. And link these two to agility and make it more useful for us to try and envision how it distributed agile has some challenges, which can be solved using these ideas. So briefly, this is about shortcuts to agility, of course. And there are shortcuts out there. And those shortcuts, obviously, will not really take you there. But what it'll do is make gram-grammolium look like from here to here. It'll look really awesome, but you won't really get to agile. So unfortunately, the end of it is that you don't really get there. And I'll try and say why we don't get there using those shortcuts and what possible ways are there to ensure you don't go down the wrong road. So very briefly, this topic is about culture. It's about anthropology. I'll start by trying to make this as Indian as possible as the way we get introduced back home to start with, let's say, Vanakkamaiya. It's a way we greet people back down south of where I come from in Chennai. And my name is, father's name is Ramalingam Sumedanki Sivaramaiya. And he's from this village called Kaveri Pakam. He used to be a cultivating rice there from 50, 60 years back, more like 70 years back. And no longer does it. And after that, he begot me and then I went and studied in a couple of colleges called IIT Madras and IIM Calcutta. And then I worked in a few of the companies, Polaris, Orphelab, B&Y Mellon, ThoughtWorks, JP Morgan, and now I'm in Oracle. To cut a long story short, obviously, this is what the way we are used to being introduced in this culture, and it's actually quite onerous. That's it. It is what it is. And recognizing it helps us being more sensitive to why somebody is doing it and not make it look like it's some kind of a comic show. I'll quickly get down to the conversation now. And next 40 minutes, what we'll try and do is this. Look at what the shortcuts are, what leads to the shortcuts that we talked about, and how not to be led into one. And I'll try and introduce a research that has been done by Joseph Henrich and people in the US on a lot of anthropological studies on a couple of areas. One is called culture gene evolution, which is how is our social culture also influenced back and forth by our genetics. And this leads to an interesting mix of what we consider as normal across all our population is not really normal. So finally, having looked at these anomalies, you'll try and see what tools are available for us to be able to understand these anomalies in culture and see if agility can fit into one of those. And obviously, before going there, a few disclaimers. This is an exploratory talk. It's not a completely researched field. So I'm going to be having more questions there than answers. And it'll be a bit of a roller coaster, right? It's going to be going back and forth by the very nature of it. Any point in time you're going to be having objecting to what I'm saying, please feel free to do so. For example, right in the beginning, I'll tell you that a lot of my terms will come from XP, which I shall use as though it's agile. It's not really. People have pointed to me or me to that, but it's another flavor of agile. And I know that people in this room might even believe XP is the only way to do agile. If you agree with me, then you might be still with me in this talk. And that's it. The last thing is an interesting quote from a statistician called GEP Box, is that all models are wrong, but some models are useful. So keep that in mind, bear that in mind as a disclaimer that. What are we describing here are models concocted in laboratories and researched using empirical ways. None of those models can be exact. Real life is much more complex than what we can try and capture in those models. Now let's start the story. I'll straight get to the shortcuts part of it, right? So we all know that like in most important parts of our life, agility also has two different parts. And there's a fork on the door and we need to be careful which one we take. There's one part that is a quick rewarding and the familiar part. And that's symbolized by the red caption there. And there's the other one that's a tough, boring, and a little counter-intuitive path, which is the person representing is Yoda. The wise old man is obviously going to try and create, make it as boring and difficult for you to follow that path. But that in the end, there'll be more rewarding. In the top, that's a poem by Robert Frost, which says two roads diverged in yellow wood. And then the poem continues on why you should be taking the path that is less traveled by. So I'll now describe to you what lies in the quick rewarding and the familiar path. And for better word, I'll just probably call this fellow. Since I don't have a name for it, there's Yoda on the wise side, on this side I'll call it the pretty mean imp, right? Which incidentally stands, acronyms into PMI. But let me just not go down there. So what is it that is, that's down the other path? Why do we end up taking those shortcuts? And what makes those shortcuts not really work? I'll briefly touch upon a few concepts of cognitive science. Obviously cognitive science is just one of those areas which try and delve into the way we behave, what goes into our behavior and into the culture, and what is it that we're thinking? The entire concept of language, thought process, cognition, that find results in a certain behavior. But in that slicing, the view of life, you can still see what leads us to make such a certain decisions and why are those decisions subject to certain fallacies and errors and biases? And those biases could lead to us taking a suboptimal path. And here I'm going to try and describe to you some of those biases, which may lead us down the wrong path into agility. Very broadly speaking, there are multiple biases. There's a very broadly researched field, but most of those biases can be broadly captured in two areas. One is called loss aversion, another is called attribute substitution. You might have heard of tons of other biases, but more or less fall into these large buckets. The loss aversion part tells that under loss aversion you might have heard something called prospect theory and endowment effect. Prospect theory says how we view losses very differently from gains. And endowment effect says that we value something that you own much more than something that you don't really own. Once you are endowed with something, it's got a different kind of value altogether. So loss aversion essentially tries and captures those aspects of our decision making, where by the very fact that we own something or how do we perceive the value of a good, make us treat in a way, not make right decisions by being averse to losses. A risk aversion is one of those things and why we try and not, why we try and take do gambles which are extremely stupid, why do we go behind lottery. All those things are explained by this concept of loss aversion. And this plays a big part in how we make decisions. Second thing is attribute substitution. It's a really common thing. Attribute substitution fundamentally says that when we are presented the decision, which is a set of inputs and that begs for you to analyze those inputs and come up with an answer to the question which is, which you need to solve. We both end up substituting the original question with something that's more easy for us to answer. As well as try to substitute the information with something that's more available for us. When you substitute these attributes, you end up with a wrong answer for the right question. I'll get a little more detail on what those are by giving you an example. Research was done among a very, very erudite educated law, I think in Stanford or MIT, where people are given two kinds of options. You're offered a gamble on the toss of a coin. If the coin shows tails, you lose 100 rupees. And if the coin shows heads, you gain rupees 150. How many of you would take this gamble? I think people who took this gamble and you are on the other end of the spectrum who are not at all loss of us. Most people actually don't end up taking this. It's well, sorry, not very clear there. So most people don't take up that gamble because it's known that the way we perceive losses on one side is much more steeper than the way we perceive gains. For the same amount of a loss, we would want to be compensated by almost double the amount in gains, especially when the amounts are fairly large. So this is broadly leads to certain ways in which we behave. And for example, being too invested in the work already done. So that's a very good agile principle which actually caters to this kind of loss of worship, which is at regular intervals, the team reflects on how to become more effective and tunes and adjusts to behave accordingly. At any point in time, you might not be necessarily have built the right product. But if you're not reviewed and are willing to throw it away, but enforcing the principle, we try and not get too vested in what's already been built and we're not ours to change. So second way is where value is often perceived as what's available in house. And what consumer really wants, right? So as this not happened to you, have you not seen people who try and just use just because you have a certain technology, certain architecture, certain components available? We try and use it because it worked wonderfully in the last project. We want to reuse it, maybe not the right thing to do for the next project, not the exact thing what the customer wants. And I'll assure you this happens in the most exalted cases where developers just question saying that why does the customer need a search box? All he needs is a tech search, why do you want to make it more prettier? Are we looking at it from the customer's point of view? Are we just trying to see what is available and what's easier for us to do? So loss aversion, as you can see, is something that's compensated by certain principles of agile, which will ensure that you do not try and take the easier path and follow the one that's more valuable for the customer. The last case, for example, throw away code. Have you seen organizations are extremely happy to invest in throw away code? In some of these industries, it's anathema. People question why do you want to waste dollars right now? Try and build something that's exactly what you want right up front. Unfortunately, welcoming change, changing requirements, one of the 12 principles of agile ensures that you try and get back on track. Second basic cognitive bias, one that happens through attribute substitution, right? The example is this, when subjects are asked two questions among many others, how happy are you with your life in general? And the second question is, how many dates did you have last month? What would you expect the answers to be? So when you ask in this order, the correlation between the two questions is almost negligible. But you ask it in the reverse order, it's a 0.66 correlation. So you automatically assume your happiness is a self-saving bias. The first thing on your memory is how many dates did you have last month? Or you could obviously equally substitute with what did you buy last month or name it what you will. The information gets substituted in the head. The question was how happy are you, but what you're really answering is what made you happy, right? So this is typically what's called as an attribution bias. And a similar example is something like, something like this. This is a famous, forget the picture on the left. On the right, there's a case that has again been given to the Harvard grads. Two ways in which you can describe a scenario. The positive way says that the treatment A is, it says that you can save 200 lives out of 600 people who are in a country, let's say. And the treatment B says that a 33% chance of saving all 600 people, 66% possibility of saving no one. So when you say it this way, everybody chooses the one that's actually positive sounding, though exactly both say the exact same thing. And the second one, when you give a negative one, people choose the treatment B rather than treatment A. While it makes no sense, saving 200 lives exactly same as 400 people die. While in the positive case, it's a probabilistic thing. Second case is a certainty thing. So what this really says, how you frame the question can completely affect how people perceive it. And the actual question completely gets ignored. It's just what information it's triggering in your brain. It goes back to say, are you losing something? Are you getting something? Or is this, am I really saving somebody? Or is it somebody is already, so all this information gets substituted for something that shouldn't be there. And you end up answering the wrong question. How does this affect, how does agility really help you here? It's, for example, assuming that past experience is superior to end user empathy. So how often have we substituted with what people really, what the user really wants with what we think is a better technology. What we think is a better design. Second possibility is, so that is kind of answered by the Agile principle. Business people and developers must work together daily throughout the project. If you try and create your own silos, you probably are not having the right information that's required for you to create the right product. And with the wrong information, you're going to be answering the wrong questions. Second is reliance on experts and specializations over cross-functional teams. So if you have specific experts and the team is not really cross-functional, do you think there's going to be a likelihood of bias creeping and would you not substitute your ideal product with what is it that the expert really thinks? If you have a UI only team that's sitting in a location, there's a Java architect sitting in a different location, and if these are not a part of the same Agile group, everybody's decision is going to be optimizing in their own world. And they'll be answering the question of what's a good product based on what they think in their world as ideal design. And this is exactly what would be mitigated by the couple of principles on how do you build projects around the motivated individuals by giving them the right environment and the support they need. So you bring in the motivated individuals. You ensure that you do not try and create a specialized teams. And the best architectures requirements and design emerge from self-organizing teams. So you can see that all these principles, fundamentally what they try and do is try and overcome certain natural thinking and behavioral flaws and address them and ensure that you try and keep the main values of the end user in mind and not fall prey to some of those agile shortcuts. Let me just take another couple of quick anti-patterns and let's just see now if one of the other cognitive biases play into it. First one, I just call it, it says working product mind us. What the typical anti-path would be something like, I'm happy to accept change but we will need a foolproof change control board and the customer has to pay for it. So change is obviously welcome. I don't think at this point in time anybody says change is bad. But you try and ensure you create every possible barricade and bureaucracy that the change actually doesn't result. You are so averse to change. Second thing you say, the customer doesn't understand how good this feature product is. You just built it and it's obviously the most awesome thing because it worked perfectly when you were playing within some other product elsewhere. And the last project you did, it cost a lot of money. You brought in a specialist to do it. So this must be the most awesome feature. But why does the customer not really want it? I mean I don't really want to change it. So obviously this results in some kind of loss aversion and you don't want to try and throw away something that you, what you have or what is it that you already built. And Yorda comes and says the agile path that you must take is to ensure that you welcome changing requirements even late in development. And how do you do it? How do you ensure that you remain the path? You try and play games to visualize the value and try and visualize the future. How do you avoid loss aversion and try and take the right decision? You try and visualize the future and see what the future looks like. So the future is actually valuable to you and you're not really worried about losing what you have at present. There are multiple ways you can try and create this and gamifying things to ensure the value of futures is not lost. Second, it's something I call Freeze I'm Ma Baker. It's a Bony M song for people who are into music. What it really says is that just because somebody is asking it you should really be doing it. For example, there's a story that says as an MIS I want data from the old source to continue to reach me when I'm retired so that I can rest in peace. Would that be a good story? Apart from the hyperbole there. But I'm sure you see stories where it says that I want data for the system because this data is going to be consumed by XYZ, format from XML needs to be passed, etc. Is it really a good story? Would you call that really agile? If not, what is wrong with this? I would say this is an attribute substitution. I would say that instead of solving the real problem of what is valuable to the user, you're just trying to create something that's useful for the system. And whether that's finally adding value to the user should be captured so that what is the goal behind the whole thing. And if it's a meaningless goal just because somebody is asking for it that doesn't make it a good story and just because it's a good format it's not really a good story. So how do you mitigate this is what the Yoda says is not me is that our highest priority is to satisfy the customer through early and continuous delivery of valuable software and it's important to note what value means. It's not valuable to the person who is just paying the money or valuable to the next team that's consuming it or the architect who is supposed to sanction your final design. It's valuable to the users. And how do you ensure this happens is keeping the goals visible. So one of the ways to do it is by product mapping, etc. Obviously these are just techniques. This is not really agile but the ways agile needs to be implemented which ensures you do not go down the wrong path of attribute substitution by keeping the main attribute in always in mind. Keep your goal visible, keep every single piece of work that you do tied to that goal. And the last bit is something called self-organization often becomes selfish organization. For example, as a PM I want productivity metrics so that I can be sure the team is not slacking away. How many agile groups here who collect information on what your velocity has been over a period of time and question your developers when the velocity falls? Or worse, how many of you look at number of bugs that are found by the QA as the important metric for the QA? These are normal way we are thinking. We try and equate productivity to certain very simple metrics and by that that's not really helping the final outcome of value to the users. And I fundamentally again say this is an attribute substitution a cognitive bias and the way you ensure that you don't do it is by ensuring that the best architectures and requirements and designs are emerging from self-organizing teams. Don't have an expert coming and telling you what is right for the product. It should be the team with the product owner who is trying to decide what's valuable for the product. So broadly, we come back to those two roads that the real reason behind this talk is not about those agile shortcuts. I'm afraid. I promised you a roller coaster ride so I'm going to go back on this. So you thought the agile path that whatever we talked about just now which is overcoming all those bias is the one that is the tough boring and the counter-intuitive path. In the next four or five slides I'll challenge your opinion on those things. Income culture. So far agile is a brilliant, it's not a similar bullet, at least it countered all those biases and ensured that you are meeting the customer's needs, you are delivering software frequently, you are getting the right feedback and you are delivering the right product. So what could possibly go wrong in that? Well, I don't know if you heard the song, Luis Armstrong, which says that you like potato, I like potato, you like tomato, I like tomato. The idea is that when two cultures come together what you call it really makes a big difference and the end of it might be like, let's call the whole thing off. You don't want your product to not work your project to fail simply because there are multiple cultures that are involved there which see the value and the goals very differently. So what about culture do you think that can affect agility? Case and point is a little search done by Joseph N. Rich, which we just talked about earlier as part of the University of British Columbia. He devised a bunch of experiments broadly on two areas. First is to try and see that the usual demographic for studying all the principles of culture and psychological and anthropological research he classifies them as weird studies. They are main target audience or the samples from which they collect your western educated, industrialized, rich and democratic. Believe it or not, he actually went back and looked at the sample demographic data for all the psychological research done and the most peer reviewed journals, 90% of them came from this demographic, weird demographic. Now he went back and he wanted to question whether what was found in those research really applies to the entire diaspora, the entire world. And second thing he studied is something called culture gene evolution. So there is two fields of study which says culture is purely anthropological. Second that says it's epistemological, it comes from genetics. And there is argument that goes on that culture affects genetics and vice versa. So he's done a lot of research on that too and has found that there's obviously a movement of information across gene as well as culture. And he studied both these to analyze a few of the areas he tried to look at. One is called the ultimatum game or the dictator game. I'll describe to you what that concept is. And second is the concept of altruistic punishment or antisocial punishment. And third is trying to look at how earlier we saw a framing bias. Does that kind of a physical phenomenon have the same kind of meaning across all the population? Among a lot of different studies that I'm going to talk about today which may have some implications for how we view agility and how we implement agility. The ultimatum game and dictator game. It's a very simple game. The ultimatum game involves two players. You give a pair of subjects a sum of real money for a one shot interaction. The player one offers a part of the sum to player two. The player one, the proposer, he says that out of the 100 rupees you disagree on this amount. And if he agrees, they take away what the respective amounts are, whatever is offered. If he disagrees, nobody gets anything. This is a economic game which is used to understand two things. One is what people consider fair. What do you think the other person should be asking? Because anyway I'm the proposer. I have the complete authority to propose. What would I offer the other person? The other person should be responsible for the act of unfairness. The variation of this is something called dictator game. The player one proposes an amount and the player two has to take it. The variation, the difference between these two obviously you can see is that in the second one there is no concept of punishment. The player one just thinks whatever is fair in the one shot interaction and proposes that amount. And here is a study what came out of it. Yeah, just describing that UG is purely a measure of self interest and tolerance to unfairness. Both of this is studied under the ultimatum game. The dictator game is an undiluted measure of fairness that's about it. So this is what he found. He found across a various, most of this research was done in US as well as a lot of tribes in South America and Africa. This is what is called as the fair proposal amount at which that people offered. This is the median amounts on all those populations. A few tribes on the left you can see it starts from 25 dollars out of 100 and on the extreme end there is US and second tribe which offered close to 50. In fact he found that most of the western area demographic would have this kind of distribution of what they think as a fair final amount. It's range between 45 and 50. A similar thing for dictator game again very similar, a lot of tribes offered close to 25. 25 is what I'll offer you. I'll take away 75. The second person has a chance to reject it but still I'll do that. While in the dictator game they offered slightly as you can see, slightly lesser amounts because our guy doesn't really have a choice of responding. So in the absence of the punitive possibility of a punishment or a reputation people offer slightly lesser amount but fairly more or less closely so. But you can still see that US has fairly been all the time consistently on the extreme and while a lot of tribes are not even half the amount and so depending on what are the underlying values our concept of fairness can be widely skewed. So one is that weird is obviously consistently on the extreme you can see that and second is surprisingly some cultures rejected offers that are not only too low but too high. There's something you would never come across in the weird demographic and he founded only in certain areas which were not educated or which were probably did not see much commercial transaction happening so the value of money was as more ended up just being evaluating people rather than being looked at denomination of currency. So people just thought that that guy is really stupid to offer me this higher money I'm going to take it. I find it unfair that I should be taking this high money. That's one case in point. Second is this concept of altruistic punishments and something called antisocial punishments. So what you would expect in the ultimatum game is that when you offer too low a money you would expect the person who's receiving the money to reject it. So this graph tells a quite different story. What it says is that people not only punish what is called as the altruistic punishment is if somebody is acting in a way that's inimical to the group or a common good then the rest of society is expected to punish it by trying to forego even you will go out of your way trying to punish the person even if it hurts you because you just want to show that it's fair to the other person. And lacking this concept of punishment or what you might be perceived as in a reputation people act very differently. So as expected in the most of the countries you can see the US, Australia, UK, Switzerland in a few of the countries the punishment is always towards one side which is you are willing to forego when somebody is behaving in a way that's mean. The green is the deviation of somebody's contribution. If somebody is acting in a way that's inimical to a common good then they are punished. On the bottom you see a few of the countries this is not just the all your demographic for the weird studies had a contrast between the educated and the industrialized nations we saw the tribes of Africa this is across a lot of just differences between western countries and the non-western countries. In the bottom half of those countries you see a very strange behavior people punishing even people who are good you are punished. I can see a few people nodding here it's not unused to but it's something you would not find it's almost marginal in the top I would say 6-7 countries they are out of the 20. It's a very strange behavior but what underlying values, assumptions you are making which leads us to this difference we'll try and see a little more of this going forward. The third thing you started is something called the Mule Liar illusion and it's got a various avatars of it or you would have seen in the internet on how do you judge the length of two different lines when it's framed in a different context. Again you have the study across all the groups and what this graph shows is what people perceive to be the difference between these two frames in line so people cut off saying that A looks like approximately 80% of B let's say you can clearly see people who think that A is a smaller line when compared to B which is a normal way we perceive it depending on the frame and this graph tries to measure in different parts of the world different tribes how they saw this difference on the right you clearly see that most people saw around 20% variation in the size you see completely zero. The white and black are adults and children and they wanted to see whether the adults perceive it differently based on how they culturally have been brought up during their lifetime. You can see in most of the information it's fairly similar in all the cases you would find the adults are actually more prone to illusion than children so they have a I'm sorry I'm wrong the adults are less prone to illusion you've been trying to compensate for it so all the black lines are slightly longer the difference being much higher between these two lines but in some of the tribes it's very strange to note that adults had a very small perception of it they actually completely compensated for this bias they could see these lines as being actually equal so the physical perception of length is not really common across all your entire world and what you've been and this is not just what you learn during your lifetime it's a cultural genetic evolution some of us have genetically been programmed to able to see depending on how we frame it optical illusions so these three experiments make you start questioning what is it that we think as truth in terms of how people perceive reality and what leads us to think what is fair are our value systems or what we think is should be punished and what's good is it all similar so what do we do given that there is so much anomaly I think what we are left to do is what Professor X does in mind reading but the good thing is Professor X also has some tools there which makes his mind reading even better people are used to X men so two things possible you have to read minds in the minds the better thing is use available tools more responsibly so there are tools available for us to be able to understand this cultural nuances and be able to use those tools in determining what's right appropriately to the situation for example there is one that's called Hofstede's cultural index it's been in the making for decades now and it's a lot of research and data that's gone into it Gert Hofstede is a professor in sociology and there are a couple of other tools available like top and ours which try and create certain dimensions based on which you can identify what the difference in culture are Hofstede for example has originally proposed four different dimensions which later on became six his main ones are the power distance index the individuality of a culture uncertainty avoidance masculinity, pragmatism and indulgence versus restraint and all this data comes from years and years of study something called the world value survey this is an open data project you can go and get data from you can see actually what people have been responding to and how the differences in perception happens across all these countries for about 90 different countries in the world I'll quickly go into what these each of those dimensions are and let you evaluate what you make of it how does this affect agility the power distance index is the degree to which the less powerful members of society accept and expect what that power is distributed unequally so it's not just the reality of what how power is distributed is there a hierarchy or not it's also people's acceptance of it that's defined as power distance index and the bottom is a cartoon of a famous disaster that happened for Japanese airlines and there was lack of communication because the co-pilot would not point out the mistake to the main pilot and the later crash I think back in 60's something and Jal went through a major crisis after that so you can see power distance index has an implication culture and this is how the distribution of it really looks like on the left you see that our countries which are extremely high in power distance index which means they both tolerate and accept and impossibly expect that the power is distributed unequally and fairness in some ways follows power so India is somewhere there of course 77 while the US UK US actually most of the you will see the weird demographic countries fall on the right side and although other countries you would find so what it really means is in these countries where power distance is lower it's expected that if you question anybody you can walk up to anybody you do not need an authority to stand for yourself and power is not a deterrent to you are getting means in this world while on the ones on the left there is a natural assumption that the more powerful are likely to get more means in their life so might wins right is fairly prevalent in these countries so that's one of the dimensions individualism sorry before going there so it's just my hypothesis that I'm happy to listen to a different one that for agility I would think that ideally a low PDI might be more useful right so would you not think that if self-organizing team which has got to decide things for itself it should not be the PDI should not be in an obstruction to this so obviously the higher the PDI the more likely you will face a challenge in implementing certain aspects of agility second is individualism it's the degree to which individuals are integrated into groups in individualist societies the stress is put on personal achievements and individual rights while on a collective society you are more identified as a group you do not try and create your own identity beyond or you associate your identity mostly with a group a family with a firm you are working with or with a country and into a certain extent even patriotism is fairly linked to this concept so again you can see there is a big variation in how the various cultures are perceived in individualism in UK Sweden it's really high while it's really low in China India and I don't think there is a big I don't see a big impact of this in agility I would still say that it could be an average amount of individualism that could still aid agility it's not you do not necessarily have to be promoting individual achievement or self individual rights over the others I think I have 5 minutes more 2 minutes more so I want to quickly run through this tool and compare and contrast what is it that makes what would probably help us being more sensitive to agility uncertainty avoidance society is tolerance for uncertainty and ambiguity so a society which is tolerating that it's likely to be an ambiguity is one that is not likely to follow rules in a society which wants things to be more certain you would have rules and the rules would be followed more strictly and I would think that agility would ideally require some countries which have a higher uncertainty avoidance you don't want to necessarily not follow rules and you want the opposite we say you don't want to avoid uncertainty we want that now I will take the question after this quickly run through but I do have a hypothesis there whether it's the end or the means there uncertainty is inevitable yes but you are trying to embrace agility to minimize it at some level right say fair enough let's get there third is masculinity the distribution of emotional roles between genders the masculine cultures values are competitiveness, assertiveness, materialism while the feminine cultures this is obviously not a gender based stereotyping but more trying to find out what drives a certain society channelizing of energy let's say feminine cultures plays more value on relationships and quality of life and my hypothesis again is that you would want some slightly higher masculinity as defined by these values or completely wrong sorry mid to low this is a mistake masculinity should be ideally much lower and final is pragmatism and this says how people relate to fact that so much happens around that cannot be explained and societies which are more with a normative orientation most people have a strong desire to explain as much as possible while ones that are more pragmatic are okay with things that are happening you don't need to have the final answers to every single thing and again my hypothesis here is ideally mid to high for pragmatic culture I want people to be accepting and learning rather than finding answers for everything one that's more metric driven that's trying to find the final answer for every bit is not likely to accept agility that easily the last thing is indulgence versus restraint I mean I think here the score should be pretty clear indulgence stands for society that allows relatively free gratification of basic and natural human rights and society which has got a low score in this dimension have a tendency to cynicism in all likelihood so if you are very indulgent or sorry one that is exercising too much restraint which has got a low indulgence quotient you are likely to be pessimistic at some level you are not you are not really letting go of what your natural urges are restraints stand for society that suppresses gratification of needs and regulates it by means of strict social norms so my hypothesis here is again that a society which is higher on the indulgence is likely to be more tolerant towards agile and more easy to adapt to an agility once again I go back to the original disclaimer this is a model which just helps you understand the nuances of it and here's hypothesis again coming from a certain cultural background that I possess and I am very open to discuss this and that's the whole reason why this topic exists is that I want this discussion to be able to lead to a more nuanced approach so given that tool that's one way of looking at it and broadly what that summarized to is something like this all the ideals that I put there is the green in the spider chart and the pink is what India is for example it's a fairly huge variation there and can lead to a better way we address things so quickly going back on the anti-pattern I talked about for example in the last time we said the ideal path for agility would be the best architecture requirements in the science should be emerging from a self-organizing team and with a more nuanced approach I might say that yes it's a self-organizing team but I might still put some kind of authority figure there who is still trying to say what you're doing is right or wrong may not work in every place may work in every place but I will not completely dismiss the idea of having an architect a chief architect or a PM somebody who is an authority in a specific areas because not only is power expected here the lacking of power can lead to some problems and here I'm not condoning wrong behavior I'm just trying to say this is a bulwark on which to try and build agility in the long run if you try and try to jump to the end goal immediately you probably are missing the calcium in between just hypothesis second for example is the agile path which says you should welcome changing requirements and I would say that don't accept the path just take the value here on what the real what the real goal is it's not about trying to create change for change sake but if you're still creating a good product and the product is supported by some other means by hypothesis testing I would say that go for hypothesis rather than a necessary change last last slide the idea here is this not everything is lost you don't necessarily have to think that since there's so much of diversity in cultures what works here won't work there this is a very interesting statistics again from the world value survey which tries to categorize the same cultures on two parameters one is survival was a self-expression values and the other is the traditional values versus secular rational values it should be fairly clear that the societies that embrace more of secular rational values and self-expression values are likely to be ones that are more experimental that's one that's willing to learn much more and question the traditions that do not make sense and move towards a better path so this I could probably look at this in a lot of detail if somebody's willing to the ones in the there's definite proof that in the last 40 decades even the nations that are lagging in some of those values have been moving towards the bottom top right quadrant right so with that note I just say that and also there's a second thing that says that the richer you get the more again you move towards a pragmatic a more rational society etc with this before concluding I just want to say is this that as Robert Frost said that over the years we can try and keep following the ideal path and the ideal path is not always obvious and unless you question whether it's really ideal and have some way of verifying it and adapt to this situation you probably are not taking the right path and in the end what will make a difference is what you see thank you