 So this is lightning talks. Again, this is a last minute impromptu thing that we thought would be great. We've always had lightning talks at Agile India conference and we wanted to do one this year as well. So the concept with lightning talks is basically these are very short three minute talks. And this is a great way for anyone to share their ideas and present to the greater forum. Especially if you are a first time speaker, it's a great way to get up there and speak and present and kind of, you know, be heard. What I would request you is if you want to come up and speak, then please use the Q&A tab and propose your topic as a question like you use just the Q&A. It doesn't have to be a question, but propose a topic using the Q&A. I'll select the topic based on the number of likes in case we have too many topics. We cannot fit everybody. Then we'll use the likes. And then once your topic is picked, I will call out your name and you just raise the hand. If you can see the raise hand, you raise the hand and then I can bring you on stage and you can present the topic. Okay. You can choose to use slides. You could choose to ignore and just present the topic without slides. It's up to you. You know, it's whatever makes you comfortable. Great. This is such an awesome crowd. I see four topics there. Yeah. So we will get started and maybe, you know, more people will get encouraged when they see some of them. Okay. Keep hitting those like buttons and we'll get started. Cool. So I think the one with the highest likes is the empathy one. So, Babita, if you are in the audience, can you please put up your hand and I will bring you on stage. Hello. Yes. Hi, Babita. Thank you. Thanks for proposing a very important topic. I think empathy is extremely important. So I'm going to set my timer for three minutes. Are you going to do slides or are you just going to talk? I just want to talk. Perfect. Take it away. Yeah. Thank you, everyone, for voting that. Empathy is the word that I have learned long back. And one of my team leader who used to talk more about this empathy word. But I started to feel it recently when I've taken the role of Scrum Master and thought to understand people more. Now I'm trying to relate it to my daughter as well while living with all the noticed stuff. What I observed with this is that when someone is being emotional, very, very emotional, that is being because of the work that we are doing or because of the little kid who has been hurt from someone or want something. What I observed, they don't need any of your advice. They don't need what solution you are up to and what can solve your problem. What I observed is they need listening them out. You listen to whatever they have, try asking them what mode they have, what happened then, what happened then, until they calm down to tell you all the problem that they have faced. I feel that at that moment when you go and share what you have, something that you could help them out. And that's when they open up their mind and listen to you. And then maybe they couldn't think the problem or whatever with a different direction altogether. So that's what was my learning. I thought maybe I will be happy to share that. I don't think I'll need three more minutes. Thank you. Great. So empathy and what you talked about is people don't need advice or any of that stuff. So that's great. Cool. Thanks, Babita. I think next up we have is FinCy, who wants to talk about stop using the term agile with customers. So FinCy, if you can put up your hand or actually FinCy can join directly, either ways is fine. So I actually wanted to encourage people to put up topics and that's the only reason why I did this. Some enemies of mine over here just hit like over there. Since I am on the stage and on the screen, let me just... So we had this discussion earlier in the morning today. I don't know if Sridhavi and Lakshman are listening to this, but we spoke about customers being walked down by this term called agile. So every time we use this term, different people perceive it differently. So we spoke about mindset and all that. So I don't want to go into that, but I felt and I think I'm doing it for quite some time now. I stopped using any agile terminologies or the term itself in front of customers because they hear a lot about agile in the market. They wanted to agile, but they don't know. Even we are not experts. We are all learning that we are here to learn. So I think we should slow down on using that terminology or using agile terminologies with customers and slowly start using facts instead to prove the point. If you really want to be agile, it actually means that mindset of doing things and adapting to change quickly. If you understand that part of agile, you really don't have to prove your point by putting words into their mind. Instead, just show them the results, show them facts and slowly they'll start getting used to it. So I want to share a very personal, I wouldn't say a story, but something that happened to me off late. There was a client who heard this term and wanted to use it into their projects and said that if you get this done for us, it's going to be project after project. So everybody wanted to showcase that they are agile. I know this, I know that, but the problem that happened is the person who actually proposed the project, the client did not know anything about it. So slowly we started showing results which actually helped him understand what it meant, what we meant. We brought him into demos, product demos and it was a product. So slowly he started understanding what it meant. So my general opinion, maybe it's just a thought that I want to put out there is stop using the word agile or anything to do with agile just out there with the customers because for them, especially people who are completely into the business side of it, they might not understand the technicality behind it, but they would love to see results and facts. I'm done, Naresh. Awesome. So results over terminology. Yes, please. All right. Thank you. All right. I am going to go next and then there are this one other topic after that, but you know, folks, please go ahead and propose your topics. We're going to bring you up on stage. So I'm going to quickly go next. And so my topic is after about after evangelizing TDD for about 10 plus years and maybe practicing it for 15 years. Why I stopped TDD. And what are the lessons I learned after I stopped TDD. So the first thing that I basically discovered is when I would, when I was doing TDD, I've got quite proficient with it. I was able to get away with a lot more complexity. You know, when, when I was doing TDD, I could, my code could be fairly sophisticated, fairly, you know, dense and complex. And I could get away with it because I had the safety net of TDD. And somewhere deep down, I felt like basically, you know, one of the core principles that we always strive towards is simplification and simplifying things. And so this started a bit contradicting with it. And this is again one of the places where I moved away from object orientation and got influenced by functional programming and started writing more functional code, if you will, trying to embrace simplification and minimalism. So that's like the first, you know, problem that I saw that I could get away with complexity. I had the safety net of those tests and I could do that. And then basically when I stopped TDD immediately hit me that, you know, I no longer have that safety net, I cannot write complex code. So I really need to focus on simplicity and, you know, functional programming to some extent was kind of at risk for me. The second thing that was, was bothering me for quite some time and I mentioned this today in the keynote as well is, you know, I often kept making the statement that code is a liability. Code is not an asset, code is a liability and you need to be throwing away code. You need to invest less in code and throw it away more frequently, which allows you to experiment more. So when you do TDD, when you write very elegant code and you start getting attached with the whole thing, you know, and so when you stop writing tests, you don't have the safety net and you also want to experiment more. Of course, you can't afford to throw away everything, but you have to now start becoming extra careful of how you will decouple your code, because then you can throw away small parts of the code, not the whole thing. And, you know, you can move away from that attachment and some costs which plays on your mind. And so, you know, if you're trying to get into this exploratory mode where you want to try things and, you know, throw away things and move very rapidly, not having that, you know, pristine code written with tests really helps you get into more of an experimental mindset. You know, it also gets you, you know, to think about how you will decouple things in a much more deeper and meaningful way. So that's the second reason. The third thing which often people keep saying is you, you know, to refactor code, you need tests. You cannot refactor code without tests. And this, for a lot of people, like they cannot think of refactoring their code, it's a nightmare for them. And what I discovered over the years is that, you know, there are actually a lot of safe refactoring techniques which you can still do and refactor large amounts of code. Without needing any tests, it just needs you to be a little bit more thoughtful and methodical and use the right tools. So refactoring without tests is very much possible. We've done large amounts of refactoring without tests and it's, you know, the earth is not stopped. You know, so things can be quite easy. So the third point I want to make is basically, you know, refactoring without tests is quite possible. And sometimes actually not having tests makes you refactor in very interesting ways which you have not have done other ways. And so that was like a moment of revelation for me. So I think those are the three kind of big ticket items that I would say that kind of really helped me. And like I was saying earlier, the convention code is predominantly does not have any tests, you know, barely any tests. And, you know, it's working fine. We've done lots of transactions, financial transactions, a lot of systems. I'm not saying we don't have any bugs. We may have bugs, but I don't think those bugs, if you go test would have gone away. Or we would have gone away. So sometimes I think it's important to kind of just step back and look at, you know, what's the benefits you're getting from DVD. And if you don't do what are some of the advantages and disadvantages. And so just keep an open mind. Don't be very dogmatic about it. These are all important tools as developers in your toolkit use what works best in the given context. So thank you. That is my quick three minute on lessons I learned from quitting DVD. Paul, why don't you come up? Paul, if you want to put your hands up, I'm going to pull you up. Hey, so, as I said, pick an argument. Yeah. Small stories. We should shouldn't do sodium testing in the bills. What is the bills? Plenty of things I'm ready to argue about if you are ready to say something controversial about if you wanted to pick one of those or something else. I've been a big proponent of not writing end to end UI tests, you know, or at least drastically limiting them. I think I gave it a talk a few years ago about linear detox. You know, selenium, the term selenium was picked up because it was to, you know, it was a recipe for mercury poisoning. And so I was kind of trying to find a reverse of now we have too much of selenium poisoning. So we need to move away from that. So for a start, I don't like the term E2E. And I don't use it end to end or E2E because every person I ask says a different thing. What is E2E and they give me a different account. So is it full stack testing? Well, for some people it is, for others it's not. And I don't do it. I'm going to engineer something with unit tests after compile. And then when that all passes, I'm going to follow with integration tests. And then those could be headless service tests. You and I do those with service virtualization technologies, you know, but they seldom involve a UI aspect. And then I'm going to bring up the functional testing with selenium or apium or something, you know, Flutter driver in my new life in Flutter. But I'm not going to run that against full stack. I'm going to bring up the smallest viable testable thing that I can as quickly as possible. And I mean, it looks split seconds. I'm going to do that on the same machine as the tests are running on. So the same local host as selenium is running on web driver. And I'm going to test something that could be in a Docker container. Or I'm going to test something that's just stood up within a JVM or nodes or a rack or a, you know, Python provision that is only servicing the particular sequence of operations I want to do within the browser. And if I can, I'm not going to include the whole site. I'm just going to concentrate on the rectangle and leave everything else out. So I'll have need in my dev side rather than my test side. I'll have need to settle that up so that I can go and functionally test rectangles. But I'm running selenium now at three tests a second or five tests a second, depending on the technology. And I can crank through a lot of tests without involving a selenium cloud or a grid and without parallelizing. And I can focus on individual things like if I'm testing a credit card page, I can test every single permutation that could be represented in parts in HTML, CSS and JavaScript. And I can test every permutation there. But what am I hitting in the tier below? It's probably not the database. It's probably something that is only stood up minimally to support the functional stories within that test sequence. And I would again do compile followed by unit test, followed by service test, followed by these component tests using selenium or APM or Flutter driver. And I want the whole build to be finished in one minute, all of it. Compile, unit, service and functional tests. So in that particular vision, I don't at all ever say that was an E2E test or a representation of any E2E exercise. Each particular thing that was tested was tested directly. Like if you have to log in before you get to the credit card page, in my test suite, I'll bypass the login page and just load up with a cookie, the credit card page ready to go. Cards not filled. Pretend that I filled the card. I'm straight on the credit card page. I'll be paying for a hypothetical card. And I'll test every single edge case that could possibly happen from there. So I don't say myself. I don't say E2E. And I refuse to cooperate when people start having the conversation that features the expression end to end, because they don't do it. Dammit, we can't disagree with each other. We need to pick an argument here. Yeah. I think the guy started E2E. I think they coined it back in 2011 or something like that. And it's run away. People say E2E testing frameworks and it's like nobody knows what that means. Just for the benefit of the folks in the audience, Paul Hammett here is the co-author of Selenium. So he's someone that I would trust when he makes that statement. He's been there. He's done that. And so when he says that this is not what Selenium should be used for, that means a lot. It's not some Joe off the street making the statement. So I just want to clarify that. But I'm controversial in that it's possible that the Selenium team themselves don't agree with me anymore. So I'm happy to be in a club of one around component testing and in avoidance of E2E testing. I mean, from whatever my interaction with Simon and the gang, I think a lot of them are on similar pages, in my opinion, that they are not encouraging or pushing people to be very top-heavy in that sense. Like right a lot of these Selenium tests, they are actually recommending you push those tests at the lower layers of the pyramid. I stopped using some of these terms because everybody has their own interpretation of what an integration test is. You ask five people and everyone will have a very different definition of what an integration test is. So I stopped using that term. It just means nonsense now. Yeah, integration tests might be easier to argue that you could say, I think this means this and have a reasonable case for it. But E2E, functional testing, they weigh out there now. You can't get any five people to agree on even two different alternates. But yeah, it's problematic. Now, Paul, what do they think of integration tests? If you want to just quickly type out in the chat window, what according to you is integration tests? I'm curious because every time I talk to people, I find quite different definitions of what an integration test will be. How do we get back to the chat window? Just on the right, you would see the discuss. Yeah. Aslak spoke maybe three years ago at Agile India. He actually did a fairly good presentation on the need for fast cycle times within functional testing suites too using some of the techniques we're alluding to. Correct. Aslak being the cucumber guy, Mr. Cucumber. I remember him building this framework at one of the clients where we were. And basically any code that did not have tests, he basically just deleted them from the repository. And that was wicked. But I think it drove home the point that people could not just throw in code. And he was talking about the subprofessional malpractice. And so Aslak can be quite intense. He's an extremist sometimes, right? In a good way. All right, I think it's just you and me, Paul, because I don't see any interactions on this discussion. I need some discussion here. So what do you think integration tests are? If it helps, I can use a very cliched example that I typically use. So I have a calculator application. It has a calculator UI, which is basically the front end where people can type in the numbers. And hit the equals. It also has a little display which shows the results and the numbers that you're typing in. Then it makes a service call to these days serverless is the fashionable thing. So it makes there's a function in the cloud. So it makes a call to the serverless thing that then looks up some kind of a data store. And then does some computation based on that and then returns the result back to you. And so you have to keep it simplistic. You have three components, if you will, at play, your UI, you have your function as a service, and then you have some kind of a data store. So if this is just a typical any credit application type thing, what would be an integration test in this context? Can folks in the audience please type out what they think a integration test would be? So Vinaya says, making sure these components work well together as expected. Anyone else? If you agree with that answer, do a thumbs up. And so only three people I see doing the thumbs up. What about the rest of the folks? They don't have a thumbs down feature. Thumbs up if you hate that as a definition of integration tests. You hate it. It's too abstract. I think it's fine. Vinaya said she did a couple of thumbs up. So it looks like it's just three of us then. Vinaya, do you want to come up on stage? Yeah, hi Vinaya. So I think, again, just in the spirit of being a little controversial and having an interesting discussion here, not trying to pick on anybody. But what I was trying to make a point earlier is, you know, making sure the components work well together as expected seems to abstract to me. So please help me with an example of what would be an example of an integration test in this context. Sure. So the way I'm thinking about it is that anything that has been developed separately but is meant to work together, we want to make sure that that happens. So let's say in the example that you gave of the calculator, let's say there's a UI layer and there's a host layer. And the work has been done separately on those. Then, and each of them works well by themselves. I have been unit tested so you know individually. And now to make sure that when both of these are brought together. So if I'm putting in a number from the UI, it's hitting the right services, returning the right responses, getting displayed correctly. So just, you know, maybe taking a cut across the components, seeing if that flow works or not. I would call that an integration test. So just again trying to build on that. So you would have some kind of a test which would punch in numbers into the UI, would send the request like would hit enter and it would send the request and then the response will come back and you will assert whether the response that came back is what you expected. So you sent two and three and then you would verify that five came back. Right. So great. Now, here's something for you to think about, right? How is this now different from the other higher level tests like your end to end tests? Is this not a kind of a full stack test in some sense? How is this different from those tests? Right. So like I'm going back to my experience from many, many years ago as a developer and thinking about it. And I would say that maybe when we are thinking of integration, we are probably focusing a little more on the interface, just that bit. And the other functionality, et cetera, maybe is taking a backseat. Whereas when we are looking at higher level functional tests, maybe we are a lot more concerned about the overall business flow, the functionality within each of the modules, behaving right and all of those things. What is the integration usually from what I remember? We used to focus more on whether the interface, whether the two components know how to speak to each other the way we expected to. Absolutely. So that's what I would prefer as the focus of integration tests is whether two components can talk to each other, can communicate with each other. And the first thing is I would never assert that the response that I got back is five, because that's now functional. It really doesn't matter if I got five back, I got six back, I got nine back. That is not a concern of the integration test. That is the functional aspect. It would let functional tests deal with it, ideally at a unit level or maybe at a higher level. But certainly the integration test should not be asserting, for example, what the results came back as long as the result was a valid number. Or an integer will be expected. Yeah, that's good enough. That's one thing. The second is do you want to drive from the UI in this specific case or do you want to go to the last mile from where you're actually making the API call like in this case? So you have some kind of a client which is making an HTTP client which is making an API call. And so you probably are interested in hitting from the HTTP layer and seeing if it can communicate to the right service. So there is some kind of a configuration lookup to figure out where the server is, whether you can connect to that, whether you're using the right protocol. So if your server is expecting an HTTP JSON request and you're sending an XML, it's not going to work. So those kinds of things that generally people talk about when they talk about integration tests. And that's what is making sure that these two components that were independently developed, whether we can talk to each other. Right. While that's true, I would just say that again, I'm talking about history, but we did tend to mix the functional and the interface test unless we were talking about API calls from third party vendors, et cetera. Over there, we knew that the functionality wasn't going to work at the first go. So that is where we really narrowed it down and said, okay, let's just look at whether the interface works and park everything else for later. But I appreciate the point. Cool. So thank you. I mean, I was just trying to again clarify that, you know, the term integration tests means quite different things to different people. And now, you know, it's a mix of technologies into this. It could further complicate things. Absolutely. And, you know, it's quite interesting to try. And so I've now stopped using the term integration tests because I think people have a lot of baggage when they come with it. In my morning's talk, I basically talked about unit tests, component tests, you know, components like service tests, contract tests, and, you know, application tests. And application tests is something everyone understands. And then, you know, more of acceptance tests and then essentially shadow mode tests which you run in the production. And those are the kind of terminologies that I seem to be sticking these days. But who knows, even those can be problematic. So contract test is your terminology of choice right now for integration tests. It's a subset in some sense. It is only verifying whether these two components, you know, contractually can talk to each other and they adhere to the contract. So that's the terminology that we're using. The moment you say contract, everyone knows that you're not asserting functionality. Everyone knows that all you're interested in is basically seeing if this other thing meets the contract and is backward compatible with the contract because those are the kind of important concerns when you're integrating. True. Because everything else I can test at a component level or at a unit level in terms of functionality and other kinds of things. So I think just using that kind of a terminology has at least helped us clarify some of these thinking process with people. Sure. Thank you. That was very useful. Thank you. Yeah, I do see Shakti, you're back. So I'm going to quickly invite you. Can you accept? There we go. Finally. All right. Okay. Finally, I'm there with my Safari browser. I'm not sure. Yeah. I think for three minutes it should work. Awesome. Cool. So your topic, Shakti, was about personal agility. So take it away. Thank you. Yeah. Thank you for this opportunity. Yeah. Hi all. So, yeah, just a small background about me. I'm an agile coach working in society general, right? So we have been talking about agility, agile and things as such. When it comes to personal agility, it becomes very personal to us, right? So what I'm trying to say is if I had a slide that I could share here, I would strike out agility and I would make it ability, right? So personal agility is nothing but personal ability, right? When I say ability, our ability to cope up with the situation, our ability to wind up the work which we start, our ability to handle the failures, right? So these are the different terms we have been, talking about in the agile world also, right? So when all these are boiled down, I have found out that it's really about the mindset, right? So we are talking about personal agility. So we have to talk about mindset shift, right? Which makes us personally agile, right? So there are four mindset shifts that I would like to talk about. First is self-awareness, right? So in Shane Hastie's session, I raised a question asking how do we build the coaching culture, right? How do we start with acceptance of people to be coached, teams to be coached, people to understand that, okay, it's not only me, it's not only about me, I can take help from others and I can still get the job done, right? So self-awareness becomes a very important aspect here to transform ourselves. We have to first know our strengths, our weaknesses, right? Are we really vulnerable? Which means are we ready to accept the failures? Are we ready to do the failures? Are we ready to take the risks? Are we ready to expose our limitations? And next is, do we have some limiting beliefs? Do we have that biases in our mind that, okay, maybe I can't do this because... maybe this can't happen because of so-and-so reasons instead of diving into the doing part, right? And also we need to be checking on ourselves that are we really courageous to make decisions? A lot of times we get an opportunity to get that ownership or get that responsibility, but do we really have that courage to make decisions based on our context, based on our experiences, based on our feelings that we have at that moment, right? So when we assess ourselves on self-awareness, that's where we start. That mindset is the first stepping stone for personal agility, right? So the second mindset shift that, according to me, is growth mindset, right? So if we don't know how to celebrate our failures, if we don't know... I'll give you a last minute to quickly wrap up, please. Okay, okay, fine. So I think we know about growth mindset, right? So we should learn to fail and we should learn from our failures, right? So this I call as the second mindset shift for personal agility. Next is to do things that really matter to be able to prioritize our work, right? So unless we know how to prioritize, unless we know what is very important, unless we know to do the things in a timely manner, our agility is at stake, which in turn hampers our abilities, right? The fourth is to be innovative, right? When I say innovative, it does not mean that we go down and learn the latest technologies, right? In our own perspective, how innovative are we in completing or taking up the deep dive into a new responsibility that we can do. So these are the four mindsets that I would like to stress upon. One is self-awareness. Second is growth mindset. Third is able to prioritize, right? Fourth is to be innovative. So these are the four key aspects. I would call to build our personal agility, which in turn makes us personally able to meet our goals, right? So yeah, that's all I had. All right, great. Thanks, Jeffrey. I love the, you know, you should strike out the word agility and replace it with the ability. I think that's great. I'll steal that. Thank you. All right. I think, thank you very much. We've run out of time.