XP Series Webinar

Mastering User-Centric Mindset: Unlocking Your Potential as a Tester

In this XP Webinar, you'll delve into mastering a user-centric mindset, unleashing your potential as a tester. Explore innovative strategies to elevate testing approaches, delivering exceptional user experiences that propel product excellence.

Watch Now

Listen On

applepodcastrecordingspotifyamazonmusic
Pradeep

Nithin SS

Head of QA, Lodgify

WAYS TO LISTEN
applepodcastrecordingspotifyamazonmusicamazonmusic
Nithin SS

Nithin SS

Head of QA, Lodgify

Nithin is the founder of Synapse QA, a community space for testers and software quality advocates. He has nearly a decade of experience in the IT field, focusing on test automation delivery of web & mobile-based applications. Nithin actively runs test automation workshops, mentors & helps budding testers and other professionals share their knowledge and works as Head of QA at Lodgify.

Kavya

Kavya

Director of Product Marketing, LambdaTest

At LambdaTest, Kavya leads various aspects, including Product Marketing, DevRel Marketing, Partnerships, Field Marketing, and Branding. Previously, Kavya has worked with Internshala, where she managed PR & Media, Social Media, Content, and marketing initiatives. Passionate about startups and software technology, Kavya excels in creating and executing marketing strategies.

The full transcript

Kavya (Director of Product Marketing, LambdaTest) - Hi, everyone. Welcome to another exciting session of the LambdaTest XP Series. Through XP Series, we dive into a world of insights and innovation featuring renowned industry experts and business leaders in the testing and QA ecosystem.

I'm your host, Kavya, Director of Product Marketing at LambdaTest, and it's a pleasure to have you with us today. Get ready to unlock your potential as a tester by mastering the user-centric mindset. Before we start today's show, let me introduce you to our guest speaker, Nithin.

Nithin is the founder of Synapse QA, a vibrant community for testers and software quality advocates. With nearly a decade of experience in the IT field, Nithin brings a wealth of expertise to the table. Currently serving as the Head of QA at Logify, Nithin actively mentors budding testers and shares his knowledge through workshops and engagements.

In today's webinar, he'll dive deep into the world of user-centric testing and explore how testers can play a pivotal role in prioritizing user needs and delivering sustainable products. Nithin will also shed light on usability heuristics and their value in testing, share insights on the intersection of automatability, and discuss strategies to build a robust automation infrastructure that focuses on user-critical flows.

Before we jump onto the session in itself, let's jump straight into the heart of today's show. Nithin, why don't you tell us a bit more about yourself and your journey and then share your insights on today's discussion?

Nithin SS (Head of QA, Lodgify) - Thank you so much, Kavya. Thanks for the wonderful introduction. And I'm really excited to be here. And it gives me immense pleasure knowing that you are also Malayali from Kerala. And we share the same roots. So basically, I come from God's own country of India, Kerala. We call it God's own country, but it's not a country. It's a state in India.

But I'm currently based in Barcelona and working as head of QA with Logify. So it's been like 10 years nearly in testing industry. Started as a QA, straight away after my graduation and ended up in QA accidentally basically. And from there, it was a wonderful journey where I found my passion in testing and also continued into leadership. So yeah, I feel glad to be here.

Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Nithin. I would definitely want to know more about how you accidentally became a QA before we jump onto the questions.

Nithin SS (Head of QA, Lodgify) - Sure thing. So after my graduation, my plan was to do civil service. So I was preparing for one year, and I could clear up to mains. No, I could clear up the prelims, but the mains didn't turn out well. And then, during that time, yeah, I badly needed a job. So I was more into teaching, as teaching was my side hustle.

And then, when I started exploring job opportunities, what I figured out was that what worked out for me was testing, basically. So I had some development opportunities. It didn't turned out after like a couple of interviews. But testing, I got my first offer in testing.

So yeah, that's how it became testing because nowhere during the graduation testing was never thought testing was not a course testing was not something like a chapter I could explore and I was not aware of anything. When I went to the interview, there were many people who know more about testing compared to me. But yeah, somehow, I managed to get the job. So that was how it accidentally ended up.

Kavya (Director of Product Marketing, LambdaTest) - No, that's a great story. Thank you so much for sharing that. I asked that question because a lot of budding folks who are looking to build a career in QA testing definitely have these questions in mind.

So yeah, thanks for sharing that personal snippet from your life. Now jumping straight into the conversation, what underrated user-centric concepts should testers focus on, and how do they improve testing?

Nithin SS (Head of QA, Lodgify) - Okay, so mostly, it's a common trap, right? We all have in our mindset that we believe user experience and user interface are similar terms. And we accidentally messed up with the basic understanding. But those are two different concepts overall.

So user experience is basically how the users feel when they interact with the product and user interface is basically the layout and structure of the application itself, right? So both have different aspects. When it comes to user experience, I would say the most underrated one is the simplicity.

And testers tend to put more effort into ensuring the functionality, and they kind of ignore the fact of how simple is it to use a certain feature or functionality. So, the most underrated one is simplicity, and that is where we need to put more focus on how easy the feature is to be used by a user and how intuitive it is for the user or a customer, right?

And how they find it. Is it mostly understandable for them? Are we giving enough feedback back to them so they know what is happening with the system? So those kinds of things. So they need to focus more on ensuring functionality that works simpler and delivers the right thing for the user.

Kavya (Director of Product Marketing, LambdaTest) - Interesting. I mean, so at the end of the day, what we have to focus on, you know, how foundational concepts can basically have a huge impact on the effectiveness of testing, right? I mean, understanding perspective.

Nithin SS (Head of QA, Lodgify) - Exactly. Exactly. Understanding the basics, yes, is the best.

Kavya (Director of Product Marketing, LambdaTest) - Yeah. Great, great. I mean, is there again, you know, just to, you know, dig deeper into this question itself, you know, from your personal experience over the last 10 years or so, are there any examples that you might want to share with our audience today?

Any instances when you sort of found that a simple concept could probably improve the testing when you were working on a project or something on those things.

Nithin SS (Head of QA, Lodgify) - Mm-hmm, yeah, so, okay, let me tell you one other example first, and then I will jump on the personal experience, right? So, say I had a personal experience when I was trying to make a travel reservation.

So, I was booking a flight ticket. And when I was going through the payment, actually, the payment was taking a longer period of time, and I was not getting user feedback, right?

So, in the end, what happened was my card was charged, and the payment went through, but I didn't get the proper feedback that the transaction was completed. So that happened with one of the airline websites, which has a lot of offers going on and coupons.

And on the other side, there is a fintech platform, which is kind of like a valid where you make a transaction, you send your money. So after your money is deducted, you get feedback that your transaction is being progressed or in process, right?

So it will, it is, actually, they are taking care of things in the backend, but you are getting feedback that you need to wait for a few minutes until you get a successful transaction status over there. So these two instances, if you compare the first one, my transaction went through, but I didn't get the proper feedback.

In that case, as a user or a customer, I would drop off. I will never go again for that platform because my first experience was bad, and I didn't get a proper status. On the second side, the platform actually gave me a status because, again, it's a similar instance because both are payment management. And the second one actually gave me feedback that I need to wait.

So I know that I have to wait to get the right status. So that really makes a difference when it comes to user experience, when we are building quality, like when we are ensuring quality and building products that give a better user experience. This really makes a difference.

And from my personal experience, in one of my previous projects, we had one instance where we got more complaints when we had a form. So we had a form where you have to fill in details, first your personal details, and then you have a building information, which actually is a replica of the similar information you filled out first.

So you have two instances where you have to repeatedly fill in the information. So from the users, we started to get feedback that you are pushing me to fill the same amount of information two times. So then we had to make a small suggestion that you enable a checkbox, you have a checkbox where you can actually copy paste the entire thing.

So you click on it, and you use the same information. So that was one option. Based on the user feedback, we provided a value from our quality perspective that this feature could help our users reuse the same data. And after having that, yes, we got good feedback from the users. And similar thing we had with the checkout.

So our checkout process was taking a longer period of time. And you know there are concepts in Amazon called faster checkout you have one-time details entered and saved in your profile so you can reuse it anytime.

So these kinds of smaller, smaller things actually made a real difference in the last project that I worked on. And we got to get more good feedback from the users because it was a problem they wanted to solve, and we actually jumped on top of it, and we solved it.

Kavya (Director of Product Marketing, LambdaTest) - Now, very interesting. So at the end of the day, testers have to notice how real users are interacting, you know, with the platform, with the software, with an app, for instance, that leads to, you know, effective testing. Very interesting.

Great, so moving on to the next question, how can testers integrate usability heuristics into their testing approach when it comes to for better user centricity?

Nithin SS (Head of QA, Lodgify) - Cool, that's a great one because, you know, testers have a lot of jobs, right? They need to ensure that everything is functionally working perfectly to make sure everything is testable in terms of the testability aspect. And they also want to make sure that everything can be automated right in the end to bring up more efficiency.

So they have a lot of things to do and users and trick approaches or use usability or user experience testing shouldn't be treated as a separate concept. So it has to be embedded as when they do the functionality test, there has to be certain elements of users and trick approaches embedded into that.

And it can only happen when they consider the testing approaches in a way that they are more empathetic towards users. So when they define the scenarios to test, they should actually think of themselves as a user and what is the problem our users will be having if something is working in this way, right?

So they have to design the scenarios based on that. And in order to do that, it should always be backed up with data. So they should understand who the users are, and what are their persona. So what I advise my team to do is to understand more of our patterns, user patterns, and user personas to have a persona list created.

So this user will always use this functionality in this way. So they have a list of personas that can be reused across multiple features. And then they know the behavioral patterns based on the tools we have. So we have real users monitoring those kinds of concepts, which gives you the real interactions, how a user interacts with your product, and you know the heat map of the functionalities the users use.

So from that data, you could define better scenarios when it comes to user experience. And it's not an added effort because you have data from different data sources, you are just embedding them, or you are just enabling it to provide you more data to enhance your testing approaches, basically.

So need to collaborate a lot with different stakeholders with the infra-team to get the metrics and also need to have a conversation with the data team to collect the behavioral patterns and understand which functionality and what all the basic trends and create a set of personas out of that data and reuse the data whenever the functionality or new features are built. So it won't be a repeated effort. It's a one-time investment, and they can reuse those approaches.

Kavya (Director of Product Marketing, LambdaTest) - Very interesting that you shared this because I definitely want to understand more about, you know, if, say, for instance, a team that hasn't done usability heuristics in the past, right? If they were to get started with this method, you know, how should, how do you think they should go about it?

Because, of course, like myself, for instance, I work in marketing, right? You suppose so now behavioral patterns are very key for us. Even the fact that you mentioned about empathy, that testers should have empathy. And that is the same concept that is related to marketing folks as well.

But that said, great to see that across all the different teams within an organization, all of them have common values or common data points that we are seeking at the end of the day. But that's it. How should I, let me reiterate my question again.

Nithin SS (Head of QA, Lodgify) - Mm-hmm. Yeah.

Kavya (Director of Product Marketing, LambdaTest) - How should a team that hasn't done usability heuristics in the past how should they go about it?

Nithin SS (Head of QA, Lodgify) - So the first thing to do, like these all concepts in my past organization, when we did the experiment of onboarding our team to do this kind of initiative, it was totally new, right? So what we did was basically to have, because we cannot go out and ask for customer interviews when the team is new.

So we have to train our team in their comfort zone. So your comfort zone is within the organization. So we can create a group, a working group with a set of different stakeholders and conduct a session similar to Buck Bash and all. So during that session, we can ask them how they interact with our product.

So keep a record of that session. And we know that different stakeholders will have different use cases. And they will be following different user paths and all, record those instances, and that is the first dataset for us to understand how different users will be using our product. So as testers, we get a lot of insight by doing that first step. And after that, it is about talking to different stakeholders, I would say.

So we have a mindset that it's a collaborative team effort, and user experience is not only owned by testers it is always a whole team approach bringing into that. So testers can initiate the conversation, and then they have to interact or be a glue or a bridge between multiple stakeholders to get more details.

So what we did last time basically was to have a set of user journeys collected from these sessions, and we started to have these conversations with multiple departments. So are you getting these kinds of scenarios monitored in your infrasight? Are you getting these kinds of transactions captured in your datasets with the data team? Right.

And also with the customer-facing teams, we started to share this scenario. So is this the specific scenario you are getting from the user reporter? Like, okay, I'm facing an issue for this scenario, right? From customer-facing teams, we actually get more user feelings, whether they are having pleasure using something or whether they are having frustrations by our features.

So those kinds of insights internally make the team more feel confident and empowered with that data. So they will be more comfortable to go outside. So the next step we took was to go outside because the platform we had was a fintech application widely used in Southeast Asia.

So we know, and we use it quite often when we are in Southeast Asia. So we know how the transaction works, the merchants who use the product, and everything. So we started when we went for lunch, we used the platform. So when we use the platform, we actually started to have conversations with the merchants.

So how you find using data? How are you find like if your customers having like, okay, or are they feeling, are they having an issues, those kinds of things. So they started to share their genuine feedback know we represent the company as well but it is kind of like so once we have a set of inputs from them, we actually collected that and presented it to the product team.

So with that, we had kind of like a better, I would say, market research analysis, if you call it that way. So we know what all are the pain points and we know from the internal customers, and we also know the set of feedback from the external customers. So that was the starting point. For any team, start from your internal comfort zone and then broaden your reach outside.

Once you are stable, what you can do as a next step is basically have the design teams and product teams do customer interviews. So as testers, instead of ignoring those kinds of sessions, maybe once in a while, you can join that session, and you get to know how your real user uses the product. And when you have like better testing similar problems, you can be part of it.

Kavya (Director of Product Marketing, LambdaTest) - Okay, thank you so much, Nithin. Thanks. I mean, great insights, for sure.

Nithin SS (Head of QA, Lodgify) - You know what the real feedback is coming out to the customers, and you make use of that. And then you will be feeling like, so these are all smaller, smaller steps. And in the end, you will be; it will automatically come to you. You don't need to do something, especially to ensure that the user experience is perfect. It will automatically come to you. Yeah.

Kavya (Director of Product Marketing, LambdaTest) - What I was trying to understand is how important detecting usability issues from different stakeholders, from the customers at the end of the day, can save a lot of time and effort down the road. So yeah, thank you so much. That was very insightful.

Moving on to the next question, how can testers effectively assess and communicate UX issues in early development stages. I understand you must have already covered some bits of it. I'd to hear more.

Nithin SS (Head of QA, Lodgify) - Yeah, sure thing. So what we can do is we have the usability heuristics created already, right? So we can refer to those. And with all the data points you covered internally based on the product specifications, you can create a checklist all the time. So this checklist can be reused and used across the product. This checklist can also be added as one of the acceptance criteria. Right?

So this actually let the team to think from a user point of view during the inception stage as well. So when they are refining something, they know that these are the checklists which we need to tick off when it comes to user experience.

So having something readily available actually empowers the team to focus more on user experience. And they will consider that, okay, we need to optimize this now instead of optimizing it later. And they see the real value out of it.

So create a checklist based on the usability heuristics, which is readily available and reuse it and advocate more about usability and user experience, right? When you test the functionality, if you ignore these elements, then others will also be ignoring them.

So you need to start advocating more. Initially, when the team is in the early stages, you need to over-communicate a bit. Because you need to when you over-communicate, and when they hear the common phrase all the time, it becomes a familiar term for them.

So I want better user experience. I want better experience. It will be annoying at first, I know, but you have to keep on iterating at the same time. So it gets embedded in everyone's mind, and they automatically consider that when it comes to building those features in the early stages rather than waiting until the last moment. Right?

So, over-communicate and also make sure that you have a reusable template added to your acceptance criteria when it comes to user experience and usability. So everyone ensures that it is met when it is available for testing or when you are about to automate, you ensure that these are matched.

Kavya (Director of Product Marketing, LambdaTest) - Great, I mean, as you were speaking about it, I was also, in my mind, I was also making a note of all the different pointers that you mentioned. So one is of course, creating more structure, testers at the end of the day, have to create more structure through templates that you suggested, advocating user experience, effective communication, in fact, over communication, right?

And then bridging the gap between different stakeholders so that you're able to create a smoothest cycle, I suppose. Great. That is what you had to share about for early development stages, but for organizations that are at enterprise level, teams that are more global at scale, do you have any feedback or any tips?

Nithin SS (Head of QA, Lodgify) - Exactly, exactly. For them, I would say on an advanced level, what we could do is to create an experience strategy. Experience strategy is something really specific to the product, but we can actually leverage the potentiality of an experience strategy to know what are the end goals of our users and what they try to achieve with a specific feature.

When it comes to the enterprise level, they have a large user base, and they have a wider spectrum of use cases, right? So you create an experience strategy based on those kinds of portfolios. And with that end goal in mind, when you have a feature development, you can actually advocate for better practices that are common for the wider segment, not only for a limited segment, you focus on a wider segment.

So having an experience strategy would be the ideal option for the enterprise level or a high, big, huge MNC sort of company because you know your wider audience, and you create something for them with the end goal in mind, not like the beginning stages, you have the end goal of your customer, and you focus.

Kavya (Director of Product Marketing, LambdaTest) - Great, thank you so much for those insights. Moving on to the next question of the day, can you discuss the impact of user-centric testing on overall product quality and customer satisfaction and share any success stories that you might have seen?

Nithin SS (Head of QA, Lodgify) - Yeah, sure thing. So basically, I would like to mention that most of the time, we feel like actual behavior is the same as perceived behavior, right? And it's a common trap. So when we are interacting with a feature or a product, we feel like the user will also be interacting in the same way. And it's not the case.

And one of the examples I used to mention in my talks was basically, say, we, as a user booking a cab in Uber, right? When we are about to go somewhere. But when it comes to testing the functionality, your functionality is booking a cab in Uber, you just make sure that you are able to book the cab, that's the functionality.

But when it comes to a real user scenario, what happens is me, if I'm booking the cab, I will check in Uber, I open the app, I check in Uber, and if it's showing crazy amount, I will go for another platform. I check on the other platform. And that also, if it is giving me a higher option, I will take the best. I won't go with Uber. Right?

So there is multiple scenarios happening in the real life use case of a user, but we don't cover that. So if you're considering the real scenario happened with the user, you know, there is a context that the user opens Uber app, trying to make a search. And then he is switching the context from your app view and he's opening another app on top of your app.

So there is an like a switch of context of different application and your app is running in the background. So how the functionality behaves during that point of time, it is not included in the functionality, but it has to be a user experience kind of aspect that you need to take into consideration.

So having these kinds of like situations, making sure that the actual scenario and the perceived one become more aligned really brings up more value. So you need to know how your user really interacts with the application and what could be the potential interruptions that could happen when they are making a specific transaction.

From our side, actually, a set of customer interviews really helped us to understand these kinds of patterns. So for me also, initially, when it comes to functionality, it was purely functionality. But what my users expect, I started to get that insight from attending customer interviews because I started to see how they interact and what they expect out of certain things.

So we had a filter, and when we tested the functionality, we focused only on filter functionality. Is it filterable? So are you getting the right filter output? But as a user, I am expecting based on my filter criteria, am I getting narrowed-down results based on what I am expecting?

So that is user experience. So it's a different layer compared to functionality. And based on the experience I had from customer interviews, actually, we were able to advocate more for these kinds of approaches.

Like the same one I mentioned to you earlier, having a checkbox to reuse the content, and having a fast pay type of item ensures that the user is more able to make a transaction faster.

And during COVID time, we had a situation where completely it was shut down. And ours was a FinTech application that you need to go to a physical store to make a transaction. But during COVID time, no one was going out, and it was able to make a transaction physically.

So we had to think of different approaches to make the product sustainable. And we really had an option that we could start delivering things to their people's homes using our platform.

But we only came to know that after listening to our customers' pain points they were wondering what could be the next feature I could have because I still need to use the platform, but I don't have a feature.

So when we started to collect those kinds of details, we saw the potentiality to implement a feature that, OK, it could be even a simple Google form because we created a Google form they could fill up, but we bring you the thing which you order online.

So it was a small feature that we did based on the customer feedback and also customer interviews which we did. And it actually brought a lot of revenue and options during the COVID time when every business was shutting down.

So it could be smaller, smaller changes and optimization. Even a smaller checkbox can make a huge difference when it comes to making pleasure in your users.

Kavya (Director of Product Marketing, LambdaTest) - I mean, those are some very interesting stories to hear. Nithin, thank you so much for sharing that. What it makes me understand is that the tangible impact that user-centric testing is bringing to both product quality and also to help increase the customer satisfaction, especially when you are directly getting those inputs from the customers and the users, right?

That must be enough it must be super valuable at the end of the day. And yeah, incorporating user feedback truly seems to be like a game changer method when it comes to that.

Nithin SS (Head of QA, Lodgify) - And I guess we share a lot of commonalities when it comes to like marketing and testing. We share a lot of commonalities, but if we consider the user's point of view and empathy as you said earlier, it is not taught when it comes to only functionality. People see testing as more towards only functionality aspect, but it's not.

Kavya (Director of Product Marketing, LambdaTest) - Thank you. Yeah, interesting to hear that as well. I mean, when you mentioned potential, looking at even potential interruptions during a transaction, right? So tester has to be fully aware of different scenarios, not just about the, as you said, functionality of it.

But looking at what are the obstacles, what are the interruptions that sort of come in when, you know, a transaction is happening for that matter. Yeah, very interesting again. It's been a learning so far. Moving on to the next question, how can testers integrate user feedback into testing processes for iterative improvement?

Nithin SS (Head of QA, Lodgify) - What I would say is more like collaborating more with all the stakeholders and testers to lead the conversation, right? So they can lead the conversation because they act as a glue between stakeholders and also customers.

Because they are the ones who are ensuring the product is built with the right amount of empathy, and we are building the right things and building the things that bring value to the people.

When it comes to quality also, I would like to define quality as something that we are ensuring that we are solving the right problems for the customers or the users. So that is how quality has to be defined. It is not the number of bugs or zero bug backlog, those kinds of things. It has to be a valuable thing that we deliver to the users.

So what I would recommend is to collaborate more with stakeholders and be an advocate who actively ensures that all these aspects of functionality or these aspects are considered when they are considering to build something or create something. And also, because many of the organizations, these kinds of concepts are pretty new.

And it is always a topic considered by product teams most of the time. So you have to introduce this concept with a value-driven factor. So when you have a smaller module to be tested, I would say incorporate these smaller, smaller changes in that feature development stage.

And with that smaller iteration, you share the updates. Okay, if we did only functional testing, this could have been the outcome. Since I incorporated this user-centric approach, now we get more insight from a user point of view. So they need to see what outcome it brings, right?

So start with a smaller feature and share the comparison in ways like if you do functionality only if you consider user-centric with functionality, what value it brings. So these kinds of differentiators actually help the testers or QA to advocate more about these practices.

And what started from a smaller feature can be broadened into a bigger feature later. So start small, collaborate with stakeholders, communicate, communicate, and also make sure that you are opting open to receive feedback and also, you are open to receive data from different stakeholders because initially when you start something like this, you will be bombarded with a lot of information.

So you have to better streamline your knowledge sources and pick the right ones which is really relevant for you. If you ask data team to give you a set of data, they will give you a tons of data, but you should be aware to pick the right ones that you really need.

Just not straight away go into something like implement, I need to implement user centric approach. It doesn't work like that. You need to have like a proper plan on what all are the steps you should be taking and really start small by maybe a smaller step with small feature and then let the team know about the value and I trade from there. Same like you create an MVP and then you are trying to make it perfect.

Kavya (Director of Product Marketing, LambdaTest) - Right. No, great. Thanks for adding marketing references to it. But yes, I mean, thank you so much, Nithin. Incorporating user feedback as well as aligning with various teams, especially when it comes to product, for instance. What I feel is that user-centricity also seems to also closely implemented within your organization or within your team.

Testers have to also closely work with the product team, for instance, in order to align their needs, expectations, and so on. Very interesting. And what also seems interesting to me is the value-driven factor that you've sort of mentioned a couple of times. How exactly, again, would it be implemented for teams that haven't done user-centricity testing in the past? Where should they again start from in order to cultivate this? Because it seems like a team cultural factor aspect.

Nithin SS (Head of QA, Lodgify) - Yeah. I would say, okay, one example I could point out is basically in our current organization. So we have scenarios defined that has to be automated, right? So when we automated, it was initially fully focused on functionality aspect, but we were not covering certain use cases that our users mainly follows.

So, and we had like a lot of customer-reported bugs. So what we did as a different iteration was we have a real user monitoring system. From there, we know that what all the common use cases the user follows. And we started to compare those scenarios with the ones we defined for automation.

So when we started to compare a set of steps that we defined that has to be validated is different when the user really interacts. Where we put more effort was to automating the scenarios in the way that the user actually performs that transaction.

So we automated those kinds of things. And we could start it like we started to see a drastic amount of drop in the box reported. So that is the value, right? So you focus on something that is really coming from the user point of view, because you observed the pattern, and you fix something and you know that the box count really drastically dropped.

And that is the outcome and value you can bring to your team. Okay, because I pushed my testing into a way that I'm capturing data from the user, our bug count was actually reduced. So the team knows that, okay, if we focus on this, our bugs will be reduced more, and we will be getting more benefits by incorporating these kinds of factors.

So it's just like a small shift in the mindset, but you have to initiate that shift. No one else will do that. So I would advise all the testers to initiate that spark, and they should capture the real value by making a small change. And then that small change should be the beginning of more changes.

Kavya (Director of Product Marketing, LambdaTest) - Yeah, very interesting. Because you know, that's exactly the kind of, you know, mind shift change is exactly what I was thinking about. Because you have to truly become a champion to cultivate a culture of quality. And yeah, I think begins within the team before you sort of, you know, scale it outward, I suppose. Thank you so much. I mean, again, really interesting points.

Nithin SS (Head of QA, Lodgify) - Yeah.

Kavya (Director of Product Marketing, LambdaTest) - Moving on to the next bit, how do you balance the need for comprehensive test coverage with the focus on user-critical flows?

Nithin SS (Head of QA, Lodgify) - Going back to the previous response I mentioned, don't consider functionality and user experience as two different items or two different components. Both are the same.

So if you are testing a functionality, you should be making sure that it has a better UX and also it is more easy to use, it is compatible and it is also simple and intuitive for the users and it gives the right amount of feedback back to the users when they are performing something. Right?

So those are not two different, like making your mind in a way prepared in a way that considers functionality and user experience as two different elements that will make a huge change. And defining all those functional scenarios with this concept in mind will give a comprehensive test coverage.

And you don't need to like a similar situation as I mentioned, because we were too focused on automating scenarios, which was defined by us not considering what is valuable for the users. And that was why we end up increasing in bugs. And then, we started to observe the pattern from users. We slowly reduced the amount of bugs we have in the system.

So you need to start considering those kinds of inputs when you are defining a test coverage or when you are defining or designing your test in a way that puts more focus on what your users are supposed to do, and you add those scenarios.

So it has to be maybe like if you have a, say, for example, if you are writing or deciding a scenario to test your search feature, you can have two functional scenarios to be tested to ensure that, okay, is it searchable or is it not searchable? Or is it giving you the right result or not?

So that is your functionality. You test the functionality, right? And then think about the next elements. Okay, user experience. You are searching for something. And within this point of time, your user should expect a result from the search. And if it is taking longer time, it is a bad user experience.

And also, when you are searching, are you giving feedback back to the users? Maybe one loading icon will make a huge difference because the user knows the system is doing something. Otherwise, they will be thinking nothing is happening.

So functional test cases, when you define, you just start thinking in the ways that, what extra items I should add from the heuristics which is already available with you as a checklist. So you can derive more test scenarios and then you test for that. That brings you a comprehensive test cover.

Kavya (Director of Product Marketing, LambdaTest) - Got it. And of course, I think testers must also identify, you know, which, which use of flows to, for instance, prioritize while they're maintaining the test coverage. And yeah, what stands out for me is finding that balance, the right, equilibrium when it comes to both test coverage as well as flows. Yeah, it's a very strategic approach that testers have to sort of initiate.

Nithin SS (Head of QA, Lodgify) - Yeah, and as you said, I totally agree. It is more about making that balance, right? You will have hundreds of scenarios to be tested, but you cannot test hundreds of scenarios. It is more about prioritizing what the real need is and also making sure that at least the critical ones are being tested and that it is perfectly working fine.

Kavya (Director of Product Marketing, LambdaTest) - Right. Thank you so much. Really appreciate your insights once again. So we are about to wrap up today's session, and we have one final question to answer. Can you please provide insights into how testers can continuously advocate for a user-centric mindset within their organizations?

Nithin SS (Head of QA, Lodgify) - I would say be the annoying person at first, to whom the other people will thank you later. So at first, you're going to be annoying for sure. So you need to ask as much as questions as possible, and you need to initiate as much communication as possible.

So for that, consider yourself as a kid. What I tell my teams is you are a kid when it comes to starting to make a smaller change, right? So you are observing something new and you need to ask many, many questions, ignoring all your assumptions.

And also you need to consider the different mental models that our customers will have when you ask those questions, right? And when you ask more questions, yes, people will find it annoying at first. So be the annoying person for at least for some time, maybe for one to two months. That is the pattern I saw from my experience.

So one to two months, you're going to be the annoying person on the table, right? And then, after that, once they start to see the value, it becomes a normal process. You don't need to say or mention anything about user experience or a specific case.

It is something that is understood and it is something that has been already taken care of because it becomes a practice, right? So initially, for the mindset shift, you need to put some effort from your side and don't get demotivated because you will be facing a lot of offensive comments and also like push backs and everything.

Make sure that you get a better buy in small iterations and also make sure that the team understands this is a whole team approach and not only your approach, like not only something you want to change.

So start collecting more data once you make small iterations and present those data to the team to a wider audience then they feel more confident in believing in these ideas, and after that, it's easier to bring those changes moving forward.

Kavya (Director of Product Marketing, LambdaTest) - Thank you so much, Nithin, that really helps. So what I am taking away from it is testers have to sort of keep users at the heart of the decision-making process for sure. Also, collaborate with stakeholders in the process. They have to be consistent, as you said, persistent and consistent culture within the team. Great insights. Thank you so much, Nithin. Once again. Thanks to you, Kavya, for the nice questions.

So as we wrap up today's session, I would like to thank you, Nithin, for sharing valuable insights and expertise on mastering the user-centric mindset in testing. You have literally broken it down in as easily consumable format as possible. I'm sure our audience has gained a wealth of knowledge today.

To our viewers, thank you for joining us in this episode of the XP Series. Stay tuned for more enlightening discussions, and Subscribe to LambdaTest YouTube Channel for more XP episodes. Thank you once again for joining us today. Nithin, it has been a pleasure hosting you.

Nithin SS (Head of QA, Lodgify) - Thank you so much, Kavya, and thanks to the entire team. It has been a pleasure to be here with you, and I really enjoyed the conversation, and thanks again.

Kavya (Director of Product Marketing, LambdaTest) - Great, this is Kavya signing off. Until next time, happy testing. Have a great day ahead.

Nithin SS (Head of QA, Lodgify) - Bye-bye.

Past Talks

Future Trends and Innovations in Gen AI for Quality EngineeringFuture Trends and Innovations in Gen AI for Quality Engineering

In this XP webinar, you'll explore future trends and innovations in Gen AI for Quality Engineering, discovering how advanced technologies are reshaping QA practices. Gain valuable insights to optimize your approach and stay at the forefront of software quality assurance.

Watch Now ...
Flaky Tests from an Engineering PerspectiveFlaky Tests from an Engineering Perspective

In this XP Webinar, you'll learn how to mitigate test unpredictability, optimizing development workflows, & enhancing overall product quality for smoother releases and better user experiences.

Watch Now ...
Testing Tomorrow: Unravelling the AI in QA Beyond AutomationTesting Tomorrow: Unravelling the AI in QA Beyond Automation

In this webinar, you'll discover the future of QA beyond automation as he explores the realm of AI in testing. Unravel the potential of AI to revolutionize QA practices beyond conventional automation.

Watch Now ...