Saturday, July 8, 2017

Problem Not From SDK; But, I Said, That's the problem. I'm Wrong!

Today's software is not like the software and hardware what I saw in 2006 to 2010.  I see the software which we are building today, it integrates third party software vendor's SDKs.  As well, today's software what we are building are not desktop application like in the 2000s; but the applications on the desktop and mobile which can connect everywhere else and serving the user.

Recently, I was testing a product which is in its initial state of MVP. It integrates four SDKs from different software vendors.  Among these, one SDK showed a behaviour which was unusual and the engineering team which I was part of, raised the request for support from that SDK's vendor.  While the behaviour what I observed in the product was uncertain and over a period of week, it took a stable behaviour.  When asked, I was said it is due to SDK.

Here is the miss, what I did for first. I took those words as I had got used to this behaviour for two weeks with no technical solution. I did not question enough that and debug around as the time what I had for testing was also crunched. I and other fellow tester owned the ownership of testing for the release. Stretching late nights and weekends, it all drove me to take words when said it is technical limitations and nothing can be done.  Here is the second miss what I made -- I took these words as the behaviour due to SDK and previous experience was consistent and synching very well.

The story got different in production.  One of the user reported the same behaviour and I was confident in my reply and said, it is as a known behaviour.  My other programmer friends had same opinion.  My programmer friends here are highly skilled and they know their job very well.  But we all were fooled by the software as the pressure which eventually starting building upon us. That was also due to the experience of us talking with that SDKs team.

I took up that behaviour for investigation again today after hearing from production and from a techie friend who said, "it should not happen for so long".  Yes, it should not happen for so long as I expect it should be gone after the synch is done, is what I think as well.

I started questioning myself and doubting what I have learned.  Isolated all the environments for my observations and started watching my actions, the requests, the responses, the pay load, the race conditions, the served and unserved data streams, the logs, and the breakpoints & values in code eventually as I did all these moves. 

I was wrong! Yes, I was wrong in what I said confidently to every stakeholders right form CEO, CTO to all other business stakeholders. Also the engineering team had the opinion what I had.  It was time to go back and tell them, "I'm wrong in what I communicated in behaviour of the client software and rational behind it".

I gave the reason why it happens, when it will happen, when it will not happen, the impact of the behaviour, work around for same in production, is it dependent on anything else and awaiting, and why it was a missed from our end technically.  It was not a problem from third party's SDK. It was a UI that got triggered each time when the activity got triggered. 

Though I have to investigate much lot for other problems in using that SDK, this behaviour is not due that SDK. It did work well in this case here.

What I want to say here?

  • No matter how confident in the code and in tests, we will be fooled and are fooled each time
  • There is no harm in doubting ourselves for second time on -- what we have listened; what we have ignored; what we have learned; what we have not carried technically while we observe the behaviour from the tests consistently
  • Being non-technical in our observations and work helps a lot
  • Knowing the benefits of not thinking technically each time
  • Giving time to ourself so we sit back and investigate the behaviour which everyone claims it is due "that"
  • If it is due to that, you will happen to learn about it so you can test it better
  • If it is not due to that, then you will happen to learn it is not so and can figure out the cause of it; help team in fixing it
In this case I see no impact to user in terms of data and performance of product, but the experience is definitely annoyed.

So I learn again, that I should build the tests technically -- which will put the product into test for the experience of each feature; each UI; each actions of user on a UI; each interactions with backend and its outcome to the client; and evaluate the outcome of the same.

How will I do this? I can do this sitting in person and also with help of automation.  But the point is what is the scale where I have to be with my tests samplings to experience it and investigate further. This takes time to learn and it won't happen in crunch time of releases.  Having said this, it is not bad to go and talk with stakeholders and buy the time quoting the priority and impact of the behaviour. We testers will have to initiate and assist the stakeholders in learning when there is a need for it.

Wednesday, July 5, 2017

PID Cat - Reading the logs and test investigating the Android apps

Reading the logs and helping self to design better tests is one of the key skill for a tester, is my opinion.  I insist and encourage the testers to practice it. I assist in learning the log and making use of it.  If you are not sure whether the product you are testing has a log or not, it is time to know it.

It can be server logs, proxy logs, client logs, hardware logs, deployment logs, etc.  The log contents and type of logs will vary from product to product.  What goes into log will also vary and it depends on what is the log level set to print into log file.

It happens that the logging will be turned off or will be set to minimal saying it consumes the disk space and IO of the box where it exists. On the other hands, the DevOps and programmers usually monitor the logs in production.  The logs are one of the most useful utility to know whats happening in the production.  While they use it in production environment, can't we testers use the same in the staging and pre-production environments? Won't that help us to test better? Yes, it will help!

Understanding how the logging feature is written for the product is one of the key task usually missed by a practicing tester. The way of interpreting the observations and further test design will be influenced upon learning how to use the logs.  

In this post, I will share one tool which helps in context of testing the Android mobile apps. The logs and Android devices; if you say, "logcat", yes; it helps to print the logs in different logging levels. What if I want to pick for particular Android app from the debug stream of logcat log? It is not an easy task with logcat's output stream. 

I have been using the PID Cat for years now and I find it very useful. It is written by @JakeWharton  in Python.  It prints the logs in colored text and we have choice to change the color and choose what we want on our box in the program written. Upon that I can just fetch log for a particular Android application.  Also, we can contribute further to this tool. The readability experience of log on using the PID Cat, is better.  Read about the logcat color script from Jeff Sharkey, it is here.

How to get PID Cat?

On Mac OS X, the PID Cat can be installed via brew.  If brew is not configured on the Mac OS, configure it. On successful configuration of brew, using the below command the pidcat can be configured.
brew install pidcat

On Ubuntu, the PID Cat be installed by using below commands in terminal
sudo apt-get update
sudo apt-get install pidcat

On Windows, install the Python and configure the path. I have tested it with Python 2.7 and 3.x during my tests. It appears to work for me in my test environment.  

Below are the steps to use PID Cat on Windows machine.

  1. Install Python and configure the path
  2. Make sure the path of Python is set
  3. Make sure the ADB is installed and the Path is configured for it
  4. PID Cat is a Python program and it is available in above mentioned Git repository
  5. Get this Python program file and save it locally on the drive
  6. Go the drive where the is saved and enter the command in terminal -- python
  7. This should start printing the logs and on Windows 10 OS, I see colored log text

If using the Cygwin on Windows OS, install the Python package and make sure it is installed. Now go to root and access the directory 'cygdrive'.  From here choose the physical partition drive of Windows OS where the file is stored. Then use the below command to fetch log of specific app.

Note that, for all the above said, the ADB has to be configured and set in path. If not, it will create trouble. The general log of Android device can also be collected via PID Cat, by using command -- python .

Android device logs in colored text in Windows OS via Cygwin

Android device logs in colored text in Windows OS terminal

Now having setup the PID Cat, while it is up and running, let us test and debug the apps much better.

Saturday, April 1, 2017

Web Client Performance: Webpage's Response Time and HTTP Requests -- Part 1

I'm practicing along with my colleague Chidambara, on Web Client Performance Testing (WCPT). While we practice together, we are learning, how to assess the current capability of the web page and then how to interpret the same.  Why I say this, because when I test, I will make a report; that test report as to help one to decide and assist in taking the necessary action if it has to be taken. If not, the purpose of the test report might just bound to know what did I test and what is my interpretation. For this, I need not write the test report.  The same with automation as well, isn't it? We will get the report on running the test suite and this report will be used further to decide what to do based on our interpretation of it. Coming back to WCPT, I happened to learn it this way and sharing my learning with my fellow tester.

I picked a Google India search page and monitored its requests and response. At time of testing, I used Firefox in private mode. I noticed total of 16 requests were fired and close to 1.4 MB data were transferred. Overall it took 4.62 seconds on 100 Mbps bandwidth internet line. Below picture reads the requests fired and its timeline.

HTTP requests fired from Google Search India page

I notice, first two redirection taking total of 294 ms, and then 484 ms to load the html document. In total, to load the html document, 778 ms is used. This is close to a second. If I had to make it to round figure, say 1 second to load the html document. Without html document, the web page will not get loaded and for this reason html document will be the first content to get downloaded on client i..e on the browser.  Notice that there are two redirects with HTTP status code 302.
I will walk through in a blog post how the redirects affects the web client performance potentially. I and Chidambara, practiced the same few days back.

Moving ahead, I see other 3 seconds are used to download the images, stylesheet (CSS) and scripts. This tells, more time is taken to download these elements of the page then the html document. When looked into other web's webpages the same is observed during time of testing.  With that, I learn, these elements are the time consumers if not handled well thereby showing the slow response time. Slower the response time is, i.e. the more time is taken to load the functional webpage.
Now you see, while carrying out the functional test you also test partially for the performance of it, and vice versa.

The 'capability' i.e. performance attribute of the webpage will be -- "how quickly it loaded on the client i.e. what is its response time".

Note that, the expectation is, it should be functional and  the experience of using it should be pleasant to the user; these are the implicit expectations with the words 'capability' and 'response time'.

In above picture, the total time for receiving the html document is 778 ms. But, the event DOMContentLoaded is fired after 1 second and somewhere it looks near to the mark of 1.2 second. Where did the 442 ms go then? On my network bandwidth of 100 Mbps and machine configuration with 8 GB RAM, still this is a question to me. Then what will be in other network bandwidth and client's machine configuration? Why DOM parsing took time?
I see there is JS being received i.e. in 8th request. I'm not sure if is synchronous or asynchronous. Technically where there is no dependencies between the scripts or any other page elements, keeping it as asynchronous helps.
Then there is a CSS being received in last second request. If the CSS isn't received well before, then page loading has to wait for it as the style cannot be defined for the parsed DOM of loaded html document.  It is a thumb rule to keep CSS in the top requests while receiving from the server. But context of the website's design can change this thumb rule.
If this gets tested, can the response time of the Google India search page can come close to 3.3 to 3.5 seconds?

If this is just for one webpage of a website, how many webpages do the product I'm testing, have? Now I understand very much better than ever that, like functionality is coded into the product, the performance as well should be designed and coded in equal importance.  I derive my tests with such thought process as it serves me to be the base in building the tests and for sampling the variation and interpretation of test observations.

What did I learn here?

These are key learning when I happened to interpret the Web Client's performance
  • More the number of HTTP requests, the response time increases for the webpage
    • Lesser the number of HTTP requests, it is good from point view of webpage's response time and performance
  • The html document takes minimal share in the total load time of the web page and rest of the time is taken by the webpage's element
    • Handling and optimizing this is important for better performance of web client
  • More the page weight, more the response time for webpage
    • Need to test and figure out is there any way the page weight can be optimized, so the response time for webpage will be fast
  • If said, 15% of time is for html document of the webpage and then other 85% is for the webpage's element, then the problem area is evident with respect to response time
    • If fine tuned the client for better performance, the experience for targeted user should still be pleasing with better response time
    • Upon this, if server performance is optimized, then do targeted user say, "it is just like this, so fast"? I do not know; but as a practicing tester I see, fine tuning the performance of web client is possible for a tester while she or he tests the website for functionality or any other quality criteria. In parallel, if taken care of server performance based on contextual learning from the tests observation, it will be very much useful.
  • Know this when looked at webpage
    • Number of HTTP requests fired; 
    • time taken by each requests; 
    • what each requests carries from client and back from server to client; 
    • size of the HTTP request transaction; 
    • page weight and contribution by each elements in the webpage; 
    • know when did DOMContentLoad event got fired in the timeline;
      • in above picture it is marked with blue line in the timeline
      • it is fired when the html document is completely parsed and loaded without waiting for other elements of the webpage i.e. css, js, images etc.
    • know when did Load event got fired in the timeline;
      • in above picture it is marked with red line in the timeline
      • it is fired when the page elements are loaded and finished loading
    • overall response time for the webpage;
    • understand what each page element is and why it is needed;
      • it is very much important from aspect of testing to know is it actually needed for the webpage or website
      • if it is not needed, then as a tester I should be in position to advocate my observations out of tests
      • if it is needed, then as a tester I should be in position to advocate my observations out of tests and how it can be tuned and optimized

It is important to know what is HTTP and how the web communicates. So that we understand the influence of bandwidth and latency on the response time. In next post, I will share what I have understood for the same. Later in subsequent will share, how I and Chidambara progressed in learning and practicing of this.

Sunday, March 26, 2017

My Learning from Agile Testing Alliance's 12th Bengaluru Meetup

I attended Agile Testing Alliance's 12th Bengaluru Meetup hosted at Moolya Software Testing Pvt. Ltd., office on 25th March 2017. I got to know about this meetup from the Facebook share by Moolya and made my mind to be there. The audience in the meetup were software testers, Agile trainer or coach, and technical lead.

Below listed presentation came in time between 9:40 am IST to 12:50 pm IST
  • Welcome and introduction to Agile Testing Alliance, by Nishi Grover Garg
  • Challenges of Agile for a Manger, by Preeth Pandalay, Techno Agilist, Agile Coach & Trainer
  • Behavior Driven Development: What, Why & How - from a tester's perspective, by Vinay Krishna, Agile Technology Coach
  • Problem Solving Techniques: An attempt to apply ideas across disciplines, by Ajay Balamurugadas, Tyto Software
  • Creating 100 mindmaps in 1 minute, a demo by, Dharamalingam K, Moolya Software Testing Pvt. Ltd
  • Concluding the meetup and vote of thanks, by Nishi Grover Garag

Brief lines on presentations from my notes

I'm listing few points here out of my notes. It was engaging sessions and I had to make sure that I will listen, I make notes and tweet to people if any who were curious to know what's happening in the meetup.

Nishi Grover Garg : Introduction to ATA and welcome note
  • She introduced herself and shared about the Agile Testing Alliance and what it does
  • Said about the recent testing conference that was held i.e. GTR Pune 2017
  • Shared about the different certifications and the assistance from ATA

Preeth Pandalay : Challenges of Agile for a Manager
  • Started with management structure, management hierarchy and bureaucracy
  • Spoke about management in 21st century and in technology era
  • Shared his views on traditional structure of management and Agile management structure
  • From there, he spoke about traditional manager and Agile manager
  • Mentioned about the network based management and said it as organization structure 2.0
  • Then he said about Agile Team Cross Functional and mentioned Katrina Clokie's blog which speaks about this
  • With that he said, Agile team is self organized
  • He shared statistics of Agile helping to solve and deliver better
  • With the statistics walk through, he says, Agile works
  • Audience had question around -- "Management yet to getting adapted to Agile and teams are on Agile. How to solve this so team gets much more support?"

Vinay Krishna : BDD -- What, Why, How - a tester's perspective
  • Started by asking do you know what is BDD and then said it is another buzz word and jargon
  • Says, "in this era we all are programmers and need to write code; testing is a specialization now."
  • He said about the surprises that software development brings and highlighted on "assumptions" what people make in team
  • Started a group activity saying to draw start having 12 points and later he asked "why you did not ask questions but assumed?"
  • Says, bug + feature = beature
    • misunderstanding at all levels
    • lack of effective communication
    • difficulty in communication
    • lots of assumptions
  • Then he shared, BDD = shared understanding by discussing examples
  • Continuing his talk, he said, for start have at least three amigos -- Business Analyst, Programmer and Tester
  • Also says, it is useful if identified and used more than three amigos
  • Shared about how important a scenario and use of Gherkins
  • Mentioned on BDD framework saying,
    • Feature File
    • Step defintion (glue code) 
    • Actual implementation
  • Started another group activity and asked to identify the scenarios for a ATM transaction
  • He said to avoid UI tests with BDD
  • Shared few myths around BDD
    • BDD is automation of functional testing
    • Using Cucumber is BDD
    • BDD is replacement of functional testing
  • Took questions from audience around
    • Difference between unit testing and BDD
    • Around usefulness on BDD
    • Deriving the benefits of BDD in performance and security testing
    • Limitations of BDD

Ajay Balamurugadas : Problem Solving Techniques: An attempt to apply ideas across disciplines
  • Starts by asking, "How do you solve problem? Take a minute and let me know."
    • Audience started interacting
    • There were no slide and it was a white board and interactive session througout
  • Then he mentioned about a crisp definition for "what is problem?" of Jerry Weinberg
    • difference between expectation and reality
  • He mentioned about Problem Solving Leadership workshop by Jerry Weinberg
  • He says, "focus on things which can be controlled"
  • From here, he asked the audience to pick any one problem, so that he demonstrates how to solve it
    • The audience picked -- why less number of attendees to the meetup
    • He started to brainstorm around this problem while audience interactively shared their thoughts on how to solve it
    • Nishi Grover Garg, said this is useful and it will be used from the next upcoming meetup
  • Moving from here, he said about four techniques which can be used in solving the problem
    1. Attributes and Improvement
      • by, Robert P Crawford
      • Further with examples he said
        • identify the problem and list out the attributes of the problem
        • work on the improvement of the problem
        • If you miss out an imporant attribute, problem might not be solved
    2. Six Thinking Hats
      • by, Edwared de Bono
        • Mentioned about 6 different thinking hats -- White, Black, Yellow, Green, Blue and Red
        • Briefed what each hats means and what they signify
        • He recommended to avoid using Black hat immediately after the use of Green hat
    3. Questioning
      • He said the importance of questioning
      • Mentioned about Osborn Questioning
    4. SCAMPER
      • A mnemonic
        • S - substitute
        • C - combine
        • A - adapt
        • M - magnify
        • P - put
        • E - eliminate
        • R - rearrange, reverse
  • Later he shared one more mnemonic which he made while on the way to meetup
      • P - perception
      • R - reasoning
      • O - opportunity
      • B - beware of assumptions & problems caused by solving the problems
      • L - lawfulness
      • E - exploratory
      • M - management
  • He took the questions from audience on the techniques and applying it

Dharamalingam K : Creating 100 mindmaps in 1 minute, a demo

  • Walked through swiftly on what is Mindmap and where it can be used
  • He shared about the problem what he and his team encountered when wanting to build a mindmap for a product's feature
  • Then, he said how he built the mindmap via Python program
  • He ran a quick demo which showed creation of mindmaps
  • He took the questions from audience
    • On mindmap
    • On the complexity and how to do this via programming

Nishi Grover Garg : Concluding the meetup and vote of thanks

  • She thanked the audience who made for the meetup
  • Said about the Agile Testing Alliance and benefits the people can get from ceritification

I took below to my desk from this meetup
  • How to handle myself in teams which claims to run on Agile
  • How to coordinate and deliver my best in the environment which claims to run on Agile
  • How to focus on my work irrespective of Agile or not Agile and assist fellow testers and stakeholders
  • Thoughts and questions around BDD apart from functional testing
    • A mind which says to explore on this
  • To focus on things which is in my control and where I can deliver
    • Do not take responsibility without having the authority
      • I repeated this to myself again
    • To read and build skills from below resources shared by Ajay
      • Attributes and Improve, by Robert P Crawford
      • Game Storming, by Dave Gray, Sunni Brown, James Macanufo
      • To explore and use what I can to learn in the web --
Post meetup hours, was part of three interactive discussing sessions with Ajay and Pranav. I did learn discussing to Ajay and Pranav on fundamental topics of software testing, programming and practice.

Here is a pic from the meetup

Attendees of Agile Testing Alliance's 12th Bengaluru Meetup hosted at Moolya

Saturday, March 25, 2017

Testing for Web Performance: Why I want to share ?

I was practicing with a fellow tester and he expressed that he wants to practice performance testing. I too have that wish i.e. to practice performance engineering and testing. I try to find the opportunity each time to practice this and look out for fellow testers and mentors who can assist me here. However I do not wait for mentor now though I want one. Instead I will start my practice and I learn from my practice as I practice.

May be, as me there could be other testers who wants to practice the performance testing. I wish to share my learning and practice here, in the simplest and short way.  

I have heard Rahul Verma saying, "replace the word 'performance' with the word 'capability', when you hear a word performance". I feel, it makes more appropriate and meaningful to look it as a 'capability' when said 'performance'.

With the label 'Web Performance', I will share what I'm learning, practicing and thinking for web performance in last few years. When said, 'web', it is not just the website; it is the web that connects different interconnected systems. I will be sharing my learning on one aspect of web which is connected and communicating via HTTP protocol.

Sunday, March 19, 2017

My learning from Agile Testing Days Asia 2017

I had got an opportunity to attend Agile Testing Days Asia 2017 (ATDA-2017). I thank my friend Jyothi for sharing me this opportunity.  Before I happen to say much, I request the reader to go through this post.  I will share what I felt on being part of this conference and what did I take back to my desk so I can practice my testing better than before.

I'm a person who adapts to an environment of project and do what is needed for delivering the best. Whether I'm in a project which claims to work on Agile principles, manifesto and its various methodologies or in a project that claim to be on other methodology, principles and process, I work to see what best I can do there in context.

With that, I have no idea if Agile is successful always or not. I have an idea -- the teams working together to accomplish will get it when each assist each others to deliver.

I have worked with teams which claims to be on Agile. Still being on Agile, I have worked beyond office hours, in weekends, in pressure, and yet seeing the release delayed beyond timeline. There are all problems here as exists in projects which claim to be non-Agile i.e. before Agile got adopted to Software Development. Its all in understanding, solving and going forward is what I see irrespective of principles and methodology.

I don't say Agile is wrong or effective or not effective. I learn, it is we people who adapt it, will do it something else be Agile or non-Agile. Now, I will stop talking on process or Agile here as it not my area of interest for now.

ATDA-2017 conference was organized by STeP-IN Forum and I should be thankful to them. It is not an easy job to organize a conference. Getting people together by taking care of their presence, giving space for networking with the fellow testers of different organizations and helping to carry the thoughts back to work place is not an easy effort.  I thank you STeP-IN Forum, your people and the committee. Respects!

Coming back to conference, I attended Conference Day 1 and Conference Day 2.  The audience were from around India, Bangladesh, Canada, Romania, Spain and USA.

The day 1 had below talks
  1. Evoke the Soul of Agile - Keys to Lasting Transformation, by Selena Delesie, Leadership & Innovation Coach, Speak & Soul Igniter, Delesie Soljtions Inc.
  2. Leveraging Global Talent for Effective Agility, by Todd Little, Agile Leadership Consultant & Innovation Software Executive
  3. Agile Testing of Microservices, by Praveen Kumar, Manager-Agile Practice and Manoj Kumar Nagaraj, Director, Capgemini
  4. The Agile QTOPIA, by Rahul Verma, CTO & Founder, Test Mile
  5. How to avoid Internet of Insecure Things, by Gaurav Maheshwari, Engg. Manager - Software and Prashant Jain, Project Manaager, TVS
  6. Building Inner Trust - there is no agile without confidence, by Ginna Encache, Founder & President, CHOICE
  7. Rev up to Agility, Apply Lean Six Sigma in Testing, by Ramesh BR, Sr. Director, OpenText
  8. Agile Games -- to build a jeep

The day 2 had below talks
  1. Divide and Conquer: Easier Continuous Delivery using Micro-Services, by Carlos Sanchez, Engineer at CloudBees, Member - Apache Software Foundation, Startup Technology Advisor
  2. What New Approaches to Software Can Teach the Enterprises, by Puneet Gupta, Global CTO, Brillio
  3. Agile and Startups - What can go wrong - A case study, by Vipin Jain, Director QA, Astegic Infosoft
  4. All sprints are guilty unless proven innocent, by Deepak Chopra, VP, Genpact
  5. Testing in a Responsive Enterprise, by Abhishek Johri, Agile Coach, TEK Systems and I remember other co-presenter was by name Mahesh, consultant.
  6. Agility for The Last Mile, by Diwakar Menon, Founder, Last Mile Consultants
  7. There is no such thing called Agile Testing - Debunking Rituals and Ceremonies, by Srinivas Kulkarni, VP, JP Morgan
  8. The Big Fight - Cirque (Dangal) -- Agile - to be or not to be. Debaters -- Raj Netravati (moderator), Selena Delesie, Rahul Verma, Pradeep Soundararajan, Jayapradeep Jiothis, and Jayaprakash Prabhakar

Here is my experience of Day 1
  • I understand, Agile practice is a culture. It is very much essential to have culture that helps an organization and team grow and deliver. In doing so, the process come in; how to make use of process and tailor it to accomplish, is the key. This was iterated probably in one or other way by every presenters giving an example of their work and process brought in to the project or work place. I did not see any thing about testing. Rather it was all about the process.
  • Todd Little shared about domain knowledge importance in programming and testing from one of his project relating Petroleum Industry. Further he went ahead and shared how collaboration is important when working with teams situated in different geographical locations.
  • A contrasting presentation was from Rahul Verma and to me is evident as I see him presenting in similar lines. I was not of surprise, however the audience had tickling and laughs as he expressed his thoughts. It was around the process, culture and mindset.
  • About security in IoTs, it was more about the practice and guidelines. I expected to see testing here. It did not happen for me. I repeat it is for me. I'm not sure about others. And this was the only presentation which did not have the word 'Agile' in presenters talk.
  • Gina Encache shared about importance of people relationship in project and working place. It was her first talk and glad that she made it in India.
  • Ramesh, gave the case study of his work which had to with process and illustrated how it benefited team in saving time and deliver. Nice! He showed his team members photo to the audience and that is remarkable! But again I was disappointed as I did not see testing here. It is about process. 
  • Game session was interesting. It had three teams and each were given stuffs to build a jeep. I enjoyed watching it.  The collaboration, communication, team efforts, and one goal to accomplish helped all to stick together. Interestingly, here team did not follow any process however it was time boxed activity and team members spoke what they have to do and in what order. This was an example probably which the conference should have highlighted having each team to talk on second day of conference. May be the organizers did not think about it. I enjoyed this activity.
  • Atul Khatri presented his stand up comedy. He did say, that we will recollect his jokes after two or three days and it gives tickle and brings the laugh.
  • Followed this, it was a announcement of winning team who designed the jeep in given design constraint and time. 

Here is my experience of Day 2
  • It looked as continued journey for me from Day 1.
  • I did not get how to detail out testing the Micro-Services. It was more about the infrastructure and concepts. For one who hasn't heard about micro-services, this session will help to get a idea at a conceptual layer.
  • Vipin Jain, had a case study and it was around a process and the impact if not understood how to use process.
  • After this, all talks were on process and how to go with process so one can be informed well before a huge cost in the business.
  • Testing in responsive enterprise, I was waiting for this talk. Eventually, this too was on process discussing on latest trend in software development and its methodologies.
  • Following this, there was a debate about being Agile or not to be.  Each person in debate seat had their views on Agile, tools, and Continuous Integration.  But, none of them here spoke about testing! That was too sad. I expected at least they bring up point asking where is testing in all what we have heard for two days around Agile. It did not happen. See, how one gets into process trap while all are talking around the process.
  • Shrini, I have seen him presenting and sharing his thoughts. It was no surprise for me. I did expect something from Shrini which would have spoken about testing execution. The only presentation which said word "testing" quite often along with the words "process" and "Agile".

My observation
  • The conference was Agile Testing Days, but it did not have testing in it i.e. in two days of conference.
  • Most of the speakers were of management level and was on process thoughts.
  • Whereas most of the audience were in execution level i.e. Testers who do testing. See the gap of presenter and audience. 
  • Process, process, and process.  The same is heard in office and same was heard in conference. Where is the testing? Or, where is the automation? How to do it? An example and demonstration of one such problem went missing in the conference.
  • Testers want to know and seek help in doing their task better. At least I look for that than hearing about the process. I did not get that. 
  • The game activity in the conference is clear example for collaboration and deliver as a team in knowing what is the goal. In the game session all were agile and did what their role expected do to in the team for building a jeep by assisting each others.
  • I do not remember if there was much discussion on The Four Testing Quadrants and how to work on it, if Agile Testing was being discussed in Agile Testing Conference. As well I remember, just one liner mention in Shrini's presentation on the four quadrant of testing.

My humble appeal to STeP-IN committee
  • I wish to share my honest opinion here as a practicing tester and seek your help to practice better. Please do not feel bad on me.
  • I see the most of the audience in conferences were to be practicing hands-on testers in desk. It is they who execute and inform the stakeholders (managers and management) about the risk. When we testers better our testing skills, eventually we will learn how to deal with process and principles that comes with any name. Thereby we actually assist the culture of the project and company as well to get better. That said, I request to have more of testing in conferences than process and management.
  • If looked into presenter and audience, you will see the gap clearly.  Please do encourage and pick the hands-on testers. Give more slots to these testers while we give few slots to management people. Ask for testers to come up and speak about their testing, automation, problem solving and innovations. This is what we need, importantly, to march forward in the testing practice as a start.
  • Having management people talk all the way is good until we testers get what we want to get. If not, it will be boring, at least for me.
  • To the panel which selects the paper or topic for presentation that come to them, please do keep the practicing testers in mind. See what is that we testers need and get that. The orientation of problem for tester and management is of different kind. You see, management sends their testers to conferences and not most of their management to conferences. I'm not saying to eliminate management talks but that should not fill the space and eliminate testing all together.
  • I wish to see audience feel happy for coming to conference and say it was very useful and recommend it for themselves and others.

I took below to my desk from this conference
  • To focus on my testing. Learn how to make use of process to its best for delivering by adapting it to the context on tailoring.
  • Convey the same to fellow testers on desk and work along with them to deliver useful testing which is of value and creates demand for itself.
  • I said myself, testing is agile in itself when understood the purpose of testing. So is the automation when it is to assist the testing. If this is consistently learned by me on each day, it will help me to adapt and tailor for any process to do better.

Thursday, March 16, 2017

When not to test? When is the no need of testing?

I saw a tweet interacting with me and it was from my colleague Pradeep Lingan (PL). He had shared a question -- When to start test once the build arrives?

I was in silence for few second on reading this question. In silence, I asked myself, "Why should I test? Why at all I should start testing in whatever timeline of product development?" Now, I have two questions, the one from PL and the other which I ask for that question.

The question asked by PL is not always easy to answer. But it is a very logical and technical question which should be answered. If not answered, stakeholders involved and those who are expecting 'valuable' (if expecting valuable) from testing will happen to ignore testing.

I wanted to ask PL what is the context in which he is asking this question. You see, it is not a simple question. It can be better understood when known the context which raised that question.

As a practicing tester, I understand, when there is contextual question, it is also possible to derive questions which is context free. When I say context free questions, it is the question which is generic and can be used along with contextual questions.  Is context free is also a context? I will leave this to you for brainstorming.

I will not ask PL for context here. I will answer my thoughts using context free learning approach here. From there, I will let PL to brainstorm with context for the question he asked.

Why we test? When is "the need" for testing?

I learn, I test and purpose of the Software Testing is,
  • To provide information on product, so authorized stakeholder make sound informed decision on product's development, scope of development, pause, abort and release
  • Information on testing the product can be -- about it's quality criteria and risks around it; and about the risk of carrying out and not carrying out the testing in context
  • Information obtained from testing can be used for -- to make decision about the product; about the decision which stakeholders going to make and risks out of it; if it is worth to pay for tester, testing team and seek the information which is being received; are we investing too much for information that's not worth; prioritization; should we carry out much more testing and better testing?  etc.

Software Testing is not about fixing. Software Testing is to assist the fixing in better, rational and empirical way. The information obtained out of testing to be used for bettering the product. That means if carried out testing, it does not mean, it improves product. When information out of testing is used to better the product by fixing, it helps the product to improve.

The above points should have helped now to start understanding the Software Testing and its service to development of product and stakeholders. If I and you mutually agreed for above points, then you are looking for information which can ruin the existence of product or purpose of building the product or something that will incur loss to everyone in the business.

When not to test? When is "the no need" for testing?

On reading why we test and when testing is needed, it is quite straight simple to know when not to test and when is there no need for testing.  This can be used as a start point to decide whether to start testing or stop testing.

In below cases, it is useful not to test
  • When a tester and testing team has no clarity in knowing -- what is useful testing and not useful testing for stakeholder
  • When we don't know about risk(s) on which the testing has to focus in product and provide information
  • When we have no clarity on what is expected out of testing
  • When a tester or stakeholder feel the outcome of testing being carried out is not going to help in making the decision
  • There is no area or a question in the product for which testing help to see an answer
  • When no one is interested and curious in outcome of the testing

In below cases, there is no need for testing
  • When the stakeholder learns or believe and say there is no risk in product, and say to stop testing or denies testing for product
  • When the outcome of testing will not change the decision of stakeholder
  • When no information is expected out of testing
  • When a tester or stakeholder feel the outcome of testing being carried out is not going to help in making decision
  • There is no area or question in the product for which testing helps to see an answer
  • When no one is interested and curious in outcome of the testing

It is business and business is also on money and time. As a tester it is my responsibility as well to help make decision when testing is not needed and when not to test. For doing this, I need to build the credibility with stakeholders.

However, as a tester, I continue to uncover risks if that really helps stakeholder and if I'm sensing it from my testing. While this is being done, it is also good to note, time of tester is also important as it is value and money. If I think my investment on time is needed here, I will do it, provided I see a good cause and benefit there.

Did that help now in context free way to decide when to start testing once the build is available or yet to be available? Further, build the contextual reasoning to figure out when to start, when to stop and no need for testing when there is a situation.

Monday, March 13, 2017

Disclaimers for the blog posts under this label - "Testing and Conferences"

I want to start writing on what I feel, and what I took back from attending a conference.  I attend conferences that is oriented majorly towards Software Testing and also on programming.  I ask to myself, "Why I'm not sharing to organizers about the experience I had attending the conference as a audience or presenter?"  Never I shared though I filled or wrote the feedback form if given during the end of conference or end of the day in a conference.

I'm not sure if I will be hated, or (and) opposed, or (and) seen as unusual for what I share. But when I share, I do with honesty and seeking the good for me and for all practicing software testers from each one of us; that includes me as well.  With this, I want to keep this post as disclaimer or pointers where one has to read if she or he has different thoughts for what I share in the successive blog posts under this label.

I want to convey the below points in list of disclaimers
  • Opinions or experience I share under this blog label are my own. It has nothing do with others. I will be solely responsible for what I say here.
  • I do not mean to downgrade or oppose or pull down the fame and legacy of any forum or association or group or practitioners. Instead, I wish to be a honest tester who shares what I feel in respect for the efforts the group has put in making a conference for software tester.
  • I do not intended to speak about any speakers or presenters in conference. It is a skill to present to the audience and I admire it and I want to be better and grow here. If one is coming up on to the stage and presenting, she or he has to be encouraged and I'm one such encourager with empathy.
  • I never talk on a person i.e. individual nor on their experience or work. If I did so, I will be too bad for doing it.  Rather, I will share what I felt as overall when I want to carry back from conference to my desk to implement in my practice.
  • My sharing will be to the conference organizers, panel who sets up theme or topics on which to be presented to audience. If the conference audience observe my thoughts are relevant, I would not be surprised. I speak from my tester perspective who is hands-on in her or his job.
  • Organizing a conference or a talk is not a simple job. I know the efforts it takes. The posts never point on that efforts. But, I will share what was beneficial and what would have been beneficial to me or testers similar to me and to people who wants to see value out of testing team.
  • From the of the blog post under this label, I will share what could have been done to reach the testers as me so that I could go back to desk and implement or try an exploration based out of the shared thought or theme in conference.
Note: I keep updating these disclaimer pointers as I learn.

At any time if you think, I'm doing wrong or have misunderstood, please do feel free for conveying it to me. My email id is ravisuriya1 at gmail dot com.  I'm human and I'm fallible, yet I want to be better each time; if you happen to assist me to do it, I will be thankful to you.

Thursday, March 9, 2017

Native, Web and Hybrid - How related and different these apps are?

During discussion on mobile apps with a tester, one of the question was - "What is native app? What is the difference between hybrid app and native app?" Like one uses the mobile apps in and out almost all days on mobile devices, yet there is confusion in knowing what these apps are and the differences among them.  In this post, I do not share about advantages and disadvantages for the types of mobile apps, rather I share what I understand for what they are.  And, this should help to start learning on advantages and disadvantages if any.

Further knowing what is the technology stack being used for the native mobile app, web app and hybrid mobile app, a tester can design the tests better.  This helps to figure out the testability and from here one can get the idea which are good to automate and assess, and those which requires human skills to assess further and test.

Mobile web app and mobile device

I will walk through briefly on native mobile app and hybrid mobile app in this post. Before I do that, let me tell bit about web app with Responsive Web Design (RWD) for mobile devices.  Do you know any website that you use often? Open that site on your mobile device browser now and experience how it is rendered. Now, why not changing the orientation of your device and view it in portrait and landscape. What was your experience? Were you able to use it on mobile device? Did it render in same width and height as in desktop browser or it gut customized to your mobile device's screen?

Along with RWD, HTML5 and latest web technologies of JS with CSS, their exists frameworks to build web app for mobile which uses the nativity of mobile devices. They are evolving and exists, today.  One such example is Ionic framework.

While mobile OS extends the support of web view to display the web contents within the activity of an app, the above said web technologies and framework supports to build mobile web app.  With this the mobile web app can make use of device's hardware and sensor, as a native mobile app does.

This mobile web app is not available as an app in the Google's Play Store and Apple's App Store. However it can look like a mobile app in an experience but the implementation way differs. When a website can be rendered to mobile device's screen size, is it possible now to get a feel like not a browser based website though address bar is shown?  Frameworks available today to build the mobile web app will help to do so.

Some examples for the mobile web app are GMail, Flipkart, Amazon, and Facebook.

Native app and mobile device

Mobile app which is installed on mobile device and that can make use of hardware and sensor available on mobile device, is typically a native app.  This app is built for a particular mobile device's OS and platform.  With mobile device's few apps come pre-installed and user can still look for an app what she or he needs on Google's Play Store and Apple's Play Store.

A native app can access what it requires on mobile device provided it has the permission to do so. Now make a list of permissions you have seen which app asks you for installing and using it on the mobile device.

Some example for the native mobile app are Contacts, Message, Camera, Calendar, and the one which can be downloaded from the respective store of Google and Apple.

Note: In this context, I'm taking Google and Apple platforms as it is well known to most.

Hybrid app and mobile device

In above said, HTML5 and JS technologies have the major role in mobile web app. Now, the same can be used to build apps which can be installed on Android an iOS?

Technically the hybrid app will have HTML5 web app within a native wrapper which is supported by the mobile OS.  This mobile app is developed with HTML5, JS technologies and CSS, then compiled into native Android or native iOS or any other mobile OS using wrapper support.

Apache Cordova is one such mobile application development framework which  helps to wrap up an app that can run in WebView by having nativity extension to mobile device's (OS) platform. The plugins of Cordova can be used to access the device's hardware support. On knowing the word as Cordova, now if happened to explore further we will hear the words -- Phonegap, Titanium, Ionic, Xamarin etc. Knowing how they support to develop mobile application will help to understand better about the hybrid mobile app.

If the hybrid mobile app is also a kind of web app, then the HTML, CSS and JS files are stored in the app which will be installed on the device? Or, will it get from the server on launch of the web on a desktop browser?  This depends on how the hybrid app will be designed. If it is packaged into the .apk or .ipa file, then web APIs can be used send and received data.  If it is received from server then a layer around the WebView will have to built and web APIs will be still used.

The apps built in this sort are called hybrid because

  • they are neither native apps as the rendering is on via WebViews instead of platform's UI framework
  • they are neither web apps because they are packaged as an app for distribution across platforms using the platform nativity via its APIs
Have you used, GMail app on your phone by downloading it from store? Are you using Instagram, Facebook, Youtube, Maps apps on your mobile device on downloading it from store? Then that is an example of the hybrid app which combines the strength of native app and mobile web app together.

So whats the point

To end this post, if asked what is best i.e. whether native app or hybrid app, it depends on context and what is trying to be solved with negotiation.  Both helps to accomplish the needs of app developing organization, programmer, tester and user.  While each have they their own strength and weakness, it is the choice to be made by people who builds for whom.  In this post, I have not shared the advantages and disadvantages of it.

Now starts the fun of exploration around testing and automation.  How to test the apps built in hybrid way? What are the level of isolation I can identify so that I can layer it out layer by layer and test? Where to start if it is getting packaged into nativity of the platform on mobile OS?

If you are doing it or figured it out, pull me to pair with you and test together. Let us do it!

Thursday, January 26, 2017

The Testability and a Tester

In one of the recent discussion with fellow testers, the topic was "testability".  While fellow testers asked, "what is testability?", I had a question which I shared in the discussion -- "What is codability?".

Codability and Testability

I see both of these are inter related.  If it is codable then it is testable to a degree.  What is that degree? That's the question of interest.  But, what is that one will accomplish by knowing the order of degree, here?  This forms the base in having the curiosity or wish to know about testability.

I have a product's feature which I have to implement. The feature has cases which I need to handle in run time for different situations which a user can encounter. Well, I can code for situations which I can think and which is of priority from point view of using the product. Then for other situations which I'm not seeing or not thought off, will it be a hit for my product?  What is the complexity level of the code I'm writing for this feature and what are the risks that are subtle in code I have written? The written code will always have the risk that is tied along with it.

Now, can I tell that testability is influenced by multiple factors and not just on the basis of the test identified? Well, after knowing this, it is essential to understand -- "testability is the easiness at which I can test the system in a given situation.".

Factors influencing testability

The few factors which will potentially influence the order of  easiness -- in learning the test challenge, identifying the test, approaching the test and executing it, are as below:
  • The testing skills of a tester
  • The programming skills of a tester
  • The technology skills of a tester
  • The experience of a tester in testing such systems
  • What is know by tester about the product -- purpose of product; people who are building the product; people who will use or who are using the product; the features and functionality of product; technology of the product; code written for the product; how the product is built with limitations from business team; information available about the product; process, practice and culture followed by teams in building product; availability of the system and its present state
  • Time available for system development
  • Time available for testing
  • The level or order to which the testability aid is built and can be provided in system to test, and the complexity or simplicity level of same
  • Measures taken to improvise and include testability aiding stuffs in system
  • Existing tools or utilities for testing the product or code, or to build one such utilities
  • The freedom to tester and testing to control and guide the testing activity
  • Information expected out of testing and priority of it
  • Relationship with fellow testers, fellow programmers, business teams and any other who are involved and interested in the system being programmed and tested
  • The audience interested in test report or outcome of testing and their expectation

Testability and Test Coverage

The test coverage accomplished by a tester in given context is relational to the skills set of tester and the testability factors of the system and environment.

While I have understood this, I tried to learn the scale of testability keeping the test coverage as base. Here is what I have observed for now.
By keeping the test coverage as a base scale might mislead me to learn what I can accomplish in coverage. Because, the testability might be available and easy one for me to accomplish the 10th level of coverage with my skills and other influencing factors of testability. But to cover and accomplish the first three levels can be not the easy one given my skills and other influencing factors of testability. The vice versa is also possible.  Here, I can take longer time to learn and know how could I have cut short and still accomplish the same coverage given the testability level to me in a context, if I had learned about the complexity level of of it. While I reach to a intended level of coverage, I would have done the moves which does not favor business timeline, learned from them and repeated few stuffs.  This is not wrong but I have consumed time which I could have invested in testing to accomplish much more coverage.  The same is represented in below image.

Keeping the test coverage in base and assessing the testability on vertical scale marking the breadth and width of coverage spread out in parallel which  I want accomplish or what I can accomplish, here is what I observe for now:
Understanding the test coverage I want to reach by my testing and automation, so I provide the information expected is a challenge each time.  The reason for simple to quote for first is the 'testability' for that coverage mark.  Once I get an idea of what is the skills that I need to build and use in such situations, it will help me to make strategic decisions in testing execution and its management.  The is very important as per me from point of view testing strategic base.  This is helping me and allows me to test the perceived level of testability itself for marked coverage boundary or milestones.

This base of strategic thinking in learning and identifying the testability can be used in timelines of a project. Not just during the execution of testing, also in pre-execution and post-execution, it can be used. It gives an idea of how the tester has to be equipped in changing needs of testing and engineering. For me this is working in context where I'm exposed and in few cases I had to add few vectors along with test coverage such as technology and programming factors specific. I'm experimenting this in varied projects having different technology and skill challenges.

Visibility out of testability

With this, for now I see,

  • Testability is related with codability, programmer, tester, environment, process, people, situation, priorities
  • Codability gives the first hint on testability and skills needed to test the same.
  • Testability influences the Test Coverage
  • Testability influences the time taken by a tester to test the system in a given context
  • Like codability requires the skills of a programmer, testability requires skills of a tester

Note: Automation when leveraged with testability, wonders can be done via automation in assisting the testing.