Saturday, May 24, 2014

Stroking the Gestures in Gesturs 1.1


I practiced testing on the gestures stroke in the application Gesturs 1.1. This was the application provided to test as part of STePIN Testers Arena's 4th contest. I covered various aspects of tests on different quality criteria.

The tests conducted for gesture stroke is interesting and it gave me new perspective of learning. It was a two hour session where I touring the product to understand and how it calculates and executes the gestures stroked by me.  The coverage area spanned from install, operation and post uninstallation.

The observations I made in a session is shared in this report. It spans to coverage area from pre-installation, installation, configuration, in-built gestures, customized gestures, gesture stroking, stroking velocity, stroke angular details, stroke length, gesture reversal and gesture threshold area.  Explored the product on functionality and response time criteria. 

From the experience part of use, I felt very much uncomfortable to use Windows in the usual way I do using the'Windows' key in keyboard. The Windows key was not functional when Gesturs process is active in Activity List of Windows.  Using the product in laptop on trackpad with no mouse, is another area where the experiences varies on different versions of Windows OS. 

That is, Windows 7 and Windows 8 has different uncomfortable experience when used it with Gesturs 1.1. Whatever it is, it was not helping me to make my task quicker. I have worked on how the Gesturs can be designed to make the experiences better. While I worked on this, I worked on the Test Architecture of the same and it is impressive for me when I tested it under the perspectives I thought of at the time of testing it. As known, the product is still in the primitive state of development and it will get better as it evolves. Wishing the best to development teams.

The testability factor for gesture testing was not straight simple. But, it had clues to learn and test about it. I used few tools available in Windows OS to make my learning quicker during the session. There was a need to build second layer of testability to understand Gestur's gestures functioning. Once I did that, it became easier for me to proceed with the testing.

The interesting test which I did not carry out is working on different screen resolution and varied hardware acceleration for gesture stroking and response.


Functional Interoperability Tests for Image Uploading in Greenshot with Imgur


Interoperability, that exists everywhere where the data or information is exchanged or interpreted is an interesting area which does not get identified usually. Software applications also have this area in them. The communication and exchange of data between two components of same system or of different system is challenging from different perspectives.

The challenge lists keeps growing as I continue to learn about it. The primary challenge what I have seen so far in my practice is, not understanding and knowing that there is a communication here to accomplish and how it is being communicated and translated in each request and response. For example, how does the command 'netstat' or 'tracert' or 'ping' functions in MS DOS and produces the information which is requested? How does the horn button when pressed in the bike or car produces the sound?

That beautiful is the interoperability is and most times it is not considered for study in the daily software testing practice. In this blog post, I'm sharing the functional interoperability test which Bhavana and myself practiced for Greenshot and Imgur to upload the captured image.  The context framed in this session notes is an assumption which I framed out for testing practice. 

Covering the Imgur API limitations and rules, is one of the interesting part of this testing. Likewise, we understood how the API limitations can cause confusion at UI level and functionality. The report can be found here.