Wednesday, March 17, 2010

Weekend Testing 28 -- Test and Experience Report


Mission: There are three tasks to be completed today. Time Duration: 1 hour.

Task 1:
Complete the game: http://www.gamesforthebrain.com/game/dragger/
Objective: Send the screenshot where the picture is built right.

Task 2:
Score 90 points in the game:
http://www.gamesforthebrain.com/game/memocoly/
Send the screenshot. Checkpoints: URL of the game, IQ Score, "Your solution is right, congratulations! (+10 points)"

Task 3:
Score 50 points in the game. http://www.gamesforthebrain.com/game/numberhunt/
Objective: Send the screenshotscreenshot. Checkpoints: URL of the game, IQ Score, "Your answer xx is right, congratulations! (+10 points)"

Date: 13th March 2010, 3:20 PM IST
Machine: Windows SP3
Browser: Firefox 3.5.8
Tester: Ravisuriya

Context: Tester was given mission of playing the game in 60 minutes. Power cut for about 15 minutes before starting the mission tasks. Power came back and had 40 minutes remaining to complete session. Saw testers asking questions to facilitator about the mission and tasks. Started by taking up first task.

To Deliver:
  1. Screenshot of picture built right from Dragger.
  2. Screenshot of score 90, URL and sentence that tells you have scored 10 points from Memocoly game.
  3. Screenshot of score 50, URL and sentence that tells you have scored 10 points from NumberHunt game.

The mission was of confusion to me as it said "complete the game", "Score 50 points" and "Score 90 points". It did not tell why I should score just 50 and 90 points or more. Procedure if any to adhere while playing the games was not mentioned. During the power cut, tried to understand the mission.


Assumptions:
  • Looking at the game description in mission statement, I thought it would be time limited.
  • Tools can be of help here for to accomplish the mission.

Tools that helped me more to accomplish mission:
  • White paper and Pencil.

Tasks:

Task-1) Dragger:

Browsing through the URL given in the mission found jumbled piece of an image. I tried clicking button ‘Refresh’ to find any simple image that I can think of to arrange quickly. Found one jumbled picture that is easy one for me and completed the mission-1.

There was no restriction that these particular image pieces needs to be put in right frame. I thought it was an opportunity for me to choose picture of my interest.


Task-2) Memocoly:

For first couple of tries kept looking at the screen with attention diverted no where. But, this made my eyes strain with no spectacles. I keep the brightness and contrast of being viewed monitor to less than 50. I felt this could affect if I fail to recognize as what color they were. Thinking of how to over come, I devised to use numbering system of regions.

Mental Modeling of 4 parts of a geometric shape that appeared as square:
  • Labeled each section as 1, 2, 3 and 4.
  • Later wrote the numbers on the sheet as per the region of square blinked.
  • Then clicked on those regions based on the numbers I wrote.
  • This helped me to complete the mission in good time.

Task-3) NumberHunt:
  • Used Microsoft calculator to calculate the displayed numbers.
  • This helped me to complete the game-3 bit quicker.

Finally was able to accomplish the mission i.e., to complete three tasks. Among few strategies I wrote for completing the tasks, I used the strategy of using the tool to accomplish mission. The discussion session had interesting discussion and chat transcript can be found here. Facilitator came up with interesting thought for discussion on Weekend Testing forum. The discussion topic title was -- "we are testers (who plan to meet the mission) or testers (who aim to improve our skills)" My report of three tasks as PDF document is here.


Sunday, March 14, 2010

Weekend Testing 27 -- Test and Experience Report


Mission: To generate test ideas to test Google Buzz with quality criteria Performance.

Date: 06th March 2010, 3:05 PM IST.
To Deliver: Report of performance testing ideas for Google Buzz.
Tester: Ravisuriya

Context: Tester has been asked to look for test ideas to test performance quality criteria of Google Buzz. It was an opportunity and first time, a tester was working with performance quality criteria. Google Buzz was live and used by users for buzzing.


Started of with mission statement. I was not very sure what 'performance' meant in mission. With a question to WT session facilitator, assumed 'performance' here as "how quick and responsive it is". Further continued to collect details regarding the context in which 'performance' was considered here. But it remained as open question and was left to tester choice in session. With mental modeling myself as a tester who is looking out performance test ideas for Google Buzz, began to brainstorm.


Assumptions made while brainstorming:
  • Was not sure what the ‘performance’ means in the test mission.
  • A thought, does the performance mean – “how well was Google Buzz was performing among with similar other available social networking services”?
  • Or how well the desired purpose of application satisfies the desired claims with given constraints by processing user actions and its throughput interval.
  • Did assume for this session, the ‘performance’ is how quick and responsive it is.
  • Various users are using Google Buzz at present, with no similar internet connectivity speed.

Few questions that I got while brainstorming about users and Google Buzz performance:
  • Performance test ideas in what context?
  • Who are the users and what kind of users are using Google Buzz?
  • How often they have used and using Google Buzz? What are their observations and perceptions of performance?
  • What devices did they use to browse the Google Buzz? Did this device too have any influence on their perception of Google Buzz performance?
  • How long did they use Google Buzz in a stretch i.e., without any break?
  • What environment did they have or do they have while using Google Buzz? Did it have any influence?
  • Any tools are used to know or understand the performance parameters of application under test?
  • How the performance is being measured? What are the units of measurement of performance in this context?

Brainstorming for Test Ideas:
  • Identify the kind of users, their ages, business, purpose, environment, consistency of using Google Buzz. Few to mention here. This keeps growing as and when brainstorming and testing sessions is on.
  • Identify the possible and potential contexts with each of this environment above said.
  • Usage of analytics and log files of Google Buzz. The log files and analytical information regarding performance of application under test to have time of request processing and of throughput.
  • Test on various possible hardware and *software configurations and network topologies. Note: *software and hardware may be that are installed on device from where Google Buzz is browsed and on Google Buzz server.
  • Does Google Buzz can be accessed by mobile phones? If yes how the configuration and applications can have influence on time factor with Google Buzz responsiveness?
  • Type of database Google Buzz makes use of.
  • Handling the incorrect or invalid entries. How quick the application is responsive to user here?
  • How quick does it handle if I keep buzzing with a very **small interval of gap between two buzz? Note: **small – will be decided on application’s minimum and maximum tolerance value.
  • Does any other application using internet or network will have influence on Google Buzz’s performance?
  • Study of features available or provided by Google Buzz. This might give more ideas how performances tests can be devised.
  • Tests for knowing how the performance of application goes, when number of parallel users increased and decreased, number of transactions in a given time period varied to threshold and least, time consumed to recover from these two edges of variations etc.
  • Performance of application when it's potential reaches to the extreme. What time it takes to handle the requests from clients?
  • Type of server being used. Type of connections. Any data stored by application on client is not stored or stored partially – how the performance of application goes here.
  • How Google applications being used by user simultaneously along with Google Buzz choke up the time. Need to identify such application(s) of Google and need to be tested along with Google Buzz.
  • Other vast applications used over internet and Google Buzz.
  • Maximum requests that can be handled at a given time. (This includes various actions using various features of Google Buzz).
  • Usage of Google Buzz over ***network that is not comfort for using of application. Note: ***network – properties of network.
  • If any tools or program used to look into the performance rating of application, what time is taken by tool to work on this? Does the time taken here by tool or program instructions have any influence?
  • How the request of client is sent to server?

The test ideas I got during this session were of generic to performance. I will use these test ideas to develop specific performance tests. Interaction with Dr. Meeta Prakash showed how these ideas can be turned to be more specific to performance test ideas. Discussion session transcript is here. PDF document of my session report is here.