Wednesday, April 15, 2020

HTTP Status Codes and Error Codes: They are not the same!



To write in brief, I understand the HTTP Status Code and Error Code are two different topics when it comes to APIs.  Most often, it is confused that HTTP Status Code is the error code.  No it is not!  Then why it is taken that way?  Could be because of 4xx and 5xx which inform about client error and server error. May be this has lead to assume or take the HTTP Status Code as error code.  If you are taking it that way, fine!  Don't do it from now on and if did, it is not right.  

The 4xx series of HTTP Status Code tell the user (that is client) about the error that occurred from the client input or from the interaction of client.  Likewise, the 5xx series of HTTP Status Code tell the user about the error which occurred at server end when processing the client input or client interaction.  For example, HTTP Status Code 404 in response by server says, the resource being requested by client is not found.  The HTTP Status Code 500 in response by server says, there was an error at server end in processing the input or interaction of client.  For more details about the HTTP Status Code refer here -- https://www.restapitutorial.com/httpstatuscodes.html



Then what is an error code?

Say, you were trying to authenticate yourself to server in a request. The authentication fails and the server returns HTTP Status Code 401, which means unauthorized.  Client when received this HTTP Status Code, it says to user about failure of authentication.

This is not over yet.  Today's microservices are so agile, scalable and adaptive, it can tell clearly what went wrong if well implemented.  Then can't microservices tell why the authentication failed?  It can, if we did implement that.  What was incorrect during the authentication activity?  If this is identifiable and can be said precisely, it helps the user to correct and attempt again to authenticate, right?  This is where the "Error Code" come in handy!

For example, for the HTTP Status Code 401 that is unauthorized, there can be multiple reason. Few reasons as to mention here -- incorrect user account, incorrect password, incorrect auth token, etc.  Now when server responds just by status code 401, will it help?  Yes it will; but can we derive much more precise help?  Of course, we can and it is by defining the error code and it's message for such failures of 401.  Refer below example.

Incorrect User Account HTTP Status Code: 401
Error Code: 1001
Error Message: Invalid user account 
Incorrect Password HTTP Status Code: 401
Error Code: 1002
Error Message: Incorrect password used to authenticate 
Incorrect Auth Token HTTP Status Code: 401
Error Code: 1003
Error Message: Incorrect auth token used in authentication

If you had observed above, all the actions yields back 401 response from server. But to tell precisely what happened, services will make use of defined error code and error messages.  When the client receives this agreed error code message in response as a contract, it displays appropriate message to the user.

Further, client will be programmed with this error code in processing the response from microservices.  Based on HTTP Status Code and Error Code payload received, it acts accordingly.

Here is an example HTTP response with status code 401 and an error code payload:

HTTP/1.1 401 Unauthorized
Content-Type: application/json
Content-Length: 123
Connection: close
Date: Sat, 11 Apr 2012 15:04:31 GMT
Access-Control-Allow-Credentials: true
Access-Control-Allow-Methods: GET, POST, PUT, DELETE, HEAD, OPTIONS, PATCH
Access-Control-Allow-Origin: *
X-Request-Id: Req-87c-96fa-e65e6efcbcde
X-Trans-Id: abcdefghijklmnopqrstuvwxyzn0=
X-Transaction-Id: Txn-41cb7c71-b123-504f-c206-a52d651c

{"code":"1003","status_code":401,"header":"Unauthorised","message":"Invalid access token used"}

The client will look for the HTTP status code in header and in payload, along with the error code and message.



What tests can be done here?

The tests of all quality criteria can be done here. It needs to be well thought, modeled, and designed.  Also, the important test which missed here is the contract test between the client and server for defined error code.


Friday, January 17, 2020

My Starting Days with BDD, Cucumber and Automation



I think how to put a thought of me into discussion mode before I start to speak. I take several approaches and strategies on same i.e. to bring into discussion mode rather as a argument and debating tone. Yet there will be set back at certain times and I learn from it. I have not let it to demotivate it me, but I have taken it as an opportunity for communicating better. The intention is doing the job best and right for context and vision of the organization.

I once said, "I don't know and this is what I know" in a meeting to a team member. When I reflected on myself later that day, I could see that I had given up on holding it for long time. Which usually I don't do i.e. I don't give up. I get stuck to it and have patience with silence in doing the task, which looks odd most times to people around me. The point here was not doing it to perfect or excellent, but doing the work in the right way. Figuring out the right way is not easy when we have team with diverse skill sets and culture practice.

Most of the times I wrote feature file (outcome of the collaboration in BDD practice) that was written, though I know it should not be me writing it. I wanted to keep it at least on average though not at good, hence I rewrote on review. Different teams practicing it in different ways and none knows who is better and not better.

If there is a design approach for a product in programming, every programmer follows it in the team, say. But this does not go in testing and automation practice, most of the times. When I sit back and ask myself, "why it is this way?", I happen to see -- could be the team is not trusting on what I'm saying? Then whom do they trust? I don't have an answer for that as well. I asked myself, "why are they so confused when we have clear cut solution approach with clarity?" I learn this is where the skill and expertise gets distinct in the work.

During this time, I observed the work we were doing and I use to analyze by staring at it to know -- what are we doing and are we doing it right? That was one of key job I believe in my role. To mark where we were right and not right, I had to learn the subject which was new to me as well. I took time here thinking it will benefit the team and organization. I asked myself, is it worth doing it? Then who else had done it? I have no answer for that in me.

In this context for me the best thing to do was, to let the testers do it and if they see a failure or difficulty later, so they might reflect back. That was a hard call which I took. I failed! You see, that's how the failure tag is put on a senior or junior in team for doing and not doing a work, when nothing goes as expected. Anyways, I failed!

What did I learn from this, then? It is okay to fail and I have pride in it. What I got from this fail are incredible learning.

I'm glad when a friend Nikolay wrote about it here -- https://saucelabs.com/blog/is-bdd-automation-actually-killing-your-project

It is not just me who goes through it. There are other people too and I saw on reading the post of Nikolay. It is one of the good posts which I will refer to a tester who is practicing automation using Cucumber and the word BDD.

As well, I observed, at times even I give up and lose bit of my patience which I hardly remember doing in my career so far.

At the end, if the question is "Who is wrong here?", I don't see anyone. There comes the time -- where a person A feel other person B is wrong; then the person A feels I'm wrong and person B is right; then the time comes, where person A see that both Person A and person B both were right (or wrong).

It was the outcome which counted most that day and the decision that was made for the outcome, speaks out more unsaid. The team has to move on and continue doing what it is expected to do and in a right way for context.