Showing posts with label test automation. Show all posts
Showing posts with label test automation. Show all posts

Thursday, September 6, 2012

Ending The Automation Hackathon


We ended the automation hackathon on 31-08-2012(last Friday). There was three main parts in our automation hackathon. In the first part we wrote tests for the QA passed test cases. And in the second part we did the patch automation. In the last part we ported the old tests to the new framework. I completed all three parts two days before the end of test automation.

Problem Occurred just as we put all the test together. Lot of test cases failed due to uncleaned artifacts deployed by the other tests. So we have to clean it to the best we can. We are asked not to fully clean the whole registry. It would be really easy to clean the whole thing up, but correct thing to do would be cleaning things that are deployed by ourselves. According to our team-lead server should come to the initial state(before running the tests) after running our tests.

Even at the end of the hackathon, some errors was there in migrated tests. We might have to work with the team-lead for few more days to get them corrected. I did the governance API testing. Which include lot of test classes. I got some help from two of my friends to complete the test migration, as there was around 45 test classes with over 300 test cases. Hardest part of porting those tests was, cleaning up the resources. In those old tests they use very low level commands to clean the whole registry. But as we are asked to clean only things that we deployed, it was not that easy.

With only new tests(without any ported tests) (1100+ test cases) we achieved over 45 line coverage and over 60 class coverage. After all testing is not about getting it 100% covered. No matter what is the %, our goal should be to deliver a product will least number of failures. If the product is failing, % will not help. I think we did a great job towards our goal.

Tuesday, August 14, 2012

Things that you should remember when test automating(best practices)


Less Hard-coding
We are always tempted to hardcode things. It make our life easy. But this can make maintaining that code a big issue. I had to change some of the places that I did hardcode due to different reasons.

Dates and times
When working with date and times you might forget that, those things are changing. I did this mistake ones in the testing. I was asked to test the service locking functionality. We can lock a service for a specific time. I just hard-corded the dates and times and did the testing. Soon I realized that, if we are going to use them in the future, test going to fail. You might think that is foolish from me, but believe me we miss those little things while cording.

Use of Constants
If you really need to hardcode something do it through a constant, at the bigging of the code. So if it get changed, it is easy for someone else to find and change it

Do not leave thing behind
When you create any objects for testing and if you have done any configuration changes to the product for testing, make sure you undone them before you leave. Those objects and changes can break other tests by other users. You can use @AfterClass for that purpose.

Avoid Test anti-patterns
Just search for the test anti-patterns and you will find lot of them that you should avoid, or at least try to avoid.

Coverage Vs Stable Product
When we are testing, we normally get blinded by the coverage. But always our goal should be to build a stable product. No matter what is the test coverage if our product is failing, all the hard work is for nothing. This is told to me by our team lead Krishantha ayya.

Good Articles

Clarity Framework


Based on the training session and the slide-set provided by Dharshana, and Krishantha.

Clarity is a easy and optimized way to do the integration testing. It can automate platform-wide scenarios , execute tests against stratos and private clouds, adopt tooling(Selenium , SoapUI , Jmeter , Ravana, WSO2 Ravana is an opensource benchmark test framework that aims to facilitate performance benchmarks of different servers. ) support to automation , do reporting and keeping historical records.

Architecture

Work similar to TestNG.


  • This is the component level view of the architecture.



More information: