Software Testing Practices, Test Methodologies and Test Competency Services


Welcome to the world of Software Testing & Best Test Practices.

All you wanted to know about Software Testing, Test Practices, Test Methodologies and building the Test Competency Services !!


Tuesday, February 21, 2012

The Why, What and How of Software Test Automation


Test automation is the use of software to control the execution of tests, the comparison of actual outcomes to predicted outcomes, the setting up of test preconditions, and other test control and test reporting functions. Commonly, test automation involves automating a manual process already in place that uses a formalized testing process.
While it may sound i am writing basic stuff which probably most of you know, its amazing to see how much of discussion goes around and conflicting views expressed once automation is embraced within a project/organization. To my bitterest of experiences, none in one of my teams knew what automation team were doing and vice versa. So here are some thoughts coupled not just with what i experienced but also what i read all along in a short while i started testing.

 
Why Test Automation
  • Consistency: The test scripts will perform the same checks every time it is run. A manual test will be affected by human error and may tend to skip certain areas believed to be stable.
  • Reusability: An element of being able to re-use tests repeatedly over a period of time improves productivity and gets work done in parallel with less manual intervention.
  • Better Quality: Since the scripts can be run for slightest of changes done, any impact due to ongoing changes will be tested thoroughly and quicker ensuring decent quality of product or a project.
  • Low Running Cost: While the cost of automation during development may be considerable to manual testing costs, over a period of time the running costs of test execution will reduce significantly.
  • Speed: A script will execute many times faster than a manual test, giving you a full report on the quality of your product in a few minutes.
  • Formal: No more "testing-by-ear". A code coverage tool (if you wish to use one) can tell you how much code is tested. The test scripts can then tell you if what you test works fine. The result is the exact percentage of the code which is guaranteed to work fine.
  • Compactness: you can perform a full compatibility check by simply copying the application together with the test scripts on all the platforms where you believe it should work. It can give you the confirmation that all functionality works indeed as expected. Maybe in the not-so-distant future all applications will come with a "minimal self-test script" so that the user can be confident that his/her installation works as expected.

Testing Under Pressure
Development teams have been blessed with wonderful GUI tools that assist in producing working code quickly there by improving their productivity 2 fold.
However similar exercise has not been done to enable testers to increase their productivity. While there are tools for automation, needs a good technical expertise as well expecting tester to also be on par with developers. End result is huge load on test team to

  • Test more functionality in less possible time
  • Need to choose automation for application testing
  • Scripted testing that can be re-used, however with potential pitfalls when system changes
  • Need to acquaint with programming skills besides testing abilities
  • Crunched timescales with iterative based delivery gaining prominence
  • A hands on approach to understand the product than take help of product documentation within the tight timescales

What "To" and "Not To" Automate

While i got tempted to say Environment Setup, Build and Installation, Test Data Creation, GUI interaction and like, thought it would not be a wise attempt instead did some reading and could gather following

What To
  • Anything that is repeatable in nature that you have to test anyways and may or may not raise many bugs
  • Anything that adds value to a tester and reduces his/her test cycle time and increases efficiency
  • Anything that is routine in nature and need no change for example Configure and Setup Environments, Create Test Data
  • Anything that needs an element of extrapolation and find out system behaviours like say performance testing
  • Anything that is combinatorial tests for complex scenarios which are difficult/timeconsuming to do through manual testing
What Not To
  • Testing of behaviours and to some extent End to End tests are risky to automate although i myself was not convinced writing this and think automating end to end scenarios is where most of value add comes from. So it works and it doesn't based on nature of project.
  • Anything that just about takes same time as manual testing or is going to die before it gives some or expected value before its death
  • Tests where its difficult to predict the outcomes are best done manually
  • Exploratory Tests i guess are hard to automate although i am a bit unclear on if exploratory testing fits with automation. May be a good topic to write sometime soon. Feel free to feedback your comments on this if possible.

Guidelines To Test Automation

Its imperative that automation testing has to be implemented based on certain guidelines with a view to ensure it can cope with changes that come in within the workflow, requirements and infrastructure every now and then.

Define Clear Goals and Automation Strategy
Having a clear vision and goals of what you want to achieve is something that is applicable to any work that a team wants to do and Automation is no exception. Having such goals is half the challenge as i see it many a times within my teams, the biggest challenge is the next half to ensure all the team are on common ground and understand the goals to which we are working towards.

Differentiate "What" to and "How" to Automate
Its advised to have a clear separation between the one's who are writing the automated test scripts and the ones who will write a framework to handle the test scripts. A specific focus has to be given on how to write the framework for future scalability and re-use.

Agree on Automation Framework and Interfaces
Agreeing on what interfaces need to and should not be automated is as essential as defining and agreeing an automation framework. Every project exhibits its own set of characteristics some of which will work and some of which will not automation perspectives. It’s essential some thought is given to define and agree on such characteristics and interfaces for automation.

Standards and Conventions
Automation scripts become stale very easily at times even in a week due to changing requirements across gui, business rules, source code, environments and others. The only way to have control to cater to such changes is to have a defined and agreed set of standards and conventions to ensure the framework is understood clearly and any member of team can change framework/scripts easily.

Follow A Software Development Process and An Iterative One
Maintainability, Readability, Re-Use, Reliability, Scalability and Effectiveness are key for automation test scripts. It would be difficult to achieve these factors on long run should the team just jump in and produce test scripts without proper planning or design. An automation framework covers not just the working code but will have to factor in environments, operating systems, multi channels (web, mobile, tablet, others) and many more. Only way to achieve this is to explicitly follow a development process and develop them iteratively with good amount of collaboration across the teams.

Align With SCM and Continuous Delivery
Key reason why you have to align with SCM and Continuous Delivery is to have a working set of automation scripts on a regular basis. Feedback loops are quite critical in terms of analyzing the changes going into a release however big or small and quickly changing the automation scripts to make them current. Thats when the maximum ROI can be realized. As per what i have heard and seen at least 55% of automation strategy failure accounts to not being able to keep the strategy and scripts "Current".


Believe In Throw Away Automation

Automated tests produce their value after the source code is changed and a level of acceptance is a must across any team that like any other living/non-living thing an automated test has death, at the least when an application changes. This will call for either throw away of complete test or fix it to re-live some more days. Whichever option you wish to proceed, if the test has not repaid the automation effort by that point, you would have been better off leaving it as a manual test. This also becomes one of the key factors or assumptions that needs to be well understood as part of automation strategy and is up to individual project to estimate whether this comes in a week or in few years time. The cycle of investing in automation, realizing the time/cost value from automation is to be a cycle and "Throw Away" is in a way good to keep this cycle moving without stop.

How Do You Measure Value From Automation

While this happens to be one of the subjects that i see is most often ignored implementing test automation in projects, i would like to measure it in following ways
  • Number of manual tests it prevents you from running and the bugs it will therefore cause you to miss
  • Amount of complexity and time the automated tests reduce on a periodic basis before its death
  • Flexibility it gives to vary your sequencing of testing as required and simplified testing integrated systems
  • ROI in terms of pence and pounds (Automation Gains - Automation Investment Costs) / (Automation Investment Costs)


No comments: