Wednesday 24 December 2014

Automation Checklist

Checklist for Automation Approach       

SNo
Items
Y/N
Comments
1.
Have the test cases been identified for automation?


2.
Is the dataflow prepared for effective reusability of test scripts? (The main objective of this approach is to increase reusability of codes, reduce redundancy in codes, and easy for maintenance)


3.
Is Requirements Traceability Matrix prepared for mapping a test case to a script?


4.
Are the necessary test data for automation available?


5.
Are all the common functions identified?


6.
Is the design document available for scripting?


7.
Are the folder names defined for the automation scripts?




Checklist for Automation Methodology 

SNo
Items
Y/N
Comments
1.
Is the identical start and end state defined for the test script?


    2.
Is the precondition documented for the test script?


    3.
Tables are defined with proper naming conventions in the database


    4.
Is the Global GUI File defined?


    5.
Are all the checkpoints results captured and validated?


    6.
All scripts should have appropriate verification points.


    7.
Verification points for key intermediary functional results or states in the flow will be inserted in-between steps and at the end of a script to verify the expected results.


    8.
The verification points should also check for every error condition possible from the UI.


    9.
On test execution of each test script, the test log will be used to verify the results for verification point passes and for the items that have not passed.


   10.
Document the steps that are not automated with reasons.


   11.
Check if the exception handling and recovery scenarios are done for all the expected and unexpected failures during the execution of scripts.





Checklist for Scripting Standards  

SNo
Items
Y/N
Comments
1.
GUI object names used in the GUI map and the object names used in business  functions follow the exact naming conventions


2.
Reference documents such as conventions and standards are available for review?


3.
Is the functionality of the script covered?


4.
Does it contain appropriate headers


5.
Is adequate Inline documentation available?


6.
Are meaningful naming conventions followed?


7.
Has code been adequately indented?


8.
Are common set of routines written instead of replicating code for these routines in various modules?


9.
Are adequate procedures written for the test script to proceed to next step/iteration during failure without the test script execution stopping?


10.
Are the error messages adequate?


11.
Is a meaningful message written to the test report if a step fails?


12.
Will the requirement of execution time be met?


13.
Verify if the name of the script is as per the “Naming Convention” given in the Scripting Standard document?


14.
Verify if the header of the script is as per the Scripting Standard document?


15.
Check for the proper version for the test script. Before initial review the test script version should be ‘Draft’ and should be incremented subsequently.


16.
Are local variables used within the functions wherever applicable?


17.
Is the Naming of variables/constants/functions according to the Naming convention document for script?


18.
Check if proper comments are added at the beginning and end of loops where multiple loops are present.


19.
Check if unused variables are not present in the script.


20.
Check if the script contains any unnecessary ‘pause’,’printf’ statement.


21.
Check if the script doesn’t contain any syntax errors.


22.
Check if the modification history is maintained properly.


23.
Check if the functions called inside the main script for entering data covers the entire screen flow for the corresponding product


24.
Check if tl_step event function is used only to report a problem/bug/pass case into the test result report.


25.
Check if script readability, maintainability and consistency are there.


26.
Is the Database connection session closed in the script?


27.
Are the previous GUI Files unloaded?


No comments:

Post a Comment