Notes on Designing Automated Test Cases

Lucent Technologies and Merrill Lynch

Background

Notes, in a step-by-step format, on how to design automated test cases using the QA Partner automated test tool for client-server applications written in C++ for Windows 95 and applications written in Visual Basic for Windows NT.


Designing Test Cases Using QA Partner (High-Level)

Review Tier 4 Functional Requirements and/or work through an existing prototype.

  1. Identify features / requirements and validation criteria (including data).
  2. For a particular feature, identify all possible test combinations.
  3. Document the test case combinations in a Word table or equivalent.
  4. Document the test cases and expected results (validation) for each test case in the Buster test management system.
  5. Using the Buster test case description, record the framework for one test condition in QA Partner.
  6. Identify the data items and replace them with variables.
  7. Dry run your test case to prove it works correctly. Force test failures by altering expected results.
  8. Archive the test case in Buster.

Designing Test Cases Using QA Partner (Detailed)

Non Data-driven Test Case

  1. Define the QAP Environment, i.e., configure and load an Option File.
  2. Create a new Include file for your application.
  3. Declare your application’s windows in the include file.
  4. Convert implicit child Windows into explicit Window declarations as necessary, and rename the window objects.
  5. Compile the Include file.
  6. Record Window Identifiers using your renamed window objects. Place the cursor on each object within the application. Verify that each object is recognized by its declared name from the include file.
  7. Record window declarations for all dialog boxes, list boxes, combo boxes, etc.
  8. Repeat Step #6.
  9. Create a new Test Script file for your application’s test cases.
  10. Record and/or code a test scenario. Run and debug the test as necessary.
  11. Induce faults in the script and verify the script’s error handling procedures.
  12. Repeat steps 10 and 11 as necessary until the test case is completed.
  13. Run the completed script as a final checkout pass.

Data-driven Test Case

  1. Re-code the test case to accept and process variables.
  2. Define and code a data structure for the variables identified in step 1 and for validation criteria.
  3. Define your test case data as a list of… in an include file.
  4. Run and debug the test case using data parameters (follow script debugging steps defined in the Non-data driven sections above).

Optional: setup the test case to run as a distributed test.

QAP Terminology

  • Test Plan (.pln): file generated by QA Planner. Represents an outline of your application. Can be used to execute test cases by passing data parameters.
  • Test Suite (.s): Used to execute one or more test scripts.
  • Test Script (.t): Contains source code for all test cases.
  • Test Case: QAP designation for the testcase form of a function.
  • Include File (.inc): Contains code for appstates, functions, and window declarations.
  • Test Frame: A special form of include file. Contains code for the Application’s Main Window, including code to support the QAP recovery system.
  • Option File (.opt): A file that contains your QA Partner test environment.

The following diagram shows the relationships between the objects referenced above:

qapfuncs.inc
    ^
    |
tgaframe.inc
    ^
    |
-----------------------------------------------------
application.inc
    window DIALOGBOX YourApp
        .
        .
        .
    YourFunctions (…)
        .
        .
        .
    YourAppstates
        .
        .
        .
    ^
    |
application.t
    testcase YourTest1 ()
    testcase YourTest2 ()
        .
        .
        .

Creating Test Cases

Non Data-driven Test Case

  1. Define the QAP Environment, i.e., configure and load an Option File.
  2. Create a new Include file for your application.
  3. Declare your application’s windows in the include file.
  4. Fix the window parenting structure as necessary, and rename the window objects.
  5. Compile the Include file.
  6. Record Window Identifiers using your renamed window objects. Place the cursor on each object within the application. Verify that each object is recognized by its declared name from the include file.
  7. Record window declarations for all dialog boxes, list boxes, drop downs, etc.
  8. Repeat Step #6.
  9. Create a new Test Script file for your application’s test cases.
  10. Record a test scenario. Run and debug the test as necessary.
  11. Induce faults in the script and verify your error handling.
  12. Repeat steps 10 and 11 as necessary until the test case is completed.
  13. Run the completed script as a final checkout pass.

Data-driven Test Case

  1. Define and code a data structure for your test case data.
  2. Re-code the test case in the form of a function that will accept your test case data as parameters.
  3. Define your test case data as a list of… in main().
  4. Run and debug the test case using data parameters.

Optional: setup the test case to run as a distributed test.

Leave a Reply

Your email address will not be published. Required fields are marked *