Before performing the main testing configuration, you should review the settings in the enumeration topics and make any changes. The values you specify in these topics are used in lists in the main testing configuration topic.
The Test Results topic enables you to specify a set of values (called and "enumeration") that define different outcomes or results of testing. For example, you specify if the result of a successful Test Run execution should be labeled "Passed", "Succeeded" or something else. You should define a value for every possible result of your test execution process. An example of these values might be:
Passed (result value for tests which succeed)
Failed (result value for tests which fail)
Error (result value for tests which succeed but with errors)
Blocked (result value for tests which could be run for some reason)
You can specify an existing icon for any value, or upload a custom icon by clicking thebutton on the relevant row.
The Create Defect column, if checked, means that a Work Item representing a defect will be created when a execution of a Test Run terminates with the result defined on the row. For example, you would probably check this option for a Test Result representing test failure.
You can add a new Test Result value to the table by filling in the last row. You can add more Test Result values to the table by adding a new empty row by clicking the icon. You can remove a Test Result value by clicking the icon on the relevant row.
Be sure to clickwhen finished, before navigating to some other topic.
This topic enables you to define custom fields which should be added to Test Runs to store information from external testing tool results or manual testing
results. For example, if your tool returns a datum "error code" that you want to track in testing results, you can add a custom field e.g.
You can add as many fields as are needed, one field per row of the table. Add more rows using the icon.
The Multi field, when checked, enables multiple values to be entered into the field. This option is only enabled when the field type selected in Type supports multiple values.
The Require field, if checked, means that the field is required and changes to the Test Run cannot be saved unless a value is supplied. This is more applicable when Test Runs are executed manually, or when results of automated tests are logged manually.
This topic enables you to define the different status values which Test Runs can have, including a standard or custom icon image to represent the status value in the user interface.
The Terminal option marks that the status is a terminal one, meaning that testing activity is complete. If you set a Test Run to a status marked Terminal, the Finished on field value will be filled in with the current date-time.
See also: User Guide: Customizing Test Run Templates.
This topic enables you to specify values that define the type of a new Test Run. Typical values include
You can change the semantics or add more values to cover all the types of Test Runs you need to create to support your testing process. You can optionally
require that users log an electronic signature for executions of Test Runs of any or all of the