Setting the Testing Configuration

The Configuration topic (Testing > Configuration) is where you specify how Polarion's Test Case Management feature should work. For field-specific help, please refer to the Quick Help text at the bottom of the Configuration page. This configuration is available in the Global (repository) scope, and in projects. Project settings override the global settings.

The main artifacts of the testing process are:

The Testing Configuration page contains settings that specify a number of properties for Test Runs:

Migrate Test Steps Button

Note:

This toolbar button is present only if the system property com.polarion.hideMigrationTestStepsButton is set to false.

(polarion.properties file location: Windows - polarion/configuration/polarion.properties, Linux - %POLARION_HOME%/etc/polarion.properties)

In earlier implementations of the Test Steps feature, manual test steps were created and stored in the Description field of Work Items. In later implementations (beginning with version 2012-SR3) test steps are created and stored in a dedicated custom field of type Test Steps. The Migrate Test Steps button enables an administrator to move existing test steps data from the Description field to the new Test Steps field of Work Items. Migration is available only in the project scope and is performed on Work Items of the type specified by the administrator in the Migrate Test Steps dialog. (This would normally be the Test Case type.)

Before running the migration utility in a project having existing Test Steps data in Description fields, you must create a corresponding Test Steps configuration in Administration TestingTest Steps. Names of the Test Step Columns in the configuration must be the same as the column labels in existing Test Steps tables (in the Description field of Work Items). You must also configure a custom field of type Test Steps for the Test Case type Work Item. For complete information please download the Polarion Test Steps Migration Guide.

Automated Testing

This section defines parameters used for importing results of automated tests (thereby enabling tracking of those results).

Test Case Type

Defines which of the configured Work Item types corresponds to a test case.

(Work Item Types are configured in Administration Work ItemsEnumerations.)

Default Test Run Template

Defines which Test Run to use as a template for creating new Test Runs when importing results of automated tests. (May be overridden in a job or build configuration.) Enter the ID of an existing Test Run from the current project.

Default Test Case Template

Defines which Work Item to use as a template for creating new Test Case Work Items while importing results of automated tests. (May be overridden in job or build configuration.) Enter the ID of a Work Item from current project or projectID/workItemID for Work Items from other project.

Custom Field for Test Case ID

Defines which string-type Custom Field (of Test Case type Work Items) to use for storing the ID string of an automated test, the results of which are being imported. It is needed for subsequent imports of the tests in order to recognize Test Case type Work Items created by previous import(s). The value for Java JUnit tests is the name of a specific test class and method.

(Work Item Custom Fields are configured in Administration Work Items Custom Fields.)

Create Summary Defect

Defines the conditions under which a single "Summary" defect item is created and linked to a Test Run instead of multiple Defects (one for each failed Test Case). It is useful for situations when automated tests fail because of some environmental problem not directly related to the concrete Test Cases, which causes many test failures.

Map Results of Automated Test to Enumerations

This section allows mapping of automated test results to configured Test Record Result and Test Run Status values in Polarion.

(Test Result values are configured in Administration TestingTest Results.)

This configuration also specifies for which Test Result(s) a linked Defect item should be created.

Before configuring this mapping, you need to configure the two enumerations: Test Run Statuses and Test Results, in the corresponding administration topics under Testing.

Note:

When there are multiple types of test results, the Test Run status will be mapped as follows:

  • If there are any errors, the Error status is will be set.

  • If there are any failures (and no errors), the Failed status will be set.

  • Otherwise, the Passed status will be set.

Manual Testing

This section provides options and setting applicable to manual testing and manual type Test Runs.

Default Planning Query

Here you can specify Lucene query syntax for a query that retrieves some subset of all Work Items when a user launches manual selection of Test Cases for a Test Run. For example, such a query might retrieve all Test Case type items excluding all other types, or some subset of Test Case items such as Test Cases of manual type and not in Draft status. The purpose of the query is to limit the amount of items in the Work Items table to some reasonable starting point for a user who is beginning manual selection of Test Cases for a manual Test Run. The user is not restricted to the results of this query. They can formulate and run any query in the Work Items Table view any time during Test Case selection. However, if users select Test Cases that are outside the scope of the results of this default query, they are notified that they have selected such items and prompted to confirm their selection.

You can use the Select Test Cases item on the Actions menu of a Test Run to select test cases for it if the Select Test Cases field is set to Manual, Query on Create or Document on Create. In a Test Run Template, you can use the action only if the Select Test Cases field is set to Manual.

Allow Retest of Manual Test Case

Defines the conditions under which a single "Summary" defect item is created and linked to a Test Run instead of multiple Defects (one for each failed Test Case). It is useful for situations when automated tests fail because of some environmental problem not directly related to the concrete Test Cases, which causes many test failures.

Defect Configuration

Create Defect in Project

Defines the project in which Defects (resulting from failed tests) should be created.

Defect Type

Defines which of the configured Work Item types is used to represent Defects resulting from test failures.

(Work Item Types are configured in Administration Work ItemsEnumerations.)

Defect Template

Defines which Work Item is used as a template for creating a new Defect type Work Item as a result of test failure while importing results of automated tests, or when a manual Test Case is executed. It is also used as the template for the "Summary Defect" (see Create Summary Defect). Enter the ID of a Work Item from current project or projectID/workItemID for Work Items from a different project.

Link Role

Defines which Link Role is used for linking Defect Work Items (created when test results contain failed tests) to Test Case Work Items.

(Link Roles are configured in Administration Work ItemsEnumerations.)

Reuse Defects

Defines when the defect created when a test case fails is reused.

Available Reuse Defect Options:

Never:

When set to Never, if the current Test Record does not have a linked defect, a new defect is always created. While re-testing, the defect that is already linked to the test record can be reused.

If failed in group / If failed in previous
  1. First If failed in group and If failed in previous check iterations of the Test Case in the current Test Run. If there is a resolvable, unresolved, not summary defect, it is reused.

  2. Then they check Test Runs with the same Group as the current Test Run and in the same Project as the current Test Run. If there is a resolvable, unresolved, not summary defect, then it is reused.

  3. Then, the If failed in previous strategy checks Test Runs in the previous group and in the same project as the current test run. (The previous group is determined by searching last Test Run by creation time with a different group.) If there is a resolvable, unresolved, not summary defect, then it is reused.

  4. Otherwise no defect is reused.

Always
  1. "Always" finds the top 20 Test Records in the Project of the current Test Run, for the same Test Case that have a defect set.

    (Sorted by execution time in descending order.)

  2. It then iterates over these defects and finds the first one that is resolvable and not a summary defect of its Test Run.

  3. If such a defect is found and has no resolution, it is reused.

  4. Otherwise no defect is reused.

Perform Auto-assignment

Defines if auto-assignment should be executed to automatically fill the Assignee field in new Defect items created automatically for a failed Test Case.

(AdministrationWork ItemsAuto-assignment.)
Allow Retest of Manual Test Case

Defines whether re-running of manually-executed Test Cases is allowed.

Copy Test Case fields to Defect

Defines which fields are copied from the Test Case Work Item to automatically created Defect Work Item(s).

Copy Test Run fields to Defect

Defines which fields are copied from Test Run to an automatically created Defect Work Item. When the source field is single-enum field and the target is multi-enum field, then the value is added, otherwise the value in the target field is overwritten.

To existing

When checked then this mapping is used also when existing Defect is linked to new failed Test Record.

Automated Cleanup of Test Runs

There is a job polarion.jobs.testruns.delete that can be run to clean up old Test Runs. The job runs in the global scope, but it uses configurations from project scopes.

Deletion of the old Test Runs by this job is configurable per project and is disabled by default. To enable it for a specific project, check the option Enable cleanup of old Test Runs on the configuration page.

( Administration TestingConfiguration.)

Note:

Enabling this option in the global configuration enables it in all projects that do not define their own configuration for automated cleanup of old Test Runs. Test Runs in such projects will be deleted according to the settings in the global configuration, which may or may not be what is wanted. It is advisable make sure that projects subject to regulatory compliance are configured so that no Test Runs important for regulatory reporting are deleted.

You can optionally define a Lucene query in the field Query used by job to select Test Runs for deletion, that returns the set of Test Runs to be deleted by the job. This filtering query is automatically extended by Polarion so that the job never deletes Test Run templates and Test Runs that have the Keep In History flag set.

When Test Runs are deleted by the job, no email notifications are generated, and nothing is shown in the Activity stream.

Example query

type:automated AND finishedOn:[19700101 TO $today-30d$]

Macros for Test Run Pages

Polarion provides the following macros which can be used to incorporate export-import functionality into Test Run pages:

  1. {export-tests} - Renders a link which launches a dialog enabling a user to export of a set of Test Cases to Microsoft Excel. Use the query to retrieve the test cases to be exported. Example: {export-tests:query=status:active|sortby=id}

    Excel round-trip export prefers the current context's export template when searching for a default one. (It takes first export template in lexicographical order from the project configuration, then the first in lexicographical order from the global configuration.)

  2. {import-test-results} - Renders a link which launches a dialog enabling a user to import a Microsoft Excel file containing test results.

For complete syntax information, see the Syntax Help provided when editing a Test Run page.

Configuring Test Run ID Prefix

You can configure any Test Run Template to add a prefix at the beginning of the IDs of new Test Runs created from the template. The prefix will be added to both manually specified and generated Test Run IDs.

To configure templates to use a Test Run ID prefix:

  1. Review the Test Run Templates, and decide which of them you want to have add a prefix to new Test Run IDs.

  2. In each template you choose, edit the template Properties and enter a prefix value in the Test Run ID Prefix field.

    For example, in a template "Software Design Verification Test", you might specify a prefix like "SWDVT" or "SDV".

When manually creating a Test Run, users can select the Test Run Template. If an ID prefix is configured in the template, and the project is not configured to generate Test Run IDs automatically, the prefix is shown as read-only in the Create Test Run dialog, before the text box in the ID field.

Generating Test Run IDs

You can set up the testing configuration to automatically assign IDs to new Test Runs. When configured, users who manually create new Test Runs do not need to specify an ID, and the naming convention will be consistent for all Test Run IDs.

To enable generated Test Run IDs:

  1. Enter Administration for the project you want to configure.

  2. In Navigation, select Testing > Configuration.

  3. In the Testing Configuration page, check the option Enable Generated Test Run IDs.

Note that this option setting is project-specific. You will need to set it in every project in which you want automatic generation of Test Run IDs.

Generated ID Format

The ID of new Test Runs will begin with the value configured in the Test Run ID Prefix field of the Test Run Template (if one has been specified), followed by a dash (-), followed by a unique ID generated by the system. This ID has the following form:

PREFIX-YYYYMMDD-HHMM with a uniqueness suffix (generated only if necessary) in the form of _NUM

For example: MYPREFIX-20150721-1032 or MYPREFIX-20150721-1032_2 - where MYPREFIX is an ID prefix value that has been configured in the Test Run template (not present if no prefix has been configured), and the trailing 2 in the second example is a uniqueness suffix.

When users manually create a new Test Run, if generated IDs are configured, the ID field is hidden in the Create Test Run dialog. The generated ID appears in the new Test Run after it is created, and the ID cannot be changed.

Limiting Notifications for Imports

The first time you import automated test cases (jUnit tests, for example) to Polarion, it is unlikely that you would want notifications sent about newly created Test Case type Work Items. Also, it is probably not necessary to show all the resulting Create activities in the Activity Stream. This is especially true if you are importing a large number of test cases.

An automated test case is a Work Item of the type configured as Test Case, with a non-empty value in the custom field defined as the Test Case ID (see configuration page in Administration > Testing).

There is a system property that defines a threshold above which notifications are not sent, and the Activity Stream is not updated when new automated test case Work Items are created. The default value is 50, meaning that if more than 50 new Test Case items are created, notifications and activity are not triggered.

If you want to change the default threshold value:

  1. Using any text editor, open the system properties file polarion.properties (follow link for location).

  2. Search for com.polarion.maxNotificationsAboutCreatedTestCases. It is not present in the file by default, so you probably need to add it on a new line.

  3. Set the threshold value as follows: com.polarion.maxNotificationsAboutCreatedTestCases=[VALUE] ...

    ...where [VALUE] is a positive or negative integer, or zero.

    • If [VALUE] is positive, notifications and activity will be suppressed if the number of imported automated tests exceeds [VALUE].

    • If [VALUE] is 0 (zero), no notifications will be sent and no activity will be streamed regardless of the number of automated tests imported.

    • If [VALUE] is negative, no threshold is applied and all notifications are sent and all activity is streamed.

Warning

Setting no threshold can result in significant temporary performance degradation of Polarion, your SMTP server, and your network if large numbers of automated tests are imported and no threshold limit is set, to say nothing of user irritation if large numbers of notification emails arrive in their in-boxes as a result of some test case import.

Limiting Test Run Records in History

If you have very large manual Test Runs, the default system configuration can result in excessive numbers of Test Run records in the historical database. In such cases, an administrator can set a system property to reduce the number of records generated. See the Administrator's Guide topic Preventing Excessive Test Run Records for information.