In manual testing, it is common for the testing steps of a test case to be executed multiple times with variations in the test procedure. For example, the same set of steps might be run on desktop and mobile operating systems, and/or with different browsers on the different operating platforms, or an electromechanical component might be tested with different input voltages. Polarion's Test Parameters (or "Parameterized Testing") feature can help make the test specification process efficient and flexible.
The discussion of this feature assumes you are already familiar with the information in the User Guide topic Creating and Managing Test Runs.
Rather than create separate test cases, or different sets of steps, it is possible to capture the variables in the testing process as Test Parameters, which can be applied by testers during different executions of the same Test Case. Test Parameters are placeholders for actual values. They enable test case authors to write test cases and steps in an abstract way, thereby enabling testers to execute them with specific variations.
For example, suppose you are testing a login operation for a web application. Your steps might be as follows:
Open Browser on your computer
Access the URL of server
Enter Username and Password in respective fields
Click Login button
Suppose you want to run this test using different browsers/versions and/or different user accounts. Rather than write separate test cases with similar steps, Test Parameters
Password can be created and inserted into test steps when writing up a test case.
Authors/planners of Test Runs can supply values for the Test Parameters in the Test Cases planned into individual manual Test Runs. Examples of parameter values in the above
simple test might be:
samplepass in one Test Run, and
samplepass in another Test Run. During execution of Test Runs, testers see these values and perform the steps accordingly.
Test Records are automatically created showing the step results with Test Parameter values applied.
If a single Test Case needs to be executed multiple times with different variables applied, multiple Iterations can be defined for the Test Steps with different parameter values specified and applied for each execution of the steps.
Changes to Test Parameters are visible in Polarion in several ways:
All changes to Test Parameters, from creating new, to renaming, to changing values... all is written to the History of the artifact in which the change takes place... Test Run Template, Test Run, or Test Case item.
Changes to Test Parameters are shown in the Activity stream. The entry shows the user who made the change, what artifact was changed, and what parameters changed.
By default, changes to Test Parameters are included in the information sent via notification events for Test runs
Also, Test Parameters can optionally be shown as a column in the Test Runs table via the Customize Table action on the toolbar of the Test Runs table. In the dialog, you add Test Parameters to the Columns Shown list.
This topic is mainly for testing managers or team leaders who plan testing activities and develop standards or best practices for a team or project by means of Test Run Templates. It is possible to configure Test Run Templates with Test Parameters (and, optionally, default parameter values), so that when a Test Run is instantiated from a template, the Test Run contains the parameters (and values, if configured) defined in the template. When the instantiated Test Run is executed, the respective values from the Test Run are pre-filled in Test Parameters in any Test Cases that have the same parameter in the Test Steps.
In the above example, it would make sense to define the Browser parameter in the Test Run Template because it is the testing environment and therefore global for all the Test Cases in all Test Runs based on the template. Then, in each Test Run, a value for Browser can be specified ("Firefox" or "Chrome" for example), which will become the default parameter value in the Test Cases selected for the Test Run.
Consider a simple example that illustrates how this feature works, continuing with the idea of testing the login function of a web application. The test environment is always a web browser. The testing steps direct the tester to open the browser, accesses the application URL, and enter login credentials username and password. The test checks both valid and invalid credentials. Let us assume that a parameter Browser is defined in a Test Run template named "Web-app Login Test". When a Test Run is created from that template, it contains the Browser parameter, but no value for it. This parameter can be saved to the Test Parameters Library, making it available to Test Case authors for insertion into Test Steps.
A default set of Test Parameters can be defined by an administrator in project Administration (Administration: Testing: Test Parameters). The parameters defined in this Test Parameters Library appear in the Add Parameter list in the Test Steps table of Test Cases, and Test Case authors defining Test Steps can select a parameter in the list to insert into a Test Step.
Now suppose you, the test planner, want a Test Run, instantiated from this template, to execute Test Cases using Firefox version 40. You would supply
as the value for the Browser parameter in the Test Run you create for this test. Let's assume the existence of a Test Case in which the author has created
Test Steps containing Test Parameters Browser, Username, and Password, and that you have
selected it for execution in your Test Run.
When you open the Test Case via the Actions > Select Test Cases action in the Test Run, you see the value "Firefox 40" in place of the Browser parameter in the Test Steps. This value is from your Test Run. You now need to supply values for other parameters in the Test Steps, so testers executing the Test Run know how to actually execute the Test Case. You can do this in the Iteration 1 panel of the Test Run Planning sidebar.
After specifying all parameter values, you save the Test Run via the button in the sidebar. (You can optionally add Iterations and provide different values for the parameter... more on Iterations later.) At this point, the Test Run is ready to be executed by Testers.
When a tester executes the Test Run via itsbutton, the Table view of Work Items opens in a new browser tab, with the Test Run Planning sidebar visible. The table lists the Test Cases selected in the Test Run. Test Parameters in the Test Steps are replaced by the actual values supplied by you, the Test Run author/planner. The tester proceeds to execute the steps according to the information. The parameter values in the steps are written to the test record.
A tester with the requisite permissions can modify parameter values during execution of the Test Run. Any changed values are written to the test record, so the actual values used to execute the test will always be traceable.
Test Steps of Test Cases often need to be executed multiple times with variations. In the example we have been using, this might mean running a different browser or browser version, or a logging in to a different user account with different access permissions. When defining a Test Run, the Test Run author/planner can add another Iteration in the Test Run Planning sidebar of any selected Test Case, and supply the different values for the Test Parameters in each Iteration, adding as many Iterations as need to be executed to cover all possible scenario variations.
After Iterations have been defined and the Test Run saved, a tester can execute the Test Run, performing the Test Steps multiple times using different values each time as specified in each Iteration. Test Step results are recorded as usual as each Iteration is executed, and Iteration results are part of the overall test record
Testing can be paused and resumed any time until all steps for all Iterations have been executed. When there are multiple Iterations, a tab for each one appears on the left side of the steps in the Execute Tests section of the page. The process moves the tester to the next Iteration when one is completed. (The tester can disable this automatic progression by clearing the checkbox option Open next queued Test when finished at the bottom right of the Test Steps in the Execute Test section of the Test Case.) When all Iterations are executed, the process moves the tester to the next Test Case, if any are still waiting.
Iterations are only supported in Test Runs where
Manually, (this includes Test Run Templates),
or if they are selected via the
Query on Create or
LiveDoc on Create options.
The following sections explain specifically how to set up a project to use parameterized testing. Generally, you will want to perform these operations in the order described, though there may be some concurrency, and some are optional, as noted.
Define the Test Parameters Library. Optional. (Project Administrator)
Define default Test Parameters in Test Run Templates. Optional. (Testing Manager or Lead)
Insert Test Parameters in the text of Test Steps in Test Case type Work Items. May take place concurrently with the next item. (Test Specification Author)
Create Test Runs and define Test Run Parameters in the Properties. May take place concurrently with Test Parameter specification in Test Cases. Optional. (Testing Manager or Lead)
In the Test Runs, provide actual Test Parameter values, optionally adding Iterations, for the Test Cases selected in the Test Run configuration. (Testing Manager or Lead)
Execute Test Cases following the Test Steps through all Iterations. (Tester)
When composing Test Cases, test specification authors who have the requisite permissions can optionally save Test Parameters they create to the project's Test Parameters Library. For example, if a common test environment is a web browser, it would make sense to add a parameter named Browser to the library. The parameters saved to the library appear as items in the Insert Test Parameter select list in the Test Steps table of Test Case type Work Items, when a Test Case author is editing the table and defining Test Steps. It is also shown in the Parameter Name column of the Manage Test Parameters dialog, accessible in Test Runs and Test Run Templates.
An administrator can access this library in the project Administration and perform several operations including defining new Test Parameters and modifying or deleting existing parameters. In the Administration configuration, you can only specify Test Parameter name and type. Parameter values are specified by users who create Test Runs.
In first releases of the parameterized testing feature, only the String type is supported.
To access and manage Test Parameters in the Library:
Open the project while logged in with administrator permissions and enter Administration.
Expand the Testing topic and select.
Users must be granted repository access permissions for the
test-parameters-library.xml file in the project
repository in order to add parameters to the Test Parameters Library. Writing to that file is not granted by default for non-administrator users. The path in the access management
As stated earlier, you can configure Test Run Templates with Test Parameters, so that when a Test Run is instantiated from a template, it contains the parameters defined there. In essence, you define the default Test Parameters for all Test Runs based on the Template. You can add Test Run Parameters in both new and existing Test Run Templates. You can optionally provide default values for the parameters you define in a Test Run Template. The values will appear in Test Runs instantiated from the template.
In a Test Run instantiated from a template, users with the necessary permissions can:
Remove default parameters provided by the template.
Change any default parameter values provided by the template.
Define new parameters and values in the Test Run. These do not affect the template.
To work with Test Run Parameters in Test Run Templates, you must have permission to MODIFY Test Runs, and permission to DEFINE TEST PARAMETERS in Test Runs.
To define Test Parameters in a Test Run Template:
Open the project, and in Navigation select Test Runs.
In the table of existing Test Runs, click thebutton. The table now lists all currently-defined Test Run Templates.
In the table of Test Run Templates, select the Test Run Template you want to configure.
In the detail of the selected template, click the Test Parameters section. That section displays a table of currently-configured parameters.button and scroll down to the
Click on the section header, then click thebutton.
In the Manage Test Parameters dialog, click the icon to add a new Test Parameter. Select from the list of parameters currently in the Test Parameters Library, or enter the parameter name and type (if multiple types are available). Click when finished adding Test Parameters.
The table in the Test Parameters section now contains the parameters you added. You can optionally enter a value in the Value column for any or all parameters. This value will appear in Test Runs instantiated from the Template. When finished, click to preserve changes, then click the button to exit from Properties. If you want to configure another Test Run Template, select it in the table and repeat the foregoing procedures.
When finished with all templates, click thebutton to toggle it off and return to the Test Runs management page.
This section applies to anyone who plans tests, creating and configuring Test Runs. When you create a new Test Run, you can define Test Parameters and their respective values. In the Test Cases selected for the Test Run, if the Test Steps contain a parameter of the same name as one defined in the Test Run, the Test Run will supply the value when the Test Cases are executed by testers. The Test Run author/planner must supply values for Test Parameters in Test Cases that do not have an equivalent parameter defined in the Test Run. Alternatively, testers can be given the necessary permission to modify parameters when executing the Test Cases of a Test Run.
For example, Test Case TC-1 might contain a step
Open Browser where "Browser" is a Test Parameter. A Test Run might then define
a parameter Browser with a value
Firefox 40. When TC-1 is selected for the Test Run, the tester sees the
Open Firefox 40 - the value "Firefox" is supplied by the Test Run during execution.
If TC-1 contains a step
"Enter Password in password field" where Password is a Test Parameter, and an equivalent parameter is not
defined in the Test Run, then the Test Run author/planner must provide a value (e.g.
Pass999) when configuring the Test Run, so the tester will see
"Enter Pass999 in password field".
To specify Test Parameters, you should be familiar with the Test Cases, or the needs of Test Case authors who will be writing up the Test Cases and Test Steps that will be tested when executing the Test Run. For example, if some Test Cases need to be executed in different web browsers, you can understand that it's useful for the Test Run to have a parameter named Browser.
To define Test Parameters in a Test Run:
In your project, navigate into the Test Runs topic.
On the Test Runs page, select the Test Run you want to configure (or create a new Test Run).
In the detail of the selected Test Run, click thebutton and scroll down to the Test Parameters section. This section contains a table of currently configured Test Parameters. In a new Test Run, any parameters configured in the Test Run Template on which the Test Run is based appear in the table.
Click on the Test Parameters header and click to open the Manage Test Parameters dialog.
In the dialog, click Select Parameter. The list contains Test Parameters in the Test Parameters Library that are not currently used in the Test Run.
If you want to add one of the library parameters to the Test Run, select it in the list, and select the type (if multiple types are listed).
If you want to define a new parameter in this Test Run, select Add New Parameter dialog,
enter the parameter name and specify the type (if multiple types are available). If you want the new parameter to be available project-wide, check the
Add to Library option (enabled only if you have permissions to add to the library).
After adding all Test Parameters, if you want to provide values for any of them, provide them in the Value field of the respective parameters.
Clickto preserve changes and then click to exit from Properties.
Your next task is to open the Test Cases selected for your Test Run and provide values for any Test Parameters they contain for which there is no equivalent Test Parameter in the Test Run. For example, if some Test Case has a parameter User that is not also configured in the Test Run, you need to provide a value for it at the Test Case level.
To supply values for Test Parameters in Test Cases:
Select the Test Run on the Test Runs page and make sure Test Cases are already selected. (Selection mode must be Manual, By query on create or From LiveDoc on create.)
With the Test Run selected in the table, click (Actions) and choose . A new browser tab opens displaying the Work Items table, which lists the selected Test Cases. The Test Run Planning sidebar also opens in the view displaying information about the number of waiting executions and other Test Run statistics.
For each of the Test Cases, in the sidebar panel labeled Iteration 1, enter values for all empty parameter fields.
Click thebutton to update the Test Run.
By default, each Test Case has one Iteration. If a Test Case needs to be executed multiple times with different parameter values, click thebutton. A sequentially numbered Iteration panel is added to the sidebar, with parameter values pre-filled from the previous Iteration. Modify the parameter values as needed so that testers will process the steps correctly. Add additional Iterations until all necessary variations of the Test Case are covered. Be sure to save changes using the button .
When all Test Cases of the Test Run have values, you can close the browser tab and return to the Test Run, which is now ready for testers to execute unless you wish to make other changes.
This section applies mainly to authors of Test Cases. The Test Parameters feature (sometimes called "parameterized testing") enables you to write abstract test procedures, inserting Test Parameters in place of some concrete specifications. When selected in a Test Run, a test planner or manager (or a tester with the requisite permissions) can supply concrete values for the parameters in the Test Steps. These values become part of the automatically created test records.
For example, in a Test Case that describes a test that must be run on different mobile operating systems, instead of writing "Android" or "iOS" or "Windows Mobile" in a Test Step,
you could insert a parameter named
OS. When planning the Test Case to a Test Run, a test manager can then supply a concrete value in place of
depending on which platform is being tested.
The Test Parameters themselves can be defined in any or all of several different places:
In the project's Test Parameters Library (in project Administration).
In Test Run Templates.
In individual Test Runs.
In Test Steps of Test Cases as they are being written.
When writing a Test Step in a Document or in the Work Items Table, you can insert any existing Test Parameter (already defined in the Test Case, or the Test Parameters Library) into your text. Or, you can define a new Test Parameter and insert that, optionally adding it to the library. Note that you must have access permissions granted for the relevant folder in the Polarion repository in order to add Parameters to Library. Writing to this location is not granted by default for non-administrator users. (See topic Defining the Test Parameters Library.) You must also have the Polarion user permission "ADD TO PARAMETERS LIBRARY".
To insert a Test Parameter in a Test Step:
Click the header of the Test Steps table in the Test Case. The table transitions to edit mode.
Place the insertion cursor at the point in the table where you want to insert a Test Parameter. (This will typically be in the Step Description column.)
In the drop-down options list beneath the last row of the table, click.
Select one of the parameters in the drop-down list (you can filter the list by typing), or create a new parameter "on the fly" by clicking the Add New Parameter dialog.item and entering properties for a new parameter in the
This section applies mainly to a testing manager or team leader creating and setting up Test Runs. During planning of a Test Run, the author/planner must supply values for any
Test Parameters defined only on the Test Case level (i.e. with no equivalent parameter in the Test Run). For example, if the Test Case author defined a parameter
OS, the Test Run author/planner must replace that with some concrete value...
Android, for example...
if there is no parameter in the Test Run named OS.
When the Select Test Cases field of the Test Run is set to Manually, By Query on Create, or From LiveDoc on Create, you can click the button in the Test Run Status section of the Test Run to load the selected Test Cases in the Work Items table, after which you can begin supplying parameter values per the steps outlined below. Note that when the Test Case selection mode is set to By Query on Execute or From LiveDoc on Execute, Test Parameter values cannot be specified at the Test Cases level. Values must be set in the Test Run, from which the Test Cases selected on execute will inherit the values. Also, multiple iterations per Test Case are not supported for the "on Execute" selection modes.
To provide Test Parameter values in the Test Cases of a Test Run:
Select the Test Run in the Test Runs topic in your project.
Click the Test Run Status section of the Test Run. A new browser tab opens and displays the Work Items table, with the Test Cases of the Test Run listed, and the Test Run Planning sidebar open.button in the
Select the first Test Case in the table, and in the Parameters section of the sidebar, specify values for any Test Parameters that do not yet have one. (Any filled-in values are coming from the equivalent parameter in the Test Run. You can overwrite these values in the Test Case, if desired.)
If the Test Steps need to be executed multiple times with different parameter values, clickand provide the necessary parameter values. Create any additional Iterations needed (see Defining the Iterations, below).
Click thebutton in the sidebar to update the Test Run.
Click the (Refresh) icon in the editor toolbar of the Test Case and check that all parameters have been replaced with actual values in all Iterations in the Execute Tests section of the Test Case. Any parameters missing a value are highlighted in light red.
Repeat these steps for all other Test Cases of the Test Run listed in the table until you have provided Test Parameter values for all of them.
This section describes the procedure for defining multiple Iterations of Test Steps in Test Case type Work Items.
To define multiple Iterations of Test Steps:
Be sure you have specified all missing values in Iteration 1 (always present and cannot be removed).
Click thebutton in the sidebar to create another Iteration.
Fill in values for all Test Parameters. Repeat this step to create as many Iterations as you need to run the Test Steps with all possible parameter value variations.
Click thebutton in the sidebar to update the Test Steps.
Click the (Refresh) icon in the editor toolbar of the Test Case.
You should now see all actual values for all Test Parameters in the Test Steps, highlighted in light blue. Any undefined parameters are highlighted in light red. You should see tabs on the left-hand side of the Test Steps, one for each Iteration you defined. You can select the tabs and check that the Test Parameters all show the correct values.
Fix any still-missing parameter value(s) in the Iteration(s) via the Test Run Planning sidebar, save the changes, and refresh the view to check again that all parameters show the correct values. Repeat these procedures for all Test Cases of the Test Run, selecting each on in the Work Items table in turn.
Testers cannot execute a Test Case until all Test Parameters in all Iterations have a value. If execution cannot proceed for this reason, a dialog appears informing the tester about missing Test Parameter values.
This section is mainly for testers who execute Test Runs that include Test Cases that have Test Parameters in the Test Steps (i.e. parameterized Test Cases).
Before executing Test Steps in a Test Case, a value for all Test Parameters in the Test Steps table, except those whose value is specified in the Test Run you are executing, must be specified. Normally this should already be done by the creator of the Test Run. However, testers can be granted the requisite permission to modify Test Runs and define Test Parameters, in which case it is possible to provide values for Test Parameters in the Test Planning Sidebar during execution of a Test Run. In this case, you should follow the procedures described in Providing Parameter Values in Test Cases before you begin executing the Test Steps, except in Step 2 you should use the Execute Tests button rather than the Waiting button. The Test Run Planning sidebar opens automatically only if you are using a license that allows Test Run modifications (i.e. ALM or QA).
Note that it is possible to override any default parameter values provided by the Test Run if you have the requisite MODIFY permission for Test Runs.
Once all Test Parameter have been supplied with values, you are ready to begin executing the Test Case. Launch the Test Run in the normal way using thebutton. The button panel indicates how many total executions of the Test Cases are waiting. In the first Test Case, click the button to begin executing the test.
The main difference between executing regular Test Cases and parameterized Test Cases is when there are multiple Iterations. You have to complete all the iterations before the Test Run execution process moves on to the next Test Case. And of course, you perform your testing with the variations specified in the Iterations. You do not have to step through the Iterations in the order of appearance. If you want to perform Iteration 3 before Iteration 1, you are not prevented from doing so.
Also, if you have permission to MODIFY Test Runs, it is possible to add more iterations while executing other Iterations. IMPORTANT: If you do this, be sure to click the (Refresh) icon in the sidebar after adding an Iteration and modifying the default values, in order to update the test record. Also click the (Refresh) icon in the editor toolbar to refresh the view of Test Steps. Remember that you can pause the Test Case and resume testing later. On pause, the current state of the current Iteration is logged in the test record.
Polarion supports off-line execution of parameterized Test Runs via export/import to Microsoft Excel. For detailed information on this feature, see Help for Excel round-trip. There is essentially no difference when Test Cases are parameterized, except that:
The exported Excel workbook contains sections for all Iterations of each Test Case. Testers can log results of each step in each Iteration.
Test Parameter values are rendered in bold font in a light blue color.
By default, Iteration names are appended to the Test Case title in the Title column. For example: "Test Valid Login - Iteration 1".
It may be useful to keep the following points in mind:
If any Test Parameters in any Test Case in the Test Run have no value assigned, Excel round-trip export fails with a message citing which Test Cases are missing parameters.
By default, all selected Test Cases of the Test Run that have waiting Iterations, and/or that have no results, are exported to Excel. If incompletely executed Test Cases are included in the exported Test Run, all Iterations are exported, whether or not they have been executed.
Excel round-trip files containing only partial results may be re-imported to Polarion, in which case only new Test Records are imported.
A tester can retest Test Cases and iterations by changing the result in the exported Excel workbook. When a workbook with retested Test Cases is imported back
into Polarion, check the option
Retest previously executed Test Cases in the Import Results from Excel dialog.
Alternatively, the Test Run can be re-exported and the Test Cases re-executed. On re-import, the
Retest previously executed Test Cases
should be checked.
The column to which the Iteration name is appended is configurable in the export template, so you may see it appended to some other column than Title in the exported workbook. For more information, see User Guide: Excel Round-trip Templates and Administrator's Guide: Specifying Column for Test Step Iteration Numbers.