With System under Test the system/object is designated, which is to be tested. Other common terms for this are test item or test object. This can be identified, for example, by the version number of the tested software. This chapter shows how to create and edit systems under test.
The overview page displays all existing systems under test in the selected project in a table. New systems under test are created here.
The table shows the following values:
ID |
Assigned automatically. |
Additional Information |
A tooltip appears when the cursor is placed over the icon shown here. |
Description |
The description of the system under test. |
Actions |
The executable actions. |
The Description entry can be edited directly in the table rows by clicking in the corresponding field.
By clicking on the button
a new empty table row will appear. Now, the Version of the system under test can be defined.With
the new system under test is created and saved. The ID of the system under test (SUT00001) is automatically assigned by Klaros-Testmanagement. Click on the ID SUT00001 to get to the detail page of the system under testWith
several table rows can be created and edited at the same time. Only when clicking on the data is stored in the database.With
all changes are undone.Red IDs | |
---|---|
All rows with red IDs have been changed and are not yet saved! |
The following actions can be performed in the action column:
If a system under test has been deleted, it is initially marked with a deletion marker and is only visible to administrators. For deleted systems under test, the following actions are available instead of Delete:
Certain actions can also be applied to several systems under test at the same time. To do this, select the systems under test to which the action is to be applied in the leftmost column.
The following bulk actions are supported for systems under test:
Bulk actions are described in detail in Section 5.2.3.1.5, “Bulk Actions”
The following operations can be performed in the line above the table on the right:
All operations are described in detail in Section 5.2.3.1, “Overview Page”.
Each system under test has its own detail page with several additional tabs. Clicking on the ID of a system under test or on the icon on the right in the action column takes you to the tab that was selected last. When called for the first time, this is the Overview tab.
The following tabs are available: Overview, Properties, User Defined, Attachments, Jobs, Results and Changes.
On the detail pages, there are additional icons in the top right corner of the header. The following actions can be performed here:
Open print view |
A print-ready view of the system under test can be created here. With a click on the icon this opens in a new browser tab. Print views are described in detail in Section 5.2.3.2.1, “Print Pages”. |
Create Bookmarks |
Each individual detail page can also be reached directly via a hyperlink. By clicking on the icon this link is copied to the clipboard. The creation of bookmarks is described in detail in Section 5.2.3.2.2, “Bookmarks”. |
Browse |
Use the green arrows at the very top right to switch between the systems under test present on the previous page. |
All data displayed here always refer to the currently displayed system under test.
The following values are displayed on the overview tab:
Success |
The overall success rate of all test cases. A success rate of 100% means that the latest test run for every test case has been successful. Even if a test case has been successfully executed in the past, only the latest result is considered for the success rate. |
Progress |
The progress rate shows how many test cases have been executed. In contrast to the success rate, the progress rate only takes into account whether a test case was executed at least once, regardless of its result. |
Compliance |
The compliance rate shows how many test cases assigned to requirements have been successfully executed. |
Coverage |
The coverage rate shows how many test cases assigned to requirements have been executed at least once. In contrast to the compliance rate, the coverage rate considers only if a test case has been executed at least once in the iteration, regardless of its result. |
Automation Rate |
The automation level indicates the percentage of test cases that can be executed automatically. |
Success history diagram |
This chart shows the historical evolution of the success, progress, conformance, and coverage rate of this system under test for the selected test environment. The timeline is automatically adjusted to the existing data. |
In the lower area there is a table with the overview of the results for this test system broken down by test environment. In the Action column, the test environment for the above diagram can be selected.
This tab ( Figure 6.33) allows the user to view or change the following attributes of the system under test:
Version |
The version of the system under test. |
You can create your own fields to meet individual requirements. For further information please refer to Section 5.2.3.2.4, “User Defined Properties”.
You can add any files as attachments to a system under test. For more information, see Section 5.2.3.2.6, “Attachments”.
This view lists all issues that are associated with this system under test.
The table shows the following values:
ID |
The ID of the issue in the external issue management system. |
System |
The external issue management system of this issue. |
Additional Information |
A tooltip appears when the cursor is placed over the icon shown here. |
Summary |
The summary of the issue. |
Created |
The date when the issue was created. |
Created by |
The user who created the issue. |
Assigned |
The user to whom the issue is assigned. |
Priority |
The priority of the issue. The possible values are specified by the external issue management system. |
Status |
The status of the issue. The possible values are specified by the external issue management system. |
Actions |
The executable actions. |
Clicking Section 9.6.6, “Issue Details (Creating a new Issue)”), where new issues can be created and linked to the system under test. Clicking links existing issues to the system under test ( Section 9.6.7, “Link Issue”).
opens the Issue Details page (This page lists all iterations to which this system under test is assigned.
This system under test can be assigned to one or more iterations by clicking the
button. This opens up a dialog, where multiple iterations can be selected at once.The following bulk actions are supported for iterations:
The detail page of an iteration shows a list of all systems under test that are assigned to it, as shown in section Section 6.2.2.7, “Systems under Test”.
The page displays the list of jobs assigned to the selected system under test.
The following values are displayed in the table:
ID |
Assigned automatically. |
Summary |
The summary of the job |
Priority |
The priority of the job. Possible values are Trivial, Minor, Major, Critical, Blocker |
State |
The status of the job. Possible values are New, Reopened, In Progress, Resolved, Closed, Rejected. |
Progress |
The percentage of executed test cases of this job and its subjobs. |
Success |
The success rate of the executed test cases of this job and its subjobs. |
Due Date |
The date on which this job is due to be finished. |
Assigned |
The user to whom the job is assigned. |
Action |
Available actions are: Edit, Open Print View und Execute For the possible representations for Execute, see Section F.2.2, “Execution Actions”. |
The results tab is further divided into a Test Runs, Test Case Results and a Test Suite Results tab, showing the test results related to this system under test as described in Section 5.2.3.2.7, “Test Runs and Results”.
The tab Changes shows the change history of this system under test.
For a detailed description of the Changes view, see Section 5.2.3.2.8, “Change History”.