In the Test dashboard, data can be filtered by two criteria:
Dashboards:
Choose a dashboard to select data specific to applications. If a user has more than one dashboard, there will be more than one option to select.
Applications:
Comprehend the build health of cloud applications such as CAM or Enterprise Marketplace.
Depending on your selections from these filters, the data displayed in the widgets varies. Some widgets may show the message
“No Data Available”
, which means no recent data is available for your selection.
The Test dashboard displays data in multiple graphs and a table view that allows a view of critical components of the Test phase:
Test summary by application
Top 5 technical services with less coverage
Top 5 technical services with code smells/bugs
The Compare Release page offers statistics for comparing releases with each other, providing valuable insights on changes and improvements over time for the following features, updates, or versions of a product, software, or service:
Bugs
Coverage
Affected technical services
Code smells
Failed tests
Skipped test
The duration for comparison is always 180 days; this option is set by default and cannot be changed.
You can Navigate to the Compare Release page from the Test dashboard by clicking the
Compare releases
button.
To see the comparison charts and their tabular representation, you must pick at least one application, two releases, and one environment in the Compare Release Dashboard. By default, you would be shown all six parameters of a test run as charts. However, you have the ability to eliminate any chart that does not meet their standards. Please remember that if the selected charts are to become the default, they must be stored as custom views; otherwise, any changes in chart selection will be lost in the next login session.
The "Compare Release" page now includes an "Export to .CSV" functionality, allowing users to easily export comprehensive reports of product, software, or service comparisons over time. This feature enhances the analysis of changes, updates, and improvements between different releases for more efficient decision-making and tracking.
The deleted charts from the
Add components
window can be re-added by dragging and dropping the item. Charts can also be rearranged based on their importance.
The table chart concisely summarizes all charts.
The Test dashboard displays five widgets that provide a graphic representation of
Test summary by application
,
Top 5 technical services with less coverage
,
Overall test status
,
Test type status
, and
Top 5 technical services with code smells/bugs
as well as a table view titled
Technical service tests
.
Test summary by application
This graph summarizes tests performed on all configured apps based on the selection chosen. This choice might be based on the number of failed, passed, or skipped tests.
Top 5 Technical services with less coverage
This graph depicts the top five technical services with the least coverage during the previous seven days.
The Overall test status graph represents the number of tests passed, failed, and skipped over the selected timeframe, and the data status is described as follows:
Failed
(Red): Tests failed within the selected period.
Passed
(Green): Tests passed within the selected period.
Skipped
(Yellow): Tests executed but not finalized; thus, their final result cannot be determined.
You can switch between the overall test statuses graph and bar chart by selecting the donut chart icon or the bar graph icon.
By hovering over the graph, the following data is presented for each test status:
Group
: the test status from the three categories.
Date
: date of the execution of the tests.
Value
: total number of tests executed in a determined time frame.
The Overall test status widget presents two axes that indicate the
Test
within a specified time frame in which Passed, Failed, and Skipped tests occurred:
X-Axis (
Days(Year)
): The X-Axis corresponds to the dates within the activity's time frame.
Y-Axis (
Total tests
): The Y-Axis corresponds to the number of Passed, Failed, or Skipped Tests for the selected time frame.
The Test Type Status widget is a graph showing
Unit tests
. The widgets display the total numbers for each of the following:
Unit tests
: (Purple) Displays the results of automated tests to ensure that a section of an application meets its design and behaves as intended.
Functional tests
: (Blue) Displays the results of successfully executed Functional Tests.
Other tests
: (Pink) Displays the results of other tests executed.
When Unit tests, Functional tests, or Other tests are checked or unchecked, the
Overall test status
widget will automatically be updated based on the new parameters.
Top 5 technical services with code smells/bugs
Based on a 7-day sample, this graph depicts the top 5 technical services with the most code smells/bugs.
Code Smells
are the default selection; using the drop-down menu, you can view the
Bugs
.
The Technical service tests table is located at the bottom of the Test dashboard, which provides technical service Test data in a tabular form and enables a detailed view of each service. Each row in the table displays information for a specific service, separated by columns of information type:
Technical service:
The name of the micro-service within the larger application.
Applications:
Comprehend the test environment health of IBM cloud applications such as CAM and Enterprise Marketplace.
Test type:
The type of Test.
Failed:
The total number and percentage of Tests in the Failed category or group.
Skipped:
The total number and percentage of Tests in the Skipped category or group.
Passed:
The total number and percentage of Tests in the Passed category or group.
Total:
The total number of Tests.
Bugs:
The total amount of Bugs. (Hidden by default)
Code smells:
The total amount of code smells. (Hidden by default)
Coverage:
The type of coverage. (Hidden by default)
Release:
Release that is associated with the executed tests.
Environment:
The instance of the application where tests were executed.
Duration:
The total time a Test took to be executed.
Execution date:
The date the Test was executed.
Tool engine:
The Test tool source.
The Technical service test table displays all data regardless of the time frame selected. All columns in this table can be sorted. Above this table, you will find a search box that allows searching technical services by name and a
Settings
icon that allows changing the table settings to show or hide pre-selected columns.
The Table view also supports detailed views for each service. To access details for a specific technical service, select the overflow menu to the far right of the table and select
View details
. Select the following link for more information:
Technical services tests table view details