You can assess the status of your model testing activities by using the metrics in the Model Testing Dashboard. When you test your models against requirements, you maintain traceability between the requirements, models, test cases, and results. The dashboard helps you to track the status of these artifacts and the traceability relationships between them. Each metric in the dashboard measures a different aspect of the quality of the testing artifacts and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178C. From the dashboard, you can identify and fix testing issues. Update the dashboard metrics to track your progress toward testing compliance.
The Model Testing Dashboard shows data on the traceability and testing status of each component in your project. A component is a functional part of the architecture that you can execute and test independently when modeling software. The dashboard considers each model in your project to represent a component because you use models to design and test the algorithms.
Open the project that contains the models and testing artifacts. For
this example, at the command line, type
On the Project tab, click Model Testing Dashboard.
The first time that you open the dashboard for the project, the dashboard must identify the artifacts in the project and trace them to the models.
To run the traceability analysis and collect metric results, click Trace and Collect All. Collecting metric results requires a license for Simulink® Check™, Simulink Requirements™, and Simulink Test™. Once metrics have been collected, viewing the results requires only a Simulink Check license.
The dashboard analyzes the traceability links from the artifacts to the models in the project and populates the widgets with metric results for the component that is selected in the Artifacts panel.
When the dashboard collects and reports metric data, it scopes the results to the artifacts in one component in the project. Use the Artifacts panel to see each component in the project, represented by the name of its model, and the artifacts that trace to it.
In the Artifacts panel, click the component db_DriverSwRequest. The dashboard widgets populate with metric data from the artifacts in this component.
In the Artifacts panel, expand the section for the component. Click the arrow to the left of db_DriverSwRequest. Each filtered section below the component shows the artifacts of each type that trace to the component.
Expand the Functional Requirements section. This
component uses requirements in the files
db_req_func_spec.slreqx. Click the arrow to the
left of a file name to see the individual requirements that trace to the
You can explore the components and sections in the Artifacts panel to see which requirements, test cases, and test results trace to each component in the project. For more information on how the dashboard analyzes this traceability, see Trace Artifacts to Units for Model Testing Analysis.
On the Artifacts panel, click the component db_DriverSwRequest. The dashboard widgets populate with metric results for the component.
To update the metric results for the component, click Collect Results.
In the Test Case Analysis section of the dashboard, locate the Tests with Requirements widget. To view tooltips with details about the results, point to the sections of the dial or to the percentage result.
To explore the metric data in more detail, click an individual metric widget. For example, click the green section of the Tests with Requirements widget.
The table shows each test case for the component, the test file containing each test case, and whether the test case is linked to requirements.
The test case
Set button is missing linked
requirements. To open the test case in the Test Manager, in the
Artifact column, click Set
Return to the results for the component. Above the table, click db_DriverSwRequest.
You can view a table of detailed results by clicking each widget in the dashboard. Use the hyperlinks in the tables to open the artifacts and address testing gaps. For more information on using the data in the dashboard, see Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.
To use the Model Testing Dashboard to track your testing activities, set up and maintain your project using the best practices described in Manage Requirements-Based Testing Artifacts for Analysis in the Model Testing Dashboard. As you develop and test your models, use the dashboard to identify testing gaps, fix the underlying artifacts, and track your progress towards testing completion. For more information on finding and addressing gaps in your model testing, see Fix Requirements-Based Testing Issues.