Main Content

Artifact Tracing

Units in the Model Testing Dashboard

A unit is a functional entity in your software architecture that you can execute and test independently or as part of larger system tests. Software development standards, such as ISO 26262-6, define objectives for unit testing. Unit tests typically must cover each of the requirements for the unit and must demonstrate traceability between the requirements, the test cases, and the unit. Unit tests must also meet certain coverage objectives for the unit, such as modified condition/decision coverage.

You can label models as units in the Model Testing Dashboard. The dashboard then provides metric results for each unit. If you do not specify the models that are considered units, then the dashboard considers a model to be a unit if it does not reference other models.

In the Model Testing Dashboard, in the Artifacts pane, the unit dashboard icon Unit dashboard icon. indicates a unit. If a unit is referenced by a component, it appears under the component. If a unit references one or more other models, those models are part of the unit. The referenced models appear in the Design folder under the unit and contribute to the metric results for the unit.

Artifacts panel showing the model Component1 expanded to show the models Unit1 and Unit2. The panel also shows the model Component2 expanded to show Component3.

To specify which models are units, label them in your project and configure the dashboard to recognize the label, as shown in Specify Models As Components and Units.

Components in the Model Testing Dashboard

A component is an entity that integrates multiple testable units together. For example, a model that references multiple unit models could be a component model. A component could also integrate other components. The Model Testing Dashboard organizes components and units under the components that reference them in the Artifacts pane. The dashboard does not provide metric results for components because components typically must meet different testing objectives than units.

If you do not specify the models that are considered components, then the dashboard considers a model to be a component if it references one or more other models.

In the Model Testing Dashboard, in the Artifacts pane, the component icon Component icon. indicates a component. To see the models under a component, expand the component node by clicking the arrow next to the component icon.

To specify the models that are considered components, label them in your project and configure the dashboard to recognize the label, as shown in Specify Models As Components and Units.

Specify Models As Components and Units

You can control which models appear as units and components by labeling them in your project and configuring the Model Testing Dashboard to recognize the labels.

  1. Open your project. For example, at the command line, type dashboardCCProjectStart. This example project is already has component and unit models configured.

  2. On the MATLAB® toolstrip Project window, right-click in the Labels pane and click Create New Category. Enter a name for the category that will contain your testing architecture labels, for example, Testing Interface.

  3. Create a label for the units. On the Labels pane, right-click the category that you created and click Create New Label. Name the label Software Unit.

  4. Create another label for component models and name the label Software Component.

    Project window showing the labels pane in the bottom left corner. The Testing Interface category is expanded and the labels Software Component and Software Unit under the category.

    The unit and component labels appear under the category in the Labels pane.

  5. Label the models in the project as components and units. In the project pane, right-click a model and click Add label. In the dialog box, select the label and click OK. For this example, apply these labels:

    • db_Controller — Software Component

    • db_ControlMode — Software Unit

    • db_DriverSwRequest — Software Unit

    • db_TargetSpeedThrottle — Software Unit

  6. In the Project tab, in the Tools section, click Model Testing Dashboard.

  7. In the Dashboard tab, click Options.

  8. In the Project Options dialog box, specify the category and labels that you created for the components and units. For the component interface, set Category to Testing Interface and Label to Software Component. For the unit interface, set Category to Testing Interface and Label to Software Unit.

    Project Options dialog box showing categories and labels specified for component and unit interfaces.

  9. Click Trace Artifacts. The dashboard updates the traceability information in the Artifacts pane and organizes the models under the component models that reference them. If a model is not referenced by a component, it appears at the top level of the components.

For each unit, the dashboard shows the artifacts that trace to the unit. To view the metric results for a unit, click the unit name in the Artifacts pane.

Trace Artifacts to Units for Model Testing Analysis

To determine which artifacts are in the scope of a unit, the Model Testing Dashboard analyzes the traceability links between the artifacts and the software unit models in the project. The Artifacts pane lists the unit models, represented by the model names, organized by the components that reference them. Under each unit, the pane shows these artifacts that trace to the unit:

  • Functional Requirements

  • Design Artifacts

  • Test Cases

  • Test Results

Artifacts panel showing units and traced artifacts

To see the traceability path that the dashboard found from an artifact to its unit, right-click the artifact and click View trace to unit. A traceability graph opens in a new tab in the Model Testing Dashboard. The graph shows the connections and intermediate artifacts that the dashboard traced from the unit to the artifact. To see the type of traceability that connects two artifacts, place your cursor over the arrow that connects the artifacts. The traceability relationship is either one artifact containing the other or one artifact tracing to the other. For example, the trace view for the functional requirement CC003_05 shows that it is contained in the requirement Activating cruise control. The container requirement traces to the functional requirement Set Switch Detection, which traces to the unit db_DriverSwRequest.

Dashboard trace view for a functional requirement.

After the list of models, the Untraced folder shows artifacts that the dashboard has not traced to models. If an artifact returns an error during traceability analysis, the panel includes the artifact in the Errors folder. Use the traceability information in these sections and in the units to check if the testing artifacts trace to the models that you expect. To see details about the warnings and errors that the dashboard finds during artifact analysis, at the bottom of the Model Testing Dashboard dialog, click Diagnostics.

As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the traceability data in the Artifacts panel might be stale by enabling the Trace Artifacts button. To update the traceability data, click Trace Artifacts. If the button is not enabled, the dashboard has not detected changes that affect the traceability information.

Functional Requirements

The folder Functional Requirements shows requirements where the Type is set to Functional and that trace to the unit model directly or through a container requirement, a library subsystem, or a combination of the two. For more information about linking requirements, see Requirement Links (Simulink Requirements).

If a requirement does not trace to a unit, it appears in the Untraced Artifacts folder. If a requirement does not appear in the Artifacts panel when you expect it to, see Requirement Missing from Artifacts Pane.

When you collect metric results for a unit, the dashboard analyzes a subset of the requirements that appear in the Functional Requirements folder. The metrics analyze only requirements where the Type is set to Functional and that are directly linked to the model with a link where the Type is set to Implements. A requirement that traces to the unit but does not have these settings appears in the Functional Requirements folder but does not contribute the metric results for requirements. For troubleshooting metric results for requirements, see Fix a requirement that does not produce metric results.

Design Artifacts

The folder Design shows:

  • The model file that contains the block diagram for the unit.

  • Models that the unit references.

  • Libraries that are partially or fully used by the model.

  • Data dictionaries that are linked to the model.

Test Cases

The folder Test Cases shows test cases that trace to the model. This includes test cases that run on the model and test cases that run on subsystems in the model by using test harnesses. Create these test cases in a test suite file by using Simulink® Test™.

If a test case does not trace to a unit, it appears in the Untraced Artifacts folder. If a test case does not appear in the Artifacts panel when you expect it to, see Test Case Missing from Artifacts Pane.

When you collect metric results for a unit, the dashboard analyzes a subset of the test cases that appear in the Test Cases folder. The dashboard analyzes only test cases that run on the model. Subsystem test harnesses appear in the folder but do not contribute to the metrics because they do not test the whole model. For troubleshooting test cases in metric results, see Fix a test case that does not produce metric results.

Test Results

The folder Test Results shows these types of test results from test cases that test the model:

  • Saved test file icon Saved test results — results that you have collected in the Test Manager and have exported to a results file.

  • Temporary test results iconTemporary test results — results that you have collected in the Test Manager but have not exported to a results file. When you export the results from the Test Manager the dashboard analyzes the saved results instead of the temporary results. Additionally, the dashboard stops recognizing the temporary results when you close the project or close the result set in the Simulink Test Result Explorer. If you want to analyze the results in a subsequent test session or project session, export the results to a results file.

If a test result does not trace to a unit, it appears in the Untraced Artifacts folder. If a test result does not appear in the Artifacts panel when you expect it to, see Test Result Missing from Artifacts Pane.

When you collect metric results for a unit, the dashboard analyzes a subset of the test results that appear in the Test Results folder. For troubleshooting test results in dashboard metric results, see Fix a test result that does not produce metric results.

Untraced Artifacts

The folder Untraced shows artifacts that the dashboard has not traced to models. Use the Untraced folder to check if artifacts are missing traceability to the units. When you add traceability to an artifact, update the information in the panel by clicking Trace Artifacts. The Model Testing Dashboard does not support traceability analysis for some artifacts and some links. If an artifact is untraced when you expect it to trace to a unit, see the troubleshooting solutions in Untraced Artifacts.

Artifact Errors

The folder Errors shows artifacts that returned errors when the dashboard performed artifact analysis. These are some errors that artifacts might return during traceability analysis:

  • An artifact returns an error if it has unsaved changes when traceability analysis starts.

  • A test results file returns an error if it was saved in a previous version of Simulink.

  • A model returns an error if it is not on the search path.

Open these artifacts and fix the errors. Then, to analyze the traceability in the dashboard, click Trace Artifacts.

Diagnostics

To see details about artifacts that cause warnings,errors, and information messages during analysis, at the bottom of the Model Testing Dashboard dialog, click Diagnostics. You can filter the diagnostic messages by type and clear the messages from the viewer.

The diagnostic messages show:

  • Modeling constructs that the dashboard does not support

  • Links that the dashboard does not trace

  • Test harnesses or cases that the dashboard does not support

  • Test results missing coverage or simulation results

  • Artifacts that return errors when the dashboard loads them

  • Information about model callbacks that the dashboard deactivates

  • Files that have file shadowing or path traceability issues

  • Artifacts that are not on the path and are not considered during tracing