Tests Management Tests Management is an important and integral part of verification. A tremendous amount of development cycle is usually spent on verification, so managing tests in a systematic way helps decrease and ease the verification cycle significantly. Common questions such as Do we have a test for each requirement? Which tests should we prioritize? Have we executed all of the tests? What is the status of the tests? Am I done with verification? - can be answered easily with efficient tests management. Spec-TRACER offers robust tests management features providing Verification Engineers and Project Managers real-time visibility to verification activities, tests status and test results. Spec-TRACER provides the following: Creation and Management of Test Plans and Test Cases Automatic capture of test cases by importing from DOC, XLS, TXT and CSV via heading styles or tags including test ID, name, description, tables and pictures. Equipped with track changes mechanism to detect changes when re-importing test cases. Manual capture of test cases within the tool environment which eliminates the usage of MS Excel as the source. MS Excel spreadsheets can be the output of the tool for reporting purposes, not the source. Manage changes and versions of test cases. Compare different versions of test cases to measure the effectiveness of test case capture and to determine how frequent test case changes happened in the project. An example of a test plan that can be created in Spec-TRACER is shown below. The test plan is separated into separate columns that are completely customizable by the user. Code – Unique code of the requirement or feature that needs to be tested. Name – Name of the requirements or feature that needs to be tested. CoverageLink – Links to coverage objects from the design. The link is the name of an object or a hierarchical name referring to an object in the design. CoverageType – Type of coverage (Assertion, Cover, CoverGroup, CoverPoint, Cross, and Directed Test) CoverageWeight – Weight for a coverage link. The weight is used to calculate total coverage for a linked item and test session element. The weight is an integer positive number or 0. CoverageGoal – Specifies the percentage value as a goal for a particular requirements or feature that needs to be tested. Environment for review of test plans, test cases and test results against a checklist, and report generation of the review activities Creation and Assignment of Test Attributes Assign pre-defined and user-defined test attributes to give context to tests characteristics and importance. The example below shows user-defined test attributes TestFinishTime, TestInfo, Safety and Priority. Command line application that can be executed from simulation scripts (running on Aldec or non-Aldec simulator) Parses log files based on regular expressions and stores test results in the Spec-TRACER database Reads coverage database (UCIS, ACDB, UCDB) and stores coverage results in the Spec-TRACER database Automatically creates traceability from test case to test results (log files, waveforms, code coverage and functional coverage) Results Analysis Tests Regression – See the test regression results of all test cases including cumulative code coverage. The example below shows 2 test regressions TS_Session_08_06 and TS_Session_09_20 and the status of all test cases executed in each regression. The example below also shows the cumulative code coverage for each regression and other test attributes such as TestFinishtime and TestInfo. Tests History – See the test history results of individual test cases executed within multiple regression runs. The example below shows the history of the three test cases TST-000, TST-001 and TST-002 and their status for each regression executed at different times.