Stress-Relief for Requirements-Based Verification

Verification of Safety-Critical FPGAs under Strict DO-254 Guidance

Louie de Luna, Aldec DO-254 Program Manager
Like(2)  Comments  (0)

If they’re being honest, anyone who has verified an FPGA under strict DO-254 guidance will tell you that it is stressful. Show me an engineer on their first DO-254 project – and I’ll show you someone pulling out their hair and downing what is probably their 5th cup of coffee while these important questions weigh heavy on their minds:


Have we reviewed all FPGA requirements and validated derived FPGA requirements? Do we have a good record of the review activities?

Do I have a test for each functional FPGA requirement? What’s the status of the tests? How do I track the progress and document the results?

How do I manage traceability and create traceability matrices?

What are the differences between the requirements baseline from SOI#2 and the requirements baseline from SOI#3?

What design and verification elements are impacted due to a requirement change?


These questions might sound scary to anyone new to DO-254, however they can be answered simply by understanding the primary processes involved and by having a well-defined set of deliverables. The processes and deliverables for requirements-based verification are: Requirements Capture and Allocation, Requirements Review and Validation, Test Planning and Management, Traceability, Defect Management and Coverage Analysis.

Here’s an overview.

Requirements Capture and Allocation

TIP: The board designer is the ideal candidate to capture the FPGA requirements since he or she knows most about the FPGA’s intended functions.

The FPGA requirements capture process is the allocation of the board functions to the FPGA which are recorded and expressed as FPGA requirements. Requirements are captured and created based on the standards, method, rules, procedures and criteria defined during the planning process. The FPGA requirements capture process is an iterative process as additional requirements may be generated during the design process. 


Requirements Review and Validation

TIP: Baselining requirements before and after review enables measurement of effectiveness and amount of work required to implement changes.

In this process FPGA requirements are reviewed for suitability and correctness in relation to board requirements. Derived FPGA requirements, created as a result of design decisions or architectural changes, are validated for correctness and completeness and evaluated for impact on safety.  Review and validation activities are conducted against a well-defined checklist.


Test Planning, Execution and Management

Formulating a verification plan with all members of a verification team is an essential step. The verification engineer must create either a directed test or constrained random test to verify specific requirements and, as each test case needs to be derived from the requirement, traceability must also be established. Each test case must also have clearly described input conditions, test sequences and expected results, and must be able to test the intended functions of the FPGA under normal and foreseeable abnormal conditions.

Self-checking testbench with PASS/FAIL criteria or by comparison against a golden waveform works best. If defects are found, corrective actions must be recorded and implemented. Test results including waveforms, log files, static timing reports and code coverage results, must be traced to the test case. Once a test case has passed the corresponding requirement is considered verified or covered.



TIP: Some verification teams use spreadsheets to trace test cases to the appropriate requirement, and for tracing test results to the appropriate test case. Spec-TRACER™ was developed to complete this task automatically, enabling simpler management of bi-directional traceability.

Downstream traceability between FPGA requirements and test cases will expose FPGA requirements that do not have a test, and upstream traceability between FPGA requirements and test cases will expose test cases that are either redundant or unnecessary.

Traceability also supports Impact Analysis. Having traceability links between FPGA requirements, HDL design, test case and test results enables the generation of downstream traceability reports to identify elements that may be impacted by a requirement change.


Coverage Analysis

During test execution, verification engineers are expected to deliver progress reports to the project manager. Reports that support coverage analysis might include requirements coverage, test results and code coverage analysis.

Requirements Coverage Analysis details requirements that have been covered -or not covered- by a test case. For any requirement missing a test, a new test case must be created

Test Results Analysis describes the actual results and interpretation as to whether or not the actual result meets the expected result including any necessary justification or corrective actions.

Code Coverage Analysis describes the parts of the HDL code that were not executed by the testbench. If 100% code coverage is not achieved, the testbench or design must be updated.


Defect Management

Defects that are found during test execution must be recorded and managed, with each defect traced to a specific test case. Having a structured defect life cycle helps trace the journey of the defects so that they can be efficiently tracked, managed and quickly corrected.


You may have already noticed that the processes and activities for requirements-based verification are simply engineering best practices.  That is DO-254 in a nutshell – a set of well-proven, engineering practices needed to ensure the commercial airplanes we fly in are safe and reliable.


Aldec offers Spec-TRACER to help manage the activities required for these processes. Spec-TRACER helps manage requirements capture and tracks each requirement version and history, automating generation of traceability matrices and coverage analysis reports.


If you’d like to learn more about requirements-based verification, watch our related webinar – Managing Requirements-Based Verification for Safety-Critical FPGAs and SoCs


For more, visit or contact


Louie de Luna is responsible for FPGA level in-target testing technology and requirements lifecycle management for DO-254 and other safety-critical industry standards.  He received his B.S. in Computer Engineering from University of Nevada in 2001.  His practical engineering experience includes areas in Acceleration, Emulation, Co-Verification and Prototyping, and he has held a wide range of engineering positions that include FPGA Design Engineer, Applications Engineer, Product Manager and Project Manager.


Ask Us a Question
Ask Us a Question
Captcha ImageReload Captcha
Incorrect data entered.
Thank you! Your question has been submitted. Please allow 1-3 business days for someone to respond to your question.
Internal error occurred. Your question was not submitted. Please contact us using Feedback form.
We use cookies to ensure we give you the best user experience and to provide you with content we believe will be of relevance to you. If you continue to use our site, you consent to our use of cookies. A detailed overview on the use of cookies and other website information is located in our Privacy Policy.