Verifying complex SoC designs can be costly and time consuming. It has been confirmed that the time required to validate a design increases exponentially as the size of the design increases. In the past few years, there have been many techniques and tools that enable verification engineers to use to handle such problems. However, many of these techniques are based on dynamic simulation and rely on circuit operations to identify design issues, so designers still face the problem of creating incentives for designs.

Designers can use the firmware running on the processor as part of the validation simulation stimulus, which is also the approach currently used—using a full-featured processor model. Compared to writing stimuli in HDL, firmware is faster and easier to create. The disadvantage of executing code on a full-featured processor model is that the model runs slower, so only a small amount of software will execute using this technique. Many firmware implementations consist of fetch operations and memory read and write cycles, and the verification value is very low. By masking these low-value operations in the logic simulator and continuing to perform register and memory-mapped I/O cycles, the execution speed can be significantly improved while minimizing verification coverage.

There are two main benefits to being able to execute code faster in a simulated environment. First, fast simulation means that functional verification simulations can use more code. Diagnostics, drivers, firmware, and in some cases some of the application code can be used to verify the problem. Second, because the simulation runs faster, more verification can be performed. Many designers choose to run additional tests instead of running less CPU emulation time. Most verifications are limited by the amount of CPU time that can be used to run the simulation. If the firmware is used as part of the verification, it will drive the design. This incentive will be practical, and it will be tested by typical operations. One of the challenges of creating incentives for design is how to estimate typical design operations and encode them on the test platform. Use the actual software to eliminate this problem for the verification engineer. However, running code as a test platform is unlikely to provide a lot of incentives, especially not covering most of the verification space. Therefore, designers need to use additional techniques to provide additional incentives to traverse all boundary conditions of the design.

Designers can use traditional direct testing and other verification techniques to increase the use of firmware as an excitation source. Memory partitioning can be used to filter unnecessary bus cycles during the simulation to improve performance. This article introduces a design example that uses code as stimulus and assertion-based validation to describe design errors that cannot be discovered using traditional verification techniques.

Solve verification challenges

At present, the verification challenges faced by electronic engineers are increasing. To better illustrate these challenges, a simple example is presented in this article. This example is a graphical output device that displays RGB values ​​on a 250 & TImes; 250 pixel matrix. It includes a register interface that maps to the processor. The related registers are: "row" - an 8-bit register containing the pixel row address information to be drawn: "column" - an 8-bit register containing the pixel column address information to be rendered: "pixel": - contains the RGB value of the pixel to be rendered An 8-bit register: "size" - an 8-bit register containing the rectangle of the pixel to be drawn (where 1 is for writing a single pixel, 2 for a square depicting 2&TImes; 2, and so on with a maximum of 16): "Status" - An 8-bit register that can read and return device status information.

Use direct test

The first step in verifying this sample device is to test if all rows and columns are being addressed. To test if all sizes of pixels can be written, also test representative samples of different color values. A typical pixel combination is also tested, such as immediately changing from the upper right pixel to the lower left pixel. All angle pairs can be tested using a similar method. Row and column addresses for orderly and unordered additions and decrements in various combinations should also be tested. All of these tests can be done by writing and compiling a simple program running on a full-featured processor model, or by using a simple test platform that generates bus cycles and BFM. Also consider testing for abnormal conditions that may affect the design. You can set the row or column address to a value greater than 249 or a pixel that exceeds the hardware support.

These are all obvious tests done at the interface level, similar verification tests performed on internal structures and verification strategies implemented at the interface level are very similar. Obviously, to test the entire verification space, even if it is just an interface to a design module, it is not as simple as the sample device described above. Possible operations are 250 rows & TImes; 250 columns & TImes; 224 colors × 16 size, or 16.7 × 1016. The combined number of all operations is the square of this value, or greater than 1034. The real challenge here is to create those that can reveal design problems. Combine and identify these issues as areas of the area that require immediate attention.

Use assertions to expose early problems

Because of the incentives driven by the design, assertions can detect problems early. The assertions to be added include row and column addresses that cannot exceed 249 (the maximum possible value for row and column addresses), and size fields that cannot exceed 16. After determining assertions and using HDL coverage analysis, you need to drive the design drive. This can be achieved by constraining random tests. The constrained random test generates feedback to the device of the test platform to process the transaction, indicating that the identified test points have been overwritten. If the design space is very large, the constrained random test cannot contain boundary conditions that the test points do not cover. This test does not require the creation of an incentive to achieve 100% coverage using HDL overlay tools. However, traversing all states and overriding all conditions in the design does not guarantee that the device is fully verified.

Software code as an incentive

For a verification space of more than 1034 combinations, it is unlikely that the actual device operation will perform all necessary combinations. You should focus on those operations that the device will run on, and reduce the time spent on operations that might not be used theoretically. The easiest and quickest way is to find the existing code that drives the device. This could be a diagnostic code, driver code or application level algorithm. Each such code provides a different level of validation and exposes different types of problems, so you should try to get and use all types of code.

For new designs, the code probably doesn't exist, but for the design of next-generation products, some code is often available. If these codes exist, the design incentives are available with little effort or cost. If the code doesn't exist, but the partner is willing to create the code early in the design cycle, then it's easy to create the stimulus. Finally, if the verification team needs to create code, it is easier to create complex and diverse incentives for the design by writing C code than using any other language.

Assumed display

Using the hypothesis display, you need to run diagnostic code that depicts various test modes and color combinations to ensure connectivity. You can also run the driver code, which can be connected to a simple paint application that uses the pixels representing the sample to adjust the driver to the appropriate location. Finally, use the application that ultimately uses the device and draw a few images. Each type of code uses the design in a different way, so you can discover problems that are not easily detected when using other methods.

Hardware/software co-verification

Many hardware and verification engineers (and even software engineers in some areas) believe that running any part of the application does not speed up design verification. After all, if the driver is tested against the device and the application is tested against the driver, no further verification is required. However, these engineers will not consider releasing the product without systematically testing all the software, nor will it accept the hardware design to be released to tapeou without system testing. System-level co-verification tests all optional components, including hardware, software, or a combination of both, to uncover problems that would not be discovered in the event of separation.

We cover many types of Connectors for industrial, electrical and automotive, such as IP68 and Waterproof Connectors, OBD diagnostic connectors, also the standard or custom-designed power connectors for MINI FIT, MICRO FIT,  MATE-N-LOCK.


We can support customers to copy some original componenst for local/equivalent connecotors by considering short L/T and competitive price.  Also it's workable to offer solutions for unique and customized connectors  by overmold tooling with low cost.

Connectors System

Connectors System,Board System Connector,Efi System Injector Connector,Efi System Car Connector

ETOP WIREHARNESS LIMITED , https://www.etopwireharness.com

Posted on