SystemVerilog included a range of new features intended to improve verification productivity and the most significant are probably:
- Object Oriented programming
- Constrained randomsation
Functional verification in simulation is entirely a software problem, so by including classes the verification community acquired all of the productivity gains of traditional OO programming (although from ~10 years ago - the lack of reflection and limits of introspection mean that libraries written in SystemVerilog like UVM are massively dependent on the pre-processor rather than the language itself).
By adding constrained randomisation, the language enforces compliant simulators must provide a constrains solver. This allows the testspace to be defined by a set of rules and the simulator to generate test sequences. A classic analogy would be Sudoku, by defining the problem in terms of constrains the simulator can solve the problem rather than the verification engineer having to think of stimulus to exercise the DUT. Typically this makes it possible to hit corner cases that might have been missed with directed testing.
Another improvement in SystemVerilog was the DPI layer, which makes it easier to interface to external languages.
SystemC is a slightly different beast - it can be used for modelling, interfacing to other languages, high-level synthesis etc.
You asked for a comparison of code; I use Python for verification so I don't have any good System[C|Verilog] examples, but Python is an OOP and therefore this example might be useful.
From a testbench I created of the OpenCores JPEG encoder:
def compare(i1, i2):
"""
Compare the similarity of two images
From http://rosettacode.org/wiki/Percentage_difference_between_images
"""
assert i1.mode == i2.mode, "Different kinds of images."
assert i1.size == i2.size, "Different sizes."
pairs = izip(i1.getdata(), i2.getdata())
dif = sum(abs(c1-c2) for p1,p2 in pairs for c1,c2 in zip(p1,p2))
ncomponents = i1.size[0] * i1.size[1] * 3
return (dif / 255.0 * 100) / ncomponents
@cocotb.coroutine
def process_image(dut, filename="", debug=False, threshold=0.22):
"""Run an image file through the jpeg encoder and compare the result"""
cocotb.fork(Clock(dut.clk, 100).start())
driver = ImageDriver(dut)
monitor = JpegMonitor(dut)
stimulus = Image.open(filename)
yield driver.send(stimulus)
output = yield monitor.wait_for_recv()
if debug: output.save(filename + "_process.jpg")
difference = compare(stimulus, output)
dut.log.info("Compressed image differs to original by %f%%" % (difference))
if difference > threshold:
raise TestFailure("Resulting image file was too different (%f > %f)" %
(difference, threshold))
You can see that the use of a structured testbench and OOP makes this code very understandable and quick to create. There was a pure verilog testbench for this block and although it doesn't have the same level of functionality the comparison is still interesting.
Was an ASIC Design Verification Engineer at Qualcomm. In the most simple way I can explain it:
Testing: Making sure a product works, after you've created the product (think QA).
Verification: Making sure a product works BEFORE you've created it.
They're both testing, just that verification is more complicated because you have to figure out a way to test the product before it exists and you have to be able to make sure it works as designed and to spec when it actually comes out.
For example, Intel is designing their next processor, they have the specs, they have the schematics and the simulations. They spend $1 Billion USD to go through fabrication and manufacturing. Then the chip comes back and they test it and find out that it doesn't work. They just threw a lot of money out the window.
Throw verification in. Verification engineers create models that simulate the behaviour of the chip, they create the testbench that will test those particular models. They get the results of these models and then they compare it with the RTL (model of the circuit writting in a hardware design language) results. If they match, things are (usually) OK.
There are a number of different methodologies for the verification process, a popular one is Universal Verification Methodology (UVM).
There is a lot of depth in the field and people can spend their entire career in it.
Another random tidbit of information: Usually you need 3 verification engineers for 1 design engineer. That's what everyone in the field says anyway.
EDIT: A lot of people think of verification as a testing role, but it's not; it's a design role in itself because you have to understand all the intricacies of your IC like a designer does, and then you have to know how to design models, testbenches, and all the test cases that will cover all the feature functionality of your IC, as well as trying to hit every single line of RTL code for all possible bit combinations. Remember that a processor nowadays has billions of transistors due to the fabrication process allowing smaller and smaller (now 14nm).
Also, in large corporations like Intel, AMD, Qualcomm, etc, designers don't actually design the chip. Usually the architect will define all the specs, layout the types of pieces that need to go together to get a particular function with a specific requirement (i.e. speed, resolution, etc.), and then the designer will code that into RTL. It's by no means an easy job, it's just not as much designing as a lot of engineers coming out of school think it is. What everyone wants to be is an architect, but it takes a lot of education and experience to get to that point. A lot of architects have PhD's, and like 15-20 years of experience in the field as a designer. These are brilliant people (and sometimes crazy) who deserve to be doing what they're doing, and they're good at it. The architect on the very first chip I worked on was a bit awkward and didn't really follow some social norms, but he could solve anything you're stuck with regarding the chip, and sometimes he would solve it in his head and tell you to look at one signal and you'd be like, "how the hell did he do that?". Then you ask him to explain and he does and it goes way over your head. Actually inspired me to read textbooks even though I've graduated already.
Best Answer
Basically a specification is a list of requirements. A requirement is generally defined as any statement that has the word "shall" in it. For example, a specification for a digital multimeter might include the following requirement: " The DMM shall have a display resolution of 3000 counts". A function is a capability of the equipment. Again, using A DMM as an example, a function would be the ability to measure AC volts. A feature is a particular capability that makes the device stand out. For example, the DMM features a wide bandwidth of 100 kHz with the AC volts function, which is better than most DMMs. Features and functions both must be specified in the requirements or they won't be designed into the device. Configuration refers to whether the device as designed will meet all of its specifications. Again, for example, if the device uses 5% resistors, it would be hard to verify that is meets an accuracy requirement of 1%.