SystemVerilog included a range of new features intended to improve verification productivity and the most significant are probably:
- Object Oriented programming
- Constrained randomsation
Functional verification in simulation is entirely a software problem, so by including classes the verification community acquired all of the productivity gains of traditional OO programming (although from ~10 years ago - the lack of reflection and limits of introspection mean that libraries written in SystemVerilog like UVM are massively dependent on the pre-processor rather than the language itself).
By adding constrained randomisation, the language enforces compliant simulators must provide a constrains solver. This allows the testspace to be defined by a set of rules and the simulator to generate test sequences. A classic analogy would be Sudoku, by defining the problem in terms of constrains the simulator can solve the problem rather than the verification engineer having to think of stimulus to exercise the DUT. Typically this makes it possible to hit corner cases that might have been missed with directed testing.
Another improvement in SystemVerilog was the DPI layer, which makes it easier to interface to external languages.
SystemC is a slightly different beast - it can be used for modelling, interfacing to other languages, high-level synthesis etc.
You asked for a comparison of code; I use Python for verification so I don't have any good System[C|Verilog] examples, but Python is an OOP and therefore this example might be useful.
From a testbench I created of the OpenCores JPEG encoder:
def compare(i1, i2):
"""
Compare the similarity of two images
From http://rosettacode.org/wiki/Percentage_difference_between_images
"""
assert i1.mode == i2.mode, "Different kinds of images."
assert i1.size == i2.size, "Different sizes."
pairs = izip(i1.getdata(), i2.getdata())
dif = sum(abs(c1-c2) for p1,p2 in pairs for c1,c2 in zip(p1,p2))
ncomponents = i1.size[0] * i1.size[1] * 3
return (dif / 255.0 * 100) / ncomponents
@cocotb.coroutine
def process_image(dut, filename="", debug=False, threshold=0.22):
"""Run an image file through the jpeg encoder and compare the result"""
cocotb.fork(Clock(dut.clk, 100).start())
driver = ImageDriver(dut)
monitor = JpegMonitor(dut)
stimulus = Image.open(filename)
yield driver.send(stimulus)
output = yield monitor.wait_for_recv()
if debug: output.save(filename + "_process.jpg")
difference = compare(stimulus, output)
dut.log.info("Compressed image differs to original by %f%%" % (difference))
if difference > threshold:
raise TestFailure("Resulting image file was too different (%f > %f)" %
(difference, threshold))
You can see that the use of a structured testbench and OOP makes this code very understandable and quick to create. There was a pure verilog testbench for this block and although it doesn't have the same level of functionality the comparison is still interesting.
Testbenches can take stimulus from files and write the result to files.
store the input audio/video data in a file (without compression) and use this as the stimuli. store the results in another file . you may have to verify the result manually by playing/reading the output file.
Implementing and testing on an FPGA would be better as the simulation can take time.
Best Answer
All chips, from the very simple to insanely complex, have functional specifications. These describe what the chip does. The IC designer will make a circuit to implement that functional spec, while a validation engineer or test engineer will develop a set of tests to check the implemented chip against the same functional spec.
It's not necessary for a test engineer or validation person to know 'what's inside'. In fact, good practice avoids having these folks know the implementation details, lest any design assumptions creep into their tests.
In some very limited circumstances a test engineer may need to know a low-level detail, such as an I/O pad structure, to test it properly. They may also need to know about a larger block, like an A/D or D/A converter, in order to access its design-for-test capabilities. But otherwise they treat the chip as if it were a black box.