Feed on
Posts
Comments

DVCon 2014 Wrap-up

Several folks at Verilab presented material at DVCon 2014. All papers and presentations have now been posted here on the Verilab website.

Mark Litterick presented his paper, authored with Marcus Harnish, paper entitled Advanced UVM Register Modeling - There’s More Than One Way to Skin A Reg. The slides for Mark’s presentation can be downloaded here: Advanced UVM Register Modeling Slides

Paul Marriott presented a paper authored by Jason Sprott, Gordon McGregor and André Winkelmann entitled A Guide To Using Continuous Integration Within The Verification Environment The slides for this presentation can be downloaded here: Guide To Using Continuous Integration - slides

Jeff Montesano presented a poster he co-authored with Mark Litterick entitled Verification Mind Games - how to think like a verifier and its associated paper can be downloaded here: Think like a verifier paper

Vanessa Cooper presented a poster she co-authored with Paul Marriott entitled Demystifying the UVM Configuration Database and its associated paper can be downloaded here: Configuration Database demystification paper

One Response to “DVCon 2014 Wrap-up”

  1. Tudor Timi Says:

    Hi,

    I just read the “Continuous Integration Within the Verification Environment” paper and I have to say I find it cool to see that Agile development is being applied to real life hardware projects.

    I would also like to share a few of my thoughts. You mention using a subset of the regression tests to check the health of the regression environment. I see two problems here.

    First, you are limited by the quality of the DUT. It could very well be the case that the DUT is still very buggy (especially at the start of a project) which will lead to test fails even though basically this is exactly what you want (a verification environment that finds bugs).

    Second, because you rely on constrained random testing, it can also be the case that the subset of tests that was chosen at the beginning is not relevant anymore for the health of the verification environment. This could be, for example, because sequences have changed to do different things than before. In the worst case, sequences that stimulate less of the DUT don’t cause fails and give a false sense of initial security. Adapting the test subset would also probably require a lot of manual effort.

    Combining the two points, it may even be the case that both the DUT and the VE are of high quality but still one of the tests simply hits a real bug by virtue of randomization and signals that the build is “broken”.

    What would be better in my opinion is to have unit testing applied for CI; something like SVUnit or eUnit. The advantages of this approach would be independence on DUT quality, better predictability, maintainability and less infrastructure required for setup. As SVUnit has a very specific log file structure (basically it says PASS or FAIL on the last line), only a (simple) parser for that log file format would be required to let Jenkins know if a test has passed or failed (effort that only has to be done once and will probably be done by the SVUnit development team).

    With another process in place to raise VE quality, your approach could work very well in applying CI on the DUT. It would be interesting to consider this for a future paper.

    Best regards,
    Tudor

Leave a Reply

Captcha
Enter the letters you see above.

Work For Verilab