Feed on

Archive for June, 2007

Averant’s Larry Lapides on Formal Verification

Sunday, June 24th, 2007 by JL Gray

Thursday morning at DAC I had the opportunity to speak with Larry Lapides, VP of Worldwide Sales for Averant about formal verification and Averant’s formal verification product Solidify.  Averant’s formal tools compete with those from Jasper, OneSpin, Cadence, and Synopsys, among others. 


The Core Problem of Computing

Tuesday, June 19th, 2007 by Will Partain

This note began as a review of the tech report “The Landscape of Parallel Computing Research: A View from Berkeley“, with David Patterson among the many authors. It turned into a trip down memory lane. By the end, I agreed with them that Something Big is in play.
As ever, I urge you to read the paper. It has received quite a bit of attention, so there is also much surrounding material - the group’s blog is one good jumping-off point.


Denali Night Fever and the Future of DAC

Tuesday, June 12th, 2007 by JL Gray

Music.  Dancing.  Free food and drink.  Did I mention drink?  All in a room full of engineers…?  As it turns out, Denali Night Fever provided an excellent opportunity for a veritable who’s who of EDA luminaries and the rest of us just along for the ride to relax after a couple of long days at the DAC.  The event was held at the On Broadway Event Center just a few blocks away from the San Diego Convention Center.  Music was provided by Full Disclosure Blues featuring Gary Smith and Aart de Geus, and Cadence’s Ted Vucurevich with The Chad Tuckers.  Everyone’s favorite industry gadfly John Cooley was also present, earplugs and all, to serve as the judge of the “EDA Idol” competition. 

The party itself was a blast, but the more interesting question is whether the same held true for DAC this year.  According to Richard Goering, there were 5,135 registered attendees, 3,796 exhibitor attendees, and 400 “other” attendees, for a total of 9,331 people.  These numbers are down significantly from last year’s DAC in San Francisco, where presumably everyone and their dog went to the show as it was driving distance away, but they are also down slightly from the last DAC held in San Diego in 2004. 

Richard’s take was that there wasn’t much exciting going on this year at DAC, but I would tend to disagree.  All four of us from Verilab who attended the conference were able to attend interesting product demos and sessions, and met up with people we otherwise would have had to travel far and wide to see.  It also gave us a chance to catch up ourselves, as there were Verilab attendees from the UK and US who don’t always have the opportunity to meet face to face. 

Some of the info at the conference could have been gleaned from attendance at DVCon or DATE.  The technical sessions at DVCon were consistently the most relevant to my role as a verification consultant.  Its smaller size (710 attendees) made it a good “starter conference” to help kick off the season.  DATE was good because it gave me the opportunity to catch up with current/former clients and colleagues of mine in Europe, and to get a better understanding of what the design and verification community in Europe is interested in, but the technical sessions were far too academic for my taste.  DAC, on the other hand, was a networking paradise.  A large number of people I wanted to see were there, including some unexpected surprises.  I also broadened my horizons a bit more when looking at product demos and was able to catch some interesting stuff I’d missed at the previous conferences. 

Is DAC still relevant?  For me, the answer is yes.  Your mileage may vary.  If you’ve never been to any of the major conferences (a situation I found myself in before this year), you’re missing out.  My horizons have broadened significantly over the last few months.  I’ve got a much better appreciation for the state of the industry, what tools and methodologies are available, and who to call if I need a helping hand than I did back at the beginning of February.

Carbon Design Maps RTL to C

Monday, June 11th, 2007 by JL Gray

After the keynote on Tuesday I had the opportunity to meet Soha Hassoun, an Associate Professor of Computer Science at Tufts University, while snapping a photo of Steven Levitan (DAC conference chair). Among other things, Soha is involved with a company called Carbon Design Systems. Now, as it turns out I’ve been bombarded with emails from Georgia Marszalek from ValleyPR about Carbon, but for some reason I never fully grasped the value of the company’s product after reading an email description. Based on the additional recommendation from Soha I decided to take a look.

Thursday morning I went by the Carbon booth and spoke with Elizabeth Abraham, VP, Consulting Services and Product Marketing. She gave me an overview of Carbon’s Virtual System Prototype (VSP) software. VSP converts “Verilog, VHDL, and mixed language RTL designs into an ultra-fast, cycle-accurate virtual prototype.” Basically, RTL is converted to high level C software model which can run 10-100x faster than the original design, according to Elizabeth. The other cool feature of Carbon’s product is the ability to debug hardware and software side by side, as bugs tracked down in the generated C code can be mapped back to the original hardware implementation.

I asked Elizabeth how VSP compared with solutions such as the Cadence ISX, which can provide coverage metrics and constrained random testing for embedded software. Based on my understanding of the tools, it appears VSP is focused on verifying the full system hardware/software solution, whereas ISX is focused on testing the interface layer between the system software and hardware (i.e. not the entire software solution). The other difference is that VSP should dramatically speed up simulations whereas ISX would not unless it was paired with a Palladium hardware accelleration box.

OneSpin formal verification

Friday, June 8th, 2007 by David Robinson

I had my first brush with formal methods about 11 years ago when I started my PhD. I was asked to look at the Z language, which would let you write a specification that could be formally proven to be correct. The downside was that, at any given time, there would only be three people on the planet with large enough brains to use it. Part of the complexity of Z was down to the fact that English characters were not allowed (that would have been too easy) - only Greek symbols by the looks of things. I’m not sure how you were meant to type it into a text editor, and I didn’t pursue it far enough to find out.


SystemVerilog Methodologies – It’s Getting Wild

Thursday, June 7th, 2007 by Jason Sprott

Every day this week at DAC I’ve been involved in at least one discussion on VMM versus AVM. It’s getting really competitive now. There’s all this talk of standards, Open Source, maturity, and compliance. On top of that, things just don’t stay still long enough to form an opinion that lasts more than five minutes.


VMM Users Group

Thursday, June 7th, 2007 by JL Gray

Tuesday I had the opportunity to attend the VMM Users’ Group luncheon. The highlight of the luncheon was a panel discussion moderated by Janick Bergeron, Chief Scientist at Synopsys. Before the panel got started, the folks from Synopsys had a few tidbits to share. According to Synopsys, the VMM is the most broadly adopted SystemVerilog library. They were also keen to point out that Synopsys had the highest percentage of reported users on Cooley’s DeepChip verification census.


Verilab Join Synopsys VMM Catalyst Program

Thursday, June 7th, 2007 by Jason Sprott

At DAC this week Synopsys announced the new VMM Catalyst Program, with over 50 founding members (including Verilab). Members of the VMM Catalyst Program get access to the Synopsys VCS Verification Library as well as the SystemVerilog source code for the VMM Standard Library.

Be afraid. Be Very Afraid

Wednesday, June 6th, 2007 by David Robinson

Most verification engineers burn themselves at some point by disabling a checker and then forgetting about it. There are sensible reasons for doing this; think about it. You find an RTL bug on Friday, but it doesn’t get fixed immediately. You decide to comment out the checker in order not to pollute the weekend’s regression run. The problem is that you come in on Monday morning and start debugging the new errors you have. The commented out check gets forgotten about.

Burned? I positively set myself on fire doing this on my first ever project. I spotted the commented out check 3 days before code freeze. And guess what it was masking a bug. Ouch.

I haven’t made the same mistake again. In fact, I go to excessive lengths to check that my testbench works correctly. I talk about one method in my book [1], where I create a special aspect that I load up at the start of regressions to verify that the testbench works before I run all of the other simulations. Another method I use is known as error injection (or fault injection, bug injection or mutation) where I’ll deliberately go and break the RTL and check that my testbench catches it.

The problem with this approach is that it can be manually intensive. Determining where the best place to inject a bug is, running an entire regression to see if it is caught, and repeating until you are happy you’ve done enough (and really, how do you know?) is tough.

Not any more though. I caught the Certess demo yesterday, and they seem to have solved the problem. I only saw a demo, but their solution looks pretty push button. You load the design into their tool, it runs a regression to profile your tests and work out which faults should get caught by which tests, and then it injects the faults one at a time and runs the appropriate tests. If they don’t complain about errors, then you have a problem with your testbench.

As far as I know, this is the first time that we’ve been able to measure the quality of a verification environment. So all you verification engineers, IP providers and outsourcing companies out there – be afraid. This thing will tell you what functionality your stimuli isn’t activating, what functionality it isn’t propagating, and what bugs you aren’t detecting. Your boss and customers can now find out how good a job you are really doing.


[1] D. Robinson, “Aspect-Oriented Programming with the e Verification Language: A Pragmatic Guide for Testbench Developers”

0-IN for CDC Verification Still Looks Pretty Good

Wednesday, June 6th, 2007 by Jason Sprott

Over 50% of chip designs today have >20 clock domains. This makes CDC verification pretty high up on the priority list. At Verilab we have our own CDC workshop, which is split into a design portion (it’s better to get CDC design right in the first place), and a verification portion, which focuses on using SystemVerilog Assertions and dynamic simulation. This gets our clients hitting the ground running with CDC really quickly, using the tools they have at their disposal today. However, we’re always on the lookout for other cool CDC verification techniques. I got an update yesterday of 0-IN’s CDC verification capabilities and they still look pretty good.


Work For Verilab