Feed on
Posts
Comments

Archive for April, 2007

Announcing “Aspect-Oriented Programming with the e Verification Language” - A New Book By David Robinson

Thursday, April 19th, 2007 by JL Gray

This week at DATE Morgan Kaufmann Publishers have announced a release date of August 2007 for the US and September 2007 for Europe of Verilab’s very own David Robinson’s new book, “Aspect-Oriented Programming with the e Verification Language - A Pragmatic Guide for Testbench Developers”. The book:

  • Introduces and explains a complex topic using familiar terms and examples
  • Will cause many daily coding problems to vanish and allow you to focus on your real job of verifying hardware designs
  • Takes a pragmatic approach to this complex subject.. “do it this way, because it works…”

Information about how to order a copy will be announced as soon as it is available.

Synopsys SystemVerilog Support

Wednesday, April 18th, 2007 by JL Gray

According to a helpful representative at the Synopsys booth this afternoon, VCS supports 99% of the SystemVerilog Testbench features. Which 99%? I’m aware it doesn’t support parameterized classes or the shuffle() method for arrays. I’m sure there must be more features as well. Also, a heads up. While browsing through the VCS documentation recently I discovered documentation on AOP extensions for SystemVerilog built into VCS. I haven’t tried it out yet to see if it works. However, as with Vera, the AOP support is nothing like the “when inheritance” found in Specman/e. When Inheritance allows fields, methods, and additions to methods to be added into an instance of a class only when a given field has been randomly set to a specified value.

Also useful to know? The 2006.06-12 release of VCS fixes several bugs in the graphical debugger (DVE) related to SystemVerilog.

Cadence uRM and Verification Planning

Wednesday, April 18th, 2007 by JL Gray

Tuesday afternoon I attended the Cadence/Doulos solutions workshop entitled “Adopting a Plan-to-Closure Methodology across Design Teams and Verification Teams”. The session was presented by Hamilton Carter from Cadence, co-author of the soon to be released book “Metric Driven Design Verification”, and Dave Long from Doulos. Hamilton focused much of his portion of the session on verification planning and functional coverage. I’m sure much of the information from his talk will be covered in his book, but there were a few things that stood out.

Hamilton stressed the importance of planning sessions and the idea of creating a prioritized set of metrics. He also highlighted the value of the verification planning document (vPlan). I asked him later in the presentation if it was possible to put too much emphasis on the vPlan to the point where it was being held up to the exclusion of other sets of metrics that should be used together with the vPlan to get an accurate picture of where the project is going (think bug count, number of recently changed lines of code, real progress in completing assigned tasks, etc). According to Hamilton, the Cadence methodology doesn’t take these things into account yet, but he did mention that tools such as Enterprise Manager may have some point be integrated with LSF and Clearcase to the point where you could automatically extract such information.

Next up was Dave Long. Dave’s description of uRM was the first I’ve seen any details about how the methodology has been applied to SystemVerilog, and my first impression is that the results aren’t good (yet). First of all, Incisive does not yet support class-based test environments, only module-based ones. That may change soon, but seems to be a current limitation. Second, sequences, one of the more widely used features of the eRM (the predecessor to uRM focused on the e language), seems basically useless when implemented in SystemVerilog. The implementation relies on creating a driver with one task corresponding to each of what would have originally been an individual “when subtype” of a sequence. The first thing I would do if I was stuck using that feature would be to throw it away and code a more customizable solution (perhaps using factories?). The problems with the feature would be especially severe when dealing with verification IP. Currently in ‘e’ it is possible to override default sequences and add new ones very easily. With this new approach the best possible outcome would be for a user to extend the original driver and hope it was possible to instantiate it in place of the base class in the verification IP.

One other item of note - if I understood correctly there have been no announced improvements to Cadence SystemVerilog support or the uRM. There may be some smaller announcements in the near future, but it doesn’t appear that anything major will be revealed for the next several months at least.

CoWare Says ESL or Extinction

Wednesday, April 18th, 2007 by JL Gray

Alan Naumann, President and CEO, CoWare, gave the second portion of the keynote this morning at DATE, entitled “Was Darwin Wrong? Has Design Evolution Stopped At the RTL Level? Or Will Software and Custom Processors (Or System-Level Design) Extend Moore’s Law?” Phew. The title of the talk was just about as long as Alan’s defense of the point that current problems in chip design are just like global warming and the extinction of the dinosaurs. On the bright side, Alan highlighted several key changes occurring in chip design and proposed long term solutions in order to stave off our collective extinction.

For example, in the past, new products ramped slowly into the market until reaching maturity after 3-5 years, and then gradually were end of life’d at a planned pace. These days, new products experience a tremendously fast ramp-up and quickly reach the end of their usefulness after approximately a year. That means that companies need to churn out new products much faster than ever before.

In addition to faster product ramps, projects today require significantly more investment in software development. According to Naumann, in 1997, a typical mobile phone had about 200K lines of code (LOC). In 2005, that number had jumped to 2.5M LOC!

Another interesting trend? Between 1985 and 2005, Naumann claimed the number of hardware engineers has increased by about 1.5-2x. In that same time, ASIC starts have dropped by an order of magnitude or more, and the number of embedded software engineers has skyrocketed.

So what do we need to do in order to deal with all of these changes in the chip design landscape? First, Naumann stated that individual engineers need to learn to model at a higher level of abstraction. Engineering teams need to differentiate their products using software and new architectures. Companies need to adjust in the way they work with vendors to be more of a partnership, and (of course) need to provide virtual models to customers in advance of real implementations.

Toshiba at DATE

Tuesday, April 17th, 2007 by JL Gray


A Random Presenter and Panelists Before the DATE Keynote
Originally uploaded by brillianthue.

It’s been a busy first day at DATE. The morning kicked off with keynote addresses given by Dr. Tohru Furuyama, General Manager, Center of Semiconductor Research and Development at Toshiba and Alan Naumann, President and CEO, CoWare. Tohru’s talk, entitled “Challenges of Digital Consumer and Mobile SoC’s: More Moore Possible?” was heavily technical for a keynote and focused on the need for Electronic System Level (ESL) tools and methodologies in order to keep up with increasing design complexity in the SoC market.

According to Furuyama, development costs are increasing rapidly with the advent of each new manufacturing technology, and costs have soared from between $8M to $50M to develop a chip from scratch. In order to reduce cost and decrease time to market (TTM) he claims it is necessary to adopt a strategy of developing system models in advance of the availability of RTL or silicon so that schedules for the software and hardware teams can overlap. To prove his point, he cited internal project experience suggesting that it took about 130 days after first samples returned from the fab to debug software before Toshiba started using system level modeling, and only 41 days after. Furuyama felt that was a significant improvement, and so do I!

Furuyama also proposed taking advantage of behavioral synthesis techniques to speed hardware development and simulation times. Experience at Toshiba suggests that C models can provide a 5-10x improvement in simulation performance over RTL. Additionally, the performance of synthesized designs was found to be very close to that of manually written RTL, though the gate count was potentially higher.

I was surprised at the amount of technical detail that Dr. Furuyama went into during his talk, but it was interesting to hear. The moral of the story? Use ESL tools and methodologies in order to save development time and reduce project costs. Perhaps one of our astute readers can comment on whether the results Dr. Furuyama described have been seen elsewhere.

Welcome to Verilab DATE Coverage!

Sunday, April 8th, 2007 by JL Gray

Thanks for stopping by! Jason Sprott, Terry Lawell, and JL Gray from Verilab will be attending the Design Automation and Test conference in Europe April 16-20, 2007. We will be posting information about the conference here on this site and on Cool Verification throughout the week. Check back often to see what we’ve been up to!

Work For Verilab