Wednesday, July 26, 2006

Why can't we do it in EDA?

This was the questioned posed by Chairman of Orb Networks and former CEO of Cadence, Joe Costello in his DAC 2006 keynote speech.

The “it” referred to here is the mix and match of new plug-ins (internal and external), bundling things on top of others’ offerings and selling directly to customers.

With the increasing complexity of technological challenges compounded by rising market pressures, it does indeed benefit both the big EDA companies as well as the small start-ups with niche solutions to collaborate. However, opening the tools and making them pluggable is not without its major share of teething issues. While standards do take a long time to be formulated and then adopted, they’ll still be required to an extent for “universal plug-ins”.

One scenario is where EDA companies have the basic engines for the standard design activities. To these, smaller niche companies provide their plug-ins for value-addition and tackling of issues related to leading edge designs. With a uniform standard, these companies can go with their plug-ins to various EDA companies. In its absence however, each EDA company will need to work closely with these smaller companies and sell the complete “bundle with options” to the user.

A point to be noted here is that it’ll be naïve to assume that the present basic engines are implemented in a modular fashion where a plug-in can be used in a quasi seamless fashion. Then comes the question: if addressing of the leading edge issues is done in a modular fashion by the smaller companies who are free to sell their wares to the other big EDA companies, what is in it for the big EDA companies? What will be their competitive edge? But on the flip side, if the major EDA companies persist in attempting to do everything on their own, given the complexities and constraints, it’ll not result in much growth for the EDA industry.

Interestingly, there are signs of the industry moving in this direction. For DFM, with TSMC sharing their process information with the EDA companies to integrate into their design flow is one example. This can be treated as a “plug-in”.

Let’s take an example here: Synopsys recently came out with 3 tools in the DFM space - LCC (lithography compliance checking), CMP (chemical-mechanical polish) checking, and CAA (critical-area analysis). As per the press note, LCC inspects GDS-II files using a rapid-computation model of the lithography process, calibrated with foundry data. This scan predicts the actual shapes the mask features will produce, across the focus window of the lithography step. It then examines these features against a rule set to detect pinch-off, end-shortening, bridge, and other faults that could occur with a reasonable probability.

The normal output of the device is a color-coded die map: green for areas that pass, yellow for areas of concern, and red for trouble spots. Design teams that are knowingly pushing the litho rules can look under this graphic presentation at a numerical database that will give them actual predictions of critical dimensions.

Designers can then invoke an auto-correction tool, which is based on extensive, process-dependent heuristics, to deal automatically with the majority of the problems—adding space between lines, moving edges or corners, and other such reasonable measures.

Now reconsider the situation with a small EDA company working on the basic LCC part. It takes inputs from the lithography process model provided to it by the big EDA vendor (I don’t think the big foundries will be that comfortable in working closely with the smaller companies in handling their process data!). The GDSII is also provided as an input from the big EDA vendor’s tool(s). Finally the auto correction tool can be provided by either.

I cite this example because while these three new tools do attempt to handle the first order problems, they do not even begin to cover all the important sources of variation in 90-nm and finer geometries. TSMC cites more than 2000 independent sources of potential trouble.

I see a hybrid approach in the near future………

Should IP adopt a service biz model?

As pointed out in the article, most designers treat IP as a product. However, this product rarely comes with a guarantee; which is not that surprising. It’ll be almost suicidal given the argument that IPs are not plug in objects. Not only the IP’s functionality but also its interface and integration with the other components in the system determine whether the chip will work or not. And a standalone guarantee for an IP does not hold much credence.

A close working relationship between IP supplier and user has always been deemed vital for the successful IP usage and integration; hence to formalize it and bundle things under the “service” umbrella will not be that major a leap of faith.

Friday, July 14, 2006

China Syndrome Cooling ?

China Syndrome cooling, an article by Ed Sperling in Reed Electronics points out the possible waning of the “Invest in the Booming China Market” wave by electronics companies. Possible reasons cited by them for hedging their bets in other regions and in other countries include:

- China’s emphasis on allowing other cities besides Shanghai and Beijing to partake in the economic revolution is making it far more difficult for companies to manage logistics between their manufacturing sites inside of China;

- Rapidly rising labor costs are forcing some companies to consider comparable wage scales in places such as Vietnam, Malaysia and Eastern Europe;

- China’s ineffective policing of intellectual property theft has made many companies reluctant to move design operations there;

- Continued U.S. government regulations about what technology can be shipped into China or developed there has kept the lid on many companies’ plans;

- Manufacturers are looking to hedge their bets with backup strategies in case of a natural disaster or political issues that can affect regions.

Well, it makes economic and political sense for China’s emphasis to let its cities other than Shanghai and Beijing to invite investments so that they can have an inclusive growth – something vital for both the economic and political stability of a country else the economic disparity thus created would mitigate the growth achieved otherwise. To help the electronics companies, infrastructure improvement in these other cities could help.

Ineffective policing of IPs is indeed an important deterrent. However a few recent events are providing some progress. Hong Kong Science and Technology Parks Corporation (HKSTP) recently inaugurated the Intellectual Property Servicing Centre (IPSC) at the Hong Kong Science Park. Based within the Hong Kong Integrated Circuits Design Centre at the Park, it offers IP licensing, IP hardening, IP integration and IP verification services. Most notably, IPSC is run by HKSTP and will make use of Hong Kong law for any legal dispute over intellectual property. A key vehicle for this will be the Hong Kong International Arbitration Centre which is based in the SAR.

Under the “7+1” IC Design Centre framework signed with the High-Technology Research & Development Centre of the PRC’s Ministry of Science & Technology, HKSTP jointly collaborated with Harbin Institute of Technology, Hefei University of Science and Technology, Zhejiang University and the Hong Kong University of Science and Technology in July 2005 to extend the SIP trading platform throughout Greater China. The collaboration is to develop a due diligence platform in legal and technical terms for SIP certification and authentication purposes.

In October 2005, HKSTP also formed alliance with the China’s Ministry of Information Industry Software and Integrated Circuit Promotion Centre (CSIP) for the Mainland China IC design industry. The alliance is to promote the cooperation and development of Mainland China’s IC enterprises under the guidance and supervision of the administrative bodies, to jointly facilitate the outreach and popularization of SIP in SoC design services, as well as to standardize the SIP design, SIP standard promotion and SIP protection mechanism.

Tuesday, July 04, 2006

DFM again

TSMC had recently unveiled its 65nm DFM compliance design support ecosystem by coming out with its DFM Data kit compiled with DFM Unified Format (DUF). DUF has been developed by TSMC to align DFM tools. This kit would help to put the fabless designers on an equal footing with the IDMs. The format, though, models only random and systematic defects with parametric defects being planned for a future release.

Now yet another tool has hit the “in news” DFM space.

Blaze DFM Inc recently rolled out its solution Blaze MO. It is marketed as targeting to improve the parametric yield through a better control over leakage, timing and variability.
It has an electrical focus in contrast with other DFM tools which have a geometric focus (focusing on wire spreading, lithography simulation and critical area analysis)

The heat is on…….