“Multi-chip packaging, 3D ICs emerging trends”

0
141

SEPTEMBER 2012: Mentor Graphics Corporation is a world leader in electronic hardware and software design solutions, providing products, consulting services and award-winning support for the world’s most successful electronic, semiconductor and systems companies. Established in 1981, the company reported revenues in the last fiscal year of about $1,015 million.

In an interview with Pradeep Chakraborty, Dr Walden (Wally) C. Rhines, chairman and CEO, Mentor Graphics, and vice chairman of the EDA Consortium, USA, talks about the future prospects of the global semiconductor industry, the status of the global EDA industry, the advent of newer physical design capabilities, and what EDA now needs to do at handling 22nm and sub-22nm levels, and so on. Excerpts:


68A_3A4_wrhines2012
Dr Walden (Wally) C. Rhines

Q. How is the global semiconductor industry behaving at the moment? Do you see it going past the $310 billion mark this year?
A. The absolute size of the semiconductor industry (in terms or total revenue) differs depending on which analyst you ask, because of differences in methodology and the breadth of analysts’ surveys. Current 2012 forecasts include $316 billion from Gartner, $320 billion from IDC, $324.5 billion from IHS iSuppli, $327.2 billion from Semico Research and $339 billion from IC Insights.

These numbers reflect growth rates from 4 per cent to 9.2 per cent, based on the different analyst-specific 2011 totals. Capital spending forecasts for the three largest semiconductor companies have increased by almost 50 per cent just since the beginning of this year. However, the initial spurt of demand was influenced by the replenishment of computer and disc drive inventories caused by the Thailand flooding. Now that this is largely complete, there is some uncertainty about the second half.

So, overall it looks like the industry will pass $310 billion this year, but it may not be by very much. The strong capital spending and demand for leading edge capacity should impact the second half but the bigger impact will probably be in 2013.

Explore Circuits and Projects Explore Videos and Tutorials

Q. What is the current status of the global EDA industry?
A. Year 2011 was a record year in terms of total EDA revenue ($6,128 million: Silicon IP included), and the second highest on record (behind 2007) in terms of License & Maintenance revenue ($4,193 million).

Historically, the overall spending on EDA typically increases in line with semiconductor R&D spending, delayed by one year. Semiconductor R&D spending increased 11 per cent in calendar 2010, 6 per cent in 2011 and is likely to increase 12 per cent in 2012. If historical patterns continue, the strong EDA industry growth in 2011 should be followed by a good, but lower, rate of growth in 2012 and then acceleration in 2013. Current guidance by the major EDA companies supports this prediction.

Our forecast model (based on semiconductor R&D spending offset one year) predicts that EDA License & Maintenance Revenue will grow 3.5 per cent to $4,341 million in 2012. However, it probably understates the significance of the 28nm transition that is likely to increase semiconductor R&D-funded design activity this year beyond the current forecast. Moreover, with 20nm processes coming online in the latter part of the year, I expect the growth in EDA to continue at least through 2013.

New design methodologies traditionally drive almost all the growth in EDA. Most of the growth of EDA during the last decade came from DFM (including resolution enhancement), power analysis, ESL and formal verification. Currently, the new methodologies that are driving EDA growth are largely related to the 28/20nm transition or the increase in system design requirements for EDA. Examples include accelerated adoption of emulation, embedded software development/verification and ESL verification.

READ
Map Your Location Without Satellite Navigation

Q. Has the share of total IC production gone up, rather than being flat? Why?
A. EDA revenue, as a percentage of worldwide semiconductor revenue, continues at its long term level of 2 per cent with only a very slight decrease in the last 15 years. EDA cost per transistor continues to decrease on the same learning curve as semiconductor product cost per transistor, at about a 30 per cent decrease every year.

Q. How has been the foundry spending in 2012 so far?
A. Capital investment by silicon foundries, which had been flat for the previous decade at about $7 billion ($6.9 billion) per year, doubled in 2010 to $14 billion ($14.1 billion), increased again in 2011 to $19 billion and is now projected to maintain a level above $17 billion in 2012. Forecasts for overall industry capital expenditures have increased significantly since the beginning of the year.

Much of this change can be attributed to TSMC and Samsung, which are expected to spend $8.5 billion and $7 billion in 2012, respectively. TSMC spent approximately $1.64 billion of its capex budget in Q112. Capital expenditure plans for other major foundries include $2 billion for UMC, $430 million for SMIC and upwards of $3 billion for GlobalFoundries.

Despite all of this spending, 32/28/20 nm capacity is still fully loaded with demand for more. Improvements in yield and throughput on these advanced nodes will free up much more of this capacity as we move through the year.

Q. Has 28/20nm semiconductor technology become a major ‘work horse’? What’s going on in that area?
A. It is clear that the semiconductor industry transition to the 28nm family of technologies, which broadly includes 32nm and 20nm, is a much larger transition than we have experienced for many technology generations.

The world’s 28nm-capable capacity now comprises almost 20 per cent of the total silicon area in production and yet, the silicon foundries are fully loaded with more 28nm demand than they can handle. In fact, high demand for 28/20nm has created a capacity pinch that is currently spurring additional capital expenditure by foundries.

As yields and throughput mature at 28nm, the major wave of capital investment will provide plentiful foundry capacity at lower cost, stimulating a major wave of design activity. Cost-effective, high yield 28nm foundry capacity will not only drive increasing numbers of new designs but it will also force re-designs of mature products to take advantage of the cost reduction opportunity.

Q. Are you seeing new physical design capabilities including DFR, DP, P&R coming up? Please elaborate.
A. New problems and new solutions continue to arise in physical design and verification. While some, like double patterning, are largely handled by the EDA software, others, like IC reliability checking, require insight by the design and manufacturing organizations.

The Calibre PERC product, which was first introduced in 2010 to attack the problem of verification of ESD (electrostatic discharge) protection, has been applied to a wide variety of other design problems, some of which result in circuit degradation mechanisms that cause failures in the field.

For example, it is used to identify electrical overstress arising from signals crossing multiple voltage domains, and electro-migration issues when interconnects are not sized appropriately for current loads on some nets. It can also be used to check spacing on adjacent interconnects based upon the operational voltage difference. This is used to optimize area while preventing long-term dielectric breakdown.

READ
Implantable Drug Delivery System

Other applications, like ensuring design symmetry for transistor drive capacity and impedance matching, are used with cell libraries, memory and analog circuits.

3D IC physical verification challenges are also spurring new physical design capabilities. Since Calibre is a used by the foundries as their design rule manual validation tool, we were confronted with 3D IC issues early on.

That led to a variety of features, now incorporated in Calibre 3DSTAC, to extend verification to multiple chips package on a silicon interposer or stacked with TSV interconnects. As it became apparent that 3D ICs would have unique test requirements, Mentor also accelerated development of software to support both 2.5D and full 3D test requirements over a year ago.

Currently, the primary applications we’re seeing are for silicon interposer based designs and memory chips stacked on processors with WideIO TSV-based interconnects to increase memory I/O bandwidth, and reduce access time and power dissipation.

Q. How do you see the global semiconductor industry progressing in 2013? Has it learned from the previous recession at all?
A. The semiconductor industry went into the last recession in remarkably good shape. Inventories were not excessive and capital spending had been relatively low for several years. That is why the global semiconductor industry was able to bounce back so sharply in 2010 and finally break the $300 billion mark in 2011.

The design boom caused by the current large investment in 32/28/20 nm capacity will inevitably be followed by a period of increased profitability for semiconductor and equipment companies and then a period of more rapid price declines for both semiconductors and electronic equipment. This time, however, the magnitude of these price decreases will probably be muted by the under-investment in semiconductor memory capacity that we are currently experiencing.

Q. What does EDA now need to do at handling 22nm and sub-22nm levels? Is it already happening?
A. As we move toward smaller geometries, we need better techniques to manage the growing problem of variability in nanometer integrated circuit manufacturing. We are really starting to see that DFM (design for manufacturing), something the industry has been talking about for years, is now becoming critical to design.

DFM requires a detailed understanding of OPC (optical proximity correction). Specialists in optics have joined traditional electronic design specialists at EDA companies to create these key technologies. The EDA companies are working closely with semiconductor manufacturers on process technology.

Q. How long will the integration density area savings that you get by going to new nodes remain compelling?
A. Up until now, shrinking line geometries and growing the wafer size have enabled the semiconductor industry to continue to drive down the cost per transistor. But, these two variables are running out of steam. Those of us in the semiconductor industry have always been driven to the ultimate solution for any application, the single chip. But sometimes, the single chip solution is not the lowest cost. Or, even the most reliable or best quality solution.

Going forward, there are other capabilities that can be used to allow the industry to continue lowering the cost per transistor. One trend that is emerging today is multi-chip packaging, enabling us to pack more and more transistors in a package as opposed to on the chip. That will become increasingly true in the future. Another promising approach is 3D integrated circuits. This is especially interesting for memory intensive applications, where there is a need for high speed interactions between memory and the processing unit which also impacts performance and power dissipation significantly.

READ
"Our chip has the highest Computing Power Density"

Q. What are the other challenges that the EDA industry will likely face?
A. Low power design at higher levels is a pressing challenge. Architectural choices made in the front end of design have the most significant impact on power consumption. In addition, the speed of simulation is orders of magnitude faster at the system level. But assessing power at this abstract level traditionally has been extremely difficult to do with any degree of accuracy.

Fortunately, advanced design tools are emerging that provide accurate power modeling early in the design flow where architectural tradeoffs can easily be made. This will enable designers to explore more alternatives for applying the most efficient power strategies.

In the area of functional verification, the explosion in the complexity of verification continues to be a formidable challenge for EDA. As designs expand in size and complexity, simulation runs cannot reach effective coverage within a reasonable amount of time. To keep on schedule, designers are being forced to either lower their coverage goals or change methodologies. Changes ESL (electronic system level), coverage based verification, emulation, hardware acceleration of test benches and assertion-based verification.

One of the most promising new methodologies is intelligent testbench automation that removes redundant simulation, giving top priority to running each unique test first. This results in achieving target coverage linearly, leaving more time for running random tests, or even expanding the test space to previously untested functionality.

Emulation is also evolving to address the growing complexity of system design and the increasing need for hardware/software co-verification. While traditional ‘in-circuit emulation’ is still used, the trend of leading edge users is toward acceleration of test benches, or co-modeling, virtual IP stimulus (rather than plug-in hardware) and software debug for dozens of simultaneous users.

As a result, emulation can be set up as a typical IT server farm, with users remotely accessing the portion of the emulator capacity they need. The cost per cycle of emulation is two to three orders of magnitude lower than simulation on a traditional server farm. A large, and increasing, share of emulation deployment is in systems companies, who are using emulation to both debug multi-chip systems and to develop and verify embedded software.

Another challenging area is embedded software. The cost of designing hardware has actually not increased much over the last twenty years, according to the ITRS roadmap and Gary Smith EDA.

What has increased is the cost of system engineering and embedded software development. The enablement software for the SoC (drivers, Linux, light weight executives, etc), is becoming the bottleneck in the release process.

Ideally, the SoC design process would enable embedded software development ahead of silicon. But that’s not as easy as it sounds. First, the virtual representation needs to run sufficiently fast. Second, an environment that is native to the software development team needs to be established, trying to train the embedded software team on the use of hardware design tools is a non-starter. The solution that is emerging is a single embedded software development environment that is the same whether the target is a simulation, emulation, prototype or final product. Popularity of this kind of environment is growing rapidly.

For example, Mentor’s Sorcery Codebench, which is built upon the GNU open source tool chain for cross development of Linux, RTOS and bare-metal based embedded systems, is experiencing more than 20,000 downloads per month.


LEAVE A REPLY