[ Back ]   [ More News ]   [ Home ]
January 09, 2012
Blurring the line between EDA & Test
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Russ Henke - Contributing Editor

by Russ Henke - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!




Calendar 2012 marks the third consecutive year in which this writer will have posted EDA WEEKLY articles in EDACafe.com as a Contributing Editor. 

Appearing every four weeks or so, EDA WEEKLY topics chosen to date have ranged from 'how to write an effective business plan,' to profiles on people, vendors and products from both private and public sources in Electronic Design Automation as well as in Mechanical CAD and Mechanical CAE.

Along with separate quarterly reports on financial results within the context of the then-current worldwide and national economies, emerging trends have been identified and explained, such as certain visionary vendors beginning to offer software tools in multiple technical disciplines toward the goal of simulating real-world multi-physical effects, as well as reporting on the steady march of start ups, acquisitions, conferences, stockholder challenges and the rise of electronics intellectual property in these two disciplines. 

Even the exploits of certain companies in the world of electronics testing have been occasionally covered, such as Tektronix (May 30, 2011 -- "A Tektronix Gain") and the two-part series on Agilent Technologies (February 01, 2010 and March 29, 2010 -- "Agilent EEsof EDA"). The latter twosome in fact created the most "reader click-throughs of 2010." 

2012 - Happy New Year!

To launch 2012, the writer decided to once again turn to Agilent Technologies among others, to explore the relatively esoteric role that EDA has played in the past in the development and use of electronic test equipment, and how the formerly-bright line between EDA and Test is becoming blurred these days. 

Then, a new feature of the EDA WEEKLY that may appear from time to time: An article selected by this writer as likely possessing the qualities of special interest to readers, but penned by other persons for initial publication here. More on this later. 

Read on! 

Various font sizes are used in different sections of this EDA WEEKLY,  to improve readability. A REMINDER: Depending on the READER’S PREFERENCE, the font size he/she is viewing can usually be changed by depressing the “control” key and then clicking the “+” or “-“ keys of the computer in use.

Assuming as we do, that the Synopsys acquisition of MAGMA will be fully consummated during the early part of 2012 and that MAGMA financials will be subsumed by those of Synopsys, then MAGMA will no longer be covered as a separate entity in future EDA WEEKLIES, and the EDA WEEKLY will once again be seeking a publicly-held fifth vendor to add to the Group to bring it back to Five. Readers’ suggestions are welcome; please send them to the writer via email along with the reasons why, to Email Contact by February 01, 2012. THANK YOU.

Introduction of EDA WEEKLY Topic #1 of 2012:

Blurring the line between EDA and Test

Behind every electronics device deemed so essential to modern life (smart phones, Digital TV, satellite radios, navigation systems, radar, laptop computers, iPads, etc.), stands a long and sophisticated supply chain of enterprises critical to the creation of these final consumer products.

As these end-products are designed, during each step of the way the intermediate devices must be tested to ensure that they will function as intended over a specified operating range and meet other standards of durability, safe operation, reasonable cost, and so on.

Accordingly, there has to be equipment available to carry out all that intermediate and final testing, and of course there is a whole industry of test equipment manufacturers and suppliers to serve this need -- an industry that has been around in some form long before what we known as "modern EDA" ever existed.

But these days, sitting directly between EDA and Hardware Test is an interesting and growing new area where the two disciplines are becoming more and more blended.  No longer are the worlds of EDA and Test 100% distinct, and this edition of EDA WEEKLY explores this emerging trend.

The Blurring Line between EDA and Test & Measurement Hardware 

Clearly the pace of modern communications is moving faster and faster, and design engineers are more challenged than ever before, as the recent EDA WEEKLY article on CEVA demonstrated (see "CEVA's Cruising!" – first posted November 14, 2011):




This trend is partially driven by the competitive environment and partially because the next generations of communication protocols (3G, 4G, etc.) have barely been labeled “standards” when the consumer products using them have already hit the streets. 

To get a jump start on the competition, some of the leading companies are looking for ways to blur the line between EDA and Test to accelerate their designs and be first to market.

In doing several articles on Agilent Technologies in the past (see URL's below in Footnote [1]), the writer noticed that Agilent already appeared involved in the early stages of combining EDA and Test, and so he turned to this local source (Santa Rosa, CA) once more in late 2011 to begin his investigation.

And once again, Agilent did not disappoint! “You are truly on to something there,” agreed Charles Plott, Product Planning Manager for Agilent EEsof EDA's core EDA products. ”There is no question that combining EDA and Test is a growing trend and there are many drivers at play.”




Charles continued, “Of course, EDA has a significant role in design, but there comes a point when you have to build something.  The tests, specs, algorithms, and plots used in the early stages in EDA are the same ones you measure on the test bench.  The question is whether we can we save engineers' time by finding innovative ways to streamline the flow rather than treating them as two separate worlds – the EDA World and the Measurement World?”





Caption: Turning ideas into validated designs requires both EDA and Measurement


(figure provided by Agilent Technologies)


Charles continued, "Additional evidence supporting the trend that linking EDA and measurement is a growing field is that others are taking interest in this space.  For example, in recent months National Instruments decided to acquire Applied Wave Research.   At NI Week in Austin, NI started to share its future vision for linking EDA software to its instrumentation.  Also Synopsys announced that it would collaborate with the Rhode & Schwarz hardware company on a software validation library for the emerging LTE (Long-term Evolution) phone standard. So there is plenty of evidence to support the existence of a new market dynamic here," said Charles.

So in this EDA WEEKLY article, in collaboration with Agilent and others, the writer explores several specific areas and examples where EDA and Measurement have become more intertwined.   As you will see, this is not necessarily a new trend, but it is an increasing trend one and it’s getting more attention given the companies involved, their growing investments and their declared strategies.


Example 1:

Device Modeling – One of the First Areas where EDA meets Measurement

Every EDA circuit simulator requires some level of transistor model.  It turns out that most of these models originate in a blended world of EDA and Measurement.  In some cases, the actual measured data is the model (more on this later).  Given that the transistor model is at the heart of most designs, its accuracy is of critical importance.  A further proof point is that EDA companies, foundries, and the top-tier companies designing communication products all have investments in what is called “Device Modeling.”  It is arguably the first place where the world of EDA and the world of Measurement cross paths.

The writer was fortunate to have the opportunity to speak with one of Agilent’s modeling software experts, who enthusiastically shared some insights about Measurements and EDA and its importance to Agilent.  “Having a credible modeling solution is a fundamental requirement for EDA suppliers – in other words, simulation alone is a partial solution – you have to have a core expertise in modeling,” said Dr. Roberto Tinti, of Agilent’s device modeling software team.

Dr. Roberto Tinti

“Agilent sees ‘modeling’ as a strategic investment on both the software and measurement/hardware side, knowing that this is one of the top issues of designers in many segments of the industry.  It is important for us to continue demonstrating both innovation and technical leadership in this area,”
Dr. Tinti continued. “I learned that it really comes down to this: device models are only as good as the measured data that is used to extract them, so measurement expertise is critical to be able to gather accurate and meaningful device data for modeling.  Today there is an entire EDA segment where dedicated software is used not only to control the measurements, but also to optimize the extraction of compact model device parameters based on a series of specific measurements.”


“Today we offer products like IC-CAP and WaferPro, not only to control the instruments (and even the probe station), but also to help collect, interpolate, and refine the measurements so they can be used in EDA design tools and circuit simulators downstream,” said Dr. Tinti. “This not only saves our customers time, but also helps provide them with the most accurate models for their simulations.”

Caption: Agilent’s WaferPro showing automation and statistical results.

That there are other companies participating in this space ratifies its advantages.  Accelicon, an EDA supplier based in Cupertino CA and China, and specializing in device characterization, has found an important niche offering software focused on helping companies validate the accuracy of their models against the myriad of measurements that get made.

So once again, EDA and Test are not so independent. [2]

As briefly mentioned above, there are some interesting technical advancements where the actual measurement itself is used as the device model. Dr. Tinti explains, “Device modeling for compound semiconductors has always been challenging.  There are many effects like carrier trapping, thermal and non-quasi-static, that strongly influence the device behavior and are very difficult to model using traditional compact models (equation-based).  In addition, new materials like GaN (gallium nitride) are now being used for high voltage, high power devices and no accurate compact device models are available. To fill these gaps companies like ours are exploring artificial neural network technology to extract models that are entirely based on measurements.” 


So here’s another case where the concept of separate EDA and Measurement worlds is no longer valid at all – the measurement itself is embedded in EDA.

Example 2:
X-parameters Advance Active Component Simulation and Measurement

Many of the same observations in the last section can be extended to modeling in general (not just at the transistor level).  In high-frequency design, modeling linear networks has been dominated by a technology known as "S-parameters."  Like voltages and currents, S-parameters have been the long-standing figure of merit for anyone doing high-frequency communications design.  Entire categories of EDA simulators (linear simulators, electro-magnetic simulators, filter synthesis tools, etc.) have been built for producing S-parameters.  In the measurement world, there exists a several hundred million dollar industry supplying network analyzers for this same purpose.

More recently, there has been a technical breakthrough called "X-parameters" that this writer first mentioned in the EDA WEEKLY article of March 29, 2010:



Like S-parameters, X-parameters™ is a category of network parameters that allow engineers to completely characterize nonlinear circuits or systems with a measurement.  So, again, as in the device modeling case, the measurement itself is used within the EDA simulation.

“X-parameters are one of the more interesting technical innovations that have hit the RF and Microwave market in recent years,” said Dr. Larry Dunleavy, President of Modelithics, Tampa, FL. “Nonlinear modeling and measurement has always been a challenge and we have seen some encouraging results as we work with customers providing modeling and measurement services.”

Other industry players on the measurement side see the trend as well.   “We’ve been working with Agilent since 2008 on the proof-of-concept and integration of Load Pull with X-parameters and have seen a growing interest from our customers,” said Greg Maury, President and Chief Executive Officer of Maury Microwave, Ontario, CA. “As a device characterization measurement and modeling solutions provider, X-parameters allow our 'load pull' customers to create nonlinear models of their non-50ohm active devices and use them directly with EDA tools, something that has been requested for years.”

Caption: Maury Microwave recently pioneered using X-parameters as part of a load-pull measurement where results are then used in EDA simulators.

“Sure, you can measure X-parameters with a Nonlinear Vector Network Analyzer (NVNA) to come up with a model of component and then include it with your EDA tool,”
opined Charles Plott of Agilent EEsof.

Charles Plott

“But the same principles and algorithms used in the measurement equipment are also used in the EDA simulator itself,"
said Charles.  "In other words, at the schematic level you can do a virtual measurement and replace the entire schematic with an X-parameter model that is more efficient & portable, and the schematic-level IP (intellectual property) is protected.  Here again, you can see the old lines between measurement hardware and EDA software become pretty indistinguishable.” 



Example 3:
System-level Simulation with Measurement Hardware in the Loop

It may be that the area which has seen the most growth in synergies between EDA and Measurement, is the area of system design and verification (i.e. to answer the question, "Is your component or system meeting an overall spec?").  Much of this is driven by the explosion of today’s complex communications standards:  there is just a lot more to test than there used to be!  Rather than waiting for final hardware prototypes to be available to verify against a spec, such as the latest LTE-Advanced standard (Long-term Evolution – Advanced), it is possible to start this final verification task sooner using simulation.  In this way, designers can resolve conflicts earlier before they even occur.

There are many examples of where EDA and Hardware are used simultaneously in system design, according to Daren McClernon, a senior member and application expert in the Software Modular Solutions organization of Agilent:

McClearnon explains,”If you look at a project timeline for a typical consumer product, today’s design flows are essentially multi-threaded, not sequential.  When several designers are involved in a project, each moves his/her design forward in parallel, and to the extent that each can evaluate the others’ intermediate progress and changes, then overall system verification can begin that much earlier.  This eliminates many of the obvious downstream surprises, and allows earlier validation of the architecture as a whole.”

Daren McClearnon


"Agilent’s SystemVue product line sits in the intersection between EDA and Test.  SystemVue is an EDA design tool, not simply a test automation tool,” continued Daren. “This is an important distinction – rather than focusing on just being more efficient in testing the same way as always – a better approach is to verify a ‘virtual system’ from the first day of a project with simulation and then gradually incorporate more measurements along the way. This saves time and reduces project risk.”

So here's the thesis of this approach: Take the new 4G standards, for example; they are not only complicated, they are continuously ‘Evolving’ (the E in LTE).  To save engineers time, communications system level EDA tools provide flexibility to create, import, or acquire early communication standards and reference designs.  Those same reference signals can be downloaded to the test equipment to generate realistic signals for validation or a hardware prototype long before the rest of the system components are ready.

Caption: Downloading software LTE-Advanced signals from Agilent’s SystemVue product to be used within the test equipment as a source






[1] URL's of past EDA WEEKLIES involving Agilent Technologies





[2] Interviews with Agilent

The interviews contained in the article between the writer and Agilent Technologies' personnel all occurred in November 2011, prior to the Thanksgiving holiday November 24, 2011. In a subsequent news release by Agilent on December 2, 2011, the Company announced the pending acquisition of Accelicon:



While some if not all of the Agilent personnel interviewed for the EDA WEEKLY article herein were aware of the then-current acquisition dialog with Accelicon, each was to a person 100% silent about it.




Summary of the Entire Article to this point:

"EDA and Test are not so distinct."

Well, that's the assertion of the many credible experts quoted in this article. This writer for one believes them; he has seen the outcomes. That's just the way it is with controlled electricity - you can't "see" it, but you can sure observe its outcomes!

Great thanks to all those quoted in this article, and special thanks again to Agilent's Charles Plott for his time and enthusiasm.

Competitive pressures are driving needs for a blended solution and the examples in this article bring this emerging area into better focus.  Beyond that, it is clear that large industry players like Agilent have been investing heavily in this area and other companies are jumping in as well so this looks to be an exciting area to watch.




SERENDIPITY - Thy name is Mentor Graphics!

The "watching" referred to at the conclusion of the above article on Agilent paid off immediately!

Only two weeks had passed since the conclusion of the Agilent interviews before the writer received this December 12, 2011 news release from one of his former employers:


Mentor Graphics Announces Industry’s First Integrated Solution for Component-to-System Thermal Characterization and Analysis


WILSONVILLE, Ore., December 12, 2011—Mentor Graphics Corporation (NASDAQ: MENT) announced the electronic industry’s first combined technology for thermal characterization and simulation with T3Ster® hardware test products and its FloTHERM® software. The Mentor Graphics® T3Ster product is the world’s leading advanced thermal transient tester for semiconductor device packages and LEDs. Mentor’s FloTHERM product is the de-facto standard for electronics thermal simulation and analysis to predict airflow, temperature and heat transfer throughout electronics equipment including components, boards, and entire systems. The interface between T3Ster and FloTHERM seamlessly creates accurate thermal simulation models. The thermal characterization offering is unique because it is the only JESD51-14 compliant solution available on the market today.

Increased design complexity and smaller form factors create heat management problems which represent one of the biggest challenges in electronics today. Temperature is understood to be the key accelerator in the majority of reliability failures and IEEE Standard 1413 recognizes the need for accurate thermal data at all levels of a system’s implementation.

Designers working on subsystems and systems using information for LED, semiconductor, and package components use complex thermal analysis software to help accelerate the design of their products but the analysis, typically based on vendor datasheets, often provides insufficient results. Clear, accessible, and reliable data relating to thermal characteristics upstream and downstream is critical. Methods to achieve this have traditionally been awkward, manual and therefore error-prone – until now with Mentor Graphics T3Ster thermal characterization and interface to the FloTHERM thermal simulation software product.

Now the integration of Mentor’s T3Ster hardware measurement and FloTHERM software simulation provides a combined methodology of optimizing heat management in devices, sub-systems and full systems. Manufacturers are able to optimize their LED and IC package designs for effective heat dissipation. Once the device prototype is built, they can then characterize the device from a thermal perspective and build accurate models for use in FloTHERM thermal software simulations at both the sub-system and full system levels. Finally, systems integrators can further verify their heat management solutions with physical measurements using the T3ster hardware.

“As LEDs become more powerful, more attention should be paid to thermal management, which is essential to ensure stable LED performance and long lifetime. This is why OSRAM is devoting considerable attention to thermal design. T3Ster’s accuracy and repeatability enable us to verify our thermal designs and confirm the stability and reliability of our products,” stated Dr. Thomas Zahner, quality manager, Osram Opto Semiconductors. “By testing in bulk we get increased statistical confidence in the measurement results. The structure functions built into the T3Ster software are extremely powerful for identifying different thermal attach issues during our extensive reliability testing.” (from “When Designing with Power LEDs, Consider Their Real Time Thermal Resistance,” by Andras Poppas, Mentor Graphics, Nov/Dec 2009 LED Professional Review.) 

“To design our lighting systems, we needed reliable data regarding LED characteristics and simulation tools which deliver results quickly. We found that Mentor’s T3Ster and FloTHERM products were the best such tools available to us and we could use them in our project to verify our PearLight street lighting luminaires for proper heat management. Accuracy and speed in achieving the results was critical for our business,” said András Szalai, CFO, HungaroLux.

JEDEC is the organization dedicated to microelectronics industry standards. The Mentor Graphics T3Ster advanced thermal characterization tester for semiconductor packages is the only commercially available product to fully implement the JEDEC JESD51-14 new measurement methodology standard for the junction-to-case thermal resistance of power semiconductor devices. The T3Ster test methodology ensures higher accuracy and repeatability compared to classical steady-state measurements based on older standards. Mentor’s FloTHERM product allows engineers to implement virtual prototypes using advanced CFD techniques to simulate airflow, temperature and heat transfer in electronic systems. By using accurate thermal analysis, engineers can evaluate and test designs automatically before physical prototypes are built. When combined with the T3Ster product, engineers using the FloTHERM tool will benefit from both accurate thermal simulation models derived from real measurements and thermal package characterization testing.

Package characterization measurements provide an insight into the package structure with thermal resistances and thermal capacitances. Simulation software provides the engineer with information on specific sections of the design that correspond to the measured structure. Thermal interface materials are quite difficult to model since their conductivity and thickness cannot be determined with high accuracy. Thus, thermal package measurements produced by the T3Ster product, based on the resistance of these materials, can be used later for accurate model creation in FloTHERM software. This seamless process provides fast, easy and accurate model creation; identifies product design defects; and enables manufacturing quality checking. The combined T3Ster tester and FloTHERM analysis software solution is compatible with other Mentor Graphics products to provide comprehensive thermal simulation for optimum system reliability, from IC package and LED, to PCB, and to full system development.

“Mentor’s best-in-class thermal simulation and measurement of semiconductor packages and LEDs provide tremendous advantages for customers faced with thermal challenges,”
said Dr. Erich Buergel, general manager of Mentor Graphics Mechanical Analysis Division. “The ease in creating accurate thermal models based on reliable thermal measurements helps users quickly to identify design problems and to create design alternatives which improve product quality, reliability, and increased profitability.” 

The Mentor Graphics T3Ster and FloTHERM solution for efficient thermal package characterization is available today.

End of news release.

Long time readers of the EDA WEEKLY will recall that the December 07, 2009 issue introduced Dr. Erich Buergel of the  Mechanical Analysis Division (MAD) of Mentor Graphics in an article entitled, "MAD Progress."



Joining MAD only a few months before that article was posted, Erich had become the new General Manager of the MGC Mechanical Analysis Division, formed after the acquisition of Flomerics plc by MGC in August 2008.

Dr. Buergel (pictured above) continues to report to Henry Potts, VP & GM of the MGC Systems Design Division (SDD). Mr. Potts, headquartered in Longmont, CO, was featured in the July 25, 2011 EDA WEEKLY, entitled, "Back to the Future":



Not mentioned in either previous article was the existence of a small group in Budapest, Hungary that had been acquired by Flomerics shortly before Flomerics was acquired by MGC.



It turns out that this latter Budapest group is responsible for the development of the products for Thermal and Optical Testing discussed in the news release above:

        — T3Ster® – thermal testing of electronic parts and systems...

...is combined with the more recently developed TERALED® product:               

...to create the T3Ster/TERALED® system for combined thermal and optical testing of LEDs:

...for the

So in the MGC News Release of December 12, 2011, we have found our fourth example of a system that blurs the line between EDA and Test, an example of a system already in production and use by several MGC customers!

Great thanks to MGC’s Suzanne Graham for arranging the MGC briefing on its new products, to Kim Coxe for e-mailing collaterals, and especially to John Isaac for braving the snows of Longmont, CO to conduct the Internet briefing. And a nod of congratulations to MGC for the “management moxie” to market these EDA/Test combination products.




A Personal Aside

The general topic of combining software analysis and testing to solve problems and/or to design better products, has long been an interest of the writer, no doubt because it's the same approach we mechanical engineers took, back in the seventies, when faced with understanding and simulating the mechanical dynamics of complex structures like machine tools and automobiles of the day.

Available techniques, such as finite element analysis software, were too limited in those days to model much of the structure, and even when we could build such computer models for then-current software such as ANSYS or NASTRAN, the digital computers of the day available to us were way too small and slow for timely answers. By us, I mean a tiny, 30-person consulting company based in Fairfax, OH, called “SDRC.”

“What did we do?" We used static and dynamic tests with electric or hydraulic vibration shakers, as well as new test equipment from Spectral Dynamics (then in San Diego, CA) to plot the actual structure’s ‘transfer function,’ deriving just enough information (our equivalent of Agilent's “device models”) about the complex structure to construct approximate digital computer simulations that allowed us to help machine tool companies like Cincinnati Milacron to build more stable, chatter resistant milling machines, and to help Detroit car companies, in the face of the very first oil embargo, to reduce the weight and mass of their cars but still achieve adequate structural integrity with vastly improved dynamic behavior.



“The re-designed, downsized 1975 Cadillac Seville was one of our first success stories using this approach, a car that weighed 1000 pounds less than its predecessors with equivalent, arguably improved dynamic behavior,” said the writer recently, inwardly delighted that he could summon up such old memories.




Introduction of EDA WEEKLY Topic #2 of 2012:

The EDA WEEKLY is introducing a new feature in 2012 under the aegis of the current EDA WEEKLY writer, who began his period of care of the EDA WEEKLY franchise in October of 2009. The new feature: From time to time the EDA WEEKLY will publish guest articles that should be interesting to the readership but that the writer is unlikely to pursue on his own.

The following inaugural article is contributed by Ms. Linh Hong, vice president of marketing at Kilopass Technology, Santa Clara, CA. Ms. Homg is solely responsible for the following article’s  content. (The backgrounds on the contributor and the company are presented at the end of the article).


Building a Successful Non Volatile Memory (NVM) Company on the basis of CMOS Oxide Breakdown

Starting its second decade in business under current CEO Charlie Cheng, Kilopass Technology Inc. continues its successful growth driven by two major movements. The first comprises market forces where consumers are demanding greater functionality from their mobile smart devices beyond audio and video to include environmental data that will eventually provide life care for the consumer. The second involves technology forces that continue to deliver more transistors per silicon area for each new semiconductor process generation, now at 28nm going to 20nm.

The widespread adoption of Kilopass' unique standard logic CMOS anti-fuse, one-time programmable (OTP), non-volatile memory (NVM) intellectual property (IP) is reflected in the growing number of Kilopass foundry and IDM partners. Among foundries signing new agreements are UMC, SMIC, GLOBALFOUNDRIES, Dongbu and TowerJazz, that join long-standing Kilopass partner TSMC, the first to offer Kilopass IP at 28nm. The key to success for an IP company is silicon enablement and Kilopass IP is available on process nodes from 180nm down to 28nm at its major foundry partners to provide solutions to customers across many markets. Among major Integrated Device Manufacturers (IDMs) inking deals with Kilopass are the major suppliers of image sensors, display drivers, and gaming chips.

To understand how this successful start-up is being driven by evolutionary technical and market forces, an explanation of the company’s patented anti-fuse NVM IP and how it compares with alternative NVM solutions is the place to begin. Next, a description of how this anti-fuse NVM IP has symbiotically evolved with the steady progression of each new generation of standard logic CMOS processes, currently at 28nm and moving to 20nm and beyond, is in order. Finally, a discussion of how the anti-fuse NVM IP uniquely serves the four high-volume applications where it is being incorporated will detail how market forces are driving the company’s ongoing success.

OTP anti-fuse memory technology has been in the available for several decades.  The principle behind its operation is simple. The basic storage element is a CMOS transistor that in an un-programmed state represents an open circuit. See Figure 1. During programming, the gate oxide of the transistor is broken down to produce a low-impedance path to current flow, thus storing a bit of data.  The gate oxide breakdown is permanent and is unaffected by the number of accesses, as some other NVM solutions are, or environmental factors including such hostile environments as automotive and military/aerospace.

Figure 1. Kilopass 2T Bit Cell

Kilopass was founded by prolific inventor Jack Peng, who patented the technology in 2001 and contributed to early adoption of the technology by major customers.  He served as company CEO until the fall of 2005, when the reins were given to serial entrepreneur Bernie Aronson.  Aronson expanded the company’s customer base to over 50 and increased the number of total licenses to over a 100 all contributing to over a 100M units in chip production.  After this successful run, Aronson handed the reins over to Charlie Cheng in Fall 2008.  In just over a year at the end of 2010, Cheng had doubled the number of customers to 100, achieved a 100 percent revenue growth, and saw the number of chips shipped with Kilopass IP exceeding 2 billion units.

Charlie Cheng is a Silicon Valley veteran that honed his entrepreneurial skills at start-up such as Aspec Technology, Edge Computer, Iomega, Viewlogic, and Zycad. His first solo entrepreneurial venture began at Lexra, a CPU IP start-up that pioneered the first synthesizable 32-bit CPU core. After selling Lexra to MIPS Technology Inc., he joined Faraday Technology, where he held general management and marketing vice president roles before taking the helm as Chief Executive Officer.   After four years he left to become Entrepreneur-in-Residence at U.S. Venture Partners, a major Kilopass Technology investor, with the eye to finding a start-up he where he could employ the full weight of his accumulated experience. Kilopass provided the ideal vehicle. Fluent in both English and Mandarin Chinese, Cheng is a graduate of Cornell University with a degree in mechanical engineering and computer science.

Making a Standard Logic CMOS Anti-Fuse

Peng’s invention enabled Kilopass to implement an anti-fuse in standard CMOS without additional mask or process steps—no extra cost—and to allow an elevated programming voltage to convert a standard CMOS transistor into a low-resistance path to current:  an anti fuse or bit cell.  Until 2001, When Kilopass was formed, to create the memory element or bit cell required additional process steps.  Kilopass was conceived when 180nm standard logic CMOS process technology became the volume manufacturing process for the semiconductor industry. At this process node, the gate oxide breakdown is less than that of the junction breakdown, thus eliminating the need for added manufacturing steps.  Previously, extra protection had to be added to the transistor to prevent junction breakdown—the destruction of the transistor. This extra protection added manufacturing steps, thus boosting the cost adding anti-fuse capability to any chip.

With each new CMOS process generation, transistor dimensions and the oxide thickness get smaller.  This shrinking makes the anti-fuse solution better because the programming voltage required to cause gate oxide breakdown is reduced. The process scaling also provides other benefits. The most obvious is smaller transistors allow more memory to be contained in a given silicon area.  Furthermore, successively smaller memory cell make the Kilopass memory, which is the most tamper-resistant NVM available, more secure.  The extra security results from the smaller read currents needed to access data from any bit cell.

Kilopass has 59 patents that have been issued or pending on anti-fuse technology.  As shown in figure 2, the patents are divided into three groups of fundamental bit cells by number of transistors – 1T, 2T, and 3.5T.  Kilopass’ patents enable the embedding of OTP macros in standard CMOS products:


Figure 2. Anti-Fuse Patent Portfolio



Comparing NVM Alternatives

The competitive advantage of anti-fuse is best illustrated by comparing it with other embedded NVM technologies available in the market:

Embedded Flash, the most expensive and most flexible of the embedded NVM technologies, is ideal for code and data storage that changes often. It can require at least 10 additional mask steps. The higher upfront cost of the technology is offset by high endurance allowing frequent and a large number of read-write cycles. Commonly found in microcontrollers (MCU) it provides flexibility to end applications to produce multiple configurations from one product.

For example, an MCU chip manufacturer may produce a low end 8-bit MCU for a dozen different appliance manufacturers.  By producing a base MCU and adding different features for each of the different customers in flash, the MCU supplier eliminates the need to manufacture a different MCU for each customer—thus saving silicon cost and eliminating the inventory problem of stockpiling different products for different customers.  Both the vendor and the customer benefit from the flexibility, respectively, to control inventory and provide differentiation.

An alternative to flash is floating gate or charge trapping solutions in the form of the stacked-gate 1T cell, a CMOS transistor with one floating gate and one contacted gate overlapping each other. See Figure 3. Insulated by a high quality oxide from the contacted gate, the floating gate is programmed by channel injecting electrons and erased by allowing the trapped electrons to escape. The presence or absence of electrons in the floating gate is read as a “1” or “0”. Like flash the floating gate memory can be erased and electrically programmed.

Figure 3. Floating Gate NVM Technology


However, both flash and the floating gate solution requires additional masks and processing steps on top of the standard CMOS logic process though the latter requires fewer than former.  How much additional cost a design requiring an embedded re-programmable NVM can bear will determine if an application will opt for the flash or floating gate alternative.  Having to add processing steps to the standard logic CMOS process means that these two solutions will not be available at the latest process node. Typically flash on logic CMOS will lag standard logic CMOS three generations whereas floating gate may be one or two generations behind.

Another alternative NVM storage technology is electrical fuse (eFuse) made of polysilicon or metals.  The metal eFuse is a one-time programmable memory programmed by applying a high current to rupture a conductive link or make its resistance significantly higher.  The reliability of an eFuse is a concern because debris and shards can cause the fuse to grow back over time.  See figure 4. The polysilicon eFuse with Cobalt or Nickel silicide on top is programmed by a well-known reliability mechanism called electro-migration in which electron momentum pushes the silicide atoms out of the conductor link to produce a high resistance or open circuit.

Figure 4. Poly Fuse Technology

The eFuse is typically custom-designed and provided by the foundry as a macro, thus affording the designer a lower cost solution. However, migrating a design containing an eFuse from one foundry to another becomes problematic. Most fuses are programmed during wafer fabrication with stringent power requirements for programming.  The eFuse bit cell is the largest of the standard CMOS NVM technologies. If an application calls for higher bit density, greater than 4Kb, another 4Kb eFuse increment begins to take up too much area of the SOC.

Finally, among all the NVM alternatives, ROM is the lowest cost technology available, however, it is the least flexible of all. ROM is typically used for storing code that does not change such as audio recording of “Happy Birthday” in a musical greeting card or fonts in an ink jet printer. The ROM is programmed as part of wafer fabrication, thus, the content can only be changed by producing a new mask layer.

The anti-fuse NVM addresses all the shortcomings of the four alternative NVM solutions.  Unlike flash or other charge trapping NVM solutions, anti-fuse requires no additional cost over standard CMOS and scales easily with each new generation of process technology. Unlike the eFuse and ROM solutions, anti-fuse can be programmed in the final package in the field with a simple write command.  And, unlike eFuse and ROM, anti-fuse is portable across multiple foundries and multiple processes from 180nm to 28nm.  Finally, of all the NVM solutions anti-fuse is the most tamper-resistant, even to invasive attacks and scanning electron microscope observation.

Growing Demand for Embedded NVM

Fueling the company’s success and growth is the increasing consumption of digital multimedia content on portable devices—smart-phones and tablets and consumer electronics devices in the home—Internet TVs and web-enabled set-top boxes. Further demand for the company’s NVM intellectual property (IP) is coming from being incorporated in microcontrollers (MCU) targeting automotive applications. Over a hundred MCU’s provide control of engine, transmission power, auto body, cabin environment, lamp, security, and audio entertainment in a typical new car today.  Finally, the number of analog and mixed signal circuits being incorporated on next generation system on chip (SoC) designs continues to grow producing additional consumption of Kilopass NVM IP

According to a Global Industry Analysts, Inc. market research report published in May of 2011, the global smart phones market will reach over 1.6 Billion units by 2017.  This growth is being fueled by a plethora of functionality smart phones provide such as data services:  applications, multimedia, location-based services.  The market research firm sees near-field communications that enable wireless purchases, electronic wallet, etc. becoming prevalent in the next generation of devices. All this functionality is fertile ground for theft and fraud by hackers exploiting the technology.  NVM memory within these portable devices provides tamper-resistant storage for personal identification numbers (PINs), encryption keys, and other private information.  However most NVM technology is relatively easy to hack.

Kilopass’ embedded non-volatile memory cannot be hacked using passive, semi-invasive, and invasive methods. Due to the nature of the technology, it is difficult to determine the content of the memory. Passive techniques including using current profiles to determine the word pattern are unsuccessful because the bit cell current for “0”s and “1”s are much smaller than the current required for sensing or to operate the peripheral circuits in order to read the memory. One cannot determine the pattern of the word being read.  Invasive techniques including backside attacks or scanning electron microscope passive voltage contrast are unsuccessful because it is very difficult to isolate the bit cell since it is connected in a cross point array. Furthermore, it is difficult using chemical etching or mechanical polishing to locate the oxide breakdown. In a cross-section or a top view, it is difficult to determine which bit is programmed.

Figure 5. NVM in Smart Phones

Figure 5 shows typical functions contained in a smart phone.  All can achieves higher integration by using embedded nonvolatile memory (NVM) instead of serial flash memory or EEPROM on separate die.  Kilopass estimates that 30 percent of the $5 billion spent each year on serial flash memory and EEPROM was for applications requiring 4Mb of memory or less.  Kilopass states that teardowns of Apple’s iPhone 3GS found about 10 serial flash memories and EEPROMs, most die-bonded within packages of larger chips thus invisible in system-level teardowns.  As smart phones continue to offer more features, embedded NVM memory will become more valuable to reduce chip count and to add security.  

Figure 6. NVM in Home Electronics

In home consumer electronics non-volatile memory has typically been used for conditional access, the encryption key loaded into the set-top box that enables the consumer to receive content from the service provider. However, as shown in Figure 6 with the advent of over-the-top media, content is circumventing the cable and satellite gateways via the Internet and the wired and wireless home networks that connect consumer electronics with PCs and digital devices.

Over-the-top media has spawned the proliferation of digital rights management codes that accompany the digital multimedia over the Internet and throughout the home network.  For example DTCP (Digital Transmission Content Protection) keys can be found on interface such as IEEE 1394, USB, MOST, Bluetooth, and TCP/IP or HDCP (High-bandwidth Digital Content Protection) keys can be found on interfaces such as HDMI, DVI, Display Port, GVIF and UDI.

NVM in Automotive Applications

For today’s premium cars, "the cost of software and electronics can reach 35 to 40 percent of the cost of a car," said Manfred Broy, a professor of informatics at Technical University, Munich quoted in a story in IEEE Spectrum published in February 2009. The article asserted that these cars contain 70 to 100 microcontroller-based electronic control units (ECUs) networked throughout the body of your car as illustrated in Figure 7. Even low-end cars now have 30 to 50 ECUs embedded in the body, doors, dash, roof, trunk, seats, and elsewhere.

Figure 7. NVM in the Automobile

Microcontrollers (MCUs) provide the intelligence directing this electronics.  With the growing awareness for the need for security, automobile manufacturers are looking for new alternative to flash and EEPROM for secure, low-cost, more temperature tolerant and often field-programmable non-volatile memory (NVM) for microcontrollers in automotive electronic systems. One-time programmable anti-fuse NVM with over-provisioned memory capacity is finding applications where storage contents changes infrequently over the life of the automobile:  boot code for microcontrollers—eliminating external serial EEPROM, ROM patches to fix bugs in the field, data logging of infrequently changing events—odometer reaching service milestones, etc.

Analog mixed-signal circuits, including digital to analog converters, analog to digital converters, pulse width modulators, etc., are proliferating in new chips populated by sensors, accelerometers, and radio frequency circuits.  According to a TechNavio market research report published May 2011, the global analog and mixed signal device market will reach $49.45 billion in 2014. One of the key factors contributing to this market's growth is the explosion in intelligent handheld mobile device that are incorporating applications (apps) that enable the device to sense the environment, location, speed, direction as well as interact with the user through voice recognition and text to speech synthesis.

Chips containing these analog mixed signal elements will soon include even more sensors than the ubiquitous accelerometer and GPS. According to the article "Which sensors are coming to your next smartphone?" in the May 23, 2011 issue of web magazine Mobihealthnews.com, the following are in the offing: altimeters, heart-rate monitors as well as sensors to detect perspiration and microphones, temperature and humidity sensors for more environmental data.  Each will bring their share of analog mixed-signal circuits.

The problem with analog mixed-signal circuits is they are subject to variations during manufacture and their characteristics change over time. To ensure they behave consistently, the circuits’ characteristics are trimmed during manufacture using digital parameters stored in NVM.  As the analog mixed-signal circuits’ characteristics drift over time, they are trimmed by changing the parameters in NVM to compensate for the drift. Storing these parameters presents a large market opportunity for NVM IP.


Kilopass Technology Inc. is an intellectual property company that has both significant market forces and compelling technology trends driving its success.  Propelling its business success is the proliferation of multimedia content in handheld devices, all requiring secure DRM key storage. Couple this with the increasing number of computing devices that all need NVM storage being incorporated into the smart devices to handle the expanding amount of functionality. And finally add in the expanding number of sensors migrating on board portable smart devices all needing NVM storage for mixed signal trimming data. The compelling technical trend driving the company forward is the relentless progression of Moore’s Law producing the next generation CMOS process technology every 18 months. With each new generation, the storage capacity of Kilopass IP grows, its reliability improves, the amount of power it consumes decreases, and its ability to thwart tampering even with the most invasive technique improves. Kilopass is well positioned for continued growth and success. End of article.

Ms. Linh Hong is VP of marketing at Kilopass responsible for marketing Kilopass’ solutions globally. With 13 years of  semiconductor industry experience – primarily focused on logic NVM IP, high-speed SERDES IP and broadband communication ASICs – Linh served for three years in various director and management positions in field applications engineering and applications marketing at Kilopass before assuming her current role in April 2009. Prior to joining Kilopass, she was a design consultant and design manager at LSI Logic, where she also served in various design and marketing engineering functions. She began her career as a component engineer at Sun Microsystems. Linh holds a BS degree with honors in physics, and an MSEE degree in electrical engineering, both from University of California, Davis.

Kilopass Self Description

Kilopass Technology is expanding the horizons of embedded non-volatile memory to create new cost savings and design opportunities for today’s semiconductor industry.

The leader in embedded NVM intellectual property, Kilopass removes long-standing challenges to NVM integration across a wide range of markets, applications and SOC designs.  Its patented technologies and expanding set of one-time programmable (OTP) and multi-time programmable NVM solutions have boundless capacity to scale to advanced CMOS process geometries, are portable to every major foundry and integrated device manufacturer (IDM), and meet market demands for increased integration, higher densities, lower cost, better reliability and improved security.

Trusted by today’s most trusted brands, Kilopass technology has been integrated by over 100 customers, with more than 2 billion units shipped in over 300 industrial, automotive, consumer electronics, mobile and analog & mixed signal chip designs.  The company’s solutions are currently integrated into 20 million set-top boxes, 50 million DVD chip sets, 100 million Wi-Fi modules, and 500 million FM tuners.

The EDA WEEKLY writer also thanks Ms. Nanette Collins and Mr. Jonah McLeod for their assistance.



Companies wishing to submit previously unpublished articles of possible interest to EDA WEEKLY readers are invited to send them to Dr. Henke at http://www.henkeassociates.net, along with the author’s name, company, and all contact information. No guarantee can be made that contributed articles will be published or when publishing might occur. If the contributed article is accepted for publication, no guarantee on timing can be given.

About the writer of the EDA WEEKLY:

Since 1996, Dr. Russ Henke has been active as president of HENKE ASSOCIATES, a San Francisco Bay Area high-tech business & management consulting firm. The number of client companies served by Henke Associates during those years now numbers close to fifty. Engagement lengths have varied from a few weeks up to ten years and beyond.

During his previous corporate career, Henke operated sequentially on "both sides" of MCAE/MCAD and EDA, as a user and as a vendor. He's a veteran corporate executive from Cincinnati Milacron (Research Scientist – Oakley, OH), SDRC (President & COO – Fairfax, OH & Milford, OH), Schlumberger Applicon (Executive VP – Burlington, MA), Gould Electronics Imaging & Graphics (President & General Manager – San Jose, CA), ATP (Chairman and CEO – Campbell, CA), and Mentor Graphics Corporation (VP & General Manager – PCB Division San Jose, CA & Professional Services Division – Wilsonville, OR). 

Henke is a Fellow of the Society of Manufacturing Engineers (SME) and served on the SME International Board of Directors. Henke was also a board member of SDRC, PDA, ATP, and the MacNeal Schwendler Corporation, and he currently serves on the board of Stottler Henke Associates, Inc. (San Mateo, CA).  He also serves as VP Business Development of Stottler Henke, focused on commercial applications of artificial intelligence.

In addition, Henke is a member of the IEEE and a Life Fellow of ASME International. In April 2006, Dr. Henke received the 2006 Lifetime Achievement Award from the CAD Society, presented by CAD Society president Jeff Rowe at COFES2006 in Scottsdale, AZ. In February 2007, Henke became affiliated with Cyon Research's select group of experts on business and technology issues as a Senior Analyst. This Cyon Research connection aids and supplements Henke's ongoing, independent consulting practice (HENKE ASSOCIATES). Dr. Henke is also a contributing editor of the EDACafé EDA WEEKLY, and he has published EDA WEEKLY articles every four weeks since November 2009; all URL's available.

Since May 2003 HENKE ASSOCIATES has also published more than 100 independent COMMENTARY articles on MCAD, PLM, EDA and Electronics IP on IBSystems' MCADCafé and EDACafé; most URL’s available.

Information on HENKE ASSOCIATES is available at http://www.henkeassociates.net.

March 31, 2012 will mark the 16th Anniversary of the founding of HENKE ASSOCIATES.


You can find the full EDACafe event calendar here.

To read more news, click here.

-- Russ Henke, EDACafe.com Contributing Editor.