In October of 1991 a storm stronger than any in recorded history hit the coast off of Gloucester, Massachusetts. This "Perfect Storm" - so called because it was three storms combined into one - created an almost apocalyptic situation in the Atlantic ocean, where boats encountered waves of 100 feet. This was the basis of a film by the same name. It may not qualify as a perfect storm but something is happening on the EDA sea with the convergence of a need, technology and standards.
First, the need. Much, perhaps too much, has been written about the dramatic increase in the complexity of IC's by any available metric. This situation has ramifications for all aspects of the product lifecycle: design, verification, fabrication and test. EDA tools now enable design teams to generate designs that can not be truly verified in anything like a reasonable time. Many articles cite the fact that 70% of the time is spent in verification. The traditional black box approach to verification is to provide a set of input test vectors to a design simulation and monitor the actual output against the intended output. The test bench consists of a combination of cases that reflect the specification, worry cases and random inputs. Unfortunately, even rather simple designs can require an enormous number of test vectors and therefore simulation time to exhaustively test for all possible combination of inputs. Consider the case of a simple 32 bit comparator which would require 264 vectors to exhaustively evaluate. Eliminating redundant and illegal inputs from test benches to reduce the number of vectors is itself a major challenge. A second shortcoming of simulation is that internal errors may never propagate to the outputs making them unobservable. Even if the effects do reach the output, it may be very difficult to trace the effect backwards to their cause. Lastly, certain types of implementations errors are difficult to stimulate with functional tests. These are referred to as verification hot spots. There is a lack of control, i.e. an inability to activate, stimulate or sensitizes a specific point within the design. The consequence of these observability and controllability limitations is that a chip may require an expensive (~$1M) re-spin or worse a field recall due to design flaws undetected during the verification stage. According to well publicized study by Collet International Research, Inc 70% of designs requiring multiple respins have logic flaws.
Another issue with simulation is coverage. How do we know that the verification process is complete? There is both functional or specification coverage and implementation coverage. Metrics for the later include toggle coverage, code coverage (line, branch and sub-expression coverage), and finite state machine (FSM) coverage (state, arc and path). While easy to measure, it is difficulty to correlate these measures with coverage of functional intent. Functional coverage metrics, including coverage points, are designed to check that the verification tests have exercised specific key functionality - for example, executing all instruction types or transmitting and receiving all packet types. However, functional coverage on the chip inputs and outputs may miss the exercise of the internal structures likes FIFIOs.
Next, the technology: Assertion Based Verification (ABV). Assertions capture the design intent. They specify both legal and illegal behavior of a circuit structure. The basic ABV method compares the implementation of a design against its specified assertions by embedding assertions in a design and having them monitor design activities. Assertions are placed wherever the protocols may be misused, assumptions violated, or design intent incorrectly implemented. Whenever an assertion is violated, it is reported immediately without the requirement for the problem to propagate to the design's outputs. One can also insert protocol assertion monitors to check for violations of the corresponding interface protocols. Generally, there are two kinds of assertions. Concurrent assertions state that a given property must always be true throughout a simulation, while procedural (sometimes called temporal) assertions apply only for a limited time, or under specified conditions.
ABV can also be combined with formal verification techniques. Formal verification uses mathematical techniques to prove some assertions true (given a set of assumptions) and other assertions false (by discovering counterexamples). A proof means that static formal verification has exhaustively explored all possible behavior with respect to the assertion and has determined it cannot be violated. A counterexample shows the circumstances under which the assertion is not satisfied. One can also produced a bounded proof as measured by a proof radius. The proof radius of an assertion is the number of cycles of design operation from an initial state that are covered by exhaustive formal analysis. Within the proof radius, the formal analysis guarantees that the assertion cannot be violated using any legal input sequence.
Proof radius is an assertion-based coverage metric that suggests whether the design needs more assertions. Assertion density measures the number of assertions of each type in each module. Minimum sequential distance measures the minimum number of levels of sequential logic from a given register to any assertion. Together, these metrics give the development team guidance about where to add more assertions.
0-In Design Automation identifies the fundamental technologies and methodologies (“pillars”) that make the ABV process successful.
- Automatic Assertion Check
A set of pre-defined assertion rules is applied to the RTL code of the design to target common netlist problems and design flaws. Unlike a syntax and sematic checker like lint, this synthesizes the design and employs formal methods to analyze the internal structures of the design. The process is automatic; no input is required from users.
- Static Formal Verification
Designers add assertions to their code to capture design intent and to annotate their assumptions. Exhaustive formal verification analyzes the assertions in a design determining whether they are true, false or indeterminate. No testbench or simulation is required. This approach is typically employed at the block level.
- Simulation with Assertions
User-specified assertions are simulated with the design and the test environment. The assertions monitor activities inside the design and on the interfaces and report immediately on any violation. At the same time, the assertions collect coverage and statistics data.
- Dynamic Formal Verification
This runs concurrently with simulation and leverages all simulation vectors. It identifies states “close” to the assertions and performs local formal analysis about them. Dynamic formal verification takes advantage of the design knowledge encapsulated in the simulation vectors to explore new behaviors in the design-instead of simply duplicating activities already performed by simulation.
- Increasing observability of bugs
- Increasing controllability
- Facilitating the diagnosis of bugs by identifying bugs at or near their cause
- Finding bugs earlier in the design cycle (TTM)
- Uncovering bugs that would have otherwise remained undetected (more bugs)
- Enabling the use of formal and semi-formal verification techniques
- Preventing wasted simulation cycles
- Improving verification productivity
- Lowering risk of bugs avoiding discovery
- Facilitating the integration of work from multiple designers
- Supporting design reuse and third part IP
In a paper from DAC in 1996 DEC revealed that a third of all bugs found in their Alpha microprocessor were detected by assertions.
Third, the standards. Three standards have emerged that are relevant to ABV, namely, the Open Verification Library (OVL), Property Specification Language (PSL), and System Verilog Assertions (SVA). Standards enable end users and producers of technology to avoid or lower the risk that any resources devoted to developing assertions and assertion related tools will not be subsequently obsolesced. Standards help create a market for commercial implementations. User can have confidence that once assertions are inserted into a design they can be used by many tools and are quite portable for design reuse. The three standards are described below followed by a section on Accellera, the organization responsible for driving these standards.
Open Verification Library (OVL)
The OVL is a freely downloadable open source library available in two versions. One contains Verilog modules, and the other contains VHDL modules. These modules are used to specify properties of an HDL design to be verified, either in simulation or using formal or semi-formal methods. The modules are instantiated as assertion monitors that will flag violations of the specified property. These modules provide a standard interface for multiple design verification tools, thus enabling a more seamless flow. OVL defines approximately 30 checkers.
Every assertion library definition contains a severity_level, options, and a message. As an example consider the following. The assert_always assertion continuously monitors the test_expr at every positive edge of the triggering event or clock clk. It contends that the specified test_expr will always evaluate TRUE. If test_expr evaluates to FALSE, the assertion will fire (that is, an error condition has been detected in the code). The syntax is
Property Specification Language (PSL)
A PSL specification consists of assertions regarding properties of a design under a set of assumptions. A property is built from Boolean expressions, which describe behavior over one cycle, sequential expressions, which describe multi-cycle behavior, and temporal operators, which describe relations over time between Boolean expressions and sequences.
assertion: A statement that a given property is required to hold and a directive to verification tools to verify that it does hold.
assumption: A statement that the design is constrained by the given property and a directive to verification tools to consider only paths on which the given property holds.
constraint: A condition (usually on the input signals) that limits the set of behaviors to be considered.
property: A collection of logical and temporal relationships between and among subordinate Boolean expressions, sequential expressions, and other properties that in aggregate represent a set of behaviors.
PSL consists of four layers: Boolean, temporal, verification, and modeling. The Boolean layer is used to build expressions composed of variables within the design model that are, in turn, used by the other layers. Boolean layer expressions are evaluated in a single evaluation cycle. The temporal layer is used to describe properties of the design. It is known as the temporal layer because, in addition to simple properties, it can also describe properties involving complex temporal relations between signals. Temporal expressions (always, never, next, until, before) are evaluated over a series of evaluation cycles. Sugar Extended Regular Expressions (SERE) provide an easy way to string together sequences of boolean expressions over time. The verification layer is used to tell the verification tools what to do (assume, assert, ) with the properties described by the temporal layer. The modeling layer is used to model the behavior of design inputs (for tools, such as formal verification tools, which do not use test cases) and to model auxiliary hardware that is not part of the design, but is needed for verification. PSL comes in four flavors: one for each of the hardware description languages SystemVerilog, Verilog, VHDL, and GDL.
System Verilog Assertions (SVA)
According to SystemVerilog.org “In SystemVerilog, assertion information is built into the language, eliminating the need for the special modules, pragmas or PLI calls used in traditional Verilog. Embedded assertions capture real “design intent” in terms of functionality and constraints, and are verified by simulation before application of any formal or dynamic verification methods. This approach helps to avoid recoding errors, increase test accuracy, simplify the testbench, and enable test reuse. The full controllability and observability of internal circuit nodes afforded by ABV can significantly reduce design debug time. For example, IBM reports a 50% reduction in debug time using this methodology. Very importantly, this controllability and observability will spur the innovation of advanced design and verification tools, with one area of interest being assertion-based synthesis.”
Assertions are described in Section 17 of System Verilog 3.1 Language Reference Manual (LRM). The following paragraphs give some idea of the language.
There are two kinds of assertions: concurrent and immediate. The immediate assertion statement is a test of an expression performed when the statement is executed in procedural code. The expression is non-temporal and treated as a condition as in an “if statement”. The statement specifies the actions to taken depending upon success or failure of the assertion.
assert (expression) [[pass_statement] else fail_statement]
Concurrent assertions describe behavior that spans over time. The expression in the immediate formulation above would be replaced by a sequential expression or property. Concurrent assertions are based on clock semantics and use sampled values of variables. The timing model employed in a concurrent assertion specification is based on clock ticks. This way, a predictable result can be obtained from the evaluation, regardless of the simulator's internal mechanism of ordering events and evaluating events.
The failure of an assertion has a severity associated with it;
$fatal terminates the simulation with an error code.
$error is a run-time error, the default.
$warning is a run-time warning, which can be suppressed in a tool-specific manner.
$info indicates that the assertion failure carries no specific severity.
A sequence expression describes one or more sequences by using regular expressions. Such a regular expression can concisely specify a set of zero, finitely many, or infinitely many sequences that satisfy the sequence expression. The most basic sequential expression is “a followed by b n delay cycles later” or
a ##n b where a and b are Boolean expressions or sequences
If n = o, b starts on the same cycle that a completes. If n=1, b starts on the cycle after a completes.
The sequence req ##1 gnt ##1 !req specifies that req be true on the current clock tick, gnt shall be true on the first subsequent tick, and req shall be false on the next clock tick after that. The sequence req ##2 gnt specifies that req shall be true on the current clock tick, and gnt shall be true on the second subsequent clock tick.
Sequences can be logical combined through the usual Boolean operators and, or, and intersect. The first_match operator matches only the first match of possibly multiple matches for an evaluation attempt of a sequence expression. This allows all subsequent matches to be discarded from consideration. There are ways to specify the minimum and maximum number of repetitions for a sequence expression and how many cycles between repetitions. There are also functions ($rose, $fell and $stable) to detect changes in values between two adjacent clock ticks.
A property defines a behavior of the design. A property can be used for verification as an assumption, a checker, or a coverage specification. In order to use the behavior for verification, an assert or cover statement must be used. A property declaration by itself does not produce any result. The result of property evaluation is either true or false. There are two kinds of property: sequence, and implication. The implication construct allows a user to monitor sequences based on satisfying some criteria. Most common uses are to attach a precondition to a sequence, where the evaluation of the sequence is based on the success of a condition. An assert statement specifies the actions to occur depending upon the truth or falsity of a property. The cover statement is used to gather information about the evaluation and report the results at the end of simulation.
The results of coverage statement for a property contains:
Number of times attempted
Number of times succeeded
Number of times failed
Number of times succeeded because of vacuity
Each attempt with an attemptID and time
Each success/failure with an attemptID and time
Accellera was formed in 2000 through the unification of Open Verilog International and VHDL International to focus on identifying new standards, development of standards and formats, and to foster the adoption of new methodologies. Accellera's mission is to drive worldwide development and use of standards required by systems, semiconductor and design tools companies, which enhance a language-based design automation process.
Accellera has a mature standardization process:
-Level0: Identify needs through users and vendors
-Level1: Users and Vendor supported Incubation
-Level2: Refinement and Solidification through usage and tool implementation
-Level3: Distribute standards through IEEE to IEC (International Electrotechnical Commission)
In April 2004 Accellera announced it had become a Corporate Member of the IEEE Standards Association (IEEE-SA) in order to take part in and shape the direction of technology and its marketplace application in the IEEE-SA standards development environment at a corporate level. In May Accellera announced it offered the copyright of the language reference manual (LRM) for the SystemVerilog 3.1a hardware description and verification language (HDVL) to the IEEE-SA's Corporate Advisory Group. This group provides a platform for developers to produce market-relevant, full-consensus IEEE standards and will help guide standards through rapid accreditation. Standards developed under the corporate initiative occur through entity-based working groups in which each member has one vote. This industry-oriented method allows standards to be formed in one to two years, depending on participant commitment and the use of IEEE support services, which include administrative and project management, editing, meeting planning and marketing.
In July 2004 The IEEE announced that it has formed a working group within the IEEE-SA Corporate Initiative that intends to unify the development of two Verilog standards efforts: an update of the IEEE P1364, "Standard for Verilog Hardware Description Language," and the creation of IEEE P1800, "Standard for SystemVerilog Unified Hardware Design, Specification and Verification Language." IEEE P1800 will specify SystemVerilog, a broad standard that extends IEEE 1364 to aid the design and verification of large-gate-count, IP-based, bus-intensive chips.
In April 2002 the Accellera's Formal Verification Technical Committee selected Sugar from IBM's Haifa Research Laboratory (over Versity's e, Motorola's CBV, and Intel's ForSpec) as the basis for a standard property language. On May 29, 2003, the Accellera Board of Directors unanimously approved Property Specification Language or PSL (the official name for Sugar) as an official Accellera standard. On August 2, 2004 Accellera announced that its Board of Directors had approved PSL V1.1 as a standard and that it begun the IEEE standardization process for PSL with the IEEE Corporate Advisory Group (CAG). The PSL 1.1 effort focused on refinement of PSL 1.01 and on alignment of syntax and semantics between PSL and SystemVerilog Assertions (SVA) where possible.
I interviewed Denis Brophy, Chairman of Accellera. He sees the obvious need for and benefits of assertion based verification. However, he believes that standards are the catalyst. Technology producers and consumers do not want to risk commitment to a particular implementation without the footing of a language standard. He believes that standards will expand and broaden the market for available technology. We are really at the beginning.
I asked him about the future evolution of these standards. He observes that IEEE has been project oriented in that their work is seen as largely complete once a standard has been ratified. Permanent working groups of individual expert volunteers have not been successful. He hopes that they have entered into new territory with corporate based entity standards. Corporations can be a source of staff and/or funding to continue the evolution of a standard. Accellera will be participating and also monitoring IEEE activities. If IEEE does not wish to extend the specifications to meet what Accellera sees as the needs of its members, it reserves the right to extend its own works. However, Denis stressed that his organization is collaborating with IEEE.
OVL has not gone out for ratification by Accelera. The initial content was based upon a donation by Verplex, now owned by Cadence. This was a set of elements based upon Verilog. Some effort was made to generate an equivalent VHDL version but the result was not warmly received by the VHDL community. The focus then shifted to providing a library of checkers based upon PSL and SVA in the hope of significant performance improvements. Many companies are viewing the generation of checker libraries as a competitive advantage.
The following sections cover two representative commercial implementations of ABV.
Real Intent, Inc.
In April 2000 Real Intent announced $4 million in private funding to commercialize its verification technology. The company had its first shipment in July 2000. In March 2002 the company and C-Design Automation donated the SUPERLOG Design Assertion language Subset to Accellera.
Real Intent offers formal assertion-based verification (ABV) products under the family name Verix. Verix had been deployed at over 30 major IC design houses. The Verix product family is made up of Verix.IIV, Verix.CIV and Verix.EIV.
Verix-IIV (Implied Intent Verification) automatically extracts a comprehensive set of design assertions from an RTL design and formally verifies them. Since it does not need any test vectors or user-specified assertions, Verix-IIV can be used as the first verification tool in any design flow, even before automatic formal design analysis. Verix-IIV catches corner-case design bugs that can easily be missed in simulation. The system supports Undesired Sequence Check (USC) to ensure that undesired behaviors are not present in the design and Desired Sequence Check (DSC) to ensure that desired behaviors are present in the design. These automatic assertions include:
Verix-CIV (Clock Intent Verification) automatically detects synchronization errors that are commonly associated with multiple-clock-domain designs. These problems include:
- Reset synchronization
- Glitch elimination
- Inadequate hold time
- Loss of correlation
Verix-EIV (Expressed Intent Verification) empowers users to verify complex design behaviors by writing their own design assertions. These user assertions can verify complex static and temporal design behaviors and detect deep errors without the need for testbenches and test vectors. User assertions are written in standard assertion formats such as SystemVerilog, PSL or OVL.
Verix supports the Verilog and the VHDL languages as well as mixed mode designs that allow users to process designs comprised of blocks in both languages.
Verix employs a combination of highly-optimized formal engines and patented automatic design partitioning.
I spoke with Tom Ashar, relatively new CTO at Real Intent. He was previously with NEC Research Lbs in Princeton, NJ. Tom explained that a number of complex bugs really have simple symptoms that can be automatically extracted from HDL design. This is the basis of Implied Intent Verification. Real Intent's approach is a bottom up approach. At soon as a designer has a block of HDL code, he can verify it. These verification results can be reused later at the next higher level of hierarchy. Ultimately, full chip formal verification of multi-million-gate designs is accomplished by applying this process across the hierarchy. The challenge is to achieve sufficiently high throughput. This is accomplished through the hierarchy capability in conjunction with formal verification engines. He says competitors are using top down approach that requires creation of high level specifications.
He sees several changes in the last couple of years which bode well for ABV. First is the increasing realization by designers that traditional simulations are no longer giving them sufficient confidence that their designs are without serious bugs. Second is the standardization of assertion languages (PSL, SystemVerilog). Third is the advance in formal verification technology. Consequently, ABV is just now being accepted by mainstream designers.
0-In Design Automation
0-In, a privately held company, is based in San Jose, Calif. The company was founded in 1996 by Curtis Widdoes, who had previously founded Logic Modeling Systems Inc. and Valid Logic Systems Inc, by Steven D. White, who was vice president of design verification at Synopsys, and by Richard C. Ho, whose doctoral thesis pioneered the practical applications of formal methods for functional verification of digital circuits. The firm has had three nearly equal rounds of venture capital funding in 1998, 2000 and 2001 for a total of $22 million. The company first released a product in 2000.
On January 16th 0-In announced an agreement whereby Cadence would integrate and license 0-In's library of assertion checkers, protocol monitors, assertion synthesis, and assertion management technology with Cadence's Incisive functional verification platform. On June 7th Mentor Graphics announced an agreement to acquire 0-In Design Automation (41 employees) for $50 million in common stock. Robert Hum, Mentor's VP of the design verification and test division, said that “Customers require a clear and unified approach to assertion-based verification. With the inclusion of 0-In's assertion and formal verification technology, as well as products under development, Mentor Graphics' Scalable Verification platform will become a leader in assertion-based verification." Mentor Graphics says it will continue to support 0-In's existing line of products and customers and will expand distribution and support of these products with Mentor Graphics' global sales and support organization. 0-In products will become part of the Mentor Graphics Scalable Verification platform offering a complete range of integrated verification technologies including simulation, assertions, formal verification, emulation, and hardware-software coverification.
Mentor has been a member of 0-In Check-In Partner Program since January 2003. Synopsys, Cadence and Verisity are also members. In a November 2003 company newsletter 0-In claimed 14,000 assertions simulation licenses and 2,000 formal verification licenses. The company lists as customers many of the top electronics firms including Cisco, AMD, SUN, Samsung, Fujitsu, HP and Lucent Technologies.
0-IN product portfolio contains the following offerings.
Archer - CDA (Coverage-Driven Verification) is a system-level verification tool based on critical coverage points that enables coverage-driven verification methodologies based on simulation.
Archer-SF (Static Formal) Verification is a block-level assertion-based verification tool incorporating formal techniques for functional verification without the need for simulation. Archer-SF uses static formal verification technology to exhaustively verify properties of the design specified using assertions. Static verification can start early and be performed at any level of the design hierarchy. Advanced static analysis technology automatically detects common design problems that would otherwise go undetected (such as clock domain crossing (CDC) violations, RTL coding errors, simulation-to-synthesis mismatches and verification worry cases). Data-dependent issues are automatically promoted into the simulation environment.
Archer-ABV (Assertion Based Verification) seamlessly combines static and dynamic verification technologies to create a unique technology targeting the toughest bugs.
CheckerWare is a library of verification IP offering a rich set of assertion checkers for common RTL structures such as interfaces, datapaths, arbitration logic and complex standard interfaces such as DDR-SDRAM, AMBA, and PCI-Express. standard interfaces and common design elements. In addition to assertion checking, each element contains detailed structural coverage metrics, complete constraints for formal verification and control infrastructure.
The Archer Verification system provides full interoperability through assertion synthesis. This capability works with any mix of verification engines and standard assertion formats including Property Specification Language (PSL), System Verilog Assertions (SVA), and Open Verification Library (OVL) in addition to CheckerWare.
According to a May press release North American list prices for some of these components are: Archer-CDV $50,000; Archer-SF $60,000; and Archer-ABV $120,000.
I spoke with Steve White and Robert Hum founders of 0-In and now Mentor employees. Steve asserted that his group's competitive advantage lies in the breath and depth of their solution sets plus support for all tools and all languages. He believes that the combination of simulation, static verification tools and new technologies (assertion automation, verification hot spots, total coverage model, clock domain crossing) will lead to verification closure resulting in reduced risk and lower verification dollars spent per bug.
Mentor Graphics Pioneers New DFM Technology by Leveraging Calibre Design-to-Silicon Platform
Apache Design Solutions Selected as One of the Top 60 Emerging Startups Worldwide
Analogix and Synopsys Prove SERDES Interoperability at 6.25 Gigabits over Tyco Electronics 'Legacy' Backplane
Cadence Introduces Industry's First Yield Diagnostics Tool; Encounter Test Pinpoints Most Critical Design-Related Yield Issues
Micrel Inc. Adopts Silicon Insight DesignRuleBuilder from Stone Pillar Technologies to Speed Development of New Process Technology
Synopsys Showcases Galaxy 2004 Unified Design-for-Test Solution at International Test Conference
Synopsys Acquires Cascade, Completing PCI Express IP Portfolio
Cadence Announces Comprehensive Assertion-Based Verification Solution; Expanded Support of PSL and SystemVerilog Assertions Enables More Efficient Verification
HHNEC Standardizes on Synopsys' Proteus OPC Software to Reduce Mask Synthesis Turn-Around-Time
EVE Expands into China with Distributor Crescendo Technologies; Expects Rapid Growth for Multimedia, Embedded Systems Market Sectors
ATI Licenses Tensilica's Xtensa Configurable Processor
Agere Systems Licenses ARM11 Processor Family for Integration Into High- Performance Networking System-on-a-Chip Applications
Power Integrations Files Patent-Infringement Suit Against Fairchild Semiconductor
ZF Micro's Second Trial Date Set Against National Semiconductor
Toshiba Selects Rambus DDR2 Interface Technology; High Performance Drop-in Interface Cell Improves Time-to-Market
ARM1176JZ-S PrimeXsys Platform With AXI Technology Delivers Performance, Power Efficiency, and Security
STMicroelectronics Extends Cooperation With ARM to Enhance Security Features of Nomadik Mobile Multimedia Processors
ON Semiconductor Introduces LIN and CAN Transceivers and Supporting EMI/ESD Protection Devices for Advanced Automotive Networks
Agere Systems Announces the Industry's Most Integrated and Comprehensive GPRS/EDGE Platform for Multimedia Wireless Phones
LSI Logic Advances Industry's Most Flexible SerDes Core for ASIC and RapidChip(R) Platform ASIC Designs
Mentor Graphics Announces Third Quarter Results
Cadence Reports Solid Third Quarter Results
Nassda Announces Revenue of $11.0 Million for the Quarter Ended September 30, 2004
Silterra Announces Robust Third Quarter Revenue; Strong Third Quarter Sets Pace for Annual Revenue Record
TI Reports 3Q04 Financial Results
Tower Semiconductor Ltd. Announces Third Quarter and Nine Months 2004 Results
Altera Announces Third Quarter Results; Revenue Up 26%, Income Up 90%
Zarlink Semiconductor Releases Second Quarter Fiscal 2005 Results; Revenue Up 26%
MIPS Technologies Reports First Quarter Fiscal 2005 Financial Results; Revenue Up 40%
Dialog Semiconductor Reports Third Quarter 2004 Results
WJ Communications Announces Third Quarter Results; Revenue up 29%
Freescale Semiconductor Reports Third Quarter 2004 Results; Revenue Up 16%
Motorola Reports Third-Quarter 2004 Financial Results; Revenue Up 26%, Income Up 313%
Applied Micro Circuits Corporation Announces Second Quarter Fiscal 2005 Financial Results; Revenue Up 143%
Pericom Semiconductor Reports Fiscal First Quarter Results; Revenue Up 79%
Silicon Image Reports Record Third Quarter 2004 Revenues; Revenue Up 98%
Alliance Semiconductor Reports Financial Results for the Second Fiscal Quarter Ended September 30, 2004
TriQuint Semiconductor, Inc. Announces Results for the Quarter Ended September 30, 2004
Xilinx Reports 53% Increase in Profits; New Products Post Strong Revenue Growth
Actel Announces Third Quarter 2004 Financial Results
More EDA in the News and More IP & SoC News
--Contributing Editors can be reached by clicking here.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Jack Horgan, EDACafe.com Contributing Editor.