March 23, 2009
The Aart of Analogy Revisited
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
| by Peggy Aycinena - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!
Despite turmoil and tempest, technical conferences continue unabated in Silicon Valley. In the last 4 weeks alone, DVCon,
have played themselves out in San Jose and Santa Clara. Over the next several weeks,
will be here, as well. It's true, attendance is somewhat down due to travel budget restrictions, but the presentations, keynotes and panels continue nonetheless, as do the proceedings, conference bags, food and beverages.
No need to worry that these events are happening in a vacuum, however. Every speaker I’ve heard at every conference over the last 4 weeks has felt obliged to mention, no matter the nature of the talk, that the topic at hand is being affected by the global downturn. These tumultuous economic times are thoroughly woven into the content.
And while the ubiquitous empty office space all over Silicon Valley continues to attest to the lingering affects of the last downturn, circa 2001, National Semiconductor’s recent announcement that 26 percent of its workforce is being laid off summarizes the current downturn, circa 2009. Unemployment in Silicon Valley has now hit 10 percent, proving the current downturn is worse than the last.
But don’t let the news overwhelm. By all reports at all conferences of late, progress continues with people determined to stay focused on the future. At the EDA: Dead or Alive? panel at DVCon, the seven panelists were unequivocal: EDA is more than alive. It’s the single most important pre-requisite for progress in electronics. Unless the electronics industry itself dies, which it will not, EDA will never be dead.
In the weeks after the DVCon panel on February 25th, I spoke individually at length with all seven participants, following up on various topics discussed during the panel and available here on
* Atrenta’s Ajoy Bose:
* Javelin Design’s Diana Feng Raggett:
* Synopsys’ Gary Meyers:
* EVE’s Lauro Rizzatti:
* Berkeley Design Automation’s Ravi Subramanian:
* SpringSoft’s Scott Sandler:
* Calypto’s Tom Sandoval:
Beyond these conversations, there have been a host of other topics tackled at the various conference venues in the last 4 weeks.
At ISQED on March 17th, Conference Chair and Synopsys President Chi-Foon Chan gave a keynote outlining the benefits and challenges inherent to EDA, with emphasis on digital design, packaging, and the massive ROI available to those who participate. Magma CEO Rajeev Madhavan followed with a keynote outlining the benefits and challenges inherent in analog and mixed-signal design, and the need to further enable process migration in analog. The final keynote came from Mentor Graphic’s VP and GM Simon Bloch who gave a mini-tutorial on the definitions and importance of ESL, TLM, and moving to higher levels of
abstraction. The three keynotes, taken together, were effective summary of the major initiatives underway at the biggest companies in EDA.
The evening of March 17th at ISQED, EDA Tech Forum’s Paul Dempsey moderated a panel on another critical issue in EDA, design for manufacturing. Dempsey’s query, DFM: Insurance Policy or Secret Weapon, led to a lengthy after-dinner give and take between the panelists: Chartered’s Walter Ng, Global Foundries’ Luigi Capodieci, Virage Logic’s Yervant Zorian, Atmel’s Steve Schumann, and Mentor Graphic’s Juan Rey.
Their conclusions were not surprising. DFM is important, the ROI can be quantized for the skeptics among you, and there are tools in place today that: a) engineers embrace, b) foundries endorse, and c) management accepts. The fact that people in the audience seemed a bit less than sold on the whole idea was due, per the panel, to a lack of education and exposure on the part of the audience. The panelists argued DFM is definitely here and, more importantly, is transparent to the engineer. Use it or suffer the consequences. Walter Ng emphasized, however, that DFM is still sorely lacking standards.
While ISQED was playing itself out at the DoubleTree in San Jose, the Multicore-Expo was underway up the road at the Santa Clara Convention Center. Conference Chair Markus Levy said upwards of 700 people were registered. Conference content was complex, which was not surprising given the fluid and ubiquitous nature of the evolution of multicore devices today. Even the definitions are still up for grabs, so get a pencil and create a graph with 2 columns and 15 rows. Label the two columns Hardware and Software. Label the 15 rows: Core, Pipe, Multicore, Multipipe, Multiprocessor, AMP, SMP, Multitask, Multithread, Load balancing, Granular parallel, Course
parallel, Homogeneous, Heterogeneous, and Parallel code.
Now call Dave Stewart, CEO of Critical Blue, and Tony King-Smith, VP of Marketing at Imagination Technologies. Engage them in conversation – Stewart about software, and King-Smith about hardware. Jot down what they have to say in the appropriate cells of your graph. Then, accept that the jury is still completely out with respect to the definitions, who the major players are in multicore, and whether today’s multicore hardware is being fully utilized, let alone tomorrow’s hardware, and whether or not today’s legacy code can be parallelized, let alone tomorrow’s new code. Now you’re
ready to attend Multicore-Expo 2010 – that is, if you sweep the columns and rows into a third dimension that reflects a host of different application spaces. Multicore and everything that it involves/implies is the single most daunting paradigm facing the industry today – no matter how you define the industry.
Meanwhile, in case you weren’t feeling manic enough March 16th to19th trying to be in two places at once – ISQED and Multicore-Expo – there was more. Silicon Valley SNUG 2009, the Synopsys Users Group, was also scheduled into the March 16th-to-19th timeslot. Don’t these conference organizers talk? Aren’t they sufficiently worried about attendance at the various conferences to not double or triple book? Who knows? These things are often a mystery to me.
Happily, SNUG offered very little access to the Press, with an extremely limited number of sessions during the 4-day event available. Members of the Press weren’t even given a conference bag, which was sad. This one was a beauty. [Note to Synopsys tools users: Come to SNUG. The bags alone are worth the trip.] Meanwhile, sans bag, I attended the opening keynote on Monday, March 16th, given by Synopsys CEO Aart de Geus. His slide detailing the factors that have led to the current global economic collapse was pretty darn interesting, particularly if you’re more electrical engineer than economist.
Finally – on my list of highlights from the last 4 weeks of conferences, I include the evening panel at DVCon on cloud computing. Moderator Harry Gries deserves high praise for orchestrating a really great discussion. I learned a lot, the people in attendance are clearly heavily engaged in the concepts swirling around cloud computing, and I came away with: a) new respect for Amazon for providing the compute power to keep electronic design moving forward, and b) renewed enthusiasm for tracking the developments of “a style of computing in which dynamically scalable and often virtualized resources are provided as a service over the
Internet.” (Thanks, Wikipeida, for that last bit.)
From my point of view, when you combine Cloud Computing, Multicore everything, and Open Source, you pretty much sum up the future of the Little EDA Industry that Could.
The Aart of Analogy revisited …
Synopsys CEO, Dr. Aart de Geus, is the 2009 Phil Kaufman Award recipient. There have been 15 Kaufman Award winners to date, but quite simply, Aart stands in a class by himself. Show me any other leader in the industry who has been at the helm of an organization for 20+ years, has grown the company from startup to perpetually dominant player, and whose name and flagship product – de Geus and Design Compiler – are effectively synonymous with electronic design automation.
The first time I conducted an in-depth interview with Aart de Geus was the week after the 2000 Presidential Election. Astoundingly, that original interview still lives:
An enormous amount of water has passed under the world’s bridges since that time – politically, militarily, economically, and technically. Yet through it all, Aart has continued to lead Synopsys. To this day, no one doubts de Geus is a tough competitor, no one doubts he is a superb musician, and no one doubts that his community spirit is unmatched in the industry. Those things are well known about Aart. What is less well-known is his great sense of humor and his ability to remain both candid and on message simultaneously.
It was an honor to sit down recently with Aart and chat at length once again. Please get your cup of coffee, click on the “print article” button up there on the right, and enjoy a thoroughly unscripted chat with the CEO of the largest company in EDA.
A conversation with Aart de Geus …
* Peggy – Would you have ever believed at the outset of your career that we would be at 22 nanometers today?
* Aart [laughing] – When I started my career, I didn’t even know what a nanometer was!
In fact, I’ll tell you a story. In 1978 or ‘79, I attended a conference in Switzerland of leaders in the field of electronics, or microelectronics. They all agreed on 2 things. Number 1, electronics was going to be a big deal and would move forward for many years to come. Number 2, one micron was the hardest barrier that we would never move across. And [laughing again], those same people who made the predictions are the ones who made 22 nanometers happen!
Of course, photo-lithography was what broke through the barrier – with enthusiasm and verve! The lesson here: whenever one predicts the end of something in high tech, there’s always a twist or new perspective that makes a new breakthrough possible.
By the way, in geologic time, 1978 was just yesterday. But some of us feel more attuned to geologic time than other time frames.
* Peggy – I went to a conference keynote recently where the CEO told his audience that electronics must push forward, so his kids can have more tunes on their iPods and more video downloads on their mobile devices. Do you think this kind of rampant consumerism, as they’re calling it now with the economic
downturn, is a problem? Or is consumerism the principle motivator for electronics, and therefore justifiable?
* Aart – Obviously, there’s an economic angle to this discussion. By sheer necessity, consumer consumption has gone down as people have less money, and find themselves in bad shape. But you actually imply a separate question – if there was
more consumption, would the economic machine be less stuck?
The whole notion of accessing video on a mobile device is a driver to technology in a massive way. The data transformation, storage, display, commercial utilization – a video is orders of magnitude higher in consumption of bits and bytes than audio content. Therefore, if you can build the technology that makes mobile video available, the consumers will welcome it.
Sure, many of the consumer products have driven the investments and the economic machine that have made it possible to push Moore’s law, and therefore yet more investments. But also, because of these advances, there have been many inventions that are profoundly impacting human kind from a medical point of view or communications point of view. These have had an impact, as well, on the meaning of what life is all about.
One example, the entire domain of genetics. It’s really computational technology, and chip technology, that’s driving genome science forward at a breakneck speed. A month ago I was at Davos, and there were a whole bunch of sessions on the topic of genetics, including guys working on designing chips to make genetic mapping cheaper and cheaper. You should look at the website,
. For $1000, you can get your complete genetic profile. That opens up such a rich set of information about ailments and medical challenges. These are the fabulous side results of your rampant consumerism
Clearly, however, it’s all a two-edged sword.
* Peggy – I’m very interested in the multicore technology. Is this a software problem or a hardware problem?
* Aart – The two are completely inter-related. We now design chips with multicores and other forms of parallelism, because we couldn’t design chips that were faster and faster without piercing a whole set of power limitations. Therefore, out of necessity, we
started to make multicore – if we can’t make a single processor core faster, let’s make 2 and [enhance performance that way].
There’s a loose analogy here. How can one woman have 9 babies in one month? It’s not possible. It’s one baby in 9 months with one woman, or 9 babies in 9 months with 9 women. Some applications are limited by nature in the amount of parallel processing that can occur.
Similarly for EDA software, like any other utilizer of compute power, we know some problems can be split up more easily than others. Three years ago, we started looking at the problem under the banner of Teracomputing. We said, let’s look at the state of the art of parallel programming. Now today, I can say that every one of our application tools uses multicore hardware.
It’s still true, however, that some problems are very, very tightly interconnected where it’s very difficult to split the application up into multiple pieces. But for things like a local DRC check, why not do these in parallel? That’s a very good use of multicore functionality.
* Peggy – So, are you talking here about real progress in multicore, or just window dressing?
* Aart – No, it’s not just window dressing. This goes at the very architecture of the program. But it is true, there’s quite a spread in terms of the results and what you can get. For the most inter-twined programs, you can get 2-to-5x faster. But for
programs that can be highly partitioned – the things you would have in OPC or things related to very large amounts of data being checked locally – there, we can literally split the program into hundreds of simultaneous processes. And often, not just on one computer, but on many computers. The largest compute farms in the world today are being used for simulation and OPC.
* Peggy – But is that the same problem? Partitioning programs for compute farms versus multithreading for multiple processors on a single chip?
* Aart – Well, there’s link between those things. They have somewhat different characteristics, but simply put – it’s a balance between computation and shoving data around. If you have to go to a memory chip, it’s slower. If you have to go
off to a completely different computer to process a thread, it’s really slowed.
My analogy for this is about doing the dishes. Every so often, my kids have 10 friends over to the house. I tell them it would be efficient for the 10 of them to do the dishes together, the process would go 10 times faster. But, with 10 kids in the kitchen, the reality is that you can forget about any form of efficiency. The dishes are just colliding. Nonetheless, it’s still better than just one person cleaning up for all 10.
You can see the issue – it’s one of too many little programs versus data space allocation.
* Peggy – How do you customize your offerings for your customers, whether for their multicore compute platforms, or their compute farms, or whatever.
* Aart – Ah. This is a good question! Increasingly over the last few years, we have been making quite an effort to help our customers customize in a way that makes sense. Back 5-to-6 years ago, when AMD came out with their first multicore processor, initially
people jumped on them because they actually made our products run faster, which was a driver for us for selling those solutions.
But at the same time, people were saying to us, you also need to make the software work on our slower compute platforms. So, the interactions we began to have with the different people who provide the hardware for our customers was very important. But also, the good news – we’re now providing the tools to those people for designing their multicore processor products.
So, we’ve had good insights into all aspects of the problem. It’s really an ecosystem where we all have to dance together. If somebody is going to be wearing shoes that are too big, they’ll end up stepping on everyone else’s toes.
* Peggy – So, how are your tools helping to promote multicore technology?
* Aart – Actually, I was personally involved in that, because I got worried that our teams had so many different requests from our customers about all of this: “Help me re-architect your products, so it can match our multicore hardware.”
Clearly, we needed to coordinate our internal efforts. It turns out that by putting together a task force on terascale computing, we created a center of gravity at Synopsys on best-in-class tools. We could look at a [range of options] and say, that worked well, or that didn’t work for the following reasons.
I was able to just stand on the sidelines and receive [regular updates] on how things progressed. The task force allowed us to formulate our thinking around the general topic of multicore, and has moved our entire team forward.
* Peggy – Who’s on your task force?
* Aart – It’s just an internal group, although the first thing we did was to interact explicitly, and very informally, with all of the leading developers of multicore technology – our customers – plus a number of different universities. Mostly,
we have incredibly deep technology knowledge at Synopsys, so that was our principle source of inspiration.
* Peggy – Speaking of Synopsys, how is it the company has maintained its stock valuation over time? It’s been quite interesting to watch over the last year or two.
* Aart – Obviously, valuation is a function of two things – how the business is doing, and what’s going to happen in the future. We’re fortunate that we’ve read a number of threads correctly, that we’ve put together both a technology
and an economic strategy that was in the right place at the right time.
For many years, EDA success was measured by how good, how fast, and how effective an individual tool was. That was reasonable, because most things have to get better and faster as you scale with Moore’s law. But there’s a challenge here – whenever a field grows in scale, sooner or later you get systemic complexity, as well. It’s not just more things, it’s more things inter-related in more and more dimensions.
Looking at it simply – before, every new chip was faster, but at 130 nanometers, power became a massive restriction. That was an additional dimension added to the problem – the designers had to look at everything, including power. At the manufacturing level, at the transistor level, at the system level, with multiple voltage – everything was very complex with a whole laundry list of systemic issues touching place-and-route, embedded software, and so on. In other words, everything’s inter-related with everything and there’s tremendous system complexity.
So, what have we done? We’ve moved from scale complexity to systemic complexity and started to work on the interactions of various tools. It’s not just the output of one tool to another, but the upstream tools have to be clairvoyant about the downstream tools.
For many years, synthesis was just that. But then, we said we need to look at the interconnect, so we put a little placement into synthesis, and that clairvoyance allowed a little bit of routing to predict a little congestion. The same is true for the whole tool set. If you know what the tools are going to do, it goes to a more and more integrated flow. This has driven our entire M&A strategy, moving to get all of the puzzle pieces in place.
We couldn’t have predicted this recession, of course, but we’ve moved the entire company, plus a number of customers, into preferred relationships that – despite systemic complexity – still get chips out in a reasonable amount of time.
At the same time, we also realized the business model – the end-of-quarter transactions – was broken. We realized that we’re about long-term relationships. To get the order done just to make the numbers for the quarter was a very negative approach, so we migrated to a ratable business model to realize revenue on a day-by-day basis, with no early recognition.
This combination [of strategies], both technical and economic, have put us in a very different position today than others in EDA.
* Peggy – This is a complex process you’re describing, the evolution of strategies within a company. Does it require something like an orchestra conductor to break open the teams? To discourage fiefdoms, which are unwilling to open up and cooperate with
other fiefdoms in the organization, unwilling to link their products or code?
* Aart – I like your musical analogy, but is it a classical symphony approach? You will play it this way, or bye bye. Or is it a classical jazz approach? Where sideways listening skills are just as important as individual solos. I believe Synopsys is far
better at the second than the first.
True, it’s taken years to get value links, so that the value of all tools are considered important, so you can look at the big picture and not just the runtime of individual tools. Yes, this is a very difficult goal. But, from the economics of the customers, it’s still the throughput of the individual tools that is important.
So, this is exciting. Precisely because of the economic stress happening right now, many of the management teams at our customers are asking, what is the overall case for this project or that? Precisely in a downturn, people ask, is there a way to improve the overall picture, the overall efficiency of a project?
It took a lot of internal change at Synopsys for us to [reach this point]. But, right now I’m seeing such a sea change on the part of our customers – no matter how much tuning they do for their projects, there’s still too little money – so they are turning to us for help at a much higher degree.
* Peggy – I think I hear you describing the all-you-can-eat EDA model. Is that good for Synopsys? Good for the EDA industry?
* Aart – The technology benefits are obvious, but from a commercial point of view – it’s not that we can give you all you can eat, but that we put a lot of food on the table. Our customers have fixed budgets, because if they don’t, they
won’t be back at the restaurant.
It’s always a balance between what you can provide and what the customer can afford. But no matter what, we need to make our customers successful. Gluing together some Frankenstein flows may include best-of-class, but be worst-of-class in aggregate.
The challenge for the industry remains, however. Whenever you provide a lot of technology, people don’t fully appreciate what they’re getting for their money. In every industry, the question remains. Are we appreciated enough by our customers? At this point in time, I appreciate that people are designing with our tools. That’s as far as I can see right now.
* Peggy – So, how about the future of CMOS, the prospects for novel materials, a future that includes self-assembling compute platforms?
* Aart – So that begs the question, what is new? Look at 22 nanometers versus 250 a number of years ago, it’s extraordinarily evolved! We have seen just remarkable progress, but precisely because there’s been so much investment in silicon and CMOS.
Gallium arsenide never got to a critical mass [of acceptance] and never could move the state of the art forward. That’s the reality of anything that needs big investments.
It doesn’t mean that the door’s not open, however, for new breakthroughs as we get to much smaller devices. At the nano-scale, the technology is definitely being developed. But therein lies the problem. It takes a long time. Over time, yes there can be remarkable innovation in technology – but over time.
Still, I am in the camp of never say never, although you’re the one who used the term dinosaur to describe those of us who have been in the industry for a long, long time.
* Peggy – Hey! I never used the word dinosaur!
* Aart [laughing] – My daughter always calls me a homo erectus, but I say I’ve evolved. I’m a homo sapien. I’m well aware of the scale of time needed for evolution!
* Peggy – Okay then, 50 years from now, will we have self-assembling compute platforms?
* Aart – Here’s the problem with your question. At the beginning of our conversation, you asked if 25 years ago I could have imagined 22 nanometers. Well, 50 years is so much farther out than that!
* Peggy – Are you telling me that you have a limited imagination?
* Aart [laughing] – It’s reasonably safe to predict 50 years out because neither of us will be alive, so I can say anything. 50 years is such an amazingly long time. Clearly at that point in time, we will know an enormous amount more about materials, and
will be able to build things up from extremely tiny structures. 50 years from now the notion of storage will have been brought down to something much smaller than a transistor.
Right now, a guy is building a model of the human brain, one neuron at a time. By 2018, we’ll have an electronic model of the human brain. That’s a heck of a lot closer than 50 years. And if you extrapolate from that the notion of modeling sophisticated systems – where that leads to is hard to completely imagine, because simultaneously people have started to connect the human brain, via probes, directly to computers. People can move a cursor just by thinking the command.
As you know, proteins are the molecular complement to DNA. The point being that protein research has been supported initially by massive amounts of simulation. From there, you go to synthesis and molecular optimization. You won’t have to wait 50 years to see that come to pass. And, if you combine the understanding of DNA and the interactions with drugs that are completely designed and delivered [with a particular patient in mind], that’s all going to happen in the next 10 years.
We can see so many incredibly fascinating things are happening, that five decades from now – the extrapolation becomes nothing short of a science fiction movie. And there will be many phenomenal advances between now and then.
* Peggy – What will Synopsys bring to that process of advancement?
* Aart – No, here’s the real question – are there sufficient business opportunities that are generic enough to warrant development programs in these areas? Invariably these questions get answered when the time is right. A lot of fields of study need to
go through a lot of development before you can apply the solutions.
And, it all depends on the economics of the situation, to what degree you can standardize the problem. By the time you make commercial programs, the solutions have to be easy to use. Spending too much time and money, too soon, is tricky – particularly in the midst of the biggest recession in decades. You have to make sure the ship is safe and sound, while guaranteeing that progress is being made.
* Peggy – So, what is EDA? A service industry? A product industry?
* Aart – I love that question! For many years, I’ve argued that EDA is this multiplied by that, not this and that, or this or that. Even great tools, if they’re lacking
support services, will get you nowhere in EDA.
The analogy I like to use – our tools are like race cars. You have to win the Grand Prix with them by designing the ultimate chip, while our support teams are the pit crews who tell you, “In this weather use these tires,” as so on. If you don’t listen to your pit crew, you won’t win the race.
Between the design team, which is the ultimate driver, and the support team, which is the pit crew, and the quality of the tools – EDA is clearly product multiplied by service.
* Peggy – I know you do not want to be self-promoting, but how is it that you’re a technical leader and a business leader? That you’ve had success on both fronts?
* Aart – For starters, I’m neither a top-notch technologist, nor a top-notch business person. Because of that, I’ve been able to surround myself with really top-notch technologists and really top-notch business people.
In addition, the role of the CEO is an interesting one. At any point in time, you are reviewed on the business side and on the motivational side, on the picking-the-right-people side, and on the being a hard-ass-driver side. What I love about this job is, there is not a single day without a challenge that I’ve never seen before. There are always new challenges in high tech. New technologies can be dismissed by the marketplace in a matter of months, or they can be your saviors. If you look out over the entire economy, you have to put a lot of ideas out there to find one that succeeds.
I’ve been very fortunate to be able to participate in Synopsys, to have the support of an incredible COO and a fantastic, top-notch team at the top. But, throughout the organization, there are great people. I’m just the bandleader.
There is one down side, however. There is no Off Button, literally, in my life. Sometimes, I think it would be so cool to be able to go just one month without ever thinking about Synopsys and the business. Maybe it’s just me, but in my life the On Button is always on.
* Peggy – So, if you had not been an accomplished jazz musician, Synopsys would not have been a success?
* Aart [laughing] – It’s a jazz combo here. Someone has a great idea, and the other musicians build on it. We’re not marching a big, structured army.
And in a massive downturn, there are various schools of thought about how management must change, but one has to have adaptive leadership at all times. You can’t just say, here is the solution. Go execute on it. You have to share problems with many people in the organization, so that at any moment they can act independently on things. We’re all on a ship in high seas in a hurricane. If someone sees a sail has ripped, they need to be able to fix it without asking for permission. That’s very much what happens in a jazz combo, as well.
* Peggy – I know you have two daughters. What advice do you give them for their own futures?
* Aart – They both know I support them in whatever they do. They both happen to be quite good on the math and science side, and we support them in that. As parents, we try to give them the strongest set of values, give them a set of discipline skills –
finish what you are responsible for – and some degree of help in finding their own balance.
One of my daughters recently reminded me that I would pay for anything having to do with education. She said that when she has ice cream, she is more relaxed and studies better, so that could be a cost associated with education. [laughing] She clearly has the nascent talent needed to become a marketeer!
Ultimately, however, they’ll have to decide for themselves what they will do with their lives. We are very supportive of their ideas and ambitions.
Editor’s Note …
As testaments to the community spirit of Aart de Geus, he served as Honorary Chair of the 2008
San Jose Jazz Festival
. In addition, de Geus and the Synopsys Outreach Foundation
sponsor the Santa Clara Valley Science & Engineering Fair Association
. The Awards Ceremony for the
Synopsys Science & Technology Championship
will take place this next weekend, March 29th, at Great America in Santa Clara.
Next month in EDA Weekly …
We’ll be looking at a summary of EDA-related news from mid-February through mid-April. We’ll also be visiting
in Nice. See you there!
You can find the full EDACafe event calendar here
To read more news, click here
-- Peggy Aycinena, EDACafe.com Contributing Editor.