-
- News
- Books
Featured Books
- design007 Magazine
Latest Issues
Current IssueLevel Up Your Design Skills
This month, our contributors discuss the PCB design classes available at IPC APEX EXPO 2024. As they explain, these courses cover everything from the basics of design through avoiding over-constraining high-speed boards, and so much more!
Opportunities and Challenges
In this issue, our expert contributors discuss the many opportunities and challenges in the PCB design community, and what can be done to grow the numbers of PCB designers—and design instructors.
Embedded Design Techniques
Our expert contributors provide the knowledge this month that designers need to be aware of to make intelligent, educated decisions about embedded design. Many design and manufacturing hurdles can trip up designers who are new to this technology.
- Articles
- Columns
Search Console
- Links
- Events
||| MENU - design007 Magazine
The State of PCB Design, 2011
January 18, 2012 |Estimated reading time: 7 minutes
This article appeared originally in the December 2011 issue of The PCB Magazine.
For the past 16 years, Mentor Graphics has conducted the Technology Leadership Awards program. Dozens of PCB designers from around the world submit their best designs for consideration. This set of designs provides an interesting viewpoint, when compared to the previous year’s submissions, to assess the state of PCB design for 2011. Some of the trends that we noticed may surprise you…as they did our staff.
In addition to the TLA award program, I surveyed a number of managers within Mentor to get additional views on 2011. Their input was interesting and valuable, but perhaps not quite as surprising as the actual designs from the field. So, here are the major trends from this year, with a bit of commentary.
High-Speed Circuitry and the SERDES Crisis
Mentor’s development engineering director for high-speed PCB products, Dave Kohlmeier, refers to the “SERDES Crisis,” a term he coined in 2011 to describe the almost exponential increase in the use of SERDES interfaces. Previously, designs often included one SERDES interface or perhaps a couple. A single interface requires careful design due to the extremely fast edge rates that ensure proper signal integrity. Adding a second interface increases the complexity, as now there are even more asynchronous edges with their possibility of crosstalk and EMI.
The last couple of years have seen the number of SERDES interfaces on single PCBs increase dramatically. That has resulted in the absolute necessity of running signal integrity (SI) simulations to ensure that these very fast interfaces operate properly – and that is the “crisis.” Even the best designers using the best design tools may find it impossible to get a clean signal without simulation and tweaking the layout.
The PCB designs submitted for the TLA competition bear this out: 83% of the designs employed signal integrity simulation and analysis as a design procedure. Quite clearly, the effort put into SI is reaping benefits for the companies employing it.
Signal integrity isn’t the only area that can be negatively affected by extremely fast edge rates and multiple SERDES channels. Without clean power, every designer’s nightmare – intermittent problems – is unavoidable. Current droops, voltage spikes, high-current density points and other power integrity (PI) issues can not only cause intermittent data errors but can lead to more disastrous results such as long-term component failures and even trace delamination.
Again, the TLA statistics show that PI analysis is becoming commonplace in PCB design. In 2011, exactly two-thirds of the designs incorporated PI simulation and analysis in the process. That number represents a 12% increase in the number of designs incorporating PI compared to just one year ago. As shown in Figure 1, spending up-front time to find problem points in the PCB design to ensure clean power is another trend that companies are viewing as worth the effort.
Figure 1. Current density maps produced with HyperLynx PI can graphically alert designers to problem points before the PCB design is committed to prototype, allowing rectification before making a physical board. In this illustration, the “hot” colored spikes indicate significant current density on that portion of the PCB.
To cement the trend of ever-increasing edge rates, I took a look at both the number of high-speed nets and the edge rates for 2010 and 2011. Looking at the 2010 numbers was a “WOW!” moment; the 2011 numbers are a “HOW?” moment.
The average number of high-speed nets on the design submissions actually decreased a small amount. However, the maximum number of high-speed nets in 2010 was an astonishing 5,400; in 2011 that number rose to 6,000. Who can predict how many high-speed nets will be packed into next year’s designs?
Edge rates also continue to climb to the point that they are reaching physical limits. In 2010, the average edge rate was 468 ps; in 2011, that average was cut in half to 223 ps. The minimum edge rate in 2010 was 10 ps; in 2011 five of the entrant’s designs were operating with edge rates of 10 ps or less.
One more trend deserves mention and that is the number of designs incorporating RF. In 2011, the number of designs with RF onboard rose by 43%, with one in five having RF. That there is growing use of wireless technology goes without saying. But, the additional challenges of incorporating RF along with other analog and high-speed digital signals on the same board are quite ominous. In addition to all the SI issues that multiple high-speed nets give rise to, now designers have to deal with RF energy “wiggling the bits.”
Quite obviously, the continuing trend in 2011 was faster, faster, faster. Not coincidentally, the trend in design tool usage was more and more simulation and analysis.
The Physical Challenge
The physical trends in PCB design were more surprising than the electronic trends. Table 1 shows a number of the average physical trends over the 16 years that the TLA program has existed. Most notable is that over 16 years, the number of nets has doubled, while at the same time, the total PCB area has decreased by 20%. If there is anyone who does not believe there is “art” to PCB design, this should dispel the thought!
Table 1. Physical design trends in 2011 (Source: Mentor Graphics TLA program).
But wait…there’s more. In addition to the number of nets climbing, the density of the components on the PCB increased by 30% in the last year. Let’s think about that…faster, denser and smaller. There’s no mistaking the trend.
This leads to the obvious question: What has enabled that eye-opening trend? Unfortunately, the answer is not at all obvious. One possibility is an increase in the use of HDI technology. However, there was actually a slight decrease in designs using HDI, although nearly half of the designs did employ the technology. How about trace width and spacing? Looking at Table 1 again, you can see that it has remained constant the last few years and actually hasn’t changed much in a decade.
The answer could lie in the use of more dense components. More FPGAs were employed this year and chip-on-board designs doubled, but still only account for 6% of the total. Flip chip usage also doubled from 2010 to 2011, but is only nudging 10%. But, contradicting that notion is the fact that the average number of leads per part is four (this, of course, includes connectors, test points and 2-pin passive components).
Perhaps the answer lies in the tools and the ability of the designers to fully utilize the capabilities built into those software tools. Employing SI simulation and analysis has allowed many designers to actually increase the density of their designs. They are able to actually determine, rather than calculate and use a margin of error, what effects the layout density has on the signals. The same can be said of PI simulation and analysis. A prudent use of PI generally results in a tighter layout and elimination of many bypass capacitors, freeing up space for denser components to be placed and routed.
Design for Manufacturability
Design for manufacturability (DFM) is a popular buzzword these days, but are companies actually employing the technology to reduce prototype iterations and re-designs? The TLA trends clearly show this to be the case.
At Mentor, we call the DFM tactics “left shift,” that is shifting the considerations of manufacturability to the left on the product design time line. Don’t start thinking about manufacturability after the design is complete; “left shift” that thinking to early in the design or even before the design is begun.
The entrants have clearly made an increasing commitment to this concept. In 2010, more than two-thirds of the companies entering TLA (68%) employed DFM in their design procedure. This year, that number rose to almost three-quarters (74%), an increase of 12% from year-to-year. With new software tools coming on line, this trend is certain to continue, probably at an increasing rate.
Tying It All Up
There were some very obvious trends in PCB design during 2011 and all point to an optimistic outlook. Table 2 shows a summary of the more dramatic trends.
Table 2.Design Trends for 2011 (Source: Mentor Graphics TLA program).
Quite clearly, designers are putting more “stuff” into a smaller space and doing it more efficiently. While the tool capabilities continue to amaze, no less amazing is the obvious underlying skill of the designers who squeeze every last capability out of those tools. Add to that the large and increasing use of SI and PI simulation and analysis and you have a pretty good picture of “The State of PCB Design” for 2011.
David Wiens joined Mentor Graphics in 1999 through the acquisition of VeriBest. Over the past 25 years, he has held various engineering, marketing and management positions within the EDA industry. His focus areas have included advanced packaging, high-speed design, routing technology and integrated systems design. He holds a B.S. in computer science from the University of Kansas.