# Metrics are good, but insight is best

#### Article By : Steve Hageman

Answering the questions "Will my schematic fit on the given PCB area?" and "How long will it take to do the layout?" with a statistical approach.

A situation faced by all engineers when the schematic is complete is “Will my schematic fit on the given PCB area?”, and “How long will it take to do the layout?”

I have faced these same questions for decades and about 10 years ago, I finally had the time to take a statistical approach to the issue [1]. This involved writing a script that takes the schematic bill of materials (BOM) including all the parts area, the parts pins, and the quantity of each part in the design. This script then gives me an output of the footprint to PCB utilization area and outputs the PCB utilization ratio and the number of pins to be routed (Figure 1).

There have been a few (very few) attempts at PCB estimation reported in the past and some of the methods use much more complex criteria as inputs to their estimators. My training on “analysis of variance” [2] taught me to “keep it simple” and start with the (probable) most important things first. The part’s footprint to PCB area and the total number of pins in the design seemed like a good start to me based on the type of work I do the most.

Figure 1 A schematic and a board blank, the question any designer faces is: “Will it fit?”. Source: Steve Hageman

Analysis of my typical design

I do a lot of baseband analog and RF design, but these designs always have some sort of logic control—be it something as simple as latches to more complex control FPGA or microprocessors—so everything is a mixed signal design. While I may have to length match some traces, this is not a large percentage of my designs. RF traces take longer to layout than baseband signals, but I didn’t break them out separately in my first-order analysis.

Even if the schematic is not complete and if you are just working off of a block diagram with experience, a decent estimate of the parts required can be made. This is especially true today when you can look at a myriad of reference designs and just count the parts that they require (Figure 2). This is 1000% better than some wild guess, which always tends to be low.

Figure 2 Simple analysis of the total part area, total PCB area, and pin count can provide a simple answer to the question posed in Figure 1. Source: Steve Hageman

Results of analysis

Over the course of the next 10 or so PCB designs, I had enough information to start to make sense of the data. The first interesting point is to have a good sense of the ratio of parts to PCB areas upfront instead of guessing. This directly leads to understanding the number of layers and via technology needed. Naturally, large amounts of RF circuitry will also drive the PCB and via technology required. As the PCB gets dense, adding routing layers makes the routing job easier and if the design starts to get very dense, then using via technology like via in pad helps routing as well.

At first, I thought I might have to break out the baseband and RF designs into different estimates, but a very interesting fact emerged. I found that the layout time for a reasonably dense PCB hardly depended on what I was routing, rather it was driven almost totally by the number of pins that needed to be routed.

Even though it takes longer to route an RF trace than a baseband trace, I found that this is driven by the number of pins that have to be routed. I found that because a baseband IC has almost all of the pins that need to be routed while a typical RF IC has a lot of ground pins that are pretty much routed automatically by the copper layer fills. This means that while it takes longer to route a RF trace, the myriad of ground pins on a typical RF IC made up for it, and on a per part basis the time per pin came out to be the same no matter what technology I was routing.

This resulted in a very simple two-variable metric:

1. It takes me the same average amount of time to route each pin irrespective of what kind of signal it is.
2. A modifier has to be made if:
1. The density of the PCB starts to get too high. This is an exponential modifier. As the density ratio gets higher, the time to route goes up exponentially.
2. Any constraints on layer count, or vias, or weird PCB shapes, etc. make the design more difficult to route.

The analysis result was quite surprising for me, as it all came down to the total number of pins in the design and was irrespective of the specific technology being routed.

I came up with a metric of: X hours minimum to start a PCB design, this included making the project and finally getting all the output files together to build the design, then a Y number of minutes per pin for a 0 to Z% density PCB; outside of these constraints some modified need to be added.

The final estimation equation is,

Time Estimate = X + (Y * Z)

My X, Y, and Z metrics are for me and the work I usually do; you will need to determine what these metrics are for yourself and if they even apply to your specific line of work.

Application to firmware/software

I also typically write the firmware driver layers for my designs and, by applying the same analysis principles outlined above, I have found that it takes on average T number of hours to write and test the IC device drivers. Some of the device drivers will take shorter amounts of time because of reuse, or because there are manufacturer-supplied code examples, and some will take longer because of unforeseen problems. This, however, all tends to average out over any given project.

Likewise, turn on and test is roughly equivalent to a certain amount of time to verify the testing of each firmware driver and each IC. This too will even out over the course of a typical project.

I found this result is similar to the results that other people have found in different disciplines as well. For instance, Allen Holub, et. al. have found that one can estimate software projects simply by counting the number of user stories [3]. I wager that many seemingly complex human endeavors can be accurately estimated in a similar manner. BUT don’t just believe some previously published results, you probably can’t skip the data acquisition and analysis steps for yourself as they reaffirm if you are on the right track or not giving you valuable insight that allows you to personally determine if they apply to your specific workflow.

Using simple tools to get a good handle on things

It seems simplistic and obvious after the analysis was done however, it was anything but obvious what the outcome would be when I started this project. This is another case of applying an “analysis of variance” methodology [4] to any given problem to get a handle on a process, be it design or production. As I found in my previous column on “managing chaos”, a few very simple tools can make a difference in understanding and getting a good handle on things.