www.tektronix.com
4
Signal Integrity
The Significance of Signal Integrity
The key to any good oscilloscope system is its ability to accurately recon-
struct a waveform – referred to as signal integrity. An oscilloscope is
analogous to a camera that captures signal images that we can then
observe and interpret. Two key issues lie at the heart of signal integrity.
When you take a picture, is it an accurate picture of what actually happened?
Is the picture clear or fuzzy?
How many of those accurate pictures can you take per second?
Taken together, the different systems and performance capabilities of an
oscilloscope contribute to its ability to deliver the highest signal integrity
possible. Probes also affect the signal integrity of a measurement system.
Signal integrity impacts many electronic design disciplines. But until a
few years ago, it wasn’t much of a problem for digital designers. They
could rely on their logic designs to act like the Boolean circuits they were.
Noisy, indeterminate signals were something that occurred in high-speed
designs – something for RF designers to worry about. Digital systems
switched slowly and signals stabilized predictably.
Processor clock rates have since multiplied by orders of magnitude.
Computer applications such as 3D graphics, video and server I/O
demand vast bandwidth. Much of today’s telecommunications equipment
is digitally based, and similarly requires massive bandwidth. So too
does digital high-definition TV. The current crop of microprocessor
devices handles data at rates up to 2, 3 and even 5 GS/s (gigasamples per
second), while some memory devices use 400-MHz clocks as well as data
signals with 200-ps rise times.
Importantly, speed increases have trickled down to the common IC
devices used in automobiles, VCRs, and machine controllers, to name
just a few applications. A processor running at a 20-MHz clock rate
may well have signals with rise times similar to those of an 800-MHz
processor. Designers have crossed a performance threshold that means,
in effect, almost every design is a high-speed design.
Without some precautionary measures, high-speed problems can
creep into otherwise conventional digital designs. If a circuit is
experiencing intermittent failures, or if it encounters errors at voltage
and temperature extremes, chances are there are some hidden signal
integrity problems. These can affect time-to-market, product reliability,
EMI compliance, and more.
Why is Signal Integrity a Problem?
Let’s look at some of the specific causes of signal degradation in today’s
digital designs. Why are these problems so much more prevalent today
than in years past?
The answer is speed. In the “slow old days,” maintaining acceptable
digital signal integrity meant paying attention to details like clock
distribution, signal path design, noise margins, loading effects,
transmission line effects, bus termination, decoupling and power
distribution. All of these rules still apply, but…
Bus cycle times are up to a thousand times faster than they were
20 years ago! Transactions that once took microseconds are now
measured in nanoseconds. To achieve this improvement, edge speeds
too have accelerated: they are up to 100 times faster than those of
two decades ago.
This is all well and good; however, certain physical realities have kept
circuit board technology from keeping up the pace. The propagation time
of inter-chip buses has remained almost unchanged over the decades.
Geometries have shrunk, certainly, but there is still a need to provide
circuit board real estate for IC devices, connectors, passive components,
and of course, the bus traces themselves. This real estate adds up to
distance, and distance means time – the enemy of speed.
It’s important to remember that the edge speed – rise time – of a digital
signal can carry much higher frequency components than its repetition
rate might imply. For this reason, some designers deliberately seek IC
devices with relatively “slow” rise times.
XYZs of Oscilloscopes
Primer