Like the sensitivity of a bridge reflects how much the magnitude its full-scale output
changes per volt of excitation, the sensitivity of a strain gauge or Gauge Factor GF is
how much its resistance changes in proportion to applied strain (its ratio of
fractional change in resistance to applied strain ). For common metallic strain
gauges, the Gauge Factor is typically around 2.0, meaning its resistance change is
about 2x its dimension change to an applied force (R/R = 2x L/L). However, GF
varies slightly for most applications and this affects relative strain. For many
instruments, sensor Gauge Factor will be used to compute ideal strain, but
Instrument Gauge Factor will be used to compute indicated strain.
A material subject to a tensile or compressive uniaxial force in one direction will
coincide with a lateral force referred to as Poisson’s Strain. Most materials that
undergo a tensile force or positive strain (when stretched or elongated) will
contract slightly due to coincident negative strain in the lateral/transverse
dimension (when its Poisson’s ratio = -T / is a positive number). Less common,
there are some materials with a negative Poisson’s ratio that expand in the
transverse direction (+T) in response to being stretched (+) in the longitudinal
direction. In general, the proportion of contraction or expansion is indicated by the
application material’s Poisson’s Ratio. The Poisson’s Ratio () is the negative ratio of
the simultaneous transverse strain that occurs in the perpendicular direction to the
main strain parallel to the applied force.
Strain gauge sensors typically consist of a very fine foil or wire grid that is bonded to
an application material surface in the direction of an applied force (uniaxial) or
lateral to it (bending force). These are referred to as a bonded metallic or
resistance strain gauges. They are designed to change their resistance slightly in
proportion to stress. Most strain gauges have nominal resistance values that vary
from 30 to 3000, with 120, 350, and 1000being the most common. Their
cross-sectional area is minimized by design to reduce the negative effect of the
corresponding shear or Poisson’s Strain coincident to applied strain.
The bonded metallic strain gauge has a foil grid attached to a thin backing material
or carrier strip which is directly attached to the application material to help
facilitate an efficient transfer of strain on the body to the foil grid of the gauge and
allow it to respond with a linear or nearly-linear change in electrical resistance.
Ideally, the strain gauge resistance should only change in response to applied strain,
but unfortunately in practice, this is somewhat of an inexact science and it is
difficult to make both the gauge material and application material expand and
contract equally over temperature. As you can surmise, properly mounting the
gauge is critical to ensure the application material strain is accurately transferred
through the bonding adhesive and backing material to the gauge foil.
To curb potential problems caused by mismatched expansion and contraction rates
between the gauge and application material, gauge manufacturers try to minimize
sensitivity to temperature by selecting specific gauge materials for specific
application materials. While this helps to minimize strain error, temperature
remains a source of potential error and additional compensation is usually required.
Adverse effects like this are the reason strain instruments provide additional
parametric controls for rescaling their measurement as required, like utilizing
Instrument Gauge Factor and Software Gain to help overcome application-induced
skew in strain measurement.