Transmitter Damping Adjustments

Chapter 15 - Basic Principles of Instrument Calibration and Ranging

PDF Version

The vast majority of modern process transmitters (both analog and digital) come equipped with a feature known as damping. This feature is essentially a low-pass filter function placed in-line with the signal, reducing the amount of process “noise” reported by the transmitter.

Imagine a pressure transmitter sensing water pressure at the outlet of a large pump. The flow of water exiting a pump tends to be extremely turbulent, and any pressure-sensing device connected to the immediate discharge port of a pump will interpret this turbulence as fluctuations in pressure. This means the pressure signal output by the transmitter will fluctuate as well, causing any indicator or control system connected to that transmitter to register a “noisy” water pressure:

Such “noise” wreaks havoc with most forms of feedback control, since the control system will interpret these rapid fluctuations as real pressure changes requiring corrective action. Although it is possible to configure some control systems to ignore such noise, the best solution is to correct the problem at the source either by relocating the pressure transmitter’s impulse line tap to a place where it will not be exposed to so much turbulence, or somehow prevent that sensed turbulence from being represented in the transmitter’s signal.

Since this noise is of a much greater frequency than the normal cycles of pressure in a process system, it is relatively easy to reduce the amount of noise in the transmitter signal simply by filtering that electronic signal using a low-pass filter circuit.

The simplest low-pass filter circuit is nothing more than a resistor and capacitor:

Low-frequency voltage signals applied to this circuit emerge at the output terminal relatively unattenuated, because the reactance of the capacitor is quite large at low frequencies. High-frequency signals applied to the same circuit become attenuated by the capacitor, which tends to “short” those signals to ground with its low reactance to high frequencies. The performance of such a filter circuit is primarily characterized by its cutoff frequency, mathematically defined as \(f = {1 \over 2 \pi RC}\). The cutoff frequency is the point at which only 70.7% of the input signal appears at the output (i.e. a \(-3\) dB attenuation in voltage).

If successfully applied to a process transmitter, such low-pass filtering has the effect of “quieting” an otherwise noisy signal so only the real process pressure changes are seen, while the effect of turbulence (or whatever else was causing the noise) becomes minimal. In the world of process control, the intentional low-pass filtering of process measurement signals is often referred to as damping because its effect is to “damp” (attenuate) the effects of process noise:

In order for damping to be a useful tool for the technician in mitigating measurement noise, it must be adjustable. In the case of the RC filter circuit, the degree of damping (cutoff frequency) may be adjusted by changing the value or either \(R\) or \(C\), with \(R\) being the easier component to adjust. This next photograph shows the location of an adjustable resistance on the printed circuit board of a Rosemount model 1151 analog pressure transmitter:

In digital transmitters where the damping is performed by a digital algorithm, damping may be adjusted by setting a numerical value in the transmitter’s configuration parameters. In pneumatic transmitters, damping could be implemented by installing viscous elements to the mechanism, or more simply by adding volume to the signal line (e.g. excess tubing length, larger tubing diameter, or even “capacity tanks” connected to the tube for increased volume).

The key question for the technician then becomes, “how much damping should be applied?” Insufficient damping will allow too much noise to reach the control system (causing “noisy” trends, indications, and erratic control), while excessive damping will cause the transmitter to understate the significance of sudden (real) process changes. In my experience there is a bad tendency for instrument technicians to apply excessive damping in transmitters. A transmitter with too much damping (i.e. cutoff frequency set too low, or time constant value set too high) causes the trend graph to be very smooth, which at first appears to be a good thing. After all, the whole point of a control system is to hold the process variable tightly to setpoint, so the appearance of a “flat line” process variable trend is enticing indeed. However, the problem with excessive damping is that the transmitter gives a sluggish response to any sudden changes in the real process variable.

A dual-trend graph of a pressure transmitter experiencing a sudden increase in process pressure shows this principle, where the undamped transmitter signal is shown in the upper portion and the over-damped signal in the lower portion (please note the vertical offset between these two trends is shown only for your convenience in comparing the two trend shapes):

Excessive damping causes the transmitter to “lie” to the control system by reporting a process variable that changes much slower than it actually does. The degree to which this “lie” adversely affects the control system (and/or the human operator’s judgment in manually responding to the change in pressure) depends greatly on the nature of the control system and its importance to the overall plant operation.

One way damping may cause control problems is in systems where the loop controller is aggressively tuned. In such systems, even relatively small amounts of damping may cause the actual process variable to overshoot setpoint because the controller “thinks” the process variable is responding too slowly and takes action to speed its response. A common example of this is liquid flow control, where the process variable signal is typically “noisy” and the control action is typically aggressive. A technician may introduce damping to the transmitter with good intent, but unexpectedly causes the control system to wildly overshoot setpoint (or even oscillate) because the controller is trying to get a “sluggish” process variable to respond quicker than the transmitter filtering will allow the signal to change. In reality, the process variable (fluid flow rate) is not sluggish at all, but only appears that way because the transmitter is damped. What is worse, this instability will not appear on a trend of the process variable because the control system never sees the real process variable, but only the “lie” reported by the over-damped transmitter. If any rule may be given as to how much damping to use in any transmitter, it is this: use as little as necessary to achieve good control.

Damping should be set to absolute minimum during calibration, so the results of applying stimuli to the transmitter will be immediately seen by the technician.