Calibration – Kaman KDM-8200 User Manual

Page 32

Advertising
background image


32

Kaman Precision Products

0

DISPLACEMENT

CALIBRATED
RANGE

VDC

Calibration

6. Repeat Steps 3 through 5 as many times as necessary until you reach the desired output
voltage at each point. No further adjustment of the Zero, Gain and Linearity controls will be
needed when proper calibration is attained.


You may not be able to achieve maximum sensitivity using this technique. This method may
be preferred if the maximum voltage does not exceed the op-amp saturation voltage (5.5 V for 8
volt internal regulation.)


Alternate

1. Use the Full-Scale procedure in this section to calibrate the system initially from 0 Vdc to the

Bipolar

desired maximum.

Output
Calibration

2. Position the target at mid-scale and adjust the Zero control counterclockwise until output
reads 0 Vdc.

3. Check the two ends, the points closest and farthest from the sensor, to see that they equal
minus and plus one half of the original full-scale output voltage. For example, if your original
full-scale voltage was 0-4 Vdc, it should now be -2 to +2.

In theory, adjusting the Zero control in this technique should not affect sensitivity or linearity. In
practice, however, you may see a very slight change indicated by voltage readings other than
minus and plus one-half full scale. You may then choose to use the primary bipolar technique to
fine tune the calibration.

High
Accuracy
Band
Calibration






This procedure is used to monitor changes in position that are less than the specified linear
measuring range of your sensor, or when you are interested in increased accuracy over a smaller
range and not concerned about high accuracy outside of that range.

The high accuracy band procedure maximizes the linearity of output within a calibrated span.
The sensor installed in your system has a specified linear measuring range that defines system
performance characteristics, such as linearity, resolution and long-term stability.

Maximum linearity is centered on the mid-point of the sensor’s specified maximum linearity
measuring range anywhere between 25-75% of full-scale. For example, if you have a system with
a 40 mil specified range, its most linear region will be between 10 and 30 mils.

While decreasing the linear measuring range of your system, it will improve system performance;
conversely, increasing measuring range will degrade performance. Depending upon your
measurement objectives, either can be used.

Advertising