Signal Calibrators Description
Signal Calibrators, also called Milliamp Calibrators, or Loop Calibrators, are used to calibrate and troubleshoot signal transmitters and loops. Milliamp calibrators usually come supplied with a built in DC Voltage calibration capability.
What is a signal transmitter?
Sensors such as pH meters, temperature probes, and positive displacement flowmeters output signals that cannot travel long distances and/or are susceptible to noise from other wiring cables in the same conduit. A loop powered transmitter is supplied power typically from a 24 volt DC power supply. It converts the analog measured signal to a 4-20 mA current signal. It uses the supplied voltage from the signal loop to power itself. A non-loop powered transmitter will be 3-wire or 4-wire, using the additional wires for power.
How does a mA signal translate to
an Engineering Unit?
Zero is represented by 4 mA and 20 mA represents full span or 100%, so 50% regardless of whether it is psig, °C, liters per minute, or another unit, would be 12 mA.
Here is a graphical representation for converting 4-20 mA signals.
An elevated zero of 4 mA is used because a zero mA signal is indistinguishable between a true zero and a broken connection. For short distances or laboratory applications, a voltage output such as 0-5 VDC or 0-10 VDC is common and less expensive than a 4-20 mA transmitter. Current signals handle long runs with much less interference than voltage signals.
Milliamp Calibrators (Loop Calibrators) are important for compensating for wiring runs
At the other end of the wiring run from the transmitter is an instrument such as a PLC (programmable logic controller), DCS (distributed control system), data logger, or controller. These devices actually will be measuring voltage so an external precision 250 ohm shunt resistor is installed across the terminals or built into the instrument. The instrument will be reading a voltage signal from 1 to 5 VDC. How? Look at ohms law, V = I x R, at 20 mA ==> 0.02 A x 250 ohms = 5.0 Volts.
Great, but what happens when you are using 200 feet of 20 gauge wire. Quick internet search and you will find a resistance of roughly 1 ohm per 100 feet, meaning the instrument will see 0.02 x 252 = 5.04, an increase of 0.8%. So now your brand new, NIST calibrated device is +0.8% in error before even installing it. Depending on your application, this could be significant. And it gets worse with longer runs and higher gauges. For 22 gauge, it is approximately 1.6 ohms/100 ft and 24 gauge is 2.6 ohms/100 ft. 20-24 gauge are typical wire gauges for process instrumentation.
Rather than taking estimates of the length of wire and loss in resistance, use a calibrator to quickly correct the error. These adjustments are often called 4-20 mA trim or current loop trim. Connect the milliamp calibrator in the source mode in place of the transmitter and source 20 mA and 4 mA to the instrument and setup the instrument to read properly 100% and 0%. Make sure to wait sufficient time to let the reading stabilize and/or set any instrument damping (sometimes called filter) to zero.
Source Only vs. Read/Source Calibrators and the Milliamp Process Clamp Meter
The minimum function of a calibrator is to source a known thermocouple, RTD, mA, or voltage. That permits performing zero/span adjustments and loop calibrators compensate for wiring runs, as discussed above. In source mode, testing of conditions that otherwise would be difficult or unsafe is possible. Imagine testing a tank level situation. With a mA calibrator, it is safe to simulate a 90% full condition to test a high alarm warning and 95% to test the high high alarm. Another safety example is simulating a high pH mA signal to trigger shutdown of pumps sending water into the sewer.
“I think we have a bad valve…” Milliamp calibrators in source mode can also be used to test valve positioners (i.e. “Stroke” the valve). Send signals to open, close, or any other position while watching the valve stem position on a bench or in the field for troubleshooting.
Variable frequency drives (VFD) are used to power motors, blowers, and fans in process applications as well as in conveyor systems and machine tools. Control inputs are generally voltage (1 V to 5 V or 0 V to 10 V) or current (4 mA to 20 mA). A milliamp/voltage calibrator can source a signal for commissioning and troubleshooting.
Make sure the source feature has selectable zero/span, slow ramp, fast ramp, and step ramp. It is invaluable in simulating a process. One cannot be at two places at the same time. The calibrator can be on an up or down ramp while the results are being viewed at the controller.
For troubleshooting problems, reading the signal coming from the sensor or a re-transmitted mA signal from a controller or recorder is necessary.
What is the difference between “Source” and “Simulate” in a milliamp calibrator?
Milliamp Process Clamp, a cool tool!
It’s bad enough that you are having problems with a mA instrumentation loop, but to also have to spend time disconnecting the transmitter and/or the back of the controller or recorder wastes more time. And of course the transmitter is 20 feet in the air or some other inaccessible place. Not to mention the possibility of stripping the head of the termination screws or incorrectly re-connecting the wires.
Traditionally clamp meters were unable to read accurately down at the mA level until FLUKE developed their line of milliamp process clamp meters. With the clamp meter you can clamp along the loop just like a traditional clamp meter and get accurate milliamp readings. Different versions are available depending on budget to read current only; read/source current; and read/source current and DC voltage.
K110 and K100
Very small and compact, this microprobe designed for accurate measurement of very low currents with 50µADC sensitivity.
Very small size and “clip” shape make it ideal for probing and measuring in tight wiring areas such as circuit boards, 4 to 20mA process loops or automotive electronic circuitry. An excellent companion to all DMMs and instruments that will benefit from the probe’s high sensitivity, dynamic range and waveform displaying characteristics.
Source will actually output a 4mA to 20mA signal based on the value selected. Simulate does not output anything but rather controls the current flow from an external source to be within 4mA to 20mA.
What are Linearity, Repeatability, and Hysteresis?
for process instrumentation is expressed as a percentage of the full scale of how far the instrument deviates from a best fit straight line (BSFL). For example, the average pressure transducer is 0.5% or 0.25% full scale accuracy. Sometimes the specification will state it as Linearity.
is not accuracy, instead it is how much is the variation by the same instrument and conditions over multiple tests, expressed usually as percentage of full scale.
One definition of Hysteresis
has to do with control systems and preventing “chatter” and not applicable for our accuracy discussion. Think of your furnace thermostat, as an example. It turns on when the ambient temperature falls below setpoint but stays on until a preset temperature above your setpoint and will not come back on until a preset temperature below your setpoint is reached. This prevents unnecessary cycling (i.e., "chatter") of the unit. There could be constant on/off action if the temperature is fluctuating from 69.9 to 70.1 °F, for example, and could wear out the furnace prematurely. This is different than the discussion of hysteresis with measurement accuracy.
From a measurement standpoint, Hysteresis
is the effect of loading versus unloading across the range of the instrument. Temperature can also affect hysteresis. For example, a pressure transducer has a pressure applied from zero to 100 psig. Then the pressure is removed from 100 psig to zero.
How accurate is your pressure transducer, mass flowmeter, and other sensors subject to Hysteresis?
The graph above is for illustration purposes but indicates a an inherent hysteresis problem with pressure transducers and to a lesser extent thermal mass flowmeters. The Y-axis displayed pressure represents the pressure on the pressure transducer display, mA output signal, of HART signal value. The X-axis represents the applied input. ISA-37.1 Standard talks about for pressure transducer calibration to collect data at a minimum of 5-6 points of rising and falling pressure (0, 20, 40, 60, 80, and 100%). Even with the collection of several points, it is still a best fit straight line. Across the whole range in the graph there are places where you will have better and worse accuracy. Manufacturers can play with the data too and still be correct in their specification. They can take multiple tests and average results. They can be selective in which points they measure and use regression analysis in determining the best fit straight line. End result is a 0.5% linearity generally speaking but at your specific measurement point and specific batch today at 2 PM, it can be considerably worse accuracy.
There are a few approaches to reducing your hysteresis error and error in general. Most processes operate at a fixed temperature, pressure flow, etc. or at least a narrow range. Being able to see a larger range might be needed to move the process to the desired value, but the key time is spent in a narrow range of a larger full scale. Keeping that in mind, consider the following:
When getting instruments NIST calibrated, get the data! Usually the factory or independent calibration laboratory will charge a small amount extra for the calibration data, but it is worth it. Remember to ask how many calibration points and which points are measured. If necessary, ask for more calibration points, usually a small adder in price. Now you can see where the errors are and compensate. Some controllers, PLC’s, etc. will have a menu to include a linearization table, typically 16 or more points. At a minimum they might have a menu called bias or offset where you can enter a single point positive or negative offset. If your process runs at 100.0 psig for the main critical portion of your batch and you know you are off by -0.2 psig, if you cannot make the adjustment at the pressure transducer, look for an input bias/offset menu in the controller.
When requesting calibration of your sensors, supply desired points and get the data as mentioned above.
Get higher accuracy sensors. Also, keep in mind the difference between percent full scale and percent reading when selecting sensors. If you get a percent full scale instrument, make sure your operating range is in the top third of the instrument range.
Compensate for wiring runs with a loop calibrator.
DIY (Do It Yourself). Calibrate instruments in house. At TEquipment we carry Fluke Calibration and Techne, lines worth looking at.