Altek Process / Calibration Description
Process Calibration Equipment refers to test equipment used to calibrate process instrumentation, particularly temperature, pressure, and process signals (ex. Pulse, 4-20 mA, 0-10 VDC).
Where is process instrumentation used?
The above are just a few examples of industries using process instrumentation in manufacturing, R&D, and the laboratory. There are many types of process instruments used to measure various variables (flow, level, temperature, pressure, RPM, pH, conductivity, humidity, etc.) and they would return a measurement signal to a control system or data logger. All of these sensors need to be periodically calibrated as well as the instruments receiving the signals because they might drift, causing the process to deviate and lead to quality issues, or even safety hazards. No one wants their medicine, food, power, etc. out of specification.
- Gas/Coal/Oil/Nuclear Power Generation
- Pharmaceutical and Hospital
DIY (Do It Yourself): Justification of in-house calibration/troubleshooting instruments
Looking to justify the purchase of calibration equipment? Here is a list of considerations.
Most Common Process Calibration Variables
- Costs due to down time
- Costs due to out of specification product and their disposal
- Costs of outside calibration laboratories, shipping expense, and time
- Start by only taking on the most important or critical calibrations in-house. Leave less critical ones for outside laboratory
- More control over calibration quality
- Better consistency
- Better accuracy
- Better documentation
- Calibrate more than once per year
- Calibrate instruments that normally would not get sent out
- Frequency (pulse signals from turbine flowmeters, for example)
- Milliamp loop current (4-20 mA) / DC Voltage (typically 0-30 VDC)
- Temperature (Thermocouple and/or RTD)
- Multifunction models combine two or more of the above capabilities
- HART Multifunction Calibrators add HART Communications to the multifunction capability.
The following hierarchal table lists the traceability of standards
|National Measurement Standard (i.e. NIST in USA, ISO Europe)
(i.e. Fluke Calibration)
(i.e. Fluke Calibration, Techne)
|Working Standard (i.e. “shop” standard. Typical NIST calibrators)
|Process Measuring Instrument
(i.e. Temp. probe, pressure transmitter,etc.)
When selecting a calibrator, make sure to look at the accuracy of all instruments it will be used with, not only the measurement instruments like a temperature transmitter but also the receiving instrument, such as a recorder or controller. Instruments have good accuracies, so a calibrator with better accuracy is required. Otherwise, just use the calibrator for a calibration check only.
The vast majority of calibrators will be in the Working Standards category. The rule of thumb in the instrumentation industry is to use a calibrator with 4x better accuracy than the device being calibrated (4:1 ratio). Certain industries require better or you can purchase Secondary or even Primary Standards for in-house calibration of calibrators and avoid the repeated costs of sending them out for calibration.
Consider Documenting Calibrators
If you are taking the time to calibrate, then take the time to properly document. Some calibrators make this easier. Look for the ability to capture “As Found” and “As Left” results and PC connectivity. Multifunction calibrators will be better at documenting. HART Communicators and HART Multifunction Calibrators additionally have the capability to store measuring device HART configurations, making their replacements easy to setup.
When is it important to have an Intrinsically Safe Calibrator?
When you are working in a hazardous area with your calibrator or other electronic tools such as multimeters, make sure they are intrinsically safe. Equipment rated intrinsically safe will have independent testing to comply with the requirements of an approving body and will be so marked. There is a price premium, but it is worth it. Intrinsically safe devices are designed to use low amperage so that a spark cannot occur within during a malfunction. It is obvious an intrinsically safe calibrator and multimeter is needed when you are working in a gasoline storage tank farm, but there are many other places also. For example, flammable dusts, such as grain suspended in air, can explode with a spark. Consult with your organization's safety officer to learn which areas of the facility are hazardous. If you do not have intrinsically safe equipment, your only options are to remove the instrumentation and calibrate in a safe area or access the instrumentation loop using HART from a safe area.
What is the difference between Percent Full Scale (%FS) and Percent Reading?
As an example of the difference of 1% full scale versus 1% reading, if the full scale is 100 psig, then 1% of full scale is +/-1. So if you are measuring 50 psig the accuracy would be 50 +/-1 psig, or about 2%, and as a percentage it gets worse the lower the pressure. In percent of reading, it would be 1% of reading, so if you are measuring 50 psig, then the accuracy is 50 +/-1 psig, still maintaining the 1% accuracy throughout the measuring range. This example is for pressure but applies for any unit of measure.
A Frequency Calibrator is used to source and read DC voltage pulses for calibration and troubleshooting problems with process instruments. Applications for calibration and troubleshooting with Frequency Calibrators include:
Selection advice for Frequency Calibrators
- Frequency counters
- Vibration systems
- Flowmeters such as turbine, paddle, vortex
- Check the maximum frequency that will be needed and compare against the calibrator specification. Most calibrators will be able to handle at least 10 kHz.
- Consider a multifunction calibrator that includes frequency along with other capabilities like loop, temperature, and pressure calibration
- Totalizer feature to count the pulses within a selectable time. Permits calibrating a totalizer without needing a stopwatch.
- Look for a calibrator that can Read and Source Frequency. Not all models can do both.
Milliamp Signal Calibrators / Loop Calibrators / Voltage Calibrators
Milliamp Calibrators, also called Loop Calibrators, are used to calibrate and troubleshoot signal transmitters and loops. Milliamp calibrators usually come supplied with a built in DC Voltage calibration capability.
What is a signal transmitter?
Sensors such as pH meters, temperature probes, and positive displacement flowmeters output signals that cannot travel long distances and/or are susceptible to noise from other wiring cables in the same conduit. A loop powered transmitter is supplied power typically from a 24 volt DC power supply. It converts the analog measured signal to a 4-20 mA current signal. It uses the supplied voltage from the signal loop to power itself. A non-loop powered transmitter will be 3-wire or 4-wire, using the additional wires for power.
How does a mA signal translate to
an Engineering Unit?
Zero is represented by 4 mA and 20 mA represents full span or 100%, so 50% regardless of whether it is psig, °C, liters per minute, or another unit, would be 12 mA.
Here is a graphical representation for converting 4-20 mA signals.
An elevated zero of 4 mA is used because a zero mA signal is indistinguishable between a true zero and a broken connection. For short distances or laboratory applications, a voltage output such as 0-5 VDC or 0-10 VDC is common and less expensive than a 4-20 mA transmitter. Current signals handle long runs with much less interference than voltage signals.
Milliamp Calibrators (Loop Calibrators) are important for compensating for wiring runs
At the other end of the wiring run from the transmitter is an instrument such as a PLC (programmable logic controller), DCS (distributed control system), data logger, or controller. These devices actually will be measuring voltage so an external precision 250 ohm shunt resistor is installed across the terminals or built into the instrument. The instrument will be reading a voltage signal from 1 to 5 VDC. How? Look at ohms law, V = I x R, at 20 mA ==> 0.02 A x 250 ohms = 5.0 Volts.
Great, but what happens when you are using 200 feet of 20 gauge wire. Quick internet search and you will find a resistance of roughly 1 ohm per 100 feet, meaning the instrument will see 0.02 x 252 = 5.04, an increase of 0.8%. So now your brand new, NIST calibrated device is +0.8% in error before even installing it. Depending on your application, this could be significant. And it gets worse with longer runs and higher gauges. For 22 gauge, it is approximately 1.6 ohms/100 ft and 24 gauge is 2.6 ohms/100 ft. 20-24 gauge are typical wire gauges for process instrumentation.
Rather than taking estimates of the length of wire and loss in resistance, use a calibrator to quickly correct the error. These adjustments are often called 4-20 mA trim or current loop trim. Connect the milliamp calibrator in the source mode in place of the transmitter and source 20 mA and 4 mA to the instrument and setup the instrument to read properly 100% and 0%. Make sure to wait sufficient time to let the reading stabilize and/or set any instrument damping (sometimes called filter) to zero.
Source Only vs. Read/Source Calibrators and the Milliamp Process Clamp Meter
The minimum function of a calibrator is to source a known thermocouple, RTD, mA, or voltage. That permits performing zero/span adjustments and loop calibrators compensate for wiring runs, as discussed above. In source mode, testing of conditions that otherwise would be difficult or unsafe is possible. Imagine testing a tank level situation. With a mA calibrator, it is safe to simulate a 90% full condition to test a high alarm warning and 95% to test the high high (HH) alarm. Another safety example is simulating a high pH mA signal to trigger shutdown of pumps sending water into the sewer.
“I think we have a bad valve…” Milliamp calibrators in source mode can also be used to test valve positioners (i.e. “Stroke” the valve). Send signals to open, close, or any other position while watching the valve stem position on a bench or in the field for troubleshooting.
Variable frequency drives (VFD) are used to power motors, blowers, and fans in process applications as well as in conveyor systems and machine tools. Control inputs are generally voltage (1 V to 5 V or 0 V to 10 V) or current (4 mA to 20 mA). A milliamp/voltage calibrator can source a signal for commissioning and troubleshooting.
Make sure the source feature has selectable zero/span, slow ramp, fast ramp, and step ramp. It is invaluable in simulating a process. One cannot be at two places at the same time. The calibrator can be on an up or down ramp while the results are being viewed at the controller.
For troubleshooting problems, reading the signal coming from the sensor or a re-transmitted mA signal from a controller or recorder is necessary.
||Milliamp Process Clamp, a cool tool!
|It’s bad enough that you are having problems with a mA instrumentation loop, but to also have to spend time disconnecting the transmitter and/or the back of the controller or recorder wastes more time. And of course the transmitter is 20 feet in the air or some other inaccessible place. Not to mention the possibility of stripping the head of the termination screws or incorrectly re-connecting the wires.
Traditionally clamp meters were unable to read accurately down at the mA level until FLUKE developed their line of milliamp process clamp meters. With the clamp meter you can clamp along the loop just like a traditional clamp meter and get accurate milliamp readings. Different versions are available depending on budget to read current only; read/source current; and read/source current and DC voltage.
K110 and K100
|Very small and compact, this microprobe designed for accurate measurement of very low currents with 50µADC sensitivity.
Very small size and “clip” shape make it ideal for probing and measuring in tight wiring areas such as circuit boards, 4 to 20mA process loops or automotive electronic circuitry. An excellent companion to all DMMs and instruments that will benefit from the probe’s high sensitivity, dynamic range and waveform displaying characteristics.
What is the difference between “Source” and “Simulate” in a milliamp calibrator?
Source will actually output a 4mA to 20mA signal based on the value selected. Simulate does not output anything but rather controls the current flow from an external source to be within 4mA to 20mA.
What are Linearity, Repeatability, and Hysteresis?
Linearity for process instrumentation is expressed as a percentage of the full scale of how far the instrument deviates from a best fit straight line (BSFL). For example, the average pressure transducer is 0.5% or 0.25% full scale accuracy. Sometimes the specification will state it as Linearity.
Repeatability is not accuracy, instead it is how much is the variation by the same instrument and conditions over multiple tests, expressed usually as percentage of full scale.
One definition of Hysteresis has to do with control systems and preventing “chatter” and not applicable for our accuracy discussion. Think of your furnace thermostat, as an example. It turns on when the ambient temperature falls below setpoint but stays on until a preset temperature above your setpoint and will not come back on until a preset temperature below your setpoint is reached. This prevents unnecessary cycling (i.e., "chatter") of the unit. There could be constant on/off action if the temperature is fluctuating from 69.9 to 70.1 °F, for example, and could wear out the furnace prematurely. This is different than the discussion of hysteresis with measurement accuracy.
From a measurement standpoint, Hysteresis is the effect of loading versus unloading across the range of the instrument. Temperature can also affect hysteresis. For example, a pressure transducer has a pressure applied from zero to 100 psig. Then the pressure is removed from 100 psig to zero.
How accurate is your pressure transducer, mass flowmeter, and other sensors subject to Hysteresis?
The graph above is for illustration purposes but indicates an inherent hysteresis problem with pressure transducers and to a lesser extent thermal mass flowmeters. The Y-axis displayed pressure represents the pressure on the pressure transducer display, mA output signal, of HART signal value. The X-axis represents the applied input. ISA-37.1 Standard talks about for pressure transducer calibration to collect data at a minimum of 5-6 points of rising and falling pressure (0, 20, 40, 60, 80, and 100%). Even with the collection of several points, it is still a best fit straight line. Across the whole range in the graph there are places where you will have better and worse accuracy. Manufacturers can play with the data too and still be correct in their specification. They can take multiple tests and average results. They can be selective in which points they measure and use regression analysis in determining the best fit straight line. End result is a 0.5% linearity generally speaking but at your specific measurement point and specific batch today at 2 PM, it can be considerably worse accuracy.
There are a few approaches to reducing your hysteresis error and error in general. Most processes operate at a fixed temperature, pressure flow, etc. or at least a narrow range. Being able to see a larger range might be needed to move the process to the desired value, but the key time is spent in a narrow range of a larger full scale. Keeping that in mind, consider the following:
- When getting instruments NIST calibrated, get the data! Usually the factory or independent calibration laboratory will charge a small amount extra for the calibration data, but it is worth it. Remember to ask how many calibration points and which points are measured. If necessary, ask for more calibration points, usually a small adder in price. Now you can see where the errors are and compensate. Some controllers, PLC’s, etc., will have a menu to include a linearization table, typically 16 or more points. At a minimum, they might have a menu called bias or offset where you can enter a single point positive or negative offset. If your process runs at 100.0 psig for the main critical portion of your batch and you know you are off by -0.2 psig, if you cannot make the adjustment at the pressure transducer, look for an input bias/offset menu in the controller.
- When requesting calibration of your sensors, supply desired points and get the data as mentioned above.
- Get higher accuracy sensors. Also, keep in mind the difference between percent full scale and percent reading when selecting sensors discussed in greater detail above. If you get a percent full scale instrument make sure your operating range is in the top third of the instrument range.
- Compensate for wiring runs with a loop calibrator. Discussed earlier.
- DIY (Do It Yourself). Calibrate instruments in-house. At TEquipment we carry Fluke Calibration and Techne, lines worth considering.
Multifunction Calibrators include two or more of the following: temperature, pressure, frequency, signal, and HART capability. Save money as well as space in the toolbox by purchasing one instrument with multiple functions.
Following is a discussion on HART Communications
What is a HART Communicator vs. HART Multifunction Calibrator? And what is HART anyway?
HART is a digital communications signal riding on top of a 4-20 mA signal, short for Highway Addressable Remote Transmitter. It was developed by Emerson Process/Rosemount and later made public, so now all major instrument manufacturers support the standard. You may see other protocols such as Fieldbus (more popular in Europe), CANBUS (popular in automotive and production line manufacturing), BRAIN (proprietary to Yokogawa), but HART is the king in process instrumentation. Smart pressure transmitters, temperature transmitters, flowmeters, and other instruments with HART capability give access to anyone with a HART communicator to basic to advanced configuration parameters. Setting up ranges, units of measure, tag labels, input filter/dampening, square root extraction of differential flow measurement devices like orifice plates are all possible remotely from anywhere on the 4-20 mA loop. Hazardous area transmitters become much less expensive because keypads or alternate interfaces suitable for use in explosive atmospheres are not needed.
A HART Communicator is used to communicate with the transmitter. Emerson still makes them, model 475, which we carry at TEquipment. We also carry Meriam’s MFC HART Communicator. A HART Communicator is not a calibrator. Re-ranging the URV and LRV (upper and lower range values) of a transmitter is not calibrating. A HART Multifunction Calibrator has the capabilities of a HART Communicator and a multifunction calibrator may also store the measuring instrument configuration files.
FAQ: Can I get a NIST certificate for my HART Communicator?
Answer: No, but you can get one for a HART Multifunction Calibrator.
A HART Communicator is an instrument configuration tool like a universal TV remote can be configured to operate your TV. Stored in memory are device specific drivers so it can communicate with over a thousand registered instruments.
HART protocol has a standard set of menus, so basic tasks can be performed and thousands of instruments use them. If you need to drill deep into a very specific capability of your instrument, then you will need its device driver (DD). The HART Communication Foundation now maintains this list and any HART Communicator or Multifunction Calibrator manufacturer will install the latest set when you purchase. These do not change frequently, but it is always a good idea to get the option for 3 years of no charge upgrades for the Emerson 475 HART Communicator. Hart Multifunction Calibrators from Fluke as of this writing are no charge downloads to update the device firmware. Martel Electronics makes a HART Multifunction Calibrator too, but it has loaded only the basic/universal HART commands so upgrades are not needed. Meriam’s MFC HART Communicator and HART Multifunction Calibrator as of this writing include at least 3 years of upgrades at no charge. If you will be using more than just the basic commands, then regardless of model, do check the supported devices list before purchase.
Pressure Calibrators are used to troubleshoot and calibrate pressure transducers, transmitters, and gauges. Unlike other calibrators, one pressure calibrator cannot cover all pressure ranges. At time of order, the pressure range must be specified. Selecting one very wide range is not recommended because pressure sensors are typically percent full scale accuracy (or have a percent full scale component). Refer to the earlier discussion Percent Full Scale vs. Percent Reading Accuracy.
The good news is that most designs have interchangeable optional pressure modules for alternate ranges available for purchase with the instrument or later.
Understanding Pressure Modules and Ranges?
The above graphic best explains the difference between all the pressure related terms. Key to understanding is where the reference point is located. To further clarify, Differential has to do with measuring the difference between any two pressures and is performed by differential pressure transmitters. Differential pressure applications include measuring flow using orifice plates and laminar flow elements (both available from TEquipment).
Pressure measurement does not have very wide ranges with good accuracy, just inherent in the nature of the available technologies. It is especially true when you consider the full scale accuracy problem. A pressure transducer measuring 1000 psig +/-1% Full Scale will be +/- 10 psig. It is not practical to use it to measure 8 psig, for example. Manufacturers' solution has been to offer more sensors with different ranges. In the previous example then, you would obtain a model with a range 0-10 psig or even 0-30 psig. Gauge pressure is not the only type of pressure measurement, as explained in the graphic.
Available calibrator pressure modules types:
A pressure calibrator or multifunction calibrator with pressure calibration capability will usually include one or two pressure modules and others can be purchased separately. To summarize, when selecting the modules consider the types of measurements being made and the range. When selecting the range, do not get too large a range past the required. This is one time where too much is not a good thing as it will lead to poor accuracy.
- Absolute Pressure. Pressure module measurements are referenced to zero pressure absolute or a perfect vacuum (-14.7 PSIG or 0 PSIA). When these modules are open to atmosphere they will read approximately 14.7 PSIA ( approximately 1 atmosphere at sea level). With absolute pressure starting at a true zero, there cannot be a negative absolute pressure measurement.
- Differential Pressure. Differential pressure transmitters will have two ports, a High and Low side and marked as such. You may see units such as psid.
- Dual or Compound. Pressure modules will read both positive and negative pressures through a single input port.
- Gauge Pressure. Pressure modules read pressure relative to the local atmospheric or ambient pressure, also known as one atmosphere. It’s stated in units of “G” (Gauge), for example if you are measuring PSI (Pounds per Square Inch), the pressure will be listed as PSIG.
- Vacuum. Pressure modules read only negative pressure with atmospheric pressure being your reference.
Zero and Span errors and how a calibrator corrects them
A zero error in a sensor is a positive or negative shift when the sensor is at zero. Similarly, a span error occurs at the max end of its measuring range (i.e. at full scale, also called span). A relatable real world example for the zero error is a digital weigh scale. At rest with no load, the display does not always read exactly zero. Scale manufacturers provide a zero or tare button to zero the scale and also serves to zero out an empty container in filling applications.
The following charts graphically represent the effect of a zero offset, a span offset, and if both occur. In these examples, a pressure transducer with range of 0-100 psig is plotted, but it could easily be 0-100% for any variable being measured. The center line in each case is the actual (true) value and possible high and low errors are shown. Y-axis represents the value shown on the sensor display or signal output and the X-axis is the input pressure being seen by the transducer.
Load cells, pressure sensors, and flowmeters are particularly known for having a zero offset, but zero and span errors can occur with any instrument. For pressure, it is inherent in the technology because of a slight loss of memory in the deflection of the thin metal membrane in contact with the process and also because of changes in atmospheric pressure due to elevation above sea level versus where the pressure transducer was originally calibrated. In manual weighing, it is easy to press the zero button, but for process instrumentation, it is not as easy. Better sensors will have zero and span adjustments using a keypad, potentiometers (or pots), or through digital communications using a HART Communicator. Ideally, the process input (also called PV or process variable) to the sensor is set to zero and span. Using the example above, it would be setting the pressure vents of the transducer open to read atmospheric pressure (zero) and using a pressure calibration pump or deadweight tester for the full scale span. The second choice would be to trim the output (also called analog output AO) of the transducer instead of the input. This would mean that the transducer will still believe the pressure is incorrect but output the correct value. This could be a problem for smart instruments, where 4-20 mA and digital signals like HART are both being used.
Pressure Calibration Pumps
When calibrating pressure transducers on a bench, actual pressure needs to be generated so that the pressure transducer / transmitter and the pressure calibrator can be connected in a manifold arrangement to see the pressure. For that purpose are hand operated pressure test pumps.
There are overlapping ranges, so select a model that best covers your applications.
- Pneumatic models use air and typically range from vacuum to about 600 psig.
- Hydraulic Test Pumps use oil and achieve much higher pressures of up to 10,000 psig.
Features to consider when selecting pressure calibration pumps
- Kit versions include hoses, fittings, spare filters, and/ or more than one pump to cover wider range at a savings.
- Multi-turn knob for fine adjustment of pressure.
- Adjustable stroke control to allow for fast priming or filling of test systems. This gives the operator the ability to switch as needed to a smaller stroke for easier pumping at high pressure.
- Pump can be easily cleaned without disassembly.
Deadweight Testers (DWT)
Deadweight Testers use traceable weights to apply pressure to a fluid such as air, water, or oil to calibrate pressure gauges, transducers, transmitters, and portable calibrators. Due to their fundamental method of pressure measurement using calibrated piston-cylinders and masses, they offer unmatched measurement stability and reliability. DWT’s are considered primary standards.
Deadweight testers also inherently regulate a stable test pressure once the piston is floated, solving a problem that operators of some manual pressure calibrators encounter.
Deadweight testers also measure accurately over a wide range of pressure. The uncertainty of deadweight tester measurements is a percent of the measured value (% of reading). By including multiple piston-cylinders, a single Fluke Pressurements Deadweight Tester can calibrate units under test with full scale ranges that vary by a 100:1 ratio or more.
If you think deadweight testers are too complicated or expensive, then this whitepaper from Fluke Calibration is a must read: Performing Precise Pressure Calibrations May Cost Less Thank You Think
Deadweight Tester Selection Considerations
- Pressure range. Consider the highest and lowest pressure that needs to be generated. DWT’s cover from vacuum to 60,000 psi (400 MPa). The performance is less than ideal when the pressure being generated is below 10% of the full scale. Above 10% of full scale accuracy is percent of reading. Applications that require both low and high hydraulic pressures can be handled using dual piston units. These units are provided with both a low range and high range piston/cylinder. Switching between the low range and high range on a Pressurements DWT is as simple as removing the masses from one piston and placing them on the other. No valves need to be switched.
- Pressure media. Pneumatic or hydraulic (water or oil). Pneumatic, or gas, instruments are ideal for lower pressure ranges. Gas is preferred whenever cleanliness is required. In addition, using gas reduces the impact of head height corrections. However, at higher pressures it is necessary to use water or oil.
The usefulness of gas as a media is limited to approximately 2 000 psi (14 MPa). There are two reasons for this. First, there is more risk of explosion at high gas pressures and that is a safety concern. Second, generating high gas pressure will require expensive intensifiers or gas boosters. Using oil or water eliminates these issues. Since water is not a very good lubricant, oil is preferable when allowed.
One advantage of liquids is they are incompressible. This allows a small change in the volume of the system (through a screw pump) to result in large changes in pressure. The most common approach is to use a mineral oil or water as the media. Oil is ideal in that it assists in lubricating the piston and cylinder. The downside to using oil as a media is it now introduces the device under test to possible contamination. An option is a liquid-to-liquid separator, which uses one liquid in the device and another in the DWT.
- Pressure generation options. On-board hand pumps are offered on Fluke Calibration Pressurements deadweight testers to generate vacuum or air pressure or to prime higher pressure hydraulic systems.
- Weight increments. Purchase the necessary weights for the desired ranges. The calibration weights need to be trimmed for the local gravity, so providing the final location of the instrument is critical for accuracy.
- Accuracy. DWT are inherently a percent of reading device. There is a lower breakpoint, normally 10% of full scale, where the specification ceases to be percent of reading.
Process Pressure Gauges
Process Pressure Gauges are analog or digital gauges with threaded connections for mounting on a gas, liquid, or steam line. Digital pressure gauges are either battery powered or 24 VDC powered. Pressure gauges come in a variety of types and ranges as explained above in Pressure Modules. Some models have datalogging capability.
Reference class pressure gauges are available. They are supplied with NIST calibration certificates and are high accuracy for use in pressure calibration work.
Pressure Transmitters are pressure gauges that output an analog signal, such as 4-20 mA, 0-5 VDC, or 0-10 VDC.
5 things to consider when selecting a Pressure Transmitter
- Pressure Range. Unlike other calibrators, one pressure calibrator cannot cover all pressure ranges. At time of order, the pressure range must be specified. Selecting one very wide range is not recommended because pressure sensors are typically percent full scale accuracy (or have a percent full scale component). Refer to the earlier discussion Percent Full Scale vs. Percent Reading Accuracy.
- Accuracy. See the earlier discussion on Percent Full Scale vs. Percent Reading Accuracy
- Process Connection. Pressure Transmitters are available with many plumbing connections for the process, such as 1/2 inch NPT thread or 1/4 inch NPT thread.
- Material Compatibility. Standard wetted parts are 316 SS or 316L SS that would be in contact with the process. Stainless is suitable for many most applications, but not all.
- Output and connection type. Select 4-20 mA for long runs. For shorter lengths, voltage output is an option to consider. The wiring varies too. Some transmitters can be as simple as potted wires in the transducer and a short length of unterminated wires and others have special connectors.
Temperature Calibration equipment is quite varied to match the very varied types of methods for temperature measurement. Temperature is the most measured physical property so it is logical to invest in a calibration equipment.
Here is a general discussion on Temperature Calibration that also discusses ther technology behind different temperature probes (thermocouple RTD, and thermistor). Temperature Measurement and Calibration Application Note-What every instrument technician should know
Temperature Calibration Metrology Wells
Temperature Calibration Metrology Wells are used to compare a known set temperature with the measurement from the thermocouple, RTD, and liquid filled thermometer under test. The checking of temperature is vital in numerous processes. Temperature is the most widely measured variable, so having a temperature dry well or liquid bath for calibration work is easy to justify.
Considerations when selecting a Temperature Dry Well or Liquid Bath
- Lab or portable.
- Because temperature is measured from cryogenic to extremely high temperatures, they are made to cover these wide ranges. Selection starts with understanding the required temperature range.
- Desired accuracy. Rule of thumb for process instrumentation is to select calibration equipment with an accuracy of 4x better than the instrument being calibrated. Baths come supplied with a digital temperature controller. Look at its accuracy specification. If insufficient, you can purchase an external temperature probe and display with better accuracy and use it as the master meter. It will also be easier to send out for annual re-calibration, than a complete system.
- Watch the stability specification. The benefit of having a high accuracy temperature probe is lost if the dry well or liquid bath temperature is constantly fluctuating.
- While it is important to account for future needs, do not overly exaggerate the range. As the ranges widen, more complicated solutions are required. The following table compares liquid and dry block and various liquid bath mediums.
Typical Temp. Range
|Benefits and Drawbacks
- Lowest initial cost
- Wide common temperature range
- Water is easy to cleanup, non-hazardous, practically no cost
- Accommodates odd size temperature probes
- Takes several hours to warm-up or cool-down and stabilize*
- Water baths evaporate a lot of water as temperature gets closer to boiling point, even with a lid or use of polypropylene spheres (i.e. ball blanket). Be prepared for refilling.
- Moisture from evaporation may corrode electronics over time
-40°C/°F to Ambient
- Water Bath may be possible to use with denatured ethanol (alcohol) (check with manufacturer or TEquipment)
- Evaporation and volume of bath alcohol is concern not only for refilling but flammability
- Easy to cleanup
- Takes several hours to cool-down and stabilize
Medium: Glycol or Oil
- Higher cost than Water Bath and over time the most expensive because of continued replacement of bath medium as it oxidizes from the heat
- Typical bath mediums: glycol/water mix, mineral oil, Dow Silicone Oil, or polyalphaolefin (PAO)
- Glycol/Water mix typical range: 0-95°C (32-200°F). Oils intended for temperatures above water boiling
- Oil fumes must be well ventilated using laboratory hood
- Accommodates odd size temperature probes
- Takes several hours to warm up or cool-down and stabilize*
-25 to 1200°C
(-32 to 650°F)
- Bench models have higher initial cost than liquid baths
- Portable versions possible
- No evaporation or fuming concerns. Hassle-free
- Requires inserts for known probe diameters and lengths, some supplied with unit, additional ones purchased separately
- Fastest warm-up and stabilization time at 20-60 minutes
- Blocks made of aluminum approximately until its melting point of 660°C (1200°F) . Above that temperature stainless steel is typically used.
* One work around by some labs is to set the bath on a timer to turn on an
hour or two before the start of the work day. Make sure to check low level
and over temperature shutdown safeties regularly.
Getting sub-ambient temperatures with temperature baths
To lower the bath temperature below ambient several different solutions are possible.
|Thermoelectric cooling is based on a Peltier principle. It is purely electronic with no moving parts, except a cooling fan. Great benefit over refrigeration compressors, but thermoelectric cooling is very limited in cooling capacity (i.e. BTU’s of heat removal). Because of that limitation, it will only work in very small baths. The technology can be adapted to dry block or small volume liquid baths.
|Dip Cooler (“Coldfinger”) is an external refrigerated chiller with a pump that circulates in a closed loop a cooling medium of denatured alcohol or glycol/water mix through a cooling coil that is immersed in the temperature calibration bath. A heat exchanger in the cooler has one side with the cooling medium and another with the refrigerant. Available from Techne.
|Flow thru Cooler is a refrigerated chiller without a circulation pump. The a cooling medium of denatured alcohol or glycol/water mix is circulated in a closed loop by a pump in the temperature bath instead. Analogous to a drinking water fountain using city water pressure and no internal pump. A heat exchanger in the cooler has one side with the cooling medium and another with the refrigerant Available from Techne.
|Built in refrigeration compressor(s). Having the compressors built into the bath offer the convenience of one instrument and avoid need for circulating cold fluid. Also colder temperatures are possible. A negative is that the combined instrument is larger and heavier, so more expensive to ship for repairs.
When purchasing a Cooler or bath with built in refrigeration compressors, make sure to select a model with the appropriate local voltage (110 VAC or 220 VAC). Refrigeration compressors are only available in one or the other voltage. If the wrong one is purchased, getting a transformer to step up or step down the voltage is expensive because of the amperage that compressors draw.
Temperature Calibrators and Metrology Wells are used to troubleshoot and calibrate temperature probes and the instruments that are connected to the probes such as chart recorders, temperature controllers, and PLC's. Temperature is the most measured physical property so it is logical to invest in a calibrator
There are many different thermocouple and RTD types of temperature probes. Does that mean your new calibrator needs to be able to handle all of them? Maybe, if you are a contractor you might come across several. Facilities with wide temperature ranges and applications might need the capability too.
Source Only vs. Read/Source Calibrators
The minimum function of a calibrator is to source a known thermocouple and/or RTD. That permits performing zero/span adjustments. In source mode testing of conditions that otherwise would be difficult or unsafe is possible. Imagine testing a reactor high temperature alarm. With a temperature calibrator it is safe to simulate a 90% full condition to test a high alarm warning and 95% to test the high high (HH) alarm. Another example is simulating a zero and span to confirm that a remote temperature controller is reading correctly.
For troubleshooting problems, reading the signal coming from the sensor is necessary.
How many RTD and Thermocouple types is enough in a Calibrator?
|Most Popular Temperature Probe Types
|PT-100 (0.00385 coefficient) RTD
|PT-1000 (0.00385 coefficient) RTD
Thermocouple and RTD Basics
Thermocouples (T/C’s) generate a linear millivolt (mV) signal with temperature. They are two dissimilar wires soldered at the tip. RTD’s use a resistance principle. They are a thin coiled wire, usually platinum, that varies in resistance (ohms) with temperature. Thermocouples have a wider operating temperature range and better resistance to vibration and shock than RTD’s, but RTD’s are much more accurate and repeatable. RTD’s are 4-10 times more expensive. Type K, J, T, N, and E are known as base metal thermocouples because they are made from common metals and R, S, and B are made from Noble metals.
|Thermocouple Types and Applications
||Wide range. Inexpensive. Popular.
Better at above 1000 °F (538 °C)
||Wide range. Inexpensive. Popular. Keep away from atmospheres that oxidize the iron.
Shorter life above 1000 °F (538 °C)
||Typically best accuracy, especially at ambient and below freezing. But lowest max temp.
Popular in Pharmaceutical industry.
||For applications where Type K have shorter life.
Better accuracy than K or J but less popular
||Highest mV output. Medium temp range
||High temp industrial applications above 1000 °F (538 °C)
||High temp laboratory applications above 1000 °F (538 °C)
||Same as type R and S but higher max temp
and lower mV output
Notice in the above thermocouple table, absent is the temperature range and accuracy. They vary so much by manufacturer that it is best to look at the specifications when selecting a temperature probe.
Infrared Calibrators / Blackbodies
Infrared Calibrators, also called Blackbodies, are reference standards used to calibrate Infrared Thermometers and Thermal Imaging Cameras . Even those IR Thermometers that cannot be adjusted benefit from testing to verify the consistency and validity of results. Virtually any instrument with a spot size diameter less than the cavity size can be calibrated. It is important not to have the IR thermometer too close to the target. This will cause the IR thermometer’s optics to heat excessively, which will cause false readings. It is also important to be not too far away. This will cause the target to not fill the IR thermometer’s spot size and will cause a false reading.
Advice for Selecting an Infrared Calibrator
How does an Infrared Calibrator work?
- Make sure to know the desired temperature range
- Select a model with a target area larger than the spot size diameter of the instrument to be checked
- Decide between surface type or cavity type infrared calibrator
The units use a plate that is heated. The plate is painted with a black paint with emissivity of 0.95. The temperature is controlled by a digital controller. The controller uses a precision platinum RTD as a sensor and controls the surface temperature. Models that achieve temperatures below ambient use a Peltier Thermoelectric Cooling system.
The IR calibrator is calibrated with an emissivity setting of 0.95. The IR calibrator has a variable emissivity adjustment that allows the user to vary their apparent emissivity from 0.90 to 1.00. This setting should match the IR thermometer's emissivity setting. It is best to use the emissivity setting of 0.95. However, some IR thermometers do not allow for an emissivity setting of 0.95. For these instruments, the calibrator's emissivity setting should be set to the IR thermometer's emissivity setting.
Every object with a temperature above absolute zero (0 Kelvin) radiates energy over a wide spectral band. For example, if a significant part of this energy is within the band of 400–700 nm, we can see that energy. This is the visible light band. This is the case with an electric stove burner at a temperature of 800°C. The burner will appear red or orange to the eye (red hot). That burner is also emitting energy at other wavelengths, which we cannot see. This includes wavelengths in the infrared portion of the electromagnetic spectrum.
An example of an object emitting energy at wavelengths we can see is the sun. By the same respect, if we are measuring an object at room temperature, (23°C), the peak wavelength is 9.8μm. The temperature corresponding to a peak wavelength at 8 μm is 192°F (89°C) and the temperature corresponding to a peak wavelength at 14 μm is −86°F (-66°C). This is one of the reasons the 8 – 14 μm is widely used in handheld IR thermometers.
IR thermometers take advantage of this peak wavelength phenomenon. They measure the amount of energy radiating from an object and calculate temperature based on this measured energy. In most handheld IR thermometers, the sensor and optical system measure IR energy in the 8-14μm band.
Emissivity is defined as the ratio of the energy emitted at a temperature to the energy emitted by a perfect blackbody at that same temperature. A perfect blackbody would have an emissivity of 1.0. However, in the real world there is no such thing as a perfect blackbody.
For example, if a perfect blackbody emits 10000 W/m2 at a given temperature and a material emits 5000 W/m2 at that same temperature, then the emissivity of that material is 0.5 or 50%. If another material emits 9500 W/m2 at that same temperature, it has an emissivity of 0.95.
It is important to note that for any opaque material, the ratio of energy reflected plus the ratio of energy transmitted is equal to 1.0 (this is known as Kirchhoff’s Law). Therefore, if a material’s emissivity is 0.95, the material reflects 5% of the energy radiated by objects facing it. By contrast, if an object has an emissivity of 0.50, the material reflects 50% of the energy radiated by objects facing it. This means this reflected energy can contribute to measurement accuracy. This is especially true when measuring materials with lower emissivity, and objects at lower temperatures.
A lack of knowledge of emissivity itself can contribute greatly to inaccuracy in IR temperature measurement. For an example, say we are measuring an object at 500°C. We assume it has an emissivity of 0.95. However, its emissivity is really 0.93. This would cause our 8-14 μm IR thermometer to read the temperature 6.7 degrees low, a – 6.7°C error in temperature measurement.
Emissivity, blackbodies and graybodies
Most people associate a blackbody calibration source with calibrating infrared thermometers. Although the word blackbody specifically refers to an ideal surface that emits and absorbs electromagnetic radiation with the maximum amount of power possible at a given temperature, many calibrators with non-ideal surfaces are also referred to as "blackbody calibrators." While an ideal surface would have an emissivity equal to 1.00, many of these "blackbody calibrators" have an emissivity of approximately 0.95 (better described as a “graybody”). A true blackbody calibration source would usually be a long cavity with a narrow opening. Unfortunately, the opening is usually too narrow to be useful for calibrating common infrared thermometers, which require a large target size for an accurate calibration. The advantage of a true blackbody calibration source is that the emissivity is precisely known. Whereas traditional flat plate calibrators have emissivities with uncertainties too large for meaningful calibrations of most thermometers.