United States Pharmacopeia online
Select Pharmacopoeia
645 WATER CONDUCTIVITY
Electrical conductivity in water is a measure of the ion-facilitated electron flow through it. Water molecules dissociate into ions as a function of pH and temperature and result in a very predictable conductivity. Some gases, most notably carbon dioxide, readily dissolve in water and interact to form ions, which predictably affect conductivity as well as pH. For the purpose of this discussion, these ions and their resulting conductivity can be considered intrinsic to the water.
Water conductivity is also affected by the presence of extraneous ions. The extraneous ions used in modeling the conductivity specifications described below are the chloride and sodium ions. The conductivity of the ubiquitous chloride ion (at the theoretical endpoint concentration of 0.47 ppm when it was a required attribute test in USP XXII and earlier revisions) and the ammonium ion at the limit of 0.3 ppm represents a major portion of the allowed water impurity level. A balancing quantity of cations, such as sodium ion, is included in this allowed impurity level to maintain electroneutrality. Extraneous ions such as these may have significant impact on the water's chemical purity and suitability for use in pharmaceutical applications. The combined conductivities of the intrinsic and extraneous ions vary as a function of pH and are the basis for the conductivity specifications described in the accompanying table and used when performing Stage 3 of the test method. Two preliminary stages are included in the test method. If the test conditions and conductivity limits are met at either of these preliminary stages, the water meets the requirements of this test. Proceeding to the third stage of the test in these circumstances is unnecessary. Only in the event of failure at the final test stage is the sample judged noncompliant with the requirements of the test.

INSTRUMENT SPECIFICATIONS AND OPERATING PARAMETERS
Water conductivity must be measured accurately using calibrated instrumentation. The conductivity cell constant, a factor used as a multiplier for the scale reading from the meter, must be known within ±2%. The cell constant can be verified directly by using a solution of known conductivity, or indirectly by comparing the instrument reading taken with the cell in question to readings from a cell of known or certified cell constant.
Meter calibration is accomplished by replacing the conductivity cell with NIST-traceable precision resistors (accurate to ±0.1% of the stated value) or an equivalently accurate adjustable resistance device, such as a Wheatstone Bridge, to give a predicted instrument response. Each scale on the meter may require separate calibration prior to use. The frequency of recalibration is a function of instrument design, degree of use, etc. However, because some multiple-scale instruments have a single calibration adjustment, recalibration may be required between each use of a different scale. The instrument must have a minimum resolution of 0.1 µS/cm* on the lowest range. Excluding the cell accuracy, the instrument accuracy must be ±0.1 µS/cm.
Because temperature has a substantial impact on conductivity readings of specimens at high and low temperatures, many instruments automatically correct the actual reading to display the value that theoretically would be observed at the nominal temperature of 25. This is done using a temperature sensor in the conductivity cell probe and an algorithm in the instrument's circuitry. This temperature compensation algorithm may not be accurate. Conductivity values used in this method are nontemperature-compensated measurements. Accuracy of the temperature measurement must be ±2.
The procedure described below is designed for measuring the conductivity of Purified Water and Water for Injection. Stage 1 of the procedure below may alternatively be performed (with the appropriate modifications to Step 1) using on-line instrumentation that has been appropriately calibrated, whose cell constants have been accurately determined, and whose temperature compensation function has been disabled. The suitability of such on-line instrumentation for quality control testing is also dependent on its location(s) in the water system. The selected instrument location(s) must reflect the quality of the water used.

PROCEDURE
Stage 1
1. Determine the temperature of the water and the conductivity of the water using a nontemperature-compensated conductivity reading. The measurement may be performed in a suitable container or as an on-line measurement.
2. Using the Stage 1—Temperature and Conductivity Requirements table, find the temperature value that is not greater than the measured temperature, i.e., the next lower temperature. The corresponding conductivity value on this table is the limit. [note—Do not interpolate.]
3. If the measured conductivity is not greater than the table value, the water meets the requirements of the test for conductivity. If the conductivity is higher than the table value, proceed with Stage 2.
Stage 1—Temperature and Conductivity Requirements
(for nontemperature-compensated conductivity measurements only)
Temperature Conductivity Requirement (µS/cm)
0 0.6
5 0.8
10 0.9
15 1.0
20 1.1
25 1.3
30 1.4
35 1.5
40 1.7
45 1.8
50 1.9
55 2.1
60 2.2
65 2.4
70 2.5
75 2.7
80 2.7
85 2.7
90 2.7
95 2.9
100 3.1
Stage 2
4. Transfer a sufficient amount of water (100 mL or more) to a suitable container, and stir the test specimen. Adjust the temperature, if necessary, and, while maintaining it at 25 ± 1, begin vigorously agitating the test specimen while periodically observing the conductivity. When the change in conductivity (due to uptake of atmospheric carbon dioxide) is less than a net of 0.1 µS/cm per 5 minutes, note the conductivity.
5. If the conductivity is not greater than 2.1 µS/cm, the water meets the requirements of the test for conductivity. If the conductivity is greater than 2.1 µS/cm, proceed with Stage 3.
Stage 3
6. Perform this test within approximately 5 minutes of the conductivity determination in Step 5, while maintaining the sample temperature at 25 ± 1. Add a saturated potassium chloride solution to the same water sample (0.3 mL per 100 mL of the test specimen), and determine the pH to the nearest 0.1 pH unit, as directed under pH 791.
7. Referring to the Stage 3—pH and Conductivity Requirements table, determine the conductivity limit at the measured pH value. If the measured conductivity in Step 4 is not greater than the conductivity requirements for the pH determined in Step 6, the water meets the requirements of the test for conductivity. If either the measured conductivity is greater than this value or the pH is outside the range of 5.0 to 7.0, the water does not meet the requirements of the test for conductivity.
Stage 3—pH and Conductivity Requirements
(for atmosphere- and temperature-equilibrated samples only)
pH Conductivity Requirement (µS/cm)
5.0 4.7
5.1 4.1
5.2 3.6
5.3 3.3
5.4 3.0
5.5 2.8
5.6 2.6
5.7 2.5
5.8 2.4
5.9 2.4
6.0 2.4
6.1 2.4
6.2 2.5
6.3 2.4
6.4 2.3
6.5 2.2
6.6 2.1
6.7 2.6
6.8 3.1
6.9 3.8
7.0 4.6

*  µS/cm (microsiemens per centimeter) = µmho/cm = reciprocal of megohm-cm.

Auxiliary Information—
Staff Liaison : Gary E. Ritchie, M.Sc., Scientific Fellow
Expert Committee : (PW05) Pharmaceutical Waters 05
USP31–NF26 Page 245
Pharmacopeial Forum : Volume No. 33(4) Page 722
Phone Number : 1-301-816-8353