Skip past footer information
USGS logo with link to USGS Home Page
Contaminated Sediments Database for the Gulf of Maine, OFR 02-403
Skip past footer information
BACKGROUND
Home/Abstract
Site Map
Introduction
Content Overview
How to Reach Us

METHODS
Database Construction

RESULTS &
DISCUSSION

How to Access
 the Data
Data Utilization
Data Tables & Maps
Geographic Context
 & Outside Links

CREDITS
References Cited
Collaborators
Acknowledgements

DISCLAIMER



Gulf of Maine Contaminated Sediment Database:

Organic Contaminant Analysis

John Farrington and Bruce Tripp
Woods Hole Oceanographic Institution

Methods are in flux with time: use caution when merging datasets

Background

During the planning stages of a coastal monitoring program, many aspects of the developing program are hotly debated. The resolution of data required and the choice of analytical methods should be one of the topics discussed at the conception of the program. This critical aspect of data production creates limits to which the resulting data can be used but is too often undervalued in the decision process. In addition, analytical methods have changed with time, for example, resolution has improved and detection limits been lowered, Changes such as these will affect the quality of data reported very significantly but are usually invisible in the reported data table. Once chemical measurement data are reported, most users do not fully appreciate the inherent limits of any given analytical method or the constraints these limits should place on any decision-making that relies on the data.

A wide range of legitimate uses exist for analytical data including:

  • compliance monitoring
  • trend analysis
  • model verification
  • contaminant flux budgeting
  • environmental quality standards development
  • process focused research

Data may be found in the literature that meets each of these need, but data acquired for specific purposes (such as these) cannot be lumped into a single database without a review of goals of the project which produced the data, the methods by which the data was generated, and an explicit assessment of limitations that these may have created. In general, raw data should not be released beyond the generating laboratory unless it is tightly coupled to information on its accuracy, precision, detection limits, sampling and sample handling specifics and other "quality assurance" results. Without this information, data should be considered suspect and should not be merged with data from other sources.

An early concern of environmental chemists was the addition of artifacts during the sampling and sample handling procedures. No matter how accurate the analysis, the results may be worthless for the intended use if the initial sample was contaminated in sampling or work-up. As analytical methods have improved, this issue has become ever-more important. Hence, high method blanks in older data may overwhelm low environmental concentrations for some measured compounds. To an unknown extent, early reports contain some undetermined amount of contaminant that was introduced during the sampling process. An example is the early reports of open-ocean metal concentrations that were highly variable until investigators realized that their results were more closely correlated with collecting samples from a dirty metal ship than with any ocean cycling process. For this reason, older (pre mid-80s) organic contaminant data should be reviewed carefully before incorporation into a combined database.

Analytical chemistry methods are developed to meet specific data needs and no single method exists that can meet all needs. Some methods are designed for compliance purposes and are adequate for analysis of highly contaminated samples. They may even be preferred in situations where large sample throughput, low cost-per-sample and analytical simplicity are priorities. They may be completely unsatisfactory, however, in situations where concentration data on trace contaminant concentrations are needed, or where susceptibility to sample material is an issue. Data produced for different purposes can consequently result in data with quite different levels of accuracy and precision that are not directly comparable. Concentration data on the same chemicals will be produced by both routine methods and "state-of-the-art" research methods; however, and these data are likely to be merged by an unsuspecting data compiler with little regard to the associated caveats of data quality.

Overview of Analytical Methods for Organic Contaminants

Extraction, separation and clean-up

Regardless of the method used to identify and quantify organic chemical contaminants in coastal environmental samples, steps must first be taken to extract chemicals of interest from the bulk of the sample material and to separate them from other chemicals that might be co-extracted. These steps are usually accomplished by "wet-chemistry" procedures in the laboratory and may not fully extract chemicals from the sample material, or they may provide opportunities for the inadvertent addition of interfering artifacts. Some extraction procedures may be less efficient, but are simpler and more rapid. Some separation procedures will cleanly combine extracted chemicals into groups with similar structure, while other procedures include compounds of interest yet do not exclude other related but interfering groups. The extraction and separation methods selected by the analyst will ultimately affect the limit of detection of the chemical of interest and the final use of data. The analyst begins to make trade-off decisions (i.e., cost, method complexity, time, resolution, etc.) at the time of initial sampling. These early decisions concerning sample handling ultimately affect data interpretation; their implications must be crystal clear to all users of the data.

Analytical Instrumentation

Usually, the end result of the extraction, separation and clean-up procedures used is a concentrated mixture containing chemicals of similar structure that can be analyzed further. This further analysis is usually done by various kinds of instruments, where instrument selection is based on the specific data need. Some instrumentation will only detect compound groups while others can resolve and detect individual compounds. All instrumentation have inherent detection limits and these limits vary by orders-of-magnitude between types of instruments. In addition to sample resolution and limit detection characteristics, the analyst will consider initial cost, cost of operation, availability, and operation complexity when making an instrument decision. As with the selection of "wet chemistry" methods, the analyst is limiting the uses to which data may legitimately be used as he or she selects an analytical instrument. Instruments commonly applied to the analysis of organic chemical contaminants in coastal samples include:

• Spectroscopic Methods

Infra-red Spectroscopy (IR)

Organic molecules have a flexible structure, which allows infrared light to be absorbed by the molecule. The amount of light absorbed (percent transmission) can be related to structural characteristics of the molecule and thus be used for identification. This sensitive technique cannot be used for analysis of individual hydrocarbons as it is difficult to separate natural from contaminant hydrocarbons; however, it can be used for remote analysis of samples. IR data will probably be reported for environmental samples only in situations which involve remote sensing of complex mixtures such as for oil spills.

UV-fluorescence

When excited by ultraviolet light, some organic molecules are caused to emit fluorescent light. By adjusting (or scanning) both the excitement and emission spectra, aromatic hydrocarbons may be analyzed. This method is specifically useful to indicate the presence of polycyclic aromatic hydrocarbons (PAH). This is a bulk measure and provides little indication of the complexity of a mixture or how much of a signal might derive from interfering compounds (e.g., conjugated alkenes). The use of this method is most appropriately limited to highly contaminated samples or as a screening technique. Reported results can not be directly compared to results from other methods of analysis.

Chromatographic techniques

All chromatographic methods work on the same principle: differential mobility. Separation of molecular types, and even individual compounds, is achieved by exploiting the relative affinity of an organic compound (or groups of similar compounds) to two phases; one mobile and one stationary. The phases may be solid, liquid or gas and separation occurs as the compound that is being analyzed spends more or less time in either the mobile or the stationary phase. Chromatography is a work-horse method for separation of hydrocarbon mixtures.

During the past two decades, chromatographic separation methods have improved dramatically both in the limits detected and in the degree of resolution, i.e., the separation that can be achieved. This revolution is wonderful in that we can analyze environmental samples more accurately and precisely, and can use these enhanced techniques to ask increasingly sophisticated questions about the inner workings of environmental processes. A down side to this ever-changing analytical landscape is that historic data, produced by older and less specific methods, still resides in the published literature ready to trip up an unsuspecting data user. Reported results, even results reported for the same compound, cannot be directly compared without a thorough review of QA/QC information. If that information is not available, then older ("low-resolution") and modern ("high-resolution") data sets can be compared only at non-quantitative or semi-quantitative levels.

High-Performance Liquid Chromatography (HPLC)

In HPLC, a liquid is pumped through a solid-phase column to allow for partitioning between the phases. With constantly improving resolution and sensitivity, HPLC has evolved slowly into a valuable analytical technique for separation of chemical contaminants. Compounds are subsequently eluted from the column and detection of the presence and amount of chemical in the column effluent is accomplished in a variety of ways, including spectroscopy. Because the method of detection (and quantification) is necessarily different than that used for gas chromatography, reported concentrations cannot be compared directly when combining data resulting from these two methods.

Gas Chromatography (GC)

In GC, a gas carries the volatilized sample mixture through a column in which a liquid stationary phase has been coated. After separation, the fractionated sample is sequentially eluted and compounds of interest are quantified on an appropriate detector. In early environmental studies (pre-1975), most GC analyses were accomplished in large diameter columns that were packed with an inert substrate on which the liquid phase was coated. Resolution of complex environmental samples was mediocre and full separation of most mixtures was not possible. Therefore, quantitative information on many individual compounds was not available, although estimates of many chemical constituents were obtained by a variety of techniques. These older results remain in the literature. Beginning in the mid-70s, high-resolution separation of environmental samples using capillary GC began to be introduced and eventually replaced packed column separations as a routine analytical tool. For about a decade, reports using both methods were published. Packed columns are still (late 1990s) used for specific purposes where lower resolution is acceptable, but most reported analyses for organic compounds now reflect a routine use of high-resolution separations.

Detectors

Several detectors are available for the analysis of the effluent stream from a gas chromatography column. Choice of detector depends on the chemical characteristics of the analyte in question and the sensitivity required for a specific application. The flame ionization detector is commonly selected because of its applicability to a broad range of analytes. Chemicals in the effluent stream are ionized in a flame that is burning between two electrodes. The ions migrate to one of the electrodes, and cause a change in potential that is amplified and detected. Another type of detector is the electron capture detector . This instrument had a high sensitivity for analytes that contain halides and is commonly used for the analysis of chlorinated pesticides and PCBs. In all cases, the analyst must routinely monitor the detector performance and sensitivity as a part of a day-to-day laboratory quality assurance program.

Discussion

The minimum detectable amount of a given chemical can vary with the analytical method, the instrumentation and with how well these tools are applied. A method with a high detection limit that may be appropriate for the analysis of highly contaminated samples may be the wrong choice in estuarine process studies that require quantitative analysis of trace concentrations. "Standard methods" normally have high detection limits but can be used when cost, speed of analysis and simplicity are given a high priority. Non-standard, state-of-the-art methods used in research projects normally have much lower detection limits but are usually slower, more complex and expensive. Either method can be the "right" one, depending on the circumstances. Even though both methods are providing data on the same chemicals, these data are not directly comparable because methodological and detection limits differ.

A problem is that there is no "correct" answer when analyzing for the presence and concentration of organic contaminants in environmental samples. The concentration of contaminant found in any sample is related to the analytical method selected and how carefully it is applied. The same method may have different limits of detection when used in different laboratories. Any concentration measured should be reported with this caveat in mind and should be reported with a specific range of uncertainty. For example, a value of 7.5 µg/g should be reported with accompanying concentration range data, since 7.5 ± 0.3 µg/g i s very different from 7.5 ± 1.0 µg/g. The information which was used to establish this range of uncertainty is an essential component of reported data and needs to be reported simultaneously. This essential information on analytical methods is generally designated as "quality assurance-quality control" (QA/QC).

With these concerns in mind, one can discuss the limit of detection (LD or DL) and the limit of quantification (LoQ). The published literature is replete with data records that are reported as "less-than" but have little, or even no, quantitative information about how the "less-than" limit was derived or what the LD value is. Without this information, it is difficult to interpret reported results, and accurate comparison with results from other times and places is impossible. When the analyst encounters very trace levels of contaminant concentrations in environmental samples, the error or uncertainty introduced by the analyst and the method and the instrumentation becomes increasingly more significant. The scale of this error must be known for the data to be interpreted. At higher contaminant concentrations, this inherent analytical error is usually relatively less important and data from highly contaminated samples may be compared with less risk of erroneous conclusions. It is possible to detect a compound (LD) but not quantify it (LoQ) with any accuracy when the analytical error and the contaminant concentration are similar. If sufficient QA/QC procedures have been carefully applied, there may be useful data hidden in the values that are reported to be "less than" a detection level. The development of quality control charts as described by Villeneuve and Mee (1992) and others is a simple way to objectively ascertain a "limit of data acceptability" that can be used in data comparisons (if the basic QA/QC is reported with the data). If insufficient QA/QC information is available to conduct this exercise, the data set should be considered suspect and should not be automatically included with other data sets in a combined database.

In summary, any competent analyst will regularly perform quality assurance tests in order to know that the methods and instrumentation are being consistently applied in the same manner from day-to-day in a single laboratory. Since the purposes and detection needs vary between studies and between laboratories, these tests become doubly important when analysts attempt to compare analytical results between laboratories. The reports of numerous inter-laboratory comparison exercises can be found in the literature. When methods have been in flux, quality assurance tests are particularly important when comparing data not only between laboratories, but also over time. Each data source should be carefully reviewed, and the constraints imposed by the methodology must be considered, prior to use of any database that has been compiled from a variety of sources, or which covers a long period of time. Unfortunately, quality assurance data has not been uniformly and consistently reported. Data reported without such ancillary information remains suspect and can be compared only with extreme caution.


Skip past navigation information

Home  /  Site Map  /  Introduction  /  Overview  /  Contacts  /  DB Construction  /  Data Access  /  Data Utilization  /  Data Tables & Maps  /  Geographic Context/Links  /  References  /  Collaborators  /  Acknowledgements  /  Disclaimer
Skip past footer information

Department of Interior
> U.S. Geological Survey > Coastal and Marine Geology > Woods Hole Field Center

USGS Privacy Statement / Disclaimer / Accessibility
URL: https://pubsdata.usgs.gov/pubs/of/2002/of02-403/HTMLdocs/org_text.htm
Maintained by Eastern Publications Group
Modified Friday, 11-Jan-2013 03:31:37 EST