Skip past header information
USGS - science for a changing world

U.S. Geological Survey Data Series 74, Version 3.0

Long-Term Oceanographic Observations in Massachusetts Bay, 1989-2006


Data Processing

Skip past contents information

The following figures are in PDF format.

Filter weights for PL33 low-pass filter.
Figure 25


Amplitude response of PL33 low-pass filter as a function of frequency.
Figure 26


The time-series data were processed to provide a data set at the instrument sampling rate that has been carefully checked and edited.  As the data are processed, they are added to the USGS Oceanographic Time-Series Measurement Database (Montgomery and others, 2008).

Data-processing software and strategies evolved over the 16-year duration of the long-term measurement program.  Data processing was initially conducted using the WHOI Buoy Group Data Processing System (Tarbell and others, 1988), then a WHOI-USGS Oceanographic Data Processing System, and finally a USGS Oceanographic Data Processing System (Montgomery and others, 2008). The Buoy System ran on a VAX VMS computer and stored data in a VMS data format. The more recent systems run in MATLAB on all computers and store the data in EPIC Standard NetCDF files. For compatibility, the older Buoy-format data files have been translated to EPIC NetCDF.

In either data-processing system, after data were decoded and calibrated, they were checked for instrument malfunctions and then edited. Instrument malfunctions and sensor degradation due to fouling were initially identified visually in plots by off-scale values and patterns of variability. The beginning and end of each data series were truncated and outlier points deleted. Short data gaps (typically less than 1 hour long) were filled by linear interpolation. After editing, the basic version of the data file includes all variables recorded at the basic sampling interval and is called the Best Basic Version. In some cases (differing lengths of good data, for example) the variables at the basic sampling interval are stored in several files. An hour-averaged data file and a low-pass filtered data file were created from the Best Basic Version. The Best Basic Version, hour-averaged version, and low-pass filtered version of the data are included in this report (see Digital Data Files).

Low-Pass Filter

A low-pass filter is used to remove tidal and higher frequency fluctuations from the time-series data that sometimes mask smaller fluctuations in the time-series plots that are driven by winds and the density field. The filter, called PL33 (Flagg and others 1976; Beardsley and others, 1985), operates on hourly data values. The digital filter replaces each point with a weighted average of the 33 points on either side of the central point (fig. 25). The filtering reduces the total length of the time series by 66 hours (33 hours on each end). The filter transfers signals at unreduced amplitude that have periods longer than about 50 hours (fig. 26). The half-amplitude point of the filter is at 33 hours and the half-power point is at 38 hours. The filter removes more than 99 percent of the amplitude at the semidiurnal tidal periods and more than 90 percent of the amplitude at the diurnal tidal periods (table 6). The low-passed data are subsampled every 6 hours.

Vector-Measuring Current Meter (VMCM)

VMCMs recorded data on 1/4-inch cassette tapes using Sea Data recorders. After the VMCMs were recovered, the data were read from the cassette onto a personal computer, and then translated into WHOI Carp format, using programs Seadata and PCARPHP (Danforth, 1990). Decoding and calibration were performed using the Buoy Group Data Processing System. Until September 1999, data were edited, truncated, averaged, and filtered using the Buoy System, but since then VMCM data have been written to NetCDF files, and these procedures have been conducted in the WHOI-USGS system. Since February 2001, data were decoded and calibrated using the WHOI-USGS system. After 2004, the VMCMs were upgraded to solid-state control circuitry and modern data-logging systems replaced the magnetic tape system. A new compass was included in these upgrades. The sensors for current velocity and temperature remain the same.

SEACAT and MicroCAT

SEASOFT programs (Sea-Bird Electronics, Inc., 1990) were used to read the data stored in SEACATs and MicroCATs into a file on a personal computer, convert to calibrated oceanographic units, calculate salinity and density, and write the data to ASCII flat files. ASCII files were translated to Buoy Format or NetCDF, and the data were edited, truncated, averaged, and filtered using the Buoy System (until September 1999) or the WHOI-USGS system.

Acoustic Doppler Current Profiler (ADCP)

The ADCP observations were processed using the USGS ADCP Data Processing System and elements of the WHOI-USGS Oceanographic Data Processing system. The ADCPs were normally configured to record data in beam coordinates (rather than Earth coordinates). Upon recovery, the ADCP data were transferred to a personal computer and converted to NetCDF format using software available for the ADCP Toolbox. MATLAB routines were used to check for data quality, flag bad values, convert to Earth coordinates using a four-beam or three-beam solution, truncate the data at the beginning and end of the deployment, and discard bins that were always above the water surface. Some near-surface bins were not discarded even though the side-beam reflection at times of low tide renders these data invalid; therefore near-surface ADCP data must be interpreted accordingly. On occasion, the ADCP skips an ensemble record because the data are poor. Data collected since 2000 have blank placeholders for the missing ensemble records. The ADCP processing produces an EPIC-compatible data file.

Data Logging Current Meter (DLCM) Tripods

When DLCM data were recorded on Sea Data cassettes, the data were read from the cassette into a file on a personal computer, and then translated into WHOI Carp format, using programs Seadata and PCARPHP (Danforth, 1990). When DLCM data were recorded on a Tattletale hard disk, the Tattletale was attached to a personal computer and the data were copied into a file on the computer's hard disk, and then translated into WHOI Carp format using a C language program called SEADAT. Carp format DLCM data from both sources were processed using a Fortran program called NEWDISC that translates into calibrated oceanographic units and derives current speed and direction, standard deviation of pressure (PSDEV), and salinity. The WHOI Buoy Group routine NSINP was then used to translate the data into Buoy Group format, and the data were edited, truncated, averaged, and filtered using the Buoy System.

MIDAS Tripods

MIDAS data were recorded on a Tattletale hard disk and then copied to a personal computer's hard disk after recovery. Until February 1998, a C language program was used to translate the data to calibrated oceanographic units; rotate the velocity to produce east, north, and up components; compute the vector-averaged velocities from the burst measurements summed in situ; and calculate PSDEV and velocity variances, periods, and covariances from the burst measurements. The periods of the east and north velocity fluctuations were computed as two times the burst length divided by the number of sign changes in the burst. Due to an oversight, the mean was not subtracted from the burst measurements prior to computing the zero crossings. The result was an ASCII flat file that was translated into the WHOI Buoy Group format using routine NSINP, and the data were edited, truncated, averaged, and filtered using the Buoy System. Since February 1998, the WHOI-USGS system has been used to decode and calibrate the data, compute secondary variables, and perform all further processing.

BASS current-meter data were initially processed by assuming a constant speed of sound of 1,500 m/s. BASS data from 1991 to 1998 were corrected using a time-series of sound speed that was calculated from measured pressure, temperature, and salinity using the UNESCO algorithm (Fofonoff and Millard, 1983). Sound-velocity corrections were discontinued in 1998. Prior to 1998, BASS beam velocities were converted to horizontal and vertical components using sensor-tilt values that were in error by about 9 degrees due to a calibration error. This error caused a reduction of horizontal velocities is small, about 1 percent, but the vertical velocities are contaminated by about 15 percent of the horizontal velocity. The covariances between the horizontal and vertical velocities are contaminated by the horizontal velocities. After 1998, correct tilt values were used.

Because of the tilt error, the covariances prior to 1998 are contaminated by the horizontal velocity and are not included in this report. The periods of the u and v fluctuations may not reflect the true fluctuations when the mean is larger than the fluctuations. Only the standard deviation of the burst measurements are included in this report.

Transmissometer

Transmissometer data were processed along with the other data from SEACAT and tripod systems. The beam-attenuation coefficient (units of m-1) was computed from the light transmission observations as -4(ln(T/100)), where T is percent light-transmission over a beam length of 0.25 m. The beam-attenuation coefficient is linearly proportional to the concentration of suspended material in the water if the particles are of uniform size and composition (Moody and others, 1987). The size of the particles in the water changes with time, however, especially during resuspension events. Therefore, and thus the beam-attenuation measurements are only a qualitative indication of suspended-sediment concentration.

Wind Stress

Wind speed was adjusted to 10 m above the sea surface and wind stress calculated from wind speed and direction using the formulas of Large and Pond (1981).

Time-Series Sediment-Trap Samples

After recovery of the time-series trap, the trap bottles were cleaned and photographed. The contents were then sieved using a 1,000-micron polyethylene screen in order to remove filamentous organic matter, such as seaweed, which would complicate the splitting process. Samples were split on a four-way splitter described by Honjo (1978). The splits were used to determine mass, sediment texture, and concentration of Clostridium Perefringens, a bacterial spore and tracer of sewage. The fourth split was archived. The split designated for determination of mass was allowed to settle in a refrigerator for 3 to 5 days, the overlying clear seawater (and sodium azide) was measured for salinity, siphoned off, and the wet residue was subsequently freeze-dried. The mass was corrected for salt content using the weight lost on drying and the measured salinity of the overlying water. The sediment-collection rate (in grams/m2/day) was calculated from the measured weight, the time of exposure under the funnel, and the cross-sectional area of the trap (0.5 m2).

Back to Top of Page

adobe reader To view files in PDF format, download free copy of Adobe Reader.

Skip USGS links group

Accessibility FOIA Privacy Policies and Notices

Take Pride in America logo USAGov logo U.S. Department of the Interior | U.S. Geological Survey
End of USGS links group