Scott: You have talked about NEHRP and its big effect on USGS and its earthquake
program. Would you now discuss the development of that program?
Wallace: Yes. This is a good place to tell about the evolution of the earthquake program
within the USGS. That would include, of course, both the things supported by
NEHRP, and those initiated and financed otherwise. It is a fairly complicated
business. I set the stage by starting with the 1964 Alaskan earthquake, but many
things preceded that event, and there was more than one line of action going on.
It was more like the channels of a braided stream, first one channel then another
carrying more water, always shifting, sometimes a dam or a sudden flood
altering the whole pattern.
Scott: Before you start discussing the earthquake program, let me ask about the organizational structure of USGS. You just now mentioned the three divisions into which USGS is divided--National Mapping, Water Resources, and Geologic. Also I have heard the terms "office" and "branch" referred to, but have never understood how they all related organizationally.
Wallace: I do not want to get deeply into organizational structure. For purposes of this oral history it is probably sufficient to refer to the three divisions, and to note that among them is the Geologic Division headed by a Chief Geologist. As to why the USGS switched from having Assistant Chief Geologists with Branches under them to "Offices"--I'd guess it was a change that made management feel good. Branches also changed in size and character. A history of all these changes would no doubt be of interest to some readers, but it would involve a lot of administrative detail that I don't see as belonging in this history.
In any event, I think the personality characteristics of participants far outweighed in importance the administrative structure within which they worked. For example, George Plafker, to whom I give the highest marks as a leader in earthquake matters, has always been administratively in the Alaskan Branch of the Geologic Division, but he has worked all over the world on earthquakes, and the Chief of the Alaskan Branch exerts almost no control over George. The "upwelling of nonadministrative leadership," to use Preston Cloud's phrase, characterized the USGS for 100 years and made its administration an "improbable bureaucracy," as he called it, because it was so different from the typical perception of a bureaucracy. (Cloud, Preston, 1980, "The Improbable Bureaucracy: The United States Geological Survey, 1879-1979," Proceedings of the American Philosophical Society, v.124, n.3, June 1980, pp.155-167.
Scott: Those are some really interesting observations about administrative behavior that
I hope you will talk about more when you take a retrospective look at your entire
USGS experience. But for now, USGS titles and organizational nomenclature
need to be outlined, to help readers who may be unfamiliar with the terminology.
Could you give a very brief description of the structure of the USGS so that
readers will understand what Branch Chief, Division Chief, and so on stand for?
Wallace: The administrative levels of the USGS include the Director of the USGS at the
top, who oversees and speaks for 9,000-10,000 staff members. Under that level
are three Divisions, including the Geologic Division, Water Resources Division,
and National Mapping Division (formerly named the Topographic Division).
Each has a Chief, e.g. a Chief Geologist. For many decades the next level
consisted of Branches headed by Branch Chiefs. Before 1960 there were 10
branches with membership ranging from 50 to over 500. In 1960 branches were
increased to about 30, so that membership of each could be dropped to between
50 and 100, a number with which a Branch Chief could have personal contact.
Almost all my comments in this oral history pertain to the Geologic
Division. For many years the parts of the Geologic Division program were
administered by an Assistant Chief Geologist, who divided up money for the
various branches. In 1973 when the earthquake program began to grow, an
intermediate "Office" level was inserted between the Chief Geologist and
Branches. Parallel to this pyramid structure, representatives in the three regions
were titled Regional Geologists, or Assistant Chief Geologist for such and such a
region, but Regional Geologists had no control over money.
In 1995 the names were being changed to such things as "Team Leader"
for what may be somewhat parallel to a former "Branch." The stated goal is to
produce a "seamless" operation. Many names have changed at the branch level
as administrators try to adjust to changing times. Over the years many experiments have been tried, including establishing "Program Coordinators."
who then share authority for the distribution of money with Branch Chiefs, in a
"matrix management" scheme. In my opinion, there never has been a perfect
solution to managing a complex program in which research specialists are
expected to contribute to a variety of programs, while maintaining a steady
course in his/her own area of research. I have lived and operated under several
Scott: This brief description helps, although I see how it easily gets complicated.
Wallace: From a seismological point of view, by far the most important effort had been
the study and research aimed at the detection of underground nuclear explosions.
This effort was to find ways to monitor the Soviet underground nuclear tests
from afar, and to make nuclear test-ban treaties meaningful - even possible.
Seismological techniques seemed to have great promise, but the seismographs
then in place were too few, too primitive, and poorly placed for the task. There
were not very many competent seismologists available then, and some called
seismology "a cottage industry."
Recognizing this state of affairs, the Department of Defense, through its
Advanced Research Projects Agency (ARPA) and a program known as VELA
Uniform, turned on the money faucet for seismology in the mid to late 1950s.
Within a very few years more seismologists had been trained, new instruments
were designed and deployed, and theory had been developed to do the assigned
task. To make a long story short, the program was a huge success, and
underground explosions could be detected routinely.
Scott: How did the USGS respond to this clear earth-science need?
Wallace: Under the leadership of Lou Pakiser, a USGS group at the Denver center
participated actively in the VELA program. Their specific project was to
develop information about the structure of the earth's crust. All of the seismic
signals that came from Soviet underground explosions had to travel to distant
points where they could be detected, and the interpretation of the signals
depended heavily on knowing the structure of the earth's crust.
With success in hand, VELA projects began to be phased out, and by the
early 1960s, the USGS group decided to get involved in natural seismic events--earthquakes. Along came the 1964 Alaskan earthquake, and during 1965 and
1966 the USGS Crustal Studies group moved to the Menlo Park center in order
to study natural earthquakes.
Scott: Would you say a word or two more about that move, which seems to have been a
very important one?
Wallace: The Crustal Studies group--headed by Lou Pakiser--essentially took the initiative itself to move to Menlo. The move was not initiated by the Chief Geologist, but of course had his approval. It might be cited as an example of the "upwelling of non-administrative initiative," because the idea did not come from on high. While it did create problems, it in fact drove the agency to seek a better administrative structure. Remember, this was well before the concepts of what a national program should look like came into being. The consolidation of program elements then spread across seven or eight branches. While such seeming chaos may be anathema to some with the "pyramid-military-administrative" mentality, I see it as one of the strengths of the USGS in the way it facilitates the applications of talents of creative individuals to important missions.
A Gathering of Geologists and Geophysicists
Scott: I presume the move to Menlo Park tended to concentrate earthquake efforts
Wallace: That move certainly added a lot. The Crustal Studies group from Denver,
together with the Menlo Park engineering and structural geologists, most of
whom were at that moment working on the Alaskan earthquake, made up an
impressive capability within the USGS. We could see that the USGS could
expand an already impressive record and mount a major multi-disciplinary, long-term study of earthquakes.
Wallace: Sometime after the Crustal Studies group came to Menlo Park, we started to use
the name "National Center for Earthquake Research." That was a move, in part,
to recognize the unity of the overall USGS earthquake effort, and to identify the
many, many products we were beginning to turn out here in Menlo Park. The
name served well for a number of years and was printed on all reports and maps
For several years everyone had seen that some sort of high-level Division organization was needed to manage the earthquake activities of many Branches. In 1973 an "Office of Earthquakes Studies" was established, Bob Hamilton was selected to head the new office, and the name National Center for Earthquake Research was abandoned.
Some Major Players
Wallace: I will name just a few of the players. Among the geologists were Art Grantz,
Parke Snavely, George Gates, George Plafker, M. G.(Doc) Bonilla, Wally
Hanson, Ed Eckel and Dave McCulloch, and among the geophysicists from
Denver were Jerry Eaton, Dave Hill, Jack Healy, Barry Raleigh, and the Branch
Chief, Lou Pakiser. What a powerhouse that collection represented!
I might nominate here a Hero Number 1 of all the USGS investigators
who worked on the Alaskan earthquake. That is George Plafker, who first
figured out what really caused the Alaskan earthquake. Using an amazing array
of geologic information from decades of Alaskan Branch studies, George
demonstrated that a gigantic thrust fault produced the earthquakes observed. He
followed with some beautiful analyses of marine terraces to establish how often
such great earthquakes had happened in prehistoric times.
Scott: The Plafker story certainly bears out one of your themes--the importance of
individuals. Who were some of the major players in Washington?
Wallace: By then many people at USGS headquarters in Washington recognized the
potential earthquake responsibilities of the USGS. Wayne Hall and Dallas Peck
were two of these who championed the earthquake program. Wayne Hall and
Lou Pakiser had worked on the Pecora committee report of 1968, and Lou
Pakiser had been a member of the Press committee of 1965. Operating out of
the Office of the Director, Jim Balsley and Jim Devine played important roles
Soon after the move from Denver, Lou Pakiser, as Chief of an operating branch, the Branch of Crustal Studies, took strong initiatives to create a program. He organized an advisory committee of non-USGS people for his Branch, and began to develop concrete ideas on how to move ahead. The program ideas that came out of these crustal-studies deliberations focussed on seismology and geophysics. The capabilities and interests of the dozens of geologists who had been so productive in studying the Alaskan earthquake were almost totally ignored. Indeed, I felt that Lou had identified geologists as "the enemy," and for several years this standoff prevailed. Of course, the geophysical-seismological camp and geologic camp were competing for funds and control of future plans. I can understand that Lou did not want any of his VELA money to be siphoned off for geologic work. There were other disciplinary conflicts--some leading members of the engineering community considered the USGS earthquake program to be anti-engineering.
Scott: To what extent were these conflicts resolved, and how did you try?
Wallace: As I suggest, conflict within USGS was only one facet of a bigger contest among
all those interested in developing a National earthquake program. I have told
about the string of reports and struggle that went on for 13 years after the 1964
Alaskan earthquake until the 1977 passage of the National Earthquake Hazards
Reduction Act (NEHRP).
To answer your question more directly, an important one-word answer might be "patience." Perhaps it is healthy to have some continuing tensions. As time went by, different groups were merged or mingled, breakthroughs were accomplished all around, new people came aboard, others left the program, moved away, retired, and gradually a sense of community grew within the USGS program. But consensus remains fragile, particularly in light of the plethora of projects and programs whose proponents are asking for support.
Scott: You have mentioned a few of the USGS people who were influential. Were
there others who played a part?
Wallace: In 1969 George Gates, as Regional Geologist in Menlo Park, was designated by
Washington headquarters as coordinator of the earthquake program in Menlo
Park. Art Grantz, as Chief of the Branch of Pacific Coast Geology was intimate
with the geologic thrust of the Alaskan investigation. Art had strong convictions
about what the emphasis should be in a USGS earthquake program. At the time,
Art was a major force in developing the mapping of the active strands of the San
Andreas fault, an effort directed toward defining earthquake hazards and learning
the geologic basis for minimizing hazards. That program led directly to making
California's Alquist-Priolo Act possible.
George Gates was closely associated with the engineering community,
and during those years he served on the Engineering Criteria Review Board
(ECRB) of the Bay Conservation and Development Commission (BCDC), which
I will discuss a little later. Perhaps the main concern of ECRB was seismic
safety. Cooperation with the state of California, its Division of Mines and
Geology, and the emerging Seismic Safety Commission was always high on
At some point Jerry Eaton became Chief of the Branch of Seismology and he was very influential in shaping the program. He championed the development and led the way for a microearthquake study of the San Andreas fault. To my mind that net and project has produced some of the most important advances in understanding the geometry of the fault and the time-sequence of strain release through earthquakes. The so-called CAL-NET comprised seismometers and other instruments used to record, transmit and analyze data. Knowing really very little about seismology myself, I turned again and again to Jerry for guidance in the subject.
Scott: Did all these people work well together?
Wallace: The strong personalities of Pakiser and Grantz, and their very different ideas of
what should be emphasized in a USGS earthquake program inevitably led to
harsh words and strong feelings.
Scott: That must have made for difficult times.
Wallace: It surely was a difficult time. The rationale that "only seismology equals
earthquakes," seemed to rule the day in the USGS. Geology took a seat far to
the rear despite the fact that it was geological data that dominated the
contributions in the 28-volume USGS series on the Alaskan earthquake. And it
was through the interpretation of geologic mapping that the enormous slip on the
San Andreas fault had been defined. Art Grantz gave up the fight, and I
wondered whether or not I could stand these internecine battles. Fortunately,
those very strong animosities have long since disappeared and mutual respect
seems to rule the day.
Scott: What other things were happening about then?
Wallace: Early-on, the Nixon administration started a program--called something like
"New Scientific and Technologic Initiatives for Economic Development". Out
of a hundred or more ideas, seven or eight items survived a winnowing process,
and earthquake prediction came out a winner. Jack Healy, Barry Raleigh and
some others went back to Washington, D.C. for a congressional hearing. After
their experiment at Rangely, Colorado, where they literally turned earthquakes
on and off in the oil field there, by controlling fluid pressures, they started
talking about the possibility of turning earthquakes on and off intentionally. This
translated into the concept of earthquake control, which was a dramatic
possibility to talk about. As I understood it, that idea caught the curiosity of a
congressional staffer. I am convinced, earthquake prediction and related items
was included in the economic development program because of that
congressional aid's excitement. As a result the earthquake program got about $7
million, whereas we (USGS) had been getting about $1 million a year. It was a
big jump, but in some ways it was kind of a fluke.
In many ways earthquake control made real sense. Say there is some region in
the country--maybe Nevada, where the recurrence of earthquakes may be in
thousands of years--and where you know there are earthquakes waiting to
happen. You might trigger a bunch of earthquakes which, because of the state of
strain, are ready to go off in the next 100 years--or even 1,000 years. Once the
earthquakes are set off, you get rid of the accumulated strain and release much of
it. Then, whatever installation you want to put there, be it a nuclear reactor or
waste depository, will have a much lower possibility of being hit by a damaging
earthquake during its lifetime. That is the kind of rationale you might go
through in making use of earthquake control.
But, of course, in no way were we then or are we now prepared to trigger
earthquakes on the San Andreas fault, or to try to let the strain seep off
gradually. We just don't know enough about how the whole system works, and
the possible consequences of earthquake surprises would be completely
unacceptable. No way--I cannot see that coming, certainly not in my lifetime, or
for a century or more.
Scott: You talked about the origins of paleoseismology before, and I understand that it
now has an important role in the USGS earthquake efforts. What is the story on
how it gained acceptance?
Wallace: Yes, I noted earlier the Nevada paleoseismology studies that had really begun
serendipitously with the work on a geologic map of Nevada. Then in the 1960s I
began looking for a way to justify geology's role in the earthquake program,
based on evidence of prehistoric earthquakes preserved on fault scarps in
Nevada. Also between 1965 and 1968 field examinations along the San Andreas
revealed the first clear evidence of repeated displacements, each presumably
occurring at the time of a great earthquake on the fault.
Scott: When would you say paleoseismology actually began? Were there still earlier
forerunners predating your own involvement?
Wallace: I credit G.K. Gilbert with first using geologic evidence for making a statement about prehistoric earthquakes, which he did a full century ago. In his 1884 paper Gilbert notes a small cliff running along most of the base of the Wasatch Mountains of Utah, and says, "This little cliff is, in geologic parlance, a 'fault scarp,' and the earth fracture which has permitted the mountain to be uplifted is a 'fault.'" The small fault scarp of which Gilbert speaks represents one "fossil earthquake," one spasm of uplift of the mountain, and that spasm occurred a relatively few thousand years ago. (Gilbert, G.K., "A Theory of the Earthquakes of the Great Basin, With Practical Application," American Journal of Science, v. 27, n. 157, 1884, pp.49-53.)
Gilbert even used the evidence of scarps to make a prediction: "From Warm Springs to Emigration Canyon fault scarps have not been found, and the rational explanation of their absence is that a very long time has elapsed since their last renewal. In this period the earth strain has been slowly increasing, and some day it will overcome the friction, lift the mountains a few feet. and re-enact on a more fearful scale the catastrophe of Owens Valley."
The term "paleoseismology" was not used in Gilbert's time, of course,
nor was it yet in use even decades later when Charles Richter noted in New
Zealand that "Everywhere in the principal active areas of both islands, are
scarplets of the right height and extent to have originated in single seismic
events, without indication of accumulation or repetition, as if the locus of
fracture were constantly shifting." Surely this was an important
paleoseismological observation. (Richter, C.G., Elementary Seismology, 1958.)
Scott: Are there other early examples of using the idea of paleoseismology, if not the
Wallace: After the Alaskan earthquake of 1964, George Plafker not only found marine
platforms that had been raised above sea level during the earthquake, but also
many prehistoric marine terraces which showed, without a doubt, that similar
uplift events had taken place in the past. He and Meyer Rubin (Plafker and
Rubin, 1978) dated some of these events and demonstrated that they had
occurred at intervals of from 500 to 1400 years--a major paleoseismic finding.
(Plafker, George, and Rubin, Meyer, "Uplift History of Earthquake History as
Deduced from Marine Terraces on Middleton Island, Alaska," In Isacks, B. L.,
and Plafker, George, (co-organizers), Proceedings of Conference VI,
Methodology For Identifying Seismic and Soon-to-break Gaps, U.S. Geological
Survey Open-file Report 78-943, 1978, pp. 687-722.)
Scott: Hasn't trenching become one of the principal means of doing paleoseismology?
Wallace: Indeed it has. The idea is that in just the right circumstances slip on faults cuts and offsets sedimentary layers of silt, sand or gravel which can be dated by carbon fourteen or other means and thus the time of offset can be determined. Of course, the offset is assumed to occur at the time of a prehistoric earthquake. Trenching is a powerful method of investigation, and today is the most commonly used paleoseismological technique. Around the world I'll guess that thousands of trenches have been excavated to find paleoseismologic data. Such data has helped determine slip rates on faults, and the slip rates are then translated into estimates of the earthquake frequency--truly a predictive exercise.
Scott: When were the first trenches dug to look for paleoseismologic data?
Wallace: In 1968, Jay Smith of Converse, Davis and Associates may have been the first to
gather paleoseismologic data by trenching. Also in 1968, after the Borrego
Mountain, California, earthquake, Malcolm Clark and Art Grantz certainly were
among the first to get important paleoseismologic results from trenching. (Clark,
M.M., Grantz, Arthur, and Rubin, Meyer, "Holocene Activity of the Coyote
Creek Fault as Recorded in Sediments of Lake Cahuilla," in The Borrego
Mountain earthquake of April 3, 1968, U.S. Geological Survey Professional
Paper 787, 1972, pp. 112-130.) M.G. Bonilla (1973) and H.E. Malde (1971)
were also engaging in similar studies at about the same time. (Malde, H.E.,
Geologic Investigations of Faulting Near the National Reactor Testing Station,
U.S. Geological Survey Open-file Report, 1971; (Bonilla, M. G., "Trench
Exposure Across Surface Fault Rupture Associated With San Fernando
Earthquake," in National Oceanic and Atmospheric Administration, San
Fernando, California Earthquake, of February 9, 1971, Geological and
Geophysical Studies, v. 3, 1973, pp. 173-182.)
Scott: Some paleoseismologic studies had a profound effect on planning efforts in the
Los Angeles area if I remember correctly. Would you comment on those
Wallace: Yes. In the late 1970s, when Kerry Sieh was working toward a Ph.D. under
Dick Jahns at Stanford, he latched onto the thesis topic of analyzing young
faulting along the San Andreas fault in central and southern California. I was
pleased to participate as a reviewer at his defense of his thesis. He discovered a
very significant site at Pallett Creek on the north side of the San Gabriel
Mountains where he used the trenching technique very effectively. By
meticulous and detailed analysis, he identified a series of prehistoric offsets
along the San Andreas fault. He determined that the offsets had been formed at
intervals averaging between 140 and 150 years, presumably when earthquakes
The fact that the most recent great earthquake in that area was in 1857, led to the ominous conclusion that a quiet cycle of about 150 years was probably almost over, and that another great earthquake was imminent. This caused much concern among public officials and the public in the Los Angeles region. Kerry's very careful work, and his ability to communicate the significance of a complex scientific story to the lay public, had a profound effect and helped increase manyfold the region's and the nation's awareness of the value of earthquake-hazard mitigation. Since then, Sieh has continued to be a leader.
Scott: Once it had been established pretty securely, I guess the idea of using paleoseismology spread pretty rapidly.
Wallace: Oh my, yes. By the late 1970s paleoseismology trenches were being excavated routinely in Japan, and in the early 1980s the practice also became widespread in China. I reviewed some of this history in a paper presented at the first international conference on paleoseismology organized in 1987 by Anthony Crone and Eleanor Omdahl. (Wallace, R.E., "A Perspective of Paleoseismology", Directions in Paleoseismology, Proceedings of Conference XXXIX, U.S. Geological Survey Open-file Report 87-673, 1987, pp.7-16.)
So by 1987 paleoseismology was recognized as meriting that kind of
attention. Following that, in 1994 a second major international conference on
paleoseismology was organized by Dave Schwartz and Carol Prentice of the
USGS and Bob Yeats of Oregon State University. In 1995 at least two other
conferences or special sessions are planned in Germany and Italy.
Paleoseismology clearly has come of age.
Scott: You planned to talk more about promoting the introduction of new ideas into
everyday practice? I have always thought of the remarkable USGS Bay Area
planning project as an extremely effective program of outreach and bridging
between researchers, practitioners and public policy people.
A Little Background
Wallace: Government uses buzz words like "technology transfer" in answering the
question: How do we translate what the scientist develops into immediate
application? I take something of a market approach. To use an analogy, if Ford
produces an Edsel, it won't sell, if it produces a Mustang, it will sell. If a
scientist produces a useful theory or concept, it will be used. If they don't, it
won't be used. We all try to have our ideas and products be useful and used,
and try to put these ideas in useful forms. Our egos feed on such successes.
Scott: On the other hand, much research in its original form needs to be translated or interpreted for use by practitioners, and researchers may need help with that. Also, the successes can also help feed budgets, especially when they involve things seen as useful to society. So everybody should benefit from effective outreach.
Wallace: Yes, "outreach" is another currently popular buzz word, and the Bay Area project is an excellent example. It was in 1970 that we started the San Francisco Bay Region Environmental and Resources Planning Study, a cooperative effort with the Department of Housing and Urban Development. The purpose was to develop some of the fundamental earth-science principles and data sets that would make regional planning possible and effective, then to develop bridges to the planning community. That in turn would promote wider use of earth-science knowledge and ideas in regional planning. The Bay Area project also had a strong earthquake-hazard-mitigation component.
Scott: Give a little background and history of the Bay Area project.
Wallace: In the late 1960s two federal agencies in Washington, USGS and HUD--the
Department of Housing and Urban Development--were considering a joint effort
to aid urban planning by helping with its earth-science aspects. Meanwhile here
in Menlo Park, ideas for such a study had grown out of experience with a
Geologic Hazards Committee in Portola Valley, California.
Portola Valley is a small residential town of about 3,000-4,000 population with a
strong commitment to maintaining its rural atmosphere and setting. It is located
right on the fault--in fact, the "Valley" part of the town's name comes from a
valley that follows the San Andreas fault. Several of us geologists from the
USGS, as well as geologists and geophysicists from Stanford University, and
private consultants, reside in Portola Valley. Naturally, we earth scientists were
very much concerned about the earthquake hazards that faced the town. Dwight
Crowder had organized us and had been the inspiration for the town's planning
Scott: Yes, others have mentioned Dwight Crowder and the Portola Valley effort that
he got started. So the USGS-HUD interest in Washington, and these local
activities in Portola Valley, came together in a most constructive way?
Wallace: Yes, they did.
Program Initiated in 1969
Wallace: Art Grantz and George Gates, both of whom held administrative posts at the time, took the local ideas back to Washington. To abbreviate the story, the cooperative USGS-HUD program came into being by the end of 1969. George Gates became the first Director. Jim Balsley, as USGS Assistant Director, carried the ball in Washington, and Director Bill Pecora championed and aided the project, from the early discussions on.
When George Gates retired in early 1970, Bill Pecora asked me to take on the direction of the study. Almost simultaneously, I was also assigned the post of Assistant Chief Geologist (soon to be renamed Regional Geologist), Western Region, for the Geologic Division.
As one who hated administration, that was a sorry time for me. For the
San Francisco Bay Area study, we put together a plan that involved all three
operating Divisions of the USGS: Geologic, Topographic, and Water Resources.
Something like $4 million to $5 million was spent on the study. As a result of
the study, close ties were developed between the scientific-technical people, and
the planners and other "decision-makers" around the nation.
At the outset, my counterpart in HUD thought the project would be
immediately able to draw on the 300,000 volumes in the USGS, Menlo Park
library for help in urban planning. True, there was a lot of data in the library,
but not accessible to local planners and policymakers in a usable form. We also
lacked some of the fundamental information they needed. So we ended up
having to do a lot of new fundamental studies, basic research and interpretation,
and then developing practical steps for application.
New report formats had to be invented. We created three series of
1. A basic data series.
2. An interpretive series.
3. A series directed specifically toward application, intended for the
planners and others who would really use the material. Preparing
that was a complicated process that had many steps.
After I directed the Bay Area Study for a couple of years, Bob Brown
took over from me and did a magnificent job until its completion around 1976.
The project demonstrated how important earth-science knowledge was in the
planning of urban and regional areas. City and county governments developed
and adopted guidelines based on the work of the study, which produced more
than 100 publications, providing a wealth of information for regional planning
and urban policymaking.
For example, San Mateo County adopted ordinances governing density of
housing on steep slopes and landslide-prone areas. Most communities in the San
Francisco Bay Area that were along the San Andreas fault hired consulting
geologists, prepared fault-hazard maps, and created ordinances to minimize
earthquake hazards. Prudent use of the land became the word of the day.
Scott: Many people may need to be reminded of that Bay Area Study effort and its significance. It had a much broader influence than just on earthquake preparedness and was widely used both in the Bay Area and nationally.
Wallace: Yes, that study won several awards in the planning arena. Also there were
important spinoffs from the effort.
A Spin-off and its Widening Influence:
Scott: An example of a spin-off would be very interesting.
Wallace: A perfect examle of a "spin-off", as well as of the "up-welling of non-administrative leadership", alluded to several times in later sections, came from a
discussion among some younger scientists.
Roger Borcherdt, Bob Page and Rob Wesson, operating outside of official
planning efforts, informally gathered a group together to brain-storm the
possibility of assessing the geographic variations in geologic hazards. A
multidisciplinary, multi-administrative-unit effort resulted, and eventually
became a formal part of the Bay-region study.
This multidisciplinary study produced a comprehensive assessment of the
feasibility of seismic zonation in the San Francisco Bay region. The report was
compilied and edited by Boger Borcherdt, who not long afterward was able to
pass on many of the ideas to an international audience through his role as co-chairman of the Fourth International Conference on Seismic Zonation. The ideas
of a few imaginative scientists, thus, spread internationally. (Borcherdt, R. D.,
ed, 1975a, Studies for seismic zonation of the San Francisco Bay region; U.S.
Geological Survey Professional Paper 941-A, 102 p.)
In addition, the maps produced were incorporated into the "General Seismic
Safety Plan" of most city and county governments in the San Francisco Bay
region. Plans for similar studies were incorporated into the Newmark-Stever
report for the National Science Foundation and the USGS, one of the most
influential reports in finally establishing the National Earthquake Hazard
Reduction Act of 1977.
Even more recently, and perhaps one of the more important milestones that can
be linked back to the early effort is that of the California Seismic Hazard Maping
Act (AB-3897). Passed after the 1994 Northridge earthquake, this act reequires
that methodologies and land-use policies be developed for maps showing
geographic variations in earthquake ground shaking, liquefaction, and landsliding
Scott: That train of events certainly is interesting. So often the origin, blossoming and
expansion of an idea is difficult to trace.
Wallace: So true! As scientists, we agonize because we may know how the fundamental ideas from research were born, but the path to important practical application is commonly so very obscure.
Bill Kockelman's Role: "Bridging"
The late Bill Kockelman played an essential role that I want to mention.
Bill had previously been New Mexico State Planner and came to USGS to work
as a planner on the Bay Area Study. Later he carried his talents over into the
earthquake program. He developed a rapport with a community of information
users that was wholly different from the usual contacts of our scientific and
After he had applied his "bridging" efforts to the earthquake hazard
reduction for a decade, the planning community came to think of Bill as "Mr.
U.S. Geological Survey." He had an intense interest in this important work and
a remarkable ability for effective bridging that few can equal. For his expertise
as a planner, he was invited to serve on the California Seismic Safety
Commission. Later he received the Department of the Interior's Meritorious
Service Award for his work.
Scott: I knew Bill Kockelman well, enjoyed working with him, and served on the
Seismic Safety Commission with him. As part of the USGS outreach effort, he
and Bob Brown wrote a very useful guidebook--a USGS professional paper--to
help Bay Area policymakers and planners use geologic knowledge more
effectively. (Brown, Robert D., Jr., and William J. Kockelman, Geologic
Principles for Prudent Land Use: A Decisionmaker's Guide for the San
Francisco Bay Region, USGS Professional Paper 946, 1983.)
Then I got them to do a streamlined version for the Public Affairs Report
(bulletin of the UC Berkeley Institute of Governmental Studies). Their article
was published in 1985 and widely distributed in the Bay Area and beyond.
(Brown, Robert D., Jr., and William J. Kockelman, "Geology for
Decisionmakers: Protecting Life, Property and Resources," Public Affairs
Report, v. 26, no. 1, February 1985.)
Wallace: During the Bay Area study we often talked about "bridging" as a separate discipline. In fact, as I mentioned before, we treated it in the 1978 Steinbrugge report Issues for an Implementation Plan. You find some scientists who by personality and certain natural talents are effective in reaching across to planners and policy-makers. I think of George Mader as a planner who was also able to reach across and link with seismologists, geologists and engineers. Another person who could reach across was Bob Brown here in USGS, who followed me as director of the Bay Area study.
Wallace: To sum up, the Bay Area project illustrates the marvelous interactions among different disciplines that developed within the USGS. This gave it a power that many mainly scientific research organizations lack. You also mentioned the broader significance of the approach of the Bay Area Study. I should make it clear that the Bay Area Study was conceived as one of several demonstration studies (and not primarily earthquake), another was conducted in the East (Pennsylvania I believe) and a third in Arizona using very different designs and different techniques. Seattle is now getting still another approach, and other experiments are under way, but I don't know the details. The volcano program has similar hazard reduction experiments--witness its work on the Philippine volcano, Mt. Pinatubo, which erupted in 1991.
The whole idea was for the federal government to test approaches and
show the way, after which state and local governments would pick up the long-term implementation. While it seldom happens that way quite as fully as one
might wish, unquestionably the Bay Area is far ahead of the rest of the world,
the USGS-HUD program must have had a lasting influence.
Scott: The Bay Area Study made excellent use of the Portola Valley experience as
something of a model that other communities might learn from in adapting its
policies to geologic hazards. I am sure it has had a significant continuing
influence in the Bay and more widely in California. I greatly fear that now the
budget crunch and force reductions will make it difficult for USGS to continue
doing innovative things in the future. My view of how our federal system can
work best is not prevailing these days.
Wallace: Yes, and we also get into the philosophic problem of downsizing all government, so popular today.
A Personal Note on Portola Valley
Wallace: As a long-time Portola Valley resident, I include this personal note on my own experience, which also illustrates the difference between voluntary risk acceptance and involuntary risk acceptance. The Portola Valley elementary school, one of three in town at the time, was situated directly over the San Andreas fault. I began to make strong statements at school board meetings, advocating that the school be abandoned. With this as my main policy positon, I even tried--unsuccessfully--for a vacancy on the board. When I was interviewed by the board of education in applying to fill the vacancy, I also spoke out for higher teachers' salaries. But they did not like me very much.
Regarding the school on the fault, as a father, I took the position that I
personally was willing to accept the risk for my son to go to this school,
because, during the few years he would be attending the school, the probability
of a killer earthquake was very low, particularly for the relatively few hours he
would be in the most vulnerable room. I thought his chances of being injured in
a school-bus crash on the town's winding roads far exceeded the risk of dying at
the school in an earthquake.
On the other hand, as a potential school-board official, I would need to
consider the hundreds of students who would be exposed to the risk while
attending school there, over many years. I found the idea totally unacceptable
that, during the decades the school would be used, some students would probably
be killed because we required them to attend a school situated on the San
After a decade or more of my pushing this idea, this school was
abandoned. But I must add that its use was changed at a time when the town's
school-age population was declining. With other schools taking up the slack,
Portola Valley school was turned into the Town Center, a use that drastically
lowered its occupancy level. Many other elements of the Portola Valley
experience could also illuminate practical earthquake hazard implementation
methods, which is why it became an important pilot project in the Bay Area
Scott: Earlier you commented on the influence that individual earthquakes have on
earthquake studies, public policy, and the directions taken by earthquake
preparedness efforts. Would you say more on that, with particular reference to
the San Fernando earthquake, which was very influential in California?
Wallace: The San Fernando earthquake happened on February 9, 1971, just a few days
after we all had met for the Earthquake Engineering Research Institute's annual
meeting near the Los Angeles airport. Everyone immediately became involved,
and were swamped with things to do. I flew down from Menlo Park in a light
plane with Jack Healy and we entered the Los Angeles basin flying along the San
Gabriel fault, thinking that we might as well be looking for earthquake things as
soon as possible. It was also a fine opportunity for taking some early air photos
with my beloved Roloflex.
Scott: San Fernando had a major impact on programs and mitigation efforts, followed by other earthquakes, such as Coalinga, Loma Prieta, and Northridge, each with their individual and unique impacts.
Wallace: Yes indeed, and in speaking of earthquakes that happened during this period, I
must not forget the Parkfield earthquake of June, 1966. Although not large or
damaging, it set into motion what turned out to be the very first formal
earthquake prediction, validated by both the National and State earthquake
prediction councils. That was an important development, both scientifically and
from a disaster-management point of view.
Also, don't let me forget to say a little about another happening with
major impact on the program, the identification of the so-called "Palmdale
Bulge" by Bob Castle. That led to a "first" in hazard notification in 1976 by
USGS Director Vince McKelvey, who had just been delegated responsibility via
FEMA and the Department of Interior to issue warnings and predictions. He
took great pleasure in this "first," and flew to California to notify the Governor
Jerry Brown personally about the strange Palmdale Bulge, which was believed
possibly to presage a big earthquake in southern California.
Wallace: Following a few days investigating the San Fernando earthquake, I met Jerry
Eaton at the Burbank airport when we were both on our way back to the Bay
Area. At the time, and despite the USGS's marvelous response to the San
Fernando earthquake, in many ways our earthquake program was in disarray.
Some key figures were gone: Lou Pakiser had moved back to Denver, and
George Gates had retired. In visiting while waiting for the plane, Jerry and I
seemed to say simultaneously, "It is up to you and me to get the USGS
earthquake program on track again."
The task was not easy for two people who disliked administrative roles.
But many things were brewing at the time. The U.S. Office of Management and
Budget (OMB) was beginning to say things about "duplication." The old U.S.
Coast and Geodetic Survey, which had an earthquake studies role for decades,
was now in NOAA, which had a powerful personality as its head--Director Bob
I think the first thing Jerry and I did was to recommend through USGS Director Bill Pecora that an advisory committee be reestablished. Frank Press agreed to serve as chairman, which was very fortuitous because by the spring of 1973 OMB was prepared to consolidate the USGS and NOAA earthquake programs into one.
Merger of USGS and NOAA
Scott: What was the makeup of the advisory panel, and how did it proceed?
Wallace: The advisory panel--which was a successor to the one Lou Pakiser had established several years earlier--consisted of non-USGS people from universities, state agencies and private practice. I made a point of expanding the discipline representation, including an earthquake engineer, Karl Steinbrugge. Such panels provide a broad perspective from outside the organization, which is invaluable. Any organization commonly is blind to its own faults. Unfortunately this and other such panels became the victims a presidential directive to reduce "agencies of the federal government," even though it is questionable that a panel of 10 or 12 should be classed as an agency.
At the June 1973 meeting of the advisory panel, Frank Press admonished
the USGS to prepare a National Earthquake Program. There was some at least
mild resistance, because many felt that we could not speak for NOAA, the
Bureau of Standards and other agencies. Frank essentially scolded us, saying;
"If USGS cannot define a national program, who can and will?"
At first I was not aware that Frank Press and Karl Steinbrugge were
advising OMB on the merger of USGS and NOAA earthquake programs, but
during the meeting that became apparent. It was much later that I learned that
Karl was for the earthquake program to go to NOAA. He had good friends
there, such as Bill Cloud, Don Tocher and Fritz Matthiesen, all of whom were
strong, competent players in the earthquake business.
Scott: I suppose both the USGS and NOAA groups were concerned about how to
approach the merger and what the impact on each agency would be.
Wallace: I should say so! Unfortunately, I seemed to get more and more involved and was
put on the spot to prepare the total plan by drawing on the myriad programs
which Art Grantz, Jerry Eaton, Parke Snavely, I, and many others had written
over the years.
Scott: Give us some insight from your point of view on how things developed.
Wallace: On the Fourth of July, 1973, several of us of the USGS spent the holiday in Washington, D.C., getting out copies of all the USGS papers about earthquakes that had ever been published. I was duly impressed as the piles on a table in the Chief Geologist's office grew and grew until they toppled over. What an impressive record we amassed; going back to G.K. Gilbert in 1883 and Dutton's report on the 1886 Charleston, S.C., earthquake! We were sure that over at NOAA a similar panic exercise was going on.
I had abundant material for the program statement, but until August I had
not made much progress in integrating it. Inasmuch as Frank Press had
"ordered" it completed by the time of the next advisory panel meeting in
September, time was running out. I was scheduled to attend a meeting of the
International Geological Union in Montreal in late August. As things turned out,
my bed in the hotel in Montreal became the work place over which I spread all
of the various pieces of the report-to-be. Program plans that Art Grantz had
prepared were especially helpful. I heard almost none of the fine papers
presented at the meeting.
Scott: But you did get a national plan outlined?
Wallace: Yes, a plan of sorts. It was at least a basis for a national program. As soon as I
had a draft, I turned it over to the USGS Directorate. The document was
published with my name as author, even though I merely compiled materials
prepared by many others. More or less concurrently with the September meeting
of the advisory panel, the report was transmitted by the USGS directorate to
What an explosion that caused at NOAA. Bill Hess, head of the NOAA
earthquake program, phoned me. I don't remember being so loudly scolded ever
before or since. I had to hold the phone at some distance from my ear. He said
that we were supposed to develop a joint, cooperative program, and that I had
lied and cheated in sending the "national plan" to OMB unilaterally. Of course,
others had made that decision, but I had a part, of course.
Frank Press or Karl Steinbrugge would have to report on the details of
action at OMB. Suffice it to say, OMB decided to consolidate the earthquake
programs of NOAA with that of USGS. In 1974 the NOAA group moved from
their San Francisco office to Menlo Park.
Scott: That merger was a major change in a long history of earthquake studies, wasn't
Wallace: Yes, it was traumatic for the folks in NOAA, and many of them were very
unhappy. The USGS bent over backward to make the merger as positive a move
as possible, but it was quite difficult. There is always the question of whether
competing programs are wasteful "duplication," useful "redundancies" or very
valuable from a straight "competitive" point of view.
With the merger, the management of the USGS earthquake program
began to take better shape. Bob Hamilton, who had been with the Crustal
Studies Branch in Menlo Park had moved to Washington, D.C., to be a deputy in
the office of the Chief Geologist, with duties to take care of earthquake program
matters. About the time the NOAA group came in, the USGS earthquake
program was raised to the status of an Office, Bob being designated
as office Chief by Chief Geologist, Dick Sheldon, who headed the Geologic
I was delighted by the earthquake program's added stature. I had finished my tour as Regional Geologist, and a temporary role in dividing up earthquake monies, and could tactfully refuse other administrative assignments. I looked forward to years of research ahead. That was achieved in large part, except for one assignment.
The Office of Earthquake Studies
Scott: I believe that for many years you had the title of Chief Scientist, Office of
Earthquake Studies. Later, "...Volcanoes and Engineering" was added to the
Wallace: Yes, that title represented a strange turn of events, and did not carry much
meaning. Chief Geologist Dick Sheldon, and Bob Hamilton, by then Chief,
Office of Earthquake Studies, were concerned about a still-active schism between
the geophysics group--the former Branch of Crustal Studies--and the dozens of
geologists in different branches who felt that geology was being given short
shrift. With the appointment of seismologist Bob Hamilton as Chief of the new
office, geologists felt that they had lost out again--although actually Bob also
qualified as a geologist.
Jointly, Dick and Bob proposed that I take on a title and some
responsibility to formalize the more-or-less informal lead role I had played in the
few months following the San Fernando earthquake. They suggested the title,
"Chief Scientist, Office of Earthquake Studies." The idea was that inasmuch as I
was a geologist, and it was the geologic group who felt disenfranchised, this
would make them feel better represented. I responded with a strong; "No way!"
I didn't want anything that smacked of administration.
Under pressure, however, I accepted their idea, on the condition that I
would just do my thing and have no real administrative responsibility. The
ambiguous assignment had its pluses and minuses. To outsiders, it seemed that
some one person did represent the Office in Menlo Park, which was the largest
earthquake group. I had no administrative authority, which internally was well
understood, so I could not and did not try to take certain steps. The only
influence I could exert was by selling an idea, or still better, getting individuals
to invent the same or a similar idea themselves. Every three or four years I
resigned the title: Bob Hamilton, Rob Wesson, and John Filson each in turn
rotated out as Chief. But the Chief Scientist title stuck until I retired in 1987.
Scott: You have covered important facets of the USGS earthquake program, but there
was a lot more going on, I believe. Could you briefly sketch out some of the
other major efforts?
Wallace: Yes, I will give those things some attention here, but first I want to emphasize that I never intended to make this oral history memoir into a full history of the USGS earthquake program. Instead I have planned all along to deal largely with my own personal research, field work, writings, and travels. I was, of course exposed to almost all parts of the program at one time or another, but often the involvement was spotty and fragmented. So I would rather not tackle some topics to which I do not feel able to do justice. Interested readers might look at an excellent summary that Tom Hanks did of the program up to 1985. (Hanks, Thomas C., The National Earthquake Hazard Reduction Program--Scientific Status, U.S. Geological Survey Bulletin 1659, 1985.)
Putting Prediction Studies in Context
Scott: In doing that, you might put some of the projects related to prediction into
context. Prediction figured in the public mind and was often treated in the media
like some kind of "Gee Whiz" high-tech spectacular. In reality, a lot of solid
and fundamental scientific work and thought underlay the prediction effort. You
might talk about that as you round out your treatment of the USGS earthquake
Wallace: Prediction has already figured in my "Trail of Documents" discussion. First off,
I want to emphasize that "prediction" refers to more than just the prediction of
the day, hour, minute and size of an earthquake. The prediction of the effects of
earthquakes and their distribution is equally important for hazard-reduction
measures. (See: Holzer, Thomas L., "Predicting Earthquake Effects--Learning
from Northridge and Loma Prieta," Science, v. 265, August 26, 1994, pp.1182-1183.)
"Almost everything bends before it breaks" is the simple concept that underlay several of the USGS prediction projects. I often used a stick or a thin piece of cedar roofing shingle to illustrate this point for lay audiences. I would bend the piece of wood until it almost broke. Then as I continued bending it I would ask members of the audience to call out when they thought the stick was about to break. Someone in the audience was nearly always right on the mark.
Then I would tell the group that before the stick broke, I could hear little
crackles and pops as wood strands and fibers gave way. I compared the crackles
to earthquake foreshocks, and the break itself to the main earthquake.
Furthermore, the vibration I felt when the stick broke were like the ground
shaking that goes with an earthquake. I pointed out that we would be better able
to predict the stick's breaking point if we knew more precisely how strong the
stick was, how its fibers were arranged, and how hard I was pressing.
Scott: You used the breaking-stick analogy to demonstrate the range of geologic
phenomena that must be studied to get a better idea of when earthquakes may
Wallace: Yes, and the analogy suggests some major parts of the prediction program. How
can we measure bending (straining) of the Earth's crust before the big break?
How can small shocks noted before a major earthquake be identified as
foreshocks of the big event and not just the usual background of seismic activity?
How can we predict the strength of the earthquake shaking?
Scott: Those are straightforward questions, but are probably very hard to answer.
Wallace: I should say so. In 1960, who knew how to measure the bending (strainig) of the earth's crust over hundreds or thousands of kilometers, or even over much smaller distances, in both two- and three-dimension, and in a suitable time frame. Just how do you set about identifying small shocks as foreshocks?
Each idea for an experiment to address one of these problems is bound to
have branches and subtopics. Furthermore, the simple model of an earth that
behaves elastically is complicated by the presence of water almost everywhere.
Water makes rocks less brittle and more ductile. Similarly, the greater pressures
and heat prevailing at depths in the crust make rocks behave in a more ductile
Measuring Crustal Bending and Warping (Straining)
Scott: How is the bending of the earth's crust measured?
Wallace: Methods first used for measuring bending or warping (straining) near the San Andreas fault were rather straightforward. They included simple surveying and levelling, creep meters, and tilt meters. The USGS began testing and refining these techniques from the 1960's on, using alignment arrays, creepmeters, and tilt meters. Straight lines of monuments were placed in a fence-row pattern across the San Andreas fault and resurveyed repeatedly for any deviation from the original alignment. This might detect slight precursory movements.
So-called "creepmeters" were made of invar steel wires, perhaps 100
meters long and suspended in buried, protecting pipes. The wires were anchored
diagonally across the fault; one end was attached to a sensitive measuring device.
If even very small movements on the fault occurred, the amount of extension or
shortening was recorded. Bob Burford and Sandra Schulz (later to become Mrs.
Burford) spent many years recording changes in the earth using these techniques.
The Parkfield prediction area was an important target.
Sensitive "tilt meters" were placed in patterns to detect very slight disturbances of the ground surface. Tilt meters are nothing but a levelling device, commonly a bubble device electronically attached to recording devices, which can measure very minute changes in the slope of the ground.
As simple as they were, each of these methods required the invention and
construction of basic new instruments, followed by designing ways to protect the
devices from temperature and weather changes. In the shallow vaults built to
protect the recording devices at the end of a creep-meter, Sandy Schulz would
often find a family of black-widow spiders climbing around the recorder, or a
pool of rain water covering the instruments. Many variations in instrumentation
were tried to determine which were the best, and of course cost was always a
factor. How could it be done at lower cost and with less maintenance?
Improvisation was a daily affair.
Much interesting data was obtained, and some of it seemed to hold
promise for prediction. Tectonic creep--the slow slip sometimes observed along
faults--would proceed at a constant rate for a time. Then, not uncommonly just
before an earthquake, the rate of creep would slow down, halt, or speed up. But
there was always a big background question. Could these near-surface changes
really give a good clue as to what was happening at the depths where
Measuring Long Surface Lines
Wallace: When the USGS program began to take shape in the 1960s, there were no entirely satisfactory methods for measuring longer lines on the earth's surface. Following antiquated and unsuitable surveying methods at first, the invention and development of the laser provided a new and potentially powerful approach. Taking each laser measurement was complicated, however, because the determined distance varied according to air temperature and humidity along the line of sight. For a time, light airplanes were flown along the line being measured, in order to make the atmospheric measurements while the lengths were being determined. Later, to counteract the correction problem, the USGS had two-color lasers designed and built. The two-color instruments could automatically correct for variations in air temperature and humidity along the lines of sight, and more frequent line measurements became practical.
A very successful two-color unit was deployed at Parkfield for the
prediction experiment there. The unit has performed very well over the years.
But the cost of building and operating custom-made units proved far too great for
our budgets. Consequently, and sad to say, similar laser units have never been
used for measurements along other reaches of the fault.
Following the launch of Sputnik in 1957, earth-orbiting satellites gave
science a way to view large areas of the crust simultaneously. Now the Global
Positioning System (GPS), developed by the U.S. Air Force, uses signals from
several orbiting satellites in a ranging mode to give positions routinely. The
potentials of this technique are now being realized in the earthquake prediction
I can remember in the 1970s hearing presentations to the Committee on
Seismology, National Academy of Science, on the precision of space techniques.
Early on, investigators were pleased to be able to make measurements accurate
to within hundreds of meters, then to meters and then to centimeters. Now
aircraft or hikers can use light-weight GPS units to determine precisely where
they are anywhere in the world. Automation of the process permits aircraft to be
guided to a landing automatically, without human intervention.
Jim Savage and associates, especially his colleague Mike Lisowski, have
been especially successful in using the long-line techniques, including
trilateration networks, laser ranging and GPS, to track crustal changes in the
western U.S. and Alaska. One of their papers recounts a long-term study in
Nevada and illustrates some of the techniques used and types of findings.
(Savage, J.C., Lisowski, M., Svarc, J.L., and Gross, W.K., "Strain
Accumulation Across the Central Nevada Seismic Zone, 1973-1994," Journal of
Geophysical Research, v.100, n.B10, 1995, pp. 20, 275-20, 269.) Efforts to
monitor the strain across the Long Valley caldera and Mammoth Mountain on
the east side of the Sierra Nevada have been extremely important in trying to
keep abreast of the possibility of major earthquakes and volcanic eruptions there.
Dave Hill has been the principal investigator of the Long Valley caldera.
The potential for both a major eruption and a major earthquake has caused local
consternation, inasmuch as the area is a major ski resort with as many as 50,000
visitors during winter weekends. So the USGS has attempted to maintain an on-going evaluation of the hazardous situation there.
While pursuing his specialty, which is seismology, Dave has been kept
busy watching a wide range of changes, e.g. emission of excessive amounts of
carbon dioxide, constantly providing the public with up-to-date information,
and doing his own scientific research. (Hill, David P., "Earthquakes and Carbon
Dioxide Beneath Mammoth Mountain, California." Seismological Research
Letters, v.67, n.1, 1996, pp.8-15.)
The Palmdale Bulge
Scott: I remember that the so-called Palmdale Bulge in southern California attracted a
great deal of attention for several years. It was also called the Southern
California Uplift, particularly by those who did not like the Palmdale area
singled out. There was a considerable debate about the Bulge, during which
even its very existence was seriously questioned. The Bulge issue was in a way
part of the earthquake prediction discussion. Can you say a word about that?
Wallace: Yes. The phenomenon was identified by some of the geodetic techniques also
used for prediction. The major finding that called attention to the bulge was
based on leveling records gathered over many decades by the National Geodetic
Survey (NGS) and others. Bob Castle (USGS) in 1974 began studying and
reviewing these records, obtained largely from NGS.
Regional leveling had been done much as it is done in local surveys such
as those done in laying out foundations for homes. Using a telescopic level to
view a rod placed at a point some distance away, the difference in elevation
between two points is determined. When the leveling must be extended over
miles or even hundreds of miles, however, very stringent procedures are
essential to prevent the accumulation of small routine or systematic errors that
could otherwise significantly distort the results.
Many level lines had been surveyed across southern California by both
the NGS an others working on railroads, highways and pipelines. It was these
records that Bob Castle reviewed, and to make a long story short, he found that
an enormous elliptically-shaped area had risen episodically by as much as 0.45
meters during the previous several decades. The area covered much of the San
Gabriel and San Bernardino Mountains, and beyond to the west and east,
extending in width from the front of the ranges on the south to well out in the
Mojave Desert to the north.
Scott: I believe it was called the Palmdale Bulge because it sort of centered on Palmdale, or the highest uplift was noted as near Palmdale. Is that correct?
Wallace: Yes, the uplift became known as the "Palmdale Bulge" because the town of
Palmdale on the north side of the San Gabriel Mountains was in the area of
greatest uplift. This pattern of uplift was not particularly surprising to me,
however, or to many other geologists either, because the geomorphology of the
Transverse Ranges clearly indicates a pattern of uplift over the past million years
or more, with rapid uplift in most recent geologic time.
Nevertheless, such clear instrumental evidence of modern uplift was
alarming. Among other concerns, the area defined by the uplift lay astride the
San Andreas fault, and across a part of the fault whose 1857 rupture produced
one of California's great historic earthquakes. Elsewhere I have told how Kerry
Sieh had estimated that the recurrence interval of great earthquakes on that fault
segment approximated the elapsed time since the 1857 earthquake.
Was another earthquake the size of 1857 about to occur? Considering
what I described earlier about the crust of the earth bending before breaking,
perhaps here was a mammoth-sized precursor of an impending great earthquake
in southern California.
Scott: The phenomenon did have ominous overtones. Because the implications seemed
very serious, I can see why USGS Director Vince McKelvey took Bob Castle's
analysis very seriously, and made a trip to California in 1976 to alert and brief
the Governor personally. The Bulge issue riveted the attention of the new
Seismic Safety Commission, all the more so since Commissioner Bob Rigney
was the administrative officer of San Bernardino County, whose territory ran
near Palmdale. With so much interest, I presume that the data and its
interpretation got some pretty thorough independent scrutiny by others, to
minimize the chance of some mistake having crept in?
Wallace: Yes indeed, they were checked by many people and by several independent
techniques. Perhaps the most significant challenge came from David Jackson, a
geophysicist at the University of California at Los Angeles. Jackson's analyses
led him to believe that systematic errors in the original surveying had lead to the
misinterpretation of an uplift. The error concerned the accuracy of the level rods
and improper adjustments for weather factors and refraction along the lines of
sight between the rods and the leveling instrument.
As a result, many other tests were run, additional level lines were
measured, and other techniques were tried, in order to get independent
confirmation or denial of the Palmdale Bulge's existence. Among these, several
studies in the changes of gravity across the Bulge area strongly supported the
reality of an uplift. While the Palmdale Bulge survived these close scrutinies, in
the final analysis the height of the uplift may prove to have been less than
The Bulge was first recognized not long after the 1971 San Fernando
earthquake, and since then numerous earthquakes of moderate size have occurred
along the southern flank of the Bulge, including such damaging earthquakes as
Landers (1992), and Northridge (1994). Clearly the southern
California region has been unusually active in the past few decades, and
earthquakes must be considered part of the overall pattern yet to be understood.
I wish we had a way to judge the significance of these earthquakes, because they
could be part of a pattern leading up to a great, disastrous earthquake. The final
chapter on this is yet to be written.
Seismological Studies and the Parkfield Experiment
Scott: You mention seismology, Dave Hill's specialty. Presumably the USGS had
many such projects going. Could you say something about them, particularly as
related to earthquake studies?
Wallace: The projects range from global seismology to detailed local studies, from the study of extremely large earthquakes to microearthquakes, and from foreshocks to the changes in normal patterns of seismicity. Within each area of study exciting possibilities continually show up in the search for the Holy Grail of prediction. Generally, however, very few of even the most exciting possibilities have had sufficient funding for adequate follow up. The same holds true in the search for better ways of defining related potential hazards, regardless of when the big event might occur.
Scott: A mostly seismology project that was exceptional in attracting a great
deal of attention and getting quite a lot of funding was the Parkfield study.
Maybe you could start off with the Parkfield study. It was a very big thing for
several years. I even recall the Seismic Safety Commission taking a day-long
field trip to Parkfield for briefings on the whole program.
Wallace: Let me begin with a little background. Many in the USGS earthquake group that
came together after the 1964 Alaskan earthquake were excited about the idea of
earthquake prediction. It would require scientific investigations at the cutting
edge of science, and the potential societal benefits seemed to be enormous. As
noted earlier, Frank Press, Science Adviser to President Carter chaired a panel
that wrote the first major proposal in 1965 for a national program to reduce the
hazards of earthquakes. That proposal outlined a ten year program for learning
how to predict earthquakes.
The stage thus was set when a moderate-sized (M 5.5) earthquake struck the hamlet of Parkfield along the San Andreas fault in 1966 in a remote area of central California. The main earthquake hit at 9:26 p.m. on the evening of June 27th. A group of seismologists and geologists at Caltech, including Clarence Allen, left Pasadena for Parkfield soon after they had an epicentral location determined for the earthquake.
The next morning a USGS group, including Lou Pakiser, Doc (Manuel)
Bonilla, and me, set out for Parkfield. Where we crossed the general trace of
the San Andreas fault zone near Cholame, we stopped the car, got out and
looked for ground fractures that might have occurred during the earthquake.
"Eureka," there they were. The main fracture offset the white line in the middle
of Highway 46 by about 5 cm, and the apparent movement--east side to the
southeast--was what we would expect along the strike-slip San Andreas fault.
Scott: I guess that was the beginning of the interest in the Parkfield region?
Wallace: Yes it was, and some exciting things began to show up immediately. We found
Clarence Allen lying out in the shade of a big live oak tree, after an all-night
stint of driving and field study using headlights and flashlights. "Did you see the
offset in the white line on Highway 46?", he asked. "Yes we did", I said taking
out my sketch of the offset. "It was about 5 cm.", I reported, or nearly 2 inches.
"It was?" Clarence replied incredulously. "I measured only about 1 inch". We
soon realized that we had encountered a new, previously unreported
phenomenon--now known as post-earthquake "creep" or "slip." That was very
Scott: I can imagine. I know about Karl Steinbrugge's discovery of tectonic creep at
the winery near Hollister. So at Parkfield you were getting first-hand exposure
to a new form of creep?
Wallace: Yes. It soon led to the notion that creep might speed up before an earthquake,
and thus serve as a good precursor. Later that same day we heard about a water
pipe line that broke where it crossed the San Andreas fault near the Wilson
ranch, and that the break had occurred about nine hours before the June 27
earthquake. The break in the pipe showed the characteristic movement on the
San Andreas fault--east side to the southeast. Did this truly represent pre-earthquake fault slip?
A few days later I examined the site carefully, looking for other possible
causes of the pipe break, such as being bumped or shifted by moving farm
machinery. After questioning local ranchers and making careful examinations, I
had found no explanation for the break other than pre-earthquake fault
A local rancher, Herbert Durham, took me to a spot on Turkey Flat road
where the 1966 fault break had offset the road and recounted how the fault also
broke there in 1934. He told me of the difficulty he had back then getting his
team of horses to cross the break.
Clarence Allen reported that on June 16, 1966, some eleven days before
the earthquake, he had taken a group of Japanese scientists along this part of the
fault. Clear ground fractures could be seen even then. Dr. Keichi Kasahara, of
Japan, had taken photographs. With D.B. Slemmons, University of Nevada, as intermediary, Dr. Kasahara kindly provided me with a copy of the
photographs for use in our paper on the earthquake. Here again was a strong
suggestion of slip and cracking before the earthquake. At that time, however,
we did not have a good idea of the extent of fault creep in the absence of an
Two years later, in 1968, Bob Brown and I reported in print that a creeping section of the fault extended from Cholame northwest to San Juan Bautista, but that south of Cholame the San Andreas fault seemed to be "locked," and no slip had been observed since the earthquake of 1857. We reached this conclusion by a study of fence lines. Northwest of Cholame, fences were clearly offset where they crossed the fault, but fences south of the latitude of Cholame were not offset. Our paper was the first to state the distribution and rates of slip along this creeping section and to note the sharp change to no slip to the southeast. Later, more precise measurements confirmed our crude first determinations. (Brown, R.D.Jr. and Wallace, R.E., "Current and Historic Fault Movement Along the San Andreas Fault Between Paicines and Camp Dix, California," Proceedings of Conference on Geologic Problems of San Andreas Fault System; Stanford University Publications, Geological Sciences, v. XI., 1968, pp. 22-41.)
In those days I became very optimistic that we would find many more
things happening before earthquakes on which we could base predictions. I
could not and still do not believe that a gigantic event involving perhaps
hundreds or thousands of cubic kilometers of the earth's crust could sneak up on
us without some warning. Could an elephant creep into our camp without us at
least hearing some slight noise, if we had the proper detection system?
Scott: I can see why your 1966 experiences made you optimistic about prediction. But
I believe only considerably later was any effort made to predict a repeat of the
Wallace: That is right. In May 1984 Bill Bakun (USGS) and Tom McEvilly (University
of California, Berkeley) noted in print that the 1966 earthquake had been
preceded by almost identical events in 1922 and 1934. The detailed lines on the
seismographic records of the three earthquakes could be laid exactly one over
another. In their joint paper Bakun and McEvilly suggested that the next such
"characteristic" earthquakes might occur between 1983 and 1993. (Bakun,
W.H., and McEvilly, T.V., "Recurrence Model and Parkfield, California,
Earthquakes," Journal of Geophysical Research, v. 89, no. B5, 1984, pp. 3051-3058.)
This idea was amplified by Bill Bakun and Al Lindh in two papers published in 1985, although written in 1984, and they added the evidence that after 1857 a series of five very similar earthquakes had occurred, spaced about 22 years apart. On that basis they suggested a ten-year time for the next Parkfield earthquake. In the Terra paper the prediction is stated as "within a ten-year time frame window centered on 1987-1988." (Bakun, W.H., and Lindh, A.G., "The Parkfield, California, Prediction Experiment," Earthquake Prediction Research, 3, 1985, pp. 285-304, published by Terra Scientific Publishing Co., Tokyo, Japan.) (Bakun, W.H. and Lindh, A.G., "The Parkfield, California, Earthquake Prediction Experiment," Science, v. 229, pp. 619-624.)
In early 1984 the emerging Parkfield prediction matter was reported at a
meeting of the National Earthquake Prediction Evaluation Council (NEPEC).
But just before the Parkfield discussion, and by sheer coincidence, I had
presented a proposed set of definitions of terms to apply to predictions. Jim
Davis (California Division of Mines and Geology) and Karen McNally
(University of California at Santa Cruz) and I had been designated as a
committee of the Southern California Earthquake Preparedness Project (SCEPP)
to prepare definitions.
The definitions had been requested to serve as a guide for southern California jurisdictions trying to design response measures to predictions. A misconception had grown that scientists were prepared at any moment to produce a valid prediction of a disastrous earthquake in southern California. That idea prompted SCEPP to contract with several communities, such as the City and County of Los Angeles, to prepare response plans.
Scott: All of this reminds me of the widespread excitement that prediction had generated back then. The idea had taken hold, seemingly all around the world.
Wallace: The national council having been presented with an apparent prediction of a
Parkfield earthquake at the same meeting as a new set of prediction definitions
was approved, made it almost inevitable that Parkfield should be designated as a
formal "prediction." Soon after, the California Earthquake Prediction
Evaluation Council also formally accepted the Parkfield prediction as reasonable.
Within the earthquake hazard reduction program, an experiment to "trap"
a potentially damaging earthquake had been discussed ever since the Press report
had made such proposals. Parkfield seemed to be a excellent site for the
Scott: So prompted by the Parkfield "prediction," the scientific community began
marshalling resources for a "trapping" experiment there?
Wallace: Yes. In addition, those who were administratively responsible for disaster response measures--especially at the state level--were also activated. The network of seismometers was gradually enlarged and improved, a project to monitor water wells was started, more tilt meters, magnetometers, strong-motion recorders, and geodetic nets were put in place, and a time-lapse photo net was set up. Signals from each net were telemetered to Menlo Park, where they could be compared and analyzed. In addition, a local headquarters was established near Parkfield in an old farm house next to the fault.
I should interject here that although the "characteristic" M5/5-6 earthquake held the focus of the Parkfield prediction, a series of interpretations arose that suggested the next earthquake might be a magnitude 7. Furthermore, many considered it not unreasonable that the successor to the great 1857 earthquake might start in the Parkfield reach of the fault, as it may have in 1857, possibly starting with a Parkfield-type earthquake as a point of nucleation. Concern ran high for several years, and still persists in a more moderate form. Meanwhile there has been no new hard evidence.
In connection with the Parkfield prediction, the California Office of
Emergency Services began designing and testing procedures for coordinating
local jurisdictions and the state in handling communications and prediction
announcements, and in developing responses. New terminology had to be
developed, radio frequencies assigned, and chains of command and delegations
of authority had to be devised. This was all considered essential for an
earthquake prediction to have beneficial societal consequences.
The USGS management in Reston, Virginia, had to ease its stringent
policy of requiring official headquarters clearance of important announcements
or news stories. The old procedures would not work, given the short time for
developing a prediction from seismic data analysis and formulating an
announcement. While that might seem like a relatively simple matter to rectify,
it actually took several years to achieve a satisfactory delegation of authority to
smaller offices such as Menlo Park, where predictions were most likely to be
Bill Bakun and many others worked long and hard on specific criteria and
definitions, to designate predictions of different levels as indicated by signals
from the different instrumental systems. Finally the USGS headquarters came to
trust the staff at Menlo Park with the important duty of issuing predictions! In
fact that delegation may have been one of the most important administrative
accomplishments to come out of the Parkfield prediction experiment.
Scott: As one who has studied government in action, I can understand the extraordinary difficulty of achieving such delegation, particularly when the decisions on sensitive, high-visibility matters. USGS deserves some real credit for what it was able to do. I also think many other scientific and public policy agencies learned a lot from Parkfield.
Wallace: Yes, a lot of good was done. Unfortunately, however, the predicted Parkfield earthquake simply did not happen within the time window where it had been expected. It still has not occurred, even as we are concluding work on this oral history.
The predicted earthquake had been assigned a probability of about 10
percent per year, with an estimated cumulative probability of well over 90
percent within 30 years. Although it missed the time-window, most scientists
involved with Parkfield felt that a characteristic earthquake of about 5.5-6
magnitude would still occur, following the pattern of nucleating in the same
small volume of rock along the fault where the 1966 event began.
Despite the Parkfield earthquake's failure to arrive on time, many things
were learned about earthquake processes that occur before, during and after
moderate-sized events. Many smaller earthquakes were captured on the dense
array of instruments deployed at Parkfield, and hundreds of technical papers
have been written reporting a wealth of scientific findings.
Scott: In short, the Parkfield project produced an enormous amount of data and
substantially furthered our understanding of the earthquake process itself?
Wallace: No question about that! The failure to predict the specific time of the earthquake soon led to growing pressure to close down the prediction experiment and shift the funds to other projects. In response the USGS convened a special working group (made up of non-USGS scientists and emergency-management experts) of the National Earthquake Prediction Evaluation Council to evaluate the results of the experiment and to advise on what should be done: close down the experiment or continue at some level?
The working group reported in 1995, and its recommendations included
"The USGS should recognize and provide support for the Experiment as a
scientific experiment in the broader integrated context of an actual public
policy activity." ( p. 11)
"Parkfield remains the best identified locale to trap an earthquake." (p.
"....the Experiment should be viewed with a long-term perspective. The
Experiment should not stagnate: rather it should continue to evolve."
(National Earthquake Prediction Evaluation Council Working Group,
B.H. Hager (ch.), Earthquake Research at Parkfield, California, for 1993 and
Beyond, Report of the NEPEC Working Group to Evaluate the Parkfield
Earthquake Prediction Experiment, U.S. Geological Survey Circular 1116, p.1-14, 1995.)
Scott: The working group clearly concluded that the Parkfield prediction experiment
had been very worthwhile.
Wallace: Let's now shift from the localized Parkfield project to other earthquake studies,
starting with global seismology. For many years Waverly Person in the USGS
office in Golden, Colorado, has been responsible for reporting earthquakes on
the global network. In a worldwide cooperative effort, Waverly draws on the
records from a myriad institutions and universities around the world, as well as
managing USGS networks.
To serve the cooperative organizations well, more than a decade ago
Waverly's group was one of the very first to distribute vast quantities of data on
CD-ROMs. First the preliminary determination of epicenters of earthquakes
were released even before the majority of investigators had computers. This first
experiment in data distribution has been expanded in almost every other program
of the USGS: geologic map data, water resource data, and topographic data in
digital form are now available on CD-ROMs. .
Wallace: Microearthquake seismology has become one of the most valuable and
productive parts of the USGS earthquake program, and its history is well
described by Jerry Eaton. (Eaton, J.P., 1996, Microearthquake Seismology in
USGS Volcano and Earthquake Hazards Studies: 1953-1995, U.S. Geological
Survey Open-file report 96-54, 1996.) It is hard to overstate the importance of
the microearthquake seismology program's accomplishments.
I cannot overemphasize the microearthquake nets in contributing to understanding fault behavior along the San Andreas system. The nets have given us our first four-dimensional picture of earthquake occurrences along the faults (time being the fourth dimension). Microearthquake data clearly define the base of the brittle zone that produces earthquakes, as well as irregularities in the base, and laterally across the fault. The timing of extensions and migration of seismicity are readily seen, as are gaps in seismicity. The clustering of aftershocks observed suggests signifcant physicial processes.
I have already mentioned Jerry Eaton's development of microearthquake
nets, which he did first in Hawaii, and then along the San Andreas fault in
central California. Jerry made many of the early instruments out of begged and
borrowed components. But as the usefulness of mapping seismicity in time and
space became apparent, the net in central California was expanded to southern
Wallace: Automation in analyzing data became a necessity, and real-time processing
(RTP) by computer evolved through efforts of Sam Stewart, Rex Allen, and
many others. Most small earthquakes are now located automatically, and
magnitudes assigned, in a matter of seconds. An office of the USGS was
established in Pasadena in southern California adjacent to the Caltech offices to
facilitate close cooperation with that major center for seismologic research. Tom
Heaton and Lucy Jones became major players there.
Tom Heaton initiated and has led the way to what is truly a very short-term prediction technique--a prediction measured in seconds to tens of seconds.
It depends on having one or more seismometers very close to an epicenter. As
soon as the start of earthquake movement is detected, an electronic alarm is
transmitted to outlying areas. By arriving many seconds before the damaging
earthquake shaking arrives, the warning signal can automatically trigger a variety
of emergency measures--such as around nuclear power plants or along rail or
transport routes. Or automatic alerts can be flashed by radio and TV.
Scott: Say a little more about the deployment of microearthquake nets.
Wallace: Microearthquake nets became essential for the study of other regions. After the
great earthquake of 1964 in Anchorage, the seismically active areas of Alaska
could not be ignored. Bob Page and John Lahr have lead the investigations there
in recent years. Microearthquake nets and investigations were also established in
Nevada, Oregon, Washington, the mid-continent, and along the eastern
seaboard. Most of these studies were carried out cooperatively with
Universities. Finding funds for expansion and maintenance of the regional
networks became a major perennial problem. Many other federal and state
Temporary, portable nets are deployed immediately after almost every
major earthquake. At first, the seismic signals were scribed mechanically and
directly as the swinging arm of the seismograph scratched its movements onto
the smoked drums. As technology progressed the signals were recorded on
photographic film. More recently, the signals are sent in digital form to be
analyzed and stored by various computer techniques.
Scott: Those are valuable observations on some of the management and budgetary problems of program operation. Could you now say more about some of the other elements of the seismology effort? What about the strong-motion studies that engineers view as so important to their needs?
Wallace: Yes, I want to stress the importance of strong ground-motion studies. To study
strong motion requires a very different approach than regional or global
seismicity, different instrumentation and different deployment. Ordinary
seismographs are driven beyond their limits by strong ground motion and thus
fail to record significant details of motion. From very crude instrumentation, we
have progressed to where earthquakes of very wide dynamic range (from very
weak to very strong shaking) can be fully recorded. The advent of digital
recording underlies this fundamental progress.
In recent years, the strong signals can be dissected, and each new
generation of computers permits analyses of more and more complex signals.
As a geologist fascinated with faults, I have been excited by the work of
seismologists such as Ralph Archuleta (now at the University of California,
Santa Barbara), Paul Spudich, and Roger Borcherdt. (Editor's Note: Roger
Borcherdt received EERI's Outstanding Paper Award at the Annual Meeting in
Los Angeles, February 1996. The award was given for his 1994 Earthquake
Spectra paper on a methodology for estimating site-dependent response spectra.)
Analyses of strong-motion records have permitted them and others to
show exactly where rupture begins on a fault plane, then how a fracture spreads
in a microsecond time frame up and down and along the length of the fault plane.
I can see such information eventually letting us identify the weak and strong
places on a fault and thus permitting us to focus attention in just the right places
to look for predictive signals. Large lobes of directed energy commonly radiate
from a rupture in a process that still is not fully understood. Strong-motion
studies are essential to a better grasp of what is going on.
Scott: Yes, and we have been making a lot of progress in collecting good strong motion
information, through instrument installations in buildings, and so-called "free-field" installations away from buildings. A Los Angeles ordinance requiring
such instrumentation was a major breakthrough, and later a state-wide program
was set up, administered by the Division of Mines and Geology, which
eventually took over the Los Angeles program. The installations are financed
from a very small surcharge collected through fees for building permits. I think
the program is generally viewed as one of California's great success stories in
obtaining information that is essential to effective earthquake engineering.
Wallace: Yes, good estimates of probable strong motion are crucial for earthquake engineers, because it is the effect of strong ground motion on buildings that causes failure and collapse. The study of both the source of the motion in the earth and the response of buildings to that motion stand out as of the highest priority in assisting engineers to design earthquake-resistant buildings.
For decades Ted Algermissen and his colleagues worked at preparing and improving maps of the United States showing estimates of the strong ground motion to be expected in each region. Engineers have used these maps to set basic parameters for design and construction. In the field of strong-ground-motion seismology, Bill Joyner and Dave Boore have taken their fundamental studies and, working with engineers, have helped to keep building codes abreast of the state of the science.
Roger Borcherdt and several other investigators found ways to estimate
local variations in strong ground motion, and have prepared detailed maps
(microzonation maps) to depict these local variations. Addressing the
microzonation problem after the Loma Prieta earthquake, Jack Evernden
developed a powerful algorithm which, with the help of computers, permitted
him to successfully predict in considerable detail the strong motion likely to be
generated by any earthquake.
Selecting Priorities in a Complex Program
Scott: It sounds enormously complex.
Wallace: Vast amounts of data have been accumulated, creating great opportunities for
new scientific breakthroughs. But the potential is largely untapped, as available
scientific manpower has seldom equalled the opportunities begging to be
Other parts of the earthquake hazard reduction program always were in
competition with the seismology program. I remember scientists complaining
again and again that seismic nets were using so much of the budget, that "My
more important project cannot be funded or staffed." But how can we possibly
understand earthquakes if we don't know where and how big they are. Without
seismic records myriad cause-and-effect relations can never be established.
Even as I have been writing this, in brown-bag luncheon discussions at
the tables outside USGS Building 8, W.P. (Porter) Irwin repeatedly asserts, "If
even a small part of the funding had been devoted to geologic mapping, we
would now be way ahead in our understanding of the earthquake processes, and,
furthermore, we would have an invaluable legacy for a variety of other uses."
Earl Brabb angrily adds that, "The earthquake program would not continue
support for the landslide program that I labored so hard to develop under the Bay
Area study--even though landslides triggered by earthquakes are a major
At the same brown-bag lunches, Jack Healy holds to his conviction, "If
we had just put our money into drilling deep holes in the regions where
earthquakes form--2 to 10 km down--and then instrumented them properly, we
would by now have earthquake prediction in hand." These three brilliant and
productive scientists now are in emeritus status, but a wealth of diverse ideas still
swirl among the fully-active scientific staff.
Scott: I presume USGS staff members generate intriguing and potentially valuable ideas
all the time. It must be hard for managers, who have to select what to support,
and what not to support, particularly when funds are limited.
Wallace: Yes, that is a perennial problem. As the USGS rotates managers, you can be
sure that the personal ideas and prejudices of the manager of the moment play
strongly into the decisions made, especially the distribution of money.
Unfortunately, the manager's judgments about projects and directions are never
neutral, in so far as scientific staff is concerned. That is just human nature. So
it could put things in a rut if a single manager's preferences regarding
approaches and personnel were to dominate for a extended period.
Scott: The geologic study of earthquakes not only deals with phenomena at or near the
surface, such as visible fault breaks or shallow earthquakes but also has to
consider things that go on much deeper down. Would you say something about
that kind of work?
Wallace: The crust of the earth is not a single, simple block of rock, but is complexly
layered and divided into a multitude of blocks and blobs of different rock
materials. At places where the crust has been upwarped and eroded deeply, we
can see this diversity at the surface.
It takes "remote sensing" techniques, however, to determine the
arrangement of things at depth. Deep holes can be drilled, as in the exploration
for petroleum, but generally for the very deep crustal exploration, the numbers
and depths of drill holes needed far exceed realistic funding possibilities.
Geophysical techniques include mapping the patterns of rock density by
measuring gravity and mapping patterns of magnetic differences. Each rock and
group of rocks differ in their density and magnetic attraction. Analysis of these
patterns can disclose large regional patterns, sharp boundaries between rock
types, perhaps earthquake-related faults can be recognized, and the general
configuration of the structures deciphered. Deep-sounding techniques are
essential to the study of deep earthquake faults and to understanding how deep
forces cause the crust to bend.
Bob Jachens and Andy Griscom have focussed on gravity techniques, and
have found places where the upper mantle (the material below the crust) comes
to the surface, or close to it. Izzy Zietz also comes to mind for being loudly
vocal during his career in cajoling, and insisting, and selling the idea that a
magnetic map of the U.S. must be financed and completed, regardless of what
happened to other programs. Thanks to him, as well as many others, a magnetic
map of the U.S. does exist, and major invisible structures can be deciphered
Seismic methods also are invaluable in exploring the deep crustal and
upper mantle parts of the earth. Many seismic refraction and reflection teams
have worked since before the earthquake program began. I have already told of
how in the late 1950s the group headed by Lou Pakiser, and headquartered in
Denver, worked to help solve crustal problems for the underground nuclear-testing and detection programs. Helping national defense needs in the battle with
the Soviet Union was no small accomplishment.
The Pakiser group moved to Menlo Park in the mid-1960s and formed the
core of the geophysical group studying natural earthquakes. Lou, himself, and
many members of the Crustal Studies team were leaders in the international
seismology community, and played a key role in assuring that the USGS became
an indispensable agency in the National Earthquake Hazards Reduction Program.
More recently, Walter Mooney and Gary Fuis have lead the way in the
study of the deep crust. Using seismic reflection and refraction techniques, they
have explored deep "cross sections" along many long lines or transects across
Alaska, the margins of the United States bordering the Pacific, the San Andreas
fault, the Sierra Nevada as well as many other major faults, mountain ranges and
basins within the continent. Without these geophysical views of the deep
structures within the earth, earthquake prediction and many other needs of the
nation would never be possible.
In recent years, however, the program has shrunk because of an inflation-driven decrease in usable funds and personnel available. The RIF ("reduction in
force") imposed on the USGS in 1995 has severely limited most programs. The
RIF and the reorganization that accompanied it also prompted a decline in staff
morale. As Jerry Eaton observed to me recently, "Without the USGS's strong
tradition for innovation and encouragement for individual scientists to follow
their own instincts, an attitude which has withered recently, I would not have
come to the USGS to work."
Scott: For many years the office to which you were attached has been called the Office
of Earthquakes, Volcanos and Engineering. Would you say a little about the
"volcanos and engineering" portions of the program?
Wallace: Yes, but I will only say a word, because at this stage I cannot really do them
justice. For further information, I recommend to readers the bimonthly USGS
publication Earthquakes and Volcanos, which includes both technical and non-technical articles.
The volcano program has many notable accomplishments, one of which I
will mention here--the successful forecasts of Mount Pinatubo's 1991 eruption in
the Philippines. Those forecasts came about after investigations by a group of
USGS volcanologists and seismologists headed by Dave Harlow. They had an
office close to the mountain, and stuck it out there even into the major phases of
the eruption when volcanic ash and blocks were falling all around them.
On the basis of their work and advice, the U.S. Air Force was able to
conduct a timely evacuation of Clark Air Base, saving many lives of military
personnel, as well as planes and equipment worth hundreds of millions of
dollars. In addition probably thousands of the lives of the local population were
saved. It was a remarkable accomplishment for science, and for USGS.
Wallace: The USGS followed, adapted to, and has made use of the explosion of the
computer era. Early on scientists in the earthquake program took to computers
to solve many geophysical problems. Computers became essential tools of
research and investigation. I think of Willie Lee and Pete Ward as leading the
USGS earthquake program's venture into the computer world.
Scott: You use a computer quite a bit yourself, don't you?
Wallace: I would be totally lost without a computer, although I have a lot still to learn. Learning about and coping with computers seems to become a full-time job. And I find that my joy in pondering and inventing new concepts about earth-science questions has suffered.
Scott: Yes, in addition to all you can do with them, there are some real down-sides to
working with personal computers. You have to be careful how you spend time
Wallace: We all use them. Every scientist in the earthquake program either has a personal
computer on his desk, or can tap into the larger computers maintained by the
USGS's Information Systems Division. During the 1980s, funding to tool up and
join the computer age added problems to an already severely stretched budget.
The earthquake program has suffered the same stresses most computer
users face. The rate of technological change is phenomenal. What basic
computer techniques deserve large expenditures as the industry grows and is
transformed? Costly individual units seem to become obsolete even while they
are being warmed up.
Scott: I think most organizations and businesses have faced the dilemma of keeping up with the fast-changing field, but avoiding unwise expenditures.
Back to contents -- On to next section
U.S. Department of the Interior | U.S. Geological Survey
Page Contact Information: Michael Diggles
Page Last Modified: March 9, 2007