I realize that the title of this new oral history, “Bruce N. Ames: The Marriage of Biochemistry and Genetics at Caltech, the NIH, UC Berkeley, and CHORI, 1954–2018,” can give the impression of a sequence of institutional histories framed by the life of one researcher; it’s really more about a certain type of scientific inquiry that manifests itself as a career that is both varied and foundational to the work done at each of these places. Dr. Ames studied with H.K. Mitchell at the California Institute of Technology at the beginning of the revolution in molecular biology, which was the coming together of genetics, or the study of inheritance and development, and biochemistry, or the study of the chemical processes that underlie cellular processes. A key part of this revolution of course is the discovery of the helical structure of DNA by James Watson, Rosalind Franklin, and Francis Crick at Cambridge University. But to boil it all down, it became possible to see biochemical processes functioning not only as reactions to changing environments but also as the expressions of a genetic code that could be “switched on” or repressed experimentally. At Caltech, Ames displayed the kind of curiosity that endeared him to scientists like Max Delbrück, whose “phage group” was beginning to look at these biochemical mechanisms of genetics.
There was just a handful of sites in the early 1950s where this new work was being undertaken. Although Ames did his graduate training in biochemistry, he was always “hopping the fence” to look at what was going on in other disciplines, especially genetics. He joined the National Institutes of Health in 1953 as a fellow, and rose to become the Chief of the Microbial Genetics section in 1962. During his time at the NIH, Ames split a fellowship year at Crick’s laboratory at Cambridge and another at Francois Jacob and Jacques Monod’s laboratory at the Pasteur Institute. Historians of science have recorded their suspicions of some of these famous scientists, who learned of Ames’ idea that the biosynthetic pathways of histidine were achieved through the activation of several coordinated genes, which led to Jacob and Monod’s theory of the operon—which is a cluster of genes expressed by a common promoter—without attribution to Ames. Asked about it decades later, Ames seemed surprised that historians of science had documented this observation. But these research programs in which Ames participated laid the foundations of the molecular understanding of life: in 1961, Sydney Brenner (Crick Laboratory) and Monod published the discovery of messenger RNA, or mRNA—a copy of DNA that initiates cellular processes—which is the foundation of the new vaccines developed to protect against COVID-19.
What becomes apparent reading Bruce Ames’ oral history is that he himself functioned as a kind of synthesizer. A restless mind began a research track, produced results, but then became distracted by an active curiosity with respect to research in other areas of biology, or even in social sciences or political philosophy. In the labs he ran at NIH, UC Berkeley, or the Children’s Hospital of Oakland Research Institute, Ames generated interest among colleagues, post-docs, technicians, graduate students—even among undergraduate students or amateur activists—to undertake research in a new direction. Once the program was off the ground, Ames was often already thinking about new areas of research.
Such was not always an easy path, as the expected institutional rigidity and disciplinary boundaries sometimes made for cool receptions when Ames attempted to enter a new field. In the mid-1960s, just before he moved from the NIH to take up a professorship in biochemistry at UC Berkeley, Ames became interested in the chemical preservatives in potato chips. This was the beginning of a long research trajectory in genetic toxicology, or the impact of toxins on DNA, at a time when the relationship between DNA damage and diseases such as cancer was poorly understood. Out of this research, Ames developed a simple inexpensive bacterial test for the mutagenicity of chemical substances, what became known as the Ames Test, which cut dramatically the time and money it took to determine the degree to which chemicals caused cancer using the standard method, animal trials. In the wake of this success, Ames was also frustrated by what he saw as the institutional inertia and the misunderstanding of critics of science and industry. To advocate for a more balanced view of modernity, Ames applied his test to the plants humans eat every day, and found mutagenic chemicals that are produced by the plants themselves in far higher doses than the trace amounts of pesticide residues that accompany them.
In 2000, Ames retired from UC Berkeley and became a senior research scientist at the Children’s Hospital of Oakland Research Institute until his retirement in 2018. During this time, Ames followed the logic of his inquiries into plant toxicity to examine nutrition, specifically the micronutrients that protect our DNA from these mutagenic chemicals or processes. He then embarked on a third career — or fourth or fifth, depending on what we count — of research on micronutrients. With over 550 publications, he is among the top hundred most highly cited scientists across all disciplines. But perhaps more interesting than the story of a peripatetic and pioneering mind is Ames’ frequently proclaimed reliance on the talents of other researchers and technicians, whose widely varying backgrounds and skills, including those of his wife and fellow UC Berkeley scientist Giovanna Ferro-Luzzi Ames, complemented his talents and made the research possible. From oral histories of top scientists, we can easily conclude that science is about leadership, but the skill of leadership is revealed over and over to be the identification, nurturing, and coordination of diverse talents. That will surely be another of Dr. Ames’ lasting contributions to science.
This interview with University of Chicago economist George S. Tolley is the latest in our series of interviews with Chicago economists as part of the Economist Life Stories project. However, along with this twenty-hour interview, we are also releasing online for the first time the oral history with his father, Howard R. Tolley, first head of UC Berkeley’s Giannini Foundation, and chief of the Bureau of Agricultural Economics (USDA) during the New Deal and World War II. This interview, conducted by Dean Albertson in the early 1950s, comes to us from the archives of the Columbia Oral History Project, which graciously granted permission for us to publish these interviews together.
These interviews will be of enormous importance to historians who are interested in the social and political contexts of the social sciences. Economists at the University of Chicago have been deeply involved in policy advocacy and policymaking since the beginning of World War II. Growing up in Washington during the Great Depression, with a father who was responsible for analyzing and providing solutions to Depression-era farming, George S. Tolley felt that economics was the calling of his generation: to figure out how to prevent such a calamity from ever happening again.
George completed his PhD at the University of Chicago with Theodore Schultz and D. Gale Johnson in the 1950s, and returned as faculty in the 1960s. Schultz was the impresario for the department during this period, bringing his ethic of service to the nation, along with many important contacts. G.S. Tolley was part of Schultz’s agriculture group at Chicago, in which many prominent economists researched problems of agricultural modernization, whether in the rural US or around the world. True to this spirit of service, Tolley became the Director of the Economic Development Division of the Economic Research Service of the US Department of Agriculture in 1964-65, and in 1974–75, he served as Deputy Assistant Secretary in the Office of Tax Analysis of the US Department of the Treasury.
As an outgrowth of his research on resource use and farm labor migration, G.S. Tolley was in on the ground floor of two new research areas, urban economics and environmental economics, neither of which could be more topical today. He also consulted widely for federal, state, and municipal agencies on urban and environmental problems from the 1960s until the present day, both as an academic and in his capacity as CEO of his firm RCF Consulting, Inc. As agricultural economics became central to policymaking in developing countries, so too did urban economics, as megacities mushroomed across the globe, echoing the influence of the agriculture group in this domain. In the late 1980s and 90s, George also produced a seminal work on health economics, which moved the field to a broader view of the economics of wellness.
Social sciences emerged to identify, define, and address the social problems and challenges of the age of which they were a part. If we think of the ideal of science as a belief in the possibility of making knowledge that stands independently of the biases, instrumentation, and idiosyncrasies of observation and experiment, we can understand what the social scientist is up against. Notwithstanding the commitments of many social scientists to such an ideal, it’s impossible for them to escape the social context in which they operate. Both of these oral histories chronicle policy controversies and challenges, and the emergence and evolution of sub-disciplines to tackle particular social problems (low farm income, labor migration, housing inequality and urban sprawl, management of natural resources, or the management of an insurance-based health care system). And both economists focused on the economics of human geography, the price of proximity to markets, opportunities, amenities, and resources. But the larger pattern in both life histories is this: the higher the political stakes of an area of research, the greater the social scientist’s commitment to an ideal of value-free science, often out of sheer necessity. George S. Tolley’s basic approach in his career was humility before the complexity of economic and social phenomena. But that approach becomes a policy orientation in itself: careful analysis of blanket prescriptions or proscriptions, and an understanding of the unintended consequences of well-intentioned plans, an orientation he passed on to his many accomplished graduate students.
When I asked John Prausnitz about his interest in science growing up, he said “I really recognized that chemistry is life. Chemistry is how we live, and the body, what we eat, and what we inhale. Chemistry is it.” This is one key to understanding John Prausnitz’s approach to science and to life. He doesn’t separate them. His intellectual pursuits are not narrow. Although less and less so, engineering can still be understood as “mere” applied science, a question of fitting the fundamental principles long ago worked out by pioneering scientists to industrial or commercial processes. If this supposition was ever true, it really began to come apart around the time that Dr. Prausnitz entered the field of chemical engineering in the 1950s. And while he faced a brief moment when a possibility of a career in the private sector was presented to him, he quickly chose instead to devote his entire career and the rest of his life to UC Berkeley.
The mind of Dr. Prausnitz works on at least two tracks. First, he considers the challenges beyond the confines of the academy; but he also ranges widely over scientific and humanities literatures for tools to help him to interpret and solve these problems. He was a key mover in the foundation of the field of molecular thermodynamics: using the principles of molecular behavior worked out in physical chemistry to predict properties and behavior of mixtures of substances in various states. The scope of this revolution in engineering is hard to grasp until one considers the fact that many large-scale chemical processing factories were designed, constructed, and operated without careful consideration of chemical behavior at the molecular level. Some of his models that simplified the interactions of types of molecules were adopted by entire industries and would dramatically improve their efficiency and efficacy over the decades.
By immersing himself in the literatures of different scientific disciplines, Prausnitz attacked problems with a much broader perspective than if he had stayed in his field, respecting the boundaries of disciplines. Over the decades, Dr. Prausnitz has also been a witness to the expansion of the field of chemical engineering, from petroleum production processes to electro-chemical engineering to bioengineering. Prausnitz was not merely a champion on the sidelines of these new subfields; he immersed himself, quite late in his career, in molecular biology in order to collaborate on contributions to bioengineering, drawing especially on work with the Department of Energy in its sponsorship of biofuels research. Again, Prausnitz brought to bear his expertise in molecular thermodynamics to help place the research on a stronger footing.
Always, however, Dr. Prausnitz was thinking about the larger context and meaning of the work, and as an academic, this meaning had a lot to do with teaching and students. Like all great mentors, he took an active role in the progress and wellbeing of students and collaborators. As an enthusiast of the history of science, he understands that science is above all a human endeavor and a social process.
– Paul Burnett
These times pose great challenges for us as individuals and as a nation. We are being called upon to look beyond our own narrow interests and to make changes in our behavior to keep ourselves and others safe. In reflecting on my interviews over the past year, most of which are not yet publicly available, I see people who have identified problems and engaged with them directly. I see people having hard conversations, which includes taking some degree of responsibility, either personally or institutionally, for something that has gone wrong, or that has been going wrong for quite some time. I see people who act in accordance with their values.
In the San Francisco Opera project, I see Dramaturg Emeritus Kip Cranna and former General Director David Gockley having difficult conversations about budgets and staffing during periods of crisis, which, in the arts, is always a relative term. I see former UC Berkeley Chancellor Robert Birgeneau spending a lifetime advocating for the excluded and disadvantaged, and taking criticism after making difficult administrative decisions. I see Susan Graham—one of the first professors of computer science at UC Berkeley—participating in the President’s Council of Advisors on Science and Technology that was established during the Obama Administration, which recently warned the federal government of the urgent need to replenish the national stockpile of personal protective equipment that had nearly been depleted after the H1N1 pandemic. And although I did not conduct the interview with nurse administrator Cliff Morrison, I felt close to his story, as it features prominently in the podcast I worked on about the early years of the AIDS epidemic. After spending a career caring for people living with AIDS, Morrison is currently participating in a study of the long-term effects of COVID-19, having contracted the disease while performing similar acts of service in this latest pandemic. In Cliff’s story, he took the step—audacious for the early 1980s—of asking patients what they needed and providing it for them, overriding an established hierarchy in the hospital by doing so. Although he was not the first to suggest patient-centered care, his act of courage was an important catalyst for the development of the “San Francisco Model” of nursing care that has since become a standard around the world.
But one of the interviews that really stays with me is with Bob Kendrick, who had a 60-year career in the mining industry. He tells a story of a mine accident that happened while he was the superintendent. What he relays in the story—and the fact of his telling it—is an example of taking responsibility that I take to heart.
And finally, there’s the oral history of George Leitmann, an engineering science professor at UC Berkeley who returned to Europe and risked his life to fight the Nazis and to make the world a better place.
In recent months, we have all been reminded, again, of the call to respect one another and to act to reduce harm to others, whether this involves simple acts of observing public health recommendations or speaking out and acting against organized discrimination, implicit bias in our own work, and systemic problems with police brutality against African Americans. Many of the oral histories listed below are examples of people who have spent their lives serving some idea of the greater good. I am grateful to all of my narrators this past year for reinforcing the importance of stepping up and taking responsibility for the world we live in, and the world we want to live in.
This year, we celebrate the completion or near-completion of the following interviews:
Bob Kendrick – Global Mining and Materials Research
George Leitmann – University History
John Prausnitz – University History
Bruce Ames – University History
Robert Birgeneau – University History
Susan Graham – University History
David Gockley – San Francisco Opera
Kip Cranna – San Francisco Opera
George Tolley – Economist Life Stories
A large number of recent news items have reflected on our current crisis by looking to the past for comfort, commiseration, and even some answers. My own writing on this is no exception. However, we need to be very careful about how we use history to inform our current context.
“Were masks effective in the 1918 flu?”
I was recently asked this question for an article that just appeared in the Smithsonian Magazine. It’s a fascinating exploration of the politics of masks in California during the 1918 flu, and the fact that Mayor Davie of Oakland was jailed for not wearing a mask in Sacramento. However, my statement at the end of the article, applied historically, is not correct by itself. I’m quoted as saying the gauze masks of 1918, “may not have been much use to the user but did offer protection to those around them.” I had in mind the ultimate public health lessons learned from the 1918 flu way down the line, in a study concluded a little more than ten years ago.
But back in 1918, public health leaders who studied the problem thought that the mask laws and mask use by the public were minimally effective.
This is from a study published in 1919 by the California State Department of Health. The above graph showed very little difference in death rates between Stockton, which mandated the wearing of masks in public, and Boston, which did not. So, early on, authorities were skeptical of the effectiveness of masks, but they also felt that masks were not used properly.
Part of the disappointment was that medical authorities had advised using medical gauze, which had a tighter weave than what most people understood as “gauze.” Then as now, not everyone had access to the personal protective equipment solutions that were recommended. People were using cheese cloth for masks, with predictable outcomes. The problem was the user. A more pessimistic appraisal of masks came in a study published in 1921 by physician William T. Vaughan:
“One difficulty in the use of the face mask is the failure of cooperation on the part of the public. When, in pneumonia and influence wards, it has been nearly impossible to force the orderlies or even some of the physicians and nurses to wear their masks as prescribed, it is difficult to see how a general measure of this nature could be enforced in the community at large.”
William T. Vaughan, Influenza: An Epidemiologic Study, (Baltimore, MD: American Journal of Hygiene Monographic Series, No.1, 1921) 241.
Mask skepticism was officially sanctioned by the Surgeon General of the US Navy in a 1919 report:
“No evidence was presented which would justify compelling persons at large to wear masks during an epidemic. The mask is designed only to afford protection against a direct spray from the mouth of the carrier of pathogenic microorganisms … Masks of improper design, made of wide-mesh gauze, which rest against the mouth and nose, become wet with saliva, soiled with the fingers, and are changed infrequently, may lead to infection rather than prevent it, especially when worn by persons who have not even a rudimentary knowledge of the modes of transmission of the causative agents of communicable diseases.”
“Epidemiological and Statistical Data, US Navy, 1918,” Reprinted from the Annual Report of the Surgeon General, US Navy, (Washington, DC: Government Printing Office, 1919) 434.
Although the Surgeon General of the US Navy acknowledged that wearing masks by hospital staff was good practice, “the morbidity rate, nevertheless, was very high among those attending the sick,” and may only have prevented infection from a direct, close hit from a cough or sneeze of a patient. The protocols followed in the contagious annex of the US Naval Hospital in Annapolis, MD, were sufficient to prevent cross-contamination of “cerebro-spinal fever” (aka meningitis), diphtheria, measles, mumps, scarlet fever, and German measles. Not so with influenza. In fact, the infection rate of staff was as high in the high-protocol wards as in the improvised hospitals. In one improvised hospital at the Navy Training Station in Great Lakes, IL., the infection rate was higher among those corpsmen and volunteers who wore masks than those who did not!
But what did all of this mean? Again, a discussion of a specific piece of technology by itself is not enough. This was not simply a question of “mask or no mask,” but of design, construction, supply, and use. The wearer needed to use a well-designed mask properly, and change masks frequently. Therefore, most of the expert complaints about masks around the Spanish Flu pandemic in the US seemed to be about the users and reliable access to steady supplies of properly constructed masks, not the concept of wearing a mask.
Indeed, that’s what the research team led by Howard Markel found when the Pentagon asked them to study the Spanish Flu pandemic. In 2007, they published their report on non-pharmaceutical interventions during epidemics and found that there was a “layered” effect of protection by using multiple techniques together: school closure, bans on public gathering, isolation and quarantine of the infected, limited closure of businesses, transportation restrictions, public risk communications, hygiene education, and wearing of masks.
So the key historical question here is not whether or not masks were useless. The broader, more troubling historical pattern that Erika Mailman revealed in her article is clear: the problem of public trust in public health. Some Americans, then as now, do not like being told what to do.
They especially do not like being instructed by “experts.” Americans arguably had more respect for expert authority during the flu pandemic than they do now, but even then, some would wear masks in public to comply with the law, then remove them when they went indoors, in close quarters with others and with poor air circulation, when they needed protection the most. Mayor Davie’s casual rebellion aside, citizens might grudgingly comply with the letter of the law, but not its scientific spirit. I’m reminded of a recent TikTok video of a woman in Kentucky who entered a store wearing a mask with a long slit around the mouth because it “makes it a lot easier to breathe.” Another problem, then as now, was that the right equipment in a time of crisis was unavailable. So people made masks with what they had available. But a cheesecloth mask was probably worse than no cloth, especially if they touched it frequently and changed it rarely.
A public health technology such as a mask is not just a simple, inanimate object. It’s the care with which it is designed and constructed; it is the infrastructure that can assure a steady, sanitary supply; it’s the use of one technology and practice in conjunction with others, and it is especially the informed users who take responsibility for their own health and those of their fellow Americans. Going back to the graph near the beginning of this blog post, we see that the curve of the death rate was bent by this layering of multiple public health interventions. Masks were not used widely and well enough to make much of a difference, but public health authorities tended to believe in their effectiveness for front line workers exposed to the worst of the flu pandemic. If we invested more in public health research, primary health care, and public health education, we could improve even more the quality and quantity of these layered non-pharmaceutical interventions, which worked together in 1918-19 and which are working now. We can do better with building our communication and trust so that all of these measures can work together. But for them to work together, we have to work together.
Many comparisons have recently been made between COVID-19 and the great Spanish Flu pandemic of 1918-19. There has also been a lot of great research by historians into that pandemic, specifically to discover any lessons that might apply to future pandemics. Historians of medicine Howard Markel and Alexandra Stern at the University of Michigan conducted a large study of the Spanish Flu in the wake of the SARS epidemic in the early 2000s. One of the most important lessons they learned was the difference that social distancing could make. Remember the graph that was in the news early on in the COVID-19 pandemic which showed the differences in case-fatality rates between Philadelphia and St. Louis? This is where the evidence for the concept of “flattening the curve” comes from. The results were published in the Journal of the American Medical Association. Researchers from the same effort then compiled all kinds of information in an online encyclopedia of the Spanish Flu.
We at the Oral History Center at UC Berkeley have our own stories that touch on the Spanish Flu. Using our “Advanced Search” fields in our collection’s search engine, you can search using a number of terms to bring up these oral histories. I found eight for “Spanish Flu,” accounting for duplications. But it’s important to try different search terms. When I searched “flu” and “1918,” I got 591 hits. And when I pull up an oral history in PDF form by clicking “view transcript,” I hit Ctrl-F (or Command-F if you have a Mac), and type a search term into the field. When I hit “Enter,” I’m taken right to each location in the transcript where that word is mentioned, in sequence. A lot of the mentions of flu come from our Rosie the Riveter World War II Homefront Project and a series on Russian immigrants to the US that was conducted by UC Berkeley Professor Richard A. Pierce in the 1960s and 70s.
Usually there is just a brief mention or anecdote about that dreadful year 1918, as these are life histories about much more than that time of crisis. But what’s striking about these narratives is just how much was going on at the time. First of all, there was a world war that had claimed the lives of tens of millions of people. Social order had disintegrated across much of Eurasia; harvests were not attended to; revolution was overturning society in Russia; Russian Jews were being killed and displaced in pogroms, and millions of young men were returning from the Great War to their homes thousands of miles away, in many cases bringing new and unfamiliar germs along with them back to their families and communities. In addition, organized national public health institutions were underdeveloped in many countries, including in the United States. The newly named US Public Health Service was only six years old when the flu struck, and the germ theory of disease had only begun to shape public health policy over the previous few decades. Then, as now, states and municipalities in the US had widely varying responses to the epidemic. Furthermore, no one knew exactly what viruses were at the time. Though much larger bacteria and parasites could be seen under a basic microscope, viruses were defined as undetectable substances that passed through the finest filters they had for measuring particles. So, in addition to a poor understanding of the nature of viruses, part of what made the Spanish Flu so devastating was that human bodies in 1918 were in general more vulnerable to disease at that particular moment because of all of the other factors that tended to cluster around epidemics: war, famine, persecution, and mass refugee migration.
Here Cal professor of business Jacob Marschak reminisces with his sister Frances and historian Richard Pierce about fleeing the Bolsheviks during the Russian Civil War, on a boat to Sevastopol:
Jacob Marschak: That was already after the Armistice, and there were many German prisoners returning. The Spanish flu was then raging, and was quite devastating. Many died of it. On the boat I got a terrible attack of it, and my sister nursed me. One man died of it and we buried him at sea.
Frances Sobotka: On the steamer he was severely ill with Spanish flu; I didn’t get it but he did, and it was very hard on him, because it was already about October, very cold, and he had the highest fever that I ever knew, and we had only places to be on the deck. Then somebody allowed him to go and stay in the machine compartment, which was too hot, and he always had a bottle of cognac with him. He was no more a drunkard than you or I, but somebody told him that it was the best thing against Spanish flu. By the time we reached Sevastopol he could already walk, but he was still very weak, and held my shoulder as if I was his stick.
Jacob Marschak, “Recollections of Kiev and the Northern Caucasus, 1917-18,” conducted by Richard A. Pierce in 1968 (Oral History Center, The Bancroft Library, University of California, Berkeley, 1971) 75.
Fate, chance, and one’s family, social, and financial resources helped determine who lived and died. Mary Prout, one of the narrators for our Rosie the Riveter WWII Home Front Project, describes her aunt’s good fortune when the flu came for her. Here she is from an interview in 2002 with Ben Bicais:
My mother was a housewife, but she became a nurse before my mom and dad got married. An excellent nurse, loved it. But her father didn’t want her to become a nurse because he felt that it was too hard on girls. They had to do all the custodial work and everything. So she said, “Okay, Father, I’ll graduate from high school, and then in three years I’m going to go to become a nurse.” And he said, “That’s fair enough.” So she had a young sister, and in three years, they both went to Mary’s Help [Hospital], both became RNs. Then after— well, it was after the war, I guess, my aunt was just a little gal, and she was getting that virulent flu that they had then, that terrible flu. It was just awful. So my mother knew that she wouldn’t live through it. So Mom tried to get an ambulance, but she couldn’t find one, so she got a hearse, and she went down to the veteran’s hospital in Palo Alto, picked up my aunt and saved her life, really. … Oh, yes, that was terrible. People were dying by the inches.
Mary Hall Prout, “Rosie the Riveter World War II American Home Front Oral History Project” conducted by Ben Bicais in 2002, (Oral History Center, The Bancroft Library, University of California, Berkeley, 2002) 27.
Things were bad enough here in the US, but in Russia, years of war and the Bolshevik Revolution had depleted supplies of food, and people had to barter away the last of their possessions for scraps. Russian émigré Valentina Vernon, whom historian Richard Pierce interviewed in 1980, described her diet in 1918:
Vernon: We ground that horrible stuff, and then used the skins from the potatoes you could get a few of potatoes, they were a sort of luxury. The skins were put in the oven and dried out and made into a powder, and the cakes made out of the dried vegetables were rolled in the black stuff and fried in coco-butter. Awful! There was no butter, but you still could find some coconut. And then the dried fish (seledki) , that was the main supply of food. So naturally we lost some weight. And this was our existence until we left.
Pierce: When did you leave?
Vernon: In September, 1918, just 18 days before the birth of my son. I had the Spanish flu, but I didn’t die, and my son didn’t either. Like the doctor here, who says Russian women are awfully tough! But my sister-in-law refused to leave, and they shot her husband. (Vernon, 1980, 15)
Another émigré, Boris Shebeko, describes his experiences during the First World War and the Russian Civil War in his oral history. Since I couldn’t search this PDF – which was just a scan of an old photostat copy – I started scrolling through the transcript for references around the time period of the end of World War One. I quickly became immersed in the stories. He seemed to careen from one life-threatening experience to another: almost being overtaken by singing Cossacks, who rode their horses directly into his machine-gun fire; stepping over giant cracks on a march across the frozen Lake Baikal; or getting into a knife-fight with French sailors in Constantinople. He even suspected his colleague of being a serial killer! The Spanish Flu was just one of his problems in those days.
Boris Shebeko, “Boris Shebeko: Russian Civil War 1918-1922 and Emigration,” conducted by Richard Pierce in 1960. Oral History Center, The Bancroft Library, University of California, Berkeley, 1961.
After reading through these fascinating stories, I was left wondering what influenza meant in those chaotic contexts of war, migration, and near-starvation. In 1918, the global circulation of people was higher than at any time in history. And by some measures, the world was much more interconnected in 1918 than it was even as late as the 1970s. In 2020, we have our own special mix of challenges. We are clearly, obviously much more interconnected on a global scale. In 2018, there were 4.3 billion passengers on flights. Add this incredible daily migration across the globe to our exposure to ecosystems that had until now been isolated from this global community, and we see an increase in our exposure to germs to which we have no natural immunity. We call these new epidemics —SARS, Zika, Ebola, COVID-19—”emerging” diseases for these reasons. Although we know so much more about viruses and how to treat them than we did in 1918, we have not learned as much as we could from previous pandemics. Scarier yet, the knowledge we do have has not guided policy, and in some ways we are still very much flying blind in this pandemic.
What do these oral histories tell us about epidemics? As horrifying and tragic as some of these stories are, they are the stories of survivors, those who were lucky, resourceful, and somehow positioned well enough to help themselves or to help others. As we learned in our First Response podcast, about the early AIDS epidemic in San Francisco, people tend to have two basic reactions in epidemics: fear, which can lead to scapegoating and selfishness; and love, sympathy, and a sense of duty to family, communities and even to complete strangers, which lead to mutual aid, careful planning, adaptation to changing circumstances, and rational action based on the available facts. Let’s allow these histories of resilient survivors to be our guides, now and into the future.
In 1988, the World Health Organization established World AIDS Day, one of eight major global health campaigns begun by the United Nations. It stands today as a reminder that, after forty years, tens of millions of people have died of HIV/AIDS, and approximately 37 million are currently living with the disease. It also should remind us of how terrifying this disease is, how it hurts those the most who have the fewest resources with which to defend themselves, and how we are nowhere near an end to the spread of HIV and the suffering it causes.
Last year, the Oral History Center released a 7-episode podcast about the first efforts to understand, track, and treat a mysterious new illness that was killing gay men in San Francisco. Based on the three dozen interviews conducted by Center historian Sally Smith Hughes, First Response: AIDS and Community in San Francisco highlights both the larger context of the arrival of the epidemic and the on-the-ground drama of the physicians, nurses, epidemiologists, and laboratory researchers who were fighting for scarce resources while also fighting the disease.
For decades now, HIV/AIDS has been evolving into different disease, as it spread around the world into new social, economic, technological, and cultural contexts. By 2010, new infections in the United States were three times as likely to be among African American and Latino populations as among white men, for example.
There is therefore a lot of work for us to do at the Oral History Center to map the past thirty years of the epidemic. Sadly, however, many of the themes of First Response are still relevant today: the stigmatization of the disease, a reluctance to pay for the public-health and primary-care interventions necessary to check the spread of HIV, and our general struggle to address the larger context in which survivors of HIV find themselves. For now, we will be working with partners to develop curriculum content for Grade 11 classrooms across the country around the First Response podcast and our collection of interviews. If you would like to help with this work developing assignments and content for educators, please contact Paul Burnett at firstname.lastname@example.org.
I was looking back over my original 2014 proposal for the The Global Mining and Materials Research Project, to which this new Bob Kendrick oral history is the latest addition. There was this passage about the project’s purpose: “to document expertise that can speak to the political, environmental, legal, social, and economic changes that surrounded mining exploration, permitting, production, processing, and remediation over the last thirty years.” This oral history is certainly about all of that, but it is also a life history in the fullest sense. The more proximate impetus for the interviews, as it too often is, was the narrator’s failing health.
In the lead-up to the interview sessions, Bob was concerned that he had not prepared enough, that he might not remember enough, or tell the stories well enough. But he was sure that he wanted to say some very specific things about his life experiences. We worked together with his wife of 65 years, Marian, and their sons Peter and Mike, to assemble material to frame the set of interviews that would be held in a western suburb of Phoenix. Marian and Bob then welcomed me into their home in February of this year for just two days and an afternoon. They had a dining room table covered with boxes of papers, photo albums, and artifacts, such as core samples from a gold mine, the hide of an anaconda and a stuffed piranha. For better or worse, I positioned two of Gary Prazen’s bronze sculptures of miners in the background behind Bob in the interview frame. All around the house were Marian’s paintings depicting beautiful scenes from some of their long sojourns in South America, Central Asia, Siberia, Canada, and all over the United States.
Bob and Marian did work all over the world, but one of the clear messages we get from a life history is the sense of origin, of place. There is a distinct mountain sensibility about this oral history. The place and time where Bob grew up, Leadville, CO in the 1930s and 40s, was a somewhat forbidding place, on a kind of vertical frontier, a sometimes dangerous place. Bob grew up with danger, but I suspect he also grew up with some variety of mountain culture. People did things up there that were difficult, sometimes because they had to, but sometimes precisely because they were difficult. Bob relished physical challenges, which perhaps prepared him for other kinds of challenges later in life.
There are parts of this oral history that are raw. There were stories that were difficult for Bob to recount, but that he wanted to tell nevertheless. These days, there is little talk about “character,” so little that I don’t know that we can easily define it anymore. When I heard Bob talk, still with a lot of grief after so many years, about what it’s like to tell a family that their loved one has been killed in the mine of which he was in charge, I thought I might have caught a glimpse. It occurs to me that these interviews were perhaps one more difficult thing he wanted to do. But this oral history is, to paraphrase Bob, also full of good things.
Bob is survived by his wife Marian, and his children Mike, Peter, Melissa, Rob, and Gina.
The Oral History Center wishes to thank the Freeport Foundation and Stanley Dempsey for their support of the Center and for making this oral history possible.
A long while ago, my colleague Shanna Farrell told our group about one of her pet peeves, the overuse / misuse of the term “oral history” in the media. She is certainly right about the increased use of the term. A cursory search of the mass-media landscape includes some fun stories in Forbes on the week Wayne Gretzky hosted Saturday Night Live, in Billboard on the time Kanye West rushed the stage at the Video Music Awards, and in vulture.com (New York Magazine) on a scene in Christopher Nolan’s The Dark Knight, in which an actor narrowly avoids having a pencil pierce his eyeball.
As I was looking at these pieces, it occurred to me that what journalists mean by “oral history” is simply a more extensive use of quotations and a lighter touch with the narrative. But it’s still, of course, journalism. These are “thick descriptions” of a moment in time, not the life histories oral historians usually do.
We in the profession of oral history, however, are at a bit of a messaging disadvantage. After all, mainstream journalism, even in the age of social media, is the ultimate mainline to the public. In fact, journalists still for the most part shape and define what we call “the public.”But do all of these examples constitute a misuse of the term “oral history?” The piece on The Dark Knight, for example, is nothing more than a series of quotations from people interviewed for the article. How different are these from the secondary outputs developed within the field of oral history, such as the Oral History Center’s own podcasts?
There is a real appetite for what oral historians do, and it’s growing. Part of me welcomes this misuse of so-called oral history, as long as we have the opportunity to correct misconceptions about our nuts-and-bolts work, i.e., the co-creation of more in-depth life histories, and to highlight the core fact of our privileging the voice and authority of the narrator over our own.
With our hectic, multitasking lives, punctuated by the forced downtime of gridlocked traffic and monotonous subway rides, we’re living in a strange second Golden Age of Radio. Archival oral history projects have a great shot at reaching these audiences of bored commuters, pensive gardeners, and late-night snugglers if we can broaden our notion of how to package and curate the wonderful materials we’ve helped to make with our narrators.
“We’re used to hearing about how game-changing technology makes whole new ways of living and working possible. But what makes the game-changing technologies possible? We’re going to talk about Berkeley’s contribution in this domain, a bit upstream from the technology we all know.”
This season of the Berkeley Remix we’re bringing to life stories about our home — UC Berkeley — from our collection of thousands of oral histories. Please join us for our fourth season, Let There Be Light: 150 Years at UC Berkeley, inspired by the University’s motto, Fiat Lux. Our episodes this season explore issues of identity — where we’ve been, who we are now, the powerful impact Berkeley’s identity as a public institution has had on student and academic life, and the intertwined history of campus and community.
The three-episode season explores how housing has been on the front lines of the battle for student welfare throughout the University’s history; how UC Berkeley created a culture of innovation that made game-changing technologies possible; and how political activism on campus was a motivator for the farm-to-table food scene in the city of Berkeley. All episodes include audio from interviews from the Oral History Center of The Bancroft Library.
Episode 2, “Berkeley Lightning: A Public University’s Role in the Rise of Silicon Valley,” is about the contributions of UC Berkeley Engineering to the rise of the semiconductor industry in what became known as Silicon Valley in the 1960s and 70s. In contrast to the influential entrepreneurial spirit of a private university like Stanford, Berkeley’s status as a public institution had a different impact on Silicon Valley. We focus on the development of the first widely used design program for prototyping microchips. Originally designed by and for students, the software spread like lightning in part because Berkeley, as a public institution, made it available free of charge. The world has not been the same since.
This episode uses audio from the Oral History Center of The Bancroft Library, including interviews with Paul R. Gray, Professor of Engineering Emeritus, Department of Electrical Engineering and Computer Science and Dr. Laurence Nagel, CEO of Omega Enterprises, former senior manager at Bell Laboratories, with a PhD from UC Berkeley EECS (oral history forthcoming).
“Berkeley Lightning” was produced, written, narrated, and edited by Oral History Center historian Paul Burnett.
The following is a written version of the The Berkeley Remix Podcast Season 4, Episode 2, “Berkeley Lightning: A Public University’s Role in the Rise of Silicon Valley. ”
Narration: Silicon Valley. It’s a real place, the valley roughly encompassed by Santa Clara County at the southern end of San Francisco Bay. But it’s also a mythic place, with just the right combination of top universities, electronics firms, defense dollars, and a concentration of rare, plucky college-dropout geniuses who would go on to hatch world-changing technologies in suburban garages.
We want to take that apart a bit. The university that looms largest in nearly every story of the rise of Silicon Valley is near the heart of that actual valley, Stanford University. But about 30 miles north, on the eastern edge of the Bay, lies the University of California, Berkeley, a longstanding rival to Stanford, if football is your game.
Our story here focuses on this other university, a public, state university, that established institutions and teams to develop innovations that would make the culture of innovation possible. There are many different stories we could tell about Berkeley’s role in the rise of Silicon Valley, from specific technologies such as flash memory to digital-to-analog conversion, aka the hardware and software that make it possible for you to listen to me right now.
We’re used to hearing about how game-changing technology makes whole new ways of living and working possible. But what makes the game-changing technologies possible? We’re going to talk about Berkeley’s contribution in this domain, a bit upstream from the technology we all know.
The centerpiece of just about any discussion of Silicon Valley is the development of its namesake, the silicon microchip, a tiny wafer packed with an ever-growing number of all the components that make up an electronic circuit in this microscopic space, what comes to be called an integrated circuit. Chief among these components is the transistor. Transistors do many things, but among them is to act as a switch, which allows them to process digital information, zeros or ones, much more efficiently and cheaply than tube-based mainframe computers, the ones that used to fill up entire basements of office buildings. When the transistor was invented, the race was on to increase the density and number of transistors. There are a few reasons why you want to make these integrated circuits smaller and denser. For one thing, they work better and more efficiently. And, you can cram them into small spaces, such as at the tip of a missile, for example.
But imagine how tricky they are to make. Here is Berkeley engineering professor Paul Gray explaining the dimensions of the microchip.
Gray: It was probably maybe fifty mils by a hundred mils. A mil is a thousandth of an inch. So that would be a tenth of an inch in one dimension and one-twentieth of an inch in the other dimension.
Narration: And here is Larry Nagel, who was a student at Berkeley’s microelectronics lab in the 1960s, on making chip prototypes.
Nagel: And being a little bit clumsy we had to make up for a couple of times when things got dropped and things got otherwise messed up. So I guess I was probably working maybe probably four weeks at that before I had a chip that actually worked.
Narration: Now this was not normal, routine work for an electrical engineering student in the 1960s. Just a handful of universities that had links the electronics industry and to military research had founded specialized microelectronics labs by the early 1960s. Here’s Dr. Gray again:
Gray: Berkeley had started the country’s first laboratory in which you could fabricate an integrated circuit. And that was in the about ’63, ’64 timeframe…Don Pederson was the faculty member here who really spearheaded the establishment of that laboratory. In that era Stanford and MIT also were starting labs—I think Berkeley was the first and then Stanford and MIT, some a year or two or three later, also got on the same track. … Berkeley and Stanford and MIT continued for the next several decades as the main institutions with this fabrication capability. And we still have a big lab, fabrication facility over there in the CITRIS Building.
Narration: So, UC Berkeley was the first university in the country to have a microchip fabrication facility. But that’s just the beginning. You have to understand that making microchips, by hand, is hard, really hard, and time-consuming.
Nagel: Well, just to build the circuit alone would probably take a couple of days for a hundred transistors for a kit. Maybe for a twenty transistor circuit a day. So one or two days, something of that order of magnitude. But actually measuring and getting the thing to work right, debugging it, could take a good deal longer than that. That could take weeks.
Narration: But from the time the microelectronics lab was founded until the end of the 1960s, the number of transistors on the same space on a chip went way up.
Gray: We were building chips that had on the order of a hundred transistors on them. It’s very difficult to predict by building a physical breadboard or a prototype out of discrete components how a chip like that’s going to behave electrically. You really needed circuit simulation even at that point, a program that would simulate the electrical behavior of a circuit.
Narration: Now if a computer program could do the work of prototyping a circuit, you wouldn’t have to waste time building dud after dud. As a student at the University of Arizona, Paul Gray then learned about the work at Berkeley from his mentor, David Holland. And when Gray went off to work down in the valley at Fairchild Semiconductor, right around the time that Gordon Moore and others split off to found Intel, he recognized the importance of the work that Berkeley researchers were doing.
Gray: I do remember having a lot of connections with universities in general, Berkeley and Stanford, on various topics in those years. Of course, we were in an industrial park created by Fred Terman, owned by Stanford. We were on their land. But I don’t remember having a sense of a Stanford dominance of the landscape in terms of university engagements. We had a lot of interaction with the Berkeley people because of the computer-aided design activity. Don and his group here at Berkeley were developing that kind of simulator. We needed that. So we got a connection going and we got one of the early versions of SPICE, I think it was called something else at that point, and used that.
Burnett: CANCER, I think.
Gray: Yes, correct.
Narration: SPICE? CANCER? These are strange if cool names for software, but this will be explained later. As a result of his reaching out to Berkeley to get a hold of this computer program called CANCER, Paul Gray was invited to Berkeley to teach for a year, and then joined the faculty, where he got the back story on what this software was all about.
First of all, Berkeley engineering was structured for this interaction between computing and electronics. At the time, only Berkeley and MIT had electrical engineering and computer science in the same department. Second, there was a leading light of the Electrical Engineering Department named Donald Pederson, who made computer-aided design a priority.
Gray: Somewhere in the early sixties, Don had recognized this need for computer simulation. … back in those days you could still try out your design by building what they called a breadboard and you plugged discrete devices in and it mimicked the chip, how the chip was going to behave. That was pretty ineffective anyway. But once you got bigger than a hundred transistors or so it became completely impossible to do that. Don recognized very early, the way things were going, it was going to be essential. And it was one of those early interdisciplinary things. To build an effective simulator of electronic circuits you have to have someone that knows devices and models the behavior of the transistors electrically. You have to have somebody that understands computer numerical analysis and how you actually solve differential equations on a computer on a large scale. And you have to have circuit people who understand what’s needed.
Narration: The other piece of the story is the fact that UC Berkeley is an institution of higher education, and students need to be taught in an efficient manner. So Don Pederson asked Ron Rohrer to develop a graduate course where the challenge was to have the students build design software. Here’s Dr. Nagel again:
Nagel: But I would say actually the major emphasis of the simulation programs that were developed at Berkeley were more as teaching tools, so that students could actually get a first-hand—again, Don was an intuitive guy. By running circuits on a computer you could get an intuitive feel for how the circuit would work, something that would take hours and hours and hours if you were to do it in the lab. Because in one night you could build five different variations of some particular circuit, simulate it, and have your results of which variation worked the best. That would be hours and maybe days of laboratory work. By doing it on a computer the entire class—it was no longer just a graduate exercise. Undergraduates could also enjoy this thing.
Narration: But the first graduate class to get this program built was intense, to say the least.
Nagel: And, of course, Ron [Rohrer] came in the first day and said, “Well, for those of you who think this is going to be a course on circuit synthesis, you’re in for a shock because this is going to be a course on circuit simulation and you guys are going to learn all about circuit simulation by writing a circuit simulator. The judge for how well you do will be Don Pederson. If he likes the program that you write, you’ll all get As. If he doesn’t like the program you write, you’ll all fail.” Ron was a very brash guy. So immediately half the class turned white as a sheet and left the room and were gone. [laughter]
Narration: Building on a number of earlier versions, Larry Nagel’s CANCER program was the result of that class, [computer analysis of non-linear circuits, excluding radiation]. Here’s Larry Nagel explaining the significance:
Nagel: “CANCER: Computer Analysis on Non–Linear Circuits…Excluding Radiation!” Because we were very proud of the fact that we’d developed this program with no money from the government at a time when the government was heavily funding research into radiation effects on circuits…
[background audio of student protests, chanting]
…because we were at the time very much worried about some kind of a nuclear event disabling our missiles. … So that’s how we got the “excluding radiation,” because we weren’t doing radiation. We were Berkeley at the time, right? This was Berkeley. This was not MIT. This was Berkeley. So that’s how CANCER got started.
Narration: So, you had institutional innovations, a microelectronics lab and a teaching approach that focused on computer-aided circuit design, resulting in this technical innovation. But a crucial innovation was not technical at all; it was legal, and social. Larry Nagel’s SPICE, or Simulation Program with Integrated Circuit Emphasis, was central to this part of the story, as was UC Berkeley’s status as a public university:
Nagel: I like to think that SPICE was actually the first open-source project way back before there was such a thing as open-source. But Don Pederson had a very strong belief that anything that was developed at a public institution should be in the public domain. So all of the Berkeley programs were available free of charge, or basically for whatever it cost to load the program onto a tape. So the fact that this program was for free had an enormous impact in a lot of ways. First of all, the program was made available to anybody so it diffused very quickly out to various different universities. And all the students that learned to use SPICE took it with them to industry. So it wasn’t at all long after the original release of SPICE in 1971 that basically every major integrated circuit manufacturer had their own version of SPICE. There was a TI SPICE at Texas Instruments. There was an ADI SPICE at Analogue Devices. After I graduated there was a program called ADVICE, which was developed at Bell Laboratories and used at Bell Laboratories. …
Don’s deal was that you can have the program for free but if you find a bug in it and fix it you have to give it back to us. You have to tell us what the bug is and how you fixed it. There were several generations of students that kept improving SPICE. SPICE improved because you had this entire base of industry feeding information back.
Nagel: I think for Don it was really a matter of principle. He didn’t do a cost-benefit analysis. He just said, “This is how it has to be. We’re a public institution. We have to make it publicly available.” But if you look in hindsight, the reason that the program became as widely accepted as it did was largely because it was freely available to everyone. Anybody could walk out of Cory Hall with a tape and they had their version of SPICE. That process went on for a long time. I think the last version of SPICE that was released was SPICE III and that was in 1980s. So we’re talking about Berkeley being pretty much a SPICE factory for fifteen, maybe even twenty years.
Narration: Here’s Paul Gray again:
Gray: Long story short, within ten or fifteen years of that point in time, every circuit design engineer in the world was using some flavor of a derivative of SPICE. It became an industry standard. The industry wouldn’t really exist without that kind of simulation capability. Once you get [the scale] to the thousands and tens of thousands and hundreds of thousands of transistors, it’s the only way you can do design. … So it was a huge innovation and had an incredible impact and also pioneered a great software dissemination model for universities. Many others emulated that public domain model of dissemination. … It has been a big part of the landscape here at Berkeley for many years.
Narration: And so, is the Silicon Valley phenomenon really just about Stanford, chip companies in the Valley, and amateur hobbyists?
If you asked the random citizen on the street anywhere in the Bay Area they would probably say that Stanford was way more important than Berkeley. I think “way more important” is not correct. … judging by the number of companies started by faculty or former students, or by the numbers of alumni employed in the valley, one could argue that Berkeley and Stanford are comparable contributors. In terms of innovations that have mattered, there are many.
Narration: So, what does this mean? The development of computer-aided design greatly facilitated experimental research in the design of new microchips. But there is something even more fundamental going on here. First, easy availability of quasi-open source software encouraged commercial development of new chip designs without making it prohibitively expensive to do so through proprietary licensing, copyright, and other legal devices. But Berkeley’s model still allowed companies to make a version of the software they could own.
UC Berkeley deliberately went down a different road, as a direct consequence of its hard-wiring as a public university. The second pivot point here is the separation of design from manufacturing. This software, and all its descendants, allowed companies to focus on the more-profitable end of microchip design, and to outsource the much-more-expensive part of the process, manufacturing, to other companies, and, eventually, to other countries. Now there are consequences to this change, environmental, socioeconomics and otherwise that we can’t cover here. But suffice it to say that in 2015, there were just a handful of major semiconductor fabrication facilities in the United States, with exports totaling $40 billion. Compare that to the fab-less semiconductor design companies, many of which are located in Silicon Valley: total exports $166 billion.
This is just one example of the influence that UC Berkeley’s Department of Electrical Engineering and Computer Science has had on the size, character, and global leadership of Silicon Valley. It was a combination of social, legal, and technical innovation that struck the Valley with lightning force, and in turn accelerated change and growth in the semiconductor industry to this very day. For more about these stories, visit the Oral History Center website at ucblib.link/OHC and search across our entire collection or through the full interviews with Paul Gray and Laurence Nagel.