Reckoning with the Mortally Inevitable

  • Article Formats:
  • MP3 audio
  • PDF
  • MOBI
  • ePub
  • Kindle store
  • NOOK store
  • Order Print Copy

[Page vii]Abstract: Every human enterprise — even the best, including science and scholarship — is marred by human weakness, by our inescapable biases, incapacities, limitations, preconceptions, and sometimes, yes, sins. It is a legacy of the Fall. With this in mind, we should approach even the greatest scientific, cultural, and academic achievements with both grateful appreciation and humility. J. B. Phillips’s rendition of Paul’s words at 1 Corinthians 13:12 captures the thought nicely: “At present we are men looking at puzzling reflections in a mirror. The time will come when we shall see reality whole and face to face! At present all I know is a little fraction of the truth, but the time will come when I shall know it as fully as God now knows me!”


It can be argued even now, in this age of social-media-facilitated skepticism, that science enjoys the greatest universal prestige of any cultural phenomenon in the modern world. And not without justice. Its achievements — from its development of vaccines and medicines that have saved and extended the lives of millions, through its creation of astonishing earthly technologies, to its ever-progressing exploration of space and its peering back to the very dawn of creation in the Big Bang — richly merit the respect they typically receive.

Yet science is an inescapably human endeavor, pursued and interpreted and employed by fallible mortals. Its history is instructive in many ways — not least as a stage upon which human weaknesses, errors, and biases are fully displayed. An article in a recent issue of Scientific American takes a brief but clear-eyed look at a small selection of embarrassing episodes in that venerable magazine’s own past.1 More on that shortly, though.

[Page viii]This issue of Scientific American is full of articles worthy of notice. With Moritz Stefaner and Jen Christiansen, for example, Lorraine Daston considers “The Language of Science: How the Words We Use Have Evolved Over the Past 175 years.”2 Maryn McKenna’s “Return of the Germs: For More Than a Century Drugs and Vaccines Made Astounding Progress against Infectious Diseases. Now Our Best Defenses May Be Social Changes,” leads off with a confident prediction made by the distinguished Australian virologist Sir Frank Macfarlane Burnet (d. 1985), a Nobel laureate, in his co-authored 1972 book Natural History of Infectious Disease. After surveying with distinct satisfaction the rise of antibiotics and the triumph of vaccines over smallpox, measles, mumps, rubella, and polio, Burnet opined that “The most likely forecast about the future of infectious disease is that it will be very dull.”3

We know better these days.

And, in his fascinating article “How Astronomers Revolutionized Our View of the Cosmos: The Universe Turns Out to Be Much Bigger and Weirder Than Anyone Thought,” the British cosmologist and astrophysicist Martin Rees, Baron Rees of Ludlow, formerly master of Trinity College Cambridge and president of the Royal Society and, since 1995, Astronomer Royal, seems to be making a valiant effort to repair previous neglect (or even suppression) of the major contributions made by female scientists to the topic he’s discussing.4

This is entirely appropriate for the pages of Scientific American, since its own history in this regard is far from blameless.

[Page ix]Schwartz and Schlenoff, both of whom are senior editors with the magazine, begin by discussing an article about women engineers that was published in Scientific American by Karl Drews in 1908. One might well have expected it to be something of a celebratory piece. After all, women were moving rapidly forward in the United States; several states had already granted them the vote. Final ratification of the Nineteenth Amendment, which made voting in federal elections accessible to both sexes, was only twelve years away.

Almost as visible and much more directly relevant was the role played by Emily Warren Roebling in the completion of the famous Brooklyn Bridge. For the decade during which her husband Washington Roebling was bedridden with a serious long-term illness, she effectively assumed his role as the project’s chief engineer, not only demonstrating an extensive understanding of such topics as stress analysis, the strength of materials, cable construction, and the calculation of catenary curves, but also taking over day-to-day supervision of the internationally-watched project until its completion. When the bridge was finally opened in 1883, a carriage carrying Emily Roebling and President Chester A. Arthur was the first to cross over it.5 Speaking on the occasion, Congressman Abram Hewitt, a future mayor of New York City, described the Brooklyn Bridge as “an everlasting monument to the sacrificing devotion of a woman and of her capacity for that higher education from which she has been too long disbarred.”6 Still in use today, the Brooklyn Bridge bears a plaque dedicated to the memory of Emily Warren Roebling, her husband Washington Roebling, and her father-in-law John A. Roebling, who had created the initial designs for the structure but who had died of tetanus in 1869, as the result of an accident.7

Karl Drews, however, would have none of that.

Obstacles to the success or prospects of female engineers, he wrote, are “inherent in the nature of the case and are due to women’s comparative weakness, both bodily and mental.” And he elaborated, saying that “The work of the engineer is creative in the highest sense of the word. From his brain spring the marvels of modern industry,” [Page x]in contrast to women, “whose notable performances have hitherto been confined to the reproductive arts.” The path to the workshop, he condescendingly continued, takes “blistered hands, not dilettante pottering and observation.” Drews declared that even “the most resolute and indefatigable of women” cannot overcome these difficulties. And, in support of the soundness of his reasoning, he appealed to female inferiority in other fields beyond engineering. There has been, he noted, “no great woman composer, painter, or sculptor.” Even “the best of woman novelists are surpassed by men.”

“After making these conclusions in the first few paragraphs,” say Schwartz and Schlenoff, “Drews does something more insidious: he invokes data to support his case.” It seems that Drews mailed a letter to several dozen engineering firms and technical societies seeking to “obtain some definite information on the subject.” And then he cherry- picked, manipulated, and spin-doctored the “data” he had received in order to support his apparently pre-ordained conclusion.

A few women, for example, were mentioned in the responses that came to him. But the only woman he regarded as worthy to be mentioned in the same breath with male engineers didn’t really count because, he said, she was too “masculine.” When he found that some women had identified themselves in the previous United States Census as boilermakers, he consulted an electrical engineering institute to ask whether these self-identifications could possibly be authentic. The institute’s response? Absolutely not! In their reply, they explained that they were “too chivalrous” to permit any such thing.8

It’s not only sexism that was scientifically promulgated in Scientific American. Scientific racism also found expression in its pages. “The trappings of science,” report Schwartz and Schlenoff, “have been misused in these pages to uphold systemic oppression. Under the cloak of empirical evidence, some writers entrenched discrimination by framing it as unimpeachable truth.”9

William Tecumseh Sherman, of course, was famous for his “March to the Sea” in the American Civil War that had raged from 1861 to 1865 and, overall, for his harsh “scorched earth” tactics of “total war.” He followed similar principles in the subsequent Indian Wars, in which he expressly declined to distinguish between men and women, children and adults, and in which millions of bison were deliberately slaughtered, [Page xi]nearly rendering the species extinct as a means of bringing the native Americans to their knees and forcing them onto reservations.

In an 1868 column, the editors of Scientific American commented on a report from General Sherman about how railroad construction was being hindered in the West by “Indian affairs.” (The famous “Golden Spike” that linked the transcontinental railroad would be driven at Promontory Summit, Utah Territory, on 10 May 1869.) The magazine felt that Sherman wasn’t being sufficiently aggressive. “The Indians must be summarily and thoroughly squelched,” remarked Scientific American. “They are the most treacherous, as well as the most inhuman, of all barbarous races.”10

“During the 19th century,” Schwartz and Schlenoff flatly declare, “Scientific American published articles that legitimized racism.”11 Here is another example, supplied yet again by senior editors of the magazine itself:

Already in 1871, Charles Darwin had made the claim, heard around the world, that all living humans had descended by a process of evolution from the same biological ancestors. And, of course, Judaism, Christianity, and Islam had long taught that all peoples of the world were the posterity of Adam and Eve.

Very soon, though, a doctrine called “Social Darwinism” arose, in which the idea of the “survival of the fittest” was often used to account for, to defend, and even to advocate the natural superiority of certain classes. It is commonly linked, especially, with the philosopher and sociologist Herbert Spencer (d. 1903) and the statistician Sir Francis Galton (d. 1911), who was Darwin’s half-cousin.

On 5 October 1895, Scientific American published the text of a speech by Daniel G. Brinton, the president of the prestigious American Association for the Advancement of Science. (A surgeon turned ethnologist, Brinton also presided over the American Philosophical Society, the nation’s oldest learned society, at one point.). In that speech, Brinton contended that “the black, the brown, and the red races differentiate anatomically so much from the white … [that] they never could rival its results by equal efforts.” From birth, he declared as a self- evident fact requiring no defense or supporting evidence, a baby’s race determines “his tastes and ambitions, his fears and hopes, his failure or success.”

The highest goal of anthropology, according to Brinton, should be to measure what he called the “peculiarities” of “races, nations, tribes,” so that people can be governed according to the nature and capacities of their “sub-species.” The differences between those sub-species, Brinton [Page xii]announced to the most elite scientific organization of his day, over which he presided, “supply the only sure foundation for legislation; not a priori notions of the rights of man.”

So much for the quaint notion of the “self-evident” truth “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights,” as enshrined in the American Declaration of Independence; now it was time for rule by scientific “experts.”

It may not be wholly coincidental that the very next year, 1896, saw the landmark Plessy v. Ferguson decision by the Supreme Court of the United States. In that decision, the Court upheld the constitutionality of racial segregation under the doctrine of “separate but equal.” As Loren Miller, a justice of the Supreme Court of California, remarked in a 1966 book, Plessy v. Ferguson “smuggled Social Darwinism into the Constitution.”12

However, views that we today would consider deeply racist didn’t vanish from the pages of Scientific American with the end of the nineteenth century. The magazine continued for decades to report on ideas of eugenics — “improvement” of the human species through controlled breeding — that had been passionately advocated by Galton, and which later became an obsession of the National Socialist movement in Germany and a principal element of government policy under Hitler’s Third Reich.13 Class prejudice and racial bias appeared in the magazine under the guise of dispassionate science, with the editors responding uncritically to it, and sometimes not even neutrally. When articles opposing eugenics and its racist agenda appeared, they “were often labeled ‘the opposition.’”14

Although a Scientific American staff writer argued in 1932 that humans, including scientists, were too ignorant to be able to effectively institute eugenic policies, “articles promoting eugenics as scientific consensus continued to appear in the magazine.” In 1933, for instance, one article promoted the then-controversial practice of birth control as a means of preventing the reproduction of “defectives.” The article was accompanied by a photograph depicting people in what appears to be a bread line, with an accompanying image of guinea pigs in a cage [Page xiii]alongside it. The following year, in 1934, the president of the so-called Human Betterment Foundation opined in Scientific American that the “trend toward race degeneracy is evident in statistics so well known that they need not here be rehearsed.” A quotation in the article features an assertion from the famous Viennese surgeon Adolf Lorenz — father, incidentally, of the famous ethologist Konrad Lorenz — that the eugenic sterilization of undesirable elements “eventually will come to all civilized countries as a means of getting rid of the scum of humanity.” In 1935 — only five or six years before the Nazis began their “Final Solution” to the “Jewish Problem” — Scientific American published an article with the distinctly ominous title “The Oddest Thing about the Jews.” 15

A passage from the late Hugh Nibley seems apropos here. Writing in an essay entitled “Fact and Fancy in the Interpretation of Ancient Records,” he wrote

Science represents a high court from whose judgment there is no appeal, the idea (Freud expresses it in his The Future of an Illusion) … that all other judgments are outmoded traditions; [that] the judges are free from prejudice and bias, and above petty personal interests, if they let the facts speak for themselves; that they suspend all judgment until all the facts have been gathered; that they proceed cautiously and carefully, step by step, making no mistakes, no guesses, never accepting a proposition until it is proven; that to question such a judge is an affront to his dignity and to his high office; that the judges never guess but always know; that they make no pronouncements until they have proven and verified everything; that they begin their investigations by accumulating facts with completely open minds, neither selecting or eliminating as they go; that their procedures and conclusions are in no way colored by any previous experience. That they never trust anything to luck and rarely make mistakes; that their accumulated decisions of the past compose a solid and reliable body of tested and proven knowledge called science; that by following the instructions and example of the judges, our civilization can emancipate itself from the darkness of ignorance; that to accept the decision of the judges as definitive is the mark of an intellectual person; that the knowledge of the judges is so deep and specialized that it [Page xiv]cannot be put into ordinary language or understood by the layman but [that] science is a necessary domain of highly specialized experts and so forth …

Well, every one of these propositions is completely false.16

However, the purpose of my drawing upon this article and these episodes from the history of Scientific American is not to denigrate science. As I said earlier, the sciences have earned justifiable respect for their enormous achievements to date. Instead, I’m simply trying to point out that human fingerprints are visible, and unavoidable, in every human enterprise — science among them. Science should not be summarily rejected. It should also not be deified. And if human factors have influenced even so rarified and seemingly pure a discipline as mathematical logic, as has been persuasively argued,17 how much more so will this be true in “softer,” less clear-cut fields such as history, archaeology, philosophy, theology, and the social sciences?

We can, I think, respect the powers-that-be at Scientific American for their frank acknowledgment of some grave mistakes, even moral errors, in the magazine’s past. On the other hand, no great courage is required to admit the “sins” of others, to acknowledge the missteps of predecessors. Doing so can even sometimes be a form of moral preening or virtue-signaling in the present.

But acknowledging our own errors can be extremely difficult. Not only morally but, precisely, because we can’t always easily discern them. The authors called out in the article by Schwartz and Schlenoff were probably not evil people by the standards of their times. They may well even have been idealists. But, as we see today, they were blind — just as blind as the countless laypeople, politicians, administrators, religionists, bureaucrats, and captains of industry who relied upon and followed the all-too-human scientific experts. (This is a real-world example of the blind leading the blind.)

[Page xv]So here is the question that I raise: How can we be certain that we’re not blind today? This question is even relevant regarding — and perhaps even especially regarding — matters on which there is broad consensus, sometimes especially among experts. If we’re blind to our own errors and mistakes, we will obviously not see them.

That is why broad scientific conclusions, and apparent historical and social scientific truths, often need to be not only gratefully received but also carefully examined and, even if they appear to withstand scrutiny, at most tentatively accepted. Humility is an intellectual virtue as well as a practical virtue for everyday life. We cannot be certain which of today’s obvious facts will be overturned in the light of the morrow. We can be certain only that, as has demonstrably happened in earlier generations, it will happen again. Humans will not stop being humans; mistakes will be made, discovered, and discarded. The march of science, and of historical and other forms of understanding, hasn’t stopped. It hasn’t culminated with us.

Let me close with a word concerning the present, on a matter about which I am sure there is no discernible error on my part. As this volume of Interpreter: A Journal of Latter-day Faith and Scholarship goes to press, it is a special pleasure for me to acknowledge the efforts of those who have made it possible. Allen Wyatt has now been joined in his demanding duties as the Journal’s editor by Jeff Lindsay, for which we’re grateful. We also appreciate the time and energy expended by the writers in these pages, who receive no compensation beyond our gratitude and, I hope, a sense of satisfaction for doing important things that are appreciated by many others. Peer reviewers, source checkers, and copy editors are all anxiously, selflessly, and expertly engaged in what we view as a good cause. (A fuller accounting of those involved with the Foundation — sans peer reviewers, who necessarily do their work in anonymity — can be found on pp. ii-iii of this volume.)

I am keenly aware that without the generous donations of time, energy, and, yes, funding that come from many people, the Interpreter Foundation could not accomplish its work.


1. Jen Schwartz and Dan Schlenoff, “Reckoning with Our Mistakes: Some of the Cringiest Articles in the Magazine’s History Reveal Bigger Questions about Scientific Authority,” Scientific American 323/3 (September 2020): 36–41, https://www.scientificamerican.com/article/reckoning-with-our-mistakes/.
2. Lorraine Daston, with Moritz Stefaner and Jen Christiansen, “The Language of Science: How the Words We Use Have Evolved Over the Past 175 Years,” Scientific American 323/3 (September 2020): 26–35, https://www.scientificamerican.com/article/the-language-of-science/.
3. See Maryn McKenna, “Return of the Germs: For More Than a Century Drugs and Vaccines Made Astounding Progress against Infectious Diseases. Now Our Best Defenses May Be Social Changes,” Scientific American 323/3 (September 2020): 50–56. The article is also available online, unhelpfully under a different title (“In the Fight against Infectious Disease, Social Changes Are the New Medicine: Vaccines and Drugs Drove a Century of Progress, But Today’s Contagions Thrive on Inequality”), https://www.scientificamerican.com/article/in-the-fight-against-infectious-disease-social-changes-are-the-new-medicine/.
4. Martin Rees, “How Astronomers Revolutionized Our View of the Cosmos: The universe turns out to be much bigger and weirder than anyone thought,” Scientific American 323/3 (September 2020): 58–64, https://www.scientificamerican.com/article/how-astronomers-revolutionized-our-view-of-the-cosmos/.
5. David McCullogh, Brave Companions: Portraits in History (New York City: Simon and Schuster, 1991), 116.
6. See the entry on Emily Warren Roebling at the website of the American Society of Civil Engineers, https://www.asce.org/templates/person-bio-detail.aspx?id=11203.
7. A photograph of the plaque is available at http://www.hmdb.org/PhotoFullSize.asp?PhotoID=68007.
8. For their discussion of the article by Karl Drews, see Schwartz and Schlenoff, “Reckoning with Our Mistakes,” 38–39
9. Ibid., 40.
10. Ibid., 39–40.
11. Ibid., 40.
12. Ibid.
13. For Hitler’s own Social Darwinist views, see Richard Weikart, Hitler’s Religion: The Twisted Beliefs that Drove the Third Reich (Washington, DC: Regnery History, 2016); also Anton Grabner-Haider and Peter Strasser, Hitlers Mythische Religion: Theologische Denklinien und NS-Ideologie (Vienna, Cologne, and Weimar: Böhlau Verlag, 2007).
14. Schwartz and Schlenoff, “Reckoning with Our Mistakes,” 40.
15. For their discussion of eugenics, see ibid., 40–41.
16. Hugh Nibley, “Fact and Fancy in the Interpretation of Ancient Records,” 55 pp., d.s. typed transcript of an address given at the third annual Religion Lecture Series at Brigham Young University on 11 November 1965, 6–7. (The transcript of this address has also been circulated under the title “Intre-Ancient Records.” Topics include Karl Popper, science, bias, and dogmatism.) Thanks to Shirley S. Ricks for locating this item for me.
17. For a discussion of human factors in mathematical logic, see William Barrett, The Illusion of Technique: A Search for Meaning in a Technological Civilization (Garden City. NY: Anchor Press/Doubleday, 1979), 3–117.

Posted in Essay and tagged , , on . Bookmark the permalink.
mm

About Daniel C. Peterson

Daniel C. Peterson (PhD, University of California at Los Angeles) is a professor emeritus of Islamic studies and Arabic at Brigham Young University, where he founded the University’s Middle Eastern Texts Initiative. He has published and spoken extensively on both Islamic and Latter-day Saint subjects. Formerly chairman of the board of the Foundation for Ancient Research and Mormon Studies (FARMS) and an officer, editor, and author for its successor organization, the Neal A. Maxwell Institute for Religious Scholarship, his professional work as an Arabist focuses on the Qur’an and on Islamic philosophical theology. He is the author, among other things, of a biography entitled Muhammad: Prophet of God (Eerdmans, 2007).

Go here to see the one thought on ““Reckoning with the Mortally Inevitable”” or to comment on it.