Saturday, October 13, 2012

Faust in Copenhagen review

Faust In Copenhagen: A Struggle for the Soul of Physics, by Gino Segré, Viking, 310 pages, $25.95.


Gino Segré’s new book derives its title from a skit performed at the end of a small and informal week-long conference held at Niels Bohr’s physics institute in Copenhagen in April 1932. That year was the centennial of Johann Wolfgang von Goethe’s death, so the chosen skit was a physicist-based version of Goethe’s epic poem about the battle between God and the Devil (Mephistopheles) for the soul of the very learned but very dissatisfied Faust.

By tradition, the Copenhagen skit was written and performed by the youngest conference participants, parodying the mannerisms and physics discoveries of the older participants. Max Delbrück, a 25-year-old Berlin physicist who later won a Nobel Prize for his contributions to biology, accepted the task of writing and being the Master of Ceremonies for the 1932 skit.

In Delbrück’s script, instead of God there is Bohr, the great father-figure of 20th Century physics, who often prefaced his remarks with "I don’t mean to criticize, but…".  Delbrück’s Mephistopheles is Wolfgang Pauli, from Zurich, something of a Peter Lorre look-alike who Segré says was “generally recognized as having the finest critical mind in quantum theory,” not to mention the sharpest tongue. “So young and already so unknown,” was a well-known Pauli-ism directed at fledgling physicists whose work didn’t quite measure up to that of the that of the twenty-something-year-old founders of quantum theory. (Erwin Schrödinger is the exception to this rule.  He was in his forties when he discovered the Schrödinger equation.)


The innocent romantic interest of Faust, Margareta (also called Gretchen, her German nickname), was not a person in the Copenhagen skit—she was the neutrino, an entity postulated by Pauli in 1930, not discovered experimentally until the 1950s, and famously made fun of in John Updike’s poem "Cosmic Gall".

Finally, the substitute for Faust was a heroic but soon-to-be tragic figure, Paul Ehrenfest, a Leiden professor and one of Einstein’s favorite physics confidants. Segré says Ehrenfest, who never won a Nobel Prize, “was perhaps the best teacher of them all."

The skit became known colloquially as the Copenhagen Faust. Excerpts from it and humorous drawings of the characters (taken from George Gamow’s book Thirty Years That Shook Physics) are sprinkled here and there in the book, as are translated excerpts from the original Faust. But the Copenhagen Faust is only used as a leverage point in Segré’s book, which is a warmly written account of what happened in the lives of seven of the conference participants from just after World War I until the early 1930s, a time Segré says was “probably the 20th Century’s most dynamic period ... marked by James Joyce’s cryptic retelling of the story of Ulysses, Arnold Schoenberg’s atonal compositions, Giorgio de Chirico’s eerie landscapes, Le Corbusier’s manifesto for a new architecture, and Heisenberg’s perplexing uncertainty principle.”

Werner Heisenberg is indeed one of the seven 1932 conference participants profiled by Segré . Besides Bohr, Delbrück, Pauli, Ehrenfest, and Heisenberg, the other two physicists Segré introduces to the reader are the laconic British theorist Paul Dirac and the female experimental nuclear physicist Lise Meitner, whom Segré describes as one of the 20th Century’s great physicists, overlooked repeatedly by Nobel Prize judges for personal and political reasons.

As Segré points out, 1932 was a pivotal year in several ways, but particularly in physics and German politics. Nuclear physics was born in February of that year, mid-wifed into existence at the Cavendish Laboratory of Cambridge University by James Chadwick’s discovery of the neutron. Less than a year later, Hitler became the chancellor of Germany, and Faustian bargains—to save one’s family, one’s self, one’s country—were suddenly not so easy to refuse.


Segré’s book, however, is not about Faustian bargains. The "struggle" referred to in the title is between Heisenberg's version of quantum mechanics, called matrix mechanics, and Schrödinger's version, called wave mechanics.  While the book does provide a scientific look at the lives of the seven physicists, it is also a personal account that includes stories about Segré’s mother and mother-in-law, both of whom arrived in Munich, Germany, as 18 year-olds in 1918--the same year Wolfgang Pauli and Adolph Hitler arrived there. Heisenberg, also 18 years old, was already in Munich, since it was his hometown and he attended the university there.

Segré’s uncle, Emilio Segré--awarded the Nobel Prize in 1959 with Owen Chamberlain for their discovery of the anti-proton--makes a cameo appearance in the book.

Faust in Copenhagen lacks the pathos of Michael Frayn’s Tony Award-winning play Copenhagen, about the still-mysterious purpose of Heisenberg’s visit to Bohr in occupied Denmark in 1941, but it does almost as good a job of portraying the greatest generation of physicists as Jon Else’s masterful 1981 documentary film, The Day After Trinity, about J. Robert Oppenheimer and the making of the atomic bomb.

Near the beginning of The Day After Trinity, Hans Bethe, who won a Nobel Prize for discovering how stars can exist, says, "You may well ask why people with a kind heart and humanist feelings…why they would go and work on weapons of mass destruction." Unlike the film, Faust In Copenhagen doesn’t attempt to answer to that question. But it does provide some enjoyable background reading.


 (similar to my review published in the Arkansas Democrat-Gazette on January 6, 2008)

Wednesday, October 10, 2012

Phyicists on Wall Street

Physicists on Wall Street, unpublished review


[This is a book review I wrote for the Arkansas Democrat-Gazette three years ago, which the book review editor decided not to use. Probably a good decision, as it's an obscure book and why write a highly critical review of an obscure book? Yeh, so nobody will buy it! Like anybody was gonna anyway, hey? I edited it some more before posting it. I just love to write and especially, apparently, critiqueing everthang physics-related. I did have a review, a positive review, of the book Faust in Copenhagen, by Gino Segre, published in the ADG earlier in 2008. Got a whopping $75 for it! Then in March 2009, as if to end my attempts to write reviews for them, the higher-ups at the ADG unceremoniously jettisoned the two-page book review section from their Sunday edition in order, one assumes,to stay afloat financially or to just look good (i.e., lean and mean) to Wall Street. Oh, yeh, speaking of which...]

Physicists on Wall Street and Other Essays on Science and Society, by Jeremy Bernstein, Springer, 182 pages, $34.95.

 
Some of the tasks performed by physicists and accountants are rather similar, and also rather straightforward. They both, for instance, work with balance sheets, although physicists’ balance sheets must conform to natural laws called conservation laws, and accountants are only required to follow man-made laws.
 
Physicist at the Large Hadron Collider in Geneva, for instance, are looking at the known masses and energies going into the proton-proton head-on collisions, and balancing or equating those with the masses and energies of particles coming out of the collisions. The collisions occur at near the speed of light.
 
In theory it's simple, at least if you know your relativistic physics well enough, but in practice identifying particles and energies after the collisions is a complicated engineering task. The process requires not only huge particle detectors but also special computer programs that statistically search for the presence of new and already-known elementary particles.
 
Nowadays, accountants and financial analysts face similarly monumental tasks in figuring out the results of complicated financial transactions. Some of these analysts also, of course, invented those complicated financial transactions, purposely making them hard for industry regulators and investors to figure out. By their (man-made) nature, financial instruments that are the hardest to figure out also make the most money—if the people trying to profit from them don’t get fooled themselves.
 
Because of the complex, computer-intensive nature of their work, quantitative financial analysts—also called “quants” or financial engineers—are often recruited from the PhD pool of applied mathematicians, engineers, computer scientists, and even elementary particle physicists.
 
Jeremy Bernstein himself is an elementary particle physicist and a professor emeritus at the Stevens Institute of Technology in New Jersey. He was a staff writer for The New Yorker from 1961 until 1995, and his compact scientific biography of Albert Einstein, simply titled Einstein, was nominated for a National Book Award in 1974. Since then he's written over twenty books on physics, physicists, and other subjects, including mountain climbing. 
 
With a background like that, Bernstein should be a sure bet. Of late, however, he’s been having a bad run. His two books published last year--Nuclear Weapons: What You Need to Know and Plutonium: A History of the World’s Most Dangerous Element—don’t deliver what their titles promise and often read like extemporaneous lectures directly transcribed into books. And we’re not talking Feynman-type lectures either. These books seriously needed a good editor.
 
It does seem that Bernstein has lost his muse, or maybe just his editor. He says in the acknowledgements at the beginning of his Plutonium book, “When I first started writing books, now some decades ago, they were made up of things that had first appeared in the New Yorker.” He then gives credit to that magazine’s longtime editor, the late William Shawn, for helping him learn how to write about science for the general public. 
 
The publication of Physicists on Wall Street, a hodge-podge collection of essays on science, economics, and language, should set Mr. Shawn to spinning in his grave, if he has one. The book could provide nonfiction writing instructors with many examples of how not to write, starting with the first line in the preface: “Everyone has their own way of learning.” 
 
(That could easily have been changed by an editor, or by Bernstein himself, to “Different people have different ways of learning.” Or at least it could be grammatically correct if it read “Everyone has his or her own way of learning.”)
 
If all the problems with Bernstein’s writing were so slight, things wouldn’t be so bad. But his narrative bounces around like it’s following the random walk or Brownian motion (also called the drunkard’s walk) that was analyzed statistically by Einstein in 1905 and put to use on Wall Street in the 1970s as something called the Black-Scholes formula.
 
For example, Bernstein writes this about a physics PhD named Emanuel Derman: “He interviewed at Salomon, where eventually he took a job for a very unhappy year, after which he returned to Goldman. One of the groups at Salomon that he interviewed with was one that had been handpicked by John Meriwether." 
 
The mention of John Meriwether leads Bernstein off in another direction, into a discussion of Meriwether's ill-fated hedge fund, Long Term Capital Management, which financially imploded in 1998. That particular debacle, from which regulators of the financial industry seem to have learned nothing, is also described in the book When Genius Failed: The Rise and Fall of Long Term Capital Management, by Roger Lowenstein. Emanuel Derman tells his own story in My Life As a Quant: Reflections on Physics and Finance, published in 2004. Either of these would be a better choice for learning about Meriwether and Derman than Bernstein’s rambling anecdotal account.
 
Bernstein’s writing on science and scientists of various sorts is only slightly better than his discussion of stock market economics. And his writing about language and linguists is too lengthy to maintain the reader’s interest. Again, the problem seems to be a total lack of much-needed editing.
 
Bernstein at least does make one astute and timely comment about the financial markets. "The key to everything was the assumption that the market would behave rationally,” he says in a chapter called The Rise and Fall of the Quants. “This continuity of behavior was one of the assumptions, for example, that went into deriving the Black-Scholes formula. If in the Brownian motion, for example, the drunkard suddenly falls down a manhole, all bets are off." 
 
That’s a pretty good description of what actually happened to the financial world just after Bernstein’s book was published in August 2008. The cover of the September 29th issue of The New Yorker tells that same story pictorially: a businessman walking in front of the New York Stock Exchange, preoccupied with a cell phone call, is about to step into an open manhole. The cover is titled "Downward Mobility."
 
 

Sunday, October 7, 2012

A brief history of Lise Meitner

© 2002 David W. Trulock (an unpublished manuscript from March 2002)

Before Women's History Month ends, there's one woman who lived during the 20th Century who needs to receive more attention, especially in these times of war, rumors of war, and an increased threat of nuclear weapons use.

 Ever heard of Lise Meitner?

That's what I thought. She was born in Vienna in 1878 and died in Cambridge, England, in 1968. (Lise was originally Elise, but it's pronounced just like Lisa.) In her 90 years of unmarried life, Meitner earned a doctorate in physics, became a prominent experimental nuclear physicist in Berlin (and protege of Einstein, Shrodinger, et al) and then became a refugee from Germany a few months after Hitler's annexation of Austria in 1938.

It was on Christmas Eve of 1938, after 6 months of living in exile in Sweden, that Meitner made the most important scientific discovery of her life—and one of the most important of the 20th Century. She and her physicist nephew Otto Frisch did some simple calculations and drawings that convinced them a uranium nucleus hit by a neutron could be split in half, a process that they shortly named nuclear fission.

Although Einstein's E = mc2 formula had been around since September 1905, Meitner, on that Christmas Eve morning, was the first to see how it applied to the fissioning of the uranium nucleus. Before being exiled from her work, Meitner had instigated experiments involving neutron bombardment of uranium with her collaborators Otto Hahn and Fritz Strassmann in Berlin.  Because she was Jewish, however, Meitner could not continue to lead their experimental team after March 1938, the month of Hitler's annexation and occupation of Austria. The annexation made Meitner a German citizen, and thus subject to the restrictive Nazi anti-Semitic laws. But she and Hahn kept in constant contact by mail after she illegally and with much sadness left Germany for Sweden.

It was only Hahn, however, who later received a Nobel Prize (in chemistry) for his and Strassmann's chemical separations, done in late 1938, showing neutron bombardment of uranium produced barium, an element with about half the mass of uranium. Hahn, like other scientists of the time, could not understand such a result until Meitner received word of it and she and Frisch made their theoretical calculation showing the nucleus could split. 

The calcuation, based on Niels Bohr's newly developed liquid-drop model of the nucleus, used E = mc2 and predicted an energy release about a hundred times greater than any known release of energy in a chemical reaction.  What was still necessary for the practical use of this energy was that there be a chain reaction, meaning more than one neutron would be released as a result of uranium fission. That more than one neutron was released was soon demonstrated experimentally by Frisch.

When Meitner had a chance to leave her pittance of a research job in Sweden in the early 1940s and work on the Manhattan Project in the United States, she refused, saying "I will have nothing to do with a bomb!" She had been a volunteer X-ray technician during World War I, and unlike many past and present supporters of war, she knew what it truly involved.

Ruth Lewin Sime, in her definitive 1996 biography, Lise Meitner: A Life in Physics, explains: "Meitner wanted no part of deaths anywhere: she could not commit herself and her physics—the two were not distinct—to a weapon of war. She had seen the casualties firsthand in 1915-1916; she had heard the screams. She could not do it. Her decision was instantaneous and absolute: there was no discussion. She would not work on the bomb."

Besides Sime's book, other references on this unusual and overlooked scientist include the 1992 BBC documentary "A Gift From Heaven" (Hahn's phrase for what he regarded as his fission discovery), and Rachel Barron’s book Lise Meitner: Discoverer of Nuclear Fission, published in 2000.

Although Meitner was nominated several times for a Nobel Prize in physics, it was never awarded to her.
------------------------
(There's an original b&w photo located outside the ground floor lecture hall in Robert Lee Moore Hall on the University of Texas at Austin campus that shows Meitner and Irene Curie and about 20 mostly famous male scientists and has their original signatures on it. The photo is from the 1933 Solvay physics conference, held in Brussels.  Albert Einstein is not in the photo, and thus it doesn't have his signature on it.  He had attended all the earlier conferences, held about every three years starting in 1911, but he permanently moved to Princeton, New Jersey, in 1933 to escape the Nazi’s, and did not attend the 1933 conference.  In fact, even after the war was over, he didn't return to Europe even for a visit.)

Friday, October 5, 2012

Article in college paper on Amory Lovins

Consumer. Consumer…Arrgh!  [note: this was before the pirate "arrgh!" craze]


Lately I’ve been wondering whether I like being called a consumer. I never gave the idea much thought until I read an excerpt from The Strawberry Statement by James Simon Kunen in which he enumerates some of the things he doesn’t like: “calling people consumers” is something he doesn’t like. (Or I should say, didn’t like in 1968, when he wrote the book.) Then I began thinking about the word itself. Consumer. Consumer . . . Arrgh! I don’t like it! A vacuum cleaner consumes! A garbage disposal consumes! But it’s true: people also are consumers.

Our system of buying and selling is based on mass production and mass consumption. If you don’t want to be a consumer you can drop out of the system. Get some land, grow your own food, barter your handmade tapestries. You are going against the grain, though, and that’s not easy—you still have property taxes to worry about for one thing. The system won’t accommodate you; it’s much too inflexible, and many of the sources of production are too far removed, too conglomerated, to provide anything except the large-scale, lock-step service designed to bear the greatest amount of traffic.

Thus most of us are consumers by default: we fail to try anything that isn’t already systematic. But by what other means does one raise a family? The picture isn’t entirely gloomy, of course, because there are some advantages. The convenience of it all is the greatest advantage and is actually at the root of the whole system. If you can do all your shopping at Skaggs-Albertson’s you don’t need the grocer on the corner or the drugstore down the street. You can load up your car (another convenience) with all you need by making one stop. But the convenience breeds a certain habit, and the habit determines what kind of consumers we are. What kind of consumers we are determines . . . well that’s something to think about.

Energy consumption is something else to think about: “The energy in one U.S. gallon of oil is equivalent to one and a half weeks of a fine diet of 3,000 food calories per day (more than most of the world’s people get). The gallon lasts less than ten minutes in a fast car. A Concorde SST consumes it in about a tenth of a second. Millennia were consumed in putting the gallon together.”

It’s obvious that Amory Lovins, who put those words together in his 1975 book World Energy Strategies, is thinking about energy consumption. He is not painting the gloomy picture, however, like so many others, some of whom are trying to gently but firmly coerce the public into thinking that new power plants must be built without hesitation in order to supply the increasing energy needs of our society.

Perhaps you have wondered why energy usage must so inevitably increase. Isn’t conservation capable of holding the line on energy consumption?

That question has been exhaustively studied and the result is central to the idea of using alternative energy sources: zero increase in the rate of energy consumption is feasible—in fact a decrease in energy consumption is possible. Whether or not it’s desirable depends on who you ask about it.

Lovins talked about his ideas at the Old State House in Little Rock last month. About 150 people attended his lecture. Among his statistics, which were plentiful and thought-provoking, was a chart showing how government and industry had, over the last few years, lowered their predictions of future energy usage. Lovins called the chart a diagonal matrix because his predictions from several years ago, in the upper left of the chart, agreed precisely with the most recent predictions of government and industry, shown in the lower right. In other words, as time has passed even the most vested of vested interests have concluded that energy consumption isn’t going to increase as rampantly as they once thought it would. But they are not very eager to let the public know this.

At this point a little information on Amory Lovins himself is appropriate: He is a 27-year-old physicist, an American living in England, working as the chief British representative of Friends of the Earth International. He looks sort of like a scaled-down version of Isaac Asimov—smaller but with the same wiry hair, same black-framed eyeglasses, and behind the glasses the same rather wild, electric look in the eyes. His ideas, however, are far from being wild, although they certainly go beyond more conventional thinking.

Lovins wants to convince people that alternative energy sources will not only work, they will also provide a significant improvement over the present system. He claims the “soft path” of solar energy, wind energy and other diverse non-extinguishable sources of energy, “each doing what it does best and none of which is a panacea,” will ultimately be less expensive than coal, oil and nuclear energy. Government subsidies, he pointed out, make these conventional forms of energy seem cheaper than they really are. In an interview published August 20 in the Arkansas Gazette, Lovins says the subsidies should be abolished and low-interest loans from large holders of capital should be made available to help people finance insulation and solar collectors for homes and businesses.

In his speech Lovins called the soft path “a hopeful alternative to the energy future” and said it would “have no effects on the life-style” of people in the affluent Western nations.

One aspect of the soft path is a decentralization of power production, resulting in energy sources being better suited to a particular requirement. Solar energy, for instance, is well suited for heating—water heating and space heating. The latter of these accounts for 58% of the energy usage in the United States. When electric power from a steam generating plant is used for space heating, it’s like “using a forest fire to fry an egg,” Lovins said.

He also mentioned other problems with coal and nuclear power plants: environmental damage, including “zones of national sacrifice,” such as Appalachia, where coal is mined and used for energy in far away cities, and also places where highly radioactive nuclear by-products would be cached; the political problems created by allocating considerable sums of money to meet capital demands of large power plant construction; and the tendency to form an “elitist technocracy.”

Conservation is a key ingredient in the soft approach to solving energy problems. Of course, we are already being urged to conserve energy, but if we do a really good job of it, what happens? The electric rates go up, for one thing—the company must make a profit. So with respect to the present state of affairs, a little conservation is beneficial, but it’s hazardous to the economy to do a lot of conserving.

Again there is the alternative of dropping out of the system, although that means losing the conveniences and changing habits. (However, it also means gaining some independence from rising utility costs.) Lovins approach is to initiate a new system, one that is not so far removed from the user, nor too conglomerated. Whether or not that system can be implemented depends on whether people are going to be insatiable consumers or sensible individuals.

      --David Trulock, in The College Profile, Hendrix College’s student newspaper, December 8, 1978, page 6. Lovins is now director of The Rocky Mountain Institute. Peace.

Monday, October 1, 2012

Displaying Little Regard for the Mysteries of Life

This article was published  in October 1986 as an unsigned editorial in the now-defunct but once great bi-weekly Little Rock alternative newspaper Spectrum.  Five years later, the editorial with me listed as the author was reprinted with other Spectrum articles in a book called A Spectrum Reader: Five Years of Iconoclastic Reporting, Criticism and Essays, edited by Bill Jones, Philip Martin, and Stephen Buel. 



There are many simple questions for which science has no simple answers, and some of the simplest involve life itself. Two of those questions pertain to current events. One is "When did the human species come into existence?" and the other is "When does the life of an individual begin?"

Science does not have the answers to these questions. It does not have the answer to the former because a definitive pattern of human evolution has not been established. To the latter, science has no answer because it is not able to answer a more fundamental question: What is life?

This pair of simple questions is of current interest because of the points of view espoused by the creationism and pro-life movements, both of which have received considerable political attention and are about to receive more. Creationism arguments emanating from Louisiana will be heard later this year by the United States Supreme Court. In Arkansas, as well as several other states, the pro- life movement has helped put an anti-abortion amendment on the November 4 ballot.

Both the pro-lifers and the creationists have experienced setbacks in Arkansas in recent years. In early 1982, U.S. District Judge William R. Overton ruled against the equal-time teaching of creationism with evolution, saying that "creation science" is a religious point of view, not a scientific one. Two years ago, the Arkansas Supreme Court disallowed the ballot title of the Unborn Child Amendment, saying the title was misleading. The amendment was renamed "The Limitation of Abortion Funding Amendment" by the Unborn Child Amendment Committee and will appear on this year's ballot as Amendment 65.

The people responsible for the creationism and pro-life movements share a common belief concerning the time scale of the beginning of life. Creationists say the human species appeared virtually instantaneously, and pro-life proponents say that the life of an individual begins in the same fashion. The pro-life movement's point of view was stated a few months ago during a KLRE [public radio] interview, when the vice-president of the Unborn Child Amendment Committee at one point said, "Life begins at conception." Not "We believe life begins at conception," but "Life begins at conception," as if the statement represented an unquestionable fact.

Those of us who will admit to not being sure when life begins must ask of the pro-life movement, "How do you know that?" The only thing science has to offer in this regard is that there is a unique set of DNA present after fertilization occurs. But uniqueness and the existence of life are not the same thing.

More satisfactory from a scientific standpoint is the view that life does not begin at a certain moment, but rather that it evolves from one stage to another— from separate and living sperm and ovum to fetus, and from there to a full-fledged human being. Such a point of view does not imply that there is nothing wrong with abortion. On the contrary, it says that abortion is very disrespectful of the mystery of human life. However, by stating unequivocally that human life begins at conception, the pro-life movement is also disrespectful of this mystery.

Creationists have also shown little regard for the mystery of human origins. There is validity to the creationists' criticism that some people falsely accept evolution as a fact. That is not a problem with evolution but with people's understanding of science. This problem is only worsened by the arguments put forward by the creationist movement, because the creationists have created the perception that people must accept one theory or another, when in fact people do not have to accept any scientific theory as fact. That includes such now-obvious theories as Newton's theory of universal gravitation.

It is appropriate to mention gravitation at this point, because it, like evolution, can be said to contradict the Book of Genesis. Before Newton, it was imagined that celestial bodies moved in accordance with divine law, not in accordance with any natural law that could be described by humankind. What Newton did for physics, Darwin did for biology. Newton discovered universal gravitation and described it with a simple equation, and Darwin discovered the evolution of species, which he described in terms of the law of natural selection.

To put the scientific viewpoint in its proper setting, the late UCLA astronomer George Abell, who gave an energetic lecture at the University of Arkansas at Little Rock one afternoon not long before he died, included a section called "What science is and is not" in the first chapter of his highly-regarded astronomy textbook. Abell wrote, "One could argue, technically, that the sun and moon do not exist at all—that we are dreaming the whole thing. Most of us accept the existence of a real world; moreover, we accept many scientific theories as fact - such as the rotation and revolution of the earth. But in this acceptance we are going beyond the rules of science into religious belief. Science does not dictate as fact that the earth moves, but only that its motion is required by Newtonian theory."

In light of Abell's words, it should be reiterated that some people's acceptance of evolution as fact must be regarded as scientific naivete or as a pseudo-religious belief. But even if one chooses, for the sake of argument, to consider the latter case, there is still no credence warranted for the creationists' claim that the theory of evolution is itself the equivalent of a religious belief. Evolution, including human evolution, is no more and no less godless than the theory of gravitation.

The creationists apparently are not satisfied with such a separation of religion and science. They wish to take the literal translation of the Book of Genesis and pit it against evolution in the arena of scientific theory. In doing so, they force people to make a choice between evolution and Genesis, much as the pro-life movement forces people to choose between the belief that life begins at conception and the belief that abortion is perfectly proper. In short, the creationist argument equates evolution with atheism, and the pro-life argument equates abortion with murder. That these two arguments do not have any scientific validity hardly matters if they are given political validity in the coming weeks.

Thursday, April 26, 2012

Disagreeing with Thomas Sowell on the sixties

(Guest Column from the 11 June 2006 op-ed page of the Pine Bluff Commercial.)

In his May 31st column criticizing liberalism in the 1960s ("Liberals are staying busy preserving their own vision of history"), Thomas Sowell tries to refute the recollection of a liberal political activist who is quoted as saying [to Sowell], "This country was about to blow up. There were riots everywhere. You can stand there now and criticize, but we had to keep the country together, my friend."
Sowell says it was instead people like Chicago Mayor Richard J. Daley who kept the country together during the riots of the '60s: "Even during the 1960s, riots were far more common and deadly in liberal bastions like New York City than in Chicago, where the original Mayor Daley announced on television that he had given his police orders to 'shoot to kill' if riots broke out."
Because of this, Sowell says, "the net effect was that Daley saved lives while liberals saved their vision." But it seems to me that the net effect of Sowell's comparison of rioting in New York and Chicago, and his quoting of Mayor Daley, is to give the wrong impression.
The 1969 Britannica Book of the Year, in an article on race relations, describes the contrast between Chicago and New York right after the April 4, 1968, murder of Rev. Martin Luther King Jr. in Memphis:  "In Chicago, angry blacks burned down blocks and, in the end, 12,500 troops were needed to bring the city under control.  In New York, a cool mayor kept violence to a minimum by walking the streets of Harlem three nights in a row, but many wondered if nonviolence was dead."
The yearbook says that on April 15, 1968, Daley called for police to "shoot to maim" looters as well as "shoot to kill" any arsonists.  It also reports that after complaints from civil rights activists, both orders were "subsequently modified."  The orders therefore represent failures on the part of Daley and the Chicago police, not successes.  And Daley's iron-fisted tactics failed again during the antiwar protests at the Democratic National Convention in Chicago in August 1968.

Sowell makes an issue of the fact that more congressional Republicans than Democrats voted for the Civil Rights Act of 1964 and the Voting Rights Act of 1965. But liberals and conservatives didn't fall neatly into Democrat or Republican affiliations in the 1960s.

Mayor Daley was famously a big-time Democratic party boss, while the "cool mayor" of New York mentioned in the Britannica yearbook was John V. Lindsay, a Republican who had his share of problems of his own making, but not when it came to the success of his liberal-minded riot control response in 1968.

Lindsay's career says a lot about what happened to the Republican party after the 1960s.  

When he was defeated in the 1969 New York Republican mayoral primary, Lindsay ran on the Liberal Party ticket and was re-elected to a second term.  Then he switched his party affiliation to Democrat in 1971 and made an unsuccessful run for the Democratic presidential nomination in 1972 [which went to George McGovern]. Of course, that was the year Richard Nixon—promising “law and order”—was re-elected in a landslide.
Lindsay's December 20, 2000, obituary on CNN's website quotes his reasoning for becoming a Democrat in 1971: "It has become clear that the Republican Party and its leaders in Washington have finally abandoned the fight for a government that will respond to the real needs of most of our people— and those most in need."

That is nearly a textbook definition of what liberalism is all about, and is also a fitting indictment of the Republican Party and its leaders in Washington today.

David Trulock is a writer and physicist who grew up in Pine Bluff in the 1960s and 1970s and recently moved back after living in Austin, Texas, for 17 years.  

Wednesday, April 25, 2012

1991 Ode to Walker Percy

This appeared in Spectrum Weekly (Little Rock), May 22, 1991.  Percy died on May 10, 1990, from prostate cancer or its complications.  I only got around to reading The Moviegoer in 1990, after I borrowed it from a friend in Austin. If I recall correctly, I'd barely finished reading it when Percy died.


Walker Percy, the Southern writer who died last May at the age of 74, was interested in what he once called "the dislocation of man in the modern age.”  But unlike some other writers and philosophers—Thoreau comes most vividly to mind—Percy did not draw any sharp conclusions on that subject.
Whereas Thoreau concluded most people "lead lives of quiet desperation," a protagonist in a Percy novel is more likely to move about in an open-ended wandering, wondering if a little quiet desperation isn't better than the alternative.
Percy's first and last novels consider that alternative—something the protagonist in the first novel sees as sort of a living death and something the last novel examines as a way to head off the existential search for meaning before it's begun.
Percy’s first novel, The Moviegoer, won the National Book Award for fiction when it was published in 1961. In a way, however, it later served as the model for The Thanatos Syndrome, Percy's last and best novel, published in 1987. As a model, The Moviegoer offered new insights, new angles from which to view [freud, said the OCR!] familiar subjects, including one of the primary ways Americans identify with the world at large —by going to the movies.
Percy himself spent a good deal of time going to the movie houses of upper Manhattan in the late 1930s, when he was earning his psychiatry M.D. from Columbia University's College of Physicians and Surgeons.  However, while interning at Bellevue Hospital, Percy contracted tuberculosis, and thus spent the next few years convalescing in sanatoriums, where he read the Russian novelists, French existentialists such as Sartre, and the 19th century Danish philosopher Soren Kierkegaard
Percy had spent his childhood in Alabama and in Greenville, Mississippi, and after surviving his bout with tuberculosis, he returned to the South in 1943, to live near New Orleans.  In the mid 1950s he began work on The Moviegoer.
The book takes place in New Orleans, where John Bickerson Bolling, the thoughtful and observant 29-year old narrator, has become happily entrenched in the American way of life:  selling stocks and bonds, watching television, dating his secretaries, and going to the movies, "Where Happiness Costs So Little," as the permanent lettering on the marquee at his neighborhood theater notes.   
Bolling is "a model tenant and a model citizen," with "a wallet full of identity, cards, library cards, credit cards . . . certifying, so to speak, one’s right to exist.  But underneath his bon vivant lifestyle, and that of his contemporaries, he senses a problem.  “For some time now,” Bolling confides to the reader, “the impression has been growing upon me that everyone is dead.”  It is during ordinary conversations that he senses this phenomenon.  “As I listen to Eddie speak plausibly and at length of one thing or another—business, his wife Nell, the old house they are redecorating—the fabric pulls together into one bright texture of investments, family projects, lovely old houses, little theater readings and such. 
"But the fabric unravels as rapidly as it is woven, and "it seems that the conversation is spoken by automatons who have no choice in what they say.  Everyone seems to talk like a politician. I hear myself or someone else saying things like: 'In my opinion the Russian people are a great people but ...' or 'Yes, what you say about the hypocrisy of the North is unquestionably true. However ...' and Ithink to myself: this is death."
Bolling escapes death by going to the movies.  "Other people; so I have read, treasure memorable moments in their lives…  What I remember is the time John Wayne killed three men with a carbine as he was falling to the dusty street in Stagecoach and the time the kitten found Orson Welles in the doorway in The Third Man."
Percy is partly poking fun at himseif and his New York movie-golng period.  But he is also taking a deep satirical look into the relationship that has developed between people and movies. In that respect, The Moviegoer is more relevant today than it was 30 years ago.  More than ever, movies are a part of or possibly a substitute for something Percy also identified in the book.  He called it “the search.”
Bolling first thought of pursuing the search when he was wounded during the Korean War. "I came to myself under a chindolea bosh," he remembers. "My shoulder didn't hurt but it was pressed hard against the ground, as if somebody sat on me. Six inches from my nose a dung beetle was scratching around under the leaves.  As I watched, there awoke in me an immense curiosity. I was onto something. I vowed that if I ever got out of this fix, I would pursue the search. Naturally, as soon as I recovered and got home I forgot all about it.”
But the idea of the search unexpectedly returns to him. One morning he notices that his wallet, watch, pen and keys look unfamiliar. What was unfamiliar about them he says, "was that I could see them.  They might have belonged to someone else.  A man can look at this little pile on his bureau for thirty years and never once see it. It is as invisible as his own hand. Once I saw it, however, the search became possible."
The search in one form or another remained one of the central themes in Percy 's later novels. It is “what anyone would undertake if he were not sunk in the everydayness of his own life,” Bolling explains 'To become aware of the possibility of the search is to be onto something. Not to be onto something is to be in despair.”
Yet how does one tell if one is onto something or in despair, and does desperation become invisible to the person experiencing it?  Percy uses a quotation from Kierkegaard at the beginning of The Moviegoer to identify that state of being out of touch with one’s own emotions as true despair:  “The specific character of despair is precisely this: it is unaware of being despair."
Throughout all this Percy allows for the possibility that it is just his point of view, or Bolling's point of view, that makes some people seem to be truly in despair.  “Have 98% of Americans already found what I seek or are they so sunk in everydayness that not even the possibility of a search has occurred to them?" he wonders.
In The Moviegoer the search remains an abstraction. In The Thanatos Syndrome the search crystallizes into something specific and meaningful—namely the protagonist’s search for the cause of the syndrome.   Echoing a phrase from The Moviegoer, and also written in first person, the The Thanatos Syndrome opens with this line:  "For some time now I have noticed that something strange is occurring in our region."
The region is southeast Louisiana, near Baton Rouge, and the narrator is Dr. Tom More, a thoughtful, observant, and unfettered psychiatrist who is trying to figure out what the problem is with some of his patients and old friends—and also with his wife. Thanatos is a Greek word for death, hence the repetition of the moviegoer's observation, but as less of a metaphor.
More describes the syndrome as the abatement of such things as anxiety, depression, stress, insomnia, suicidal tendencies and chemical dependence. "Think of it as a regression from a stressful human existence to a peaceable animal existence."
What has caused it? Nothing less than a plot to create a utopia of happy automatons by treating the water supply with an uncommon isotope of sodium.  Statistics show an 85 percent decrease in violent crime, largely due to a drop in drug abuse, a 76 percent decrease in reported cases of AIDS, an 85 percent decrease in teenage pregnancy, a 95 percent decrease in teenage suicide, and to top all that off, notes one of the perpetrators, "L SU has not lost a football game in three years."
The thanatos syndrome, it seems, is simply life without the possibility of the search, but with a good dose of metaphysical anesthetic thrown in to prevent despair.  Maybe it is also Percy’s last statement about the dislocation of men and women in the modern age.  He seems to be asking us a question about ourselves as individuals:  What are we willing to give up to have safe, secure and also socially impressive lives?
He left us with a partial answer:  Don’t give up the search.

Friday, April 13, 2012

From the Robert Oppenheimer page of my website

In the photocopies I sent out recently to friends and family members, page 614 from The Making of the Atomic Bomb refers to other events that occurred on April 12, 1945 besides the death of President Roosevelt.   The other events were a report from Germany by Allied intelligence officers discussing the first physical evidence that German scientists had failed to develop an atomic bomb, and in Tokyo--where it was Friday, April 13--the burning down due to Allied bombing of the scientific lab where some infinitesimal progress had been made toward a Japanese atomic bomb. Therefore, looking back in historical perspective, as of that date the United States had a monopoly on the Bomb. At the same time, however, the person most likely to wisely use that monopoly--Franklin Roosevelt--died. Also: Hitler's suicide was announced by German radio on May 1. The Germans surrendered on May 7, and within weeks of that date the fierce fighting on the various Pacific islands near Japan was over. The war was then a waiting and bombing game (see below) as the Allies prepared for the invasion of the Japanese home islands. (May 8, 1945 is the official date of the end of the war in Europe, or V-E Day.)

As several people discuss in The Day After Trinity, President Truman merely went along with the program that was already in place for dropping the bombs on Japan. Roosevelt might have done things differently, because as Commander-in-Chief of the U.S. military he had full authority over the Manhattan Project from the beginning. One thing he might have done differently: on the day of the Trinity test, he might have announced to the world that the U.S. had successfully exploded the first atomic bomb. Instead, the U.S. Army issued its planned press release saying an ammo dump at Alamogordo had exploded. In the name of secrecy, which was conventionally thought to equate to security, the nuclear age began with a government lie. The main intention of the lie was to keep a military secret during wartime, but the secret was to be given up as soon as the bomb was used, so the main result of not publicizing the Trinity test after it happened was to deprive the American people of having a voice in the use of the bomb against Japan. A secondary result was that the Japanese were not given a chance to surrender with knowledge of the Bomb before having it dropped on them.

By July 1945, the time of the Trinity test, the only fighting going on besides the isolated attacks from Japanese submarines was the unimpeded American bombing of Japanese cities with hundreds of new B-29 'Superfortress' planes. This bombing could hardly be called fighting since the Japanese air force had already been almost entirely destroyed, partly by their own kamakazi attacks on Allied ships. The large numbers of estimated lives saved by the use of the atomic bomb are only the estimated casualties of the future invasion of the Japanese homeland had the atomic bomb not been used. That presumes the Japanese would not have surrendered otherwise, which is a rather big question mark and is related to the Allied demand for "unconditional" surrender, which meant to the Japanese they would have to give up their emperor, whom they considered to have a divine right and responsibility for ruling Japan. After the surrender and the American occupation of Japan, the emperor was left in place anyway, so this very reason the Japanese fought so hard and refused the unconditional surrender demand was in the end not turned into the reality they had feared.

(The Japanese soldiers not only fought hard, they fought without mercy. Here's a quote that was to accompany the Hiroshima/Nagasaki 50th anniversary exhibit at the Smithsonian National Air and Space Museum in 1995. Because of complaints by the American Legion and the Air Force Association, the controversial exhibit was dumbed down to make it a simple patriotic exhibit. However, one of the things left out in the process was an accurate historical rendering of Japanese wartime atrocities. This excerpt comes from one of the original, cancelled exhibit labels: "In 1931 the Japanese Army occupied Manchuria; six years later it invaded the rest of China. From 1937 to 1945, the Japanese Empire was constantly at war. Japanese expansionism was marked by naked aggression and extreme brutality. The slaughter of tens of thousands of Chinese in Nanking in 1937 shocked the world. Atrocities by Japanese troops included brutal mistreatment of civilians, forced laborers and prisoners of war, and biological experiments on human victims." I got this excerpt from an article in The Bulletin of the Atomic Scientists, May/June 1995, by Stanley Goldberg, titled "Smithsonian suffers Legionaires' disease.")

The Russians were another issue. The USSR was our ally in WWII but was not allowed any knowledge of the Manhattan Project. This was another secrecy issue that eventually had to be dealt with, but the Russians knew about the Bomb anyway, through espionage, so that part of the huge effort that equated secrecy with security had already failed before the Trinity test. (And the U.S. was aware of the espionage, but not the extent of it.) At the Potsdam conference near Berlin, a few days after Trinity, Truman told Stalin about the successful test of the Bomb, but Stalin was able to brush off the information as of no significance because the Russians were already at work on a bomb of their own. This was in effect the beginning of the Cold War. Announcing the Trinity test to the world at this time would have upstaged Stalin's secret knowledge of the Bomb, but instead Stalin was able to thumb his nose at Truman's uncharacteristic meekness and weakness in sharing the secret with him. Well, perhaps casualness is a better word than either meekness or weakness. Here's an excerpt from The Making of the Atomic Bomb, p. 690:

But in fact Stalin already knew about the Trinity test. His agents in the United States had reported it to him. It appears he was not immediately impressed. There is gallows humor in Truman's elaborately offhand approach to the Soviet Premier at the end of that day's plenary session at the Cecilienhof Palace, stripped and shabby, where pale German mosquitoes homing through unscreened windows dined on the sanguinary conquerors. Truman left behind his translator, rounded the baize-covered conference table and sidled up to his Soviet counterpart, both men dissimulating. "I casually mentioned to Stalin that we had a new weapon of unusual destructive force. The Russian Premier showed no special interest. All he said was that he was glad to hear it and hoped we would make 'good use of it against the Japanese.'" "That," concludes Robert Oppenheimer dryly, knowing how much at that moment the world lost, "was carrying casualness rather far."

Last but not least, Los Alamos scientists, not the American public, later had to bear the responsibility for the first use of the Bomb on a civilian population. Although that made the scientists heros at the end of the war, it turned them into villians for later generations. One of the scientists at the Trinity test, Kenneth Bainbridge, got it right when he said to the others, "Now we're all sons of bitches." Americans during WWII considered Hirohito, the Japanese emperor, to be the embodiment of evil, something like Osama bin Laden is today, and had they known about the Bomb, they--we--probably would have been happy to nuke the Japanese, so the outcome would likely not have been different, but the "physicists have known sin" legacy would likely not exist. And although the American public would have been happy to nuke Hirohito, the hatred was not reflected in military policy. While nearly all of Tokyo was reduced to rubble by American bombing in early 1945, the emperor's palace was intentionally left untouched. This was mainly a practical matter: We needed Hirohito to declare a surrender, when that point eventually was reached.

It's possible to believe that the use of nuclear weapons at Hiroshima and Nagasaki provided the world with an inoculation against future uses of the Bomb, such as during the Korean War, the Cuban Missile Crisis and the Vietnam War. But I see that the instability of the world situation these days, including even the U.S.-Russian situation, points toward the inoculation having worn off. The U.S is going to try to build a nuclear missile defense and Russia, as Vladimir Putin recently warned, is going to attempt to design cruise-type missiles that can dodge the U.S. defense system. China is not being vocal on the issue, but isn't going to twiddle its thumbs while the U.S. builds a nuclear missile shield. Besides that, the risk of an accidental nuclear missile firing is a current problem that is not lessened by political instability in the former U.S.S.R and elsewhere (North Korea mainly) , and this instability or lack of attention to detail in regard to plutonium and enriched uranium also increases the chances of terrorists obtaining a nuclear weapon. Pakistan's notorious Dr. Kahn is an example of how easily things with which to make nuclear weapons can get out of hand. In the future, terrorism may come from places where we aren't expecting it.

In any case, terrorism is less of a national security threat than a public health threat, though it's potentially a very big public health threat. The most undesirable future is the use of many nuclear weapons in a short period of time, which only Russia, the U.S. and China can bring about, because of the existence of so many weapons in these particular nuclear arsenals. These arsenals are the biggest security threat in the world today.

To close this nuclear essay I'll just mention one other historical April 12 event related to The Day After Trinity. Robert Oppenheimer's security hearing in Washington D.C. began on April 12, 1954, exactly 50 years ago today. Oppenheimer turned 50 ten days later, and you can be sure he didn't have a happy 50th birthday, sitting through a day of testimony. See below for an excerpt from the April 22 session.
      
---DWT, April 12, 2004.   Updated once (October 13, 2004).  Oppenheimer was  born April 22, 1904
-------------------------------------------------------------------------------------------------------------------------------

"In the Matter of J. Robert Oppenheimer, Transcript of Hearing before Personnel Security Board, Washington D.C., April 12, 1954 through May 6, 1954." The hearing took place at the Atomic Energy Commission, Building T-3, Room 2022. Usually 7 or 8 people were in the room, including the Board, two lawyers for the AEC, and Oppenheimer and his two or three lawyers. The Personnel Security Board consisted of Gordon Gray, Ward T. Evans (a chemistry professor) and Thomas A. Morgan. The following is an excerpt from testimony given on April 22 by Norris Bradbury, a Navy physicist who worked on the Manhattan Project during the war, and who became director of Los Alamos after Oppenheimer resigned in October 1945. Like most of the witnesses, he testified in favor of Oppenheimer, but the testimony got a little weirder than usual--the hearing itself in retrospect was titanically weird--about 12 pages into the transcript for April 22. The part that is not unusual for the hearing, and is likely to again be a contentious issue in these times of the PATRIOT Act, is the question of loyalty to one's country versus loyalty to friends.

...
Dr. EVANS. Do you think that scientific men as a rule are rather peculiar individuals?
The WITNESS. When did I stop beating my wife?
Mr. GRAY. Especially chemistry professors?
Dr. EVANS. No, physics professors.
The WITNESS. Scientists are human beings. I think as a class, because their basic task is concerned with the exploration of the facts of nature, understanding, this is a quality of mind philosophy--a scientist wants to know. He wants to know correctly and truthfully and precisely. By this token it seems to me he is more likely than not to be interested in a number of fields, but to be interested in them from the point of view of exploration. What is in them? What do they have to offer. What is their truth. ... Therefore I think you are likely to find among people who have imaginative minds in the scientific field, individuals who are also willing, eager, to look at a number of other fields with the same type of interest, willingness to examine, to be convinced and without a priori convictions as to rightness or wrongness, that this constant or that curve or this or that function is fatal.
I think the same sort of willingness to explore other areas of human activity is probably characteristic. If this makes them peculiar, I think it is probably a desirable peculiarity.
Dr. EVANS. You didn't do that, did you?
The WITNESS. Well---
Dr. EVANS. You didn't investigate these subversive organizations, did you?
The WITNESS. No. Perhaps my interest lay along other lines. I don't think one has to investigate all these political systems.
Dr. EVANS. Do you go fishing and things like that?
The WITNESS. Yes, I have done a number of things. Some people, and perhaps myself among them, I was an experimental physicist in those days, and I was very much preoccupied with my own investigations.
Dr. EVANS. But that didn't make you peculiar, did it?
The WITNESS. This I would have to leave to others to say.
Dr. EVANS. Younger people sometimes make mistakes, don't they?
The WITNESS. I think this is part of people's growing up.
Dr. EVANS. We all do.
...
Dr. EVANS. You spoke of loyalty. Would you put loyalty to your country above loyalty to your friends?
The WITNESS. I would.
Dr. EVANS. That is all I have.

REDIRECT EXAMINATION

By Mr. SILVERMAN: [one of Oppenheimer's lawyers]
Q. Doctor, from your knowledge of Dr. Oppenheimer, today, do you think he would put loyalty to his country above loyalty to a friend?
A. I believe he would.
Mr. SILVERMAN. That is all.
...
(This excerpt is from pages 491 and 492 of the 992-page transcript.)