Dog Brothers Public Forum
Return To Homepage
Welcome, Guest. Please login or register.
May 27, 2015, 11:13:19 AM

Login with username, password and session length
Search:     Advanced search
Welcome to the Dog Brothers Public Forum.
86331 Posts in 2276 Topics by 1068 Members
Latest Member: cdenny
* Home Help Search Login Register
  Show Posts
Pages: 1 ... 604 605 [606] 607 608 ... 665
30251  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Environmental issues on: March 13, 2007, 08:50:03 AM
Today's NY Times:

Published: March 13, 2007
Hollywood has a thing for Al Gore and his three-alarm film on global warming, “An Inconvenient Truth,” which won an Academy Award for best documentary. So do many environmentalists, who praise him as a visionary, and many scientists, who laud him for raising public awareness of climate change.

But part of his scientific audience is uneasy. In talks, articles and blog entries that have appeared since his film and accompanying book came out last year, these scientists argue that some of Mr. Gore’s central points are exaggerated and erroneous. They are alarmed, some say, at what they call his alarmism.

“I don’t want to pick on Al Gore,” Don J. Easterbrook, an emeritus professor of geology at Western Washington University, told hundreds of experts at the annual meeting of the Geological Society of America. “But there are a lot of inaccuracies in the statements we are seeing, and we have to temper that with real data.”

Mr. Gore, in an e-mail exchange about the critics, said his work made “the most important and salient points” about climate change, if not “some nuances and distinctions” scientists might want. “The degree of scientific consensus on global warming has never been stronger,” he said, adding, “I am trying to communicate the essence of it in the lay language that I understand.”

Although Mr. Gore is not a scientist, he does rely heavily on the authority of science in “An Inconvenient Truth,” which is why scientists are sensitive to its details and claims.

Criticisms of Mr. Gore have come not only from conservative groups and prominent skeptics of catastrophic warming, but also from rank-and-file scientists like Dr. Easterbook, who told his peers that he had no political ax to grind. A few see natural variation as more central to global warming than heat-trapping gases. Many appear to occupy a middle ground in the climate debate, seeing human activity as a serious threat but challenging what they call the extremism of both skeptics and zealots.

Kevin Vranes, a climatologist at the Center for Science and Technology Policy Research at the University of Colorado, said he sensed a growing backlash against exaggeration. While praising Mr. Gore for “getting the message out,” Dr. Vranes questioned whether his presentations were “overselling our certainty about knowing the future.”

Typically, the concern is not over the existence of climate change, or the idea that the human production of heat-trapping gases is partly or largely to blame for the globe’s recent warming. The question is whether Mr. Gore has gone beyond the scientific evidence.

“He’s a very polarizing figure in the science community,” said Roger A. Pielke Jr., an environmental scientist who is a colleague of Dr. Vranes at the University of Colorado center. “Very quickly, these discussions turn from the issue to the person, and become a referendum on Mr. Gore.”

“An Inconvenient Truth,” directed by Davis Guggenheim, was released last May and took in more than $46 million, making it one of the top-grossing documentaries ever. The companion book by Mr. Gore quickly became a best seller, reaching No. 1 on the New York Times list.

Mr. Gore depicted a future in which temperatures soar, ice sheets melt, seas rise, hurricanes batter the coasts and people die en masse. “Unless we act boldly,” he wrote, “our world will undergo a string of terrible catastrophes.”

He clearly has supporters among leading scientists, who commend his popularizations and call his science basically sound. In December, he spoke in San Francisco to the American Geophysical Union and got a reception fit for a rock star from thousands of attendees.

“He has credibility in this community,” said Tim Killeen, the group’s president and director of the National Center for Atmospheric Research, a top group studying climate change. “There’s no question he’s read a lot and is able to respond in a very effective way.”

Some backers concede minor inaccuracies but see them as reasonable for a politician. James E. Hansen, an environmental scientist, director of NASA’s Goddard Institute for Space Studies and a top adviser to Mr. Gore, said, “Al does an exceptionally good job of seeing the forest for the trees,” adding that Mr. Gore often did so “better than scientists.”

Still, Dr. Hansen said, the former vice president’s work may hold “imperfections” and “technical flaws.” He pointed to hurricanes, an icon for Mr. Gore, who highlights the devastation of Hurricane Katrina and cites research suggesting that global warming will cause both storm frequency and deadliness to rise. Yet this past Atlantic season produced fewer hurricanes than forecasters predicted (five versus nine), and none that hit the United States.


Page 2 of 2)

“We need to be more careful in describing the hurricane story than he is,” Dr. Hansen said of Mr. Gore. “On the other hand,” Dr. Hansen said, “he has the bottom line right: most storms, at least those driven by the latent heat of vaporization, will tend to be stronger, or have the potential to be stronger, in a warmer climate.”

In his e-mail message, Mr. Gore defended his work as fundamentally accurate. “Of course,” he said, “there will always be questions around the edges of the science, and we have to rely upon the scientific community to continue to ask and to challenge and to answer those questions.”

He said “not every single adviser” agreed with him on every point, “but we do agree on the fundamentals” — that warming is real and caused by humans.

Mr. Gore added that he perceived no general backlash among scientists against his work. “I have received a great deal of positive feedback,” he said. “I have also received comments about items that should be changed, and I have updated the book and slideshow to reflect these comments.” He gave no specifics on which points he had revised.

He said that after 30 years of trying to communicate the dangers of global warming, “I think that I’m finally getting a little better at it.”

While reviewers tended to praise the book and movie, vocal skeptics of global warming protested almost immediately. Richard S. Lindzen, a climatologist at the Massachusetts Institute of Technology and a member of the National Academy of Sciences, who has long expressed skepticism about dire climate predictions, accused Mr. Gore in The Wall Street Journal of “shrill alarmism.”

Some of Mr. Gore’s centrist detractors point to a report last month by the Intergovernmental Panel on Climate Change, a United Nations body that studies global warming. The panel went further than ever before in saying that humans were the main cause of the globe’s warming since 1950, part of Mr. Gore’s message that few scientists dispute. But it also portrayed climate change as a slow-motion process.

It estimated that the world’s seas in this century would rise a maximum of 23 inches — down from earlier estimates. Mr. Gore, citing no particular time frame, envisions rises of up to 20 feet and depicts parts of New York, Florida and other heavily populated areas as sinking beneath the waves, implying, at least visually, that inundation is imminent.

Bjorn Lomborg, a statistician and political scientist in Denmark long skeptical of catastrophic global warming, said in a syndicated article that the panel, unlike Mr. Gore, had refrained from scaremongering. “Climate change is a real and serious problem” that calls for careful analysis and sound policy, Dr. Lomborg said. “The cacophony of screaming,” he added, “does not help.”

So too, a report last June by the National Academies seemed to contradict Mr. Gore’s portrayal of recent temperatures as the highest in the past millennium. Instead, the report said, current highs appeared unrivaled since only 1600, the tail end of a temperature rise known as the medieval warm period.

Roy Spencer, a climatologist at the University of Alabama, Huntsville, said on a blog that Mr. Gore’s film did “indeed do a pretty good job of presenting the most dire scenarios.” But the June report, he added, shows “that all we really know is that we are warmer now than we were during the last 400 years.”

Other critics have zeroed in on Mr. Gore’s claim that the energy industry ran a “disinformation campaign” that produced false discord on global warming. The truth, he said, was that virtually all unbiased scientists agreed that humans were the main culprits. But Benny J. Peiser, a social anthropologist in Britain who runs the Cambridge-Conference Network, or CCNet, an Internet newsletter on climate change and natural disasters, challenged the claim of scientific consensus with examples of pointed disagreement.

“Hardly a week goes by,” Dr. Peiser said, “without a new research paper that questions part or even some basics of climate change theory,” including some reports that offer alternatives to human activity for global warming.

Geologists have documented age upon age of climate swings, and some charge Mr. Gore with ignoring such rhythms.

“Nowhere does Mr. Gore tell his audience that all of the phenomena that he describes fall within the natural range of environmental change on our planet,” Robert M. Carter, a marine geologist at James Cook University in Australia, said in a September blog. “Nor does he present any evidence that climate during the 20th century departed discernibly from its historical pattern of constant change.”

In October, Dr. Easterbrook made similar points at the geological society meeting in Philadelphia. He hotly disputed Mr. Gore’s claim that “our civilization has never experienced any environmental shift remotely similar to this” threatened change.

Nonsense, Dr. Easterbrook told the crowded session. He flashed a slide that showed temperature trends for the past 15,000 years. It highlighted 10 large swings, including the medieval warm period. These shifts, he said, were up to “20 times greater than the warming in the past century.”

Getting personal, he mocked Mr. Gore’s assertion that scientists agreed on global warming except those industry had corrupted. “I’ve never been paid a nickel by an oil company,” Dr. Easterbrook told the group. “And I’m not a Republican.”

Biologists, too, have gotten into the act. In January, Paul Reiter, an active skeptic of global warming’s effects and director of the insects and infectious diseases unit of the Pasteur Institute in Paris, faulted Mr. Gore for his portrayal of global warming as spreading malaria.

“For 12 years, my colleagues and I have protested against the unsubstantiated claims,” Dr. Reiter wrote in The International Herald Tribune. “We have done the studies and challenged the alarmists, but they continue to ignore the facts.”

Michael Oppenheimer, a professor of geosciences and international affairs at Princeton who advised Mr. Gore on the book and movie, said that reasonable scientists disagreed on the malaria issue and other points that the critics had raised. In general, he said, Mr. Gore had distinguished himself for integrity.

“On balance, he did quite well — a credible and entertaining job on a difficult subject,” Dr. Oppenheimer said. “For that, he deserves a lot of credit. If you rake him over the coals, you’re going to find people who disagree. But in terms of the big picture, he got it right.”

30252  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / What's so Funny? on: March 13, 2007, 08:46:12 AM
Today's NY Times:

So there are these two muffins baking in an oven. One of them yells, “Wow, it’s hot in here!”
And the other muffin replies: “Holy cow! A talking muffin!”

Did that alleged joke make you laugh? I would guess (and hope) not. But under different circumstances, you would be chuckling softly, maybe giggling, possibly guffawing. I know that’s hard to believe, but trust me. The results are just in on a laboratory test of the muffin joke.

Laughter, a topic that stymied philosophers for 2,000 years, is finally yielding to science. Researchers have scanned brains and tickled babies, chimpanzees and rats. They’ve traced the evolution of laughter back to what looks like the primal joke — or, to be precise, the first stand-up routine to kill with an audience of primates.

It wasn’t any funnier than the muffin joke, but that’s not surprising, at least not to the researchers. They’ve discovered something that eluded Plato, Aristotle, Hobbes, Kant, Schopenhauer, Freud and the many theorists who have tried to explain laughter based on the mistaken premise that they’re explaining humor.

Occasionally we’re surprised into laughing at something funny, but most laughter has little to do with humor. It’s an instinctual survival tool for social animals, not an intellectual response to wit. It’s not about getting the joke. It’s about getting along.

When Robert R. Provine tried applying his training in neuroscience to laughter 20 years ago, he naïvely began by dragging people into his laboratory at the University of Maryland, Baltimore County, to watch episodes of “Saturday Night Live” and a George Carlin routine. They didn’t laugh much. It was what a stand-up comic would call a bad room.

So he went out into natural habitats — city sidewalks, suburban malls — and carefully observed thousands of “laugh episodes.” He found that 80 percent to 90 percent of them came after straight lines like “I know” or “I’ll see you guys later.” The witticisms that induced laughter rarely rose above the level of “You smell like you had a good workout.”

“Most prelaugh dialogue,” Professor Provine concluded in “Laughter,” his 2000 book, “is like that of an interminable television situation comedy scripted by an extremely ungifted writer.”

He found that most speakers, particularly women, did more laughing than their listeners, using the laughs as punctuation for their sentences. It’s a largely involuntary process. People can consciously suppress laughs, but few can make themselves laugh convincingly.

“Laughter is an honest social signal because it’s hard to fake,” Professor Provine says. “We’re dealing with something powerful, ancient and crude. It’s a kind of behavioral fossil showing the roots that all human beings, maybe all mammals, have in common.”

The human ha-ha evolved from the rhythmic sound — pant-pant — made by primates like chimpanzees when they tickle and chase one other while playing. Jaak Panksepp, a neuroscientist and psychologist at Washington State University, discovered that rats emit an ultrasonic chirp (inaudible to humans without special equipment) when they’re tickled, and they like the sensation so much they keep coming back for more tickling.

He and Professor Provine figure that the first primate joke — that is, the first action to produce a laugh without physical contact — was the feigned tickle, the same kind of coo-chi-coo move parents make when they thrust their wiggling fingers at a baby. Professor Panksepp thinks the brain has ancient wiring to produce laughter so that young animals learn to play with one another. The laughter stimulates euphoria circuits in the brain and also reassures the other animals that they’re playing, not fighting.

“Primal laughter evolved as a signaling device to highlight readiness for friendly interaction,” Professor Panksepp says. “Sophisticated social animals such as mammals need an emotionally positive mechanism to help create social brains and to weave organisms effectively into the social fabric.”

Humans are laughing by the age of four months and then progress from tickling to the Three Stooges to more sophisticated triggers for laughter (or, in some inexplicable cases, to Jim Carrey movies). Laughter can be used cruelly to reinforce a group’s solidarity and pride by mocking deviants and insulting outsiders, but mainly it’s a subtle social lubricant. It’s a way to make friends and also make clear who belongs where in the status hierarchy.

Page 2 of 2)

Which brings us back to the muffin joke. It was inflicted by social psychologists at Florida State University on undergraduate women last year, during interviews for what was ostensibly a study of their spending habits. Some of the women were told the interviewer would be awarding a substantial cash prize to a few of the participants, like a boss deciding which underling deserved a bonus.

The women put in the underling position were a lot more likely to laugh at the muffin joke (and others almost as lame) than were women in the control group. But it wasn’t just because these underlings were trying to manipulate the boss, as was demonstrated in a follow-up experiment.

This time each of the women watched the muffin joke being told on videotape by a person who was ostensibly going to be working with her on a task. There was supposed to be a cash reward afterward to be allocated by a designated boss. In some cases the woman watching was designated the boss; in other cases she was the underling or a co-worker of the person on the videotape.

When the woman watching was the boss, she didn’t laugh much at the muffin joke. But when she was the underling or a co-worker, she laughed much more, even though the joke-teller wasn’t in the room to see her. When you’re low in the status hierarchy, you need all the allies you can find, so apparently you’re primed to chuckle at anything even if it doesn’t do you any immediate good.

“Laughter seems to be an automatic response to your situation rather than a conscious strategy,” says Tyler F. Stillman, who did the experiments along with Roy Baumeister and Nathan DeWall. “When I tell the muffin joke to my undergraduate classes, they laugh out loud.”

Mr. Stillman says he got so used to the laughs that he wasn’t quite prepared for the response at a conference in January, although he realizes he should have expected it.

“It was a small conference attended by some of the most senior researchers in the field,” he recalls. “When they heard me, a lowly graduate student, tell the muffin joke, there was a really uncomfortable silence. You could hear crickets.”

30253  DBMA Espanol / Espanol Discussion / Re: Mexico on: March 13, 2007, 08:31:45 AM
Geopolitical Diary: U.S.-Mexican Relations Changing

U.S. President George W. Bush is scheduled to meet with Mexican President Felipe Calderon on Tuesday in Mexico -- the last stop of Bush's Latin American tour. The agenda for the meeting is predictable; issues to be discussed include trade, security, counternarcotics programs and the polarizing immigration and border control debate.

Bush's trip has focused on political alliances, and his stop in Mexico is no different. Mexico has traditionally been an ally, but tensions have recently risen over border and immigration policies. Smoothing these tensions and reaffirming Mexico's long-term status as a U.S. ally is the driving motivation behind the U.S. president's visit.

However, though Bush is arriving in Mexico with a largely rhetorical agenda, his counterpart could meet him with a much stiffer proposition.

Since taking office in December 2006, Calderon has aggressively approached his presidency. His presidential campaign called for massive reforms, and he has wasted no time pursing them. He already has announced plans for a constitutional redraft and a controversial reform of Mexican state-run oil company Petroleos Mexicanos (Pemex) and already has launched a massive multistate offensive to counter narcotics trafficking. Though the attack against drug cartels has not severely impacted their operations, it has won Calderon domestic support; with recent approval ratings ranging from 58 percent to 73 percent, it is clear that Mexicans approve of Calderon's boldness. And since the Mexican government depends on oil money, Calderon desperately needs this approval to push through the Pemex reform.

Bush might not be prepared to meet with a bold Mexican president; former Mexican President Vicente Fox rarely challenged Bush and reveled in a close friendship with his U.S. counterpart. And while Calderon has not disparaged U.S.-Mexican ties, he has made it clear that he is not interested in helping to repair U.S.-Latin American relations, noting that the United States has to "regain respect" in the region.

Mexico has long demanded increased attention -- and a solution -- to the immigration debate. But a visit between Bush and Calderon will have little, if any, impact on the immigration front. Bush's hands are all but tied -- he faces an opposition Congress and a populace deeply divided on the issue at home -- and he is not in a position to settle the immigration issue, much less to do so in a way that Mexico would desire.

Calderon knows this as well as Bush does, and is not expecting a sudden shift in U.S. immigration and border policies. A breakthrough on the immigration front at this point is not plausible, but with this visit Calderon can earn himself a few more approval points at home.

Though Mexico's close relationship with the United States is not likely to change in the near future -- trade, security issues and proximity will tie the nations together indefinitely -- the Mexican government is no longer interested in pushing the U.S. agenda in Latin America. Calderon is ready to be an independent leader, and Bush could find him to be less of an ally than expected.
30254  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Music on: March 12, 2007, 11:36:17 PM


Is It Live ... or Yamaha? Channeling Glenn Gould

A “reperformance” of Glenn Gould’s famous 1955 mono rendition of Bach’s “Goldberg” Variations played at the Yamaha piano studios in New York on March 7.

NY Times
Published: March 12, 2007

Was that relief I felt as the piano was playing? A feeling that some worry had been alleviated or a fear quieted? Why then was it also mixed with disappointment, as if some deep yearning had been thwarted? Not yet, not yet, not yet: relief and frustration intertwined.

For months I had deliberately avoided listening. A technologically oriented, musically sophisticated company, Zenph Studios (, claimed that it could bring the voices of the musical dead back to life. It could achieve, that is, what technology has long dreamt of: It would make light of the material world and all its restrictions.

Zenph claimed it could take a 50-year-old mono recording and distill from its hiss-laden, squished sound all of the musical information that originally went into it. It wouldn’t “process” the recording to get rid of noise; it wouldn’t pretend to turn mono into stereo; it wouldn’t try to correct things that were sonically “wrong.” Instead the claim was that it would, using its proprietary software, learn from recorded sound precisely how an instrument — a piano, for starters — was played, with what force a key was struck, how far down the sustain pedal was pressed, when each finger moved, how each note was weighted in a complex chord and what sort of timbre was actually produced.

Then it would effectively recreate the instrument. A digital file encoded with this information would be read by Yamaha’s advanced Disklavier Pro — a computerized player piano — and transformed into music. A recorded piano becomes a played piano. This would be sonic teleportation, monochromatic forms reincarnated as three-dimensional sound — not colorization but re-creation.

Zenph also announced it had accomplished this feat of technological legerdemain with one of the most remarkable recordings of the last century: Glenn Gould’s 1955 mono rendition of Bach’s “Goldberg” Variations. Gould, who retreated from performance into the private realm of the recording studio where he could splice and fiddle with sound and phrase, would be posthumously pulled back into the realm of public performance.

Gould believed technology liberated performer and listener. Here this pianist, who died in 1982, would be freed from the ultimate constraint.

And indeed, last September in Toronto, Zenph gave a public “reperformance” of Gould’s “Goldbergs” on a specially prepared Yamaha Disklavier. Zenph’s “Goldbergs” inspired a standing ovation from the audience members, many of whom knew Gould and some of whom had heard him play live. The press reports glowed.

Then one day last week Zenph — which took its name from “senf,” the German word for mustard — brought a press demonstration of its “Goldbergs” to Yamaha’s New York piano studios, playing portions of the work both on the Disklavier and from its recording, due to be released at the end of May on Sony BMG Masterworks.

Before the demonstration I returned to the 1955 recording, which I had not heard for several years. I was swept away again. This is not spiritual playing, plumbing the profundity of Bach’s meditations; it is ecstatic, uncanny in its intoxication. The recording is skittish, illuminating, thrilling and extraordinarily physical: the playing seeps into muscles as well as ears; every phrase exerts the pressure and play of dance.

John Q. Walker, Zenph’s president, knows this as well. He is a brilliant software engineer (who did important work in computer networking) and a musician who speaks of his enterprise with impassioned fervor. Last week, when he started the Yamaha instrument playing his encodings of Gould, something thrilling really did take place. The piano produced sounds that were indisputably human and unmistakably Gouldian. The playing could not have come from any other pianist.

But wait. ... Gould’s recorded piano sound is dry, as if each note were squeezed free of moisture. The phrases quiver; connections between notes are tensile, as if they were being held together by sinews. But at the demonstration the sound was often plump, rotund, even bell-like. That is partly the character of Yamaha pianos. And isn’t that a problem? Any great pianist will adjust a performance to the instrument, treating one with a “wet sound” differently from one with more sharply etched qualities, phrasing differently, even adjusting tempo. This difference in instruments limits Zenph’s claims; it also seemed to slacken the music’s sinews.

Presumably though the recording — done on another Yamaha that the piano technician, Marc Wienert, voiced to resemble Gould’s old Steinway — would have a better effect. Yet it leaves a similar impression. Is this some psychoacoustic phenomenon then, some disorientation caused by close familiarity with the old mono sound? When recordings were first becoming widely available at the turn of the 20th century, there were demonstrations in concert halls in which singers would begin a song, and a hidden gramophone with its amplifying horn would complete it. One London newspaper reported: “The most sensitive ear could not detect the slightest difference between the tone of the singer and the tone of the mechanical device.”

Bizarre. But am I experiencing something in reverse, treating sonic antiquity with reverence and not recognizing musical similarities? We all learn languages of listening, ways of interpreting reproductions, imagining full-size orchestras emerging from clock radios, ignoring hisses or distortions, compensating for flaws.

Does the new instrumentation seem less convincing because it disrupts the old familiar language of listening? I don’t think so. In Zenph’s recording, the music’s tensile line really is loosened. I admire what I hear and might not even realize what was missing without comparing, but I am not intoxicated with Gould’s exuberance or infected with his ecstatic amazement. The music is the same, yet not the same.

Of course one might say, “How could it be otherwise?” Think of the kinds of processing and analysis that had to be done: filtering out Gould’s hums or groans, isolating the sound of the piano with all its intricate overtones, taking into account the way sound was compressed or altered by every microphone, processor or wire it passed through.

Then there’s another step: “reverse engineering” the sound, as if reconstructing the instrument that created it. Then another: producing the music from yet another instrument, Yamaha’s Disklavier. And another: recording the music yet again.

The process is mind-bogglingly complex. And at every moment there are also human decisions — adjustments of the piano, musical alterations. Perhaps over time both human practice and technological possibilities will evolve further, leaving fewer distinctions. A recording by Art Tatum is due next from Zenph, along with other recordings from Sony BMG Masterworks’ rich archives.

But why all this effort? (Five man-months for a “reperformance,” as Mr. Walker explained.) Partly perhaps because contemporary sound is considered preferable and marketable. Partly because, as Zenph’s Web site points out, the great recordings of the past are passing into the public domain. The European Union allows just 50 years of protection — and this is a way of maintaining proprietary control.

But is the result really musically superior? It could only be that if there were absolutely nothing lost and every difference were an improvement; neither is the case. This is a disappointment then, though one that is exhilarating in its enterprise and promise.

The disappointment is also a relief. For had Zenph succeeded, there would have been a severe price. Had that really been Gould’s sound coming from the piano, it would have dealt a severe blow indeed to an ancient prejudice: that music, in all its complexity, is beyond the reach of the merely technical, and that it belongs, in creation and interpretation, to humanity’s ever-shrinking domain. Relief: no Gouldian robotics. Yet.

Connections, a critic’s perspective on arts and ideas, appears every other Monday.
30255  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Nuclear War? on: March 12, 2007, 11:02:57 PM
Report: Dirty bomb materials still available

Government reports ‘limited progress’ securing nuclear material worldwidevar cssList = new Array();getCSS("3053751")

By Lisa Myers & the NBC News Investigative Unit
Updated: 44 minutes ago
function UpdateTimeStamp(pdt) { var n = document.getElementById("udtD"); if(pdt != '' && n && window.DateTime) { var dt = new DateTime(); pdt = dt.T2D(pdt); if(dt.GetTZ(pdt)) {n.innerHTML = dt.D2S(pdt,((''.toLowerCase()=='false')?false:true ));} } } UpdateTimeStamp('633093384142830000');

Lisa Myers
Senior investigative correspondent
• Profile

WASHINGTON - International inspectors working in the former Soviet republic of Georgia last summer tracked down dangerous radiological materials in an abandoned military complex.

It was an important mission. But a new report by U.S. government watchdogs says a parallel effort overseas by the U.S. Department of Energy has made only "limited progress securing many of the most dangerous sources" — waste disposal sites and abandoned generators across Russia, each with enough material for several devastating dirty bombs.
The new report by the Government Accountability Office says that DOE is doing an admirable job securing low-risk radiological sources — the proverbial low-hanging fruit — at the expense of more dangerous materials that remain vulnerable to terrorists.
“Many of the highest-risk and most dangerous sources still remain unsecured, particularly in Russia,” the GAO writes. “Specifically, 16 or 20 waste storage sites across Russia and Ukraine remain unsecured while more than 700 RTGs [radioisotope thermoelectric generators] remain operational or abandoned in Russia and are vulnerable to theft or potential misuse.”

RTGs can contain up to 250,000 curies of Strontium-90. Experts say an explosion with that amount of Strontium-90 could be dangerous.
"You would cause a significant contamination over a square mile — many, many city blocks, and with the right city blocks, Wall Street or the White House,” says Leonard S. Spector, deputy director of the Center for Nonproliferation Studies at the Monterey Institute. “The impact could be very devastating.”
A test explosion by U.S. scientists working at the Sandia National Labs near Albuquerque, N.M., showed how a dirty bomb works: Conventional explosives spread the radioactive material, which can contaminate large areas.

The new report says the DOE has focused most of its energies in the last three years on securing small sources of radioactive materials in Russia and abroad — largely found in medical equipment stored in doctors’ offices.
Meanwhile, the report says, major waste disposal sites sit protected by primitive fences. And more than 700 generators are vulnerable to terrorists.

"If you look at the past six months, we see, I think, an upsurge in criminal and terrorist activity using radioactive materials,” says Charles Ferguson, a science and technology fellow at the Council on Foreign Relations.
Last year, according to the International Atomic Energy Association, there were 85 confirmed thefts or loss of nuclear or radioactive materials worldwide — mostly small amounts. Most of those have not been recovered.
Last fall, al-Qaida’s leader in Iraq called on militant scientists to create dirty bombs to be tested on U.S. bases in Iraq.

“I am disturbingly concerned about this because it can grow into a huge threat,” says Sen. Daniel Akaka, D-Hawaii, who will chair a hearing Tuesday on the issue at a subcommittee of the Senate Committee on Homeland Security. “These generators are sources that can be used for dirty bombs, and [they are] there for the taking. I feel that DOE is not meeting the priority of our nation in security.”
The report also criticizes the DOE for a “steady” decline in its budget for the International Radiological Threat Reduction program. It says, “[F]uture funding is uncertain because the agency places a higher priority on securing special nuclear material” than it does in protecting dirty-bomb material.

The DOE points out that the GAO report also applauds its efforts in many areas. The agency also says it has made progress, having upgraded security at 500 sites in more than 40 countries. DOE officials say they are now moving to secure more of those high-risk generators and waste sites in Russia, and that their budget request for next year represents a slight increase.

“DOE and the National Nuclear Security Administration are committed to securing and removing vulnerable radiological sources around the world,” says Andrew Bieniawski, who heads up the DOE’s Global Threat Reduction Initiative, run under the National Nuclear Security Administration.
As of January, the agency has spent approximately $120 million to secure vulnerable radiological sources, an expenditure that demonstrates a strong commitment to a program that has produced tangible results and reduced the risks of terrorists acquiring the materials to make a dirty bomb, Bieniawski said.
30256  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Iran on: March 12, 2007, 10:37:19 PM
Iran, Russia: Nuclear Reactors and Geopolitics

Russian President Vladimir Putin on March 12 jumped into the dispute over Russia's construction of a nuclear reactor in Iran, explicitly telling state press that all work will be suspended until the Iranians resume their payments. The message between the lines is clear: Russia will not complete the Bushehr reactor -- or at least not while Putin remains president.


Russian President Vladimir Putin on March 12 personally ordered the suspension of any transfers of nuclear materials and technologies to Iran's Bushehr nuclear power plant project, ostensibly because of Iran's unwillingness to meet its payment schedule for the project. The idea that Iran, currently flush with petrodollars and facing down the U.N. Security Council over its nuclear program, would choose this moment to stop paying its primary political backer, Russia, is an odd one.

The reality is that Putin has no intention of ever completing the Bushehr project.

The Bushehr project dates to 1995, when the Russians agreed to build it for Iran, and was supposed to be completed by 1999. In theory, aside from some simple -- if essential -- component installation, the facility has been ready since 2004. Now, pushing three years later, the project remains a white elephant, and the Russians are claiming the Iranians are not paying for their services.

The nuclear card has been among Iran's most reliable means of drawing Washington's attention and pushing the Americans to take Tehran's concerns over the future of Iraq seriously, so Putin's announcement has delivered the Iranians a strong blow. If a junior minister or representative of a state firm were to insist that a bogus payment problem existed, it easily could be written off as bureaucratic stubbornness or the payment getting lost in the mail. Not so when a president -- particularly one as sober, controlling and exacting as Putin -- puts his personal seal on the policy. Bushehr is not going to be finished.

This does not eliminate Iran's nuclear card. Tehran still has its uranium conversion program at Isfahan, its uranium enrichment program at Natanz, and a heavy-water reactor under construction at Arak, but these facilities are not under regular international inspections, and moreover have direct uses in a nuclear weapons program. (Though uranium power reactors such as Bushehr can be used in a weapons program, they require extensive additional support infrastructure first.) It is far more difficult to convince the West -- and especially the Europeans, who are less inclined to view Iranian plans as nefarious -- that these facilities are all for the peaceful development of nuclear energy when one's power plant is not getting off the ground.

Ultimately, it is all political. Russia uses Bushehr as a means of injecting its influence into the Middle East, positioning itself as an impossible-to-ignore go-between for the West and Iran. So long as the facility is under construction, Moscow has maximized its leverage with all parties.

Should the facility ever come on line, however, Moscow will lose hugely. First, the West would be furious with Russia for giving Iran functional nuclear technology, severely damaging Russian relations with the West. Second, with Bushehr operational, neither the West nor Iran would need to keep talking to Russia about the Iranian nuclear power program. Third, Iran is not a natural Russian ally. The two have fought in a number of wars and actively compete for influence in Azerbaijan and Turkmenistan. A nuclear-armed Iran is actually more of a long-term threat to Russia than it is to the United States, which a strategist like Putin knows well.

Not even in the case of a breach in U.S.-Russian relations -- and those relations are not exactly in tip-top shape -- will Putin change this policy. There is only one conceivable policy evolution in Russia that would allow Iran access to Russian nuclear technology: regime change that saw the ejection of Putin and his inner circle of pragmatists in favor of Russia's siloviki.

The siloviki are a loosely aligned group of Russian nationalists and ultranationalists who dominate the country's military, intelligence and foreign policy apparatus and share the goal of resurrecting Russia as a great power. One of the siloviki's most glaring weaknesses is that they consider anything bad for the United States by definition good for Russia. Many siloviki have declared their support for proliferating nuclear technology far and wide in order to complicate U.S. efforts globally.

Under a siloviki government, therefore, Russia might actually give Iran what it needs to make Bushehr operational -- and perhaps even more -- but not until then.
30257  DBMA Espanol / Espanol Discussion / Re: Mexico on: March 12, 2007, 02:40:27 PM
Court Papers Show How 'Iron River' of Guns Flows Into Mexico

Monday , March 12, 2007

MESA, Ariz. —
Human and drug-smuggling organizations in Mexico are getting their guns from the same places law-abiding U.S. citizens are getting theirs: licensed gun dealers and gun shows, according to court documents.

"There's an iron river of guns flowing to Mexico," said special agent Thomas Mangan, spokesman for the Phoenix office of the U.S. Bureau of Alcohol Tobacco, Firearms and Explosives.

Search warrant affidavits show smugglers are getting guns from "straw purchasers," people with clean records who buy guns for smugglers, who then sneak them across the border for a few hundred dollars.  Records show the weaponry is bought from legitimate dealers in U.S. cities from Tucson to Scottsdale and Apache Junction to Avondale.

On Jan. 21, agents with the U.S. Bureau of Immigration and Customs Enforcement arrested Cedric Lloyd Manuel and Miguel Apodaca of Phoenix with nine assault rifles at the Arizona-Mexico border.  The guns had been bought the day before at gun stores in Apache Junction, Scottsdale and Phoenix. They were purchased by three brothers, Lucio, Rosendo and Marcos Aguilar.  Between November and the Jan. 21 arrests at the border, the Aguilars and others in the straw-purchasing crew bought 66 assault rifles, records show.

"Manuel (Aguilar) stated that he had taken probably about 20 loads of firearms into Mexico over the past couple of months," ATF special agent Heidi Peterson wrote in the affidavit.  The Aguilar family, Manuel Apodaca and the alleged ringleader, Blas Bustamante, have been charged in U.S. District Court with gun violations.  Mangan said the value of guns triples across the border.

He said Mexican crime organizations use the same infrastructure for smuggling humans and drugs north as they do to move the guns south.
He said the agency is working on a number of Arizona gun trafficking investigations while they also work with Mexican authorities to trace guns used in crimes across the border.

One such crime was the shooting of Ramon Tacho Verdugo, the 49-year-old police chief of Agua Prieta, Sonora, who was gunned down as he left the police station Feb. 26.
30258  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Israel, and its neighbors on: March 12, 2007, 11:01:59 AM

U.S./SAUDI ARABIA: The United States will hold separate talks with Israel and Saudi Arabia before an Arab League summit in Riyadh in late March in order to come to a compromise on the so-called Saudi initiative for the settlement of the Arab-Israeli conflict, Israeli daily Haaretz reported.
30259  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Afghanistan-Pakistan on: March 12, 2007, 10:59:29 AM
AFGHANISTAN: Swiss weekly newspaper SonntagsBlick reported that former Taliban Defense Minister Mullah Obaidullah Akhund, who was captured in February in Pakistan, was set free after only two days. While the report has not been confirmed by Pakistan, a SonntagsBlick reporter allegedly met with the former leader Feb. 28.

PAKISTAN: More than 20 people were injured when riot police clashed with 3,000 lawyers in Pakistan. The lawyers were striking to protest the suspension of Chief Justice Iftikhar Muhammad Chaudhry. The strike affected superior and lower courts all over the country.

AFGHANISTAN: The first joint meeting of the Pakistani-Afghan Jirga Commission began. The two-day talks are aimed at convening traditional jirga meetings on both sides of the border in an attempt to control violence in the tribal regions. These talks also will include a discussion of ways to stop illegal cross-border migration.

PAKISTAN: U.S. and Pakistani agents have arrested two suspected German terrorists in Pakistan, German magazine Der Spiegel reported. The men are accused of contacting terrorists and visiting an al Qaeda camp near the Afghan-Pakistani border.
30260  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Nuclear War? on: March 12, 2007, 10:57:02 AM
This is the first I have seen explicitly linking Russia's Bushehr plant and Iran's enrichment program.

RUSSIA/IRAN: Russia and Iran have begun talks that could last several days to settle financial issues related to the construction of Iran's Bushehr nuclear power plant, an Atomstroyexport spokesman said. The announcement came after Russian President Vladimir Putin decided to delay completion of the plant until Iran ends its uranium enrichment program.
RUSSIANS: IRAN NUKE PLANT TO BE DELAYED: The state-run Russian company building Iran's first nuclear power plant said Monday that the reactor's launch will be postponed because of Iranian payment delays. Russian media reports, meanwhile, indicated that the Kremlin was growing tired of Iran's nuclear defiance in the face of U.N. Security Council sanctions, with three agencies citing an unidentified official warning Iran to cooperate and stop playing "anti-American games."

IRAN: Iran should make concessions to the international community regarding its nuclear program to avoid additional sanctions from the U.N. Security Council, former Iranian President Mohammed Khatami said in an interview with economic daily Sanaat va Tosee.
30261  DBMA Martial Arts Forum / Martial Arts Topics / Re: Crimes using knives on: March 12, 2007, 10:31:25 AM

Hospital security staff are being equipped with stab-proof vests, shields and helmets to protect them against violent patients and relatives.
The protection is available to staff at hospitals in Cheshire in response to an unprecedented numbers of assualts against doctors and nurses.
Last year across the UK, there were 75,000 attacks on NHS staff - one every seven minutes.

It is estimated that the violence costs the NHS around £100,000 a year in security, time off for affected staff, and legal costs.
Nursing leaders say violence against staff is 'endemic' in the NHS, making them dangerous and 'traumatic' places to work.

Managers at North Cheshire NHS trust, which covers Halton and Warrington hospitals, decided to act after 73 assaults occured at the hospitals in 2005/06.
A spokesman said: "Security staff have been issued with stab-proof vests. Helmets and shields are available."
Security staff are called in by doctors and nurses when situations get out of hand.
Chris Todd, the trust's security management specialist, said: "Our main priority is safeguarding the well-being off out staff and patients.
"We meet with our local police and community support officers to discuss how we can address these issues at a local level.
"The trust will not tolerate abusive physical or verbal behaviour from patients or relatives towards any of our staff.

"We keep a thorough record of all incidents and will be tracking the progress of any of these matters through the criminal justice system."
All staff attend conflict resolution training sessions, to help them recognise and defuse potentially violent situations.
The number of incidents in 2006/07 is expected to be lower than the previous year.
Hospitals across the country are taking their own measures to deal with violence on NHS wards.
In Nottingham, plain-clothed police officers have been brought in to patrol wards and step in at the first sign of trouble.

Undercover officers began patrolling the accident and emergency department at the Queen's Medical Centre in Nottingham at the weekend.
Inspector Andy Baguley of Nottinghamshire Police said violence in hospitals is a crime and should not be tolerated.
He said: "Doctors, nurses and other staff shouldn't have to put up with rowdy and abusive behaviour."
Last week the BBC's Panorama programme revealed that the vast majority of assaults never reached the courts.

Of 58,700 incidents in England in the year 2005/06, only 850 ended with a prosecution.
A survey by the Royal College of Nursing found that 80 per cent of accident and emergency nurses had suffered harrassment or assault over the past year, and a quarter claimed they had been physically assaulted.
The violence problem in England prompted the Department of Health last June to unveil a crackdown on violence and verbal abuse in England's hospitals.
They promised that anyone being threatening or abusive to NHS staff would be slapped with a £1,000 fine and bosses would have the power to remove them from the premisis.

Patients and those needing treatment would be treated but could still later face fines or be subject to criminal action.
Last year St James Hospital in Leeds introduced police onto its wards to protect staff from physical and verbal abuse.
30262  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Health Thread (nutrition, medical, longevity, etc) on: March 12, 2007, 04:55:53 AM
AFPA [American Fitness Professionals &
Associates] March 2007 Health & Fitness
Newsletter Online vol. 12 no. 3

"There are no great people in this world, only
great challenges which ordinary people rise to
William "Bull" Halsey, Admiral 1882-1595

Table of Contents:

AFPA Fitness Conferences for Spring 2007
AFPA announces Pilates Fitness Instructor
Level II Certification
Depression Promotes Heart Problems
Why Doctors Miss Colon Cancer
Why Exercising as You Age Becomes More
Important and Challenging
Fruits and Vegetables Improve Male Fertility

Take advantage of AFPA's Resources
Use Folic Acid to Cut Heart Disease, Say

Depression Promotes Heart Problems

Depression appears to increase the
development of blood vessel plaques, known
as atherosclerosis, a condition that can lead to
heart attack, stroke, and a host of other
cardiovascular problems, according to a report
in the Archives of General Psychiatry.

Patients' psychological status influence quality
of life, and may also have a "significant impact"
on their physical status, including cardiovascular
health, Dr. Jesse C. Stewart, from Indiana
University-Purdue University Indianapolis, told
Reuters Health.

Stewart and colleagues evaluated the contribution of
depression, anxiety, and anger to atherosclerosis
among 324 men and women between 50 and 70
years old.

Symptom scoring tests evaluated the presence
of depression, anxiety and anger, while the
extent of atherosclerosis was accessed using
an imaging test, which measured the thickness
of the walls of the carotid arteries, major blood
vessels in the neck that carry oxygen to the

Why Doctors Miss Colon Cancer

An interesting study underscored one more
reason, among a seemingly, never-ending
number of them, why patients may die from the
errors their doctors make: Your physician may be
missing signs of colon cancer right in front of

Among more than 12,000 colon cancer
patients, 430 patients had a new or missed
tumor that was diagnosed anywhere from six
months to three years after having a
colonoscopy. What's more, family physicians
and internists who did their own colonoscopies
were generally far more prone to miss colon
cancer, with women (85 percent) edging men
(77 percent).

The other troublesome variable, Canadian
researchers discovered, was where a
colonoscopy was performed. An office setting
tripled the risk of new or missed cancers
among men and doubled it among women.
Fortunately, there are many natural measures
you can take -- none of which have anything to
do with a drug, doctor or procedure -- to
prevent or fight colon cancer. A few to get you

Have your C-reactive protein levels checked
and reduce them, if necessary.
Get the right amount of exercise.
Rebalance the ratio of omega-3 fats you
consume by taking a high quality fish oil or krill
Eat plenty of vegetables, ideally based on your
body's unique metabolic type.

Gastroenterology, Vol. 132, No. 1, January
2007: 96-102
Yahoo News February 23, 2007

Why Exercising as You Age Becomes More
Important and Challenging

A biological process called AMP-activated
protein kinase (AMPK), which boosts muscles,
begins to fail with advancing age. This leads to
a need for increased effort to achieve the
same effects from exercise, and could help
explain the link between aging and type 2

AMPK stimulates the body to burn off fat by
producing mitochondria, the power sources of
cells. The skeletal muscles of athletes have
been found to contain a much higher number
of mitochondria, which is likely linked to AMPK

When scientists compared the skeletal muscle
of 3-month-old rats and 2-year-olds, they found
that AMPK was significantly slowed down in
older animals. In addition, the muscle of young
rats who did more exercise had double the
normal AMPK activity, but this effect was not
nearly as strong in older rats.

Older people have more fat in their muscles
and livers than younger people do. These fat
cells have been linked to insulin resistance and
type 2 diabetes.

Cell Metabolism February 7, 2007; 5(2): 151-
BBC News February 10, 2007
Science Daily February 7, 2007

Fruits and Vegetables Improve Male Fertility

A new study shows that eating fruits and
vegetables can improve fertility in men.
Researchers from the University of Rochester
compared the dietary intake of antioxidants of
10 fertile and 48 infertile men and correlated
the findings with sperm motility. Infertile men
were twice as likely to have a low intake of
fruits and vegetables (fewer than five servings
per day) compared with fertile men. Also, men
with the lowest overall intake of dietary
antioxidants had lower sperm motility than men
with higher intakes.

Lewis V, Kochman L, Herko R, Brewer K,
Andolina E, Song G. Dietary antioxidants and
sperm quality in infertile men.Paper presented
at: Annual Scientific Meeting of the American
Society for Reproductive Medicine; October
2006; New Orleans.

Please make sure you take advantage of our
monthly online newsletter at:

Also please note that AFPA has archived

As well as articles:

Site Search for Information:

Use Folic Acid to Cut Heart Disease, Say

The scientific evidence is strong enough to
justify using folic acid as a cheap and simple
way of reducing heart disease and strokes.

Debate continues over whether raised
homocysteine levels in the blood (an amino
acid implicated in the development of arterial
disease) causes heart disease and stroke, and
whether folic acid, which lowers homocysteine,
will help reduce the risk of these disorders.
So heart expert, Dr David Wald and colleagues
set out to clarify the issue. They examined all
the evidence from different studies to see
whether raised homocysteine is a cause of
cardiovascular disease.

Some studies looked at homocysteine and the
occurrence of heart attacks and strokes in
large numbers of people (cohort studies),
some focused on people with a common
genetic variant which increases homocysteine
levels to a small extent (genetic studies), while
others tested the effects of lowering
homocysteine levels (randomised controlled

The conclusion that homocysteine is a cause
of cardiovascular disease explains the
observations from all the different types of
study, even if the results from one type of
study are, on their own, insufficient to reach
that conclusion, say the authors.

Since folic acid reduces homocysteine
concentrations, it follows that increasing folic
acid consumption will reduce the risk of heart
attack and stroke.
They therefore take the view that the evidence
is now sufficient to justify action on lowering
homocysteine concentrations, although the
position should be reviewed as evidence from
ongoing clinical trials emerges.

BMJ Volume 333 pp 1114-7 Click here to view
Source: Diabetes In Control
30263  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Nuclear War? on: March 11, 2007, 11:13:38 AM
Here's the URL:
30264  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Internet and related technology on: March 11, 2007, 09:43:49 AM
Perhaps I'm stretching the term "related technology" but I didn't know where else to put this and it didn't deserve its own thread:

Tired of getting recorded messages?  This site gives 500 contact numbers and instructions as to how to get a live human for customer service.

30265  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Dark Energy Part Two on: March 11, 2007, 09:28:05 AM
(Page 4 of 6)

The challenge with dark energy, as opposed to dark matter, is even more difficult. Dark energy is whatever it is that’s making the expansion of the universe accelerate, but, for instance, does it change over time and space? If so, then cosmologists have a name for it: quintessence. Does it not change? In that case, they’ll call it the cosmological constant, a version of the mathematical fudge factor that Einstein originally inserted into the equations for relativity to explain why the universe had neither expanded nor contracted itself out of existence.

After the discovery of dark energy, Perlmutter concluded that the next generation of dark-energy telescopes would have to include a space-based observatory. But the search for financing for such an ambitious project can require as much forbearance as the search for dark energy itself. “I don’t think I’ve ever seen as much of Washington as I have in the last few years,” he says, sighing. Even if his Supernova Acceleration Probe didn’t now face competition from several other proposals for federal financing (including, perhaps inevitably, one involving his old rival Riess), delays have prevented it from being ready to launch until at least the middle of the next decade. “Ten years from now,” says Josh Frieman of the University of Chicago, “when we’re talking about spending on the order of a billion dollars to put something up in space — which I think we should do — you’re getting into that class where you’re spending real money.”

Even some cosmologists have begun to express reservations. At a conference at Durham University in England last summer, a “whither cosmology?” panel featuring some of the field’s most prominent names questioned the wisdom of concentrating so much money and manpower on one problem. They pointed to what happened when the government-sponsored Dark Energy Task Force solicited proposals for experiments a couple of years ago. The task force was expecting a dozen, according to one member. They got three dozen. Cosmology was choosing a “risky and not very cost-effective way of moving forward,” one Durham panelist told me later, summarizing the sentiment he heard there.

But even if somebody were to figure out whether or not dark energy changes across time and space, astronomers still wouldn’t know what dark energy itself is. “The term doesn’t mean anything,” said David Schlegel of Lawrence Berkeley National Laboratory this past fall. “It might not be dark. It might not be energy. The whole name is a placeholder. It’s a placeholder for the description that there’s something funny that was discovered eight years ago now that we don’t understand.” Not that theorists haven’t been trying. “It’s just nonstop,” Perlmutter told me. “There’s article after article after article.” He likes to begin public talks with a PowerPoint illustration: papers on dark energy piling up, one on top of the next, until the on-screen stack ascends into the dozens. All the more reason not to put all of cosmology’s eggs into one research basket, argued the Durham panelists. As one summarized the situation, “We don’t even have a hypothesis to test.”

Michael Turner won’t hear of it. “This is one of these godsend problems!” he says. “If you’re a scientist, you’d like to be around when there’s a great problem to work on and solve. The solution is not obvious, and you could imagine it being solved tomorrow, you could imagine it taking another 10 years or you could imagine it taking another 200 years.”

But you could also imagine it taking forever.

“Time to get serious.” The PowerPoint slide, teal letters popping off a black background, stared back at a hotel ballroom full of cosmologists. They gathered in Chicago last winter for a “New Views of the Universe” conference, and Sean Carroll, then at the University of Chicago, had taken it upon himself to give his theorist colleagues their marching orders.

“There was a heyday for talking out all sorts of crazy ideas,” Carroll, now at Caltech, recently explained. That heyday would have been the heady, post-1998 period when Michael Turner might stand up at a conference and turn to anyone voicing caution and say, “Can’t we be exuberant for a while?” But now has come the metaphorical morning after, and with it a sobering realization: Maybe the universe isn’t simple enough for dummies like us humans. Maybe it’s not just our powers of perception that aren’t up to the task but also our powers of conception. Extraordinary claims like the dawn of a new universe might require extraordinary evidence, but what if that evidence has to be literally beyond the ordinary? Astronomers now realize that dark matter probably involves matter that is nonbaryonic. And whatever it is that dark energy involves, we know it’s not “normal,” either. In that case, maybe this next round of evidence will have to be not only beyond anything we know but also beyond anything we know how to know.


(Page 5 of 6)

That possibility always gnaws at scientists — what Perlmutter calls “that sense of tentativeness, that we have gotten so far based on so little.” Cosmologists in particular have had to confront that possibility throughout the birth of their science. “At various times in the past 20 years it could have gotten to the point where there was no opportunity for advance,” Frieman says. What if, for instance, researchers couldn’t repeat the 1963 Bell Labs detection of the supposed echo from the big bang? Smoot and John C. Mather of NASA (who shared the Nobel in Physics with Smoot) designed the Cosmic Background Explorer satellite telescope to do just that. COBE looked for extremely subtle differences in temperature throughout all of space that carry the imprint of the universe when it was less than a second old. And in 1992, COBE found them: in effect, the quantum fluctuations that 13.7 billion years later would coalesce into a universe that is 22 percent dark matter, 74 percent dark energy and 4 percent the stuff of us.

And if the right ripples hadn’t shown up? As Frieman puts it: “You just would have thrown up your hands and said, ‘My God, we’ve got to go back to the drawing board!’ What’s remarkable to me is that so far that hasn’t happpened.”

Yet in a way it has. In the observation-and-theory, call-and-response system of investigating nature that scientists have refined over the past 400 years, the dark side of the universe represents a disruption. General relativity helped explain the observations of the expanding universe, which led to the idea of the big bang, which anticipated the observations of the cosmic-microwave background, which led to the revival of Einstein’s cosmological constant, which anticipated the observations of supernovae, which led to dark energy. And dark energy is ... ?

The difficulty in answering that question has led some cosmologists to ask an even deeper question: Does dark energy even exist? Or is it perhaps an inference too far? Cosmologists have another saying they like to cite: “You get to invoke the tooth fairy only once,” meaning dark matter, “but now we have to invoke the tooth fairy twice,” meaning dark energy.

One of the most compelling arguments that cosmologists have for the existence of dark energy (whatever it is) is that unlike earlier inferences that physicists eventually had to abandon — the ether that 19th-century physicists thought pervaded space, for instance — this inference makes mathematical sense. Take Perlmutter’s and Riess’s observations of supernovae, apply one cornerstone of 20th-century physics, general relativity, and you have a universe that does indeed consist of .26 matter, dark or otherwise, and .74 something that accelerates the expansion. Yet in another way, dark energy doesn’t add up. Take the observations of supernovae, apply the other cornerstone of 20th-century physics, quantum theory, and you get gibberish — you get an answer 120 orders of magnitude larger than .74.

Which doesn’t mean that dark energy is the ether of our age. But it does mean that its implications extend beyond cosmology to a problem Einstein spent the last 30 years of his life trying to reconcile: how to unify his new physics of the very large (general relativity) with the new physics of the very small (quantum mechanics). What makes the two incompatible — where the physics breaks down — is gravity.

In physics, gravity is the ur-inference. Even Newton admitted that he was making it up as he went along. That a force of attraction might exist between two distant objects, he once wrote in a letter, is “so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it.” Yet fall into it we all do on a daily basis, and physicists are no exception. “I don’t think we really understand what gravity is,” Vera Rubin says. “So in some sense we’re doing an awful lot on something we don’t know much about.”

It hasn’t escaped the notice of astronomers that both dark matter and dark energy involve gravity. Early this year 50 physicists gathered for a “Rethinking Gravity” conference at the University of Arizona to discuss variations on general relativity. “So far, Einstein is coming through with flying colors,” says Sean Carroll, who was one of the gravity-defying participants. “He’s always smarter than you think he was.”

But he’s not necessarily inviolate. “We’ve never tested gravity across the whole universe before,” Riess pointed out during a news conference last year. “It may be that there’s not really dark energy, that that’s a figment of our misperception about gravity, that gravity actually changes the way it operates on long ranges.”

The only way out, cosmologists and particle physicists agree, would be a “new physics” — a reconciliation of general relativity and quantum mechanics. “Understanding dark energy,” Riess says, “seems to really require understanding and using both of those theories at the same time.”


Page 6 of 6)

“It’s been so hard that we’re even willing to consider listening to string theorists,” Perlmutter says, referring to work that posits numerous dimensions beyond the traditional (one of time and three of space). “They’re at least providing a language in which you can talk about both things at the same time.”

According to quantum theory, particles can pop into and out of existence. In that case, maybe the universe itself was born in one such quantum pop. And if one universe can pop into existence, then why not many universes? String theorists say that number could be 10 raised to the power of 500. Those are 10-with-500-zeros universes, give or take. In which case, our universe would just happen to be the one with an energy density of .74, a condition suitable for the existence of creatures that can contemplate their hyper-Copernican existence.

And this is just one of a number of theories that have been popping into existence, quantum-particle-like, in the past few years: parallel universes, intersecting universes or, in the case of Stephen Hawking and Thomas Hertog just last summer, a superposition of universes. But what evidence — extraordinary or otherwise — can anyone offer for such claims? The challenge is to devise an experiment that would do for a new physics what COBE did for the big bang. Predictions in string theory, as in the 10-to-the-power-of-500-universes hypothesis, depend on the existence of extra dimensions, a stipulation that just might put the burden back on particle physics — specifically, the hope that evidence of extra dimensions will emerge in the Large Hadron Collider, or perhaps in its proposed successor, the International Linear Collider, which might come online sometime around 2020, or maybe in the supercollider after that, if the industrial nations of 2030 decide they can afford it.

“You want your mind to be boggled,” Perlmutter says. “That is a pleasure in and of itself. And it’s more a pleasure if it’s boggled by something that you can then demonstrate is really, really true.”

And if you can’t demonstrate that it’s really, really true?

“If the brilliant idea doesn’t come along,” Riess says, “then we will say dark energy has exactly these properties, it acts exactly like this. And then” — a shrug — “we will put it in a box.” And there it will remain, residing perhaps not far from the box labeled “Dark Matter,” and the two of them bookending the biggest box of them all, “Gravity,” to await a future Newton or Einstein to open — or not.
30266  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Physics & Mathematics on: March 11, 2007, 09:27:20 AM
Dark Energy
NY Times

Three days after learning that he won the 2006 Nobel Prize in Physics, George Smoot was talking about the universe. Sitting across from him in his office at the University of California, Berkeley, was Saul Perlmutter, a fellow cosmologist and a probable future Nobelist in Physics himself. Bearded, booming, eyes pinwheeling from adrenaline and lack of sleep, Smoot leaned back in his chair. Perlmutter, onetime acolyte, longtime colleague, now heir apparent, leaned forward in his.

“Time and time again,” Smoot shouted, “the universe has turned out to be really simple.”

Perlmutter nodded eagerly. “It’s like, why are we able to understand the universe at our level?”

“Right. Exactly. It’s a universe for beginners! ‘The Universe for Dummies’!”

But as Smoot and Perlmutter know, it is also inarguably a universe for Nobelists, and one that in the past decade has become exponentially more complicated. Since the invention of the telescope four centuries ago, astronomers have been able to figure out the workings of the universe simply by observing the heavens and applying some math, and vice versa. Take the discovery of moons, planets, stars and galaxies, apply Newton’s laws and you have a universe that runs like clockwork. Take Einstein’s modifications of Newton, apply the discovery of an expanding universe and you get the big bang. “It’s a ridiculously simple, intentionally cartoonish picture,” Perlmutter said. “We’re just incredibly lucky that that first try has matched so well.”

But is our luck about to run out? Smoot’s and Perlmutter’s work is part of a revolution that has forced their colleagues to confront a universe wholly unlike any they have ever known, one that is made of only 4 percent of the kind of matter we have always assumed it to be — the material that makes up you and me and this magazine and all the planets and stars in our galaxy and in all 125 billion galaxies beyond. The rest — 96 percent of the universe — is ... who knows?

“Dark,” cosmologists call it, in what could go down in history as the ultimate semantic surrender. This is not “dark” as in distant or invisible. This is “dark” as in unknown for now, and possibly forever.

If so, such a development would presumably not be without philosophical consequences of the civilization-altering variety. Cosmologists often refer to this possibility as “the ultimate Copernican revolution”: not only are we not at the center of anything; we’re not even made of the same stuff as most of the rest of everything. “We’re just a bit of pollution,” Lawrence M. Krauss, a theorist at Case Western Reserve, said not long ago at a public panel on cosmology in Chicago. “If you got rid of us, and all the stars and all the galaxies and all the planets and all the aliens and everybody, then the universe would be largely the same. We’re completely irrelevant.”

All well and good. Science is full of homo sapiens-humbling insights. But the trade-off for these lessons in insignificance has always been that at least now we would have a deeper — simpler — understanding of the universe. That the more we could observe, the more we would know. But what about the less we could observe? What happens to new knowledge then? It’s a question cosmologists have been asking themselves lately, and it might well be a question we’ll all be asking ourselves soon, because if they’re right, then the time has come to rethink a fundamental assumption: When we look up at the night sky, we’re seeing the universe.

Not so. Not even close.

In 1963, two scientists at Bell Labs in New Jersey discovered a microwave signal that came from every direction of the heavens. Theorists at nearby Princeton University soon realized that this signal might be the echo from the beginning of the universe, as predicted by the big-bang hypothesis. Take the idea of a cosmos born in a primordial fireball and cooling down ever since, apply the discovery of a microwave signal with a temperature that corresponded precisely to the one that was predicted by theorists — 2.7 degrees above absolute zero — and you have the universe as we know it. Not Newton’s universe, with its stately, eternal procession of benign objects, but Einstein’s universe, violent, evolving, full of births and deaths, with the grandest birth and, maybe, death belonging to the cosmos itself.

But then, in the 1970s, astronomers began noticing something that didn’t seem to fit with the laws of physics. They found that spiral galaxies like our own Milky Way were spinning at such a rate that they should have long ago wobbled out of control, shredding apart, shedding stars in every direction. Yet clearly they had done no such thing. They were living fast but not dying young. This seeming paradox led theorists to wonder if a halo of a hypothetical something else might be cocooning each galaxy, dwarfing each flat spiral disk of stars and gas at just the right mass ratio to keep it gravitationally intact. Borrowing a term from the astronomer Fritz Zwicky, who detected the same problem with the motions of a whole cluster of galaxies back in the 1930s, decades before anyone else took the situation seriously, astronomers called this mystery mass “dark matter.”


Page 2 of 6)

So there was more to the universe than meets the eye. But how much more? This was the question Saul Perlmutter’s team at Lawrence Berkeley National Laboratory set out to answer in the late 1980s. Actually, they wanted to settle an issue that had been nagging astronomers ever since Edwin Hubble discovered in 1929 that the universe seems to be expanding. Gravity, astronomers figured, would be slowing the expansion, and the more matter the greater the gravitational effect. But was the amount of matter in the universe enough to slow the expansion until it eventually stopped, reversed course and collapsed in a backward big bang? Or was the amount of matter not quite enough to do this, in which case the universe would just go on expanding forever? Just how much was the expansion of the universe slowing down?

The tool the team would be using was a specific type of exploding star, or supernova, that reaches a roughly uniform brightness and so can serve as what astronomers call a standard candle. By comparing how bright supernovae appear and how much the expansion of the universe has shifted their light, cosmologists sought to determine the rate of the expansion. “I was trying to tell everybody that this is the measurement that everybody should be doing,” Perlmutter says. “I was trying to convince them that this is going to be the tool of the future.” Perlmutter talks like a microcassette on fast-forward, and he possesses the kind of psychological dexterity that allows him to walk into a room and instantly inhabit each person’s point of view. He can be as persuasive as any force of nature. “The next thing I know,” he says, “we’ve convinced people, and now they’re competing with us!”

By 1997, Perlmutter’s Supernova Cosmology Project and a rival team had amassed data from more than 50 supernovae between them — data that would reveal yet another oddity in the cosmos. Perlmutter noticed that the supernovae weren’t brighter than expected but dimmer. He wondered if he had made a mistake in his observations. A few months later, Adam Riess, a member of a rival international team, noticed the same general drift in his math and wondered the same thing. “I’m a postdoc,” he told himself. “I’m sure I’ve messed up in at least 10 different ways.” But Perlmutter double-checked for intergalactic dust that might have skewed his readings, and Riess cross-checked his math, calculation by calculation, with his team leader, Brian Schmidt. Early in 1998, the two teams announced that they had each independently reached the same conclusion, and it was the opposite of what either of them expected. The rate of the expansion of the universe was not slowing down. Instead, it seemed to be speeding up.

That same year, Michael Turner, the prominent University of Chicago theorist, delivered a paper in which he called this antigravitational force “dark energy.” The purpose of calling it “dark,” he explained recently, was to highlight the similarity to dark matter. The purpose of “energy” was to make a distinction. “It really is very different from dark matter,” Turner said. “It’s more energylike.”

More energylike how, exactly?

Turner raised his eyebrows. “I’m not embarrassed to say it’s the most profound mystery in all of science.”

Extraordinary claims,” Carl Sagan once said, “require extraordinary evidence.” Astronomers love that saying; they quote it all the time. In this case the claim could have hardly been more extraordinary: a new universe was dawning.

It wouldn’t be the first time. We once thought the night sky consisted of the several thousand objects we could see with the naked eye. But the invention of the telescope revealed that it didn’t, and that the farther we saw, the more we saw: planets, stars, galaxies. After that we thought the night sky consisted of only the objects the eye could see with the assistance of telescopes that reached all the way back to the first stars blinking to life. But the discovery of wavelengths beyond the optical revealed that it didn’t, and that the more we saw in the radio or infrared or X-ray parts of the electromagnetic spectrum, the more we discovered: evidence for black holes, the big bang and the distances of supernovae, for starters.


(Page 3 of 6)

The difference with “dark,” however, is that it lies not only outside the visible but also beyond the entire electromagnetic spectrum. By all indications, it consists of data that our five senses can’t detect other than indirectly. The motions of galaxies don’t make sense unless we infer the existence of dark matter. The brightness of supernovae doesn’t make sense unless we infer the existence of dark energy. It’s not that inference can’t be a powerful tool: an apple falls to the ground, and we infer gravity. But it can also be an incomplete tool: gravity is ... ?

Dark matter is ... ? In the three decades since most astronomers decisively, if reluctantly, accepted the existence of dark matter, observers have eliminated the obvious answer: that dark matter is made of normal matter that is so far away or so dim that it can’t be seen from earth. To account for the dark-matter deficit, this material would have to be so massive and so numerous that we couldn’t possibly miss it.

Which leaves abnormal matter, or what physicists call nonbaryonic matter, meaning that it doesn’t consist of the protons and neutrons of “normal” matter. What’s more (or, perhaps more accurately, less), it doesn’t interact at all with electricity or magnetism, which is why we wouldn’t be able to see it, and it can rarely interact even with protons and neutrons, which is why trillions of these particles might be passing through you every second without your knowing it. Theorists have narrowed the search for dark-matter particles to two hypothetical candidates: the axion and the neutralino. But so far efforts to create one of these ghostly particles in accelerators, which mimic the high levels of energy in the first fraction of a second after the birth of the universe, have come up empty. So have efforts to catch one in ultrasensitive detectors, which number in the dozens around the world.

For now, dark-matter physicists are hanging their hopes on the Large Hadron Collider, the latest-generation subatomic-particle accelerator, which goes online later this year at the European Center for Nuclear Research on the Franco-Swiss border. Many cosmologists think that the L.H.C. has made the creation of a dark-matter particle — as George Smoot said, holding up two fingers — “this close.” But one of the pioneer astronomers investigating dark matter in the 1970s, Vera Rubin, says that she has lived through plenty of this kind of optimism; she herself predicted in 1980 that dark matter would be identified within a decade. “I hope he’s right,” she says of Smoot’s assertion. “But I think it’s more a wish than a belief.” As one particle physicist commented at a “Dark Universe” symposium at the Space Telescope Science Institute in Baltimore a few years ago, “If we fail to see anything in the L.H.C., then I’m off to do something else,” adding, “Unfortunately, I’ll be off to do something else at the same time as hundreds of other physicists.”

Juan Collar might be among them. “I know I speak for a generation of people who have been looking for dark-matter particles since they were grad students,” he said one wintry afternoon in his University of Chicago office. “I doubt how many of us will remain in the field if the L.H.C. brings home bad news. I have been looking for dark-matter particles for more than 15 years. I’m 42. So most of my colleagues, my age, we are kind of going through a midlife crisis.” He laughed. “When we get together and we drink enough beer, we start howling at the moon.”

Although many scientists say that the existence of the axion will be proved or disproved within the next 10 years — as a result of work at Lawrence Livermore National Laboratory — the detection of a neutralino one way or the other is much less certain. A negative result from an experiment might mean only that theorists haven’t thought hard enough or that observers haven’t looked deep enough. “It could very well be that Mother Nature has decided that the neutralino is way down there,” Collar said, pointing not to a graph that he taped up in his office but to a point below the sheet of paper itself, at the blank wall. “If that is the case,” he went on to say, “we should retreat and worship Mother Nature. These particles maybe exist, but we will not see them, our sons will not see them and their sons won’t see them.”

30267  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Neuroscience: The Brain on the Stand: Part Three on: March 11, 2007, 09:20:18 AM
(Page 7 of 9)

Greely acknowledges that lie-detection and memory-retrieval technologies like this could pose a serious challenge to our freedom of thought, which is now defended largely by the First Amendment protections for freedom of expression. “Freedom of thought has always been buttressed by the reality that you could only tell what someone thought based on their behavior,” he told me. “This technology holds out the possibility of looking through the skull and seeing what’s really happening, seeing the thoughts themselves.” According to Greely, this may challenge the principle that we should be held accountable for what we do, not what we think. “It opens up for the first time the possibility of punishing people for their thoughts rather than their actions,” he says. “One reason thought has been free in the harshest dictatorships is that dictators haven’t been able to detect it.” He adds, “Now they may be able to, putting greater pressure on legal constraints against government interference with freedom of thought.”

In the future, neuroscience could also revolutionize the way jurors are selected. Steven Laken, the president of Cephos, says that jury consultants might seek to put prospective jurors in f.M.R.I.’s. “You could give videotapes of the lawyers and witnesses to people when they’re in the magnet and see what parts of their brains light up,” he says. A situation like this would raise vexing questions about jurors’ prejudices — and what makes for a fair trial. Recent experiments have suggested that people who believe themselves to be free of bias may harbor plenty of it all the same.

The experiments, conducted by Elizabeth Phelps, who teaches psychology at New York University, combine brain scans with a behavioral test known as the Implicit Association Test, or I.A.T., as well as physiological tests of the startle reflex. The I.A.T. flashes pictures of black and white faces at you and asks you to associate various adjectives with the faces. Repeated tests have shown that white subjects take longer to respond when they’re asked to associate black faces with positive adjectives and white faces with negative adjectives than vice versa, and this is said to be an implicit measure of unconscious racism. Phelps and her colleagues added neurological evidence to this insight by scanning the brains and testing the startle reflexes of white undergraduates at Yale before they took the I.A.T. She found that the subjects who showed the most unconscious bias on the I.A.T. also had the highest activation in their amygdalas — a center of threat perception — when unfamiliar black faces were flashed at them in the scanner. By contrast, when subjects were shown pictures of familiar black and white figures — like Denzel Washington, Martin Luther King Jr. and Conan O’Brien — there was no jump in amygdala activity.

The legal implications of the new experiments involving bias and neuroscience are hotly disputed. Mahzarin R. Banaji, a psychology professor at Harvard who helped to pioneer the I.A.T., has argued that there may be a big gap between the concept of intentional bias embedded in law and the reality of unconscious racism revealed by science. When the gap is “substantial,” she and the U.C.L.A. law professor Jerry Kang have argued, “the law should be changed to comport with science” — relaxing, for example, the current focus on intentional discrimination and trying to root out unconscious bias in the workplace with “structural interventions,” which critics say may be tantamount to racial quotas. One legal scholar has cited Phelps’s work to argue for the elimination of peremptory challenges to prospective jurors — if most whites are unconsciously racist, the argument goes, then any decision to strike a black juror must be infected with racism. Much to her displeasure, Phelps’s work has been cited by a journalist to suggest that a white cop who accidentally shot a black teenager on a Brooklyn rooftop in 2004 must have been responding to a hard-wired fear of unfamiliar black faces — a version of the amygdala made me do it.

Phelps herself says it’s “crazy” to link her work to cops who shoot on the job and insists that it is too early to use her research in the courtroom. “Part of my discomfort is that we haven’t linked what we see in the amygdala or any other region of the brain with an activity outside the magnet that we would call racism,” she told me. “We have no evidence whatsoever that activity in the brain is more predictive of things we care about in the courtroom than the behaviors themselves that we correlate with brain function.” In other words, just because you have a biased reaction to a photograph doesn’t mean you’ll act on those biases in the workplace. Phelps is also concerned that jurors might be unduly influenced by attention-grabbing pictures of brain scans. “Frank Keil, a psychologist at Yale, has done research suggesting that when you have a picture of a mechanism, you have a tendency to overestimate how much you understand the mechanism,” she told me. Defense lawyers confirm this phenomenon. “Here was this nice color image we could enlarge, that the medical expert could point to,” Christopher Plourd, a San Diego criminal defense lawyer, told The Los Angeles Times in the early 1990s. “It documented that this guy had a rotten spot in his brain. The jury glommed onto that.”


(Page 8 of 9)

Other scholars are even sharper critics of efforts to use scientific experiments about unconscious bias to transform the law. “I regard that as an extraordinary claim that you could screen potential jurors or judges for bias; it’s mind-boggling,” I was told by Philip Tetlock, professor at the Haas School of Business at the University of California at Berkley. Tetlock has argued that split-second associations between images of African-Americans and negative adjectives may reflect “simple awareness of the social reality” that “some groups are more disadvantaged than others.” He has also written that, according to psychologists, “there is virtually no published research showing a systematic link between racist attitudes, overt or subconscious, and real-world discrimination.” (A few studies show, Tetlock acknowledges, that openly biased white people sometimes sit closer to whites than blacks in experiments that simulate job hiring and promotion.) “A light bulb going off in your brain means nothing unless it’s correlated with a particular output, and the brain-scan stuff, heaven help us, we have barely linked that with anything,” agrees Tetlock’s co-author, Amy Wax of the University of Pennsylvania Law School. “The claim that homeless people light up your amygdala more and your frontal cortex less and we can infer that you will systematically dehumanize homeless people — that’s piffle.”

V. Are You Responsible for What You Might Do? The attempt to link unconscious bias to actual acts of discrimination may be dubious. But are there other ways to look inside the brain and make predictions about an individual’s future behavior? And if so, should those discoveries be employed to make us safer? Efforts to use science to predict criminal behavior have a disreputable history. In the 19th century, the Italian criminologist Cesare Lombroso championed a theory of “biological criminality,” which held that criminals could be identified by physical characteristics, like large jaws or bushy eyebrows. Nevertheless, neuroscientists are trying to find the factors in the brain associated with violence. PET scans of convicted murderers were first studied in the late 1980s by Adrian Raine, a professor of psychology at the University of Southern California; he found that their prefrontal cortexes, areas associated with inhibition, had reduced glucose metabolism and suggested that this might be responsible for their violent behavior. In a later study, Raine found that subjects who received a diagnosis of antisocial personality disorder, which correlates with violent behavior, had 11 percent less gray matter in their prefrontal cortexes than control groups of healthy subjects and substance abusers. His current research uses f.M.R.I.’s to study moral decision-making in psychopaths.

Neuroscience, it seems, points two ways: it can absolve individuals of responsibility for acts they’ve committed, but it can also place individuals in jeopardy for acts they haven’t committed — but might someday. “This opens up a Pandora’s box in civilized society that I’m willing to fight against,” says Helen S. Mayberg, a professor of psychiatry, behavioral sciences and neurology at Emory University School of Medicine, who has testified against the admission of neuroscience evidence in criminal trials. “If you believe at the time of trial that the picture informs us about what they were like at the time of the crime, then the picture moves forward. You need to be prepared for: ‘This spot is a sign of future dangerousness,’ when someone is up for parole. They have a scan, the spot is there, so they don’t get out. It’s carved in your brain.”

Other scholars see little wrong with using brain scans to predict violent tendencies and sexual predilections — as long as the scans are used within limits. “It’s not necessarily the case that if predictions work, you would say take that guy off the street and throw away the key,” says Hank Greely, the Stanford law professor. “You could require counseling, surveillance, G.P.S. transmitters or warning the neighbors. None of these are necessarily benign, but they beat the heck out of preventative detention.” Greely has little doubt that predictive technologies will be enlisted in the war on terror — perhaps in radical ways. “Even with today’s knowledge, I think we can tell whether someone has a strong emotional reaction to seeing things, and I can certainly imagine a friend-versus-foe scanner. If you put everyone who reacts badly to an American flag in a concentration camp or Guantánamo, that would be bad, but in an occupation situation, to mark someone down for further surveillance, that might be appropriate.”

Paul Root Wolpe, who teaches social psychiatry and psychiatric ethics at the University of Pennsylvania School of Medicine, says he anticipates that neuroscience predictions will move beyond the courtroom and will be used to make predictions about citizens in all walks of life.

“Will we use brain imaging to track kids in school because we’ve discovered that certain brain function or morphology suggests aptitude?” he asks. “I work for NASA, and imagine how helpful it might be for NASA if it could scan your brain to discover whether you have a good enough spatial sense to be a pilot.” Wolpe says that brain imaging might eventually be used to decide if someone is a worthy foster or adoptive parent — a history of major depression and cocaine abuse can leave telltale signs on the brain, for example, and future studies might find parts of the brain that correspond to nurturing and caring.


(Page 9 of 9)

The idea of holding people accountable for their predispositions rather than their actions poses a challenge to one of the central principles of Anglo-American jurisprudence: namely, that people are responsible for their behavior, not their proclivities — for what they do, not what they think. “We’re going to have to make a decision about the skull as a privacy domain,” Wolpe says. Indeed, Wolpe serves on the board of an organization called the Center for Cognitive Liberty and Ethics, a group of neuroscientists, legal scholars and privacy advocates “dedicated to protecting and advancing freedom of thought in the modern world of accelerating neurotechnologies.”

There may be similar “cognitive liberty” battles over efforts to repair or enhance broken brains. A remarkable technique called transcranial magnetic stimulation, for example, has been used to stimulate or inhibit specific regions of the brain. It can temporarily alter how we think and feel. Using T.M.S., Ernst Fehr and Daria Knoch of the University of Zurich temporarily disrupted each side of the dorsolateral prefrontal cortex in test subjects. They asked their subjects to participate in an experiment that economists call the ultimatum game. One person is given $20 and told to divide it with a partner. If the partner rejects the proposed amount as too low, neither person gets any money. Subjects whose prefrontal cortexes were functioning properly tended to reject offers of $4 or less: they would rather get no money than accept an offer that struck them as insulting and unfair. But subjects whose right prefrontal cortexes were suppressed by T.M.S. tended to accept the $4 offer. Although the offer still struck them as insulting, they were able to suppress their indignation and to pursue the selfishly rational conclusion that a low offer is better than nothing.

Some neuroscientists believe that T.M.S. may be used in the future to enforce a vision of therapeutic justice, based on the idea that defective brains can be cured. “Maybe somewhere down the line, a badly damaged brain would be viewed as something that can heal, like a broken leg that needs to be repaired,” the neurobiologist Robert Sapolsky says, although he acknowledges that defining what counts as a normal brain is politically and scientifically fraught. Indeed, efforts to identify normal and abnormal brains have been responsible for some of the darkest movements in the history of science and technology, from phrenology to eugenics. “How far are we willing to go to use neurotechnology to change people’s brains we consider disordered?” Wolpe asks. “We might find a part of the brain that seems to be malfunctioning, like a discrete part of the brain operative in violent or sexually predatory behavior, and then turn off or inhibit that behavior using transcranial magnetic stimulation.” Even behaviors in the normal range might be fine-tuned by T.M.S.: jurors, for example, could be made more emotional or more deliberative with magnetic interventions. Mark George, an adviser to the Cephos company and also director of the Medical University of South Carolina Center for Advanced Imaging Research, has submitted a patent application for a T.M.S. procedure that supposedly suppresses the area of the brain involved in lying and makes a person less capable of not telling the truth.

As the new technologies proliferate, even the neurolaw experts themselves have only begun to think about the questions that lie ahead. Can the police get a search warrant for someone’s brain? Should the Fourth Amendment protect our minds in the same way that it protects our houses? Can courts order tests of suspects’ memories to determine whether they are gang members or police informers, or would this violate the Fifth Amendment’s ban on compulsory self-incrimination? Would punishing people for their thoughts rather than for their actions violate the Eighth Amendment’s ban on cruel and unusual punishment? However astonishing our machines may become, they cannot tell us how to answer these perplexing questions. We must instead look to our own powers of reasoning and intuition, relatively primitive as they may be. As Stephen Morse puts it, neuroscience itself can never identify the mysterious point at which people should be excused from responsibility for their actions because they are not able, in some sense, to control themselves. That question, he suggests, is “moral and ultimately legal,” and it must be answered not in laboratories but in courtrooms and legislatures. In other words, we must answer it ourselves.
30268  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Neuroscience: The Brain on the Stand Part Two on: March 11, 2007, 09:19:20 AM
(Page 4 of 9)

The leading neurolaw brief in the case, filed by the American Medical Association and other groups, argued that because “adolescent brains are not fully developed” in the prefrontal regions, adolescents are less able than adults to control their impulses and should not be held fully accountable “for the immaturity of their neural anatomy.” In his majority decision, Justice Anthony Kennedy declared that “as any parent knows and as the scientific and sociological studies” cited in the briefs “tend to confirm, ‘[a] lack of maturity and an underdeveloped sense of responsibility are found in youth more often than in adults.’ ” Although Kennedy did not cite the neuroscience evidence specifically, his indirect reference to the scientific studies in the briefs led some supporters and critics to view the decision as the Brown v. Board of Education of neurolaw.

One important question raised by the Roper case was the question of where to draw the line in considering neuroscience evidence as a legal mitigation or excuse. Should courts be in the business of deciding when to mitigate someone’s criminal responsibility because his brain functions improperly, whether because of age, in-born defects or trauma? As we learn more about criminals’ brains, will we have to redefine our most basic ideas of justice?

Two of the most ardent supporters of the claim that neuroscience requires the redefinition of guilt and punishment are Joshua D. Greene, an assistant professor of psychology at Harvard, and Jonathan D. Cohen, a professor of psychology who directs the neuroscience program at Princeton. Greene got Cohen interested in the legal implications of neuroscience, and together they conducted a series of experiments exploring how people’s brains react to moral dilemmas involving life and death. In particular, they wanted to test people’s responses in the f.M.R.I. scanner to variations of the famous trolley problem, which philosophers have been arguing about for decades.

The trolley problem goes something like this: Imagine a train heading toward five people who are going to die if you don’t do anything. If you hit a switch, the train veers onto a side track and kills another person. Most people confronted with this scenario say it’s O.K. to hit the switch. By contrast, imagine that you’re standing on a footbridge that spans the train tracks, and the only way you can save the five people is to push an obese man standing next to you off the footbridge so that his body stops the train. Under these circumstances, most people say it’s not O.K. to kill one person to save five.

“I wondered why people have such clear intuitions,” Greene told me, “and the core idea was to confront people with these two cases in the scanner and see if we got more of an emotional response in one case and reasoned response in the other.” As it turns out, that’s precisely what happened: Greene and Cohen found that the brain region associated with deliberate problem solving and self-control, the dorsolateral prefrontal cortex, was especially active when subjects confronted the first trolley hypothetical, in which most of them made a utilitarian judgment about how to save the greatest number of lives. By contrast, emotional centers in the brain were more active when subjects confronted the second trolley hypothetical, in which they tended to recoil at the idea of personally harming an individual, even under such wrenching circumstances. “This suggests that moral judgment is not a single thing; it’s intuitive emotional responses and then cognitive responses that are duking it out,” Greene said.

“To a neuroscientist, you are your brain; nothing causes your behavior other than the operations of your brain,” Greene says. “If that’s right, it radically changes the way we think about the law. The official line in the law is all that matters is whether you’re rational, but you can have someone who is totally rational but whose strings are being pulled by something beyond his control.” In other words, even someone who has the illusion of making a free and rational choice between soup and salad may be deluding himself, since the choice of salad over soup is ultimately predestined by forces hard-wired in his brain. Greene insists that this insight means that the criminal-justice system should abandon the idea of retribution — the idea that bad people should be punished because they have freely chosen to act immorally — which has been the focus of American criminal law since the 1970s, when rehabilitation went out of fashion. Instead, Greene says, the law should focus on deterring future harms. In some cases, he supposes, this might mean lighter punishments. “If it’s really true that we don’t get any prevention bang from our punishment buck when we punish that person, then it’s not worth punishing that person,” he says. (On the other hand, Carter Snead, the Notre Dame scholar, maintains that capital defendants who are not considered fully blameworthy under current rules could be executed more readily under a system that focused on preventing future harms.)


(Page 5 of 9)

Others agree with Greene and Cohen that the legal system should be radically refocused on deterrence rather than on retribution. Since the celebrated M’Naughten case in 1843, involving a paranoid British assassin, English and American courts have recognized an insanity defense only for those who are unable to appreciate the difference between right and wrong. (This is consistent with the idea that only rational people can be held criminally responsible for their actions.) According to some neuroscientists, that rule makes no sense in light of recent brain-imaging studies. “You can have a horrendously damaged brain where someone knows the difference between right and wrong but nonetheless can’t control their behavior,” says Robert Sapolsky, a neurobiologist at Stanford. “At that point, you’re dealing with a broken machine, and concepts like punishment and evil and sin become utterly irrelevant. Does that mean the person should be dumped back on the street? Absolutely not. You have a car with the brakes not working, and it shouldn’t be allowed to be near anyone it can hurt.”

Even as these debates continue, some skeptics contend that both the hopes and fears attached to neurolaw are overblown. “There’s nothing new about the neuroscience ideas of responsibility; it’s just another material, causal explanation of human behavior,” says Stephen J. Morse, professor of law and psychiatry at the University of Pennsylvania. “How is this different than the Chicago school of sociology,” which tried to explain human behavior in terms of environment and social structures? “How is it different from genetic explanations or psychological explanations? The only thing different about neuroscience is that we have prettier pictures and it appears more scientific.”

Morse insists that “brains do not commit crimes; people commit crimes” — a conclusion he suggests has been ignored by advocates who, “infected and inflamed by stunning advances in our understanding of the brain . . . all too often make moral and legal claims that the new neuroscience . . . cannot sustain.” He calls this “brain overclaim syndrome” and cites as an example the neuroscience briefs filed in the Supreme Court case Roper v. Simmons to question the juvenile death penalty. “What did the neuroscience add?” he asks. If adolescent brains caused all adolescent behavior, “we would expect the rates of homicide to be the same for 16- and 17-year-olds everywhere in the world — their brains are alike — but in fact, the homicide rates of Danish and Finnish youths are very different than American youths.” Morse agrees that our brains bring about our behavior — “I’m a thoroughgoing materialist, who believes that all mental and behavioral activity is the causal product of physical events in the brain” — but he disagrees that the law should excuse certain kinds of criminal conduct as a result. “It’s a total non sequitur,” he says. “So what if there’s biological causation? Causation can’t be an excuse for someone who believes that responsibility is possible. Since all behavior is caused, this would mean all behavior has to be excused.” Morse cites the case of Charles Whitman, a man who, in 1966, killed his wife and his mother, then climbed up a tower at the University of Texas and shot and killed 13 more people before being shot by police officers. Whitman was discovered after an autopsy to have a tumor that was putting pressure on his amygdala. “Even if his amygdala made him more angry and volatile, since when are anger and volatility excusing conditions?” Morse asks. “Some people are angry because they had bad mommies and daddies and others because their amygdalas are mucked up. The question is: When should anger be an excusing condition?”

Still, Morse concedes that there are circumstances under which new discoveries from neuroscience could challenge the legal system at its core. “Suppose neuroscience could reveal that reason actually plays no role in determining human behavior,” he suggests tantalizingly. “Suppose I could show you that your intentions and your reasons for your actions are post hoc rationalizations that somehow your brain generates to explain to you what your brain has already done” without your conscious participation. If neuroscience could reveal us to be automatons in this respect, Morse is prepared to agree with Greene and Cohen that criminal law would have to abandon its current ideas about responsibility and seek other ways of protecting society.

Some scientists are already pushing in this direction. In a series of famous experiments in the 1970s and ’80s, Benjamin Libet measured people’s brain activity while telling them to move their fingers whenever they felt like it. Libet detected brain activity suggesting a readiness to move the finger half a second before the actual movement and about 400 milliseconds before people became aware of their conscious intention to move their finger. Libet argued that this leaves 100 milliseconds for the conscious self to veto the brain’s unconscious decision, or to give way to it — suggesting, in the words of the neuroscientist Vilayanur S. Ramachandran, that we have not free will but “free won’t.”

Morse is not convinced that the Libet experiments reveal us to be helpless automatons. But he does think that the study of our decision-making powers could bear some fruit for the law. “I’m interested,” he says, “in people who suffer from drug addictions, psychopaths and people who have intermittent explosive disorder — that’s people who have no general rationality problem other than they just go off.” In other words, Morse wants to identify the neural triggers that make people go postal. “Suppose we could show that the higher deliberative centers in the brain seem to be disabled in these cases,” he says. “If these are people who cannot control episodes of gross irrationality, we’ve learned something that might be relevant to the legal ascription of responsibility.” That doesn’t mean they would be let off the hook, he emphasizes: “You could give people a prison sentence and an opportunity to get fixed.”


Page 6 of 9)

IV. Putting the Unconscious on Trial If debates over criminal responsibility long predate the f.M.R.I., so do debates over the use of lie-detection technology. What’s new is the prospect that lie detectors in the courtroom will become much more accurate, and correspondingly more intrusive. There are, at the moment, two lie-detection technologies that rely on neuroimaging, although the value and accuracy of both are sharply contested. The first, developed by Lawrence Farwell in the 1980s, is known as “brain fingerprinting.” Subjects put on an electrode-filled helmet that measures a brain wave called p300, which, according to Farwell, changes its frequency when people recognize images, pictures, sights and smells. After showing a suspect pictures of familiar places and measuring his p300 activation patterns, government officials could, at least in theory, show a suspect pictures of places he may or may not have seen before — a Qaeda training camp, for example, or a crime scene — and compare the activation patterns. (By detecting not only lies but also honest cases of forgetfulness, the technology could expand our very idea of lie detection.)

The second lie-detection technology uses f.M.R.I. machines to compare the brain activity of liars and truth tellers. It is based on a test called Guilty Knowledge, developed by Daniel Langleben at the University of Pennsylvania in 2001. Langleben gave subjects a playing card before they entered the magnet and told them to answer no to a series of questions, including whether they had the card in question. Langleben and his colleagues found that certain areas of the brain lighted up when people lied.

Two companies, No Lie MRI and Cephos, are now competing to refine f.M.R.I. lie-detection technology so that it can be admitted in court and commercially marketed. I talked to Steven Laken, the president of Cephos, which plans to begin selling its products this year. “We have two to three people who call every single week,” he told me. “They’re in legal proceedings throughout the world, and they’re looking to bolster their credibility.” Laken said the technology could have “tremendous applications” in civil and criminal cases. On the government side, he said, the technology could replace highly inaccurate polygraphs in screening for security clearances, as well as in trying to identify suspected terrorists’ native languages and close associates. “In lab studies, we’ve been in the 80- to 90-percent-accuracy range,” Laken says. This is similar to the accuracy rate for polygraphs, which are not considered sufficiently reliable to be allowed in most legal cases. Laken says he hopes to reach the 90-percent- to 95-percent-accuracy range — which should be high enough to satisfy the Supreme Court’s standards for the admission of scientific evidence. Judy Illes, director of Neuroethics at the Stanford Center for Biomedical Ethics, says, “I would predict that within five years, we will have technology that is sufficiently reliable at getting at the binary question of whether someone is lying that it may be utilized in certain legal settings.”

If and when lie-detection f.M.R.I.’s are admitted in court, they will raise vexing questions of self-incrimination and privacy. Hank Greely, a law professor and head of the Stanford Center for Law and the Biosciences, notes that prosecution and defense witnesses might have their credibility questioned if they refused to take a lie-detection f.M.R.I., as might parties and witnesses in civil cases. Unless courts found the tests to be shocking invasions of privacy, like stomach pumps, witnesses could even be compelled to have their brains scanned. And equally vexing legal questions might arise as neuroimaging technologies move beyond telling whether or not someone is lying and begin to identify the actual content of memories. Michael Gazzaniga, a professor of psychology at the University of California, Santa Barbara, and author of “The Ethical Brain,” notes that within 10 years, neuroscientists may be able to show that there are neurological differences when people testify about their own previous acts and when they testify to something they saw. “If you kill someone, you have a procedural memory of that, whereas if I’m standing and watch you kill somebody, that’s an episodic memory that uses a different part of the brain,” he told me. Even if witnesses don’t have their brains scanned, neuroscience may lead judges and jurors to conclude that certain kinds of memories are more reliable than others because of the area of the brain in which they are processed. Further into the future, and closer to science fiction, lies the possibility of memory downloading. “One could even, just barely, imagine a technology that might be able to ‘read out’ the witness’s memories, intercepted as neuronal firings, and translate it directly into voice, text or the equivalent of a movie,” Hank Greely writes.

30269  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Neuroscience: The Brain on the Stand on: March 11, 2007, 09:18:14 AM
I. Mr. Weinstein’s Cyst When historians of the future try to identify the moment that neuroscience began to transform the American legal system, they may point to a little-noticed case from the early 1990s. The case involved Herbert Weinstein, a 65-year-old ad executive who was charged with strangling his wife, Barbara, to death and then, in an effort to make the murder look like a suicide, throwing her body out the window of their 12th-floor apartment on East 72nd Street in Manhattan. Before the trial began, Weinstein’s lawyer suggested that his client should not be held responsible for his actions because of a mental defect — namely, an abnormal cyst nestled in his arachnoid membrane, which surrounds the brain like a spider web.

The implications of the claim were considerable. American law holds people criminally responsible unless they act under duress (with a gun pointed at the head, for example) or if they suffer from a serious defect in rationality — like not being able to tell right from wrong. But if you suffer from such a serious defect, the law generally doesn’t care why — whether it’s an unhappy childhood or an arachnoid cyst or both. To suggest that criminals could be excused because their brains made them do it seems to imply that anyone whose brain isn’t functioning properly could be absolved of responsibility. But should judges and juries really be in the business of defining the normal or properly working brain? And since all behavior is caused by our brains, wouldn’t this mean all behavior could potentially be excused?
The prosecution at first tried to argue that evidence of Weinstein’s arachnoid cyst shouldn’t be admitted in court. One of the government’s witnesses, a forensic psychologist named Daniel Martell, testified that brain-scanning technologies were new and untested, and their implications weren’t yet widely accepted by the scientific community. Ultimately, on Oct. 8, 1992, Judge Richard Carruthers issued a Solomonic ruling: Weinstein’s lawyers could tell the jury that brain scans had identified an arachnoid cyst, but they couldn’t tell jurors that arachnoid cysts were associated with violence. Even so, the prosecution team seemed to fear that simply exhibiting images of Weinstein’s brain in court would sway the jury. Eleven days later, on the morning of jury selection, they agreed to let Weinstein plead guilty in exchange for a reduced charge of manslaughter.

After the Weinstein case, Daniel Martell found himself in so much demand to testify as a expert witness that he started a consulting business called Forensic Neuroscience. Hired by defense teams and prosecutors alike, he has testified over the past 15 years in several hundred criminal and civil cases. In those cases, neuroscientific evidence has been admitted to show everything from head trauma to the tendency of violent video games to make children behave aggressively. But Martell told me that it’s in death-penalty litigation that neuroscience evidence is having its most revolutionary effect. “Some sort of organic brain defense has become de rigueur in any sort of capital defense,” he said. Lawyers routinely order scans of convicted defendants’ brains and argue that a neurological impairment prevented them from controlling themselves. The prosecution counters that the evidence shouldn’t be admitted, but under the relaxed standards for mitigating evidence during capital sentencing, it usually is. Indeed, a Florida court has held that the failure to admit neuroscience evidence during capital sentencing is grounds for a reversal. Martell remains skeptical about the worth of the brain scans, but he observes that they’ve “revolutionized the law.”

The extent of that revolution is hotly debated, but the influence of what some call neurolaw is clearly growing. Neuroscientific evidence has persuaded jurors to sentence defendants to life imprisonment rather than to death; courts have also admitted brain-imaging evidence during criminal trials to support claims that defendants like John W. Hinckley Jr., who tried to assassinate President Reagan, are insane. Carter Snead, a law professor at Notre Dame, drafted a staff working paper on the impact of neuroscientific evidence in criminal law for President Bush’s Council on Bioethics. The report concludes that neuroimaging evidence is of mixed reliability but “the large number of cases in which such evidence is presented is striking.” That number will no doubt increase substantially. Proponents of neurolaw say that neuroscientific evidence will have a large impact not only on questions of guilt and punishment but also on the detection of lies and hidden bias, and on the prediction of future criminal behavior. At the same time, skeptics fear that the use of brain-scanning technology as a kind of super mind-reading device will threaten our privacy and mental freedom, leading some to call for the legal system to respond with a new concept of “cognitive liberty.”


Page 2 of 9)

One of the most enthusiastic proponents of neurolaw is Owen Jones, a professor of law and biology at Vanderbilt. Jones (who happens to have been one of my law-school classmates) has joined a group of prominent neuroscientists and law professors who have applied for a large MacArthur Foundation grant; they hope to study a wide range of neurolaw questions, like: Do sexual offenders and violent teenagers show unusual patterns of brain activity? Is it possible to capture brain images of chronic neck pain when someone claims to have suffered whiplash? In the meantime, Jones is turning Vanderbilt into a kind of Los Alamos for neurolaw. The university has just opened a $27 million neuroimaging center and has poached leading neuroscientists from around the world; soon, Jones hopes to enroll students in the nation’s first program in law and neuroscience. “It’s breathlessly exciting,” he says. “This is the new frontier in law and science — we’re peering into the black box to see how the brain is actually working, that hidden place in the dark quiet, where we have our private thoughts and private reactions — and the law will inevitably have to decide how to deal with this new technology.”

II. A Visit to Vanderbilt Owen Jones is a disciplined and quietly intense man, and his enthusiasm for the transformative power of neuroscience is infectious. With René Marois, a neuroscientist in the psychology department, Jones has begun a study of how the human brain reacts when asked to impose various punishments. Informally, they call the experiment Harm and Punishment — and they offered to make me one of their first subjects.

We met in Jones’s pristine office, which is decorated with a human skull and calipers, like those that phrenologists once used to measure the human head; his father is a dentist, and his grandfather was an electrical engineer who collected tools. We walked over to Vanderbilt’s Institute of Imaging Science, which, although still surrounded by scaffolding, was as impressive as Jones had promised. The basement contains one of the few 7-tesla magnetic-resonance-imaging scanners in the world. For Harm and Punishment, Jones and Marois use a less powerful 3 tesla, which is the typical research M.R.I.

We then made our way to the scanner. After removing all metal objects — including a belt and a stray dry-cleaning tag with a staple — I put on earphones and a helmet that was shaped like a birdcage to hold my head in place. The lab assistant turned off the lights and left the room; I lay down on the gurney and, clutching a panic button, was inserted into the magnet. All was dark except for a screen flashing hypothetical crime scenarios, like this one: “John, who lives at home with his father, decides to kill him for the insurance money. After convincing his father to help with some electrical work in the attic, John arranges for him to be electrocuted. His father survives the electrocution, but he is hospitalized for three days with injuries caused by the electrical shock.” I was told to press buttons indicating the appropriate level of punishment, from 0 to 9, as the magnet recorded my brain activity.

After I spent 45 minutes trying not to move an eyebrow while assigning punishments to dozens of sordid imaginary criminals, Marois told me through the intercom to try another experiment: namely, to think of familiar faces and places in sequence, without telling him whether I was starting with faces or places. I thought of my living room, my wife, my parents’ apartment and my twin sons, trying all the while to avoid improper thoughts for fear they would be discovered. Then the experiments were over, and I stumbled out of the magnet.

The next morning, Owen Jones and I reported to René Marois’s laboratory for the results. Marois’s graduate students, who had been up late analyzing my brain, were smiling broadly. Because I had moved so little in the machine, they explained, my brain activity was easy to read. “Your head movement was incredibly low, and you were the harshest punisher we’ve had,” Josh Buckholtz, one of the grad students, said with a happy laugh. “You were a researcher’s dream come true!” Buckholtz tapped the keyboard, and a high-resolution 3-D image of my brain appeared on the screen in vivid colors. Tiny dots flickered back and forth, showing my eyes moving as they read the lurid criminal scenarios. Although I was only the fifth subject to be put in the scanner, Marois emphasized that my punishment ratings were higher than average. In one case, I assigned a 7 where the average punishment was 4. “You were focusing on the intent, and the others focused on the harm,” Buckholtz said reassuringly.


Page 3 of 9)

Marois explained that he and Jones wanted to study the interactions among the emotion-generating regions of the brain, like the amygdala, and the prefrontal regions responsible for reason. “It is also possible that the prefrontal cortex is critical for attributing punishment, making the essential decision about what kind of punishment to assign,” he suggested. Marois stressed that in order to study that possibility, more subjects would have to be put into the magnet. But if the prefrontal cortex does turn out to be critical for selecting among punishments, Jones added, it could be highly relevant for lawyers selecting a jury. For example, he suggested, lawyers might even select jurors for different cases based on their different brain-activity patterns. In a complex insider-trading case, for example, perhaps the defense would “like to have a juror making decisions on maximum deliberation and minimum emotion”; in a government entrapment case, emotional reactions might be more appropriate.

We then turned to the results of the second experiment, in which I had been asked to alternate between thinking of faces and places without disclosing the order. “We think we can guess what you were thinking about, even though you didn’t tell us the order you started with,” Marois said proudly. “We think you started with places and we will prove to you that it wasn’t just luck.” Marois showed me a picture of my parahippocampus, the area of the brain that responds strongly to places and the recognition of scenes. “It’s lighting up like Christmas on all cylinders,” Marois said. “It worked beautifully, even though we haven’t tried this before here.”

He then showed a picture of the fusiform area, which is responsible for facial recognition. It, too, lighted up every time I thought of a face. “This is a potentially very serious legal implication,” Jones broke in, since the technology allows us to tell what people are thinking about even if they deny it. He pointed to a series of practical applications. Because subconscious memories of faces and places may be more reliable than conscious memories, witness lineups could be transformed. A child who claimed to have been victimized by a stranger, moreover, could be shown pictures of the faces of suspects to see which one lighted up the face-recognition area in ways suggesting familiarity.

Jones and Marois talked excitedly about the implications of their experiments for the legal system. If they discovered a significant gap between people’s hard-wired sense of how severely certain crimes should be punished and the actual punishments assigned by law, federal sentencing guidelines might be revised, on the principle that the law shouldn’t diverge too far from deeply shared beliefs. Experiments might help to develop a deeper understanding of the criminal brain, or of the typical brain predisposed to criminal activity.

III. The End of Responsibility? Indeed, as the use of functional M.R.I. results becomes increasingly common in courtrooms, judges and juries may be asked to draw new and sometimes troubling lines between “normal” and “abnormal” brains. Ruben Gur, a professor of psychology at the University of Pennsylvania School of Medicine, specializes in doing just that. Gur began his expert-witness career in the mid-1990s when a colleague asked him to help in the trial of a convicted serial killer in Florida named Bobby Joe Long. Known as the “classified-ad rapist,” because he would respond to classified ads placed by women offering to sell household items, then rape and kill them, Long was sentenced to death after he committed at least nine murders in Tampa. Gur was called as a national expert in positron-emission tomography, or PET scans, in which patients are injected with a solution containing radioactive markers that illuminate their brain activity. After examining Long’s PET scans, Gur testified that a motorcycle accident that had left Long in a coma had also severely damaged his amygdala. It was after emerging from the coma that Long committed his first rape.

“I didn’t have the sense that my testimony had a profound impact,” Gur told me recently — Long is still filing appeals — but he has testified at more than 20 capital cases since then. He wrote a widely circulated affidavit arguing that adolescents are not as capable of controlling their impulses as adults because the development of neurons in the prefrontal cortex isn’t complete until the early 20s. Based on that affidavit, Gur was asked to contribute to the preparation of one of the briefs filed by neuroscientists and others in Roper v. Simmons, the landmark case in which a divided Supreme Court struck down the death penalty for offenders who committed crimes when they were under the age of 18.

30270  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Between Black & Immigrant Muslims Part Two on: March 11, 2007, 09:05:45 AM
Page 5 of 6)

“Our kids would come home from school and say, ‘Where is my Christmas tree, my Hanukkah lights?’ ” recalled Dr. Khan, who lives in nearby Jericho. “We didn’t want them to grow up unsure of who they are.”

Since opening in 1993, the mosque has thrived, with assets now valued at more than $3 million. Hundreds of people pray there weekly, and thousands come on Muslim holidays.

The mosque has an unusually modern, democratic air. Men and women worship with no partition between them. A different scholar delivers the Friday sermon every week, in English.

Perhaps most striking, a majority of female worshipers do not cover their heads outside the mosque.

“I think it’s important to find the fine line between the religion and the age in which we live,” said Nasreen Wasti, 43, a contract analyst for Lufthansa. “I’m sure I will have to answer to God for not covering myself. But I’m also satisfied by many of the good deeds I am doing.”

She and other members use words like “progressive” to describe their congregation. But after Sept. 11, a different image took hold.

In October 2001, a Newsday article quoted a member of the mosque as asking “who really benefits from such a horrible tragedy that is blamed on Muslims and Arabs?” A co-president of the mosque was also quoted saying that Israel “would benefit from this tragedy.”

Conspiracy theories about Sept. 11 have long circulated among Muslims, and Dr. Khan had heard discussion among congregants. Such talk, he said, was the product of two forces: a deep mistrust of America’s motives in the Middle East and a refusal, among many Muslims, to engage in self-criticism.

“You blame the other guy for your own shortcomings,” said Dr. Khan.

He visited synagogues and churches after the article ran, reassuring audiences that the comments did not reflect the official position of the mosque, which condemned the attacks.

But to Congressman Peter T. King, whose district is near the mosque, that condemnation fell short. He began publicly criticizing Dr. Khan, asserting that he had failed to fully denounce the statements made by the men.

“He’s definitely a radical,” Mr. King said of Dr. Khan in an interview. “You cannot, in the context of Sept. 11, allow those statements to be made and not be a radical.”

When asked about Mr. King’s comments, Dr. Khan replied proudly, “I thought we had freedom of speech.”

It hardly seems possible that Mr. King and Dr. Khan were once friends.

Mr. King used to dine at Dr. Khan’s home. He attended the wedding of Dr. Khan’s son, Arif, in 1995. At the mosque’s opening, it was Mr. King who cut the ribbon.

After Sept. 11, the mosque experienced the sort of social backlash felt by Muslims around the country. Anonymous callers left threatening messages, and rocks were hurled at children from passing cars.

The attention waned over time. But Mr. King cast a new light on the mosque in 2004 with the release of his novel “Vale of Tears.”

In the novel, terrorists affiliated with a Long Island mosque demolish several buildings, killing hundreds of people. One of the central characters is a Pakistani heart surgeon whose friendship with a congressman has grown tense.

“By inference, it’s me,” Dr. Khan said of the Pakistani character. (Mr. King said it was a “composite character” based on several Muslims he knows.)

For Dr. Khan, his difficulties after Sept. 11 come as proof that Muslims cannot stay fragmented. “It’s a challenge for the whole Muslim community — not just for me,” he said. “United we stand, divided we fall.”

The Litmus Test

Imam Talib and his bodyguard set off to Westbury before dusk on Oct. 14. They passed a fork on the Long Island Expressway, and the imam peered out the window. None of the signs were familiar.

He checked his watch and saw that he was late, adding to his unease. He had visited the mosque a few times before, but never felt entirely at home.

“I’m conscious of being a guest,” he said. “They treat me kindly and nicely. But I know where I am.”

Page 6 of 6)

At the Islamic Center of Long Island, Dr. Khan was also getting nervous. Hundreds of congregants had gathered after fasting all day for Ramadan. The scent of curry drifted mercilessly through the mosque.

Dr. Khan sprang to his feet and took the microphone. He improvised.

“All of us need to learn from and understand the contributions of the Muslim indigenous community,” he said. “Starting with Malcolm X.”

It had been six years since Imam Talib and Dr. Khan first encountered each other in Chicago. Back then, Imam Talib rarely visited immigrant mosques, and Dr. Khan had only a peripheral connection to African-American Muslims.

In the 1980s, the doctor had become aware of the high number of Muslim inmates while working as the chief of medicine for a hospital in Nassau County that oversaw health care at the county prison. His mosque began donating prayer rugs, Korans and skullcaps to prisoners around the country. But his interaction with black Muslim leaders was limited until Sept. 11.

After Dr. Khan read the book “Black Rage,” he and Imam Talib began serving together on the board of a new political task force. Finally, in 2005, Dr. Khan invited the imam to his mosque to give the Friday sermon.

That February, Imam Talib rose before the Long Island congregation. Blending verses in the Koran with passages from recent American history, he urged the audience to learn from the civil rights movement.

Dr. Khan listened raptly. Afterward, over sandwiches, he asked Imam Talib for advice. He wanted to thaw the relationship between his mosque and African-American mosques on Long Island. The conversation continued for hours.

“The real searching for an answer, searching for a solution, was coming from Dr. Khan,” said Imam Talib. “I could just feel it.”

Dr. Khan began inviting more African-American leaders to speak at his mosque, and welcomed Imam Talib there last October to give a fund-raising pitch for his organization, the Muslim Alliance in North America. The group had recently announced a “domestic agenda,” with programs to help ex-convicts find housing and jobs and to standardize premarital counseling for Muslims in America.

After the imam arrived that evening and spoke, he sat on the floor next to a blazer-clad Dr. Khan. As they feasted on kebabs, the doctor made a pitch of his own: The teenagers of his mosque could spend a day at Imam Talib’s mosque, as the start of a youth exchange program. The imam nodded slowly.

Minutes later, the mosque’s president, Habeeb Ahmed, hurried over. The congregants had so far pledged $10,000.

“Alhamdulillah,” the imam said. Praise be to God.

It was the most Imam Talib had raised for his group in one evening.

As the dinner drew to a close, the imam looked for his bodyguard. They had a long drive home and he did not want to lose his way again.

Dr. Khan asked Imam Talib how he had gotten lost.

“Inner city versus the suburbs,” the imam replied a bit testily.

Then he smiled.

“The only thing it proves,” he said, “is that I need to come by here more often.”

30271  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Between Black & Immigrant Muslims on: March 11, 2007, 08:59:49 AM
Its the NY Times, so of course there are the shadings that one expects from the Times, but  the article addresses matters of interest, and so I post it here:


Dr. Faroque Khan, left, and Imam Al-Hajj Talib ‘Abdur-Rashid serve very different mosques, one on Long Island and one in Harlem.

Published: March 11, 2007
NY Times

Under the glistening dome of a mosque on Long Island, hundreds of men sat cross-legged on the floor. Many were doctors and engineers born in Pakistan and India. Dressed in khakis, polo shirts and the odd silk tunic, they fidgeted and whispered.

James Estrin/The New York Times
Imam Al-Hajj Talib ‘Abdur-Rashid at a rally against profiling.
One thing stood between them and dinner: A visitor from Harlem was coming to ask for money.

A towering black man with a gray-flecked beard finally swept into the room, his bodyguard trailing him. Wearing a long, embroidered robe and matching hat, he took the microphone and began talking about a different group of Muslims, the thousands of African-Americans who have found Islam in prison.

“We are all brothers and sisters,” said the visitor, known as Imam Talib.

The men stared. To some of them, it seemed, he was from another planet. As the imam returned their gaze, he had a similar sensation. “They live in another world,” he later said.

Only 28 miles separate Imam Talib’s mosque in Harlem from the Islamic Center of Long Island. The congregations they each serve — African-Americans at the city mosque and immigrants of South Asian and Arab descent in the suburbs — represent the largest Muslim populations in the United States. Yet a vast gulf divides them, one marked by race and class, culture and history.

For many African-American converts, Islam is an experience both spiritual and political, an expression of empowerment in a country they feel is dominated by a white elite. For many immigrant Muslims, Islam is an inherited identity, and America a place of assimilation and prosperity.

For decades, these two Muslim worlds remained largely separate. But last fall, Imam Talib hoped to cross that distance in a venture that has become increasingly common since Sept. 11. Black Muslims have begun advising immigrants on how to mount a civil rights campaign. Foreign-born Muslims are giving African-Americans roles of leadership in some of their largest organizations. The two groups have joined forces politically, forming coalitions and backing the same candidates.

It is a tentative and uneasy union, seen more typically among leaders at the pulpit than along the prayer line. But it is critical, a growing number of Muslims believe, to surviving a hostile new era.

“Muslims will not be successful in America until there is a marriage between the indigenous and immigrant communities,” said Siraj Wahhaj, an African-American imam in New York with a rare national following among immigrant Muslims. “There has to be a marriage.”

The divide between black and immigrant Muslims reflects a unique struggle facing Islam in America. Perhaps nowhere else in the world are Muslims from so many racial, cultural and theological backgrounds trying their hands at coexistence. Only in Mecca, during the obligatory hajj, or pilgrimage, does such diversity in the faith come to life, between black and white, rich and poor, Sunni and Shiite.

“This is a new experiment in the history of Islam,” said Ali S. Asani, a professor of Islamic studies at Harvard University.

That evening in October, Imam Al-Hajj Talib ‘Abdur-Rashid drove to Westbury, on Long Island, with a task he would have found unthinkable years ago.

He would ask for donations from the immigrant community he refers to, somewhat bitterly, as the “Muslim elite.”

But he needed funds, and the doors of immigrant mosques seemed to be opening. Imam Talib and other African-American leaders had formed a national “indigenous Muslim” organization, and he knew that during the holy month of Ramadan, the Islamic Center of Long Island could raise thousands of dollars in an evening.

It is a place where BMWs and Mercedes-Benzes fill the parking lot, and Coach purses are perched along prayer lines.

In Harlem, many of Imam Talib’s congregants get to the mosque by bus or subway, and warm themselves with space heaters in a drafty, brick building.

Before the terrorist attacks of Sept. 11, Imam Talib had only a distant connection to the Islamic Center of Long Island. In passing, he had met Faroque Khan, an Indian-born doctor who helped found the mosque, but the two had little in common.

Imam Talib, 56, is a thundering prison chaplain whose mosque traces its roots to Malcolm X. He is a first-generation Muslim.

Dr. Khan, 64, is a mild-mannered pulmonologist who collects Chinese antiques and learned to ski on the slopes of Vermont. He is a first-generation American.

But in the turmoil that followed Sept. 11, the imam and the doctor found themselves unexpectedly allied.

“The more separate we stay, the more targeted we become,” Dr. Khan said.
(Page 2 of 6)

Each man recognizes what the other has to offer. African-Americans possess a cultural and historical fluency that immigrants lack, said Dr. Khan; they hold an unassailable place in America from which to defend their faith.

 For Imam Talib, immigrants provide a crucial link to the Muslim world and its tradition of scholarship, as well as the wisdom that comes with an “unshattered Islamic heritage.”

Both groups have their practical virtues, too. African-Americans know better how to mobilize in America, both men say, and immigrants tend to have deeper pockets.

Still, it is one thing to talk about unity, Imam Talib said, and another to give it life. Before his visit to Long Island last fall, he had never asked Dr. Khan and his mosque to match their rhetoric with money.

“You have to have a litmus test,” he said.

One Faith, Many Histories

Imam Talib and Dr. Khan did not warm to each other when they met in May 2000, at a gathering in Chicago of Muslim leaders.

The imam found the silver-haired doctor faintly smug and paternalistic. It was an attitude he had often whiffed from well-to-do immigrant Muslims. Dr. Khan found Imam Talib straightforward to the point of bluntness.

The uneasy introduction was, for both men, emblematic of the strained relationship between their communities.

Imam Talib and other black Muslims trace their American roots to the arrival of Muslims from West Africa as slaves in the South.

(Is this at all true?  I thought the point was that the Muslim Arab slave traders felt free to enslave the non-Muslim blacks?)

That historical link gave rise to Islam-inspired movements in the 20th century, the most significant of which was the Nation of Islam.

The man who founded the Nation in 1930, W. D. Fard, spread the message that American blacks belonged to a lost Muslim tribe and were superior to the “white, blue-eyed devils” in their midst. Under Mr. Fard’s successor, Elijah Muhammad, the Nation flourished in the 1960s amid the civil rights struggle and the emergence of a black-separatist movement.

Overseas, Islamic scholars found the group’s teachings on race antithetical to the faith. The schism narrowed after 1975, when Mr. Muhammad’s son Warith Deen Mohammed took over the Nation, bringing it in line with orthodox Sunni Islam. Louis Farrakhan parted ways with Mr. Mohammed — taking the Nation’s name and traditional teachings with him — but the majority of African-American adherents came to embrace the same Sunni practice that dominates the Muslim world.

Still, divisions between African-American and immigrant Muslims remained pronounced long after the first large waves of South Asians and Arabs arrived in the United States in the 1960s.

Today, of the estimated six million Muslims who live in the United States, (a number commonly claimed, which I have plausibly seen challenged) about 25 percent are African-American, 34 percent are South Asian and 26 percent are Arab, said John Zogby, a pollster who has studied the American Muslim population.

“Given the extreme from which we came, I would say that the immigrant Muslims have been brotherly toward us,” Warith Deen Mohammed, who has the largest following of African-American Muslims, said in an interview. “But I think they’re more skeptical than they admit they are. I think they feel more comfortable with their own than they feel with us.”

For many African-Americans, conversion to Islam has meant parting with mainstream culture, while Muslim immigrants have tended toward assimilation. Black converts often take Arabic names, only to find foreign-born Muslims introducing themselves as “Moe” instead of “Mohammed.”

The tensions are also economic. Like Dr. Khan, many Muslim immigrants came to the United States with advanced degrees and quickly prospered, settling in the suburbs. For decades, African-Americans watched with frustration as immigrants sent donations to causes overseas, largely ignoring the problems of poor Muslims in the United States.

Imam Talib found it impossible to generate interest at immigrant mosques in the 1999 police shooting of Amadou Diallo, who was Muslim. “What we’ve found is when domestic issues jump up, like police brutality, all the sudden we’re by ourselves,” he said.

Some foreign-born Muslims say they are put off by the racial politics of many black converts. They struggle to understand why African-American Muslims have been reluctant to meet with law enforcement officials in the wake of Sept. 11. For their part, black Muslim leaders complain that immigrants have failed to learn their history, which includes a pattern of F.B.I. surveillance dating back to the roots of the Nation of Islam.

The ironies are, at times, stinging.

“From the immigrant community, I hear that African-Americans have to learn how to work in the system,” said Nihad Awad, the executive director of the Council on American Islamic Relations, adding that this was not his personal opinion.

At the heart of the conflict is a question of leadership. Much to the ire of African-Americans, many immigrants see themselves as the rightful leaders of the faith in America by virtue of their Islamic schooling and fluency in Arabic, the original language of the Koran.

“What does knowing Arabic have to do with the quality of your prayer, your fast, your relationship with God?” asked Ihsan Bagby, an associate professor of Islamic studies at the University of Kentucky in Lexington. “But African-Americans have to ask themselves why have they not learned more in these years.”


Every year in Chicago, the two largest Muslim conventions in the country — one sponsored by an immigrant organization and the other by Mr. Mohammed’s — take place on the same weekend, in separate parts of the city.

The long-simmering tension boiled over into a public rift with the 2000 presidential elections. That year, a powerful coalition of immigrant Muslims endorsed George W. Bush (because of a promise to stop the profiling of Arabs).

The nation’s most prominent African-American Muslims complained that they were never consulted. The following summer, when Imam Talib vented his frustration at a meeting with immigrant leaders in Washington, a South Asian man turned to him, he recalled, and said, “I don’t understand why all of you African-American Muslims are always so angry about everything.”

Imam Talib searched for an answer he thought the man could understand.

“African-Americans are like the Palestinians of this land,” he finally said. “We’re not just some angry black people. We’re legitimately outraged and angry.”

The room fell silent.

Soon after, black leaders announced the creation of the Muslim Alliance in North America, their first national “indigenous” organization.

But the fallout over the elections was soon eclipsed by Sept. 11, when Muslim immigrants found themselves under intense public scrutiny. They began complaining about “profiling” and “flying while brown,” appropriating language that had been largely the domain of African-Americans.

It was around this time that Dr. Khan became, as he put it, enlightened. A few weeks before the terrorist attacks, he read the book “Black Rage,” by William H. Grier and Price M. Cobbs. The book, published in 1968, explores the psychological woes of African-Americans, and how the impact of racism is carried through generations.

“It helped me understand that even before you’re born, things that happened a hundred years ago can affect you,” Dr. Khan said. “That was a big change in my thinking.”

He sent an e-mail message to fellow Muslims, including Imam Talib, sharing what he had learned.

The Harlem imam was pleased, if not yet convinced.

“I just encouraged the brother to keep going,” Imam Talib said.

An Oasis in Harlem

One windswept night in Harlem, cars rolled past the corner of West 113th Street and St. Nicholas Avenue. A police siren blared as men huddled by a neon-lit Laundromat.

Across the street stood a brown brick building, lifeless from the outside. But upstairs, in a cozy carpeted room, rows of men and women chanted.

“Ya Hakim. Ya Allah.” O wise one. O God.

Imam Talib led the chant, swathed in a black satin robe. It was Ramadan’s holiest evening, the Night of Power. As the voices died down, he spotted his bodyguard swaying.

“Take it easy there, Captain,” Imam Talib said. “As long as you don’t jump and shout it’s all right.”

Laughter trickled through the mosque, where a translucent curtain separated men in skullcaps from women in African-print gowns.

“We’re just trying to be ourselves, you know?” Imam Talib said. “Within the tradition.”

“That’s right,” said one woman.

The imam continued: “And we can’t let other people, from other cultures, come and try to make us clones of them. We came here as Muslims.”

He was feeling drained. He had just returned from the Manhattan Detention Complex, where he works as a chaplain. Some of the mosque’s men were back in jail.

“We need power,” he said quietly. “Without that, we’ll destroy ourselves.”

Since its birth in 1964, the Mosque of Islamic Brotherhood has been a fortress of stubborn faith, persevering through the crack wars, welfare, AIDS, gangs, unemployment, diabetes, broken families and gentrification.

The mosque was founded in a Brooklyn apartment by Shaykh-‘Allama Al-Hajj K. Ahmad Tawfiq, a follower of Malcolm X. The Sunni congregation boomed in the 1970s, starting a newspaper and opening a school and a health food store.

With city loans, it bought its current building. Fourteen families moved in, creating a bold Muslim oasis in a landscape of storefront churches and liquor stores. The mosque claimed its corner by drenching the sidewalk in dark green paint, the color associated with Islam.

The paint has since faded. The school is closed. Many of the mosque’s members can no longer afford to live in a neighborhood where brownstones sell for millions of dollars.


Page 4 of 6)

But an aura of dignity prevails. The women normally pray one floor below the men, in a scrubbed, tidy room scented with incense. Their bathroom is a shrine of gold curtains and lavender soaps. A basket of nylon roses hides a hole in the wall.

Most of the mosque’s 160 members belong to the working class, and up to a third of the men are former convicts.

Some congregants are entrepreneurs, professors, writers and musicians. Mos Def and Q-Tip have visited with Imam Talib, who carries the nickname “hip-hop imam.”

Mosque celebrations are a blend of Islam and Harlem. In October, at the end of Ramadan, families feasted on curried chicken and collard greens, grilled fish and candied yams.

Just before the afternoon prayer, a lean man in a black turtleneck rose to give the call. He was Yusef Salaam, whose conviction in the Central Park jogger case was later overturned.

Many of the mosque’s members embraced Islam in search of black empowerment, not black separatism. They describe racial equality as a central tenet of their faith. Yet for some, the promise of Islam has been at odds with the reality of Muslims.

One member, Aqilah Mu’Min, lives in the Parkchester section of the Bronx, a heavily Bangladeshi neighborhood. Whenever she passes women in head scarves, she offers the requisite Muslim greeting. Rarely is it returned. “We have a theory that says Islam is perfect, human beings are not,” said Ms. Mu’Min, a city fraud investigator.

It was the simplicity of Islam that drew Imam Talib.

Raised a Christian, he spent the first part of his youth in segregated North Carolina. As a teenager, he read “The Autobiography of Malcolm X” twice. He began educating himself about the faith at age 19, when as an aspiring actor he was cast in a play about a man who had left the Nation of Islam.

But his conversion was more spiritual than political, he said.

“I’d like to think that even if I was a white man, I’d still be a Muslim because that’s the orientation of my soul,” the imam said.

He has learned some Arabic, and traveled once to the Middle East, for hajj. Yet he feels more comfortable with the Senegalese and Guinean Muslims who have settled in Harlem than with many Arabs and South Asians.

He is trying to reach out, but is often disappointed.

In November, he accepted a last-minute invitation to meet with hundreds of immigrants at the Islamic Cultural Center of New York, an opulent mosque on East 96th Street.

The group, the Coalition for Muslim School Holidays, was trying to persuade the city to recognize two Muslim holidays on the school calendar. The effort, Imam Talib learned, had been nearly a year in the making, and no African-American leaders had been consulted.

He was stunned. After all, he had led a similar campaign in the 1980s, resulting in the suspension of alternate-side parking for the same holidays.

“They are unaware of the foundations upon which they are standing,” he said.

Backlash in the Suburbs

Brush Hollow Road winds through a quiet stretch of Long Island, past churches and diners and leafy cul-de-sacs. In this tranquil tableau, the Islamic Center of Long Island announces itself proudly, a Moorish structure of white concrete topped by a graceful dome.

Sleek sedans and S.U.V.’s circle the property as girls with Barbie backpacks hop out and scurry to the Islamic classes they call “Sunday school.”

It is a testament to America’s influence on the mosque that its liveliest time of the week is not Friday, Islam’s holy day, but Sunday.

Boys in hooded sweatshirts smack basketballs along the pavement by a sign that reads “No pray, no play.” Young mothers in Burberry coats exchange kisses and chatter.

For members of the mosque — many of whom work in Manhattan and cannot make the Friday prayer — Sunday is the day to reflect and connect.

The treasurer, Rizwan Qureshi, frantically greeted drivers one Sunday morning with a flier advertising a fund-raiser.

“We’re trying to get Barack Obama,” Mr. Qureshi, a banker born in Karachi, told a woman in a gold-hued BMW.

“We need some real money,” he called out to another driver.

The mosque began with a group of doctors, engineers and other professionals from Pakistan and India who settled in Nassau County in the early 1970s.
30272  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Race, religion, ethnic origin on: March 10, 2007, 10:25:32 PM
Wall Street Journal

D-Day in Little Rock
Eisenhower's civil rights showdown.

Thursday, March 8, 2007 12:01 a.m. EST

In spring 1954, as the Supreme Court was deliberating on Brown v. Board of Education, President Dwight D. Eisenhower invited Chief Justice Earl Warren to a stag dinner at the White House. He seated Warren at the same table as John W. Davis, the lawyer who had argued against school desegregation before the court. Eisenhower proceeded to tell the chief justice what a "great man" Davis was.

As it happened, Eisenhower had authorized his Justice Department to file an amicus brief in the case opposing Davis and public-school segregation. And he specifically allowed his solicitor general, Lee Rankin, to tell the justices during oral argument that "separate but equal" schools were unconstitutional. Yet he sympathized with the segregated South. "These are not bad people," he told Warren at the dinner. "All they are concerned about is to see that their sweet little girls are not required to sit in school alongside some big, overgrown Negroes." Warren was appalled.

To put it kindly, Eisenhower was ambivalent on civil rights. "Conservative by nature, he hoped that the advance of the civil rights movement would be gradual, allowing time for the South to change," writes Kasey S. Pipes in "Ike's Final Battle." Most of all, Eisenhower didn't want to lead a civil-rights crusade from the White House. "The only crusade he had ever wanted to lead was liberating Europe in World War II," Mr. Pipes says.
But when necessary--or when steps toward desegregation were relatively painless--Eisenhower acted. He broke the color barrier in the military by deploying black soldiers alongside whites to win the Battle of the Bulge in December 1944 and January 1945. As president, he integrated the schools and movie theaters in Washington, D.C., and federal installations around the country. Most important, he sent U.S. Army troops to Little Rock, Ark., in September 1957 to escort nine black students into Central High School after days of violent protest. It was a defeat from which segregationist forces never recovered.

"Little Rock represented something else as well: the culmination of Eisenhower's own attitude toward racial justice," Mr. Pipes writes. "Ike had enjoyed the luxury of endorsing civil rights in broad terms, knowing full well that much of segregation law was a state and local matter. Little Rock ended that."

Two days after the Army troops arrived in Little Rock, Eisenhower decided to address the nation on prime-time television. This surprised his attorney general, Herbert Brownell, who had been prodding Eisenhower for years to act more boldly on civil rights. The president wrote most of the speech himself, including a passage, suggested by Secretary of State John Foster Dulles, arguing that violent opposition to racial integration was weakening America's influence and prestige in the world.

In the speech, Eisenhower lauded the desegregation efforts of other Southern communities and their willingness to comply with federal law. This was a new tack for the president, who had refused to endorse Brown v. Board of Education, the Supreme Court's decision declaring segregated public schools unconstitutional. Nor had he denounced the murder of Emmett Till by racist thugs in Mississippi in 1955, despite pleas by the teenage boy's mother.

"He feared that moralizing from the bully pulpit would raise not only awareness, but also the collective blood pressure of the South," Mr. Pipes writes. "He saw no point in riling an already angry population. . . . To put it bluntly, Eisenhower had little interest in trying to change the minds of millions of Southerners."

But he had learned a lesson from Little Rock. His view had been, as Mr. Pipes puts it, that "segregationists and civil rights advocates were cut from the same cloth." In his dealings with Arkansas Gov. Orval Faubus, he learned otherwise.

Faubus betrayed Eisenhower. In the midst of the Little Rock crisis--as Arkansas's National Guard was blocking the nine black students from Central High--Faubus had agreed to meet the president in Newport, R.I. At the end of their 20-minute talk, Faubus gave the president the clear impression that he would change the National Guard's orders, requiring it to protect the black students as they entered Central High. But Faubus didn't follow through. Eisenhower felt double-crossed and told Brownell: "You were right. Faubus broke his word." The president then took the next step, dispatching the 101st Airborne.

Mr. Pipes is not a professional historian. He is a public-relations consultant and speechwriter who worked in the Bush White House from 2002 to 2005. But he has written a highly readable and credible account of Eisenhower's struggle with race and civil rights. While sympathetic, he doesn't sugarcoat Eisenhower's qualms about desegregation or excuse his unwillingness to move decisively before Little Rock.
Eisenhower famously regretted his appointment of Earl Warren as chief justice. (Warren served in that role from 1953 to 1969.) Warren confronted Eisenhower about the president's feelings toward him when they flew together to Winston Churchill's funeral in 1965. Eisenhower explained that it was Warren's liberal rulings on national security that had upset him. He didn't mention Brown v. Board of Education, and understandably so: Years earlier Eisenhower had told an aide, privately, that he thought the Brown decision was wrong; by 1965, he had concluded that it was right.

Mr. Barnes is executive editor of The Weekly Standard and co-host, with Morton Kondracke, of "The Beltway Boys" on Fox News Channel. You can purchase "Ike's Final Battle" at the OpinionJournal bookstore here.
30273  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Help our troops/our cause: on: March 10, 2007, 10:21:45 PM
Free Walter Reed
The wounded deserve more than political recrimination.
Wall Street Journal

Saturday, March 10, 2007 12:01 a.m. EST

The reports of poor conditions at Walter Reed Army Medical Center have set off a political firestorm. It remains to be seen whether the system that created the problem is capable of fixing it.

The Walter Reed facility is located inside the District of Columbia. While the press reports have been dramatic, it strains credulity to think these problems are suddenly news to Congress and all its staff, the executive branch, the Pentagon (across the Potomac), the rest of the Washington press divisions or servicemen and their families.

So now Congress is holding hearings, the White House is setting up an independent commission and Vice President Dick Cheney has pledged "there will be no excuses, only action." Arguably "only action" is a federal-government oxymoron. The action so far has consisted of firings and recrimination. If this continues, the incentive for anyone in government to think innovatively about Walter Reed will fail.

Not surprisingly, the story beneath the Walter Reed mess is a morass. It is government, in its inevitable sprawl, working at cross purposes with itself. For starters, Walter Reed is scheduled to shut down in 2011 as part of the base-closure commission process. No surprise that resources going into Walter Reed would not rise under this circumstance.

Meanwhile, President Bush has proposed spending $38.7 billion on military health care in the coming year--double what the military spent in 2001. Over the past six years the military has expanded health coverage for reservists and for military families and has added a more generous prescription drug benefit.

What has happened is that for more than a decade military health care has shifted away from long hospital stays in favor of increased outpatient care, mirroring the private-sector trend. The military and the entirely separate Department of Veterans Affairs--which itself spends tens of billions of dollars on health care--have shuttered large in-patient facilities and opened hundreds of outpatient clinics.
By and large this has been for the good. The military in fact is a pace-setter in medical procedures to treat the severely injured; it drives advances in prosthetic limbs, trauma care and reconstructive surgery. Approximately 98% of those wounded on the battlefield who reach a hospital survive. Consider the following comparison: The ratio of those wounded to killed today is seven to one; in Vietnam that ratio was closer to three to one. And the VA is excelling at outpatient care. The Rand Corporation recently found that on nearly every measure of quality of care--preventive services, follow-ups, chronic care--VA patients receive better care than most civilians.

But the problems are real and significant. The military provides health care to more than nine million people. The VA runs the largest unified health-care program in the country to cover an additional five million people. It's predictable that patients will get lost inside a government system this vast. In recent weeks veterans from the Korean and Vietnam wars have stepped forward to tell their own stories of fighting the health-care bureaucracy.

More than three million people eligible for cheap prescription drugs through the VA are opting instead to pay a little extra for Medicare drug coverage. Why? Because, as a Manhattan Institute study recently found, only 22% of the most important drugs released in recent years are covered by the VA.

These manifest problems will now tread water while we await the president's commission, Congress's hearings and on into the darkness. We have some shorter-term ideas to get help where it's needed.

For starters, free the patients captive inside this system. Congress should give these wounded soldiers vouchers to pay for out-patient care anywhere in America they wish--near home and family, at innumerable state-of-the-art rehab facilities, at specialized care institutions. Army word-of-mouth would quickly transmit data on best care, location, cost and family support. The professionals and staff in these places would move heaven and earth to help the service men and women.

To make this work, give a primary role to nonprofit foundations. The Fisher House program of comfort homes for families is perhaps the most famous. There are others more than willing to help.

Certainly the government needs to right its own battered programs. But in the meantime, let the American people--the world's greatest reservoir of medical, financial and volunteer skills--at last get involved helping those who've been fighting on our behalf in Iraq and in the war on terror.
30274  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Nuclear War? on: March 10, 2007, 10:06:06 PM
I have never heard of "Zee" so this must count as highly unconfirmed info-- but given AQ's recent boasts in this regard, it is worth noting:
Top Pakistan nuclear scientists in Taliban Custody: Zee News Exclusive

New Delhi, March 07: Two top nuclear scientists of Pakistan Atomic Energy Commission (PAEC) are currently in Taliban custody. The two were working at PAEC’s facility in North West Frontier Province. Zee News investigations reveal that the two scientists were kidnapped about six months ago. To avoid international embarrassment Pakistan Government has kept this information under wraps.

According to information available with Zee News, nuclear scientists have been kidnapped by Taliban at the behest of Al-Qaeda. Further investigations reveal that Al-Qaeda may be using the expertise of the scientists to produce nuclear bombs. The two scientists are reportedly being held somewhere in Waziristan, near Afghanistan border.

In January this year Pakistan security agencies had foiled another attempt by Taliban militia to kidnap nuclear scientists. Earlier, incidents of Taliban militia stealing uranium in NWFP have already been reported. PAEC also has a uranium mining facility in NWFP.

With repeated Al Qaeda threats to the US, news of kidnapping of nuclear scientists will increase pressure on Pakistan to attack terrorist camps.

Bureau Report
30275  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Islam the religion on: March 10, 2007, 10:01:42 PM
Wall Street Journal

Free Radical
Ayaan Hirsi Ali infuriates Muslims and discomfits liberals.

Saturday, March 10, 2007 12:01 a.m. EST

NEW YORK--Ayaan Hirsi Ali is untrammeled and unrepentant: "I am supposed to apologize for saying the prophet is a pervert and a tyrant," she declares. "But that is apologizing for the truth."

Statements such as these have brought Ms. Hirsi Ali to world-wide attention. Though she recently left her adopted country, Holland--where her friend and intellectual collaborator Theo van Gogh was murdered by a Muslim extremist in 2004--she is still accompanied by armed guards wherever she travels.

Ms. Hirsi Ali was born in 1969 in Mogadishu--into, as she puts it, "the Islamic civilization, as far as you can call it a civilization." In 1992, at age 22, her family gave her hand to a distant relative; had the marriage ensued, she says, it would have been "an arranged rape." But as she was shipped to the appointment via Europe, she fled, obtaining asylum in Holland. There, "through observation, through experience, through reading," she acquainted herself with a different world. "The culture that I came to and I live in now is not perfect," Ms. Hirsi Ali says. "But this culture, the West, the product of the Enlightenment, is the best humanity has ever achieved."

Unease over Muslim immigration had been rising in the Low Countries for some time. For instance, when the gay right-wing politician Pim Fortuyn--"I am in favor of a cold war with Islam," he said, and believed the borders should be closed to Muslims--was gunned down in 2002, it was widely assumed his killer was an Islamist. There was a strange sense of relief when he turned out to be a mere animal-rights activist. Ms. Hirsi Ali brought integration issues to further attention, exposing domestic abuse and even honor killings in the Dutch-Muslim "dish cities."

In 2003, she won a seat in the parliament as a member of the center-right VVD Party, for People's Party for Freedom and Democracy. The next year, she wrote the script for a short film called "Submission." It investigated passages from the Quran that Ms. Hirsi Ali contends authorize violence against women, and did so by projecting those passages onto naked female bodies. In retrospect, she deeply regrets the outcome: "I don't think the film was worth the human life."

The life in question was that of Van Gogh, a prominent controversialist and the film's director. At the end of 2004, an Islamist named Mohammed Buyeri shot him as he was bicycling to work in downtown Amsterdam, then almost decapitated him with a curved sword. He left a manifesto impaled to the body: "I know for sure that you, Oh Hirsi Ali, will go down," was its incantation. "I know for sure that you, Oh unbelieving fundamentalist, will go down."

The shock was palpable. Holland--which has the second largest per capita population of Muslims in the EU, after France--had always prided itself on its pluralism, in which all groups would be tolerated but not integrated. The killing made clear just how apart its groups were. "Immediately after the murder," Ms. Hirsi Ali says, "we learned Theo's killer had access to education, he had learned the language, he had taken welfare. He made it very clear he knew what democracy meant, he knew what liberalism was, and he consciously rejected it. . . . He said, 'I have an alternative framework. It's Islam. It's the Quran.' "

At his sentencing, Mohammed Buyeri said he would have killed his own brother, had he made "Submission" or otherwise insulted the One True Faith. "And why?" Ms. Hirsi Ali asks. "Because he said his god ordered him to do it. . . . We need to see," she continues, "that this isn't something that's caused by special offense, the right, Jews, poverty. It's religion."

Ms. Hirsi Ali was forced into living underground; a hard-line VVD minister named Rita Verdonk, cracking down on immigration, canceled her citizenship for misstatements made on her asylum application--which Ms. Hirsi Ali had admitted years before and justified as a means to win quicker admission at a time of great personal vulnerability. The resulting controversy led to the collapse of Holland's coalition government. Ms. Hirsi Ali has since decamped for America--in effect a political refugee from Western Europe--to take up a position with the American Enterprise Institute. But the crisis, she says, is "still simmering underneath and it might erupt--somewhere, anywhere."
That partly explains why Ms. Hirsi Ali's new autobiography, "Infidel," is already a best seller. It may also have something to do with the way she scrambles our expectations. In person, she is modest, graceful, enthralling. Intellectually, she is fierce, even predatory: "We know exactly what it is about but we don't have the guts to say it out loud," she says. "We are too weak to take up our role. The West is falling apart. The open society is coming undone."

Many liberals loathe her for disrupting an imagined "diversity" consensus: It is absurd, she argues, to pretend that cultures are all equal, or all equally desirable. But conservatives, and others, might be reasonably unnerved by her dim view of religion. She does not believe that Islam has been "hijacked" by fanatics, but that fanaticism is intrinsic in Islam itself: "Islam, even Islam in its nonviolent form, is dangerous."

The Muslim faith has many variations, but Ms. Hirsi Ali contends that the unities are of greater significance. "Islam has a very consistent doctrine," she says, "and I define Islam as I was taught to define it: submission to the will of Allah. His will is written in the Quran, and in the hadith and Sunna. What we are all taught is that when you want to make a distinction between right and wrong, you follow the prophet. Muhammad is the model guide for every Muslim through time, throughout history."

This supposition justifies, in her view, a withering critique of Islam's most holy human messenger. "You start by scrutinizing the morality of the prophet," and then ask: "Are you prepared to follow the morality of the prophet in a society such as this one?" She draws a connection between Mohammed's taking of child brides and modern sexual oppressions--what she calls "this imprisonment of women." She decries the murder of adulteresses and rape victims, the wearing of the veil, arranged marriages, domestic violence, genital mutilation and other contraventions of "the most basic freedoms."

These sufferings, she maintains, are traceable to theological imperatives. "People say it is a bad strategy," Ms. Hirsi Ali says forcefully. "I think it is the best strategy. . . . Muslims must choose to follow their rational capacities as humans and to follow reason instead of Quranic commands. At that point Islam will be reformed."

This worldview has led certain critics to dismiss Ms. Hirsi Ali as a secular extremist. "I have my ideas and my views," she says, "and I want to argue them. It is our obligation to look at things critically." As to the charges that she is an "Enlightenment fundamentalist," she points out, rightly, that people who live in democratic societies are not supposed to settle their disagreements by killing one another.

And yet contemporary democracies, she says, accommodate the incitement of such behavior: "The multiculturalism theology, like all theologies, is cruel, is wrongheaded, and is unarguable because it is an utter dogmatism. . . . Minorities are exempted from the obligations of the rest of society, so they don't improve. . . . With this theory you limit them, you freeze their culture, you keep them in place."

The most grievous failing of the West is self-congratulatory passivity: We face "an external enemy that to a degree has become an internal enemy, that has infiltrated the system and wants to destroy it." She believes a more drastic reaction is required: "It's easy," she says, "to weigh liberties against the damage that can be done to society and decide to deny liberties. As it should be. A free society should be prepared to recognize the patterns in front of it, and do something about them."

She says the West must begin to think long term about its relationship with Islam--because the Islamists are. Ms. Hirsi Ali notes Muslim birth rates are vastly outstripping those elsewhere (particularly in Western Europe) and believes this is a conscious attempt to extend the faith. Muslims, she says, treat women as "these baby-machines, these son-factories. . . . We need to compete with this," she goes on. "It is a totalitarian method. The Nazis tried it using women as incubators, literally to give birth to soldiers. Islam is now doing it. . . . It is a very effective and very frightening way of dealing with human beings."

All of this is profoundly politically incorrect. But for this remarkable woman, ideas are not abstractions. She forces us back to first principles, and she punctures complacencies. These ought to be seen as virtues, even by those who find some of Ms. Hirsi Ali's ideas disturbing or objectionable. Society, after all, sometimes needs to be roused from its slumbers by agitators who go too far so that others will go far enough.

Mr. Rago is an editorial page writer for The Wall Street Journal.
30276  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Humor/WTF on: March 10, 2007, 09:06:38 PM
Man saws house in two in divorce split
German man chain saws house in two in divorce split, takes his half
Updated: 11:55 a.m. ET March 10, 2007

BERLIN - A 43-year-old German decided to settle his imminent divorce by chain sawing a family home in two and making off with his half in a forklift truck.

Police in the eastern town of Sonneberg said on Friday the trained mason measured the single-story summer house — which was some 8 meters (26 feet) long and 6 meters wide — before chain sawing through the wooden roof and walls.

"The man said he was just taking his due," said a police spokesman. "But I don't think his wife was too pleased."

After finishing the job, the man picked up his half with the forklift truck and drove to his brother's house, where he has since been staying.

Copyright 2007 Reuters Limited. All rights reserved. Republication or redistribution of Reuters content is expressly prohibited without the prior written consent of Reuters.
30277  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Military Science on: March 10, 2007, 10:36:21 AM
Museum Review | U.S.S. Monitor Center

A Celebrity Warship Gets a Hall of Fame to Call Its Own

Published: March 10, 2007

NEWPORT NEWS, Va. — As sacred relics go, it doesn’t seem too inspiring. In appearance, Nathaniel Hawthorne said, it “looked like a gigantic rat-trap.” In life, it had little more than a single day of major achievement, and in that it was less than triumphant. In death, it was even less grand, sinking into the Atlantic during a storm, not even a year after it first lumbered onto the scene.

So why, after 145 years, $15 million in oceanic explorations and more than a decade of dives and excavation, is the Civil War battleship the Monitor being given a second life at a cost of $30 million, with its artifacts, history and accounts of its career displayed in a 63,500-square-foot space? That’s precisely what is happening at the U.S.S. Monitor Center, which opened March 9 at the Mariners’ Museum here.

Something seems off kilter about the entire scale: why this kind of attention and expense? It is much easier to see why the Mariners’ Museum itself was interested. Rich in land (a 550-acre park) and endowment ($110 million) and founded in 1930 by Archer M. Huntington (of the railroad Huntingtons) to explore what he called the “culture of the sea,” this museum features a collection of about 150 boats, a major research library, world-class navigation equipment and exhibitions about the history of navigation. But it has been drawing only about 60,000 visitors a year in a region where American history is a major tourist attraction, shipbuilding a local industry and the United States Navy a nearby presence.

Now that may change with the opening of its U.S.S. Monitor Center, in conjunction with the National Oceanic and Atmospheric Administration. (The government owns the wreck and oversaw its excavation.) While the scale and attention can be a little disorienting to a visitor without sea legs, by the time you have passed through the well-annotated, smartly presented exhibits, watched the widescreen re-creations of historic battles and read something about what this ship meant to its contemporaries and devotees, the Monitor starts to loom large.

The center’s galleries are meant, in current museum style, to be evocative re-creations of times and places — turning points of experience. (The exhibits were overseen by the museum’s chief curator, Anna Gibson Holloway, and designed by DMCD Inc.) The history begins on a gun deck of a 1798 warship, where the vulnerabilities of the age of sail could be sensed in the evolution of ever more powerful guns. The early 19th century sounded the death knell for that age; the Civil War allowed it a final breath; the Monitor and the Confederate ironclad Virginia buried it.

Then comes a room evoking the Gosport Navy Yard in Virginia in 1862: the Union had tried to destroy the yard and remove its warships, but only half-managed to burn the Merrimack and leave it in the mud. Lacking the North’s industrial facilities but not ingenuity, the Confederates took the burned hull of the Merrimack, built on it and layered on four inches of iron, renamed it the Virginia and, with this strange contraption, emulated the armored ships that were transforming European navies. A 50-foot-long replica of the Virginia’s bow here is a monstrosity that understandably inspired fear and bewilderment among those used to wooden vessels with billowing sails.

Then a visitor enters the board room of 1862, where Navy officials discussed what kind of armored warship the Union could hastily construct. A Confederate ironclad ship, it was justifiably feared, could wipe out the entire Union Navy. A brilliant Swedish engineer, John Ericsson, had fruitlessly peddled an ironclad design to Napoleon III, but the urgency of war now won him American approval. Abraham Lincoln saw Ericsson’s model and famously declared: “All I have to say is what the girl said when she stuck her foot into the stocking. It strikes me there’s something in it.”

The catch: Ericsson was given 100 days.

One hundred days! This was to be a revolutionary vessel in which the crew and engine were to be entirely housed below the water line. If naval weaponry had traditionally been aimed at targets by turning the ship, here a gun turret would rotate on enormous gears, allowing shots in almost any direction. Everything about the Monitor was experimental, but there was no time for experiments. It was built in the Brooklyn Navy Yard, with numerous contractors bringing the ship in on time.

This technological marvel then took on mythic dimensions. On March 8, 1862, the Virginia had steamed into Hampton Roads, not far from its birthplace, and almost effortlessly destroyed two Union ships, the Cumberland and the Congress, mauling them with its iron ram. With 121 men dead on the first and 240 on the second, it was the worst naval defeat for the United States until Pearl Harbor. What would come next? The Confederacy’s triumphal river journey to Washington? The Union Navy had become obsolete — until the next day, when the Monitor met the Virginia in battle.

In the museum a 13-minute wide-screen show, intriguingly composed of animated paintings and maps and aided by lighting and sound effects, recounts the great battle that followed, as these behemoths tested out their gear, each side claiming victory.

This battle is in every elementary-school textbook. About 20,000 people stood on the banks, watching. The clash — chronicled by letters of participants and witnesses — apparently ended in a draw. But the age of sail definitively lost. The Times of London declared that the British Royal Navy had 149 first-class warships before the battle, but “we now have two.” Jules Verne, inspired by the Monitor, wrote “20,000 Leagues Under the Sea,” published in 1870.

As a drama, the encounter could not have been more skillfully plotted: the Union disaster, the last-minute rescue, the celebration. There were Monitor playing cards, hats, scrimshaw and sheet music.

And there were also complaints, because what was ending was not just the technology of sail. The seaman’s center of gravity had changed — which may be why the vessel’s living quarters below the water, reproduced here, were given unusual attention. An entire culture had evolved around sailing and naval warfare, complete with manners and strategies, uniforms and training. Now the action was below the water line. And in combat, there was no more hand-to-hand confrontation or urgent need to know the ropes. This wasn’t really life at sea; it was life in the engine room.

“All the pomp and splendor of naval warfare are gone by,” Hawthorne mournfully wrote. “Henceforth there must come up a race of enginemen and smoke-blackened commoners who will hammer away at their enemies under the direction of a single pair of eyes.”

In his recent book “Ironclad,” Paul Clancy points out that Melville wrote poems about the Monitor, referring to the turret as the seaman’s “welded tomb,” and noting that warriors

Are now but operatives; War’s made

Less grand than Peace.

After their major battle, the deaths of the Virginia and the Monitor seemed to prove Melville’s point. Within days the Virginia, cornered, was run aground and set on fire by the Confederates: a suicide avoiding capture. By the next winter, the Monitor too, in less than glorious circumstances, came to its accidental death in a storm. The Union produced another generation of ironclads, but the Civil War stumbled along its bloody course, undeterred.

A good portion of the museum is devoted to the recent rescue of the Monitor from the sea floor, itself done at great risk. There is a full-size reproduction of the rusted, lichen-

encrusted gun turret, just as it was found sunk off the coast of Cape Hatteras, N.C. Outside the museum’s glass wall, a full-scale exterior deck of the Monitor is recreated; inside a replica of the turret’s mechanism is also reproduced. It will take 15 years to rehabilitate the original turret in tanks filled with 90,000 gallons of water.

So what are these relics, then, that so attract a visitor’s gaze? Here, not far from where the Monitor fought its main battle, the rusted machinery, silver forks, glass bottles, the human-size propeller and interlocking turret gears all seem to offer testimony to a moment when the world changed, when, as with the Civil War itself, something had come to an end, and something else — which could either turn out horrifying or magnificent — had not yet begun.

The U.S.S. Monitor Center is at 100 Museum Drive, Newport News, Va; (757) 596-2222 or
Washington Post
July 11, 1862


New Type of Ship Fights for North
By Elizabeth
HAMPTON ROADS, Va.--Officers of the U.S.S. Monitor displayed their new type of battleship as it lay anchored in the James River in Virginia on July 9, 1862. Four months ago it fought a battle that could change the course of naval warfare forever.

The twelve officers are posing in front of the ship's turret, one of the many new features of this vessel. It can turn allowing the ship's two cannons to be pointed in any direction. It gave the ship its nickname, "a tin can on a shingle."

Unlike the traditional battleship, which is made of wood, the Monitor is covered with iron. This kind of ship is called an ironclad. That makes it harder for cannon balls to sink the ship.

The ship fought a famous battle just four months ago, in March 1862, against the Merrimac. Both ships were ironclads.

The Merrimac was a Union ship at the beginning of the Civil War. But the Confederates captured it and turned it into an ironclad renamed the C.S.S. Virginia. But in common usage it was still called the Merrimac.

On March 8, 1862, the Merrimac won a victory at Hampton Roads, Va., against Union ships who were blockading the Confederate coast.

A Union officer watching the one-sided battle between the Merrimac and one of the Union ships, the Congress, said that the Merrimac "fired shot and shell into her with terrific effect, while the shot from the Congress glanced from her sloping sides without doing any apparent effect."

But the next day, March 9, the Union ironclad, the Monitor, arrived on the scene. The Merrimac and the Monitor fought each other for almost five hours.

Describing the first exchange of gunfire, Lt. Samuel Dana Greene, an officer on the Merrimac said, "The turrets and other parts of the ship were heavily struck, but the shots did not penetrate; the tower was intact and it continued to revolve. A look of confidence passed over the men's faces and we believed the Merrimac would not repeat the work she had accomplished the day before."

Neither ship was able to do much damage to the other ship. The battle was considered a draw.

Although there was no winner, the battle will be likely to change the course of naval warfare forever. It has brought worldwide attention to the importance of ironclad ships.

The Monitor was built in less than four months according to the design of a man who is not in the picture. His name was John Ericsson, a Swedish immigrant.

Ericsson's design was unusual and not everyone liked it. But when it was shown to President Lincoln, he said, "All I have to say is what the girl said when she put her foot into the stocking. 'It strikes me there may be something in it.'"

The Union has plans to build other ships designed by John Ericsson called "monitors." They will be ironclad, easy to maneuver, and will have revolving turrets.

The officers of the Monitor include Captain John Lorimer Worden, a young man of 24 with a long beard. He was blinded permanently in one eye by an explosion in the battle.

Lt. Samuel Dana Green, the second in command, is 22. He took over after Worden was wounded. Another officer was Lt. Thomas Oliver Selfridge Jr.
30278  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Intel Matters on: March 10, 2007, 08:06:52 AM heri.htm
March 9, 2007 -- 'A VERY big fish" - so Tehran sources de scribe former Deputy Defense Minister Ali-Reza Askari (sometimes called "Asghari" in the West), who disappeared in Istanbul on Sunday.

Askari's disappearance fits an emerging pattern. Since December, the United States and its allies appear to have moved onto the offensive against the Islamic Republic's networks of influence in the Middle East:
* Jordan has seized 17 Iranian agents, accused of trying to smuggle arms to Hamas, and deported them quietly after routine debriefing.
* A number of Islamic Republic agents have been identified and deported in Pakistan and Tunisia.
* At least six other Iranian agents have been picked up in Gaza, where they were helping Hamas set up armament factories.
* In the past three months, some 30 senior Iranian officials, including at least two generals of Revolutionary Guards, have been captured in Iraq.
All but five of the Islamic Republic agents seized in Iraq appear to have been released. One of those released was Hassan Abbasi, nicknamed "the Kissinger of Islam," who is believed to be President Mahmoud Ahmadinejad's strategic advisor.
Among those still held by the Americans is one Muhammad Jaafari Sahraroudi, a senior Revolutionary Guard commander wanted by the Austrian police in connection with the murder of three Iranian Kurdish leaders in Vienna in 1989.
All this looks like a message to Tehran that its opponents may be moving on to the offensive in what looks like a revival of tactics used in the Cold War.
But let us return to the "big fish."
A retired two-star general of the Islamic Revolutionary Guard, Askari had just led a military mission to Damascus, the Syrian capital. He was making a private "shopping stopover" in Turkey on his way back to Tehran.
The Iranian mission's task was to lay the foundations for a Syrian armament industry, licensed to manufacture Iranian-designed weapons. The 30 or so experts that had accompanied Askari remained in Syria to work out the technical details.
According to some reports, Askari had stopped over in Istanbul to meet with an unidentified Syrian arms dealer who lives in Paris.
Having at first denied reports of the general's disappearance, Tehran authorities eventually came out with a confirmation. The Islamic Republic's police chief, Gen. Ismail Ahmadi-Muqaddam, issued a statement Tuesday claiming that the missing general had been abducted by a Western intelligence service and taken to "a country in northern Europe."
Foreign Ministry sources in Tehran, however, said that Askari might have defected, possibly to the United States, where he has relatives. Some reports in the Iranian and Arab media suggest that the Israeli secret service Mossad and the CIA are behind Askari's disappearance.
Israel has denied involvement in the general's disappearance, but The London Daily Telegraph speculated on Monday that Askari could have been abducted by Israel to shed light on the whereabouts of Israel Air Force Lt.-Col. Ron Arad, missing since 1986, who might have been held at one point by Iran. Askari was involved in a deal to transfer Arad to Tehran after his capture by the Lebanese Hezbollah.
Iranian Foreign Minister Manouchehr Mottaki was quoted Monday as saying Iran was "taking necessary steps" to solve the case: "A director-general from the [foreign] ministry has traveled to Turkey . . . We have asked Turkey to investigate Askari's case."
According to Iranian sources, Askari, in his late 50s, joined the Islamic Revolutionary Guard (IRG) at its very start in 1979. He was an associate of Mostafa Chamran, a naturalized U.S. citizen of Iranian origin who returned to Iran when the mullahs seized power in 1979 and helped found the IRG. When Chamran was appointed defense minister two years later, Askari became one of his advisers.
Always in the shadows, Askari was in charge of a program to train foreign Islamist militants as part of Tehran's strategy of "exporting" the Khomeinist revolution.
In 1982-83, Askari (along with Ayatollah Ali-Akbar Mohatashami-Pour) founded the Lebanese branch of Hezbollah and helped set up its first military units. The two men supervised the 1983 suicide attacks on the U.S. Embassy and on the U.S. Marine barracks in Beirut - killing more than 300 Americans, including 241 Marines. Iranian sources say Askari was part of a triumvirate of Revolutionary Guard officers that controlled Hezbollah's armed units until the end of the '90s.
Askari led the 500-man Iranian military mission in Beirut from 1998 to 2000 before returning home to work for the Strategic Defense Procurement Committee. In that capacity, he often traveled abroad to negotiate arms deals.
Tehran sources claim that Askari was also involved in Iran's controversial nuclear program, which, although presented as a civilian project, is controlled by the Islamic Revolutionary Guard. They also say that last November he was appointed a member of the Strategic Defense Planning Commission set up by Ali Khamenei, the "Supreme Guide."
Indeed, Iran is rife with rumors about the case: Askari has been transferred to Romania, where he is being debriefed by the Americans; he had documents with him, mostly related to Iran's military purchases abroad; Israeli efforts to see him (in connection with his years of running Hezballah) have so far failed to meet with success . . .
Whether he defected or was abducted, Askari is a big catch with a mine of information about the activities of the Revolutionary Guard and its elite arm, the Quds Corps, which controls Arab and Turkish radical groups financed by Tehran. Last month, the United States accused the Quds Corps of supplying special projectiles to terrorists in Iraq to kill GIs.
Iranian-born journalist and author Amir Taheri is based in Europe.
Askari is a big catch, with a mine of information about the activities of the Revolutionary Guard.</B>
a different interpretation:

From: stevewessler
Date: 3/8/2007 6:53:43 PM
Subject: [ASA] Iranian General Reportedly Defects

Iranian General Reportedly Defects
Kenneth R. Timmerman
Wednesday, March 7, 2007

A former high-ranking Iranian government official, Brig. Gen. Alireza
Asghari, 63, has defected to the United States, Iranian exiles and
other sources told Newsmax today.

Asghari had access to highly-classified intelligence information and
"defected to the Americans with lots of secrets," respected Iranian
journalist Alireza Nourizadeh told Newsmax from London.
The disappearance of the former Revolutionary Guards General has
created a panic in Tehran.

Gen. Asghari left Iran on an officially-sanctioned trip to Damascus,
Syria, then went missing during a stop-over in Istanbul, Turkey on
February 7, according to statements by Iranian government officials
in Tehran.

Nourizadeh believes he had been sent to Damascus to supervise an arms
deal between Iran and Syria that was signed last June during a trip
to Tehran by Syria's defense minister.

"It is possible that former deputy defense minister Asghari was
kidnapped by Western intelligence services because of his Defense
Ministry background," the head of Iran's national police, Gen. Ismail
Ahmadi-Moghaddam, said in Tehran yesterday.

But Newsmax has learned from Iranian sources that Gen. Asghari's
family also managed to leave Iran just before he went missing, and
that he sold his house in the Narmak area of Tehran in December.
Both are considered clear indications that he defected and had been
planning his departure for some time.

As a senior member of the general staff of the Revolutionary Guards
Corps, Gen. Asghari had access to highly-classified operational
information, as well as strategic planning documents, said Shahriar
Ahy, an Iranian political analyst based in Washington, D.C. "It will
take them months to know just what they've lost," Ahy told Newsmax

The damage control investigation could reach the very summit of the
Iranian government because of Gen. Asghari's long-standing personal
relationship to former Defense minister Admiral Ali Shakhani, Any
said. "The loss of Gen. Asghari will severely hamper the regime's
operations outside the country, because he will pull back the cloth
on what he knows," Ahy said. "Intelligence agents will be called
back, and operations will be put into deep freeze" as the regime
tries to figure out what secrets Asghari compromised.

Gen. Asghari is believed to have detailed knowledge of the
Revolutionary Guards Qods Force units operating in Iraq. He is also
believed to have come out with extensive information on Iran's
clandestine nuclear weapons program, which will make it harder for
Russia and China to come to Iran's defense at the ongoing 6-power
talks on Iran's nuclear program.

From 1989-1993, Gen. Asghari was stationed in Lebanon as Iran's
liaison to Hezbollah. Israeli press accounts have identified him as
the Iranian official who "knows the most" about what happened to
Israeli navigator Ron Arad, who was reportedly "sold" to Iran after
his plane was shot down over southern Lebanon in 1986.

The Iranian regime requires top official such as Gen. Asghari to
obtain an authorization before they can travel abroad. Gen. Asghari's
10-day trip to Syria was approved by the military judicial
authorities, sources inside Iran told Newsmax. Two days after he
arrived in Damascus, his family managed to leave Iran, the sources
said. The main impediment to defections by high-ranking Iranian
officials is fear that any family members left behind will be
arrested, tortured, and possibly killed.

The Persian-language website claims that Gen. Asghari's
name was on a CIA "hit list" of twenty former Revolutionary Guards
officers. Baztab is owned by former Revolutionary Guards commander
Gen. Mohsen Rezai, now a top aide to former president Ali Akbar
Hashemi-Rafsanjani. Alireza Nourizadeh, the Iranian journalist based
in London, tells Newsmax that Gen. Asghari planned his defection
carefully. "While he was in Damascus, he sent a fax or an email to
Tehran saying that one of his contacts, who was an arms dealer, was
in Turkey and wanted to meet him," he told Newsmax. "So they gave him
permission to go to Turkey, where he defected."

The Iranian military attaché in Istanbul had reserved a room for Gen.
Asghari at the Continental hotel, Nourizadeh said, but Asghari
complained that it was not safe. Instead, he booked three rooms at
the Gilan Hotel, in the Tacsim district which is popular among
Iranians. "After calling a relative in Tehran, he left the hotel at
6:30 PM and disappeared," he said.

During the 1990s, Gen. Asghari was in charge of short and medium-
range missile projects at the Defense Industries Organization. "He
ran the Nazeat, Fajr, and Zelzal missile programs," Nourizadeh said.
From 1996-1997, he worked on secret nuclear procurement projects, and
traveled frequently to Russia, China, North Korea, and Southeast Asia
buying equipment and parts.

Nourizadeh believes Gen. Asghari defected because he had incurred the
wrath of his superiors in the Defense ministry during a stint as the
Defense Ministry's Inspector General. "He discovered two gangs of
corrupt officials who had embezzled the government for $90 million
and $150 million," Nourizadeh said. "After he exposed them, he was
arrested. He was Mr. Clean."
Eventually, Gen. Asghari was rehabilitated and put to work on the
Iran-Syria arms deals signed last year, but he never forgave his
superiors for orchestrating his fall from power.
30279  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Professor Pino - enough is enough - I've had it. on: March 09, 2007, 09:33:01 PM
Today's Peggy Noonan from the WSJ:


'That's Not Nice'
Our political discourse needs less censorship and more self-discipline.

Friday, March 9, 2007 12:01 a.m. EST

Here is what has been said the past week or so that sparked argument: Bill Maher, on HBO, said a lot of lives would be saved if Vice President Cheney had died, and Ann Coulter, at a conservative political meeting, suggested John Edwards is a "faggot."

She was trying to be funny and get a laugh. He was trying to startle and get applause.

What followed was the predictable kabuki in which politically active groups and individuals feigned dismay as opposed to what many of them really felt, which was grim delight. Conservatives said they were chilled by Mr. Maher's comments, but I don't think they were. They were delighted he revealed what they believe is at the heart of modern liberalism, which is hate.

Liberals amused themselves making believe they were chilled by Ms. Coulter's remarks, but they were not. They were delighted she has revealed what they believe is at the heart of modern conservatism, which is hate.

The truth is many liberals were dismayed by Mr. Maher because he made them look bad, and many conservatives were mad at Ms. Coulter for the same reason.

I realized as I watched it all play out that there's a kind of simple way to know whether something you just heard is something that should not have been said. It is: Did it make you wince? When the Winceometer is triggered, it's an excellent indication that what you just heard is unfortunate and ought not to be repeated.

In both cases, Mr. Maher and Ms. Coulter, when I heard them, I winced. Did you? I thought so. In modern life we wince a lot. It's not the worst thing, but it's better when something makes you smile.

One of the clearest statements ever about the implied limits of legitimate political discourse was made by the imprisoned Socrates in his first dialogue with Crito, when he said, "That's not nice." Actually, it was your grandmother who said "That's not nice." She's the one who probably taught you the wince. It is her wisdom, encapsulated in those three simple words, that is missing from the current debate.
We tie ourselves in knots trying to explain why it is, or why it isn't, always or occasionally, helpful or destructive to use various epithets, or give full voice to our resentments. But the simple wisdom of Grandma-- "That's not nice"--is a good guide. (I should say that when I was a kid, grandmas were older people who had common sense. They had observed something of people, had experienced life directly, not only through books or TV. Almost all of them had religious faith, and had absorbed the teachings of the Bible. Almost all of them sat quietly at the kitchen table, and even when I was a kid they were considered old fashioned. They were often ethnic and had accents. As a matter of fact, all of them were.)

I think that as America has grown more academic or aware of education, the wisdom of Grandma has been denigrated. Or ignored. Or stolen and dressed up as something else. For instance, Rudy Giuliani's success in cleaning up and reviving the city of New York is generally attributed to his embrace of what is called, in academic circles, the broken-window theory. It holds that when criminals see that even small infractions are met and punished, they will understand that larger infractions will be met and punished. It also holds that when neighborhoods deteriorate, criminals are emboldened. People from Harvard won great prizes for these insights.

But all of broken-windows theory comes down to what Grandma always knew and said: "Fix the window or they'll think no one cares! When people think no one cares, they do whatever they want." There was not a single grandmother in America circa 1750-2007 who didn't know this. But no one wants to quote Grandma. She's so yesterday. And her simple teachings have been superseded by more exotic forms of instruction.

Fifty years ago, no one speaking at a respected political gathering would say, would even think of saying that Adlai Stevenson is a faggot. Nor would Arthur Godfrey or Jack Paar have declared on their television shows that we'd be better off if Eisenhower died. Is our discourse deteriorating? Yes, it is.
Part of the reason is that Grandma had more sway in the public sphere 50 years ago, which is to say common sense and a sense of decorum had more sway. Another part is that privately people felt they had more room to think or say whatever they wanted without being shamed or shunned. It let the steam out. We think of the 1950s as buttoned up, but in a way America had more give then. Men were understood not to be angels.

Our country now puts less of an emphasis on public decorum, courtliness, self-discipline, decency. America no longer says, "That's not nice." It doesn't want to make value judgments on "good" and "bad." We have come to rely on censorship to maintain decorum. We are very good at letting people know that if they say something we don't like, we'll shame them and shun them, even ruin them.

But censorship doesn't make people improve themselves; it makes people want to rebel. It tells them to toe the line or pay a price. People who are urged in the right direction and taught in the right direction will usually try to discipline and improve themselves from within. But they do not enjoy censorship from without. They fight back. They are rude in order to show they are unbroken.

This is human. And Grandma would have understood this, too.

I think the atmosphere of political correctness is now experienced by normal people--not people who speak on TV, but normal people--as so oppressive, so demanding of constant self-policing, that when someone says something in public that is truly not nice, not nice at all, they can't help but feel that they are witnessing a prison break.
As long as political correctness reigns, the more antic among us will try to break out with great streams of Tourette's-like forbidden words and ideas.

We should forbid less and demand more. We should exert less pressure from without and encourage more discipline from within. We should ask people to be dignified, hope they'll be generous, expect them to be fair. When they're not, we should correct them. But we shouldn't beat them to a pulp. Because that's not nice.

Ms. Noonan is a contributing editor of The Wall Street Journal and author of "John Paul the Great: Remembering a Spiritual Father" (Penguin, 2005), which you can order from the OpinionJournal bookstore. Her column appears Fridays on

30280  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Homeland Security on: March 09, 2007, 05:21:45 PM
"the problem is administrative, not operational."

Would someone explain this distinciton to me please?
30281  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Afghanistan-Pakistan on: March 09, 2007, 03:14:41 PM
The Afghan guard who stops suicide bombers
A gatekeeper's resolve has earned him the nickname 'Rambo' at a US base in Kabul.
By Mark Sappenfield | Staff writer of The Christian Science Monitor
Page 1 of 3

Reporters on the Job
We share the story behind the story. KABUL, AFGHANISTAN - There is trouble outside Camp Phoenix. The American base on the dusty outskirts of Kabul has called for English translators. The problem is, the Americans have now hired their translator, and a crowd of Afghan job hunters at the camp gate is getting unruly.

The US soldiers are nervous. One yells obscenities and waves his gun. The crowd cowers but doesn't budge. Then, another soldier steps forward, armed only with a thick wooden staff, wrapped in peeling red tape.

The name tag on his broad chest says "Rambo," and though he wears US Army fatigues, he speaks in perfect Dari, ordering the crowd to leave. It reluctantly disperses.

This is a normal day for Rambo, an Afghan who has stood guard here for more than four years, pledging his life to the American soldiers that rid his land of the Taliban. But on Jan. 16, Rambo's gatekeeping made him a bona fide hero.

On that day, Rambo wrenched open the driver's side door of a moving car and wrestled a suicide bomber into submission before he could detonate his explosives. President Bush lauded him in a nationally televised speech several weeks ago, and before that, slightly exaggerated accounts of his feat circled through cyberspace, pleading for America to offer him citizenship or at least a medal.

Dutiful: Four days off in four years

On this gray day, amid the intermittent raindrops of a coming storm, Rambo seems somewhat weary of the story, asking a lieutenant whether he really needs to tell it again. So far as he is concerned, his only job is to protect those American soldiers at the gate. It is why he has taken only four days off in more than four years, even working Fridays, though that is the Muslim day of rest.

But the lieutenant kindly requests Rambo's patience. To Rambo, that is an order. "If you want me to do it, I will do it," he tells her with martial deference.

In fairness, his story is not just about the day he stopped a suicide bomber, when the steel of his resolve to protect American troops became so apparent to all who did not know him. To those who do, who gave him the "Rambo" nickname, the name tag, and the stick, his devotion was already evident.

At every corner of Camp Phoenix, Rambo stops to salute American officers. Soldiers heading out on patrol call out his name as if he were a fraternity brother. He is unquestionably one of them, because he is so willing to make the same sacrifice that they, too, have been called upon to make.

COMMENDATIONS: The Afghan security guard 'Rambo' was praised in a speech by President Bush, and he proudly displays awards in his room at Camp Phoenix, near Kabul, Afghanistan.

The Afghan guard who stops suicide bombers
A gatekeeper's resolve has earned him the nickname 'Rambo' at a US base in Kabul.
Page 2 of 3

Page 1 | 2 | Page 3

Reporters on the Job
We share the story behind the story. Yet he is also unquestionably Afghan, and never more so than when he smothered his countryman and would-be martyr at the front gate. To Rambo, whose name has been withheld for his protection, what happened that day was a matter of pride – a personal pride that burns deeper than love of country, or family, or faith.

"I made a promise to every American soldier," he says in grave tones. "Even if there is only one American soldier, I will be here to protect him."

Amid Camp Phoenix's soil-filled blast walls and bristling guard towers, designed to keep soldiers separate from the unsettled Afghanistan beyond, Rambo is a living lesson in the character of his country, where friends pledge their lives to defend you and enemies never rest until you have been destroyed.

On a clear, chilly Tuesday in mid- January, those two perceptions of the American presence here collided.

How he spotted the suicide bomber

Having spoken for five loving minutes about his well-worn red stick and its many uses in crowd control, the black-bearded Rambo is at last primed to talk about his legendary feat, his dark eyes bright with enthusiasm. He sits on a cold, wooden picnic bench in the Camp Phoenix compound, immune to the freezing rain, his rough and blackened hands working frantically to depict the scene.

When the driver of an off-white sedan did not brake as he approached the gate, Rambo sensed danger. He ran to the door, flung it open, and saw two buttons by the gearshift, each with a wire running to a gas tank that filled the entire back seat.

Before the terrorist could reach the buttons, Rambo seized his hands, and a Security Forces soldier arrived to help. In an instant, it was over.

Later in the day, the car exploded when a demolition team failed to disarm it, but no one was injured.

Before and since the event, Rambo has gotten recognition for his role at Camp Phoenix. In his dark and low-ceilinged room – a nestlike clutter of boxes and badges and potato-chip bags – Rambo displays a letter from the former commander of NATO. There is a framed commendation that bears both the US and Afghan flags, as well as a jumble of military coins given for his service.

In another corner, he uncovers a pile of letters from American soldiers, their wives, and their mothers – one with a lipstick-stained kiss of gratitude. These are his treasures. The thanks he has always received for his service makes his monastic existence worthwhile. Even before Jan. 16, he stayed here from before dawn until after dusk. Now, he lives on the base full time. In fact, he has not been home for three months.


A gatekeeper's resolve has earned him the nickname 'Rambo' at a US base in Kabul.
Page 3 of 3

Page 1 | Page 2 | 3

Reporters on the Job
We share the story behind the story. He bears the security measures joyfully. And he doesn't heed the Afghans who roll down their windows and shout obscenities at him as they pass. "I don't care what they say," he says. "I will protect my friends."

Yes, he says, the Americans are here to help hold his country together as it attempts to heal after three decades of misrule and civil war. But more than that, he loves Americans because they have treated him with respect.

"They are good and they have strong hearts," he says.

They have given him this uniform, which is frayed at the cuffs from constant use. They have created a "Rambo fund" to help him get a TV, and have helped two of his sons get jobs. On his shoulder he proudly wears the patches of every unit that has come through Camp Phoenix – each vying for the esteemed piece of real estate that is Rambo's uniform.

"When you think of Camp Phoenix, you think of Rambo," says 1st Lt. John Stephens of 1-180th Infantry Battalion, who is in the midst of his second tour here. "He's the rock of Camp Phoenix."

Taliban rocket killed his wife and child

Rambo's journey to the American side of the war is a simple one. During the days of the Taliban, his wife and one of his children were killed when a rocket crashed into their home. It was not intentional, he says, but it was indicative of the lives ruined by Taliban rule. Moreover, as a member of the Army during a former government, he felt unsafe and eventually fled to Pakistan for refuge.

The fall of the Taliban in 2001 brought him back to Kabul, where he resumed an old job as a truck driver and security guard at a transportation company. When Camp Phoenix commandeered the building used by the transportation company in 2003, Rambo stayed on as a security guard for the new installation. He has been here ever since, and he has been "Rambo" for almost as long.

His handle was the suggestion of a woman who was here during the early days of Camp Phoenix. "I liked Rambo even from before," he says, betraying no knowledge of anyone named Sylvester Stallone, as if Rambo and the actor are synonymous. "Sometimes he is in a movie where he is wild, and sometimes he has a necktie and is very respectable."

Which Rambo is he? "It depends," he says with a smile. "If a polite man comes, I will be a Rambo who is polite and gentle. But if it is Al Qaeda, I will be the wild Rambo."

Soldiers here will vouch for that, telling of instances where Rambo pulled people out of car windows. Back during Communist times, when he was a tank commander, Rambo says that he cut all the medals off the uniform of a superior officer when the officer (falsely, he insists) accused him of not fixing a tank correctly.

Today, he returns to the gate, huddling beside a fire in an old oil drum along with his American colleagues. They are his responsibility, he says, and he is determined not to forsake that trust.

"I don't want to be blamed," he says. "I promised these people a lot. Dying is better than to be blamed."

30282  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Mexico-US matters on: March 09, 2007, 02:40:57 PM

OFFICERS OUTGUNNED ON U.S. BORDER: Violence along the U.S.-Mexico border is undergoing what U.S. law-enforcement authorities call "an unprecedented surge," some of it fueled by weapons and ammunition purchased or stolen in the United States. Federal, state and local law-enforcement officials from Texas to California, concerned about the impact of illegally imported weapons into Mexico, say they already are outmanned and outgunned by ruthless gangs that collect millions of dollars in profits by smuggling aliens and drugs into this country.
30283  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Help our troops/our cause: on: March 09, 2007, 08:39:40 AM
Today's NY Times

GTON, March 8 — Staff Sgt. Gregory L. Wilson, from the Texas National Guard, waited nearly two years for his veterans’ disability check after he was injured in Iraq. If he had been an active-duty soldier, he would have gotten more help in cutting through the red tape.

Allen Curry of Chicago has fallen behind on his mortgage while waiting nearly two years for his disability check. If he had filed his claim in a state deploying fewer troops than Illinois, Mr. Curry, who was injured by a bomb blast when he was a staff sergeant in the Army Reserve in Iraq, would most likely have been paid sooner and gotten more in benefits.

Veterans face serious inequities in compensation for disabilities depending on where they live and whether they were on active duty or were members of the National Guard or the Reserve, an analysis by The New York Times has found.

Those factors determine whether some soldiers wait nearly twice as long to get benefits from the Department of Veterans Affairs as others, and collect less money, according to agency figures.

“The V.A. is supposed to provide uniform and fair treatment to all,” said Steve Robinson, the director of veteran affairs for Veterans for America. “Instead, the places and services giving the most are getting the least.”

The agency said it was trying to ease the backlog and address disparities by hiring more claims workers, authorizing more overtime and adding claims development centers.

The problems partly stem from the agency’s inability to prepare for predictable surges in demand from certain states or certain categories of service members, say advocates and former department officials. Numerous government reports have highlighted the agency’s backlog of disability claims and called for improvements in shifting resources.

“It’s Actuary Science 101,” said Paul Sullivan, who until last March monitored data on returning veterans for the V.A. “When 5,000 new troops get deployed from California, you can logically expect a percent of them will show up at the V.A. in California in a year with predictable types of problems.”

“It makes no sense to wait until the troop is already back home to start preparing for them,” Mr. Sullivan said. “But that’s what the V.A. does.”

Veterans’ advocates say the types of bureaucratic obstacles recently disclosed at Walter Reed Army Medical Center are eclipsed by those at the Veterans Affairs division that is supposed to pay soldiers for service-related ills. The influx of veterans from the Iraq war has nearly overwhelmed an agency already struggling to meet the health care, disability payment and pension needs of more than three million veterans.

Stephen Meskin, who retired last year as the V.A.’s chief actuary, said he had repeatedly urged agency managers to track data so they could better meet the needs of former soldiers. “Where are the new vets showing up?” Mr. Meskin said he kept asking. “They just shrugged.”

Agency officials say they have begun an aggressive oversight effort to determine if all disability claims are being properly processed and contracted for a study that will examine state-by-state differences in average disability compensation payments.

“V.A.’s focus is to assure consistent application of the regulations governing V.A. disability determinations in all states,” the department said in a written statement.

Many new veterans say they are often left waiting for months or years, wondering if they will be taken care of.

Unable to work because of post-traumatic stress disorder and back injuries from a bomb blast in Iraq in 2004, Specialist James Webb of the Army ran out of savings while waiting 11 months for his claim. In the fall of 2005, Mr. Webb said, he began living on the streets in Decatur, Ga., a state that has the 10th-largest backlog of claims in the country.

“I should have just gone home to be with family instead of trying to do it on my own,” said Mr. Webb, who received a Bronze Star for his service in Iraq. “But with the post-traumatic stress disorder, I just didn’t want any relationships.”

After waiting 11 months, he began receiving his $869 monthly disability check and he moved into a house in Newnan, Ga. About three weeks ago, Mr. Webb moved back home to live with his parents in Kingsport, Tenn.

The backlogs are worst in some states sending the most troops, and discrepancies exist in pay levels.

Illinois, which has deployed the sixth-highest number of soldiers of any state, has the second-largest backlog. The average disability payment for Illinois veterans — $7,803 a year — is among the lowest in the nation, according to 2005 V.A. data.


In Pennsylvania, which has sent the fourth-highest number of troops, the claims office in Pittsburgh is tied for second for longest backlogs, where 4 out of 10 claims have been pending for more than six months. Veterans from this state on average receive relatively low payments, $8,268 per year, according to 2005 V.A. data. Comparable 2006 data were not available.

 The agency’s inspector general in 2005 examined geographic variations in how much veterans are paid for disabilities, finding that demographic factors, like the average age of each state’s veteran population, played roles. But the report also pointed to the subjective way that claims processors in each state determined level of disability.

Staffing levels at the veterans agency vary widely and have not kept pace with the increased demand. The current inventory of disability claims rose to 378,296 by the end of the 2006 fiscal year. The claims from returning war veterans plus those from previous periods increased by 39 percent from 2000 to 2006. During the same period, the staff for handling claims has remained relatively flat, a problem the department highlighted in its 2008 proposed budget. The department expects to receive about 800,000 new claims in 2007 and 2008 each.

“It’s clear to everyone here that the system over all is struggling and some veterans are waiting far too long for decisions,” Senator Larry E. Craig, Republican of Idaho, said Wednesday at a hearing before the Senate veterans affairs committee.

The growing strains on the veterans agency have affected some soldiers more than others.

While the Reserve and National Guard have sent a disproportionate number of soldiers to the war, the average annual disability payment for those troops is $3,603, based on 2006 V.A. data for unmarried veterans with no dependents. Active-duty soldiers on average receive $4,962.

Though the V.A. acknowledged that there were discrepancies, officials also said they believed that a significant factor might be length of service. Active-duty soldiers generally serve longer, and therefore more suffer from chronic diseases or disabilities that develop over time. Many who served in the Guard think they are losing the battle against the bureaucracy.

“We take a harder toll,” said Mr. Wilson, the Texan, referring to the fate of reservists and Guard troops compared with active duty soldiers.

He said that last month he received his disability check for his back injuries but only after a 21-month wait and the intervention of a congressman and a colonel.

When active-duty soldiers near discharge, they have access to far more programs offering assistance with benefits than do reserve and National Guard soldiers, according to veterans’ advocates.

“The active-duty guys, they get those resources,” Mr. Wilson said. “We don’t.”

He said that while active-duty soldiers often received medical disability evaluations in about 30 days, many reservists he knew waited two years or more to get an initial appointment. Active-duty personnel also routinely received legal advice about appeals and other issues from military lawyers, while reservists had to request those hearings, he said.

For years, the V.A.’s inspector general, the Government Accountability Office, members of Congress and veterans’ advocates have pointed out the need to improve how the V.A. tracks data on soldiers as they are deployed and when they are injured. That would help prepare for their future needs and ease delays in processing health and benefit claims.

In 2004, a system was designed to track soldiers better, prepare for surges in demand and avoid backlogs. But the system was shelved by program officials under Secretary Jim Nicholson for financial and logistical reasons, V.A. officials said Thursday at a hearing before the House Veterans Affairs Committee.

The V.A., which has said it has an alternate tracking system nearly operational, depends on paper files and lacks the ability to download Department of Defense records into its computers.

President Bush has appointed a commission to investigate problems at military and veterans hospitals.

For Mr. Curry, the reservist from Chicago who has fallen behind on his mortgage payments, his previous life as a $60,000-a-year postal worker is a fading memory. “It’s just disheartening,” he said. “You feel like giving up sometimes.”

« Previous Page1 2

Richard G. Jones contributed reporting from Trenton, Bob Driehaus from Cincinnati, and Sean D. Hamill from Pittsburgh.

30284  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Iraq on: March 09, 2007, 08:28:41 AM

1212 GMT -- IRAQ -- U.S. forces in Iraq captured 16 suspected al Qaeda militants who allegedly were responsible for numerous suicide bombings, kidnappings and beheadings, the U.S. military said March 9. Six insurgents, including an al Qaeda leader known as "the Butcher" because of his involvement in beheadings, were captured and one was killed in an early morning raid in the northern city of Mosul. Two more were captured in Al Fallujah and eight were apprehended near Karmah.
30285  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Afghanistan-Pakistan on: March 08, 2007, 03:03:10 PM
AFGHANISTAN: Fugitive Afghan militant leader Gulbuddin Hekmatyar said his forces have stopped cooperating with the Taliban, and suggested that he is open to talks with Afghan President Hamid Karzai. Hekmatyar told The Associated Press in a video response to questions that his group contacted Taliban leaders in 2003 and agreed to wage a joint holy war against U.S. troops. He did not say when the split occurred, but that "certain elements among the Taliban rejected the idea of a joint struggle against the aggressor." Hekmatyar said his forces are now mounting only restricted operations, partly because of a lack of resources.
30286  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Nuclear War? on: March 08, 2007, 03:01:28 PM
That seems to be a rather broad brush that would get quite a few people who really had nothing to do with it, not to mention contaminating a goodly portion of the planet on which we live for quite a long time and generally greatly irritating the neighbors downwind of the mushroom cloud.

Any chance of your narrowing it down a tad?

Or is the problem precisely that we wouldn't know who did it or where they could be found? Which then reduces us to "Kill 'em all and let God sort it out"? Is this what our strategy should be?  How do you think this would play with the American people?  Anyone who then saw an American abroad?
30287  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: WW3 on: March 08, 2007, 02:49:34 PM

 March 8, 2007 -- IMAGINE the reaction if Western agents slaughtered a hundred Sunni pilgrims on their way to Mecca. The outrage would spark incendiary rhetoric, riots and revenge killings from Peshawar to Paris.
But when Sunni suicide bombers murdered 118 Shia pilgrims (and wounded almost 200 more) on Tuesday, Sunnis around the globe looked away: Shias only count as Muslims when America can be blamed for their suffering.

Many of those Shia victims of religious totalitarianism were traveling on foot to Karbala to honor Mohammed's grandson Hussein - who was butchered by the founders of Sunni Islam, to whom power was worth more than the Prophet's family.

The hatred goes deep.

The Sunni Arab campaign against Shias isn't just a struggle for political advantage: It reflects an impulse to genocide. And it makes a grim joke of claims of Muslim unity.

The Tuesday atrocities, followed by smaller-scale attacks on more pilgrims yesterday, were meant to be as outrageous as possible. They not only underscored the hatred Sunni extremists feel toward all Shias, but had the immediate goal of provoking Muqtada al-Sadr's Shia militia to retaliate.

The Sunni insurgents and their foreign-terrorist allies are worried. The recent effort by American and Iraqi forces to pacify Baghdad has shown early signs of success. Wary of tangling with our troops again, Sadr's Mahdi Army has been laying low, while the Sunni extremists have taken heavy losses.

The Sunnis want the Shias back in the fight.

Why? Because they want to disrupt the Baghdad security plan. Because they want to deepen the reawakened hatred between Iraq's religious communities. And because they yearn for a regional conflict that would "put Shias back in their place."

So they slaughtered more than a hundred pilgrims - men, women and children; young and old - in Allah's name.

Where was the outcry?

Human-rights groups were too busy applauding European requests for the extradition of CIA operatives (the real enemies of Western civilization, of course). Since this butchery wasn't the fault of Americans or Brits, the Europeans themselves took no interest.

American leftists, who raved that Abu Ghraib was another Auschwitz, didn't offer a single word of pity for the Muslim victims of Muslims.

All to be expected.

But shouldn't Muslims have denounced the attacks on the pilgrims? Shouldn't such an atrocity have sparked Arab anger that transcended Islam's internal divide? After all, those murdered Shias were fellow Arabs, not Persians.

Where were the public statements of sympathy by government ministers and mullahs? Where was the noble Arab media? Where are the outraged demonstrations?

Not only is Islamic unity a sham, the Middle East's hypocrisy stinks like a shallow grave. Sunnis regard Shias as Untermenschen. No Sunni government wants to see Shias receive a fair deal - in Iraq or anywhere else.

In the short term, the question is whether Shias will take the bait and retaliate against Sunni Arab civilians in Iraq. The Baghdad government is doing its best to calm the furious Shia community. We'll just have to wait and see what happens.

But the greater, long-term danger is one this column has highlighted before: The administration's rush back into the arms of the Saudis and other America-hating Sunni Arab governments is a colossal strategic mistake.

The moral issues are bad enough: To the Saudi royal family, dead Shias aren't tragedies - they're trophies. One almost expects those bloated, bigoted princes to organize Shia-hunting safaris the way they slaughter endangered species when vacationing in impoverished African countries (been there, seen that).

The strategic catastrophe that would result from a return to our wretched mistakes of the 20th century would cost us dearly. When picking allies in the Middle East, we've been on the wrong side of history for over a half-century. And now the Saudis are waging a propaganda campaign to convince American opinion-makers that they're our best pals in the whole, wide world.

It works. An honorable elder statesman I respect recently got suckered during a junket to Saudi Arabia. He left Riyadh convinced he'd been sitting down with our indispensible allies.

Well, the view I've seen with my own eyes - in dozens of Muslim and mixed-faith countries - is of Saudi money spent lavishly to divide struggling societies, to block social and educational progress for Muslims and to preach deadly hatred toward the West.

Until 9/11, the Saudis got away with their extremist filth in this country, too. And Saudi-funded mosques here still seek to prevent Muslims from integrating into American society.

The Saudis, not the Iranians, are the worst anti-American hate-mongers in the world today. When our dignitaries visit Prince Bandar and his buddies, they get the (literal) royal treatment. But in the slums of Mombasa or Cairo, in Lahore, Delhi and Istanbul, the Saudis do everything in their power to make Muslims hate us.

After the suicide attacks on those pilgrims, did any member of the Saudi royal family visit the kingdom's own oppressed Shias to express sympathy and Muslim solidarity?

Our relationship with the Saudis reminds me of the scene in the film "The Shining" when Jack Nicholson's character imagines he's embracing a beautiful woman only to open his eyes and find himself smooching a decomposing corpse. It's time for Washington's Saudi-lovers to open their eyes.

By the way: The two suicide bombers who killed those pilgrims were Saudis.

Ralph Peters' latest book is "Never Quit the Fight."

30288  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Is there something they are not telling us? on: March 08, 2007, 01:41:08 PM
LAX passenger hides objects in his body; bomb squad called

By Andrew Blankstein

Los Angeles Times

Authorities called in the bomb squad early Tuesday and diverted a flight to Las Vegas after Los Angeles International Airport security screeners found hidden wires and other objects in a body cavity of a Philadelphia-bound passenger.

Fadhel Al-Maliki, a 35-year-old Iraqi national living in Atlantic City, N.J., had been flagged by security officials at LAX and was undergoing a secondary "selectee screening" when he set off a metal detector.
Al-Maliki, a former security guard, told screeners that he knew what had triggered the alarm and proceeded to remove items from his rectum, including a rock, chewing gum and thin wire filament.

Fetters, federal security director at LAX, said at news conference that Transportation Security Administration officers had become alarmed because Al-Maliki was acting strange but initially refused to identify the items he had hidden.

Concern that the objects might be components for an explosive device led airport authorities to call in the Los Angeles Police Department and FBI bomb technicians as well as a hazardous material team.

A preliminary investigation appeared to rule out a theory that Al-Maliki may have been looking for weaknesses in security or was rehearsing for a terrorist act, federal and local law enforcement authorities said.

During questioning, Al-Maliki said the objects in his rectum were used to alleviate stress, federal law enforcement sources said.

The rock, authorities said he told them, was from another planet.
As Al-Maliki was being detained, his two bags were loaded on to US Airways Flight 1422, which took off for Philadelphia with 143 passengers and six crew members on board, said Liz Landau, a spokeswoman for the airline.

Federal officials said the bags had been checked for explosives, chemicals and other hazardous materials using the most modern and extensive screening devices available. Even so, they diverted the aircraft to McCarran International Airport in Las Vegas "out of an abundance of caution."

There, passengers were taken off the plane, which was parked away from the terminal. Passengers had to leave their carry-on bags aboard, and the plane and their luggage were searched, Landau said.

Federal officials also said a search of Al-Maliki's luggage turned up nothing "hazardous or illegal." "Based on our investigation, there was no threat to Los Angeles International Airport or the airports in Las Vegas or Philadelphia," said Ethel McGuire, the FBI assistant special agent in charge of the Joint Terrorism Task Force.

Airport police briefly blocked access to roads leading to LAX and diverted vehicle traffic. But no other flights were disrupted at the airport, and Terminal 1, the building used by Southwest Airlines and US Airways, remained open.

After several hours of questioning, the FBI determined that Al-Maliki had not committed a crime, but he was turned over to U.S. Immigration and Customs Enforcement.

At Tuesday afternoon's news conference, authorities said that Al-Maliki had been in the United States legally since 1994 but that federal officials were reviewing his immigration status because he may have outdated information on his green card.

Law enforcement sources said Al-Maliki previously served time in jail for criminal trespassing in Atlantic City.

In addition, he was arrested on suspicion of possession of a destructive device, but the sources said charges were dropped; details of the incident were unavailable.

A law enforcement source close to the investigation said Al-Maliki spent only a day in Los Angeles, arriving Monday afternoon after taking a flight from Philadelphia.

Copyright 2007 Los Angeles Times
All Rights Reserved
30289  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Nuclear War? on: March 08, 2007, 01:35:45 PM
Is it that easy?

Lets say NY or Chicago or Los Angeles or DC gets hit with a small nuke or major dirty bomb and no one (or every one!) takes credit?  Whom do you think we should hit?
30290  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: The 2008 Presidential Race on: March 08, 2007, 01:30:07 PM
"Consider a contrast between the two front-runners for their respective party's nomination. A strong argument can be made that the shortcomings and vulnerabilities of Sen. Hillary Rodham Clinton, D-N.Y., are well known to virtually all; on Wall Street, they would say her numbers have already been discounted for her negatives... For Giuliani, the story is quite different. A cursory glance at not just Giuliani's stands on social and cultural issues, but also his complicated marital and personal life and the circumstances around his ability to avoid being drafted during the Vietnam War reveal ominous warning signals... And that is before discussing his support for gun control measures while he was mayor of New York City or mentioning that the first of his three marriages was to his second cousin and that one wife found out from a televised news conference that he was leaving her. The list could go on and on. Can he still win the GOP nomination? My guess remains no" -- political handicapper Charlie Cook, writing at
30291  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Part Three on: March 08, 2007, 01:05:43 PM

“I knew I couldn’t be a novelist,” said Wilson, who crackled with intensity during a telephone interview, “so I chose something as far as possible from literature — I chose science.” He is disarmingly honest about what motivated him: “I was very ambitious, and I wanted to make a mark.” He chose to study human evolution, he said, in part because he had some of his father’s literary leanings and the field required a novelist’s attention to human motivations, struggles and alliances — as well as a novelist’s flair for narrative.
Wilson eventually chose to study religion not because religion mattered to him personally — he was raised in a secular Protestant household and says he has long been an atheist — but because it was a lens through which to look at and revivify a branch of evolution that had fallen into disrepute. When Wilson was a graduate student at Michigan State University in the 1970s, Darwinians were critical of group selection, the idea that human groups can function as single organisms the way beehives or anthills do. So he decided to become the man who rescued this discredited idea. “I thought, Wow, defending group selection — now, that would be big,” he recalled. It wasn’t until the 1990s, he said, that he realized that “religion offered an opportunity to show that group selection was right after all.”

Dawkins once called Wilson’s defense of group selection “sheer, wanton, head-in-bag perversity.” Atran, too, has been dismissive of this approach, calling it “mind blind” for essentially ignoring the role of the brain’s mental machinery. The adaptationists “cannot in principle distinguish Marxism from monotheism, ideology from religious belief,” Atran wrote. “They cannot explain why people can be more steadfast in their commitment to admittedly counterfactual and counterintuitive beliefs — that Mary is both a mother and a virgin, and God is sentient but bodiless — than to the most politically, economically or scientifically persuasive account of the way things are or should be.”

Still, for all its controversial elements, the narrative Wilson devised about group selection and the evolution of religion is clear, perhaps a legacy of his novelist father. Begin, he says, with an imaginary flock of birds. Some birds serve as sentries, scanning the horizon for predators and calling out warnings. Having a sentry is good for the group but bad for the sentry, which is doubly harmed: by keeping watch, the sentry has less time to gather food, and by issuing a warning call, it is more likely to be spotted by the predator. So in the Darwinian struggle, the birds most likely to pass on their genes are the nonsentries. How, then, could the sentry gene survive for more than a generation or two?

To explain how a self-sacrificing gene can persist, Wilson looks to the level of the group. If there are 10 sentries in one group and none in the other, 3 or 4 of the sentries might be sacrificed. But the flock with sentries will probably outlast the flock that has no early-warning system, so the other 6 or 7 sentries will survive to pass on the genes. In other words, if the whole-group advantage outweighs the cost to any individual bird of being a sentry, then the sentry gene will prevail.

There are costs to any individual of being religious: the time and resources spent on rituals, the psychic energy devoted to following certain injunctions, the pain of some initiation rites. But in terms of intergroup struggle, according to Wilson, the costs can be outweighed by the benefits of being in a cohesive group that out-competes the others.

There is another element here too, unique to humans because it depends on language. A person’s behavior is observed not only by those in his immediate surroundings but also by anyone who can hear about it. There might be clear costs to taking on a role analogous to the sentry bird — a person who stands up to authority, for instance, risks losing his job, going to jail or getting beaten by the police — but in humans, these local costs might be outweighed by long-distance benefits. If a particular selfless trait enhances a person’s reputation, spread through the written and spoken word, it might give him an advantage in many of life’s challenges, like finding a mate. One way that reputation is enhanced is by being ostentatiously religious.

“The study of evolution is largely the study of trade-offs,” Wilson wrote in “Darwin’s Cathedral.” It might seem disadvantageous, in terms of foraging for sustenance and safety, for someone to favor religious over rationalistic explanations that would point to where the food and danger are. But in some circumstances, he wrote, “a symbolic belief system that departs from factual reality fares better.” For the individual, it might be more adaptive to have “highly sophisticated mental modules for acquiring factual knowledge and for building symbolic belief systems” than to have only one or the other, according to Wilson. For the group, it might be that a mixture of hardheaded realists and symbolically minded visionaries is most adaptive and that “what seems to be an adversarial relationship” between theists and atheists within a community is really a division of cognitive labor that “keeps social groups as a whole on an even keel.”

Even if Wilson is right that religion enhances group fitness, the question remains: Where does God come in? Why is a religious group any different from groups for which a fitness argument is never even offered — a group of fraternity brothers, say, or Yankees fans?

Richard Sosis, an anthropologist with positions at the University of Connecticut and Hebrew University of Jerusalem, has suggested a partial answer. Like many adaptationists, Sosis focuses on the way religion might be adaptive at the individual level. But even adaptations that help an individual survive can sometimes play themselves out through the group. Consider religious rituals.
“Religious and secular rituals can both promote cooperation,” Sosis wrote in American Scientist in 2004. But religious rituals “generate greater belief and commitment” because they depend on belief rather than on proof. The rituals are “beyond the possibility of examination,” he wrote, and a commitment to them is therefore emotional rather than logical — a commitment that is, in Sosis’s view, deeper and more long-lasting.

Rituals are a way of signaling a sincere commitment to the religion’s core beliefs, thereby earning loyalty from others in the group. “By donning several layers of clothing and standing out in the midday sun,” Sosis wrote, “ultraorthodox Jewish men are signaling to others: ‘Hey! Look, I’m a haredi’ — or extremely pious — ‘Jew. If you are also a member of this group, you can trust me because why else would I be dressed like this?’ ” These “signaling” rituals can grant the individual a sense of belonging and grant the group some freedom from constant and costly monitoring to ensure that their members are loyal and committed. The rituals are harsh enough to weed out the infidels, and both the group and the individual believers benefit.

In 2003, Sosis and Bradley Ruffle of Ben Gurion University in Israel sought an explanation for why Israel’s religious communes did better on average than secular communes in the wake of the economic crash of most of the country’s kibbutzim. They based their study on a standard economic game that measures cooperation. Individuals from religious communes played the game more cooperatively, while those from secular communes tended to be more selfish. It was the men who attended synagogue daily, not the religious women or the less observant men, who showed the biggest differences. To Sosis, this suggested that what mattered most was the frequent public display of devotion. These rituals, he wrote, led to greater cooperation in the religious communes, which helped them maintain their communal structure during economic hard times.

In 1997, Stephen Jay Gould wrote an essay in Natural History that called for a truce between religion and science. “The net of science covers the empirical universe,” he wrote. “The net of religion extends over questions of moral meaning and value.” Gould was emphatic about keeping the domains separate, urging “respectful discourse” and “mutual humility.” He called the demarcation “nonoverlapping magisteria” from the Latin magister, meaning “canon.”

Richard Dawkins had a history of spirited arguments with Gould, with whom he disagreed about almost everything related to the timing and focus of evolution. But he reserved some of his most venomous words for nonoverlapping magisteria. “Gould carried the art of bending over backward to positively supine lengths,” he wrote in “The God Delusion.” “Why shouldn’t we comment on God, as scientists? . . . A universe with a creative superintendent would be a very different kind of universe from one without. Why is that not a scientific matter?”

The separation, other critics said, left untapped the potential richness of letting one worldview inform the other. “Even if Gould was right that there were two domains, what religion does and what science does,” says Daniel Dennett (who, despite his neo-atheist label, is not as bluntly antireligious as Dawkins and Harris are), “that doesn’t mean science can’t study what religion does. It just means science can’t do what religion does.”

The idea that religion can be studied as a natural phenomenon might seem to require an atheistic philosophy as a starting point. Not necessarily. Even some neo-atheists aren’t entirely opposed to religion. Sam Harris practices Buddhist-inspired meditation. Daniel Dennett holds an annual Christmas sing-along, complete with hymns and carols that are not only harmonically lush but explicitly pious.

And one prominent member of the byproduct camp, Justin Barrett, is an observant Christian who believes in “an all-knowing, all-powerful, perfectly good God who brought the universe into being,” as he wrote in an e-mail message. “I believe that the purpose for people is to love God and love each other.”

At first blush, Barrett’s faith might seem confusing. How does his view of God as a byproduct of our mental architecture coexist with his Christianity? Why doesn’t the byproduct theory turn him into a skeptic?

“Christian theology teaches that people were crafted by God to be in a loving relationship with him and other people,” Barrett wrote in his e-mail message. “Why wouldn’t God, then, design us in such a way as to find belief in divinity quite natural?” Having a scientific explanation for mental phenomena does not mean we should stop believing in them, he wrote. “Suppose science produces a convincing account for why I think my wife loves me — should I then stop believing that she does?”

What can be made of atheists, then? If the evolutionary view of religion is true, they have to work hard at being atheists, to resist slipping into intrinsic habits of mind that make it easier to believe than not to believe. Atran says he faces an emotional and intellectual struggle to live without God in a nonatheist world, and he suspects that is where his little superstitions come from, his passing thought about crossing his fingers during turbulence or knocking on wood just in case. It is like an atavistic theism erupting when his guard is down. The comforts and consolations of belief are alluring even to him, he says, and probably will become more so as he gets closer to the end of his life. He fights it because he is a scientist and holds the values of rationalism higher than the values of spiritualism.

This internal push and pull between the spiritual and the rational reflects what used to be called the “God of the gaps” view of religion. The presumption was that as science was able to answer more questions about the natural world, God would be invoked to answer fewer, and religion would eventually recede. Research about the evolution of religion suggests otherwise. No matter how much science can explain, it seems, the real gap that God fills is an emptiness that our big-brained mental architecture interprets as a yearning for the supernatural. The drive to satisfy that yearning, according to both adaptationists and byproduct theorists, might be an inevitable and eternal part of what Atran calls the tragedy of human cognition.

Robin Marantz Henig, a contributing writer, has written recently for the magazine about the neurobiology of lying and about obesity.
30292  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Science vs. God Part Two on: March 08, 2007, 01:04:39 PM
Folkpsychology, as Atran and his colleagues see it, is essential to getting along in the contemporary world, just as it has been since prehistoric times. It allows us to anticipate the actions of others and to lead others to believe what we want them to believe; it is at the heart of everything from marriage to office politics to poker. People without this trait, like those with severe autism, are impaired, unable to imagine themselves in other people’s heads.

The process begins with positing the existence of minds, our own and others’, that we cannot see or feel. This leaves us open, almost instinctively, to belief in the separation of the body (the visible) and the mind (the invisible). If you can posit minds in other people that you cannot verify empirically, suggests Paul Bloom, a psychologist and the author of “Descartes’ Baby,” published in 2004, it is a short step to positing minds that do not have to be anchored to a body. And from there, he said, it is another short step to positing an immaterial soul and a transcendent God.

The traditional psychological view has been that until about age 4, children think that minds are permeable and that everyone knows whatever the child himself knows. To a young child, everyone is infallible. All other people, especially Mother and Father, are thought to have the same sort of insight as an all-knowing God.

But at a certain point in development, this changes. (Some new research suggests this might occur as early as 15 months.) The “false-belief test” is a classic experiment that highlights the boundary. Children watch a puppet show with a simple plot: John comes onstage holding a marble, puts it in Box A and walks off. Mary comes onstage, opens Box A, takes out the marble, puts it in Box B and walks off. John comes back onstage. The children are asked, Where will John look for the marble?

Very young children, or autistic children of any age, say John will look in Box B, since they know that’s where the marble is. But older children give a more sophisticated answer. They know that John never saw Mary move the marble and that as far as he is concerned it is still where he put it, in Box A. Older children have developed a theory of mind; they understand that other people sometimes have false beliefs. Even though they know that the marble is in Box B, they respond that John will look for it in Box A.
The adaptive advantage of folkpsychology is obvious. According to Atran, our ancestors needed it to survive their harsh environment, since folkpsychology allowed them to “rapidly and economically” distinguish good guys from bad guys. But how did folkpsychology — an understanding of ordinary people’s ordinary minds — allow for a belief in supernatural, omniscient minds? And if the byproduct theorists are right and these beliefs were of little use in finding food or leaving more offspring, why did they persist?
Atran ascribes the persistence to evolutionary misdirection, which, he says, happens all the time: “Evolution always produces something that works for what it works for, and then there’s no control for however else it’s used.” On a sunny weekday morning, over breakfast at a French cafe on upper Broadway, he tried to think of an analogy and grinned when he came up with an old standby: women’s breasts. Because they are associated with female hormones, he explained, full breasts indicate a woman is fertile, and the evolution of the male brain’s preference for them was a clever mating strategy. But breasts are now used for purposes unrelated to reproduction, to sell anything from deodorant to beer. “A Martian anthropologist might look at this and say, ‘Oh, yes, so these breasts must have somehow evolved to sell hygienic stuff or food to human beings,’ ” Atran said. But the Martian would, of course, be wrong. Equally wrong would be to make the same mistake about religion, thinking it must have evolved to make people behave a certain way or feel a certain allegiance.

That is what most fascinated Atran. “Why is God in there?” he wondered.

The idea of an infallible God is comfortable and familiar, something children readily accept. You can see this in the experiment Justin Barrett conducted recently — a version of the traditional false-belief test but with a religious twist. Barrett showed young children a box with a picture of crackers on the outside. What do you think is inside this box? he asked, and the children said, “Crackers.” Next he opened it and showed them that the box was filled with rocks. Then he asked two follow-up questions: What would your mother say is inside this box? And what would God say?

As earlier theory-of-mind experiments already showed, 3- and 4-year-olds tended to think Mother was infallible, and since the children knew the right answer, they assumed she would know it, too. They usually responded that Mother would say the box contained rocks. But 5- and 6-year-olds had learned that Mother, like any other person, could hold a false belief in her mind, and they tended to respond that she would be fooled by the packaging and would say, “Crackers.”

And what would God say? No matter what their age, the children, who were all Protestants, told Barrett that God would answer, “Rocks.” This was true even for the older children, who, as Barrett understood it, had developed folkpsychology and had used it when predicting a wrong response for Mother. They had learned that, in certain situations, people could be fooled — but they had also learned that there is no fooling God.

The bottom line, according to byproduct theorists, is that children are born with a tendency to believe in omniscience, invisible minds, immaterial souls — and then they grow up in cultures that fill their minds, hard-wired for belief, with specifics. It is a little like language acquisition, Paul Bloom says, with the essential difference that language is a biological adaptation and religion, in his view, is not. We are born with an innate facility for language but the specific language we learn depends on the environment in which we are raised. In much the same way, he says, we are born with an innate tendency for belief, but the specifics of what we grow up believing — whether there is one God or many, whether the soul goes to heaven or occupies another animal after death — are culturally shaped.

Whatever the specifics, certain beliefs can be found in all religions. Those that prevail, according to the byproduct theorists, are those that fit most comfortably with our mental architecture. Psychologists have shown, for instance, that people attend to, and remember, things that are unfamiliar and strange, but not so strange as to be impossible to assimilate. Ideas about God or other supernatural agents tend to fit these criteria. They are what Pascal Boyer, an anthropologist and psychologist, called “minimally counterintuitive”: weird enough to get your attention and lodge in your memory but not so weird that you reject them altogether. A tree that talks is minimally counterintuitive, and you might believe it as a supernatural agent. A tree that talks and flies and time-travels is maximally counterintuitive, and you are more likely to reject it.

Atran, along with Ara Norenzayan of the University of British Columbia, studied the idea of minimally counterintuitive agents earlier this decade. They presented college students with lists of fantastical creatures and asked them to choose the ones that seemed most “religious.” The convincingly religious agents, the students said, were not the most outlandish — not the turtle that chatters and climbs or the squealing, flowering marble — but those that were just outlandish enough: giggling seaweed, a sobbing oak, a talking horse. Giggling seaweed meets the requirement of being minimally counterintuitive, Atran wrote. So does a God who has a human personality except that he knows everything or a God who has a mind but has no body.

It is not enough for an agent to be minimally counterintuitive for it to earn a spot in people’s belief systems. An emotional component is often needed, too, if belief is to take hold. “If your emotions are involved, then that’s the time when you’re most likely to believe whatever the religion tells you to believe,” Atran says. Religions stir up emotions through their rituals — swaying, singing, bowing in unison during group prayer, sometimes working people up to a state of physical arousal that can border on frenzy. And religions gain strength during the natural heightening of emotions that occurs in times of personal crisis, when the faithful often turn to shamans or priests. The most intense personal crisis, for which religion can offer powerfully comforting answers, is when someone comes face to face with mortality.

In John Updike’s celebrated early short story “Pigeon Feathers,” 14-year-old David spends a lot of time thinking about death. He suspects that adults are lying when they say his spirit will live on after he dies. He keeps catching them in inconsistencies when he asks where exactly his soul will spend eternity. “Don’t you see,” he cries to his mother, “if when we die there’s nothing, all your sun and fields and what not are all, ah, horror? It’s just an ocean of horror.”

The story ends with David’s tiny revelation and his boundless relief. The boy gets a gun for his 15th birthday, which he uses to shoot down some pigeons that have been nesting in his grandmother’s barn. Before he buries them, he studies the dead birds’ feathers. He is amazed by their swirls of color, “designs executed, it seemed, in a controlled rapture.” And suddenly the fears that have plagued him are lifted, and with a “slipping sensation along his nerves that seemed to give the air hands, he was robed in this certainty: that the God who had lavished such craft upon these worthless birds would not destroy His whole Creation by refusing to let David live forever.”

Fear of death is an undercurrent of belief. The spirits of dead ancestors, ghosts, immortal deities, heaven and hell, the everlasting soul: the notion of spiritual existence after death is at the heart of almost every religion. According to some adaptationists, this is part of religion’s role, to help humans deal with the grim certainty of death. Believing in God and the afterlife, they say, is how we make sense of the brevity of our time on earth, how we give meaning to this brutish and short existence. Religion can offer solace to the bereaved and comfort to the frightened.

But the spandrelists counter that saying these beliefs are consolation does not mean they offered an adaptive advantage to our ancestors. “The human mind does not produce adequate comforting delusions against all situations of stress or fear,” wrote Pascal Boyer, a leading byproduct theorist, in “Religion Explained,” which came out a year before Atran’s book. “Indeed, any organism that was prone to such delusions would not survive long.”

Whether or not it is adaptive, belief in the afterlife gains power in two ways: from the intensity with which people wish it to be true and from the confirmation it seems to get from the real world. This brings us back to folkpsychology. We try to make sense of other people partly by imagining what it is like to be them, an adaptive trait that allowed our ancestors to outwit potential enemies. But when we think about being dead, we run into a cognitive wall. How can we possibly think about not thinking? “Try to fill your consciousness with the representation of no-consciousness, and you will see the impossibility of it,” the Spanish philosopher Miguel de Unamuno wrote in “Tragic Sense of Life.” “The effort to comprehend it causes the most tormenting dizziness. We cannot conceive of ourselves as not existing.”

Much easier, then, to imagine that the thinking somehow continues. This is what young children seem to do, as a study at the Florida Atlantic University demonstrated a few years ago. Jesse Bering and David Bjorklund, the psychologists who conducted the study, used finger puppets to act out the story of a mouse, hungry and lost, who is spotted by an alligator. “Well, it looks like Brown Mouse got eaten by Mr. Alligator,” the narrator says at the end. “Brown Mouse is not alive anymore.”

Afterward, Bering and Bjorklund asked their subjects, ages 4 to 12, what it meant for Brown Mouse to be “not alive anymore.” Is he still hungry? Is he still sleepy? Does he still want to go home? Most said the mouse no longer needed to eat or drink. But a large proportion, especially the younger ones, said that he still had thoughts, still loved his mother and still liked cheese. The children understood what it meant for the mouse’s body to cease to function, but many believed that something about the mouse was still alive.

“Our psychological architecture makes us think in particular ways,” says Bering, now at Queens University in Belfast, Northern Ireland. “In this study, it seems, the reason afterlife beliefs are so prevalent is that underlying them is our inability to simulate our nonexistence.”

It might be just as impossible to simulate the nonexistence of loved ones. A large part of any relationship takes place in our minds, Bering said, so it’s natural for it to continue much as before after the other person’s death. It is easy to forget that your sister is dead when you reach for the phone to call her, since your relationship was based so much on memory and imagined conversations even when she was alive. In addition, our agent-detection device sometimes confirms the sensation that the dead are still with us. The wind brushes our cheek, a spectral shape somehow looks familiar and our agent detection goes into overdrive. Dreams, too, have a way of confirming belief in the afterlife, with dead relatives appearing in dreams as if from beyond the grave, seeming very much alive.

Belief is our fallback position, according to Bering; it is our reflexive style of thought. “We have a basic psychological capacity that allows anyone to reason about unexpected natural events, to see deeper meaning where there is none,” he says. “It’s natural; it’s how our minds work.”

Intriguing as the spandrel logic might be, there is another way to think about the evolution of religion: that religion evolved because it offered survival advantages to our distant ancestors. This is where the action is in the science of God debate, with a coterie of adaptationists arguing on behalf of the primary benefits, in terms of survival advantages, of religious belief.
The trick in thinking about adaptation is that even if a trait offers no survival advantage today, it might have had one long ago. This is how Darwinians explain how certain physical characteristics persist even if they do not currently seem adaptive — by asking whether they might have helped our distant ancestors form social groups, feed themselves, find suitable mates or keep from getting killed. A facility for storing calories as fat, for instance, which is a detriment in today’s food-rich society, probably helped our ancestors survive cyclical famines.

So trying to explain the adaptiveness of religion means looking for how it might have helped early humans survive and reproduce. As some adaptationists see it, this could have worked on two levels, individual and group. Religion made people feel better, less tormented by thoughts about death, more focused on the future, more willing to take care of themselves. As William James put it, religion filled people with “a new zest which adds itself like a gift to life . . . an assurance of safety and a temper of peace and, in relation to others, a preponderance of loving affections.”

Such sentiments, some adaptationists say, made the faithful better at finding and storing food, for instance, and helped them attract better mates because of their reputations for morality, obedience and sober living. The advantage might have worked at the group level too, with religious groups outlasting others because they were more cohesive, more likely to contain individuals willing to make sacrifices for the group and more adept at sharing resources and preparing for warfare.

One of the most vocal adaptationists is David Sloan Wilson, an occasional thorn in the side of both Scott Atran and Richard Dawkins. Wilson, an evolutionary biologist at the State University of New York at Binghamton, focuses much of his argument at the group level. “Organisms are a product of natural selection,” he wrote in “Darwin’s Cathedral: Evolution, Religion, and the Nature of Society,” which came out in 2002, the same year as Atran’s book, and staked out the adaptationist view. “Through countless generations of variation and selection, [organisms] acquire properties that enable them to survive and reproduce in their environments. My purpose is to see if human groups in general, and religious groups in particular, qualify as organismic in this sense.”
Wilson’s father was Sloan Wilson, author of “The Man in the Gray Flannel Suit,” an emblem of mid-’50s suburban anomie that was turned into a film starring Gregory Peck. Sloan Wilson became a celebrity, with young women asking for his autograph, especially after his next novel, “A Summer Place,” became another blockbuster movie. The son grew up wanting to do something to make his famous father proud.
30293  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Science vs. God on: March 08, 2007, 01:00:44 PM
A major piece here, well worth the time to read it.  Comments?

March 4, 2007
Darwin’s God

God has always been a puzzle for Scott Atran. When he was 10 years old, he scrawled a plaintive message on the wall of his bedroom in Baltimore. “God exists,” he wrote in black and orange paint, “or if he doesn’t, we’re in trouble.” Atran has been struggling with questions about religion ever since — why he himself no longer believes in God and why so many other people, everywhere in the world, apparently do.

Call it God; call it superstition; call it, as Atran does, “belief in hope beyond reason” — whatever you call it, there seems an inherent human drive to believe in something transcendent, unfathomable and otherworldly, something beyond the reach or understanding of science. “Why do we cross our fingers during turbulence, even the most atheistic among us?” asked Atran when we spoke at his Upper West Side pied-à-terre in January. Atran, who is 55, is an anthropologist at the National Center for Scientific Research in Paris, with joint appointments at the University of Michigan and the John Jay College of Criminal Justice in New York. His research interests include cognitive science and evolutionary biology, and sometimes he presents students with a wooden box that he pretends is an African relic. “If you have negative sentiments toward religion,” he tells them, “the box will destroy whatever you put inside it.” Many of his students say they doubt the existence of God, but in this demonstration they act as if they believe in something. Put your pencil into the magic box, he tells them, and the nonbelievers do so blithely. Put in your driver’s license, he says, and most do, but only after significant hesitation. And when he tells them to put in their hands, few will.

If they don’t believe in God, what exactly are they afraid of?

Atran first conducted the magic-box demonstration in the 1980s, when he was at Cambridge University studying the nature of religious belief. He had received a doctorate in anthropology from Columbia University and, in the course of his fieldwork, saw evidence of religion everywhere he looked — at archaeological digs in Israel, among the Mayans in Guatemala, in artifact drawers at the American Museum of Natural History in New York. Atran is Darwinian in his approach, which means he tries to explain behavior by how it might once have solved problems of survival and reproduction for our early ancestors. But it was not clear to him what evolutionary problems might have been solved by religious belief. Religion seemed to use up physical and mental resources without an obvious benefit for survival. Why, he wondered, was religion so pervasive, when it was something that seemed so costly from an evolutionary point of view?

The magic-box demonstration helped set Atran on a career studying why humans might have evolved to be religious, something few people were doing back in the ’80s. Today, the effort has gained momentum, as scientists search for an evolutionary explanation for why belief in God exists — not whether God exists, which is a matter for philosophers and theologians, but why the belief does.

This is different from the scientific assault on religion that has been garnering attention recently, in the form of best-selling books from scientific atheists who see religion as a scourge. In “The God Delusion,” published last year and still on best-seller lists, the Oxford evolutionary biologist Richard Dawkins concludes that religion is nothing more than a useless, and sometimes dangerous, evolutionary accident. “Religious behavior may be a misfiring, an unfortunate byproduct of an underlying psychological propensity which in other circumstances is, or once was, useful,” Dawkins wrote. He is joined by two other best-selling authors — Sam Harris, who wrote “The End of Faith,” and Daniel Dennett, a philosopher at Tufts University who wrote “Breaking the Spell.” The three men differ in their personal styles and whether they are engaged in a battle against religiosity, but their names are often mentioned together. They have been portrayed as an unholy trinity of neo-atheists, promoting their secular world view with a fervor that seems almost evangelical.

Lost in the hullabaloo over the neo-atheists is a quieter and potentially more illuminating debate. It is taking place not between science and religion but within science itself, specifically among the scientists studying the evolution of religion. These scholars tend to agree on one point: that religious belief is an outgrowth of brain architecture that evolved during early human history. What they disagree about is why a tendency to believe evolved, whether it was because belief itself was adaptive or because it was just an evolutionary byproduct, a mere consequence of some other adaptation in the evolution of the human brain.

Which is the better biological explanation for a belief in God — evolutionary adaptation or neurological accident? Is there something about the cognitive functioning of humans that makes us receptive to belief in a supernatural deity? And if scientists are able to explain God, what then? Is explaining religion the same thing as explaining it away? Are the nonbelievers right, and is religion at its core an empty undertaking, a misdirection, a vestigial artifact of a primitive mind? Or are the believers right, and does the fact that we have the mental capacities for discerning God suggest that it was God who put them there?

In short, are we hard-wired to believe in God? And if we are, how and why did that happen?

“All of our raptures and our drynesses, our longings and pantings, our questions and beliefs . . . are equally organically founded,” William James wrote in “The Varieties of Religious Experience.” James, who taught philosophy and experimental psychology at Harvard for more than 30 years, based his book on a 1901 lecture series in which he took some early tentative steps at breaching the science-religion divide.

In the century that followed, a polite convention generally separated science and religion, at least in much of the Western world. Science, as the old trope had it, was assigned the territory that describes how the heavens go; religion, how to go to heaven.
Anthropologists like Atran and psychologists as far back as James had been looking at the roots of religion, but the mutual hands-off policy really began to shift in the 1990s. Religion made incursions into the traditional domain of science with attempts to bring intelligent design into the biology classroom and to choke off human embryonic stem-cell research on religious grounds. Scientists responded with counterincursions. Experts from the hard sciences, like evolutionary biology and cognitive neuroscience, joined anthropologists and psychologists in the study of religion, making God an object of scientific inquiry.

The debate over why belief evolved is between byproduct theorists and adaptationists. You might think that the byproduct theorists would tend to be nonbelievers, looking for a way to explain religion as a fluke, while the adaptationists would be more likely to be believers who can intuit the emotional, spiritual and community advantages that accompany faith. Or you might think they would all be atheists, because what believer would want to subject his own devotion to rationalism’s cold, hard scrutiny? But a scientist’s personal religious view does not always predict which side he will take. And this is just one sign of how complex and surprising this debate has become.

Angels, demons, spirits, wizards, gods and witches have peppered folk religions since mankind first started telling stories. Charles Darwin noted this in “The Descent of Man.” “A belief in all-pervading spiritual agencies,” he wrote, “seems to be universal.” According to anthropologists, religions that share certain supernatural features — belief in a noncorporeal God or gods, belief in the afterlife, belief in the ability of prayer or ritual to change the course of human events — are found in virtually every culture on earth.
This is certainly true in the United States. About 6 in 10 Americans, according to a 2005 Harris Poll, believe in the devil and hell, and about 7 in 10 believe in angels, heaven and the existence of miracles and of life after death. A 2006 survey at Baylor University found that 92 percent of respondents believe in a personal God — that is, a God with a distinct set of character traits ranging from “distant” to “benevolent.”

When a trait is universal, evolutionary biologists look for a genetic explanation and wonder how that gene or genes might enhance survival or reproductive success. In many ways, it’s an exercise in post-hoc hypothesizing: what would have been the advantage, when the human species first evolved, for an individual who happened to have a mutation that led to, say, a smaller jaw, a bigger forehead, a better thumb? How about certain behavioral traits, like a tendency for risk-taking or for kindness?

Atran saw such questions as a puzzle when applied to religion. So many aspects of religious belief involve misattribution and misunderstanding of the real world. Wouldn’t this be a liability in the survival-of-the-fittest competition? To Atran, religious belief requires taking “what is materially false to be true” and “what is materially true to be false.” One example of this is the belief that even after someone dies and the body demonstrably disintegrates, that person will still exist, will still be able to laugh and cry, to feel pain and joy. This confusion “does not appear to be a reasonable evolutionary strategy,” Atran wrote in “In Gods We Trust: The Evolutionary Landscape of Religion” in 2002. “Imagine another animal that took injury for health or big for small or fast for slow or dead for alive. It’s unlikely that such a species could survive.” He began to look for a sideways explanation: if religious belief was not adaptive, perhaps it was associated with something else that was.

Atran intended to study mathematics when he entered Columbia as a precocious 17-year-old. But he was distracted by the radical politics of the late ’60s. One day in his freshman year, he found himself at an antiwar rally listening to Margaret Mead, then perhaps the most famous anthropologist in America. Atran, dressed in a flamboyant Uncle Sam suit, stood up and called her a sellout for saying the protesters should be writing to their congressmen instead of staging demonstrations. “Young man,” the unflappable
Mead said, “why don’t you come see me in my office?”

Atran, equally unflappable, did go to see her — and ended up working for Mead, spending much of his time exploring the cabinets of curiosities in her tower office at the American Museum of Natural History. Soon he switched his major to anthropology.
Many of the museum specimens were religious, Atran says. So were the artifacts he dug up on archaeological excursions in Israel in the early ’70s. Wherever he turned, he encountered the passion of religious belief. Why, he wondered, did people work so hard against their preference for logical explanations to maintain two views of the world, the real and the unreal, the intuitive and the counterintuitive?

Maybe cognitive effort was precisely the point. Maybe it took less mental work than Atran realized to hold belief in God in one’s mind. Maybe, in fact, belief was the default position for the human mind, something that took no cognitive effort at all.
While still an undergraduate, Atran decided to explore these questions by organizing a conference on universal aspects of culture and inviting all his intellectual heroes: the linguist Noam Chomsky, the psychologist Jean Piaget, the anthropologists Claude Levi-Strauss and Gregory Bateson (who was also Margaret Mead’s ex-husband), the Nobel Prize-winning biologists Jacques Monod and Francois Jacob. It was 1974, and the only site he could find for the conference was at a location just outside Paris. Atran was a scraggly 22-year-old with a guitar who had learned his French from comic books. To his astonishment, everyone he invited agreed to come.

Atran is a sociable man with sharp hazel eyes, who sparks provocative conversations the way other men pick bar fights. As he traveled in the ’70s and ’80s, he accumulated friends who were thinking about the issues he was: how culture is transmitted among human groups and what evolutionary function it might serve. “I started looking at history, and I wondered why no society ever survived more than three generations without a religious foundation as its raison d’être,” he says. Soon he turned to an emerging subset of evolutionary theory — the evolution of human cognition.

Some cognitive scientists think of brain functioning in terms of modules, a series of interconnected machines, each one responsible for a particular mental trick. They do not tend to talk about a God module per se; they usually consider belief in God a consequence of other mental modules.

Religion, in this view, is “a family of cognitive phenomena that involves the extraordinary use of everyday cognitive processes,” Atran wrote in “In Gods We Trust.” “Religions do not exist apart from the individual minds that constitute them and the environments that constrain them, any more than biological species and varieties exist independently of the individual organisms that compose them and the environments that conform them.”

At around the time “In Gods We Trust” appeared five years ago, a handful of other scientists — Pascal Boyer, now at Washington University; Justin Barrett, now at Oxford; Paul Bloom at Yale — were addressing these same questions. In synchrony they were moving toward the byproduct theory.

Darwinians who study physical evolution distinguish between traits that are themselves adaptive, like having blood cells that can transport oxygen, and traits that are byproducts of adaptations, like the redness of blood. There is no survival advantage to blood’s being red instead of turquoise; it is just a byproduct of the trait that is adaptive, having blood that contains hemoglobin.
Something similar explains aspects of brain evolution, too, say the byproduct theorists. Which brings us to the idea of the spandrel.
Stephen Jay Gould, the famed evolutionary biologist at Harvard who died in 2002, and his colleague Richard Lewontin proposed “spandrel” to describe a trait that has no adaptive value of its own. They borrowed the term from architecture, where it originally referred to the V-shaped structure formed between two rounded arches. The structure is not there for any purpose; it is there because that is what happens when arches align.

In architecture, a spandrel can be neutral or it can be made functional. Building a staircase, for instance, creates a space underneath that is innocuous, just a blank sort of triangle. But if you put a closet there, the under-stairs space takes on a function, unrelated to the staircase’s but useful nonetheless. Either way, functional or nonfunctional, the space under the stairs is a spandrel, an unintended byproduct.

“Natural selection made the human brain big,” Gould wrote, “but most of our mental properties and potentials may be spandrels — that is, nonadaptive side consequences of building a device with such structural complexity.”

The possibility that God could be a spandrel offered Atran a new way of understanding the evolution of religion. But a spandrel of what, exactly?

Hardships of early human life favored the evolution of certain cognitive tools, among them the ability to infer the presence of organisms that might do harm, to come up with causal narratives for natural events and to recognize that other people have minds of their own with their own beliefs, desires and intentions. Psychologists call these tools, respectively, agent detection, causal reasoning and theory of mind.

Agent detection evolved because assuming the presence of an agent — which is jargon for any creature with volitional, independent behavior — is more adaptive than assuming its absence. If you are a caveman on the savannah, you are better off presuming that the motion you detect out of the corner of your eye is an agent and something to run from, even if you are wrong. If it turns out to have been just the rustling of leaves, you are still alive; if what you took to be leaves rustling was really a hyena about to pounce, you are dead.

A classic experiment from the 1940s by the psychologists Fritz Heider and Marianne Simmel suggested that imputing agency is so automatic that people may do it even for geometric shapes. For the experiment, subjects watched a film of triangles and circles moving around. When asked what they had been watching, the subjects used words like “chase” and “capture.” They did not just see the random movement of shapes on a screen; they saw pursuit, planning, escape.

So if there is motion just out of our line of sight, we presume it is caused by an agent, an animal or person with the ability to move independently. This usually operates in one direction only; lots of people mistake a rock for a bear, but almost no one mistakes a bear for a rock.

What does this mean for belief in the supernatural? It means our brains are primed for it, ready to presume the presence of agents even when such presence confounds logic. “The most central concepts in religions are related to agents,” Justin Barrett, a psychologist, wrote in his 2004 summary of the byproduct theory, “Why Would Anyone Believe in God?” Religious agents are often supernatural, he wrote, “people with superpowers, statues that can answer requests or disembodied minds that can act on us and the world.”

A second mental module that primes us for religion is causal reasoning. The human brain has evolved the capacity to impose a narrative, complete with chronology and cause-and-effect logic, on whatever it encounters, no matter how apparently random. “We automatically, and often unconsciously, look for an explanation of why things happen to us,” Barrett wrote, “and ‘stuff just happens’ is no explanation. Gods, by virtue of their strange physical properties and their mysterious superpowers, make fine candidates for causes of many of these unusual events.” The ancient Greeks believed thunder was the sound of Zeus’s thunderbolt. Similarly, a contemporary woman whose cancer treatment works despite 10-to-1 odds might look for a story to explain her survival. It fits better with her causal-reasoning tool for her recovery to be a miracle, or a reward for prayer, than for it to be just a lucky roll of the dice.

A third cognitive trick is a kind of social intuition known as theory of mind. It’s an odd phrase for something so automatic, since the word “theory” suggests formality and self-consciousness. Other terms have been used for the same concept, like intentional stance and social cognition. One good alternative is the term Atran uses: folkpsychology.
30294  DBMA Martial Arts Forum / Martial Arts Topics / Euro MAI Interview with Guro Lonely Dog on: March 08, 2007, 12:58:59 PM
'LONELY DOG' by Matt Tucker

In the medieval Swiss capital of Bern is one of the stickfighting worlds best kept secrets. Instructor Matt Tucker travelled to Bern to to spend a week training with the Dog Brothers Martial Arts Chief Instructor for Europe, Guru Benjamin 'Lonely Dog' Rittiner.

Forward by Guru Marc 'Crafty Dog' Denny   

When Benjamin first came to train with me (1997 or '98) with some students of his I was very impressed with how much he had absorbed on his own.   We hit it off well, and he continued to come to train with me in LA and would assist me on seminars that I would give in Spain, England, and Italy.  This, combined with his outstanding work ethic and natural talent enabled him to grow very well in DBMA even though he lived in Switzerland and I in California. 

Although we the Dog Brothers are known for an intense kind of fighting, we are about something much more than all that-- something which is revealed through the fighting perhaps, but something that is not about the fighting.  As Benjamin fought at the Gatherings, everyone was very impressed not only his skill as a fighter, but also for the man he showed himself to be.  He made Dog Brother in the minimum number of Gatherings required (5). 

As he continued to host me in Switzerland our friendship grew (likewise my friendship with his wonderful wife Cornelia) as did his skill and knowledge in DBMA--he became the only other person I have promoted to Guro in DBMA.  Because of the age disparity (about 20 years) I have almost paternal feelings for him which has allowed me to transcend the secretive nature I had about certain things when I was still fighting and teach him as if he were a son.  Tomorrow is promised to no one, and if something were to happen to me he would be the one to step into my role in DBMA.

As I continued coming to Europe, I felt the desire that many people had to become part of the Dog Brothers and I realized the difficulty of repeatedly flying to California for people in Europe.  I discussed this with Benji and shared with him what I thought were the ingredients and building blocks necessary for a Dog Brothers Gathering.   For several years we worked together to prepare the way.  In the spring of 2006 we held an "Invitational Gathering" to make sure that we had the nucleus of people necessary to establish the respect necessary for the "Dog Brothers code" and were ready to take the next step.  We were ready.

This past October first we held the first DB Gathering ever outside of Los Angeles.  The plan was for all three founders of the Dog Brothers (now the governing body of the Dog Brothers known as "The Council of Elders"-- because we are old)-- Top Dog (Eric) , Salty Dog (Arlan) and me-- to witness the Gathering, but Salty had business matters that intervened and so it was only Top Dog and me.  I proposed to Eric and Arlan that Benji become a member of The Council of Elders and they enthusiastically agreed.

Eric and I were very, very impressed with the fighting skill at the Euro Gathering.  Even more important though was the strong Dog Brother feeling shown by all the fighters there, regardless of which system they came from.  Indeed, we consider this Gathering to be one of the best Dog Brothers Gatherings ever-- a very special day! 

The Dog Brothers Gathering is now an established event and people fighting at the Gathering are eligible to be considered part of the Dog Brothers tribe-- regardless of which system in which they train.  To become a full Dog Brother, one must make it to the main Gathering in Los Angeles.

And so ladies and gentlemen, I present to you Guro Benjamin "Lonely Dog" Rittiner: highly regarded Dog Brother, member of the DB Council of Elders, Guro in DBMA and head of our organization in Europe, and my very good friend. 

The Adventure continues,
Marc "Crafty Dog" Denny
Guiding Force of the Dog Brothers
Founder/Head Instructor Dog Brothers Martial Arts

Matt Tucker: How did you first get involved with Martial Arts/Fighting?

Benjamin Rittiner: My first contact with the martial arts was at school when I was around 7 or 8 years old. We had a project to make a short movie and some of the older children had the idea to make a short Kung Fu movie. Sadly I was one of the younger children tied to the tree awaiting the heroes (The older children), to come rescue us. This was when I first saw my first Ninja Shuriken/Throwing Star. When I was around ten years old I got my first Nunchaku and played around a bit with it until I started my formal training 1984 with Karate after which moved onto other other Japanese Martial Arts, such as Judo, Ju Jutsu, and the Bujinkan. Later on I trained also in Boxing, Thai-Boxing and Sanda wher I experienced competing and fighting in competition.

MT: How did you first come across Dog Brothers Martial Arts?

BR: Whilst I was attending a Tai Kai Seminar in Luxembourg my teacher had a friend who trained in the Inosanto Blend so we went to his gym to train for a couple of hours on some basics, Sumbrada and Heaven Six etc. This type of training realy interested me and I wanted to continue training in the Filipino Martial Arts but I couldn’t find a teacher around here in Switzerland at that time. I started to collect as many instructional videos on FMA so I could continue to learn. 1994 I came across the Dog Brothers Real Contact Fighting Series and it suddenly hit me 'Thats cool, I want to do this' and I realised that this was the direction I wanted to go, so I started to train with some friends for about 4 years with these videos.

MT: So it was 4 years before you had any formal training under Marc Denny?

BR: Yes I just trained from the DVD's, I would watch them again and again, hundreds of times, perfecting each specific move until the tapes eventualy broke!. But, after a few years I knew that if I wanted to go any further I would have to go to the USA. So I wrote a letter to the Dog Brothers address in Hermosa Beach California. Mark Denny replied to my letter and invited me over. Some months later I made my first visit to Hermosa beach.

MT: How did the training differ from what you had been doing on your own?

BR: It was pretty interesting, Marc just asked me to do some Carenza (shadow boxing). He commented that I moved quite well and he asked me who my teacher was, to which my reply was 'My teacher is VCR!'. During the 5 day PTP we covered a lot of material and mostly he was surprised at how fast I could adapt to the material. It was not until the 5th day when Marc showed me some techniques really gave me a hard time and I think he was he was quite glad to find something I could not  do straight away...Over all he was impressed that someone could learn so much from just videos.

This first training with Guro Crafty changed my understanding of Stickfighting greatly. As far as the fighting went I had already developed a pretty solid structure in what we call “regular lead”. This is with the stick in right hand and the same leg forward and used to shuffle forward and back. He teached me to use both leads, means that I could fight with the right foot forward but also with the left foot as a lead. This and the knowledge of using the triangle footwork and to have over all a sense of angleing footwork help me a lot in making my fighting game more alive…. This was a very important lesson for me.

The deepest lesson I got over the years through the training with Guro Crafty was the capability to analize my opponents. To understand that I will face different structures and the better I can analize them the more solutions I have against these different structures the better it is. It’s truly like Sugar Ray Leonard once said "You don't beat the man, you beat his style."

Guro Crafty is a great teacher and he was a very feared fighter, but what I admire the most when I think about him is his great capability to analize structures and through that of course the fighters. This is truly the reason why he is the “Crafty Dog”….

I always was a talented and skilled fighter. But he maked me also a smart fighter. I never forgot the sentence I heard one day from Grand Tuhon Leo Gaje: “easy to be hard, hard to be smart”… smiley

MT: Was it hard to find training partners to help you keep progressing when you came back home?

BR: It was harder to keep them because in the beginning we would train just 3 or 4 techniques and we sparred a lot. In those days because I knew a bit more and was inexperienced I believe I may have pushed things a little too hard and this is why people stopped training

MT: Do you believe it is important to spar, even if it is controlled sparring from day one?

BR: Its different from student to student. If a student has no experience then he should perhaps wait longer than for example someone who takes up boxing. In boxing you can start with easy sparring from day one. Stickfight sparring could be more dangerous or it you make it too safe with too much protection or padding then you can learn a lot of mistakes. In general I think most students should wait about 6 months before they start sparring but sometimes you may get a beginner who is a natural fighter who does not care about bruises so he can jump in to sparring a bit earlier. I truly believe it’s better to build up a student without getting him too many bruises in the beginning. The difficulty is not to break the students spirit as too much pain too early will cause the student to quit. It always easy to teach fighters to fight and much harder to get regular people to do a Gathering. The latter is the most interesting goal for me personally as a teacher.

MT: Your nick name 'Lonely Dog' is I presume because you had no one to train with when you came back to Bern?

BR: No, actually it was to do with the sense of tribe in California amongst the DB group. It was such a great warm feeling to be part of this tribe at the Gatherings that when I flew back and saw this huge distance between California and Switzerland I did feel somewhat on my own.

MT: Can you explain the sense of tribe within the Dog Brothers?

BR: To fight like we do is very intense and dangerous. To make it happen we need something that controls the energy. If it were just a competition to see who is best it would be extremely dangerous and therefore you would need a lot of rules, however the idea is not to limit the person through the rules but if you have no rules, heavy sticks and less protection then you need something else that controls the violence.So the idea that we are one tribe, we are all friends and we all want to learn & grow together intalls the 'safety' in our fighting and its very very special. In normal competition you dont get people fighting and then afterwards discussing the fight at the side of the mat. In my amatuer boxing days I was very nervous when I competed yet there were rules, head gear and protection. I was more concerned about winning, I did not know my opponent and did not know what kind of person he was. At the 'Gatherings' I have never felt this kind of pressure. If I have to tap then its simply a good lesson   

MT: How dangerous is Real Contact Stick Fighting?.

BR: It depends how smart the fighter is! (Laughs). Actually it is quite dangerous. We have sticks and we hit each other, but there are 2 things that reduce the risk. One is the code, to not break your opponent spiritually or physically. We want to show him his weaknesses but if he is stunned we wont take the final blow that may seriously injure them.
The second is how you fight. I want to have clever fights, many fights are tough fights and attract types of people who are attracted to the danger, those who want to test their balls, which is a fair reasons to take part and many people walk away from a Gathering a different and more confident person.
For me its more about controlling & dominating a fight through my strategy and the more I apply this the less chance of injury, but a fight is a fight and in real contact there is always a risk. If you take the risk away then it would not be the same experience and through this risk you make bigger steps in the progression as a martial artist.

MT: How long before a fight do you start to concentrate your training on it?

BR: It changed over the years.At least 8 weeks of preperation both the cardio and technique. I tend to keep my cardio at a basic level. It is more important to have your head right and to be ready for the Gathering mentally. There were fights that I had where I had just recovered from flu and had 5 days to prepare to fight. You cannot build any cardio in such a short amount of time but you can do a lot of mental preparation in 5 days. So over the years I now try and keep my Cardio at a base level, I am always working power so the mental game is more important to me. Many people spar very hard up to a week before a fight which can be dangerous as you can walk into a fight already injured, personally  I stop hard sparring at least 6 weeks prior to a fight as it takes about 6 weeks for a broken finger to mend.

MT: How does the use of rhythm training (training to music) improve a students performance with regards to fighting in a Gathering?

BR: Again this varies from fighter to fighter and some people just have no rhythm! Someone does not need to have rhythm to be a good fighter. if someone has no rhythm I dont force them to train with rhythm but if they have a bit of feel for it then it can help a great deal. I have developed something called the 'Boogie Woogie' as a specific shadow boxing drill and since I have done this I have discovered how to break rhythm, maintain rhythm and control the pace of a fight. I believe its a major point in fighting to dominate the rhythm you want to fight and how to change that rhythm to disrupt your opponent and force him to create an opening.

MT: You have assisted on DBMA's instructional DVD's with Marc Denny and were recently asked to shoot a your own DBMA instructional DVD. What are 'Cycle Drills' and how did you come up with the idea?

BR: 'Cycle Drills' is a very basic drill where you just defend and counter. The reason I came up with this idea was to have some thing more defence / counter orientated.The biggest problem many people have is that they train in medio (medium) range and when they tap in for a fight they find themselves outside of largo (long) range in what we call 'Snake' range and they find serious problems closing the gap to use their medio techniques. To help them to close the gap from largo to medio range Guro Crafty developed many training drills, like the Snaggle Tooth progression and the Attacking Blocks drill. However in Cycle Drills I wanted to face more the idea of what to do if the opponent is pushing the fight. How to use our techniques we have in counter fighting structure. Cycle Drills is a generator that can be used by both beginners and more advanced students and allows the fighter to develop the ability to counter strike safely with stick, complimentary hand or even power kicks. It also helps with closing to 'Corto' (Close) range to clinch and onto take downs and grappling

MT: Can anyone have the opportunity to advance in Dogbrothers Martial Arts and can one refrain from Real Contact Stick Fighting and still train the DBMA system?

BR:  Absolutely, Indeed, MOST people who train in DBMA are what we call "Practitioners" interested in our mission statement of "Walk as a Warrior for all your days."
I think to fight RCSF is not for everybody but to have the knowledge gained from these experiences brought to the Dojo is for everyone so the practioners can benefit from the fighters

MT: I have often heard comment from other martial artists who criticise the Dog Brothers for being nothing more than brawlers with sticks. Could you give us your thoughts on this and explain the difference between Full Contact Full Armour matches and Real Contact Stick Fighting (Low Armour)

BR: Actually, I don't have a lot of view about the Full Armour tournaments but I have seen a few videos and I have seen how they train. To be honest these are brawls, I don't see any strategy. I see skill, a lot of cardio some nice looking techniques but not the  strategy you need when fighting with real sticks. If you used the Full Armour approach in Real Contact Stick Fighting you will make your way to the hospital sooner than you think. There has to be a lot of strategy and skill involved to survive. To the naked eye of someone just training in FMA then the fighting looks different to how they expect it to look. So many people say that they can't see the skill but its the same in MMA and you need a fighters eye to see this skill, to see the strategy, skill and timing because its all  happening so fast in a RCSF. One of the nice things about the DBMA DVDs is the "if you see it taught, you see if fought".  The teaching material is illustrated with actual fights and slow-motion is used to slow the fight down to where people can actually see what is going on.

MT: How do you want to see DBMA in Europe grow from here on?

BR: I am really happy to see whats happened over the last two years and if it carries on like this then I am very much looking forward to it. The training groups at the moment are pretty small and I am really excited to see them grow and develop. I am glad that the sense of tribe is allowing the different groups to work together. Last year Top Dog and Crafty Dog came to open our first European Dog Brothers Gathering here in Bern and I was very surprised to see how much interest we had with 42 fighters. It was a long day with a lot of fights. I am also seeing a unique fighting style appearing for the European Dog Brothers much like in Boxing with European boxing being different from American and American is different from Mexican boxing etc.I am already seeing a direction that will make us different (laughs)!

Also Dog Brothers Martial Arts has a lot to offer the practitioners (those that don't wish to fight) with a full self-defence system and healing arts for both men and women and I would like to see this grow along side those who wish to fight at the 'Gatherings' . Within the system we organize this under the headings of "Ritual" and "Reality" which are combined to yield a "Totality".  The Real Contact Stickfighting falls within "Ritual", street application matters are within "Reality". (There is also a Law Enforcement/Military component to the system)  Some DBMA people are more interested in one, some are more interested in the other, some seek to blend the two and some people start with one, but wind up focusing on the other.  This applies to our instructors as well! To have the adrenal state experience of Real Contact Stickfighting I think is very useful in helping people to understand what they are capable of and what skills need to be capable of in real time for the street.  In summary I would say that we like to let people explore and grow as they will.

MT: Finally , Can you tell us a bit about your first ever London Seminar later this year?

BR: Yes I am very happy to be teaching a 2 day seminar on 9th & 10th June 2007 in Plumstead, London. The plan is to give the participants an overview about DBMA. This time I want to focus on Single Stick.  I’m going to cover different areas with the single stick. There will be our blend of Kali and Krabi Krabong (Los Triques), Clinchwork with the Stick but also some Self-defence material.
I will also be visiting our Glasgow Group in August.

MT: Thank you for a fantastic weeks training and I look forward to our next adventure.

BR: Woof woof

30295  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Nuclear War, WMD issues on: March 08, 2007, 12:14:38 PM
This article raises some very important and very scary questions.  Comments?

The Words None Dare Say: Nuclear War
By George Lakoff

"The elimination of Natanz would be a major setback for Iran's nuclear ambitions, but the conventional weapons in the American arsenal could not insure the destruction of facilities under seventy-five feet of earth and rock, especially if they are reinforced with concrete. "-Seymour Hersh, The New Yorker, April 17, 2006
"The second concern is that if an underground laboratory is deeply buried, that can also confound conventional weapons. But the depth of the Natanz facility - reports place the ceiling roughly 30 feet underground - is not prohibitive. The American GBU-28 weapon - the so-called bunker buster - can pierce about 23 feet of concrete and 100 feet of soil. Unless the cover over the Natanz lab is almost entirely rock, bunker busters should be able to reach it. That said, some chance remains that a single strike would fail. " - Michael Levi, New York Times, April 18, 2006

03/01/07 "ich" -- -  A familiar means of denying a reality is to refuse to use the words that describe that reality. A common form of propaganda is to keep reality from being described.

In such circumstances, silence and euphemism are forms of complicity both in propaganda and in the denial of reality. And the media, as well as the major presidential candidates, are now complicit.

The stories in the major media suggest that an attack against Iran is a real possibility and that the Natanz nuclear development site is the number one target. As the above quotes from two of our best sources note, military experts say that conventional "bunker-busters" such as the GBU-28 might be able to destroy the Natanz facility, especially with repeated bombings. On the other hand, they also say such iterated use of conventional weapons might not work, e.g., if the rock and earth above the facility becomes liquefied. On that supposition, a "low yield" "tactical" nuclear weapon, say, the B61-11, might be needed.

If the Bush administration, for example, were to insist on a sure "success," then the "attack" would constitute nuclear war. The words in boldface are nuclear war, that's right, nuclear war - a first strike nuclear war.

We don't know what exactly is being planned - conventional GBU-28s or nuclear B61-11s. And that is the point. Discussion needs to be open. Nuclear war is not a minor matter.

The Euphemism

As early as August 13, 2005, Bush, in Jerusalem, was asked what would happen if diplomacy failed to persuade Iran to halt its nuclear program. Bush replied, "All options are on the table." On April 18, the day after the appearance of Seymour Hersh's New Yorker report on the administration's preparations for a nuclear war against Iran, President Bush held a news conference. He was asked,

"Sir, when you talk about Iran, and you talk about how you have diplomatic efforts, you also say all options are on the table. Does that include the possibility of a nuclear strike? Is that something that your administration will plan for?"

He replied,

"All options are on the table."

The President never actually said the forbidden words "nuclear war," but he appeared to tacitly acknowledge the preparations - without further discussion.

Vice-President Dick Cheney, speaking in Australia last week, backed up the President .

"We worked with the European community and the United Nations to put together a set of policies to persuade the Iranians to give up their aspirations and resolve the matter peacefully, and that is still our preference. But I've also made the point, and the president has made the point, that all options are on the table."

Republican Presidential Candidate John McCain, on FOX News, August 14, 2005, said the same .

"For us to say that the Iranians can do whatever they want to do and we won't under any circumstances exercise a military option would be for them to have a license to do whatever they want to do ... So I think the president's comment that we won't take anything off the table was entirely appropriate."

But it's not just Republicans. Democratic Presidential candidate John Edwards, in a speech in Herzliyah, Israel, echoed Bush.

"To ensure that Iran never gets nuclear weapons, we need to keep ALL options on the table. Let me reiterate - ALL options must remain on the table."

Although, Edwards has said, when asked about this statement, that he prefers peaceful solutions and direct negotiations with Iran, he has nonetheless repeated the "all options on the table" position - making clear that he would consider starting a preventive nuclear war, but without using the fateful words.

Hillary Clinton, at an AIPAC dinner in New York, said,

"We cannot, we should not, we must not, permit Iran to build or acquire nuclear weapons, and in dealing with this threat, as I have said for a very long time, no option can be taken off the table."

Translation: Nuclear weapons can be used to prevent the spread of nuclear weapons.

Barack Obama, asked on 60 Minutes about using military force to prevent Iran from developing nuclear weapons, began a discussion of his preference for diplomacy by responding, "I think we should keep all options on the table."

Bush, Cheney, McCain, Edwards, Clinton, and Obama all say indirectly that they seriously consider starting a preventive nuclear war, but will not engage in a public discussion of what that would mean. That contributes to a general denial, and the press is going along with it by a corresponding refusal to use the words.

If the consequences of nuclear war are not discussed openly, the war may happen without an appreciation of the consequences and without the public having a chance to stop it. Our job is to open that discussion.

Of course, there is a rationale for the euphemism: To scare our adversaries by making them think that we are crazy enough to do what we hint at, while not raising a public outcry. That is what happened in the lead up to the Iraq War, and the disaster of that war tells us why we must have such a discussion about Iran. Presidential candidates go along, not wanting to be thought of as interfering in on-going indirect diplomacy. That may be the conventional wisdom for candidates, but an informed, concerned public must say what candidates are advised not to say.

More Euphemisms

The euphemisms used include "tactical," "small," "mini-," and "low yield" nuclear weapons. "Tactical" contrasts with "strategic"; it refers to tactics, relatively low-level choices made in carrying out an overall strategy, but which don't affect the grand strategy. But the use of any nuclear weapons would be anything but "tactical." It would be a major world event - in Vladimir Putin's words, "lowering the threshold for the use of nuclear weapons," making the use of more powerful nuclear weapons more likely and setting off a new arms race. The use of the word "tactical" operates to lessen their importance, to distract from the fact that their very use would constitute a nuclear war.

What is "low yield"? Perhaps the "smallest" tactical nuclear weapon we have is the B61-11, which has a dial-a-yield feature: it can yield "only" 0.3 kilotons, but can be set to yield up to 170 kilotons. The power of the Hiroshima bomb was 15 kilotons. That is, a "small" bomb can yield more than 10 times the explosive power of the Hiroshima bomb. The B61-11 dropped from 40,000 feet would dig a hole 20 feet deep and then explode, send shock waves downward, leave a huge crater, and spread radiation widely. The idea that it would explode underground and be harmless to those above ground is false - and, anyway, an underground release of radiation would threaten ground water and aquifers for a long time and over a wide distance.

To use words such as "low yield" or "small" or "mini-" nuclear weapon is like speaking of being a little bit pregnant. Nuclear war is nuclear war! It crosses the moral line.

Any discussion of roadside canister bombs made in Iran justifying an attack on Iran should be put in perspective: Little canister bombs (EFPs - explosively formed projectiles) that shoot a small hot metal ball at a humvee or tank versus nuclear war.

Incidentally, the administration may be focusing on the canister bombs because it seeks to claim that the Authorization for Use of Military Force Against Iraq Resolution of 2002 permits the use of military force against Iran based on its interference in Iraq. In that case, no further authorization by Congress would be needed for an attack on Iran.

The journalistic point is clear. Journalists and political leaders should not talk about an "attack." They should use the words that describe what is really at stake: nuclear war - in boldface.

Then there is the scale of the proposed attack. Military reports leaking out suggest a huge (mostly or entirely non-nuclear) airstrike on as many as 10,000 targets - a "shock and awe" attack that would destroy Iran's infrastructure the way the U.S. bombing destroyed Iraq's infrastructure. The targets would not just be "military targets." As Dan Plesch reports in the New Statesman, February 19, 2007, such an attack would wipe out Iran's military, business, and political infrastructure. Not just nuclear installations, missile launching sites, tanks, and ammunition dumps, but also airports, rail lines, highways, bridges, ports, communications centers, power grids, industrial centers, hospitals, public buildings, and even the homes of political leaders. That is what was attacked in Iraq: the "critical infrastructure." It is not just military in the traditional sense. It leaves a nation in rubble, and leads to death, maiming, disease, joblessness, impoverishment, starvation, mass refugees, lawlessness, rape, and incalculable pain and suffering. That is what the options appear to be "on the table." Is nation destruction what the American people have in mind when they acquiesce without discussion to an "attack"? Is nuclear war what the American people have in mind? An informed public must ask and the media must ask. The words must be used.

Even if the attack were limited to nuclear installations, starting a nuclear war with Iran would have terrible consequences - and not just for Iranians. First, it would strengthen the hand of the Islamic fundamentalists - exactly the opposite of the effect U.S. planners would want. It would be viewed as yet another major attack on Islam. Fundamentalist Islam is a revenge culture. If you want to recruit fundamentalist Islamists all over the world to become violent jihadists, this is the best way to do it. America would become a world pariah. Any idea of the U.S. as a peaceful nation would be destroyed. Moreover, you don't work against the spread of nuclear weapons by using those weapons. That will just make countries all over the world want nuclear weaponry all the more. Trying to stop nuclear proliferation through nuclear war is self-defeating.

As Einstein said, "You cannot simultaneously prevent and prepare for war."

Why would the Bush administration do it? Here is what conservative strategist William Kristol wrote last summer during Israel's war with Hezbollah.

"For while Syria and Iran are enemies of Israel, they are also enemies of the United States. We have done a poor job of standing up to them and weakening them. They are now testing us more boldly than one would have thought possible a few years ago. Weakness is provocative. We have been too weak, and have allowed ourselves to be perceived as weak.

The right response is renewed strength -- in supporting the governments of Iraq and Afghanistan, in standing with Israel, and in pursuing regime change in Syria and Iran. For that matter, we might consider countering this act of Iranian aggression with a military strike against Iranian nuclear facilities. Why wait? Does anyone think a nuclear Iran can be contained? That the current regime will negotiate in good faith? It would be easier to act sooner rather than later. Yes, there would be repercussions -- and they would be healthy ones, showing a strong America that has rejected further appeasement."

-Willam Kristol, Weekly Standard 7/24/06

"Renewed strength" is just the Bush strategy in Iraq. At a time when the Iraqi people want us to leave, when our national elections show that most Americans want our troops out, when 60% of Iraqis think it all right to kill Americans, Bush wants to escalate. Why? Because he is weak in America. Because he needs to show more "strength." Because if he knocks out the Iranian nuclear facilities, he can claim at least one "victory." Starting a nuclear war with Iran would really put us in a worldwide war with fundamentalist Islam. It would make real the terrorist threat he has been claiming since 9/11. It would create more fear - real fear - in America. And he believes, with much reason, that fear tends to make Americans vote for saber-rattling conservatives.

Kristol's neoconservative view that "weakness is provocative" is echoed in Iran, but by the other side. Mahmoud Ahmadinejad was quoted in The New York Times of February 24, 2007 as having "vowed anew to continue enriching uranium, saying, 'If we show weakness in front of the enemies, they will increase their expectations.'" If both sides refuse to back off for fear of showing weakness, then prospects for conflict are real, despite the repeated analyses, like that of The Economist that the use of nuclear weapons against Iran would be politically and morally impossible. As one unnamed administration official has said (The New York Times, February 24, 2007), "No one has defined where the red line is that we cannot let the Iranians step over."

What we are seeing now is the conservative message machine preparing the country to accept the ideas of a nuclear war and nation destruction against Iran. The technique used is the "slippery slope." It is done by degrees. Like the proverbial frog in the pot of water - if the heat is turned up slowly the frog gets used to the heat and eventually boils to death - the American public is getting gradually acclimated to the idea of war with Iran.

* First, describe Iran as evil - part of the axis of evil. An inherently evil person will inevitably do evil things and can't be negotiated with. An entire evil nation is a threat to other nations.
* Second, describe Iran's leader as a "Hitler" who is inherently "evil" and cannot be reasoned with. Refuse to negotiate with him.
* Then repeat the lie that Iran is on the verge of having nuclear weapons - weapons of mass destruction. IAEA Director General Mohamed ElBaradei says they are at best many years away.
* Call nuclear development "an existential threat" - a threat to our very existence.
* Then suggest a single "surgical" "attack" on Natanz and make it seem acceptable.
* Then find a reason to call the attack "self-defense" - or better protection for our troops from the EFPs, or single-shot canister bombs.
* Claim, without proof and without anyone even taking responsibility for the claim, that the Iranian government at its highest level is supplying deadly weapons to Shiite militias attacking our troops, while not mentioning the fact that Saudi Arabia is helping Sunni insurgents attacking our troops.
* Give "protecting our troops" as a reason for attacking Iran without getting new authorization from Congress. Claim that the old authorization for attacking Iraq implied doing "whatever is necessary to protect our troops" from Iranian intervention in Iraq.
* Argue that de-escalation in Iraq would "bleed" our troops, "weaken" America, and lead to defeat. This sets up escalation as a winning policy, if not in Iraq then in Iran.
* Get the press to go along with each step.
* Never mention the words "preventive nuclear war" or "national destruction." When asked, say, "All options are on the table." Keep the issue of nuclear war and its consequences from being seriously discussed by the national media.
* Intimidate Democratic presidential candidates into agreeing, without using the words, that nuclear war should be "on the table." This makes nuclear war and nation destruction bipartisan and even more acceptable.

Progressives managed to blunt the "surge" idea by telling the truth about "escalation." Nuclear war against Iran and nation destruction constitute the ultimate escalation.

The time has come to stop the attempt to make a nuclear war against Iran palatable to the American public. We do not believe that most Americans want to start a nuclear war or to impose nation destruction on the people of Iran. They might, though, be willing to support a tit-for-tat "surgical" "attack" on Natanz in retaliation for small canister bombs and to end Iran's early nuclear capacity.

It is time for America's journalists and political leaders to put two and two together, and ask the fateful question: Is the Bush administration seriously preparing for nuclear war and nation destruction? If the conventional GBU-28s will do the job, then why not take nuclear war off the table in the name of controlling the spread of nuclear weapons? If GBU-28s won't do the job, then it is all the more important to have that discussion.

This should not be a distraction from Iraq. The general issue is escalation as a policy, both in Iraq and in Iran. They are linked issues, not separate issues. We have learned from Iraq what lack of public scrutiny does.

George Lakoff is a Senior Fellow at the Rockridge Institute. Lakoff is Professor of Linguistics at the University of California, Berkeley.
30296  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Evolutionary biology/psychology on: March 08, 2007, 11:57:34 AM
I just realized that this subject has a thread on the "Poltics & Religion" forum too at which I will bring over here when I get my wife to remind me how to do that.  embarassed

Please continue to post here, but know that there are many interesting posts there too.

30297  Politics, Religion, Science, Culture and Humanities / Politics & Religion / Re: Iran on: March 08, 2007, 11:46:39 AM
IRAN: The board of the International Atomic Energy Agency voted to suspend 22 technical aid programs to Iran as part of the expansion of international sanctions on Tehran over its refusal to halt its uranium enrichment program. The widely expected decision, which stiffens the penalties placed on Iran by the U.N. Security Council on Dec. 23, 2006, was made by consensus.
30298  Politics, Religion, Science, Culture and Humanities / Science, Culture, & Humanities / Re: Conan the B. & Robert Howard on: March 08, 2007, 11:45:21 AM

Do I understand correctly that there is a new series of Conan being released now?  I had not heard.  If so, who is doing, how can I buy? etc.
30299  DBMA Espanol / Espanol Discussion / Re: Mexico on: March 08, 2007, 11:39:39 AM
Mexico: A Rise in Killings in Sonora State
March 07, 2007 18 57  GMT


Three police officers were killed March 6 in Mexico's Sonora state, the latest in a spate of drug-related slayings in this relatively quiet state. The rise in criminal activity is believed to be related to a campaign of intimidation by Mexico's drug cartels, but it could also indicate that rival cartels are moving into territory controlled by the Sinaloa federation.


The body of a municipal police officer was found March 6 in a rural area near Hermosillo, the capital of Mexico's Sonora state. The officer, who had his hands and feet bound, had apparently been executed. A note left with the body says "the problem is not with the government" and lists the names of five other police officers. This could suggest that the officer had been an informant for the cartels and was killed by fellow officers. Later that day, a municipal police officer was shot and killed while patrolling Obregon Avenue in Cananea, near the U.S. border. The night before, an agent from the Sonora State Judicial Police was executed in the parking lot of Hermosillo's state attorney general's office.

Since the beginning of the year, crime has been on the rise in Sonora state. By late February, it was estimated that 15 executions had taken place in the state in 2007 and five had occurred during the last week, including the two in Hermosillo. This is well above the state's usual homicide rate. Almost all of the victims so far have been law enforcement officials.

The killings are believed to be a reaction to Mexican President Felipe Calderon's crackdown on drug cartel operations throughout Mexico. Sonora Gov. Jose Eduardo Robinson Bours Castelo, referring to the current situation as a "period of executions," has said the killings are part of the cartels' attempts to intimidate police and dissuade them from cooperating with Mexican federal authorities in the anti-cartel campaign.

Another explanation for the increase in violence in Sonora could be the movement into the state of members of various cartels escaping areas where Calderon's crackdowns are taking place. Organized crime in Sonora is controlled by a federation of drug cartels led by the Sinaloa cartel, which originated in Sinaloa state, which borders Sonora to the south. Sonora is important to the federation as a corridor for transporting drugs from Central and South America into the United States. While federal security efforts disrupt organized crime in other states -- such as Baja California, Tamaulipas, Michoacan -- areas with less federal presence, such as Sonora, could prove to be attractive cartel sanctuaries.

Despite the increase in violence in Sonora state, the threat to U.S. citizens visiting there remains minor. The main risk remains Sonora's notoriously hazardous roadways rather than the unlikely possibility of being caught in the crossfire between cartel and law enforcement personnel.
30300  DBMA Espanol / Espanol Discussion / Re: Peru on: March 08, 2007, 11:37:42 AM
Guau a todos:

Tengo el orgullo de anunciar que Rainer y "Sniper" ahora encabeza un DBMA Training Group.  grin

Tan pronto que me manden los datos que quieren que yo ponga en nuestro sitio para que la gente sepan como ponerse en contacto con ellos, lo hare'

La Aventura continua!
Guro Crafty
Pages: 1 ... 604 605 [606] 607 608 ... 665
Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!