Dog Brothers Public Forum


Welcome, Guest. Please login or register.
February 20, 2018, 05:48:40 PM

Login with username, password and session length
Search:     Advanced search
Welcome to the Dog Brothers Public Forum.
107427 Posts in 2403 Topics by 1095 Members
Latest Member: dannysamuel
* Home Help Search Login Register
+  Dog Brothers Public Forum
|-+  Politics, Religion, Science, Culture and Humanities
| |-+  Science, Culture, & Humanities
| | |-+  Evolutionary biology/psychology
« previous next »
Pages: [1] 2 3 Print
Author Topic: Evolutionary biology/psychology  (Read 91815 times)
Power User
Posts: 42482

« on: November 01, 2006, 07:41:09 AM »

An Evolutionary Theory of Right and Wrong
NY Times
Published: October 31, 2006
Who doesn?t know the difference between right and wrong? Yet that essential knowledge, generally assumed to come from parental teaching or religious or legal instruction, could turn out to have a quite different origin.

Primatologists like Frans de Waal have long argued that the roots of human morality are evident in social animals like apes and monkeys. The animals? feelings of empathy and expectations of reciprocity are essential behaviors for mammalian group living and can be regarded as a counterpart of human morality.

Marc D. Hauser, a Harvard biologist, has built on this idea to propose that people are born with a moral grammar wired into their neural circuits by evolution. In a new book, ?Moral Minds? (HarperCollins 2006), he argues that the grammar generates instant moral judgments which, in part because of the quick decisions that must be made in life-or-death situations, are inaccessible to the conscious mind.

People are generally unaware of this process because the mind is adept at coming up with plausible rationalizations for why it arrived at a decision generated subconsciously.

Dr. Hauser presents his argument as a hypothesis to be proved, not as an established fact. But it is an idea that he roots in solid ground, including his own and others? work with primates and in empirical results derived by moral philosophers.

The proposal, if true, would have far-reaching consequences. It implies that parents and teachers are not teaching children the rules of correct behavior from scratch but are, at best, giving shape to an innate behavior. And it suggests that religions are not the source of moral codes but, rather, social enforcers of instinctive moral behavior.

Both atheists and people belonging to a wide range of faiths make the same moral judgments, Dr. Hauser writes, implying ?that the system that unconsciously generates moral judgments is immune to religious doctrine.? Dr. Hauser argues that the moral grammar operates in much the same way as the universal grammar proposed by the linguist Noam Chomsky as the innate neural machinery for language. The universal grammar is a system of rules for generating syntax and vocabulary but does not specify any particular language. That is supplied by the culture in which a child grows up.

The moral grammar too, in Dr. Hauser?s view, is a system for generating moral behavior and not a list of specific rules. It constrains human behavior so tightly that many rules are in fact the same or very similar in every society ? do as you would be done by; care for children and the weak; don?t kill; avoid adultery and incest; don?t cheat, steal or lie.

But it also allows for variations, since cultures can assign different weights to the elements of the grammar?s calculations. Thus one society may ban abortion, another may see infanticide as a moral duty in certain circumstances. Or as Kipling observed, ?The wildest dreams of Kew are the facts of Katmandu, and the crimes of Clapham chaste in Martaban.?

Matters of right and wrong have long been the province of moral philosophers and ethicists. Dr. Hauser?s proposal is an attempt to claim the subject for science, in particular for evolutionary biology. The moral grammar evolved, he believes, because restraints on behavior are required for social living and have been favored by natural selection because of their survival value.

Much of the present evidence for the moral grammar is indirect. Some of it comes from psychological tests of children, showing that they have an innate sense of fairness that starts to unfold at age 4. Some comes from ingenious dilemmas devised to show a subconscious moral judgment generator at work. These are known by the moral philosophers who developed them as ?trolley problems.?

Suppose you are standing by a railroad track. Ahead, in a deep cutting from which no escape is possible, five people are walking on the track. You hear a train approaching. Beside you is a lever with which you can switch the train to a sidetrack. One person is walking on the sidetrack. Is it O.K. to pull the lever and save the five people, though one will die?

Most people say it is.

Assume now you are on a bridge overlooking the track. Ahead, five people on the track are at risk. You can save them by throwing down a heavy object into the path of the approaching train. One is available beside you, in the form of a fat man. Is it O.K. to push him to save the five?

Most people say no, although lives saved and lost are the same as in the first problem.

Why does the moral grammar generate such different judgments in apparently similar situations? It makes a distinction, Dr. Hauser writes, between a foreseen harm (the train killing the person on the track) and an intended harm (throwing the person in front of the train), despite the fact that the consequences are the same in either case. It also rates killing an animal as more acceptable than killing a person.

Many people cannot articulate the foreseen/intended distinction, Dr. Hauser says, a sign that it is being made at inaccessible levels of the mind. This inability challenges the general belief that moral behavior is learned. For if people cannot articulate the foreseen/intended distinction, how can they teach it?

Skip to next paragraph
Web Link
Moral Minds Excerpt: Chapter One
Readers? Opinions
Forum: Book News and Reviews
Dr. Hauser began his research career in animal communication, working with vervet monkeys in Kenya and with birds. He is the author of a standard textbook on the subject, ?The Evolution of Communication.? He began to take an interest in the human animal in 1992 after psychologists devised experiments that allowed one to infer what babies are thinking. He found he could repeat many of these experiments in cotton-top tamarins, allowing the cognitive capacities of infants to be set in an evolutionary framework.

His proposal of a moral grammar emerges from a collaboration with Dr. Chomsky, who had taken an interest in Dr. Hauser?s ideas about animal communication. In 2002 they wrote, with Dr. Tecumseh Fitch, an unusual article arguing that the faculty of language must have developed as an adaptation of some neural system possessed by animals, perhaps one used in navigation. From this interaction Dr. Hauser developed the idea that moral behavior, like language behavior, is acquired with the help of an innate set of rules that unfolds early in a child?s development.

Social animals, he believes, possess the rudiments of a moral system in that they can recognize cheating or deviations from expected behavior. But they generally lack the psychological mechanisms on which the pervasive reciprocity of human society is based, like the ability to remember bad behavior, quantify its costs, recall prior interactions with an individual and punish offenders. ?Lions cooperate on the hunt, but there is no punishment for laggards,? Dr. Hauser said.

The moral grammar now universal among people presumably evolved to its final shape during the hunter-gatherer phase of the human past, before the dispersal from the ancestral homeland in northeast Africa some 50,000 years ago. This may be why events before our eyes carry far greater moral weight than happenings far away, Dr. Hauser believes, since in those days one never had to care about people remote from one?s environment.

Dr. Hauser believes that the moral grammar may have evolved through the evolutionary mechanism known as group selection. A group bound by altruism toward its members and rigorous discouragement of cheaters would be more likely to prevail over a less cohesive society, so genes for moral grammar would become more common.

Many evolutionary biologists frown on the idea of group selection, noting that genes cannot become more frequent unless they benefit the individual who carries them, and a person who contributes altruistically to people not related to him will reduce his own fitness and leave fewer offspring.

But though group selection has not been proved to occur in animals, Dr. Hauser believes that it may have operated in people because of their greater social conformity and willingness to punish or ostracize those who disobey moral codes.

?That permits strong group cohesion you don?t see in other animals, which may make for group selection,? he said.

His proposal for an innate moral grammar, if people pay attention to it, could ruffle many feathers. His fellow biologists may raise eyebrows at proposing such a big idea when much of the supporting evidence has yet to be acquired. Moral philosophers may not welcome a biologist?s bid to annex their turf, despite Dr. Hauser?s expressed desire to collaborate with them.

Nevertheless, researchers? idea of a good hypothesis is one that generates interesting and testable predictions. By this criterion, the proposal of an innate moral grammar seems unlikely to disappoint.

Power User
Posts: 42482

« Reply #1 on: November 03, 2006, 03:18:38 PM »

Looking at Flipper, Seeing Ourselves
Published: October 9, 2006

NO one blinks when a celebrity is called "vacuous" or a politician a
"moron" - but when headlines screamed that dolphins are "dimwits" and
"flippin' idiots," I was truly shocked. Is this a way to talk about an
animal so revered that there are several Web domain names that include
"smart dolphin"?

This is not to say that one should believe everything about them. For
example, their supposed "smile" is fake (they lack the facial musculature
for expressions), and all we seem to have learned from chatting "dolphinese"
with them is that lone male dolphins are keenly interested in female

Nevertheless, it's going too far to say that dolphins are dimwits. Yet this
is the claim of Paul Manger, a South African scientist who says that
dolphins' relatively large brains are due simply to preponderance of fatty
glial cells. These glia produce heat, which allows the brain's neurons to do
their job in the cold ocean.

Based on this observation, Professor Manger couldn't resist speculating that
the intelligence of dolphins and other cetaceans (like whales and porpoises)
is vastly overrated. He offered gems of insight, such as that dolphins are
too stupid to jump over a slight barrier (as when they are trapped in a tuna
net), whereas most other animals will. Even a goldfish will jump out of its
bowl, he noted.

If we skip the technicalities - such as that glial cells are not simply
insulation, that they add connectivity to the brain, and that humans, too,
have many more glial cells than neurons - the question remains why the
prospect of animal intelligence sets off such controversy. Could it be that
the huge size of the dolphin brain, which exceeds ours by 15 percent or
more, threatens the human ego? Are we to ignore the billions and billions of
neurons that dolphins do possess?

The goldfish remark reminded me of a common strategy of those who play down
animal intelligence. They love to "demonstrate" remarkable cognitive feats
in small-brained species: if a rat or pigeon can do it, it can't be that
special. Thus, some pigeons have been trained to use "symbolic
 communication" by pecking a key marked "thank you!" that delivered food to
another pigeon. And they have also been conditioned to peck at their own
bodies in front of a mirror, supporting the claim that they are

Clearly, pigeons are trainable. But is this truly comparable to the actions
of Presley, a dolphin at the New York Aquarium, who, without any rewards,
reacted to being marked with paint by taking off at high speed to a distant
part of his tank where a mirror was mounted? There he spun round and round,
the way we do in a dressing room, appearing to check himself out.

What is so upsetting to some people about the closeness between animal and
human intelligence, or between animal and human emotions, for that matter?
Just saying that animals can learn from each other, and hence have
rudimentary cultures, or that they can be jealous or empathic is taken by
some as a personal affront. Accusations of anthropomorphism will fly, and we'll
be urged to be parsimonious in our explanations. The message is that animals
are no humans.

That much is obvious. But it is equally true that humans are animals. Is it
so outlandish, from an evolutionary standpoint, to assume that if a
large-brained mammal acts similarly to us under similar circumstances, the
psychology behind its behavior is probably similar, too? This is true
parsimony in the scientific sense, the idea that the simplest explanation is
often the best. Those who resist this framework are in "anthropodenial" -
they cling to unproven differences.

Since Aristotle, humans have known that dolphins are incredibly social. Each
individual produces its own unique whistle sound by which the others
recognize him or her. They enjoy lifelong bonds and reconcile after fights
by means of "petting." The males form power-seeking coalitions, not unlike
the politics of chimpanzees and humans. Dolphins also support sick
companions near the surface, where they can breathe. They may encircle a
school of herring, driving the fish together in a compact ball and releasing
bubbles to keep them in place, after which they pick their food like fruit
from a tree.

In captivity, dolphins are known to imitate the gait and gestures of people
walking by, and to outsmart their keepers. One female dolphin that was
rewarded with a fish for every piece of debris she managed to collect from
her tank managed to con her trainers into a bounty of snacks. They
discovered she had been hiding large items like newspapers underwater, only
to rip small pieces from them, bringing these to her trainer one by one.

There are tons of such observations, which is why most of us believe in
dolphin intelligence - glia or no glia. It also explains why the slaughter
of dolphins, as still occurs every year in Japan, arouses such strong
emotions and controversy.

Still, I must admit that the whole dolphin affair has also offered me some
fresh insights. From now on, if I find my goldfish thrashing on the floor, I
will congratulate him before dropping him back into his bowl.

Frans de Waal, a professor of psychology at Emory University, is the author
of "Our Inner Ape."
Power User
Posts: 784

« Reply #2 on: November 09, 2006, 02:24:14 PM »

Once this is sequenced it'll be interesting to compare and contrast it to the human genome:

Scientists Create Neanderthal Genome

Wednesday, 8th November 2006, 19:06
Scientists are reconstructing the genome of Neanderthals - the close relations of modern man.

The ambitious project involves isolating genetic fragments from fossils of the prehistoric beings who originally inhabited Europe to map their complete DNA.

The Neanderthal people were believed to have died out about 35,000 years ago - at a time when modern humans were advancing across the continent.

Lead researcher Dr Svante Paabo, an evolutionary geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, said: "This would be the first time we have sequenced the entire genome of an extinct organism."

But the prospect of using the genome to produce a living Neanderthal has been ruled out.

A popular caricature portrays Neanderthals as beetle-browed brutes - but this is far from the truth, reports New Scientist.

"Neanderthals were sophisticated stone-tool makers and made razor-sharp knives out of flint," said Dr Richard Klein, an anthropologist at Stanford University, California.

"They made fires when and where they wanted and seem to have made a living by hunting large mammals such as bison and deer."

Neanderthals also buried their dead, which, fortunately for researchers, increases the odds of the bones being preserved.

"By sequencing their entire genome we can begin to learn more about their biology," said Dr Eddy Rubin, a geneticist at the Lawrence Berkeley National Laboratory in Walnut Creek, California.

The genetic questions could also solve the biggest mystery of all - why did Neanderthals die out while modern humans went on to conquer the globe?

Dr Paabo and colleagues pioneered the genetic study of Neanderthals by extracting and decoding fragments of
mitochondrial DNA (mtDNA) from the bones of the original specimen, discovered in 1856 in the Neander Valley in Germany.

The mtDNA Dr Paabo sequenced suggested humans split from Neanderthals about 500,000 years ago - which fits neatly with the fossil record. It also suggested Neanderthals did not interbreed with our ancestors.

Dr Paabo's team have selected two Neanderthal specimens to work on based on the fact both have "clean" DNA that is
relatively uncontaminated.

One is a 38,000-year-old fossil from Vindija, Croatia. The other is the original specimen, which, despite being
extensively handled, has unusually clean DNA in its right upper arm bone.

During its lifetime the individual lost the use of its left arm after breaking it and had to rely on the right arm - causing the bones to grow thicker and denser than usual.

After death this shielded the DNA from contamination. The researchers are also hunting for new specimens that can be sampled before other people get their hands on them.

They have so far sequenced about a million base pairs of nuclear DNA from the Croatian fossil and hope to publish a draft of the whole genome in two years.

"It is definitely possible to sequence the entire genome from such well-preserved specimens," said Dr Eske Willerslev, an expert in ancient DNA at the University of Copenhagen, Denmark.

"Perhaps the biggest difficulty will be verifying the sequences obtained are genuinely from the Neanderthal genome and not a contaminant - as so much of it will be identical to the human genome."

The genome is sure to fuel the particularly intense controversy that has surrounded a
much-vaunted aspect of human uniqueness - language.

"There's been a debate going for more than 30 years about the speech capabilities of Neanderthals," says Dr Philip
Lieberman, a cognitive scientist at Brown University in Providence, Rhode Island.

"It's clear from the fossil record and comparisons with modern humans that Neanderthals could speak."

But the prospect of the genome providing the blueprint for resurrecting a living "Jurassic-Park-style" Neanderthal is unlikely.

Dr Paabo said: "We would be able to create a physical Neanderthal genome but we will not be able to recreate a Neanderthal - even if we wanted to."
Power User
Posts: 784

« Reply #3 on: November 13, 2006, 11:10:06 PM »

2:00 13 November 2006 news service
Roxanne Khamsi

A mother?s diet can change the behaviour of a specific gene for at least two subsequent generations, a new study demonstrates for the first time.

Feeding mice an enriched diet during pregnancy silenced a gene for light fur in their pups. And even though these pups ate a standard, un-enriched diet, the gene remained less active in their subsequent offspring.

The findings could help explain the curious results from recent studies of human populations ? including one showing that the grandchildren of well-fed Swedes had a greater risk of diabetes.

The new mouse experiment lends support to the idea that we inherit not only our genes from our parents, but also a set of instructions that tell the genes when to become active. These instructions appear to be passed on through ?epigenetic? changes to DNA ? genes can be activated or silenced according to the chemical groups that are added onto them.

Gene silencer

David Martin at the Children?s Hospital Oakland Research Institute in California, US, and colleagues used a special strain of genetically identical mice with an overactive version of a gene that influences fur colour. Mice with the AVY version of this gene generally have golden fur.

Half of the mice were given a diet enriched with nutrients such as vitamin B12 and zinc. These nutrients are known to increase the availability of the ?methyl? chemical groups that are responsible for silencing genes. The rest of the mice received a standard diet.

The pups of mice on the standard diet generally had golden fur. But a high proportion of those born to mice on the enriched diet had dark brown fur.

Martin believes that the nutrient-rich maternal diet caused silencing of the pups? AVY genes while they developed in the womb.

Passed down

Intriguingly, even though all of the pups in this generation received a standard diet, those that had exposure to a high-nutrient diet while in the womb, later gave birth to dark-coated offspring. Their control counterparts, by comparison, produced offspring with golden fur.

This shows that environmental factors ? such as an enriched diet ? can affect the activity of the AVY gene for at least two generations, the researchers say.

?The results make it clear that a nutritional status can affect not only that individual, but that individual?s children as well,? says study member Kenneth Beckman.

Skin colour

Beckman notes that the AVY gene is linked to weight and diabetes risk. He adds that there is some evidence that a related gene in humans might affect skin colour ? but it is unknown if it also affects weight.

Even though humans may have a similar gene, they should not make dietary changes based on the results of the mouse experiment, researchers stress. ?It would be irresponsible to make any prescriptions about human behaviour based on these findings,? says Martin.

An earlier Swedish study which used historical data of harvests in Sweden, found that a youngster had a quadrupled risk of diabetes if their grandfather had good access to food during his own boyhood (see Grandad's diet affects descendants' health).

Journal reference: Proceedings of the National Academy of Science (DOI: 10.1073/pnas.0607090103)

Related Articles
Famine increases the risk of schizophrenia
02 August 2005
Life sentence
30 October 2004
Hidden inheritance
28 November 1998
Power User
Posts: 784

« Reply #4 on: November 17, 2006, 03:24:02 AM »

So broad in scope it seems misfiled; Dalrymple is becoming one of my favorite essayists.

The Gift of Language -

Theodore Dalrymple

No, Dr. Pinker, it?s not just from nature.

Now that I?ve retired early from medical practice in a slum hospital and the prison next door, my former colleagues sometimes ask me, not without a trace of anxiety, whether I think that I made the right choice or whether I miss my previous life. They are good friends and fine men, but it is only human nature not to wish unalloyed happiness to one who has chosen a path that diverges, even slightly, from one?s own.

Fortunately, I do miss some aspects of my work: if I didn?t, it would mean that I had not enjoyed what I did for many years and had wasted a large stretch of my life. I miss, for instance, the sudden illumination into the worldview of my patients that their replies to simple questions sometimes gave me. I still do a certain amount of medico-legal work, preparing psychiatric reports on those accused of crimes, and recently a case reminded me of how sharply a few words can bring into relief an entire attitude toward life and shed light on an entire mental hinterland.

A young woman was charged with assault, under the influence of alcohol and marijuana, on a very old lady about five times her age. Describing her childhood, the young accused mentioned that her mother had once been in trouble with the police.

?What for?? I asked.

?She was on the Social [Security] and working at the same time.?

?What happened?? I asked.

?She had to give up working.? The air of self-evidence with which she said this revealed a whole world of presuppositions. For her, and those around her, work was the last resort; economic dependence on state handouts was the natural condition of man.

I delighted in what my patients said. One of them always laced his statements with proverbs, which he invariably mangled. ?Sometimes, doctor,? he said to me one day, ?I feel like the little boy with his finger in the dike, crying wolf.? And I enjoyed the expressive argot of prison. The prison officers, too, had their own language. They called a loquacious prisoner ?verbal? if they believed him to be mad, and ?mouthy? if they believed him to be merely bad and willfully misbehaving.

Brief exchanges could so entertain me that on occasion they transformed duty into pleasure. Once I was called to the prison in the early hours to examine a man who had just tried to hang himself. He was sitting in a room with a prison officer. It was about three in the morning, the very worst time to be roused from sleep.

?The things you have to do for Umanity, sir,? said the prison officer to me.

The prisoner, looking bemused, said to him, ?You what??

?U-manity,? said the prison officer, turning to the prisoner. ?You?re Uman, aren?t you??

It was like living in a glorious comic passage in Dickens.

For the most part, though, I was struck not by the verbal felicity and invention of my patients and those around them but by their inability to express themselves with anything like facility: and this after 11 years of compulsory education, or (more accurately) attendance at school.

With a very limited vocabulary, it is impossible to make, or at least to express, important distinctions and to examine any question with conceptual care. My patients often had no words to describe what they were feeling, except in the crudest possible way, with expostulations, exclamations, and physical displays of emotion. Often, by guesswork and my experience of other patients, I could put things into words for them, words that they grasped at eagerly. Everything was on the tip of their tongue, rarely or never reaching the stage of expression out loud. They struggled even to describe in a consecutive and logical fashion what had happened to them, at least without a great deal of prompting. Complex narrative and most abstractions were closed to them.

In their dealings with authority, they were at a huge disadvantage?a disaster, since so many of them depended upon various public bureaucracies for so many of their needs, from their housing and health care to their income and the education of their children. I would find myself dealing on their behalf with those bureaucracies, which were often simultaneously bullying and incompetent; and what officialdom had claimed for months or even years to be impossible suddenly, on my intervention, became possible within a week. Of course, it was not my mastery of language alone that produced this result; rather, my mastery of language signaled my capacity to make serious trouble for the bureaucrats if they did not do as I asked. I do not think it is a coincidence that the offices of all those bureaucracies were increasingly installing security barriers against the physical attacks on the staff by enraged but inarticulate dependents.

All this, it seems to me, directly contradicts our era?s ruling orthodoxy about language. According to that orthodoxy, every child, save the severely brain-damaged and those with very rare genetic defects, learns his or her native language with perfect facility, adequate to his needs. He does so because the faculty of language is part of human nature, inscribed in man?s physical being, as it were, and almost independent of environment. To be sure, today?s language theorists concede that if a child grows up completely isolated from other human beings until the age of about six, he will never learn language adequately; but this very fact, they argue, implies that the capacity for language is ?hardwired? in the human brain, to be activated only at a certain stage in each individual?s development, which in turn proves that language is an inherent biological characteristic of mankind rather than a merely cultural artifact. Moreover, language itself is always rule-governed; and the rules that govern it are universally the same, when stripped of certain minor incidentals and contingencies that superficially appear important but in reality are not.

It follows that no language or dialect is superior to any other and that modes of verbal communication cannot be ranked according to complexity, expressiveness, or any other virtue. Thus, attempts to foist alleged grammatical ?correctness? on native speakers of an ?incorrect? dialect are nothing but the unacknowledged and oppressive exercise of social control?the means by which the elites deprive whole social classes and peoples of self-esteem and keep them in permanent subordination. If they are convinced that they can?t speak their own language properly, how can they possibly feel other than unworthy, humiliated, and disenfranchised? Hence the refusal to teach formal grammar is both in accord with a correct understanding of the nature of language and is politically generous, inasmuch as it confers equal status on all forms of speech and therefore upon all speakers.

The locus classicus of this way of thinking, at least for laymen such as myself, is Steven Pinker?s book The Language Instinct. A bestseller when first published in 1994, it is now in its 25th printing in the British paperback version alone, and its wide circulation suggests a broad influence on the opinions of the intelligent public. Pinker is a professor of psychology at Harvard University, and that institution?s great prestige cloaks him, too, in the eyes of many. If Professor Pinker were not right on so important a subject, which is one to which he has devoted much study and brilliant intelligence, would he have tenure at Harvard?

Pinker nails his colors to the mast at once. His book, he says, ?will not chide you about proper usage . . .? because, after all, ?[l]anguage is a complex, specialized skill, which . . . is qualitatively the same in every individual. . . . Language is no more a cultural invention than is upright posture,? and men are as naturally equal in their ability to express themselves as in their ability to stand on two legs. ?Once you begin to look at language . . . as a biological adaptation to communicate information,? Pinker continues, ?it is no longer as tempting to see language as an insidious shaper of thought.? Every individual has an equal linguistic capacity to formulate the most complex and refined thoughts. We all have, so to speak, the same tools for thinking. ?When it comes to linguistic form,? Pinker says, quoting the anthropologist, Edward Sapir, ?Plato walks with the Macedonian swineherd, Confucius with the head-hunting savage of Assam.? To put it another way, ?linguistic genius is involved every time a child learns his or her mother tongue.?

The old-fashioned and elitist idea that there is a ?correct? and ?incorrect? form of language no doubt explains the fact that ?[l]inguists repeatedly run up against the myth that working-class people . . . speak a simpler and a coarser language. This is a pernicious illusion. . . . Trifling differences between the dialect of the mainstream and the dialect of other groups . . . are dignified as badges of ?proper grammar.? ? These are, in fact, the ?hobgoblins of the schoolmarm,? and ipso facto contemptible. In fact, standard English is one of those languages that ?is a dialect with an army and a navy.? The schoolmarms he so slightingly dismisses are in fact but the linguistic arm of a colonial power?the middle class?oppressing what would otherwise be a much freer and happier populace. ?Since prescriptive rules are so psychologically unnatural that only those with access to the right schooling can abide by them, they serve as shibboleths, differentiating the elite from the rabble.?

Children will learn their native language adequately whatever anyone does, and the attempt to teach them language is fraught with psychological perils. For example, to ?correct? the way a child speaks is potentially to give him what used to be called an inferiority complex. Moreover, when schools undertake such correction, they risk dividing the child from his parents and social milieu, for he will speak in one way and live in another, creating hostility and possibly rejection all around him. But happily, since every child is a linguistic genius, there is no need to do any such thing. Every child will have the linguistic equipment he needs, merely by virtue of growing older.

I need hardly point out that Pinker doesn?t really believe anything of what he writes, at least if example is stronger evidence of belief than precept. Though artfully sown here and there with a demotic expression to prove that he is himself of the people, his own book is written, not surprisingly, in the kind of English that would please schoolmarms. I doubt very much whether it would have reached its 25th printing had he chosen to write it in the dialect of rural Louisiana, for example, or of the slums of Newcastle-upon-Tyne. Even had he chosen to do so, he might have found the writing rather difficult. I should like to see him try to translate a sentence from his book that I have taken at random, ?The point that the argument misses is that although natural selection involves incremental steps that enhance functioning, the enhancements do not have to be an existing module,? into the language of the Glasgow or Detroit slums.

In fact, Pinker has no difficulty in ascribing greater or lesser expressive virtues to languages and dialects. In attacking the idea that there are primitive languages, he quotes the linguist Joan Bresnan, who describes English as ?a West Germanic language spoken in England and its former colonies? (no prizes for guessing the emotional connotations of this way of so describing it). Bresnan wrote an article comparing the use of the dative in English and Kivunjo, a language spoken on the slopes of Mount Kilimanjaro. Its use is much more complex in the latter language than in the former, making far more distinctions. Pinker comments: ?Among the clever gadgets I have glimpsed in the grammars of so-called primitive groups, the complex Cherokee pronoun system seems especially handy. It distinguishes among ?you and I,? ?another person and I,? ?several other people and I,? and ?you, one or more other persons, and I,? which English crudely collapses into the all-purpose pronoun we.? In other words, crudity and subtlety are concepts that apply between languages. And if so, there can be no real reason why they cannot apply within a language?why one man?s usage should not be better, more expressive, subtler, than another?s.

Similarly, Pinker attacks the idea that the English of the ghetto, Black English Vernacular, is in any way inferior to standard English. It is rule- governed like (almost) all other language. Moreover, ?If the psychologists had listened to spontaneous conversations, they would have rediscovered the commonplace fact that American black culture is highly verbal; the subculture of street youths in particular is famous in the annals of anthropology for the value placed on linguistic virtuosity.? But in appearing to endorse the idea of linguistic virtuosity, he is, whether he likes it or not, endorsing the idea of linguistic lack of virtuosity. And it surely requires very little reflection to come to the conclusion that Shakespeare had more linguistic virtuosity than, say, the average contemporary football player. Oddly enough, Pinker ends his encomium on Black English Vernacular with a schoolmarm?s pursed lips: ?The highest percentage of ungrammatical sentences [are to be] found in the proceedings of learned academic conferences.?

Power User
Posts: 784

« Reply #5 on: November 17, 2006, 03:24:42 AM »

Over and over again, Pinker stresses that children do not learn language by imitation; rather, they learn it because they are biologically predestined to do so. ?Let us do away,? he writes, with what one imagines to be a rhetorical sweep of his hand, ?with the folklore that parents teach their children language.? It comes as rather a surprise, then, to read the book?s dedication: ?For Harry and Roslyn Pinker, who gave me language.?

Surely he cannot mean by this that they gave him language in the same sense as they gave him hemoglobin?that is to say, that they were merely the sine qua non of his biological existence as Steven Pinker. If so, why choose language of all the gifts that they gave him? Presumably, he means that they gave him the opportunity to learn standard English, even if they did not speak it themselves.

It is utterly implausible to suggest that imitation of parents (or other social contacts) has nothing whatever to do with the acquisition of language. I hesitate to mention so obvious a consideration, but Chinese parents tend to have Chinese-speaking children, and Portuguese parents Portuguese-speaking ones. I find it difficult to believe that this is entirely a coincidence and that imitation has nothing to do with it. Moreover, it is a sociological truism that children tend to speak not merely the language but the dialect of their parents.

Of course, they can escape it if they choose or need to do so: my mother, a native German-speaker, arrived in England aged 18 and learned to speak standard English without a trace of a German accent (which linguists say is a rare accomplishment) and without ever making a grammatical mistake. She didn?t imitate her parents, perhaps, but she imitated someone. After her recent death, I found her notebooks from 1939, in which she painstakingly practiced English, the errors growing fewer until there were none. I don?t think she would have been favorably impressed by Professor Pinker?s disdainful grammatical latitudinarianism?the latitudinarianism that, in British schools and universities, now extends not only to grammar but to spelling, as a friend of mine discovered recently.

A teacher in a state school gave his daughter a list of spellings to learn as homework, and my friend noticed that three out of ten of them were wrong. He went to the principal to complain, but she looked at the list and asked, ?So what? You can tell what the words are supposed to mean.? The test for her was not whether the spellings were correct but whether they were understandable. So much for the hobgoblins of contemporary schoolmarms.

The contrast between a felt and lived reality?in this case, Pinker?s need to speak and write standard English because of its superior ability to express complex ideas?and the denial of it, perhaps in order to assert something original and striking, is characteristic of an intellectual climate in which the destruction of moral and social distinctions is proof of the very best intentions.

Pinker?s grammatical latitudinarianism, when educationists like the principal of my friend?s daughter?s school take it seriously, has the practical effect of encouraging those born in the lower reaches of society to remain there, to enclose them in the mental world of their particular milieu. Of course, this is perfectly all right if you also believe that all stations in life are equally good and desirable and that there is nothing to be said for articulate reflection upon human existence. In other words, grammatical latitudinarianism is the natural ideological ally of moral and cultural relativism.

It so happens that I observed the importance of mastering standard, schoolmarmly grammatical speech in my own family. My father, born two years after his older brother, had the opportunity, denied his older brother for reasons of poverty, to continue his education. Accordingly, my father learned to speak and write standard English, and I never heard him utter a single word that betrayed his origins. He could discourse philosophically without difficulty; I sometimes wished he had been a little less fluent.

My uncle, by contrast, remained trapped in the language of the slums. He was a highly intelligent man and what is more a very good one: he was one of those rare men, much less common than their opposite, from whom goodness radiated almost as a physical quality. No one ever met him without sensing his goodness of heart, his generosity of spirit.

But he was deeply inarticulate. His thoughts were too complex for the words and the syntax available to him. All through my childhood and beyond, I saw him struggle, like a man wrestling with an invisible boa constrictor, to express his far from foolish thoughts?thoughts of a complexity that my father expressed effortlessly. The frustration was evident on his face, though he never blamed anyone else for it. When, in Pinker?s book, I read the transcript of an interview by the neuropsychologist Howard Gardner with a man who suffered from expressive dysphasia after a stroke?that is to say, an inability to articulate thoughts in language?I was, with great sadness, reminded of my uncle. Gardner asked the man about his job before he had a stroke.

?I?m a sig . . . no . . . man . . . uh, well, . . . again.? These words were emitted slowly, and with great effort. . . . ?Let me help you,? I interjected. ?You were a signal . . .? ?A sig-nal man . . . right,? [he] completed my phrase triumphantly. ?Were you in the Coast Guard?? ?No, er, yes, yes . . . ship . . . Massachu . . . chusetts . . . Coast-guard . . . years.? It seemed to me that it was a cruel fate for such a man as my uncle not to have been taught the standard English that came to come so naturally to my father. As Montaigne tells us, there is no torture greater than that of a man who is unable to express what is in his soul.

Beginning in the 1950s, Basil Bernstein, a London University researcher, demonstrated the difference between the speech of middle- and working-class children, controlling for whatever it is that IQ measures. Working-class speech, tethered closely to the here and now, lacked the very aspects of standard English needed to express abstract or general ideas and to place personal experience in temporal or any other perspective. Thus, unless Pinker?s despised schoolmarms were to take the working-class children in hand and deliberately teach them another speech code, they were doomed to remain where they were, at the bottom of a society that was itself much the poorer for not taking full advantage of their abilities, and that indeed would pay a steep penalty for not doing so. An intelligent man who can make no constructive use of his intelligence is likely to make a destructive, and self-destructive, use of it.

If anyone doubts that inarticulacy can be a problem, I recommend reading a report by the Joseph Rowntree Trust about British girls who get themselves pregnant in their teens (and sometimes their early teens) as an answer to their existential problems. The report is not in the least concerned with the linguistic deficiencies of these girls, but they are evident in the transcript in every reply to every question. Without exception, the girls had had a very painful experience of life and therefore much to express from hearts that must have been bursting. I give only one example, but it is representative. A girl, aged 17, explains why it is wonderful to have a baby:

Maybe it?s just?yeah, because maybe just?might be (um) it just feels great when?when like, you?ve got a child who just? you know?following you around, telling you they love you and I think that?s?it?s quite selfish, but that?s one of the reasons why I became a mum because I wanted someone who?ll?you know?love ?em to bits ?cos it?s not just your child who?s the centre of your world, and that feels great as well, so I think?it?s brilliant. It is fantastic because?you know?they?re?the child?s dependent on you and you know that (um)? that you?if you?you know?you?ve gotta do everything for the child and it just feels great to be depended on. As I know from the experience of my patients, there is no reason to expect her powers of expression to increase spontaneously with age. Any complex abstractions that enter her mind will remain inchoate, almost a nuisance, like a fly buzzing in a bottle that it cannot escape. Her experience is opaque even to herself, a mere jumble from which it will be difficult or impossible to learn because, for linguistic reasons, she cannot put it into any kind of perspective or coherent order.

I am not of the ungenerous and empirically mistaken party that writes off such people as inherently incapable of anything better or as already having achieved so much that it is unnecessary to demand anything else of them, on the grounds that they naturally have more in common with Shakespeare than with speechless animal creation. Nor, of course, would I want everyone to speak all the time in Johnsonian or Gibbonian periods. Not only would it be intolerably tedious, but much linguistic wealth would vanish. But everyone ought to have the opportunity to transcend the limitations of his linguistic environment, if it is a restricted one?which means that he ought to meet a few schoolmarms in his childhood. Everyone, save the handicapped, learns to run without being taught; but no child runs 100 yards in nine seconds, or even 15 seconds, without training. It is fatuous to expect that the most complex of human faculties, language, requires no special training to develop it to its highest possible power.
Power User
Posts: 42482

« Reply #6 on: December 10, 2006, 01:39:38 PM »

SOUTH NAKNEK, Alaska — The National Geographic Society’s multimillion-dollar research project to collect DNA from indigenous groups around the world in the hopes of reconstructing humanity’s ancient migrations has come to a standstill on its home turf in North America.

A review board stopped DNA research in South Naknek, Alaska.

Billed as the “moon shot of anthropology,” the Genographic Project intends to collect 100,000 indigenous DNA samples. But for four months, the project has been on hold here as it scrambles to address questions raised by a group that oversees research involving Alaska natives.

At issue is whether scientists who need DNA from aboriginal populations to fashion a window on the past are underselling the risks to present-day donors. Geographic origin stories told by DNA can clash with long-held beliefs, threatening a world view some indigenous leaders see as vital to preserving their culture.

They argue that genetic ancestry information could also jeopardize land rights and other benefits that are based on the notion that their people have lived in a place since the beginning of time.

“What if it turns out you’re really Siberian and then, oops, your health care is gone?” said Dr. David Barrett, a co-chairman of the Alaska Area Institutional Review Board, which is sponsored by the Indian Health Service, a federal agency. “Did anyone explain that to them?”

Such situations have not come up, and officials with the Genographic Project discount them as unlikely. Spencer Wells, the population geneticist who directs the project, says it is paternalistic to imply that indigenous groups need to be kept from the knowledge that genetics might offer.

“I don’t think humans at their core are ostriches,” Dr. Wells said. “Everyone has an interest in where they came from, and indigenous people have more of an interest in their ancestry because it is so important to them.”

But indigenous leaders point to centuries of broken promises to explain why they believe their fears are not far-fetched. Scientific evidence that American Indians or other aboriginal groups came from elsewhere, they say, could undermine their moral basis for sovereignty and chip away at their collective legal claims.

“It’s a benefit to science, probably,” said Dr. Mic LaRoque, the Alaska board’s other co-chairman and a member of the Turtle Mountain Chippewa Tribe of North Dakota. “But I’m not convinced it’s a benefit to the tribes.”

The pursuit of indigenous DNA is driven by a desire to shed light on questions for which the archeological evidence is scant. How did descendants of the hunter-gatherers who first left humanity’s birthplace in east Africa some 65,000 years ago come to inhabit every corner of the Earth? What routes did they take? Who got where, and when?

As early humans split off in different directions, distinct mutations accumulated in the DNA of each population. Like bread crumbs, these genetic markers, passed on intact for millennia, can reveal the trail of the original pioneers. All non-Africans share a mutation that arose in the ancestors of the first people to leave the continent, for instance. But the descendants of those who headed north and lingered in the Middle East carry a different marker from those who went southeast toward Asia.

Most of the world’s six billion people, however, are too far removed from wherever their ancestors originally put down roots to be useful to population geneticists. The Genographic Project is focusing on DNA from people still living in their ancestral homelands because they provide the crucial geographic link between genetic markers found today and routes traveled long ago.

In its first 18 months, the project’s scientists have had considerable success, persuading more than 18,000 people in off-the-grid places like the east African island of Pemba and the Tibesti Mountains of Chad to donate their DNA. When the North American team arrived in southwestern Alaska, they found volunteers offering cheek swabs and family histories for all sorts of reasons.

The council members of the Native Village of Georgetown, for instance, thought the project could bolster a sense of cultural pride.

Page 2 of 3)

Glenn Fredericks, president of the Georgetown tribe, was eager for proof of an ancient unity between his people and American Indians elsewhere that might create greater political power. “They practice the same stuff, the lower-48 natives, as we do,” Mr. Fredericks said. “Did we exchange people? It would be good to know.”

Others said the test would finally force an acknowledgment that they were here first, undermining those who see the government as having “given” them their land.
Still others were interested in the mechanics of migration: “Were the lands all combined? Did they get here by boat?” For many nonindigenous Americans who feel disconnected from their roots, the project has also struck a chord: nearly 150,000 have scraped cells from their cheek and sent them to the society with $100 to learn what scientists know so far about how and where their individual forebears lived beyond the mists of prehistory.

By giving the broader public a way to participate, though it is likely to generate little scientific payoff, the project has created an unusual set of stakeholders with a personal interest in its success. More details, the project explains in the ancestral sketches it gives individuals, will come only with more indigenous DNA.

“I think you have to be sensitive to these cultures,” said Jesse R. Sweeney, 32, a bankruptcy lawyer in Detroit who hopes the millennia-size gaps in his own ancestors’ story will eventually be filled in. “But hopefully they will change their mind and contribute to the research.”

Mr. Sweeney’s DNA places his maternal ancestors in the Middle East about 50,000 year ago. After that, they may have gone north. Or maybe south: “This is where the genetic clues get murky and your DNA trail goes cold,” read the conclusion to his test results on the project’s Web site. “By working together with indigenous peoples around the globe, we are learning more about these ancient migrations.”

The first large effort to collect indigenous DNA since federal financing was withdrawn from a similar proposal amid indigenous opposition in the mid-1990s, the Genographic Project has drawn quiet applause from many geneticists for resurrecting scientific ambitions that have grown more pressing. As indigenous groups intermarry and disperse at an ever-accelerating pace, many scientists believe the chance to capture human history is fast disappearing.

“Everyone else had given up,” said Mark Stoneking, a professor at the Max Planck Institute for Evolutionary Anthropology. “If they get even a fraction of what they are trying for, it will be very useful.”

Unlike the earlier Human Genome Diversity Project, condemned by some groups as “biocolonialism” because scientists may have profited from genetic data that could have been used to develop drugs, the Genographic Project promises to patent nothing and to avoid collecting medical information. The project has designated half the proceeds from the sale of kits to the public for programs designed to preserve traditional cultures and language.

In May, project officials held a stormy meeting in New York with the indigenous rights group Cultural Survival while protestors carried signs reading “National Geographic Sucks Indigenous Blood.” Shortly after, the United Nations Permanent Forum on Indigenous Issues recommended suspending the project.

On the ground, every region has its challenges. To make scientific progress, the project’s geneticists are finding they must first navigate an unfamiliar tangle of political, religious and personal misgivings.

Pierre Zalloua, the project director in the Middle East, faces suspicion that he is an emissary of an opposing camp trying to prove their lineages are not important. Himla Soodyall, the project’s South African director, finds herself trying to explain to people who worship their ancestors what more her research could add. In Australia, some aboriginal groups have refused to cooperate.

But among the 10 geneticists the society has given the task of collecting 10,000 samples each by the spring of 2010, Theodore G. Schurr, the project’s North American director, is in last place. Fewer than 100 vials of DNA occupy a small plastic box in his laboratory’s large freezer at the University of Pennsylvania, where he is an assistant professor of anthropology. And at the request of the Alaska review board, he has sent back the 50 or so samples that he collected in Alaska to be stored in a specimen bank under its care until he can satisfy their concerns.

American Indians, Dr. Schurr says, hold the answer to one of the more notable gaps in the prehistoric migration map. Although most scientists accept that the first Americans came across the Bering Strait land bridge that connected Siberia and Alaska some 20,000 years ago, there is no proof of precisely where those travelers came from, and the route they took south once they arrived.


Page 3 of 3)

Comparing the DNA of large numbers of American Indians might reveal whether their ancestors were from a single founding population, and when they reached the Americas. And knowing the routes and timing of migrations within the Americas would provide a foundation for studying how people came to be so different so quickly.

Human History With Genetics
But almost every federally recognized tribe in North America has declined or ignored Dr. Schurr’s invitation to take part. “What the scientists are trying to prove is that we’re the same as the Pilgrims except we came over several thousand years before,” said Maurice Foxx, chairman of the Massachusetts Commission on Indian Affairs and a member of the Mashpee Wampanoag. “Why should we give them that openly?”

Some American Indians trace their suspicions to the experience of the Havasupai Tribe, whose members gave DNA for a diabetes study that University of Arizona researchers later used to link the tribe’s ancestors to Asia. To tribe members raised to believe the Grand Canyon is humanity’s birthplace, the suggestion that their own DNA says otherwise was deeply disturbing.

When Dr. Schurr was finally invited to a handful of villages in Alaska, he eagerly accepted. But by the time he reached South Naknek, a tiny native village on the Alaska Peninsula, to report his analysis of the DNA he had taken on an earlier mission, the Alaska review board had complained to his university supervisors.

The consent form all volunteers must sign, the Alaska board said, should contain greater detail about the risks, including the fact that the DNA would be stored in a database linked to tribal information.

Dr. Schurr’s latest attempt at a revised form is to be reviewed this month by the board in Alaska and the by University of Pennsylvania board supervising the project.

In the meantime, his early results have surprised some of the Alaskans who gave him their DNA. In South Naknek, Lorianne Rawson, 42, found out her DNA contradicted what she had always believed. She was not descended from the Aleuts, her test results suggested, but from their one-time enemies, the Yup’ik Eskimos.

The link to the Yup’iks, Ms. Rawson said, only made her more curious. “We want them to do more research,” she added, offering Dr. Schurr more relatives to be tested.

But she will have to wait.
Power User
Posts: 42482

« Reply #7 on: December 11, 2006, 07:11:13 AM »

Today's NY Times

Study Detects Recent Instance of Human Evolution
Published: December 10, 2006
A surprisingly recent instance of human evolution has been detected among the peoples of East Africa. It is the ability to digest milk in adulthood, conferred by genetic changes that occurred as recently as 3,000 years ago, a team of geneticists has found.

Convergent Adaptation of Human Lactase Persistence in Africa and Europe
 (Nature Genetics) The finding is a striking example of a cultural practice — the raising of dairy cattle — feeding back into the human genome. It also seems to be one of the first instances of convergent human evolution to be documented at the genetic level. Convergent evolution refers to two or more populations acquiring the same trait independently.

Throughout most of human history, the ability to digest lactose, the principal sugar of milk, has been switched off after weaning because there is no further need for the lactase enzyme that breaks the sugar apart. But when cattle were first domesticated 9,000 years ago and people later started to consume their milk as well as their meat, natural selection would have favored anyone with a mutation that kept the lactase gene switched on.

Such a mutation is known to have arisen among an early cattle-raising people, the Funnel Beaker culture, which flourished some 5,000 to 6,000 years ago in north-central Europe. People with a persistently active lactase gene have no problem digesting milk and are said to be lactose tolerant.

Almost all Dutch people and 99 percent of Swedes are lactose-tolerant, but the mutation becomes progressively less common in Europeans who live at increasing distance from the ancient Funnel Beaker region.

Geneticists wondered if the lactose tolerance mutation in Europeans, first identified in 2002, had arisen among pastoral peoples elsewhere. But it seemed to be largely absent from Africa, even though pastoral peoples there generally have some degree of tolerance.

A research team led by Sarah Tishkoff of the University of Maryland has now resolved much of the puzzle. After testing for lactose tolerance and genetic makeup among 43 ethnic groups of East Africa, she and her colleagues have found three new mutations, all independent of each other and of the European mutation, which keep the lactase gene permanently switched on.

The principal mutation, found among Nilo-Saharan-speaking ethnic groups of Kenya and Tanzania, arose 2,700 to 6,800 years ago, according to genetic estimates, Dr. Tishkoff’s group is to report in the journal Nature Genetics on Monday. This fits well with archaeological evidence suggesting that pastoral peoples from the north reached northern Kenya about 4,500 years ago and southern Kenya and Tanzania 3,300 years ago.

Two other mutations were found, among the Beja people of northeastern Sudan and tribes of the same language family, Afro-Asiatic, in northern Kenya.

Genetic evidence shows that the mutations conferred an enormous selective advantage on their owners, enabling them to leave almost 10 times as many descendants as people without them. The mutations have created “one of the strongest genetic signatures of natural selection yet reported in humans,” the researchers write.

The survival advantage was so powerful perhaps because those with the mutations not only gained extra energy from lactose but also, in drought conditions, would have benefited from the water in milk. People who were lactose-intolerant could have risked losing water from diarrhea, Dr. Tishkoff said.

Diane Gifford-Gonzalez, an archaeologist at the University of California, Santa Cruz, said the new findings were “very exciting” because they “showed the speed with which a genetic mutation can be favored under conditions of strong natural selection, demonstrating the possible rate of evolutionary change in humans.”

The genetic data fitted in well, she said, with archaeological and linguistic evidence about the spread of pastoralism in Africa. The first clear evidence of cattle in Africa is from a site 8,000 years old in northwestern Sudan. Cattle there were domesticated independently from two other domestications, in the Near East and the Indus valley of India.

Both Nilo-Saharan speakers in Sudan and their Cushitic-speaking neighbors in the Red Sea hills probably domesticated cattle at the same time, since each has an independent vocabulary for cattle items, said Dr. Christopher Ehret, an expert on African languages and history at the University of California, Los Angeles. Descendants of each group moved southward and would have met again in Kenya, Dr. Ehret said.

Dr. Tishkoff detected lactose tolerance among both Cushitic speakers and Nilo-Saharan groups in Kenya. Cushitic is a branch of Afro-Asiatic, the language family that includes Arabic, Hebrew and ancient Egyptian.

Dr. Jonathan Pritchard, a statistical geneticist at the University of Chicago and the co-author of the new article, said that there were many signals of natural selection in the human genome, but that it was usually hard to know what was being selected for. In this case Dr. Tishkoff had clearly defined the driving force, he said.

The mutations Dr. Tishkoff detected are not in the lactase gene itself but a nearby region of the DNA that controls the activation of the gene. The finding that different ethnic groups in East Africa have different mutations is one instance of their varied evolutionary history and their exposure to many different selective pressures, Dr. Tishkoff said.

“There is a lot of genetic variation between groups in Africa, reflecting the different environments in which they live, from deserts to tropics, and their exposure to very different selective forces,” she said.

People in different regions of the world have evolved independently since dispersing from the ancestral human population in northeast Africa 50,000 years ago, a process that has led to the emergence of different races. But much of this differentiation at the level of DNA may have led to the same physical result.

As Dr. Tishkoff has found in the case of lactose tolerance, evolution may use the different mutations available to it in each population to reach the same goal when each is subjected to the same selective pressure. “I think it’s reasonable to assume this will be a more general paradigm,” Dr. Pritchard said.

Power User
Posts: 42482

« Reply #8 on: December 26, 2006, 04:50:26 PM »

Devious Butterflies, Full-Throated Frogs and Other Liars
Joe McDonald/Corbis
The green frog has been known to deceive eavesdroppers with its croak.


Published: December 26, 2006
If you happen across a pond full of croaking green frogs, listen carefully. Some of them may be lying.

Dishonesty has been documented in crustaceans and primates alike.

A croak is how male green frogs tell other frogs how big they are. The bigger the male, the deeper the croak. The sound of a big male is enough to scare off other males from challenging him for his territory.

While most croaks are honest, some are not. Some small males lower their voices to make themselves sound bigger. Their big-bodied croaks intimidate frogs that would beat them in a fair fight.

Green frogs are only one deceptive species among many. Dishonesty has been documented in creatures ranging from birds to crustaceans to primates, including, of course, Homo sapiens. “When you think of human communication, it’s rife with deception,” said Stephen Nowicki, a biologist at Duke University and the co-author of the 2005 book “The Evolution of Animal Communication.” “You just need to read a Shakespeare play or two to see that.”

As Dr. Nowicki chronicled in his book, biologists have long puzzled over deception. Dishonesty should undermine trust between animals. Why, for example, do green frogs keep believing that a big croak means a big male? New research is offering some answers: Natural selection can favor a mix of truth and lies, particularly when an animal has a big audience. From one listener to the next, honesty may not be the best policy.

“I think it could explain a lot of mysteries in the evolution of communication in animals, including humans,” said Stephen P. Ellner, a mathematical biologist at Cornell University.

Tales of animal deception reach back at least as far as Aesop’s fables. In the late 19th century, the naturalist George Romanes made a semi-scientific study of deceptive animals. In his 1883 book, “Mental Evolution in Animals,” Romanes wrote about how one of his correspondents had sent him “several examples of the display of hypocrisy of a King Charles spaniel.”

By the mid-1900s, scientists had documented deception in cases where one species fooled another. Some nonpoisonous butterflies, for example, evolved the same wing patterns that poisonous species used to warn off birds. Within a species, however, honesty usually prevailed. Animals gave each other alarm calls to warn of predators; males signaled their prowess in fighting; babies let their parents know they were hungry. Honesty benefited both the sender and the receiver.

“The point of signaling was to get information across,” Dr. Nowicki said. “Deception was almost not an issue.”

There was just one hole in this happy arrangement: it presented a great opportunity for liars. Shrikes, for example, regularly use alarm calls to warn one another of predators. But sometimes the birds will use false alarm calls to scare other shrikes away from food.

Imagine that a shrike fools other shrikes with a false alarm. It eats more, and therefore may hatch more babies. Meanwhile, the gullible, less-nourished shrikes hatch fewer babies. If false alarms become common, natural selection should favor shrikes that are not fooled by them.

When scientists created mathematical models of this theory, they found that dishonesty could undermine many vital kinds of communication. The challenge, then, was to find out how honesty countered the advantage of deception. “The liars ought to be able to take advantage of the system, so that you’d have selection on the listeners to ignore the signals,” said Jonathan Rowell, a postdoctoral researcher at the University of Tennessee.

Amotz Zahavi, a biologist at Tel Aviv University, proposed a way for honesty to prevail. His idea was that honesty won out only because lying carried a relatively large cost. His theory eventually led to elaborate mathematical models and experiments that confirmed it.

Roosters attract hens, for example, with their large red combs. Hens benefit from choosing mates in good condition, because their chicks will tend to be in good condition as well. The bigger and brighter a comb, the better condition the rooster is in.

Theoretically, a weak rooster could fool hens by growing a deceptively large comb. But it costs a weak rooster more than it does a strong one to build a big comb. This tradeoff leads to honest signals from weak and strong roosters alike.

“The mystery of why there is honesty was suddenly solved,” Dr. Ellner said. “All the big problems fell away.”

But if they had explained why deception did not win out, why did it continue to thrive? “We couldn’t explain all the dishonesty,” Dr. Ellner said.

Dr. H. Kern Reeve, an evolutionary biologist at Cornell, said that “deception is popping up with a surprising frequency.”

Even crustaceans can lie. Male stomatopods dig burrows, to which they try to attract females. Some males choose to try to evict other stomatopods from their burrows and take them over. These conflicts are dangerous because stomatopods can deliver crushing blows with their claw-like appendages. But the stomatopods rarely come to blows. Instead, males raise themselves up and extend their appendages, like a boxer raising his gloves. The sight of big appendages causes smaller stomatopods to back down.


Page 2 of 2)

Yet even the biggest, meanest stomatopod has his moments of weakness. Like all crustaceans, they must molt. A freshly-molted stomatopod has a soft, tender exoskeleton. Even in this vulnerable state, however, males will still raise up their claws in a bold crustacean bluff.

Dr. Rowell recently created a more complicated model of animal signals that may explain why deception is so common. Previous models examined only a single animal sending a signal to a single receiver. But real signals are rarely so private. “They’re not happening in a one-on-one situation,” Dr. Rowell said. “They’re really happening in public.”

A signaler may have different relationships with different listeners. In some cases, honest signals are best. But eavesdroppers may be able to use honest signals for their own advantage.

To capture this extra layer of complexity, Dr. Rowell built a mathematical model with two receivers instead of one. The signaling animal could choose to be honest or dishonest. The receivers could respond to the signal as an honest one or a dishonest one.

Working with Dr. Ellner and Dr. Reeve, Dr. Rowell discovered that honesty and deception could reach a stable coexistence in the model. The signalers could sometimes be dishonest, and yet the receivers continued to believe the signals despite the deception.

Dr. Rowell and his colleagues published the details of their model in the December issue of The American Naturalist.

“It’s really important,” Dr. Nowicki said of the study. “They’re coming up with new angles that could explain how you could have more deception and keep it stable.”

Dr. Rowell argues that real-world cases of deception, like bluffing, support the model. When a male green frog or stomatopod bluffs, other males have to decide whether to heed the signal or to ignore it and attack. Attacking is risky, because it is possible that the signaler is not bluffing.

“The challenger isn’t willing to take that gamble,” Dr. Rowell said.

The model also showed how deception could be used against eavesdroppers. Green frogs — along with many other frogs and toads — attract females with a distinctive mating call. Dr. Ellner’s rough translation of their call: “I’m looking for female frogs, and if you come on my lily pad, I’ll show you a good time.”

In most cases, male frogs follow up on their mating calls by courting the females they attract. But sometimes they attack instead. This deceptive reaction may be a way for the males to cope with other males that eavesdrop on them. Such eavesdroppers, instead of holding onto their own territory, sneak around and try to intercept females attracted to the mating calls of other males.

If males are always honest in their mating calls, they may lose out to sneaky males. But if they attack, they can ambush the sneaky males and drive them away. Natural selection thus favors deception, despite the fact that the frogs sometimes attack potential mates. The females, meanwhile, are better off trusting the mating calls than ignoring them.

Dr. Reeve cautioned that the model was only the first step in understanding how networks of listeners can drive the evolution of deception. “Right now it needs to be tested in detail, experimentally,” he said.

Different species may be prone to different levels of deception. Solitary animals may evolve to be more honest than animals that spend long lives in big societies. If that is true, then humans may be exquisitely primed to deceive.

“We’re in a network of individuals watching us,” Dr. Reeve said. “If you provide a signal to one individual, it’s being eavesdropped on by lots of other people.”

Dr. Rowell is exploring cases of human deception with his model. In one case, he examines how terrorist organizations communicate to their sleeper cells.

“Your two listeners are the government and terrorist sleeper cells,” Dr. Rowell explained. “The sleeper cells don’t have a direct communication with whoever your terrorist signaler is.

“They might give something out over the Web, and the government picks it up. You find that you can very easily get a level of dishonesty from the terrorist signaler to get the government to waste resources on phantom attacks. You can see this evolution going on between sleeper cells and the government.”
Power User
Posts: 42482

« Reply #9 on: February 09, 2007, 08:31:11 AM »

Today's NY Times:

Our ancestors have arrived at the American Museum of Natural History. They are very old, and we are only beginning to recognize them and ourselves in them. They remind us of our origins long ago and how we have emerged as modern humans in the fullness of time.

The museum’s new permanent exhibition on human origins, which opens tomorrow, merges notable achievements in paleontology and genetics, sciences that have made their own robust evolutionary strides in recent years. Each introduces evidence supporting the other in establishing a genealogy extending back to protohuman species that arose in Africa from earlier primates some six to seven million years ago.

These two scientific threads run through the exhibition like the strands of the DNA double helix.

Ellen V. Futter, the museum’s president, said the “mutually reinforcing evidence” was organized in the exhibition to address three fundamental questions: Where did we come from? Who are we? And what lies ahead for us?

Turn right at the entrance of the new installation, the Anne and Bernard Spitzer Hall of Human Origins, and you see paleontology’s side of the story. More than 200 casts of prehuman and human fossils and artifacts illustrate stages in physical and behavioral evolution. Four life-size tableaus depict scenes in the lives of human predecessors, the realism stamped by the presence of pesky flies on their shoulders.

Some of the most striking displays are reconstructions from fossil and other evidence of what these ancestors probably looked like. Museum scientists and technicians have recreated the faces and bodies of the famous Lucy skeleton and Neanderthals — even the controversial Hobbits, the tiny specimens of what may be a previously unknown extinct species found recently in Indonesia.

The reconstruction of Turkana Boy is especially evocative. Based on one of the most complete ancestral skeletons ever excavated, the fleshed-out Homo ergaster, a species that lived in Africa 1.9 to 1.4 million years ago, is almost six feet tall, with a body form remarkably like that of modern humans.

“The fossils on which the reconstructions are based are witnesses to a dynamic history,” said Ian Tattersall, a paleoanthropologist at the museum and co-curator of the exhibition. “Now we have a much larger story to tell, with the addition of what we are learning from molecular biology.”

Bear left in the hall, and there is the sign “DNA Tells Us About Human Origins.” Below are three tubes containing particles of DNA in a milky white solution. The samples are not particularly impressive, until you think that this is the stuff of encoded information shaping an entire organism and the material that has transformed the study of genetics, or genomics, and revealed the place of humans in the rest of life.

One of the vials holds human DNA, and another a chimpanzee’s. The analysis of their genetic material has confirmed what comparative anatomy predicted, showing that human DNA is 98.8 percent identical to that of chimps and bonobos, our closest living relatives. And our DNA is, on average, 96 percent identical to our most distant primate kin, some of which are mounted on the wall.

The third vial contains a DNA sample from a 40,000-year-old Neanderthal, the extinct close cousin of Homo sapiens. The discovery of a Neanderthal skull in 1856 led to the recognition that different kinds of humans once lived on Earth. This rare DNA specimen, on display in this country for the first time, was donated by the Max Planck Institute in Leipzig, Germany, the first laboratory to succeed in extracting the genetic material from Neanderthal bones.

Standing nearby are the skeletons of a chimpanzee, a Neanderthal and a modern human, and stations with interactive electronic displays are ready, at the touch of a screen, to explain the differences and similarities between the bones, brains and DNA of the three species.

Other computer animations offer insights into how scientists decode the hereditary information, how it is transmitted through generations, and how mutations of mitochondrial DNA, the traits inherited through the mother’s lineage, reveal relationships through time and migrations. A video of a “tree of life” changes before your eyes, like a kaleidoscope, showing the branching interrelationships among 479 species.


Page 2 of 2)

Rob DeSalle, the exhibition’s other curator and a molecular biologist at the museum, said genomics is leading to the discovery of “the history between other species and humans and the relationships of humans to each other.”

The genetics side of the exhibition is not as visually compelling as the fossils and reconstructed life in other sections. Plan to invest more time with the interactive displays and videos, which convey the truly new contributions to understanding the science of human evolution and the complexity and connectivity of life.

The Hall of Human Origins occupies the galleries of its predecessor, the Hall of Human Biology and Evolution, which had its opening 12 years ago, before many of the advances in genomics and a number of major fossil discoveries. That exhibition closed in September 2005 to make way for its more up-to-date replacement, supported by a gift from the Spitzers, the parents of Gov. Eliot Spitzer of New York.

Some of the cast of fossil characters may be familiar to regular museum visitors, but they have been revitalized in new settings. For example, the Australopithecus couple that left tracks walking 3.5 million years ago across a plain at Laetoli, Tanzania, appear here. The surprise is that they are so small, no more than three feet tall. Yet the discovery of their footprints was the first clear evidence that prehumans were walking upright well before they made tools.

In the habitat displays, two Homo ergasters butcher a carcass and fight off a vulture and a jackal trying to steal the meat, and a Homo erectus, Peking Man, crouches and is about to be pounced on by a hyena. The curators said these were reminders that early human ancestors were prey rather than predator for much of their history.

Toward the back of the gallery, the cultural aspects of evolution are illustrated. An exact reproduction of the painted animals from the cave art at Lascaux in France stretches across the wall. Other displays include a replica of a 75,000-year-old piece of ochre decorated with geometric patterns, a recent discovery in South Africa and one of the earliest examples of symbolic thinking and creativity in modern humans. In this context the exhibition reviews the elements that make humans different from other life: tool use, language, music and writing, as well as art and other forms of creative expression.

Off in a side room, the Spitzer Hall has an educational laboratory with microscopes and laptops ready for visitors, guided by instructors, to try their hands at examining fossils and learning how to decode DNA. The lab is designed with young people and student groups in mind, but anyone is free to experience something of what it is like to delve into the human past. Elsewhere a multimedia bulletin board offers news of the latest developments in research into the human past.

One issue cannot be entirely sidestepped in any public presentation of human evolution: that many people in this country doubt and vocally oppose the very concept. In a corner of the hall, several scientists are shown in video interviews professing the compatibility of their evolution research with their religious beliefs.

Standing nearby at the end of a tour of the exhibition, Michael J. Novacek, a paleontologist and the museum’s senior vice president, said that a previous show on Darwin had been a reassuring test case. The exhibition was popular, he said, and provoked “very little negative response.”

Dr. Novacek said the new hall was “an emphatic statement about the theory of evolution and its power to tell us our origins and history.”

“We emphasize that a scientific theory is an argument that is very carefully tested against scientific evidence,” he continued, “and this one has withstood much scrutiny.”

The modern human capacity for symbolic and creative expression has brought forth different narratives to explain where we came from, drawn from myth, religion and pre-Darwin science. The exhibition’s parallel lines of fossil and molecular evidence have the cumulative effect of solidifying the foundation for the more recent scientific narrative of human evolution.

There are still many gaps in knowledge, and unsolved mysteries. But seeing ourselves in the train of preceding species, we also recognize the degree of our separation from other animals, even our earliest ancestors. Only modern Homo sapiens in our time could present with such newfound authority the epic narrated through the museum’s Hall of Human Origins.

The Anne and Bernard Spitzer Hall of Human Origins will open tomorrow at the American Museum of Natural History, Central Park West and 79th Street. Museum hours: daily, 10 a.m. to 5:45 p.m. (to 8:45 p.m. on Fridays). Suggested museum admission: $14; $10.50 for students and 60+; $8 for children 2 to 12; free for members. (212) 769-5100 or (212) 769-5200;

Power User
Posts: 42482

« Reply #10 on: March 05, 2007, 10:17:45 AM »

UCLA Study on Friendship among Women

By Gale Berkowitz

A landmark UCLA study suggests friendships between women are special. See the following article: Taylor, S. E., Klein, L.C., Lewis, B. P., Gruenewald, T. L., Gurung, R.A.R., & Updegraff, J. A. (2000). "Female Responses to Stress: Tend and Befriend, Not Fight or Flight", Psychological Review, 107(3), 41-429.

They [friendships between women] shape who we are and who we are yet to be. They soothe our tumultuous inner world, fill the emotional gaps in our marriage, and help us remember who we really are. By the way, they may do even more.

Scientists now suspect that hanging out with our friends can actually counteract the kind of stomach-quivering stress most of us experience on a daily basis. A landmark UCLA study suggests that women respond to stress with a cascade of brain chemicals that cause us to make and maintain friendships with other women.

It's a stunning find that has turned five decades of stress research - most of it on men - upside down. "Until this study was published, scientists generally believed that when people experience stress, they trigger a hormonal cascade that revs the body to either stand and fight or flee as fast as possible," explains Laura Cousino Klein, Ph.D., now an Assistant Professor of Bio-behavioural Health at Penn State University and one of the study's authors. "It's an ancient survival mechanism left over from the time we were chased across the planet by sabre-toothed tigers.

Now the researchers suspect that women have a larger behavioural repertoire than just "fight or flight." "In fact," says Dr. Klein, "it seems that when the hormone oxytocin is released as part of the stress responses in a woman, it buffers the "fight or flight" response and encourages her to tend children and gather with other women instead. When she actually engages in this tending or befriending, studies suggest our bodies release more oxytocin, which further counters stress and produces a calming effect. This calming response does not occur in men,” says Dr. Klein, "because testosterone - which men produce in high levels when they're under stress, seems to reduce the effects of oxytocin. Estrogen,” she adds, "seems to enhance it."

The discovery that women respond to stress differently than men was made in a classic "aha!" moment shared by two women scientists who were talking one day in a lab at UCLA. "There was this joke that when the women who worked in the lab were stressed, they came in, cleaned the lab, had coffee, and bonded,” says Dr. Klein. "When the men were stressed, they holed up somewhere on their own. I commented one day to fellow researcher Shelley Taylor that nearly 90% of the stress research is on males. I showed her the data from my lab, and the two of us knew instantly that we were onto something."

The women cleared their schedules and started meeting with one scientist after another from various research specialties. Very quickly, Drs. Klein and Taylor discovered that by not including women in stress research, scientists had made a huge mistake: The fact that women respond to stress differently than men has significant implications for our health.

It may take some time for new studies to reveal all the ways that oxytocin encourages us to care for children and hang out with other women, but the "tend and befriend" notion developed by Drs. Klein and Taylor may explain why women consistently outlive men. Study after study has found that social ties reduce our risk of disease by lowering blood pressure, heart rate, and cholesterol. "There's no doubt," says Dr. Klein, "that friends are helping us live." In one study, for example, researchers found that people who had no friends increased their risk of death over a 6-month period. In another study, those who had the most friends over a 9-year period cut their risk of death by more than 60%. Friends are also helping us live better. The famed Nurses' Health Study from Harvard Medical School found that the more friends women had, the less likely they were to develop physical impairments as they aged, and the more likely they were to be leading a joyful life.

In fact, the results were so significant, the re searchers concluded, that not having close friends or confidantes was as detrimental to your health as smoking or carrying extra weight! Moreover, that is not all! When the researchers looked at how well the women functioned after the death of their spouse, they found that even in the face of this biggest stressor of all, those women who had a close friend confidante were more likely to survive the experience without any new physical impairments or permanent loss of vitality. Those without friends were not always so fortunate.

Yet if friends counter the stress that seems to swallow up so much of our life these days, if they keep us healthy and even add years to our life, why is it so hard to find time to be with them? That is a question that also troubles researcher Ruthellen Josselson, Ph.D., co-author of "Best Friends: The Pleasures and Perils of Girls' and Women's Friendships (Three Rivers Press, 1998).

"Every time we get overly busy with work and family, the first thing we do is let go of friendships with other women," explains Dr. Josselson." We push them right to the back burner. That is really a mistake because women are such a source of strength to each other. We nurture one another. In addition, we need to have unpressured space in which we can do the special kind of talk that women do when they are with other women. It's a very healing experience."
Power User
Posts: 42482

« Reply #11 on: March 06, 2007, 04:01:26 AM »

A friend brought this website to my attention and I have just begun surfing it a bit and find it to have some distinctive takes on things various matters.  Check it out.
Power User
Posts: 42482

« Reply #12 on: March 08, 2007, 11:57:34 AM »

I just realized that this subject has a thread on the "Poltics & Religion" forum too at which I will bring over here when I get my wife to remind me how to do that.  embarassed

Please continue to post here, but know that there are many interesting posts there too.

Power User
Posts: 42482

« Reply #13 on: March 13, 2007, 08:46:12 AM »

Today's NY Times:

So there are these two muffins baking in an oven. One of them yells, “Wow, it’s hot in here!”
And the other muffin replies: “Holy cow! A talking muffin!”

Did that alleged joke make you laugh? I would guess (and hope) not. But under different circumstances, you would be chuckling softly, maybe giggling, possibly guffawing. I know that’s hard to believe, but trust me. The results are just in on a laboratory test of the muffin joke.

Laughter, a topic that stymied philosophers for 2,000 years, is finally yielding to science. Researchers have scanned brains and tickled babies, chimpanzees and rats. They’ve traced the evolution of laughter back to what looks like the primal joke — or, to be precise, the first stand-up routine to kill with an audience of primates.

It wasn’t any funnier than the muffin joke, but that’s not surprising, at least not to the researchers. They’ve discovered something that eluded Plato, Aristotle, Hobbes, Kant, Schopenhauer, Freud and the many theorists who have tried to explain laughter based on the mistaken premise that they’re explaining humor.

Occasionally we’re surprised into laughing at something funny, but most laughter has little to do with humor. It’s an instinctual survival tool for social animals, not an intellectual response to wit. It’s not about getting the joke. It’s about getting along.

When Robert R. Provine tried applying his training in neuroscience to laughter 20 years ago, he naïvely began by dragging people into his laboratory at the University of Maryland, Baltimore County, to watch episodes of “Saturday Night Live” and a George Carlin routine. They didn’t laugh much. It was what a stand-up comic would call a bad room.

So he went out into natural habitats — city sidewalks, suburban malls — and carefully observed thousands of “laugh episodes.” He found that 80 percent to 90 percent of them came after straight lines like “I know” or “I’ll see you guys later.” The witticisms that induced laughter rarely rose above the level of “You smell like you had a good workout.”

“Most prelaugh dialogue,” Professor Provine concluded in “Laughter,” his 2000 book, “is like that of an interminable television situation comedy scripted by an extremely ungifted writer.”

He found that most speakers, particularly women, did more laughing than their listeners, using the laughs as punctuation for their sentences. It’s a largely involuntary process. People can consciously suppress laughs, but few can make themselves laugh convincingly.

“Laughter is an honest social signal because it’s hard to fake,” Professor Provine says. “We’re dealing with something powerful, ancient and crude. It’s a kind of behavioral fossil showing the roots that all human beings, maybe all mammals, have in common.”

The human ha-ha evolved from the rhythmic sound — pant-pant — made by primates like chimpanzees when they tickle and chase one other while playing. Jaak Panksepp, a neuroscientist and psychologist at Washington State University, discovered that rats emit an ultrasonic chirp (inaudible to humans without special equipment) when they’re tickled, and they like the sensation so much they keep coming back for more tickling.

He and Professor Provine figure that the first primate joke — that is, the first action to produce a laugh without physical contact — was the feigned tickle, the same kind of coo-chi-coo move parents make when they thrust their wiggling fingers at a baby. Professor Panksepp thinks the brain has ancient wiring to produce laughter so that young animals learn to play with one another. The laughter stimulates euphoria circuits in the brain and also reassures the other animals that they’re playing, not fighting.

“Primal laughter evolved as a signaling device to highlight readiness for friendly interaction,” Professor Panksepp says. “Sophisticated social animals such as mammals need an emotionally positive mechanism to help create social brains and to weave organisms effectively into the social fabric.”

Humans are laughing by the age of four months and then progress from tickling to the Three Stooges to more sophisticated triggers for laughter (or, in some inexplicable cases, to Jim Carrey movies). Laughter can be used cruelly to reinforce a group’s solidarity and pride by mocking deviants and insulting outsiders, but mainly it’s a subtle social lubricant. It’s a way to make friends and also make clear who belongs where in the status hierarchy.

Page 2 of 2)

Which brings us back to the muffin joke. It was inflicted by social psychologists at Florida State University on undergraduate women last year, during interviews for what was ostensibly a study of their spending habits. Some of the women were told the interviewer would be awarding a substantial cash prize to a few of the participants, like a boss deciding which underling deserved a bonus.

The women put in the underling position were a lot more likely to laugh at the muffin joke (and others almost as lame) than were women in the control group. But it wasn’t just because these underlings were trying to manipulate the boss, as was demonstrated in a follow-up experiment.

This time each of the women watched the muffin joke being told on videotape by a person who was ostensibly going to be working with her on a task. There was supposed to be a cash reward afterward to be allocated by a designated boss. In some cases the woman watching was designated the boss; in other cases she was the underling or a co-worker of the person on the videotape.

When the woman watching was the boss, she didn’t laugh much at the muffin joke. But when she was the underling or a co-worker, she laughed much more, even though the joke-teller wasn’t in the room to see her. When you’re low in the status hierarchy, you need all the allies you can find, so apparently you’re primed to chuckle at anything even if it doesn’t do you any immediate good.

“Laughter seems to be an automatic response to your situation rather than a conscious strategy,” says Tyler F. Stillman, who did the experiments along with Roy Baumeister and Nathan DeWall. “When I tell the muffin joke to my undergraduate classes, they laugh out loud.”

Mr. Stillman says he got so used to the laughs that he wasn’t quite prepared for the response at a conference in January, although he realizes he should have expected it.

“It was a small conference attended by some of the most senior researchers in the field,” he recalls. “When they heard me, a lowly graduate student, tell the muffin joke, there was a really uncomfortable silence. You could hear crickets.”

Power User
Posts: 42482

« Reply #14 on: March 20, 2007, 08:14:20 AM »

Scientist Finds the Beginnings of Morality in Primate Behavior
 Illustration by Edel Rodriguez based on source material from Frans de Waal
Social OrderChimpanzees have a sense of social structure and rules of behavior, most of which involve the hierarchy of a group, in which some animals rank higher than others. Social living demands a number of qualities that may be precursors of morality. More 

Published: March 20, 2007

Some animals are surprisingly sensitive to the plight of others. Chimpanzees, who cannot swim, have drowned in zoo moats trying to save others. Given the chance to get food by pulling a chain that would also deliver an electric shock to a companion, rhesus monkeys will starve themselves for several days.

The Beginnings of Morality? Biologists argue that these and other social behaviors are the precursors of human morality. They further believe that if morality grew out of behavioral rules shaped by evolution, it is for biologists, not philosophers or theologians, to say what these rules are.

Moral philosophers do not take very seriously the biologists’ bid to annex their subject, but they find much of interest in what the biologists say and have started an academic conversation with them.

The original call to battle was sounded by the biologist Edward O. Wilson more than 30 years ago, when he suggested in his 1975 book “Sociobiology” that “the time has come for ethics to be removed temporarily from the hands of the philosophers and biologicized.” He may have jumped the gun about the time having come, but in the intervening decades biologists have made considerable progress.

Last year Marc Hauser, an evolutionary biologist at Harvard, proposed in his book “Moral Minds” that the brain has a genetically shaped mechanism for acquiring moral rules, a universal moral grammar similar to the neural machinery for learning language. In another recent book, “Primates and Philosophers,” the primatologist Frans de Waal defends against philosopher critics his view that the roots of morality can be seen in the social behavior of monkeys and apes.

Dr. de Waal, who is director of the Living Links Center at Emory University, argues that all social animals have had to constrain or alter their behavior in various ways for group living to be worthwhile. These constraints, evident in monkeys and even more so in chimpanzees, are part of human inheritance, too, and in his view form the set of behaviors from which human morality has been shaped.

Many philosophers find it hard to think of animals as moral beings, and indeed Dr. de Waal does not contend that even chimpanzees possess morality. But he argues that human morality would be impossible without certain emotional building blocks that are clearly at work in chimp and monkey societies.

Dr. de Waal’s views are based on years of observing nonhuman primates, starting with work on aggression in the 1960s. He noticed then that after fights between two combatants, other chimpanzees would console the loser. But he was waylaid in battles with psychologists over imputing emotional states to animals, and it took him 20 years to come back to the subject.

He found that consolation was universal among the great apes but generally absent from monkeys — among macaques, mothers will not even reassure an injured infant. To console another, Dr. de Waal argues, requires empathy and a level of self-awareness that only apes and humans seem to possess. And consideration of empathy quickly led him to explore the conditions for morality.

Though human morality may end in notions of rights and justice and fine ethical distinctions, it begins, Dr. de Waal says, in concern for others and the understanding of social rules as to how they should be treated. At this lower level, primatologists have shown, there is what they consider to be a sizable overlap between the behavior of people and other social primates.

Social living requires empathy, which is especially evident in chimpanzees, as well as ways of bringing internal hostilities to an end. Every species of ape and monkey has its own protocol for reconciliation after fights, Dr. de Waal has found. If two males fail to make up, female chimpanzees will often bring the rivals together, as if sensing that discord makes their community worse off and more vulnerable to attack by neighbors. Or they will head off a fight by taking stones out of the males’ hands.

Dr. de Waal believes that these actions are undertaken for the greater good of the community, as distinct from person-to-person relationships, and are a significant precursor of morality in human societies.

(Page 2 of 3)

Macaques and chimpanzees have a sense of social order and rules of expected behavior, mostly to do with the hierarchical natures of their societies, in which each member knows its own place. Young rhesus monkeys learn quickly how to behave, and occasionally get a finger or toe bitten off as punishment. Other primates also have a sense of reciprocity and fairness. They remember who did them favors and who did them wrong. Chimps are more likely to share food with those who have groomed them. Capuchin monkeys show their displeasure if given a smaller reward than a partner receives for performing the same task, like a piece of cucumber instead of a grape.

The Beginnings of Morality? These four kinds of behavior — empathy, the ability to learn and follow social rules, reciprocity and peacemaking — are the basis of sociality.

Dr. de Waal sees human morality as having grown out of primate sociality, but with two extra levels of sophistication. People enforce their society’s moral codes much more rigorously with rewards, punishments and reputation building. They also apply a degree of judgment and reason, for which there are no parallels in animals.

Religion can be seen as another special ingredient of human societies, though one that emerged thousands of years after morality, in Dr. de Waal’s view. There are clear precursors of morality in nonhuman primates, but no precursors of religion. So it seems reasonable to assume that as humans evolved away from chimps, morality emerged first, followed by religion. “I look at religions as recent additions,” he said. “Their function may have to do with social life, and enforcement of rules and giving a narrative to them, which is what religions really do.”

As Dr. de Waal sees it, human morality may be severely limited by having evolved as a way of banding together against adversaries, with moral restraints being observed only toward the in group, not toward outsiders. “The profound irony is that our noblest achievement — morality — has evolutionary ties to our basest behavior — warfare,” he writes. “The sense of community required by the former was provided by the latter.”

Dr. de Waal has faced down many critics in evolutionary biology and psychology in developing his views. The evolutionary biologist George Williams dismissed morality as merely an accidental byproduct of evolution, and psychologists objected to attributing any emotional state to animals. Dr. de Waal convinced his colleagues over many years that the ban on inferring emotional states was an unreasonable restriction, given the expected evolutionary continuity between humans and other primates.

His latest audience is moral philosophers, many of whom are interested in his work and that of other biologists. “In departments of philosophy, an increasing number of people are influenced by what they have to say,” said Gilbert Harman, a Princeton University philosopher.

Dr. Philip Kitcher, a philosopher at Columbia University, likes Dr. de Waal’s empirical approach. “I have no doubt there are patterns of behavior we share with our primate relatives that are relevant to our ethical decisions,” he said. “Philosophers have always been beguiled by the dream of a system of ethics which is complete and finished, like mathematics. I don’t think it’s like that at all.”

But human ethics are considerably more complicated than the sympathy Dr. de Waal has described in chimps. “Sympathy is the raw material out of which a more complicated set of ethics may get fashioned,” he said. “In the actual world, we are confronted with different people who might be targets of our sympathy. And the business of ethics is deciding who to help and why and when.”

Many philosophers believe that conscious reasoning plays a large part in governing human ethical behavior and are therefore unwilling to let everything proceed from emotions, like sympathy, which may be evident in chimpanzees. The impartial element of morality comes from a capacity to reason, writes Peter Singer, a moral philosopher at Princeton, in “Primates and Philosophers.” He says, “Reason is like an escalator — once we step on it, we cannot get off until we have gone where it takes us.”

That was the view of Immanuel Kant, Dr. Singer noted, who believed morality must be based on reason, whereas the Scottish philosopher David Hume, followed by Dr. de Waal, argued that moral judgments proceed from the emotions.

But biologists like Dr. de Waal believe reason is generally brought to bear only after a moral decision has been reached. They argue that morality evolved at a time when people lived in small foraging societies and often had to make instant life-or-death decisions, with no time for conscious evaluation of moral choices. The reasoning came afterward as a post hoc justification. “Human behavior derives above all from fast, automated, emotional judgments, and only secondarily from slower conscious processes,” Dr. de Waal writes.


Page 3 of 3)

However much we may celebrate rationality, emotions are our compass, probably because they have been shaped by evolution, in Dr. de Waal’s view. For example, he says: “People object to moral solutions that involve hands-on harm to one another. This may be because hands-on violence has been subject to natural selection whereas utilitarian deliberations have not.”

The Beginnings of Morality? Philosophers have another reason biologists cannot, in their view, reach to the heart of morality, and that is that biological analyses cannot cross the gap between “is” and “ought,” between the description of some behavior and the issue of why it is right or wrong. “You can identify some value we hold, and tell an evolutionary story about why we hold it, but there is always that radically different question of whether we ought to hold it,” said Sharon Street, a moral philosopher at New York University. “That’s not to discount the importance of what biologists are doing, but it does show why centuries of moral philosophy are incredibly relevant, too.”

Biologists are allowed an even smaller piece of the action by Jesse Prinz, a philosopher at the University of North Carolina. He believes morality developed after human evolution was finished and that moral sentiments are shaped by culture, not genetics. “It would be a fallacy to assume a single true morality could be identified by what we do instinctively, rather than by what we ought to do,” he said. “One of the principles that might guide a single true morality might be recognition of equal dignity for all human beings, and that seems to be unprecedented in the animal world.”

Dr. de Waal does not accept the philosophers’ view that biologists cannot step from “is” to “ought.” “I’m not sure how realistic the distinction is,” he said. “Animals do have ‘oughts.’ If a juvenile is in a fight, the mother must get up and defend her. Or in food sharing, animals do put pressure on each other, which is the first kind of ‘ought’ situation.”

Dr. de Waal’s definition of morality is more down to earth than Dr. Prinz’s. Morality, he writes, is “a sense of right and wrong that is born out of groupwide systems of conflict management based on shared values.” The building blocks of morality are not nice or good behaviors but rather mental and social capacities for constructing societies “in which shared values constrain individual behavior through a system of approval and disapproval.” By this definition chimpanzees in his view do possess some of the behavioral capacities built in our moral systems.

“Morality is as firmly grounded in neurobiology as anything else we do or are,” Dr. de Waal wrote in his 1996 book “Good Natured.” Biologists ignored this possibility for many years, believing that because natural selection was cruel and pitiless it could only produce people with the same qualities. But this is a fallacy, in Dr. de Waal’s view. Natural selection favors organisms that survive and reproduce, by whatever means. And it has provided people, he writes in “Primates and Philosophers,” with “a compass for life’s choices that takes the interests of the entire community into account, which is the essence of human morality.”

Power User
Posts: 42482

« Reply #15 on: April 20, 2007, 10:42:25 AM »

This URL about the pyschology of risk-taking  was just posted in the Parkour thread on the Marital Arts forum, but it seems to me like lit belongs here as well.
Power User
Posts: 107

« Reply #16 on: April 21, 2007, 08:24:22 PM »

Along the line of evolutionary biology/psychology, but from a purely philosophical vantage point, has anyone read Robert Pirsigs follow up to 'Zen and the Art of Motorcycle Maintenance' entitled 'Lila: An Enquiry into Morals'?  It deals with the concept of how universe developed morality, and what morality actually is.

In short, Pirsig developed a metaphysics of morality, where by the universe has an inate desire to move from a state of lower quality to a state of higher quality.

Pirsig developed a hirearchy of quality

inorganic quality
organic quality
social quality
intellectual quality
dynamic quality

inorganic to organic (why the universe develops life from lifelessness)
organic to social (organisms evolve from singular entities in to group entities)
social to intellectual (organisms develop reason and rationality)

In short, Pirsig stated that each level of quality is in conflict with the lower level and the higher level.

For example, organic quality are all the adaptations that the individual organism has adapted to continue to survive at that level.  The organic level of quality is always in conflict with inorganic quality (death...returning to static inorganic quality) on the one end, and social quality (the group) on the other.

It is the conflict of these qualities, which are adaptive when viewed from below and maladaptive when viewed from above (such as extreme aggression is viewed as a detriment by social quality.....but is adaptive when keeping the organism alive).

Pirsig, from a wide perspective I think, was taking in the direction that the sciences were viewing up each of these phenomenon, which appear disconnected when viewed through the microscope of science, fit together when viewed from a big picture perspective.

Taking Pirsigs metaphysics of moral quality as template, a whole host of moral questions can be answered.  Stealing, for example, has biological allows an individual biological entity to gain a resources advantage.  From a purely biological perspective there is nothing immoral about stealing.

From a social level of quality, which is a higher level of quality than biological, stealing throws the social order in to chaos, as a society cannot exist without bounderies to individual behavior.  Once a society is seen by the individuals to be unable to maintain social order on biological quality, the individual biological entities see no benefit to continuing to abide by social order, which results in a deteroration of the social order.

The next evolutionary rachette step of morality is always more moral than the lower one.  Biological order is a lower level of quality from Social order.....Social order is a lower set of quality from Intellectual.

Pirsig used the example of Fascism versus Communism.  Fascism represented social order, the subservience of the individual to the state.  Communism, however, represented an idea, an intellectual idea.  As such, Communism was a higher level of quality than Fascism, and hence more moral.  Likewise western liberal democracy, being an intellectual idea manifested as government, is a higher level of quality that Fascism.
« Last Edit: April 21, 2007, 08:30:11 PM by sgtmac_46 » Logged
Power User
Posts: 42482

« Reply #17 on: June 27, 2007, 08:26:08 AM »

From a Few Genes, Life's Myriad Shapes
Published: June 26, 2007
NY Times

Since its humble beginnings as a single cell, life has evolved into a
spectacular array of shapes and sizes, from tiny fleas to towering
Tyrannosaurus rex, from slow-soaring vultures to fast-swimming swordfish,
and from modest ferns to alluring orchids. But just how such diversity of
form could arise out of evolution's mess of random genetic mutations - how a
functional wing could sprout where none had grown before, or how flowers
could blossom in what had been a flowerless world - has remained one of the
most fascinating and intractable questions in evolutionary biology.

Now finally, after more than a century of puzzling, scientists are finding
answers coming fast and furious and from a surprising quarter, the field
known as evo-devo. Just coming into its own as a science, evo-devo is the
combined study of evolution and development, the process by which a nubbin
of a fertilized egg transforms into a full-fledged adult. And what these
scientists are finding is that development, a process that has for more than
half a century been largely ignored in the study of evolution, appears to
have been one of the major forces shaping the history of life on earth.

For starters, evo-devo researchers are finding that the evolution of complex
new forms, rather than requiring many new mutations or many new genes as had
long been thought, can instead be accomplished by a much simpler process
requiring no more than tweaks to already existing genes and developmental
plans. Stranger still, researchers are finding that the genes that can be
tweaked to create new shapes and body parts are surprisingly few. The same
DNA sequences are turning out to be the spark inciting one evolutionary
flowering after another. "Do these discoveries blow people's minds? Yes,"
said Dr. Sean B. Carroll, biologist at the Howard Hughes Medical Institute
at the University of Wisconsin, Madison. "The first response is 'Huh?' and
the second response is 'Far out.' "

"This is the illumination of the utterly dark," Dr. Carroll added.

The development of an organism - how one end gets designated as the head or
the tail, how feet are enticed to grow at the end of a leg rather than at
the wrist - is controlled by a hierarchy of genes, with master genes at the
top controlling a next tier of genes, controlling a next and so on. But the
real interest for evolutionary biologists is that these hierarchies not only
favor the evolution of certain forms but also disallow the growth of others,
determining what can and cannot arise not only in the course of the growth
of an embryo, but also over the history of life itself.

"It's been said that classical evolutionary theory looks at survival of the
fittest," said Dr. Scott F. Gilbert, a developmental biologist at Swarthmore
College. By looking at what sorts of organisms are most likely or impossible
to develop, he explained, "evo-devo looks at the arrival of the fittest."

Charles Darwin saw it first. He pointed out well over a century ago that
developing forms of life would be central to the study of evolution. Little
came of it initially, for a variety of reasons. Not least of these was the
discovery that perturbing the process of development often resulted in a
freak show starring horrors like bipedal goats and insects with legs growing
out of their mouths, monstrosities that seemed to shed little light on the
wonders of evolution.

But the advent of molecular biology reinvigorated the study of development
in the 1980s, and evo-devo quickly got scientists' attention when early
breakthroughs revealed that the same master genes were laying out
fundamental body plans and parts across the animal kingdom. For example,
researchers discovered that genes in the Pax6 family could switch on the
development of eyes in animals as different as flies and people. More recent
work has begun looking beyond the body's basic building blocks to reveal how
changes in development have resulted in some of the world's most celebrated
of evolutionary events.

In one of the most exciting of the new studies, a team of scientists led by
Dr. Cliff Tabin, a developmental biologist at Harvard Medical School,
investigated a classic example of evolution by natural selection, the
evolution of Darwin's finches on the Galápagos Islands.

Like the other organisms that made it to the remote archipelago off the
coast of Ecuador, Darwin's finches have flourished in their isolation,
evolving into many and varied species. But, while the finches bear his name
and while Darwin was indeed inspired to thoughts of evolution by animals on
these islands, the finches left him flummoxed. Darwin did not realize for
quite some time that these birds were all finches or even that they were
related to one another.

(Page 2 of 5)

He should be forgiven, however. For while the species are descendants of an
original pioneering finch, they no longer bear its characteristic short,
slender beak, which is excellent for hulling tiny seeds. In fact, the
finches no longer look very finchlike at all. Adapting to the strange new
foods of the islands, some have evolved taller, broader, more powerful
nut-cracking beaks; the most impressive of the big-beaked finches is
Geospiza magnirostris. Other finches have evolved longer bills that are
ideal for drilling holes into cactus fruits to get at the seeds; Geospiza
conirostris is one species with a particularly elongated beak.

But how could such bills evolve from a simple finch beak? Scientists had
assumed that the dramatic alterations in beak shape, height, width and
strength would require the accumulation of many chance mutations in many
different genes. But evo-devo has revealed that getting a fancy new beak can
be simpler than anyone had imagined.

Genes are stretches of DNA that can be switched on so that they will produce
molecules known as proteins. Proteins can then do a number of jobs in the
cell or outside it, working to make parts of organisms, switching other
genes on and so on. When genes are switched on to produce proteins, they can
do so at a low level in a limited area or they can crank out lots of protein
in many cells.

What Dr. Tabin and colleagues found, when looking at the range of beak
shapes and sizes across different finch species, was that the thicker and
taller and more robust a beak, the more strongly it expressed a gene known
as BMP4 early in development. The BMP4 gene (its abbreviation stands for
bone morphogenetic protein, No. 4) produces the BMP4 protein, which can
signal cells to begin producing bone. But BMP4 is multitalented and can also
act to direct early development, laying out a variety of architectural plans
including signaling which part of the embryo is to be the backside and which
the belly side. To verify that the BMP4 gene itself could indeed trigger the
growth of grander, bigger, nut-crushing beaks, researchers artificially
cranked up the production of BMP4 in the developing beaks of chicken
embryos. The chicks began growing wider, taller, more robust beaks similar
to those of a nut-cracking finch.

In the finches with long, probing beaks, researchers found at work a
different gene, known as calmodulin. As with BMP4, the more that calmodulin
was expressed, the longer the beak became. When scientists artificially
increased calmodulin in chicken embryos, the chicks began growing extended
beaks, just like a cactus driller.

So, with just these two genes, not tens or hundreds, the scientists found
the potential to recreate beaks, massive or stubby or elongated.

"So now one wants to go in a number of directions," Dr. Tabin said. "What
happens in a stork? What happens in a hummingbird? A parrot?" For the
evolution of beaks, the main tool with which a bird handles its food and
makes its living, is central not only to Darwin's finches, but to birds as a

BMP4's reach does not stop at the birds, however.

In lakes in Africa, the fish known as cichlids have evolved so rapidly into
such a huge diversity of species that they have become one of the best known
evolutionary radiations. The cichlids have evolved in different shapes and
sizes, and with a variety of jaw types specialized for eating certain kinds
of food. Robust, thick jaws are excellent at crushing snails, while longer
jaws work well for sucking up algae. As with the beaks of finches, a range
of styles developed.

Now in a new study, Dr. R. Craig Albertson, an evolutionary biologist at
Syracuse University, and Dr. Thomas D. Kocher, a geneticist at the
University of New Hampshire, have shown that more robust-jawed cichlids
express more BMP4 during development than those with more delicate jaws. To
test whether BMP4 was indeed responsible for the difference, these
scientists artificially increased the expression of BMP4 in the zebrafish,
the lab rat of the fish world. And, reprising the beak experiments,
researchers found that increased production of BMP4 in the jaws of embryonic
zebrafish led to the development of more robust chewing and chomping parts.


Page 3 of 5)

And if being a major player in the evolution of African cichlids and Darwin's
finches - two of the most famous evolutionary radiations of species - were
not enough for BMP4, Dr. Peter R. Grant, an evolutionary biologist at
Princeton University, predicted that the gene would probably be found to
play an important role in the evolution of still other animals. He noted
that jaw changes were a crucial element in the evolution of lizards, rabbits
and mice, among others, making them prime candidates for evolution via BMP4.

"This is just the beginning," Dr. Grant said. "These are exciting times for
us all."

Used to lay out body plans, build beaks and alter fish jaws, BMP4
illustrates perfectly one of the major recurring themes of evo-devo. New
forms can arise via new uses of existing genes, in particular the control
genes or what are sometimes called toolkit genes that oversee development.
It is a discovery that can explain much that has previously been mysterious,
like the observation that without much obvious change to the genome over
all, one can get fairly radical changes in form.

"There aren't new genes arising every time a new species arises," said Dr.
Brian K. Hall, a developmental biologist at Dalhousie University in Nova
Scotia. "Basically you take existing genes and processes and modify them,
and that's why humans and chimps can be 99 percent similar at the genome

Evo-devo has also begun to shine a light on a phenomenon with which
evolutionary biologists have long been familiar, the way in which different
species will come up with sometimes jaw-droppingly similar solutions when
confronted with the same challenges.

Among the placental mammals of the Americas and the marsupials of Australia,
for example, have evolved the same sorts of animals independently: beasts
that burrowed, loping critters that grazed, creatures that had long snouts
for eating ants, and versions of wolf.

In the same way, the cichlids have evolved pairs of matching species,
arising independently in separate lakes in Africa. In Lake Malawi, for
example, there is a long and flat-headed species with a deep underbite that
looks remarkably like an unrelated species that lives a similar lifestyle in
Lake Tanganyika. There is another cichlid with a bulging brow and frowning
lips in Lake Malawi with, again, an unrelated but otherwise extremely
similar-looking cichlid in Lake Tanganyika. The same jaws, heads, and ways
of living can be seen to evolve again and again.

The findings of evo-devo suggest that such parallels might in fact be
expected. For cichlids are hardly coming up with new genetic solutions to
eating tough snails as they each crank up the BMP4 or tinker with other
toolkit genes. Instead, whether in Lake Malawi or Lake Tanganyika, they may
be using the same genes to develop the same forms that provide the same
solutions to the same ecological challenges. Why not, when even the beaked
birds flying overhead are using the very same genes?

Evo-devo has even begun to give biologists new insight into one of the most
beautiful examples of recurring forms: the evolution of mimicry.

It has long been a source of amazement how some species seem so able to
evolve near-perfect mimicry of another. Poisonous species often evolve
bright warning colors, which have been reproduced by nonpoisonous species or
by other, similarly poisonous species, hoping to fend off curious predators.

Now in a new study of Heliconius butterflies, Dr. Mathieu Joron, an
evolutionary biologist at the University of Edinburgh, and colleagues, found
evidence that the mimics may be using some of the same genes to produce
their copycat warning colors and patterns.

The researchers studied several species of tropical Heliconius butterflies,
all of which are nasty-tasting to birds and which mimic one another's color
patterns. Dr. Joron and colleagues found that some of the main elements of
the patterns - a yellow band in Heliconius melpomene and Heliconius erato
and a complex tiger-stripe pattern in Heliconius numata - are controlled by
a single region of DNA, a tightly linked set of genes known as a supergene.

Dr. Joron said he and colleagues were still mapping the details of color
pattern control within the supergene. But if this turned out to function, as
researchers suspected, like a toolkit gene turning the patterns on and off,
it could explain both the prevalence of mimicry in Heliconius and the
apparent ease with which these species have been shown to repeatedly evolve
such superbly matching patterns.

One of evo-devo's greatest strengths is its cross-disciplinary nature,
bridging not only evolutionary and developmental studies but gaps as broad
as those between fossil-hunting paleontologists and molecular biologists.
One researcher whose approach epitomizes the power of such synthesis is Dr.
Neil Shubin, an evolutionary biologist at the University of Chicago and the
Field Museum.
Power User
Posts: 42482

« Reply #18 on: June 27, 2007, 09:03:58 AM »

Page 4 of 5)

Last year, Dr. Shubin and colleagues reported the discovery of a fossil fish
on Ellesmere Island in northern Canada. They had found Tiktaalik, as they
named the fish, after searching for six years. They persisted for so long
because they were certain that they had found the right age and kind of rock
where a fossil of a fish trying to make the transition to life on land was
likely to be found. And Tiktaalik appeared to be just such a fish, but it
also had a few surprises for the researchers.

"Tiktaalik is special," Dr. Shubin said. "It has a flat head with eyes on
top. It has gills and lungs. It's an animal that's exploring the interface
between water and land."

But Tiktaalik was a truly stunning discovery because this water-loving fish
bore wrists, an attribute thought to have been an innovation confined
strictly to animals that had already made the transition to land.

"This was telling us that a piece of the toolkit, to make arms, legs, hand
and feet, could very well be present in fish limbs," Dr. Shubin said. In
other words, the genetic tools or toolkit genes for making limbs to walk on
land might well have been present long before fish made that critical leap.
But as fascinating as Tiktaalik was, it was also rock hard and provided no
DNA that might shed light on the presence or absence of any particular gene.

So Dr. Shubin did what more and more evo-devo researchers are learning to
do: take off one hat (paleontologist) and don another (molecular biologist).
Dr. Shubin oversees one of what he says is a small but growing number of
laboratories where old-fashioned rock-pounding takes place alongside
high-tech molecular DNA studies.

He and colleagues began a study of the living but ancient fish known as the
paddlefish. What they found, reported last month in the journal Nature, was
that these thoroughly fishy fish were turning on control genes known as Hox
genes, in a manner characteristic of the four-limbed, land-loving beasts
known as tetrapods.

Tetrapods include cows, people, birds, rodents and so on. In other words,
the potential for making fingers, hands and feet, crucial innovations used
in emerging from the water to a life of walking and crawling on land,
appears to have been present in fish, long before they began flip-flopping
their way out of the muck. "The genetic tools to build fingers and toes were
in place for a long time," Dr. Shubin wrote in an e-mail message. "Lacking
were the environmental conditions where these structures would be useful."
He added, "Fingers arose when the right environments arose."

And here is another of the main themes to emerge from evo-devo. Major events
in evolution like the transition from life in the water to life on land are
not necessarily set off by the arising of the genetic mutations that will
build the required body parts, or even the appearance of the body parts
themselves, as had long been assumed. Instead, it is theorized that the
right ecological situation, the right habitat in which such bold, new forms
will prove to be particularly advantageous, may be what is required to set
these major transitions in motion.

So far, most of the evo-devo work has been on animals, but researchers have
begun to ask whether the same themes are being played out in plants.

Of particular interest to botanists is what Darwin described as an
"abominable mystery": the origin of flowering plants. A critical event in
the evolution of plants, it happened, by paleontological standards, rather

So what genes were involved in the origin of flowers? Botanists know that
during development, the genes known as MADS box genes lay out the
architecture of the blossom. They do so by turning on other genes, thereby
determining what will develop where - petals here, reproductive parts there
and so on, in much the same manner that Hox genes determine the general
layout of parts in animals. Hox genes have had an important role in the
evolution of animal form. But have MADS box genes had as central a role in
the evolution of plants?

Page 5 of 5)

So far, said Dr. Vivian F. Irish, a developmental biologist at Yale
University, the answer appears to be yes. There is a variety of
circumstantial evidence, the most interesting of which is the fact that the
MADS box genes exploded in number right around the time that flowering
plants first appeared.

"It's really analogous to what's going on in Hox genes," said Dr. Irish,
though she noted that details of the role of the MADS box genes remained to
be worked out. "It's very cool that evolution has used a similar strategy in
two very different kingdoms."

Amid the enthusiast hubbub, cautionary notes have been sounded. Dr. Jerry
Coyne, an evolutionary biologist at the University of Chicago, said that as
dramatic as the changes in form caused by mutations in toolkit genes can be,
it was premature to credit these genes with being the primary drivers of the
evolution of novel forms and diversity. He said that too few studies had
been done so far to support such broad claims, and that it could turn out
that other, more mundane workaday genes, of the sort that were being studied
long before evo-devo appeared on the scene, would play equally or even more
important roles.

"I urge caution," Dr. Coyne said. "We just don't know."

All of which goes to show that like all emerging fields, evo-devo's
significance and the uniqueness of its contributions will continue to be
reassessed. It will remain to be seen just how separate or incorporated into
the rest of evolutionary thinking its findings will end up being.
Paradoxically, it was during just such a flurry of intellectual synthesis
and research activity, the watershed known as the New or Modern Synthesis in
which modern evolutionary biology was born in the last century, that
developmental thinking was almost entirely ejected from the science of

But perhaps today synthesizers can do better, broadening their focus without
constricting their view of evolution as they try to take in all of the great
pageant that is the history of life.

"We're still a very young field," Dr. Gilbert said. "But I think this is a
new evolutionary synthesis, an emerging evolutionary synthesis. I think we're
seeing it."
Power User
Posts: 42482

« Reply #19 on: June 29, 2007, 01:54:59 PM »

By RANDOLPH E. SCHMID, AP Science Writer
Mon Jun 25, 5:00 PM ET

WASHINGTON - Researchers studying Neanderthal DNA say it should be possible to construct a complete genome of the ancient hominid despite the degradation of the DNA over time.  There is also hope for reconstructing the genome of the mammoth and cave bear, according to a research team led by Svante Paabo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Their findings are published in this week's online edition of Proceedings of the National Academy of Sciences.

Debate has raged for years about whether there is any relationship between Neanderthals and modern humans. Some researchers believe that Neanderthals were simply replaced by early modern humans, while others argue the two groups may have interbred.

Sequencing the genome of Neanderthals, who lived in Europe until about 30,000 years ago, could shed some light on that question.  In studies of Neanderthals, cave bear and mammoth, a majority of the DNA recovered was that of microorganisms that colonized the tissues after death, the researchers said.  But they were able to identify some DNA from the original animal, and Paabo and his colleagues were able to determine how it broke down over time. They also developed procedures to prevent contamination by the DNA of humans working with the material.

"We are confident that it will be technically feasible to achieve a reliable Neanderthal genome sequence," Paabo and his researchers reported.

They said problem of damaged areas in some DNA could be overcome by using a sufficient amount of Neanderthal DNA from different individuals, so the whole genome can be determined.

"The contamination and degradation of DNA has been a serious issue for the last 10 years," observed Erik Trinkaus, a professor at Washington University in St. Louis. "This is a serious attempt to deal with that issue and that's welcome.  I'm not sure they have completely solved the problem, but they've made a big step in that direction," said Trinkaus, who was not involved in the research.

Anthropologist Richard Potts of the Smithsonian's National Museum of Natural History, called the work "a very significant technical study of DNA decay."

The researchers "have tried to answer important questions about the potential to sequence ancient DNA," said Potts, who was not part of the research.

Milford Wolpoff, a University of Michigan Anthropologist, said creating a complete Neanderthal genome is a great goal.

But it is "sample intensive," he said, and he isn't sure enough DNA is available to complete the work. Curators don't like to see their specimens ground up, he said.

The research was funded by the Max Planck Society and the National Institutes of Health.
Power User
Posts: 42482

« Reply #20 on: July 09, 2007, 08:11:42 AM »

The International Hopology site is well worth taking a look.

Its "Three Axioms" break down the aggressive instinct differently from Konrad Lorenz.

To refresh memories, KL wrote of three categories: Territory, Hierarchy, Reproduction.  To these three in the case of humans I have added Hunting e.g. a criminal stealing money in effect is taking food and his behaviors will be those of a hunter.

In contrast, as seen below, Hopology apparently has two categories.  My first intuitive response is that their approach also seems to have merit.


Three Axioms of Hoplology

1. The foundation of human combative behavior is rooted in our evolution. To gain a realistic understanding of human combative behavior, it is necessary to have a basic grasp of its evolutionary background.

2. The two basic forms of human combative behavior are predatory and affective. Predatory combative behavior is that combative/aggressive behavior rooted in our evolution as a hunting mammal. Affective combative behavior is that aggressive/combative behavior rooted in our evolution as a group-social animal.

3. The evolution of human combative behavior and performance is integral with the use of weapons. That is, behavior and performance is intrinsically linked to and reflects the use of weapons.
Power User
Posts: 42482

« Reply #21 on: July 31, 2007, 03:35:47 PM »

In Games, an Insight Into the Rules of Evolution

Published: July 31, 2007
When Martin Nowak was in high school, his parents thought he would be a nice boy and become a doctor. But when he left for the University of Vienna, he abandoned medicine for something called biochemistry. As far as his parents could tell, it had something to do with yeast and fermenting. They became a little worried. When their son entered graduate school, they became even more worried. He announced that he was now studying games.

In the end, Dr. Nowak turned out all right. He is now the director of the Program for Evolutionary Dynamics at Harvard. The games were actually versatile mathematical models that Dr. Nowak could use to make important discoveries in fields as varied as economics and cancer biology.

“Martin has a passion for taking informal ideas that people like me find theoretically important and framing them as mathematical models,” said Steven Pinker, a Harvard linguist who is collaborating with Dr. Nowak to study the evolution of language. “He allows our intuitions about what leads to what to be put to a test.”

On the surface, Dr. Nowak’s many projects may seem randomly scattered across the sciences. But there is an underlying theme to his work. He wants to understand one of the most puzzling yet fundamental features of life: cooperation.

When biologists speak of cooperation, they speak more broadly than the rest of us. Cooperation is what happens when someone or something gets a benefit because someone or something else pays a cost. The benefit can take many forms, like money or reproductive success. A friend takes off work to pick you up from the hospital. A sterile worker bee tends to eggs in a hive. Even the cells in the human body cooperate. Rather than reproducing as fast as it can, each cell respects the needs of the body, helping to form the heart, the lungs or other vital organs. Even the genes in a genome cooperate, to bring an organism to life.

In recent papers, Dr. Nowak has argued that cooperation is one of the three basic principles of evolution. The other two are mutation and selection. On their own, mutation and selection can transform a species, giving rise to new traits like limbs and eyes. But cooperation is essential for life to evolve to a new level of organization. Single-celled protozoa had to cooperate to give rise to the first multicellular animals. Humans had to cooperate for complex societies to emerge.

“We see this principle everywhere in evolution where interesting things are happening,” Dr. Nowak said.

While cooperation may be central to evolution, however, it poses questions that are not easy to answer. How can competing individuals start to cooperate for the greater good? And how do they continue to cooperate in the face of exploitation? To answer these questions, Dr. Nowak plays games.

His games are the intellectual descendants of a puzzle known as the Prisoner’s Dilemma. Imagine two prisoners are separately offered the same deal: if one of them testifies and the other doesn’t talk, the talker will go free and the holdout will go to jail for 10 years. If both refuse to talk, the prosecutor will only be able to put them in jail for six months. If each prisoner rats out the other, they will both get five-year sentences. Not knowing what the other prisoner will do, how should each one act?

The way the Prisoner’s Dilemma pits cooperation against defection distills an important feature of evolution. In any encounter between two members of the same species, each one may cooperate or defect. Certain species of bacteria, for example, spray out enzymes that break down food, which all the bacteria can then suck up. It costs energy to make these enzymes. If one of the microbes stops cooperating and does not make the enzymes, it can still enjoy the meal. It can gain a potential reproductive edge over bacteria that cooperate.

The Prisoner’s Dilemma may be abstract, but that’s why Dr. Nowak likes it. It helps him understand fundamental rules of evolution, just as Isaac Newton discovered that objects in motion tend to stay in motion.

“If you were obsessed with friction, you would have never discovered this law,” Dr. Nowak said. “In the same sense, I try to get rid of what is inessential to find the essential. Truth is simple.”

Dr. Nowak found his first clues to the origin of cooperation in graduate school, collaborating with his Ph.D. adviser, Karl Sigmund. They built a version of the Prisoner’s Dilemma that captured more of the essence of how organisms behave and evolve.

In their game, an entire population of players enters a round-robin competition. The players are paired up randomly, and each one chooses whether to cooperate or defect. To make a choice, they can recall their past experiences with other individual players. Some players might use a strategy in which they had a 90-percent chance of cooperating with a player with whom they have cooperated in the past.

(Page 2 of 2)

The players get rewarded based on their choices. The most successful players get to reproduce. Each new player had a small chance of randomly mutating its strategy. If that strategy turned out to be more successful, it could dominate the population, wiping out its ancestors.

Dr. Nowak and Dr. Sigmund observed this tournament through millions of rounds. Often the winners used a strategy that Dr. Nowak called, “win-stay, lose-shift.” If they did well in the previous round, they did the same thing again. If they did not do so well, they shifted. Under some conditions, this strategy caused cooperation to become common among the players, despite the short-term payoff of defecting.

In order to study this new version of the Prisoner’s Dilemma, Dr. Nowak had to develop new mathematical tools. It turned out that these tools also proved useful for studying cancer. Cancer and the Prisoner’s Dilemma may seem like apples and oranges, but Dr. Nowak sees an intimate connection between the two. “Cancer is a breakdown of cooperation,” he said.

Mutations sometimes arise in cells that cause them to replicate quickly, ignoring signals to stop. Some of their descendants acquire new mutations, allowing them to become even more successful as cancer cells. They evolve, in other words, into more successful defectors. “Cancer is an evolution you don’t want,” Dr. Nowak said.

To study cancer, however, Dr. Nowak had to give his models some structure. In the Prisoner’s Dilemma, the players usually just bump into each other randomly. In the human body, on the other hand, cells only interact with cells in their neighborhood.

A striking example of these neighborhoods can be found in the intestines, where the lining is organized into millions of tiny pockets. A single stem cell at the bottom of a pocket divides, and its daughter cells are pushed up the pocket walls. The cells that reach the top get stripped away.

Dr. Nowak adapted a branch of mathematics known as graph theory, which makes it possible to study networks, to analyze how cancer arises in these local neighborhoods. “Our tissue is actually organized to delay the onset of cancer,” he said.

Pockets of intestinal cells, for example, can only hold a few cell generations. That lowers the chances that any one will turn cancerous. All the cells in each pocket are descended from a single stem cell, so that there’s no competition between lineages to take over the pocket.

As Dr. Nowak developed this neighborhood model, he realized it would help him study human cooperation. “The reality is that I’m much more likely to interact with my friends, and they’re much more likely to interact with their friends,” Dr. Nowak said. “So it’s more like a network.”

Dr. Nowak and his colleagues found that when they put players into a network, the Prisoner’s Dilemma played out differently. Tight clusters of cooperators emerge, and defectors elsewhere in the network are not able to undermine their altruism. “Even if outside our network there are cheaters, we still help each other a lot,” Dr. Nowak said. That is not to say that cooperation always emerges. Dr. Nowak identified the conditions when it can arise with a simple equation: B/C>K. That is, cooperation will emerge if the benefit-to-cost (B/C) ratio of cooperation is greater than the average number of neighbors (K).

“It’s the simplest possible thing you could have expected, and it’s completely amazing,” he said.

Another boost for cooperation comes from reputations. When we decide whether to cooperate, we don’t just rely on our past experiences with that particular person. People can gain reputations that precede them. Dr. Nowak and his colleagues pioneered a version of the Prisoner’s Dilemma in which players acquire reputations. They found that if reputations spread quickly enough, they could increase the chances of cooperation taking hold. Players were less likely to be fooled by defectors and more likely to benefit from cooperation.

In experiments conducted by other scientists with people and animals, Dr. Nowak’s mathematical models seem to fit. Reputation has a powerful effect on how people play games. People who gain a reputation for not cooperating tend to be shunned or punished by other players. Cooperative players get rewarded.

“You help because you know it gives you a reputation of a helpful person, who will be helped,” Dr. Nowak said. “You also look at others and help them according to whether they have helped.”

The subject of human cooperation is important not just to mathematical biologists like Dr. Nowak, but to many people involved in the current debate over religion and science. Some claim that it is unlikely that evolution could have produced humans’ sense of morality, the altruism of heroes and saints. “Selfless altruism presents a major challenge for the evolutionist,” Dr. Francis S. Collins, the director of the National Human Genome Research Institute, wrote in his 2006 book, “The Language of God.”

Dr. Nowak believes evolutionary biologists should study average behavior rather than a few extreme cases of altruism. “Saintly behavior is unfortunately not the norm,” Dr. Nowak said. “The current theory can certainly explain a population where some people act extremely altruistically.” That does not make Dr. Nowak an atheist, however. “Evolution describes the fundamental laws of nature according to which God chose to unfold life,” he declared in March in a lecture titled “Evolution and Christianity” at the Harvard Divinity School. Dr. Nowak is collaborating with theologians there on a project called “The Evolution and Theology of Cooperation,” to help theologians address evolutionary biology in their own work.

Dr. Nowak sometimes finds his scientific colleagues astonished when he defends religion. But he believes the astonishment comes from a misunderstanding of the roles of science and religion. “Like mathematics, many theological statements do not need scientific confirmation. Once you have the proof of Fermat’s Last Theorem, it’s not like we have to wait for the scientists to tell us if it’s right. This is it.”

Power User
Posts: 42482

« Reply #22 on: August 07, 2007, 09:02:37 AM »

In Dusty Archives, a Theory of Affluence
Published: August 7, 2007

For thousands of years, most people on earth lived in abject poverty, first as hunters and gatherers, then as peasants or laborers. But with the Industrial Revolution, some societies traded this ancient poverty for amazing affluence.

Breaking Out of a Malthusian Trap Historians and economists have long struggled to understand how this transition occurred and why it took place only in some countries. A scholar who has spent the last 20 years scanning medieval English archives has now emerged with startling answers for both questions.

Gregory Clark, an economic historian at the University of California, Davis, believes that the Industrial Revolution — the surge in economic growth that occurred first in England around 1800 — occurred because of a change in the nature of the human population. The change was one in which people gradually developed the strange new behaviors required to make a modern economy work. The middle-class values of nonviolence, literacy, long working hours and a willingness to save emerged only recently in human history, Dr. Clark argues.

Because they grew more common in the centuries before 1800, whether by cultural transmission or evolutionary adaptation, the English population at last became productive enough to escape from poverty, followed quickly by other countries with the same long agrarian past.

Dr. Clark’s ideas have been circulating in articles and manuscripts for several years and are to be published as a book next month, “A Farewell to Alms” (Princeton University Press). Economic historians have high praise for his thesis, though many disagree with parts of it.

“This is a great book and deserves attention,” said Philip Hoffman, a historian at the California Institute of Technology. He described it as “delightfully provocative” and a “real challenge” to the prevailing school of thought that it is institutions that shape economic history.

Samuel Bowles, an economist who studies cultural evolution at the Santa Fe Institute, said Dr. Clark’s work was “great historical sociology and, unlike the sociology of the past, is informed by modern economic theory.”

The basis of Dr. Clark’s work is his recovery of data from which he can reconstruct many features of the English economy from 1200 to 1800. From this data, he shows, far more clearly than has been possible before, that the economy was locked in a Malthusian trap _ — each time new technology increased the efficiency of production a little, the population grew, the extra mouths ate up the surplus, and average income fell back to its former level.

This income was pitifully low in terms of the amount of wheat it could buy. By 1790, the average person’s consumption in England was still just 2,322 calories a day, with the poor eating a mere 1,508. Living hunter-gatherer societies enjoy diets of 2,300 calories or more.

“Primitive man ate well compared with one of the richest societies in the world in 1800,” Dr. Clark observes.

The tendency of population to grow faster than the food supply, keeping most people at the edge of starvation, was described by Thomas Malthus in a 1798 book, “An Essay on the Principle of Population.” This Malthusian trap, Dr. Clark’s data show, governed the English economy from 1200 until the Industrial Revolution and has in his view probably constrained humankind throughout its existence. The only respite was during disasters like the Black Death, when population plummeted, and for several generations the survivors had more to eat.

Malthus’s book is well known because it gave Darwin the idea of natural selection. Reading of the struggle for existence that Malthus predicted, Darwin wrote in his autobiography, “It at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. ... Here then I had at last got a theory by which to work.”

Given that the English economy operated under Malthusian constraints, might it not have responded in some way to the forces of natural selection that Darwin had divined would flourish in such conditions? Dr. Clark started to wonder whether natural selection had indeed changed the nature of the population in some way and, if so, whether this might be the missing explanation for the Industrial Revolution.

The Industrial Revolution, the first escape from the Malthusian trap, occurred when the efficiency of production at last accelerated, growing fast enough to outpace population growth and allow average incomes to rise. Many explanations have been offered for this spurt in efficiency, some economic and some political, but none is fully satisfactory, historians say.

Breaking Out of a Malthusian Trap Dr. Clark’s first thought was that the population might have evolved greater resistance to disease. The idea came from Jared Diamond’s book “Guns, Germs and Steel,” which argues that Europeans were able to conquer other nations in part because of their greater immunity to disease.

In support of the disease-resistance idea, cities like London were so filthy and disease ridden that a third of their populations died off every generation, and the losses were restored by immigrants from the countryside. That suggested to Dr. Clark that the surviving population of England might be the descendants of peasants.

A way to test the idea, he realized, was through analysis of ancient wills, which might reveal a connection between wealth and the number of progeny. The wills did that, , but in quite the opposite direction to what he had expected.

Generation after generation, the rich had more surviving children than the poor, his research showed. That meant there must have been constant downward social mobility as the poor failed to reproduce themselves and the progeny of the rich took over their occupations. “The modern population of the English is largely descended from the economic upper classes of the Middle Ages,” he concluded.

As the progeny of the rich pervaded all levels of society, Dr. Clark considered, the behaviors that made for wealth could have spread with them. He has documented that several aspects of what might now be called middle-class values changed significantly from the days of hunter gatherer societies to 1800. Work hours increased, literacy and numeracy rose, and the level of interpersonal violence dropped.

Another significant change in behavior, Dr. Clark argues, was an increase in people’s preference for saving over instant consumption, which he sees reflected in the steady decline in interest rates from 1200 to 1800.

“Thrift, prudence, negotiation and hard work were becoming values for communities that previously had been spendthrift, impulsive, violent and leisure loving,” Dr. Clark writes.

Around 1790, a steady upward trend in production efficiency first emerges in the English economy. It was this significant acceleration in the rate of productivity growth that at last made possible England’s escape from the Malthusian trap and the emergence of the Industrial Revolution.

In the rest of Europe and East Asia, populations had also long been shaped by the Malthusian trap of their stable agrarian economies. Their workforces easily absorbed the new production technologies that appeared first in England.

It is puzzling that the Industrial Revolution did not occur first in the much larger populations of China or Japan. Dr. Clark has found data showing that their richer classes, the Samurai in Japan and the Qing dynasty in China, were surprisingly unfertile and so would have failed to generate the downward social mobility that spread production-oriented values in England.

After the Industrial Revolution, the gap in living standards between the richest and the poorest countries started to accelerate, from a wealth disparity of about 4 to 1 in 1800 to more than 50 to 1 today. Just as there is no agreed explanation for the Industrial Revolution, economists cannot account well for the divergence between rich and poor nations or they would have better remedies to offer.

Many commentators point to a failure of political and social institutions as the reason that poor countries remain poor. But the proposed medicine of institutional reform “has failed repeatedly to cure the patient,” Dr. Clark writes. He likens the “cult centers” of the World Bank and International Monetary Fund to prescientific physicians who prescribed bloodletting for ailments they did not understand.

If the Industrial Revolution was caused by changes in people’s behavior, then populations that have not had time to adapt to the Malthusian constraints of agrarian economies will not be able to achieve the same production efficiencies, his thesis implies.


Page 3 of 3)

Dr. Clark says the middle-class values needed for productivity could have been transmitted either culturally or genetically. But in some passages, he seems to lean toward evolution as the explanation. “Through the long agrarian passage leading up to the Industrial Revolution, man was becoming biologically more adapted to the modern economic world,” he writes. And, “The triumph of capitalism in the modern world thus may lie as much in our genes as in ideology or rationality.”

Breaking Out of a Malthusian Trap What was being inherited, in his view, was not greater intelligence — being a hunter in a foraging society requires considerably greater skill than the repetitive actions of an agricultural laborer. Rather, it was “a repertoire of skills and dispositions that were very different from those of the pre-agrarian world.”

Reaction to Dr. Clark’s thesis from other economic historians seems largely favorable, although few agree with all of it, and many are skeptical of the most novel part, his suggestion that evolutionary change is a factor to be considered in history.

Historians used to accept changes in people’s behavior as an explanation for economic events, like Max Weber’s thesis linking the rise of capitalism with Protestantism. But most have now swung to the economists’ view that all people are alike and will respond in the same way to the same incentives. Hence they seek to explain events like the Industrial Revolution in terms of changes in institutions, not people.

Dr. Clark’s view is that institutions and incentives have been much the same all along and explain very little, which is why there is so little agreement on the causes of the Industrial Revolution. In saying the answer lies in people’s behavior, he is asking his fellow economic historians to revert to a type of explanation they had mostly abandoned and in addition is evoking an idea that historians seldom consider as an explanatory variable, that of evolution.

Most historians have assumed that evolutionary change is too gradual to have affected human populations in the historical period. But geneticists, with information from the human genome now at their disposal, have begun to detect ever more recent instances of human evolutionary change like the spread of lactose tolerance in cattle-raising people of northern Europe just 5,000 years ago. A study in the current American Journal of Human Genetics finds evidence of natural selection at work in the population of Puerto Rico since 1513. So historians are likely to be more enthusiastic about the medieval economic data and elaborate time series that Dr. Clark has reconstructed than about his suggestion that people adapted to the Malthusian constraints of an agrarian society.

“He deserves kudos for assembling all this data,” said Dr. Hoffman, the Caltech historian, “but I don’t agree with his underlying argument.”

The decline in English interest rates, for example, could have been caused by the state’s providing better domestic security and enforcing property rights, Dr. Hoffman said, not by a change in people’s willingness to save, as Dr. Clark asserts.

The natural-selection part of Dr. Clark’s argument “is significantly weaker, and maybe just not necessary, if you can trace the changes in the institutions,” said Kenneth L. Pomeranz, a historian at the University of California, Irvine. In a recent book, “The Great Divergence,” Dr. Pomeranz argues that tapping new sources of energy like coal and bringing new land into cultivation, as in the North American colonies, were the productivity advances that pushed the old agrarian economies out of their Malthusian constraints.

Robert P. Brenner, a historian at the University of California, Los Angeles, said although there was no satisfactory explanation at present for why economic growth took off in Europe around 1800, he believed that institutional explanations would provide the answer and that Dr. Clark’s idea of genes for capitalist behavior was “quite a speculative leap.”

Dr. Bowles, the Santa Fe economist, said he was “not averse to the idea” that genetic transmission of capitalist values is important, but that the evidence for it was not yet there. “It’s just that we don’t have any idea what it is, and everything we look at ends up being awfully small,” he said. Tests of most social behaviors show they are very weakly heritable.

He also took issue with Dr. Clark’s suggestion that the unwillingness to postpone consumption, called time preference by economists, had changed in people over the centuries. “If I were as poor as the people who take out payday loans, I might also have a high time preference,” he said.

Dr. Clark said he set out to write his book 12 years ago on discovering that his undergraduates knew nothing about the history of Europe. His colleagues have been surprised by its conclusions but also interested in them, he said.

“The actual data underlying this stuff is hard to dispute,” Dr. Clark said. “When people see the logic, they say ‘I don’t necessarily believe it, but it’s hard to dismiss.’ ”

Power User
Posts: 42482

« Reply #23 on: August 28, 2007, 07:56:23 AM »

A good discussion of human evolutionary biology/pychology of aggression:
Power User
Posts: 784

« Reply #24 on: August 31, 2007, 02:17:03 PM »

August 27, 2007

One Species' Genome Discovered Inside Another's

Bacterial to Animal Gene Transfers Now Shown to be Widespread, with Implications for Evolution and Control of Diseases and Pests
Scientists at the University of Rochester and the J. Craig Venter Institute have discovered a copy of the genome of a bacterial parasite residing inside the genome of its host species.

The research, reported in today's Science, also shows that lateral gene transfer—the movement of genes between unrelated species—may happen much more frequently between bacteria and multicellular organisms than scientists previously believed, posing dramatic implications for evolution.

Such large-scale heritable gene transfers may allow species to acquire new genes and functions extremely quickly, says Jack Werren, a principal investigator of the study. If such genes provide new abilities in species that cause or transmit disease, they could provide new targets for fighting these diseases.

The results also have serious repercussions for genome-sequencing projects. Bacterial DNA is routinely discarded when scientists are assembling invertebrate genomes, yet these genes may very well be part of the organism's genome, and might even be responsible for functioning traits.

"This study establishes the widespread occurrence and high frequency of a process that we would have dismissed as science fiction until just a few years ago," says W. Ford Doolittle, Canada Research Chair in Comparative Microbial Genomics at Dalhousie University, who is not connected to the study. "This is stunning evidence for increased frequency of gene transfer."

"It didn't seem possible at first," says Werren, professor of biology at the University of Rochester and a world-leading authority on the parasite, called wolbachia. "This parasite has implanted itself inside the cells of 70 percent of the world's invertebrates, coevolving with them. And now, we've found at least one species where the parasite's entire or nearly entire genome has been absorbed and integrated into the host's. The host's genes actually hold the coding information for a completely separate species."

Wolbachia may be the most prolific parasite in the world—a "pandemic," as Werren calls it. The bacterium invades a member of a species, most often an insect, and eventually makes its way into the host's eggs or sperm. Once there, the wolbachia is ensured passage to the next generation of its host, and any genetic exchanges between it and the host also are much more likely to be passed on.

Since wolbachia typically live within the reproductive organs of their hosts, Werren reasoned that gene exchanges between the two would frequently pass on to subsequent generations. Based on this and an earlier discovery of a wolbachia gene in a beetle by the Fukatsu team at the University of Tokyo, Japan, the researchers in Werren's lab and collaborators at J. Craig Venter Institute (JCVI) decided to systematically screen invertebrates. Julie Dunning-Hotopp at JCVI found evidence that some of the wolbachia genes seemed to be fused to the genes of the fruitfly, Drosophila ananassae, as if they were part of the same genome.

Michael Clark, a research associate at Rochester then brought a colony of ananassae into Werren's lab to look into the mystery. To isolate the fly's genome from the parasite's, Clark fed the flies a simple antibiotic, killing the Wolbachia. To confirm the ananassae flies were indeed cured of the wolbachia, Clark tested a few samples of DNA for the presence of several wolbachia genes.

To his dismay, he found them.

"For several months, I thought I was just failing," says Clark. "I kept administering antibiotics, but every single wolbachia gene I tested for was still there. I started thinking maybe the strain had grown antibiotic resistance. After months of this I finally went back and looked at the tissue again, and there was no wolbachia there at all."

Clark had cured the fly of the parasite, but a copy of the parasite's genome was still present in the fly's genome. Clark was able to see that wolbachia genes were present on the second chromosome of the insect.

Clark confirmed that the wolbachia genes are inherited like "normal" insect genes in the chromosomes, and Dunning-Hotopp showed that some of the genes are "transcribed" in uninfected flies, meaning that copies of the gene sequence are made in cells that could be used to make wolbachia proteins.

Werren doesn't believe that the wolbachia "intentionally" insert their genes into the hosts. Rather, it is a consequence of cells routinely repairing their damaged DNA. As cells go about their regular business, they can accidentally absorb bits of DNA into their nuclei, often sewing those foreign genes into their own DNA. But integrating an entire genome was definitely an unexpected find.

"The question is, are these foreign genes providing new functions for the host? This is something we need to figure out."
Werren and Clark are now looking further into the huge insert found in the fruitfly, and whether it is providing a benefit. "The chance that a chunk of DNA of this magnitude is totally neutral, I think, is pretty small, so the implication is that it has imparted of some selective advantage to the host," says Werren. "The question is, are these foreign genes providing new functions for the host? This is something we need to figure out."

Evolutionary biologists will certainly take note of this discovery, but scientists conducting genome-sequencing projects around the world also may have to readjust their thinking.

Before this study, geneticists knew of examples where genes from a parasite had crossed into the host, but such an event was considered a rare anomaly except in very simple organisms. Bacterial DNA is very conspicuous in its structure, so if scientists sequencing a nematode genome, for example, come across bacterial DNA, they would likely discard it, reasonably assuming that it was merely contamination—perhaps a bit of bacteria in the gut of the animal, or on its skin.

But those genes may not be contamination. They may very well be in the host's own genome. This is exactly what happened with the original sequencing of the genome of the anannassae fruitfly—the huge wolbachia insert was discarded from the final assembly, despite the fact that it is part of the fly's genome.

In the early days of the Human Genome Project, some studies appeared to show bacterial DNA residing in our own genome, but those were shown indeed to be caused by contamination. Wolbachia is not known to infect any vertebrates such as humans.

"Such transfers have happened before in the distant past" notes Werren. "In our very own cells and those of nearly all plants and animals are mitochondria, special structures responsible for generating most of our cells' supply of chemical energy. These were once bacteria that lived inside cells, much like wolbachia does today. Mitochondria still retain their own, albeit tiny, DNA, and most of the genes moved into the nucleus in the very distant past. Like wolbachia, they have passively exchanged DNA with their host cells. It's possible wolbachia may follow in the path of mitochondria, eventually becoming a necessary and useful part of a cell.

"In a way, wolbachia could be the next mitochondria," says Werren. "A hundred million years from now, everyone may have a wolbachia organelle."

"Well, not us," he laughs. "We'll be long gone, but wolbachia will still be around."

This research was funded by the National Science Foundation's Frontiers in Integrative Biological Research program, which supports large, integrative projects addressing major questions in biology.
Power User
Posts: 42482

« Reply #25 on: August 31, 2007, 06:36:36 PM »

That is fascinating!
Power User
Posts: 42482

« Reply #26 on: October 09, 2007, 07:33:28 AM »

NY Times
Published: October 9, 2007
Royal is a cantankerous old male baboon whose troop of some 80 members lives in the Moremi Game Reserve in Botswana. A perplexing event is about to disturb his day.

From the bushes to his right, he hears a staccato whoop, the distinctive call that female baboons always make after mating. He recognizes the voice as that of Jackalberry, the current consort of Cassius, a male who outranks Royal in the strict hierarchy of male baboons. No hope of sex today.

But then, surprisingly, he hears Cassius’s signature greeting grunt to his left. His puzzlement is plain on the video made of his reaction. You can almost see the wheels turn slowly in his head:

“Jackalberry here, but Cassius over there. Hmm, Jackalberry must be hooking up with some one else. But that means Cassius has left her unguarded. Say what — this is my big chance!”

The video shows him loping off in the direction of Jackalberry’s whoop. But all that he will find is the loudspeaker from which researchers have played Jackalberry’s recorded call.

The purpose of the experiment is not to ruin Royal’s day but to understand what goes on in a baboon’s mind, in this case how carefully the animals keep track of transient relationships.

Dorothy Cheney and Robert Seyfarth, a husband-and-wife team of biologists at the University of Pennsylvania, have spent 14 years observing the Moremi baboons. Through ingenious playback experiments performed by themselves and colleagues, the researchers say they have worked out many aspects of what baboons use their minds for, along with their limitations.

Reading a baboon’s mind affords an excellent grasp of the dynamics of baboon society. But more than that, it bears on the evolution of the human mind and the nature of human existence. As Darwin jotted down in a notebook of 1838, “He who understands baboon would do more towards metaphysics than Locke.”

Dr. Cheney and Dr. Seyfarth are well known for a 1990 book on vervet monkeys, “How Monkeys See the World,” in which they showed how much about the animals’ mental processes could be deduced from careful experiments.

When a baby vervet’s call is played to three females, for instance, the mother looks to the source of the sound. The two others look to the mother, evidence that vervets know whose baby is whose.

An experiment like this — recording the sounds, waiting until the animals are in the right place and performing numerous controls — can take months to complete, but the results are widely admired by other biologists. “Any work of Dorothy and Robert’s is going to be as good as you get in the field,” said Robert M. Sapolsky, a Stanford biologist and an author who has studied baboons in the wild for many years.

“There is no one else in the area of animal behavior who does such incredibly interesting experiments in the field,” said Marc Hauser, a biologist at Harvard who was their first student.

Dr. Cheney and Dr. Seyfarth have summed up their new cycle of research in a book titled, after Darwin’s comment, “Baboon Metaphysics.” Their conclusion, based on many painstaking experiments, is that baboons’ minds are specialized for social interaction, for understanding the structure of their complex society and for navigating their way within it.

The shaper of a baboon’s mind is natural selection. Those with the best social skills leave the most offspring.

“Monkey society is governed by the same two general rules that governed the behavior of women in so many 19th-century novels,” Dr. Cheney and Dr. Seyfarth write. “Stay loyal to your relatives (though perhaps at a distance, if they are an impediment), but also try to ingratiate yourself with the members of high-ranking families.”

Baboon society revolves around mother-daughter lines of descent. Eight or nine matrilines are in a troop, each with a rank order. This hierarchy can remain stable for generations.

By contrast, the male hierarchy, which consists mostly of baboons born in other troops, is always changing as males fight among themselves and with new arrivals.

Rank among female baboons is hereditary, with a daughter assuming her mother’s rank.

News of that fact gave great satisfaction to a member of the British royal family, Princess Michael of Kent. She visited Dr. Cheney and Dr. Seyfarth in Botswana, remarking to them, they report: “I always knew that when people who aren’t like us claim that hereditary rank is not part of human nature, they must be wrong. Now you’ve given me evolutionary proof!”


Baboons live with danger on every side. Many fall prey to lions, leopards, pythons and the crocodiles that in the wet season stalk the fords where baboons cross from one island to another. Baboon watchers are subject to the same hazards. Dr. Cheney and Dr. Seyfarth say their rules are not to work alone or to wade into water deeper than knee high. They often find themselves sitting in a tree with baboons waiting out a lion below. But going into New York is more petrifying, they contend, than dodging Botswana’s predators.

The baboons will bark to warn of lions and leopards, but pay no attention to some other species dangerous to humans like buffalo and elephant. On two occasions, baboons have attacked animals, a leopard and a honey badger, that threatened their human companions. “We haven’t lost any post-docs,” Dr. Seyfarth said.

For female baboons, another constant worry besides predation is infanticide. Their babies are put in peril at each of the frequent upheavals in the male hierarchy. The reason is that new alpha males enjoy brief reigns, seven to eight months on average, and find at first that the droits de seigneur they had anticipated are distinctly unpromising. Most of the females are not sexually receptive because they are pregnant or nurturing unweaned children.

An unpleasant fact of baboon life is that the alpha male can make mothers re-enter their reproductive cycles, and boost his prospects of fatherhood, by killing their infants. The mothers can secure some protection for their babies by forming close bonds with other females and with male friends, particularly those who were alpha when their children were conceived and who may be the father. Still, more than half of all deaths among baby baboons are from infanticide.

So important are these social skills that it is females with the best social networks, not those most senior in the hierarchy, who leave the most offspring.

Although the baboon and human lines of descent split apart some 30 million years ago, the species have much in common. Both are primates whose ancestors came down from the trees and learned to survive on the ground in large social groups. The baboon mind may therefore shed considerable light on the early stages of the evolution of the human mind.

In some of their playback experiments, Dr. Cheney and Dr. Seyfarth have tested baboons’ knowledge of where everyone stands in the hierarchy. In a typical interaction, a dominant baboon gives a threat grunt, and its inferior screams. From their library of recorded baboon sounds, the researchers can fabricate a sequence in which an inferior baboon’s threat grunt is followed by a superior’s scream.

Baboons pay little attention when a normal interaction is played to them but show surprise when they hear the fabricated sequence implying their social world has been turned upside down.

This simple reaction says a lot about what is going in the baboon’s mind. That the animal can construe “A dominates B,” and distinguish it from “B dominates A,” means it must be able to break a stream of sounds down into separate elements, recognize the meaning of each, and combine the meanings into a sentence-like thought.

“That’s what we do when we parse a sentence,” Dr. Seyfarth said. Human language seems unique because no other species is capable of anything like speech. But when it comes to perceiving and deconstructing sounds, as opposed to making them, baboons’ ability seems much more language-like.

Assuming that early humans inherited the same ability from their joint ancestor with baboons, then when humans first started to combine sounds in the beginning of spoken language, “their listeners were all ready to perceive them,” Dr. Seyfarth said.

Baboons may be good at perceiving and thinking in a combinative way, but their vocal output consists of single sounds that are never combined, like greeting grunts, the females’ sexual whoop and the males’ competitive “wahoo!” cry. Why did language, expressed in combinations of sounds, evolve in humans but not in baboons?

A possible key to the puzzle lies in what animal psychologists call theory of mind, the ability to infer what another animal does or does not know. Baboons seem to have a very feeble theory of mind. When they cross from one island to another, ever fearful of crocodiles, the adults will often go first, leaving the juveniles fretting at the water’s edge. However much the young baboons call, their mothers never come back to help, as if unable to divine their children’s predicament.

But people have a very strong ability to recognize the mental states of others, and this could have prompted a desire to communicate that drove the evolution of language. “If I know you don’t know something, I am highly motivated to communicate it,” Dr. Seyfarth said.

It is far from clear why humans acquired a strong theory of mind faculty and baboons did not. Another difference between the two species is brain size. Some biologists have suggested that the demands of social living were the evolutionary pressure that enhanced the size of the brain. But the largest brains occur in chimpanzees and humans, who live in smaller groups than baboons.

But both chimps and humans use tools. Possibly social life drove the evolution of the primate brain to a certain point, and the stimulus of tool use then took over. Use of tools would have spurred communication, as the owner of a tool explained to others how to use it. But that requires a theory of mind, and Dr. Cheney and Dr. Seyfarth are skeptical of claims that chimpanzees have a theory of mind, in part because the experiments supporting that position have been conducted on captive chimps. “It’s bewildering to us that none of the people who study ape cognition have been motivated to study wild chimpanzees,” Dr. Cheney said.

“Baboons provide you with an example of what sort of social and cognitive complexity is possible in the absence of language and a theory of mind,” she said. “The selective forces that gave rise to our large brains and our full-blown theory of mind remain mysterious, at least to us.”
Power User
Posts: 42482

« Reply #27 on: November 20, 2007, 06:08:57 AM »

Despite flash, males are simple creatures

Females evolve slower, but it's because they're more complex

By Jeanna Bryner
updated 11:06 a.m. ET, Mon., Nov. 19, 2007

The secret to why male organisms evolve faster than their female counterparts comes down to this: Males are simple creatures.

In nearly all species, males seem to ramp up glitzier garbs, more graceful dance moves and more melodic warbles in a never-ending vie to woo the best mates. Called sexual selection, the result is typically a showy male and a plain-Jane female. Evolution speeds along in the males compared to females.

The idea that males evolve more quickly than females has been around since 19th century biologist Charles Darwin observed the majesty of a peacock’s tail feather in comparison with those of the drab peahen.

How and why males exist in evolutionary overdrive despite carrying essentially the same genes as females has long puzzled scientists.
New research on fruit flies, detailed online last week in the journal Proceedings of the National Academy of Sciences, finds males have fewer genetic obstacles to prevent them from responding quickly to selection pressures in their environments.

"It’s because males are simpler," said lead author Marta Wayne, a zoologist at the University of Florida in Gainesville. "The mode of inheritance in males involves simpler genetic architecture that does not include as many interactions between genes as could be involved in female inheritance."

The finding could also shed light on why diseases show up differently in men and women.

Complicated chromosomes
Wayne and her colleagues examined more than 8,500 genes shared by both sexes of the fruit fly Drosophila melanogaster. Of those genes, about 7,600 have different expressions (alleles) that do different jobs in males and females.

The flies were identical genetically, except for their sex chromosomes.
In flies and humans, thousands of genes made up of DNA are packaged into tiny units called chromosomes. Each parent contributes one set of 23 chromosomes to offspring, resulting in little ones with 23 father-given chromosomes and 23 mother-chromosomes — 46 total. One pair of these is called the sex chromosome. In this case, the females have two X chromosomes (XX) and males, XY.

Many genes are found on the X chromosome, whereas few are associated with the Y chromosome. For female fruit flies, the X-chromosome genes can come in two flavors called alleles that not only interact with each other but also with other genes.

For instance, if one allele is dominant over the other, that allele would get "expressed" while the recessive allele would stay hidden. Though under cover, the recessive allele kind of hitches a ride on the X chromosome and can be passed on to future generations.

That's not the case with males.
"We find direct evidence that the expression of the genes on the X has this covering behavior in females whereas in males they're out in the open," said study team member Lauren McIntyre, also of UF.

Males only have one X chromosome, so what you see is what you get. If that particular gene gives the male a boost in terms of sexual selection, say a gene responsible for fluffier feathers, the gene would be selected for in the game of natural selection over successive generations. But if the gene is no good for males, it would get selected against over time.

"Having one X means your genes are more open to selection in males," UF researcher Marina Telonis-Scott said in a telephone interview. "So in a female if you have a recessive allele that confers a sickness, it can be concealed within the two X's but if you've only got one, such as the male, you're more open to selection."

And the reason males are genetic simpletons, it turns out, is sex. The researchers suggest this uncomplicated (compared with females) genetic pathway allows males to respond at the drop of a hat to the pressures of sexual selection. That way they can win females, produce more offspring and start the cycle over again.

While not as prominent a trend, they also found a similar pattern in so-called autosomal genes, which are those found on any chromosome save the sex chromosomes. Many of the fruit-fly autosomal genes, however, did work in concert with genes located on the X chromosome.

Human implications
The "elephant lurking in these results," of course, is how they would apply to men and women.

The researchers caution the results don't directly translate to humans. "The X function is thought to be quite different in flies than humans," McIntyre told LiveScience. In humans, one of the X chromosomes gets inactivated in females, though research is finding this inactivation isn't always absolute.

However, the results could help explain differences in symptoms and responses to diseases in men and women, the authors say. Sexual selection does occur in humans, they note. In addition, fruit flies and humans share an evolutionary history, the authors point out, which is the reason why we share more than 65 percent of our genes with the tiny insects.

"If we see a mechanism in flies it may also be true in everything that shares that evolutionary history," McIntyre said.
On a basic level, the genetic machinery works in a similar manner in flies and us.

"There's a health aspect in figuring out differences in gene expression between the sexes," Wayne said. "To make a male or a female, even in a fly, it's all about turning things on — either in different places or different amounts or at different times — because we all basically have the same starting set of genes."
Power User
Posts: 42482

« Reply #28 on: January 22, 2008, 09:16:22 AM »

NY Times

As the candidates have shown us in the succulent telenovela that is the 2008 presidential race, there are many ways to parry for political power. You can go tough and steely in an orange hunter’s jacket, or touchy-feely with a Kleenex packet. You can ally yourself with an alpha male like Chuck Norris, befriend an alpha female like Oprah Winfrey or split the difference and campaign with your mother. You can seek the measured endorsement of the town elders or the restless energy of the young, showily handle strange infants or furtively slam your opponents.

Just as there are myriad strategies open to the human political animal with White House ambitions, so there are a number of nonhuman animals that behave like textbook politicians. Researchers who study highly gregarious and relatively brainy species like rhesus monkeys, baboons, dolphins, sperm whales, elephants and wolves have lately uncovered evidence that the creatures engage in extraordinarily sophisticated forms of politicking, often across large and far-flung social networks.

Male dolphins, for example, organize themselves into at least three nested tiers of friends and accomplices, said Richard C. Connor of the University of Massachusetts at Dartmouth, rather like the way human societies are constructed of small kin groups allied into larger tribes allied into still larger nation-states. The dolphins maintain their alliances through elaborately synchronized twists, leaps and spins like Blue Angel pilots blazing their acrobatic fraternity on high.

Among elephants, it is the females who are the born politicians, cultivating robust and lifelong social ties with at least 100 other elephants, a task made easier by their power to communicate infrasonically across miles of savanna floor. Wolves, it seems, leaven their otherwise strongly hierarchical society with occasional displays of populist umbrage, and if a pack leader proves a too-snappish tyrant, subordinate wolves will collude to overthrow the top cur.

Wherever animals must pool their talents and numbers into cohesive social groups, scientists said, the better to protect against predators, defend or enlarge choice real estate or acquire mates, the stage will be set for the appearance of political skills — the ability to please and placate, manipulate and intimidate, trade favors and scratch backs or, better yet, pluck those backs free of botflies and ticks.

Over time, the demands of a social animal’s social life may come to swamp all other selective pressures in the environment, possibly serving as the dominant spur for the evolution of ever-bigger vote-tracking brains. And though we humans may vaguely disapprove of our political impulses and harbor “Fountainhead” fantasies of pulling free in full glory from the nattering tribe, in fact for us and other highly social species there is no turning back. A lone wolf is a weak wolf, a failure, with no chance it will thrive.

Dario Maestripieri, a primatologist at the University of Chicago, has observed a similar dilemma in humans and the rhesus monkeys he studies.

“The paradox of a highly social species like rhesus monkeys and humans is that our complex sociality is the reason for our success, but it’s also the source of our greatest troubles,” he said. “Throughout human history, you see that the worst problems for people almost always come from other people, and it’s the same for the monkeys. You can put them anywhere, but their main problem is always going to be other rhesus monkeys.”

As Dr. Maestripieri sees it, rhesus monkeys embody the concept “Machiavellian” (and he accordingly named his recent popular book about the macaques “Macachiavellian Intelligence”).

“Individuals don’t fight for food, space or resources,” Dr. Maestripieri explained. “They fight for power.” With power and status, he added, “they’ll have control over everything else.”

Rhesus monkeys, midsize omnivores with ruddy brown fur, long bearded faces and disturbingly humanlike ears, are found throughout Asia, including in many cities, where they, like everybody else, enjoy harassing the tourists. The monkeys typically live in groups of 30 or so, a majority of them genetically related females and their dependent offspring.

A female monkey’s status is usually determined by her mother’s status. Male adults, as the ones who enter the group from the outside, must establish their social positions from scratch, bite, baring of canines and, most importantly, rallying their bases.
Page 2 of 2)

“Fighting is never something that occurs between two individuals,” Dr. Maestripieri said. “Others get involved all the time, and your chances of success depend on how many allies you have, how wide is your network of support.”

Monkeys cultivate relationships by sitting close to their friends, grooming them at every possible opportunity and going to their aid — at least, when the photo op is right. “Rhesus males are quintessential opportunists,” Dr. Maestripieri said. “They pretend they’re helping others, but they only help adults, not infants. They only help those who are higher in rank than they are, not lower. They intervene in fights where they know they’re going to win anyway and where the risk of being injured is small.”

In sum, he said, “they try to gain maximal benefits at minimal cost, and that’s a strategy that seems to work” in advancing status.

Not all male primates pursue power by appealing to the gents. Among olive baboons, for example, a young male adult who has left his natal home and seeks to be elected into a new baboon group begins by making friendly overtures toward a resident female who is not in estrous at the moment and hence not being contested by other males of the troop.

“If the male is successful in forming a friendship with a female, that gives him an opening with her relatives and allows him to work his way into the whole female network,” said Barbara Smuts, a biologist at the University of Michigan. “In olive baboons, friendships with females can be much more important than political alliances with other males.”

Because males are often the so-called dispersing sex, while females stay behind in the support network of their female kin, females form the political backbone among many social mammals; the longer-lived the species, the denser and more richly articulated that backbone is likely to be.

With life spans rivaling ours, elephants are proving to possess some of the most elaborate social networks yet observed, and their memories for far-flung friends and relations are well in line with the species’ reputation. Elephant society is organized as a matriarchy, said George Wittemyer, an elephant expert at the University of California, Berkeley, with a given core group of maybe 10 elephants led by the eldest resident female. That core group is together virtually all the time, traveling over considerable distances, stopping to dig water holes, looking for fresh foliage to uproot and devour.

“They’re constantly making decisions, debating among themselves, over food, water and security,” Dr. Wittemyer said. “You can see it in the field. You can hear them vocally disagree.” Typically, the matriarch has the final say, and the others abide by her decision. If a faction disagrees strongly enough and wants to try a different approach, “the group will split up and meet back again later,” said Dr. Wittemyer.

Age has its privileges, he said, and the older females, even if they are not the biggest, will often get the best spots to sleep and the best food to eat. But it also has its responsibilities, and a matriarch is often the one to lead the charge in the face of conflicts with other elephants or predatory threats, sometimes to lethal effect.

Hal Whitehead of Dalhousie University and his colleagues have found surprising parallels between the elephant and another mammoth mammal, the sperm whale, possessor of the largest brain, in absolute terms, that the world has ever known. As with elephants, sperm whale society is sexually segregated, the females clustering in oceanic neighborhoods 40 degrees north or south of the Equator, and the males preferring waters around the poles.

As with elephants, the core social unit is a clan of some 10 or 12 females and their offspring. Sperm whales also are highly vocal. They communicate with one another using a Morse code-like pattern of clicks. Each clan, Dr. Whitehead said, has a distinctive click dialect that the members use to identify one another and that adults pass to the young. In other words, he said, “It looks like they have a form of culture.”

Nobody knows what the whales may have to click and clack about, but it could be a form of voting — time to stop here and synchronously dive down in search of deep water squid, now time to resurface, move on, dive again. Clans also seem to caucus on which males they like and will mate with more or less as a group and which ones they will collectively spurn. By all appearances, female sperm whales are terrible size queens. Over the generations, they have consistently voted in favor of enhanced male mass. Their dream candidate nowadays is some fellow named Moby, and he’s three times their size.

Power User
Posts: 42482

« Reply #29 on: May 21, 2008, 06:53:22 AM »

Civilization and the Texas Cult
May 21, 2008; Page A17

The desperate tragedy involving polygamous cultists in Texas has attracted a growing phalanx of lawyers, judges, law enforcers and assorted psychologists.

Those responsible for coping with this astonishing disaster would be well-advised to add a primatologist to the team. The fact is that, despite all the blather about faith and freedom of religion, the men operating the various compounds in question are behaving in virtually the same manner as countless dominant males in countless primate troops observed over the years.

The essence of the case is that the men who control the politics of the group (as well as the hapless women and children who live there) have used junk theology about heaven, hell, paradise and salvation to maintain their unquestioned access to all females of reproductive age (or younger).

That's the reproductive fantasy of any adult male primate.

In this blow to simple decency, the Texas polygamists are not pathfinders. Multiple wives are of course permitted in the Islamic religion, and co-wives are a feature of dozens of human groups in which powerful men control sufficient resources to be able to support more than one woman.

This is usually because the societies in which they live are sharply unequal. Sex and offspring flow to those with resources.

One of the triumphs of Western arrangements is the institution of monogamy, which has in principle made it possible for each male and female to enjoy a plausible shot at the reproductive outcome which all the apparatus of nature demands. Even Karl Marx did not fully appreciate the immense radicalism of this form of equity.

The Texans' faith-flaunting is morally disgraceful and crudely cynical. It also raises bewildering questions about human gullibility on one hand and the efficacy of the Big Lie on the other.

Can anyone really believe that the notorious communal bed to which senior men command 16-year-old girls is part of some holy temple apparatus? Apparently some people do, and the few escapees from the fetid zoo have testified to the power the ridiculous theory wields.

The victims are not only young women but young men too. They are reproductively and productively disenfranchised, and are in effect forced to leave the communities to become hopeless, ill-schooled misfits in the towns of normal life. No dignified lives as celibate monks with colorful costumes for them.

Again, the issue is cross-cultural. Osama bin Laden has at least five wives, which means that four young men of his tribe have no date on Saturday night and forever. They may become willing jihadists, or desperate suicides eager to soothe their god by killing infidels and Americans.

Elsewhere, preference for sons has meant a sharp shortage of women in China. It is known that raiding parties from there cross into bordering countries with more regular sex ratios to steal women.

The deranged cults have been operating in plain sight for years in Texan communities whose police forces have been earnestly writing parking tickets while ignoring what is obvious major criminality. Some 400 young children have been drastically separated from their mothers – who among other derogations of civil life are allegedly part of longstanding welfare fraud engineered by their sexual tyrants.

And now what? It will be intensely depressing but probably useful to acknowledge this is at bottom a natural matter, a product of our inner behavioral nature. Understanding the shadowy sources of this nightmare may help our community cope with its victims.

Mr. Tiger teaches anthropology at Rutgers and is the author of "The Decline of Males" (St. Martin's, 2000).

See all of today's editorials and op-eds, plus video commentary, on Opinion Journal.

And add your comments to the Opinion Journal forum.
« Reply #30 on: August 20, 2008, 09:47:55 AM »

A rounder face 'means men are more aggressive'
By Roger Highfield, Science Editor
Last Updated: 12:01am BST 20/08/2008

Men with round faces tend to be more aggressive, a study of sportsmen has shown.

Five round-headed and aggressive sportsmen
Phwoar! What a lovely set of genes
Study suggest testosterone levels may be driven by looks
The male sex hormone testosterone makes faces more circular and now scientists have studied whether this characteristic is also linked to behaviour.

The shape of the face may have been honed by evolution to mark a man likely to be aggressive
A Canadian team studied 90 ice hockey players and found the rounder the face, the more aggressive the players.

For male varsity and professional hockey players, the facial ratio was linked in a statistically significant way with the number of penalty minutes per game, report Justin Carre and Prof Cheryl McCormick of Brock University, Ontario.

The penalties were incurred by players for violent acts including slashing, elbowing, checking from behind, fighting and so on.

However, there was not a link between facial shape and aggression in women.

"The facial structure of a man provides an indication of how aggressive he will be in a competitive situation," says Prof McCormick.

"Therefore, we are able to predict, with some accuracy, the behaviour of men on the basis of their facial features.

"If men's faces are providing cues as to their potential for aggression, then likely people are probably picking up on this cue, although likely on a subconscious level."

The findings, published in the Proceedings of the Royal Society: Biological Sciences suggest that the shape of the face may have been honed by evolution as a marker of the propensity for aggressive behaviour: ancestors who did not pick up this warning sign could have found out to their cost that they were dealing with a more volatile and violent person.

By one theory, testosterone is responsible for the development of rugged looks, a jutting jaw and brow, a deep voice and other trappings of masculinity but it also damps down the body's protective immune system, so only high-quality (that is those with healthy, good 'genes') men can afford to display these macho characteristics.

But the hormone affects more than appearance and a range of earlier work has shown that testosterone levels affect behaviour, other than aggression.

For example, women's judgements of the extent to which a man was interested in infants based on his face predicted his actual interest in infants: more feminised faces were seen as more trustworthy.

People also show some accuracy at identifying 'cheaters' from their looks in an idealised game of cooperation. "Together, these findings suggest that people can make accurate inferences about others' personality traits and behavioural dispositions based on certain signals conveyed by the face," say the researchers.

However, there is a long and fraught history of attempting to read a personality from the way someone looks.

The Crime Museum at Scotland Yard in central London has more than 30 casts made of the heads of those hanged for murder at Newgate prison during the 19th century to provide evidence to back the then "scientific" theory of phrenology, which said that character and criminality could be determined by the shape of a person's head.

Phrenologists believed that the brain had different "brain organs" which represented a person's personality traits.

These were thought to be proportional to a person's propensities, as reflected by "bumps" in the skull. This work, now written off as pseudoscience, was used to back the idea that some people are "born criminal" and could be identified.

But today's study shows that there may be a bit more to looks than we thought. "Given that people readily make judgements of others based on their looks, and that we have evidence that the face may actually be providing relevant information, it will be fascinating to see if people's judgements of faces are accurate," says Prof McCormick.

"Although we naturally wince a bit at the comparison to phrenology, the comparison is certainly one that has crossed our minds."

« Reply #31 on: August 25, 2008, 09:45:35 AM »

What's Sexier than Public Policy?

Ronald Bailey | August 25, 2008, 10:09am

Sex, of course. That's why newspapers obsess over Sen. Larry Craig's (R-ID) (alleged) public bathroom romances but not his position on the Medicare prescription drug benefit program which is costing taxpayers billions. And why CNN ran 24/7 coverage of Gov. Elliot Spitzer's (D-NY) high cost hotel dalliances, but not his serial abuses of prosecutorial discretion.

The Washington Post's Shankar Vedantam's always interesting Department of Human Behavior feature delves into the question of why media tend to focus on sex over policy. Evolutionary psychologists argue that thanks to our evolutionary biology gossip is what interests readers, listeners, and viewers. As Vedantam explains:

[University of Guelph in Ontario psychologist Hank] Davis and other evolutionary psychologists argue that the reason John Edwards's adultery has more zing in our heads than a dry policy dispute that could cost taxpayers billions of dollars is that the human brain evolved in a period where there were significant survival advantages to finding out the secrets of others. Since humans lived in small groups, the things you learned about other people's character could tell you whom to trust when you were in a tight spot.

"We are continuing to navigate through the modern world with a Stone Age mind," Davis said.

In the Pleistocene era, he added, there was no survival value in being able to decipher a health-care initiative, but there was significant value in information about "who needs a favor, who is in a position to offer one, who is trustworthy, who is a liar, who is available sexually, who is under the protection of a jealous partner, who is likely to abandon a family, who poses a threat to us."

We may consciously know that we are no longer living in small hunter-gatherer groups and that it no longer makes sense to evaluate someone like Edwards as we might a friend or intimate partner, but our reptilian brain doesn't realize this. Our prefrontal cortex might reason that a man who cheats on his wife while she is fighting cancer could make a perfectly fine president in a complex world, but the visceral distaste people feel about Edwards stems from there being an ancient part of the human brain that says, "Gee, I don't want to get mixed up with this guy, because even in my hour of greatest need I might not be able to count on him," said Frank T. McAndrew, an evolutionary social psychologist at Knox College in Illinois.

Most Americans, of course, will never have any personal interaction with the people they elect president. Nonetheless, if the evolutionary psychologists are correct, people will tend to choose leaders they can relate to personally -- and reject the leaders with whom they cannot see having a personal relationship.

"The human brain does not have any special module for evaluating welfare policy or immigration policy, but it has modules for evaluating people on the basis of character," said Satoshi Kanazawa, an evolutionary psychologist at the London School of Economics. "That is probably why we have this gut reaction to affairs and marriages and lying. All of those things existed in the ancestral environment 100,000 years ago."

Whole Vedantam feature here.
Power User
Posts: 100

« Reply #32 on: August 25, 2008, 12:09:21 PM »

Regarding the face width = aggression piece above, here is the  the actual paper:

I have lots of questions about studies like this, especially when they get reported in the popular press. Popular reiterations of research like this often report on the sensational aspects of it.

The test groups were actually pretty homogeneous (hockey players of the same race for the most part).  This is good for the experiment because it controls for such differences . But how much does this correlation hold outside of the test group? Whole cultural and racial groups have rounder faces than others.  Are there similar findings among those groups?

Correlation does not imply causation.  Just because two things vary with one another does not necessarily mean much.  Here is an explanation of the method used in this study:

"The main result of a correlation is called the correlation coefficient (or "r"). It ranges from -1.0 to +1.0. The closer r is to +1 or -1, the more closely the two variables are related.

If r is close to 0, it means there is no relationship between the variables. If r is positive, it means that as one variable gets larger the other gets larger. If r is negative it means that as one gets larger, the other gets smaller (often called an "inverse" correlation).

While correlation coefficients are normally reported as r = (a value between -1 and +1), squaring them makes then easier to understand. The square of the coefficient (or r square) is equal to the percent of the variation in one variable that is related to the variation in the other. After squaring r, ignore the decimal point. An r of .5 means 25% of the variation is related (.5 squared =.25). An r value of .7 means 49% of the variance is related (.7 squared = .49).

A correlation report can also show a second result of each test - statistical significance. In this case, the significance level will tell you how likely it is that the correlations reported may be due to chance in the form of random sampling error. If you are working with small sample sizes, choose a report format that includes the significance level. This format also reports the sample size.

A key thing to remember when working with correlations is never to assume a correlation means that a change in one variable causes a change in another. Sales of personal computers and athletic shoes have both risen strongly in the last several years and there is a high correlation between them, but you cannot assume that buying computers causes people to buy athletic shoes (or vice versa)."  from

Just because someone has a round face does not prove that they will be violent.  Making that assumption would place a whole bunch of people in a category that would not be accurate for them.   There are greater correlations that mean more (which also generate incorrect presumptions).

So they found a mild correlation between face shape and the number of penalties accrued.  So the next question is "is it significant or important in any way?"

Are the guys with the round faces more prone to dastardly deeds or are they the "heroes" of the hockey team?  Lots of times the accruers of penalties are the enforcers.  Hockey by design is a rough game. Maybe these guys are actually the ones doing the job that is expected of them.

The other part of the study that tried to predict the number of penalties as a function of a test taken that was supposed to indicate "trait dominance" 

"Participants completed a 10-item questionnaire assessing
trait dominance (International Personality Item Pool scales;
Goldberg et al. 2006). Some examples of items include ‘Like
having authority over others’ and ‘Want to be in charge’.
Responses were scored on a Likert scale ranging from K2
(very inaccurate) to C2 (very accurate), and had high
reliability (Cronbach’s alphaZ0.82)."

What does it mean that this test gave an insignificant result?  Why isn't this correlated to number of penalties as well?  Its a questions that they brought up and there is no explanation for it.


nothing wrong with the paper. But scientific studies are often portrayed as though they are more than they are by popular press.  I don't see much in this paper that would peak my interest beyond being mildly interesting.  It won't make me look askance at my cheerful polish friend with the gentle disposition and extremely wide face other than to send him the article and innocently ask him if he has neanderthal roots as a joke   grin


Power User
Posts: 42482

« Reply #33 on: August 26, 2008, 06:51:26 AM »

Nice post.
« Reply #34 on: September 07, 2008, 12:18:44 PM »

Jews and Their DNA

Hillel Halkin From issue: September 2008

Eight years ago, I published an article in these pages called “Wandering Jews—and Their Genes” (September 2000). At the time I was working on a book about a Tibeto-Burmese ethnic group in the northeast Indian states of Mizoram and Manipur, many of whose members believe that they descend from the biblical tribe of Manasseh, and about a group of Judaizers among them known as the B’nei Menashe, over a thousand of whom live today in Israel as converts to Judaism.

This led me to an interest in Jewish historical genetics, then a new discipline. Historical genetics itself was still a pioneering field, launched by the discovery that two sources of DNA in the human body, the Y chromosome that determines male sex and the mitochondria that aid cell metabolism, never change (barring rare mutations) in their transmission from fathers to sons and from mothers to children of both sexes. This made it possible to trace paternal and maternal lines of descent far into the past and to learn about the movements and interactions of human populations that originated hundreds, thousands, and even tens of thousands of years ago.

In my article, I observed that preliminary studies in Jewish genetics had both “shored up” and “undermined” some conventional ideas about Jewish history. On the one hand, they had indicated that there was a high degree of Y-chromosome similarity among Jewish males from all over the world, coupled with a much lower degree when the comparison was made between Jews and non-Jews in the same region. The one part of the globe in which Jews correlated as highly with many non-Jews as they did with other Jews was the Middle East—precisely what one might expect of a people that claimed to have originated in Palestine (or in Ur of the Chaldees, if you go back to Abraham) and to have spread from it.

Other studies established that the Y chromosomes of kohanim—male Jews said to descend from the priestly caste whose supposed progenitor was the biblical Aaron—had their own unique DNA signature, labeled the Cohen Modal Haplotype. Not only did half of all kohanim, who comprise about four percent of the world’s Jewish population, share this DNA configuration, but minor mutations in it pointed to a common ancestor who lived a few centuries before or after 1000 B.C.E.—that is, close to the period in which Aaron and his brother Moses are situated by biblical chronology.

Such evidence seemed to confirm traditional notions of Jewish origins. It suggested that the Jews, while certainly not a “race,” were indeed, despite the skepticism of many modern historians, the highly endogamous people they had always considered themselves to be, one that had admixed with outsiders relatively little during long centuries of wandering in the Diaspora. It also strengthened the reliability of the Bible as a historical source. Modern critics who contended that the Bible was a late document that imagined a largely non-existent past had always singled out the priestly codes of the Pentateuch as a prime illustration of this. But if the priesthood was really an institution going back to early Israelite history, rather than the backward projection in time of later generations, revisionist Bible criticism itself needed to be revised.

Yet there was contrary evidence, too. Early studies of mitochondrial DNA reported that Jewish women, unlike Jewish men, did not correlate well with one another globally. Furthermore, the greatest demographic mystery of Jewish history—that of the origins of the Ashkenazi population of Central and Eastern Europe—had only appeared to deepen.

The standard Jewish version of these origins was that Ashkenazi Jewry had first crystallized in the late first millennium of the Christian era in the French-German borderland along the Rhine; that it had reached the Rhineland from southern France, to which it had come in earlier centuries either directly from Palestine or via Italy and Spain; and that it had then migrated eastward and northward into Central and Eastern Europe.Even before the advent of historical genetics, however, this account had been challenged. There were linguists who argued that East European Yiddish, the Germanic language of most Ashkenazi Jews, had more in common with the dialects of southern and southeastern Germany than with those of the Rhineland in the west. There were demographers who contended that the Jewish population of the Rhineland prior to the appearance of East European Jewry, which would eventually become the world’s largest Jewish community, was too small to account for the latter’s rapid growth.

The early genetic findings appeared to support the challengers. If the Rhineland theory was correct, Ashkenazi DNA should have had greater affinities with non-Jewish DNA from northern France and western Germany than with non-Jewish DNA from elsewhere; no one denied, after all, that wherever and whenever Jews had lived, some Gentiles must have joined them or begotten children with them. Yet there was no sign of this. Where, then, had Ashkenazi Jewry come from?

It was a mixed picture. Since then, eight years have gone by, historical genetics has greatly refined its methods and taxonomy, and several major new studies in Jewish genetic history have been published. What, viewed from their perspective, does Jewish history look like now?



Two new books address this question. One, David B. Goldstein’s Jacob’s Legacy: A Genetic View of Jewish History, is the work of a scientist who teaches at Duke University and has been personally involved in much Jewish genetic research.1 The other, Jon Entine’s Abraham’s Children: Race, Identity, and the DNA of the Chosen People, is by a layman and journalist.2 Yet since Entine has done a serious and responsible job of reporting, and Goldstein has written a non-technical survey for the general reader, the difference between them is one more of style than of substance. They agree on most major points, starting with the puzzling disparity in the distribution patterns of Jewish Y-chromosome and mitochondrial DNA.

The fact of this disparity is now solidly established. There is no doubt that statistically (and only statistically: it is important to keep in mind that any randomly chosen Jewish individual may prove an exception to the rule), Jewish males with antecedents in such widely separated places as Yemen, Georgia, and Bukhara in Central Asia are far more likely to share similar Y-chromosome DNA with one another than with Yemenite, Georgian, or Bukharan non-Jews. Jewish females from the same backgrounds, on the other hand, yield opposite results: their mitochondrial DNA has markedly less resemblance to that of Jewish women from elsewhere than it does to that of non-Jewish women in the countries their families hailed from. The main difference between them and these Gentile women is that their mitochondrial DNA is less varied—that is, they descend from a small number of maternal ancestors. Geneticists call such a phenomenon, in which a sizable population has developed from a very small number of progenitors, a “founder” or “bottleneck” effect. (In “bottlenecks,” these few progenitors are survivors of larger groups that were drastically reduced by war, famine, plague, or other calamities.)

This calls for a new understanding of the spread of Jewish settlement in the Diaspora. Until now, it has been assumed that nearly all of the world’s Jewish communities began with the migration of cross-sections of older communities, which took their families, institutions, and practices with them and perpetuated their lives in new surroundings. Now, it would seem, as David Goldstein writes, that

[some] Jewish men . . . travel[ed] long distances to establish small Jewish communities [by themselves]. They would settle in new lands and, if unmarried, take local women for wives. The communities might [at a later date] have been augmented by additional male travelers from Jewish source populations. Once they were established, however, the barriers would go up against further input of new mitochondrial DNA, precisely because of female-defined ethnicity [i.e., the halakhic practice of determining Jewishness by the mother]; few [additional] females would be permitted to join.
Presumably, these adventurous bachelors setting out (perhaps on business ventures) for far lands could not persuade Jewish women to come with them, or else they traveled to their destinations with no intention of staying there. In the absence of rabbis to perform conversions, they married local women who, while consenting to live as Jews, were not halakhically Jewish. By halakhic standards, therefore, their descendants were not Jewish, either, even though their Jewishness was not challenged by the rabbinical authorities. Although such communities must, in their first generations, have known the truth about themselves, this does not appear to have bothered them or anyone else very much.


In a class by itself is the mitochondrial DNA of Ashkenazi women. It does not correlate closely with the DNA of non-Jewish women in Western, Central, or Eastern Europe and it has a large Middle Eastern component. Yet in their maternal lineage, Ashkenazim, too, exhibit a strong “founder effect.” Over forty percent of them, a 2005 study showed, descend from just four “founding mothers” having Middle-Eastern-profile mitochondrial DNA. Since Ashkenazi Y-chromosome DNA does not exhibit so dramatic a founder’s effect, one can assume that Ashkenazi Jewry, too, began with the migration of a preponderantly male group of Jews to new territories. Because these territories, however, were more contiguous with the old ones than were far-flung regions like Bukhara or Yemen, the men were more able to import wives from existing Jewish communities and less dependent on marrying local Gentiles.

But where did Ashkenazi Jewry, male and female alike, derive from if not from the Rhineland? One possibility that is more consistent with the linguistic data is that it entered southern Germany from northern Italy and pushed further north from there into the Slavic-speaking areas of Europe. Another is that Jews migrated to Slavic lands from the Byzantine Empire. These hypotheses, which are not mutually exclusive, can now claim a measure of scientific support, since the Y chromosomes of Ashkenazi Jews have more in common with those of Italians and Greeks than with those of West Europeans.

A more dramatic scenario, popularized by Arthur Koestler in his 1976 book The Thirteenth Tribe, has to do with the Khazars, a Turkish people living between the Black and Caspian Seas, whose royal house adopted Judaism (with what degree of rabbinical supervision, we have no way of knowing) in the 8th century c.e. A great deal is obscure in the history of the Khazar kingdom, which at its apogee ruled much of present-day Ukraine, and the degree of the Judaization of its population is uncertain. Yet Koestler and a small number of historians on whom he based himself were convinced that, following the destruction of this kingdom in the 11th century by its Slavic enemies, many of its Jews fled westward to form the nucleus of what was to become East European Jewry.3

The Khazar theory never had many backers in scholarly circles; there was little evidence to support it and good reasons to be dubious about it. Why, for instance, does medieval rabbinic literature almost never mention the Khazars? Why, if they spoke a Turkish language, did East European Jewry become Yiddish-speaking? “Like virtually every academic I have ever consulted on the subject,” David Goldstein writes, “I was initially quite dismissive of Koestler’s identification of the Khazars [with] Ashkenazi Jewry.” Yet, he continues, “I am no longer so sure. The Khazar connection seems no more farfetched than the spectacular continuity of the Cohen line.”

This is one of the few occasions on which Jon Entine disagrees with him. Abraham’s Children declares:

The studies of the Y-chromosome and [mitochondrial] DNA do not support the . . . notion that Jews are descended in any great numbers from the Khazars or some Slavic group, although it’s evident some Jews do have Khazarian blood. The Khazarian theory has been put to rest, or at least into perspective.


Who is right? Either could be, for the latest evidence is ambiguous. It consists of two studies. One, “Y-Chromosome Evidence for a Founder Effect in Ashkenazi Jews,” was published in 2004 in the European Journal of Human Genetics by a small team from the Hebrew University of Jerusalem. The other was the work of a larger, American-Israeli-British group to which Goldstein belonged; its report, “Multiple Origins of Ashkenazi Levites: Y-Chromosome Evidence for Both Near Eastern and European Ancestries,” appeared in the American Journal of Human Genetics in 2003. Both studies discuss a mutation, widely found in Poland, Lithuania, Belarus, and Ukraine, that occurs in a Y-chromosome classification known as Haplogroup R, at a DNA site labeled M117.

The Hebrew University study states:

Recent genetic studies . . . showed that Ashkenazi Jews are more closely related to other Jewish and Middle Eastern groups than to their host populations in Europe. However, Ashkenazim have an elevated frequency of R-M117, the dominant Y-chromosome haplogroup in Eastern Europeans, suggesting possible gene flow [into the Ashkenazi population]. In the present study of 495 Y chromosomes of Ashkenazim, 57 (11.5 percent) were found to belong to R-M117.
As for the American-Israeli-British study, it was designed to ascertain whether Levites, who functioned as priests’ assistants in the ancient Temple and are supposedly also descended from Aaron, have a worldwide genetic signature similar to or the same as the Cohen Modal Haplotype.4 The answer turned out to be negative, since the Y chromosomes of Levites from different geographical backgrounds proved to correlate no better with one another than they did with the Y chromosomes of non-Levitic Jews. And yet, rather astonishingly, Ashkenazi Levites, when taken separately, do have a “modal haplotype” of their own—and it is the same R-M117 mutation on which the Hebrew University study centered! Fifty-two percent of them have this mutation, which is rarely found in non-Ashkenazi Jews and has a clear non-Jewish provenance.



What is one to make of this finding? An 11.5-percent incidence of R-M117 among Ashkenazi Jews in general is easily explainable: the mutation could have entered the Jewish gene pool slowly, in small increments in every generation, during the thousand years of Ashkenazi Jewry’s existence. (This need not necessarily have been via conversion to Judaism and marriage to Jewish women. Pre- and extra-marital sexual relations, and even rape, widespread in times of anti-Jewish violence, were in all likelihood more common.) But the 52-percent rate among Levites is something else. Here we are dealing not with a gradual, long-term process (for no imaginable process could have produced such results), but with a one-time event of some sort.

Such an event could obviously not have been a sudden influx of Levites into the Jewish community from a Gentile society. Both of our studies, therefore, raise the possibility that the original R-M117 Levites were Khazarian Jews who migrated westward upon the fall of the Khazar kingdom. Of course, since all or most Khazarian Jews were converts (although some may have been Jews who came from elsewhere), few could have descended from Aaron. Yet it is quite possible that some became, or were designated, “honorary” Levites in the course of the Judaization of the Khazarian population. As the American-Israeli-British study observes, Jews traditionally held to “a lesser degree of stringency for the assumption of Levite status than for the assumption of Cohen status,” so that self-declared Khazarian Levites might have fathered lineages whose Levitic pedigree came to be accepted.

But if R-M117 did enter the East European Jewish gene pool via a lineage of Khazar Levites, how many Khazars can be assumed to have joined the Ashkenazi community? At this point, it becomes pure guesswork. Analyzing the data, the American-Israeli-British study concludes that the number of R-M117 Levites absorbed by Ashkenazi Jewry ranged from one to fifty individuals. But as much as we might like to do the rest of the arithmetic ourselves, we can’t. For one thing, we have no way of knowing what the percentage of Levites in the Khazarian Jewish population was. Nor do we know the percentage of Khazars possessing M117, which is found in 12 or 13 percent of Russian and Ukrainian males today. If these were also its proportions among the Khazars, there would have been seven non-M117 Khazars joining or founding Ashkenazi Jewry for every Khazar who had the mutation.

In sum, even if the R-M117 Levites are traceable to Khazaria, the total flow of Khazarians into the East European Jewish population could have been anywhere from a single person to many thousands. If it was the latter, the Khazar input was significant, as David Goldstein suspects it was; if the former, it was trivial, as Jon Entine believes. The last eight years of research in Jewish historical genetics have not left us any wiser in this respect.

« Reply #35 on: September 07, 2008, 12:19:26 PM »

Traditional accounts of Jewish history, it would appear, are part true and part myth. Despite their dispersion in space and time, the Jews have continued to be that most curious (and in the eyes of many, preposterous) of combinations: at once a people or nation, fellow communicants in the world’s oldest monotheistic religion, and a family or tribe belonged to only by those born or married into it. They could not have remained such an amalgam had they not clung to strict rules of membership and admission.

Yet these rules were not observed everywhere or always. There were periods and places in which a blind eye was turned to them, most often when violations were not remediable. Had a rabbi arrived in Yemen or Bukhara soon after the founding of its Jewish community, he might have been able to insist on the halakhic conversion of its handful of Jews. But this would no longer have been practicable after several generations had gone by, especially since Yemenite and Bukharan Jews would have forgotten by then that their maternal progenitors were not halakhically Jewish and would have reacted with resentment to such a demand. Similarly, Khazars identifying themselves as Levites were accepted as such without inquiries into their past. It is an old rabbinic adage that one does not inflict demands on the public that the public is incapable of meeting. Better a tolerated myth than an intolerable truth.

Such, at any rate, was the attitude of a pre-modern age in which all Jews accepted rabbinic authority, so that all rabbis felt obliged to find solutions for all Jews. Since the mid-19th century, however, this has progressively ceased to be true. Rabbinic authority itself has fractured and dissipated. Most Jews no longer want rabbis to be responsible for them, and most rabbis no longer feel responsible for most Jews. The consequence of this, as reflected in the “Who Is A Jew?” debate that has racked world Jewry for the past several decades, is that the Jewish tribe is breaking up. In the United States, Orthodox rabbis do not recognize the Jewishness of converts to Reform or Conservative Judaism, Conservative rabbis do not recognize the Jewishness of children born to Jewish fathers but not to Jewish mothers, and Reform rabbis routinely preside over the marriages of Jewish men to non-Jewish women even though they may be creating future generations that they alone will consider Jewish.

In Israel, where non-Orthodox marriages and conversions cannot be performed, the problem is even more severe, for Jewishness in a Jewish state is a secular legal category as well. Israel’s Law of Return, for example, guarantees the right to immigrate and acquire Israeli citizenship to every Jew and his immediate family, including the first two generations of his descendants. Yet the more contentious the question of who is a Jew becomes, the more this law divides Jews rather than unites them.

Meanwhile, already living in Israel are hundreds of thousands of halakhically non-Jewish immigrants, most from the former Soviet Union, who entered the country under the Law of Return because they were either married to Jews or had a Jewish father or grandfather. As matters stand now, they and their children cannot have a Jewish wedding in Israel. Many of them, probably most, would like recognition as Jews, and not a few would be willing to convert in order to obtain it. But Israel’s Orthodox rabbinate has made the conversion procedure so difficult, in part by hinging it on the promise to live an Orthodox life, that most prospective converts have been deterred. Recently, perhaps for the first time in Jewish history, a conversion was retroactively annulled by the rabbinate on the grounds that such a promise was not kept.

For its part, the rabbinate insists that it has been forced to adopt more rigorous standards by the secular nature of Israeli society, which precludes the kind of “honor system” for determining Jewish identity that was operative in Jewish life in the past. Even Israelis whose Jewishness might appear to be beyond question now find themselves questioned about it.



To take a small personal example: my Israeli-born daughter, whose Israeli ID card lists her as “Jewish” and who is getting married in Israel this month, has been required to provide a letter from an Orthodox rabbi in the United States, where I and my wife were born and raised, attesting to the Orthodox ceremony in which we were wed in New York. The reasoning behind this is simple. Had we been married in Israel, this would have been considered proof of our daughter’s Jewishness, since our own Jewishness would already have been rabbinically certified. But if we were married in a non-Orthodox ceremony in the United States, we would have to bring further proof of our Jewishness since no non-Orthodox rabbi could be trusted to have vetted us properly.

And what could such further proof be? If we could find no Orthodox rabbi to speak for us, it would indeed be difficult to supply. My daughter would then have had the option of either arduously trying to assemble convincing evidence or of getting married outside of Israel (in which case her marriage would be recognized by Israeli secular law). Yet if she were to choose the second of these courses, as an increasing number of young Israelis are doing nowadays in their disinclination to deal with the rabbinate, she would in effect be choosing it for her children, too, since by the time they reached marriageable age, proof of their Jewishness would be even more difficult. In this manner, a growing public is being created in Israel that is losing its Jewish status in the eyes of rabbinic law.

The rabbinate’s position is understandable. Once, when there was no secular advantage in being Jewish, there was no reason to suspect anyone’s declaration of Jewishness; now, such avowals can no longer be taken at face value. And understandable, too, is the position of Israeli secularists who are indifferent to the rabbinate’s attitude or even welcome it.

For such secular Israelis, the idea of biological Jewishness is an embarrassing anachronism. Secular Zionism, after all, set out to normalize Jewish existence. Surely, they reason, its goal should therefore be to make Israelis a people whose identity is based, like that of other peoples, on territory, language, and culture rather than on shared blood ties. If Orthodoxy wishes to hasten this process, so much the better. Perhaps one day Israel will be become the “state of all its citizens” that democratic values require it to be, a country of Hebrew-speaking Jews, Muslims, and Christians, all equal before the law. Although the great majority of secular Israelis do not yet subscribe to this point of view, more and more will come to it if things continue on their present course.

As far as much of the rest of the world is concerned, biological Jewishness has always been an embarrassing anachronism—at least ever since the time of the Roman Empire and early Christianity. For the most part, Jews have nevertheless managed to go their own unembarrassed way. The genetic record shows that they have on the whole succeeded. But this is only, the same record shows, because they have made a point in the past of not embarrassing one another. There is a lot of DNA in the Jewish people that came in, as it were, through the back door. Unless ways are found to keep this door open, the walls of the house may have to be torn down.



In 2003, a year after the publication of my book Across the Sabbath River, I became involved in a historical genetics-research project myself. I did so at the invitation of two geneticists whose names appear on many of the scientific papers mentioned in this article: Professor Karl Skorecki and Dr. Doron Behar of Rambam Hospital and the Rappaport Research Institute in Haifa. They had read my book and wanted to know how I felt about taking part in a DNA study of the Mizo and Kuki people of northeast India, the purpose of which would be to determine whether there was evidence for a “Jewish”—that is, a Middle Eastern—origin for any of them.

Both men had qualms about the matter. Unlike other genetic investigations they had participated in, this one might have practical consequences. The B’nei Menashe believe that they descend from one of the “ten lost tribes” of Israel that was driven into exile by the Assyrians in the 8th century b.c.e. This belief, which first surfaced in Mizoram and Manipur in the 1950’s, is basic to their identity. Because of it, they have chosen to live Jewish lives and to convert once they have managed to reach Israel.

In my book I had come to the unexpected conclusion that there was a kernel of historical truth in their claim, although I did not think that more than a tiny fraction of Mizos and Kukis might have distant Israelite ancestors. What would happen, Skorecki and Behar asked, when our study was published? Whatever its findings, they would be certain to disappoint the B’nei Menashe and perhaps even to undermine their sense of Jewishness. And what if these findings were seized on by those in the Israeli government who wished to shut the country’s gates to the B’nei Menashe? Did we have the moral right to take such risks?

I answered that I thought we did. (This is the only basis I can imagine for David Goldstein’s strange statement in Jacob’s Legacy that I “agitated for the Mizos to undergo DNA tests in order to vindicate their claims.”) Israel’s gates had already been shut—and, apart from briefly swinging open again in 2006-7, have remained so—and even if one did not agree that the scientific truth was worth pursuing at all costs, someone else would pursue it in this case if we didn’t. It was best for the work to be done by an Israeli team that was sensitive to the issues involved.

In the end, we went ahead. Three rounds of sampling, based on the theories in my book and involving approximately 500 people, were carried out in India in 2003, 2006, and 2007. Although Goldstein writes (on what grounds, I again don’t know) that “most” Mizos and Kukis “resisted” genetic testing, I am aware of only one case in which someone who was asked to be sampled refused to cooperate. The difficulties were of an entirely different nature, such as a suitcase full of samples that was lost for several days in Tashkent, or the fact that at a critical juncture one of our samplers was murdered for reasons having nothing to do with our study.

The final lab results are now being tabulated. They will not, so it seems, be earth-shaking. Nearly all of the samples have turned out to have typically Tibeto-Burmese DNA. Although a very few look Middle Eastern, there may be no way of absolutely ruling out other possible sources for them. After all our effort, the results are inconclusive. And in any case, as historical geneticists are fond of saying, “absence of evidence is not evidence of absence.” There are many reasons why an originally small input of DNA might not turn up in a study: its bearers may have failed to reproduce their lineage, or the sample may be too small, or a crucial population group may be missing from it.

It has not been my impression, however, that the B’nei Menashe are waiting for the results with bated breath. In the five years that have passed since the study was commenced, scarcely any of them has contacted me to ask about it, and there has been, as far as I know, little discussion of it in their community. There appears to be no reason to think that, when eventually published, it will have much of an impact on them or their fate.

This comes as a relief. Despite my assurances to Skorecki and Behar, I too had my doubts. But the B’nei Menashe are more grounded in their own beliefs than we had feared. They will stick to them regardless of what two highly professional geneticists and one sadly amateur historian say in some scientific journal.



This, I think, is as it should be. There may be a few people who can subsist on an austere regimen of all truth and no myth, and there are all too many people who live on a flabby diet of all myth and no truth. But some indeterminably proportioned combination of the two dispositions is what most of us require for our health. This is as true of societies as it is of individuals.

I myself have long suspected, starting far before I knew anything of historical genetics or Arthur Koestler’s The Thirteenth Tribe, that I have Khazar blood in me. One of my father’s sisters had distinctly slanty eyes. In one of her daughters, these are even more pronounced. The daughter’s daughter has features that could come straight from the steppes of Asia.

I rather like the idea of Khazar forefathers. Far from deconstructing my Jewishness, it romanticizes it even more. The thought that my distant ancestors on the plains of Russia had the intelligence and folly to choose Judaism for their religion; that they prayed to a Jewish God as they rode into battle; that (as the historians tell us) they held back the Muslim invasion of Europe from the east and helped keep the West safe for Dante and Shakespeare. Does it make me feel that, as Arab propaganda would have it, I don’t belong in Palestine? Why should it? We Khazars threw in our lot with the Jews and the Jews embraced us. Since then, we’ve also been Jews.

And who is we? Each of us has had many thousands of forebears, and each of those had many thousands in turn. The traces of millions of human beings are in our minds, our hair, our eyes and noses, our inner organs, the shape of our toes, our trillions of cells. By pure chance, two of these trillions are passed on unchanged and can be given labels like R-M117. Instructive as they are, we needn’t make too much of them.



Hillel Halkin is a columnist for the New York Sun and a long-time contributor to COMMENTARY. His “How Not to Repair the World” appeared in our July-August issue.

Let us know what you think! Send an email to

1 Yale, 176 pp., $26.00. 2 Grand Central, 432 pp., $27.99. 3 An assimilationist Jew and at one time of his life an idiosyncratic Zionist, Koestler was attracted to this theory because it demonstrated, so he thought, that the Jews of the Diaspora were a “pseudo-nation” held together by “a system of traditional beliefs based on racial and historical premises which turn out to be illusory.” Either, therefore, they should emigrate to Israel or they should cease to exist. Ironically, however, Koestler’s book was soon enlisted by Arab propaganda in its war against Israel and Zionism. What claim could the Jews have to Palestine, Arab spokesmen asked, if their original ancestors came from southern Russia? 4 Constituting, like priests, about four percent of the world’s Jews, Levites can easily be identified because, again like priests, they are assigned minor tasks in Jewish ritual to this day, so that every religiously observant Levite knows he is one.

« Reply #36 on: October 01, 2008, 01:05:40 PM »

Public release date: 1-Oct-2008

Contact: Kevin Beaver
Florida State University
Study reveals specific gene in adolescent men with delinquent peers

But family environment can tip the balance for better or worse

TALLAHASSEE, Fla. -- Birds of a feather flock together, according to the old adage, and adolescent males who possess a certain type of variation in a specific gene are more likely to flock to delinquent peers, according to a landmark study led by Florida State University criminologist Kevin M. Beaver.

"This research is groundbreaking because it shows that the propensity in some adolescents to affiliate with delinquent peers is tied up in the genome," said Beaver, an assistant professor in the FSU College of Criminology and Criminal Justice.

Criminological research has long linked antisocial, drug-using and criminal behavior to delinquent peers -- in fact, belonging to such a peer group is one of the strongest correlates to both youthful and adult crime. But the study led by Beaver is the first to establish a statistically significant association between an affinity for antisocial peer groups and a particular variation (called the 10-repeat allele) of the dopamine transporter gene (DAT1).

However, the study's analysis of family, peer and DNA data from 1,816 boys in middle and high school found that the association between DAT1 and delinquent peer affiliation applied primarily for those who had both the 10-repeat allele and a high-risk family environment (one marked by a disengaged mother and an absence of maternal affection).

In contrast, adolescent males with the very same gene variation who lived in low-risk families (those with high levels of maternal engagement and warmth) showed no statistically relevant affinity for antisocial friends.

"Our research has confirmed the importance of not only the genome but also the environment," Beaver said. "With a sample comprised of 1,816 individuals, more than usual for a genetic study, we were able to document a clear link between DAT1 and delinquent peers for adolescents raised in high-risk families while finding little or no such link in those from low-risk families. As a result, we now have genuine empirical evidence that the social and family environment in an adolescent's life can either exacerbate or blunt genetic effects."

Beaver and research colleagues John Paul Wright, an associate professor and senior research fellow at the University of Cincinnati, and Matt DeLisi, an associate professor of sociology at Iowa State University, have described their novel findings in the paper "Delinquent Peer Group Formation: Evidence of a Gene X Environment Correlation," which appears in the September 2008 issue of the Journal of Genetic Psychology.

The biosocial data analyzed by Beaver and his two co-authors derived from "Add Health," an ongoing project focused on adolescent health that is administered by the University of North Carolina-Chapel Hill and funded largely by the National Institute of Child Health and Human Development. Since the program began in 1994, a total of nearly 2,800 nationally representative male and female adolescents have been genotyped and interviewed.

"We can only hypothesize why we saw the effect of DAT1 only in male adolescents from high-risk families," said Beaver, who will continue his research into the close relationship between genotype and environmental factors -- a phenomenon known in the field of behavioral genetics as the "gene X environment correlation."

"Perhaps the 10-repeat allele is triggered by constant stress or the general lack of support, whereas in low-risk households, the variation might remain inactive," he said. "Or it's possible that the 10-repeat allele increases an adolescent boy's attraction to delinquent peers regardless of family type, but parents from low-risk families are simply better able to monitor and control such genetic tendencies."

Among female adolescents who carry the 10-repeat allele, Beaver and his colleagues found no statistically significant affinity for antisocial peers, regardless of whether the girls lived in a high-risk or low-risk family environment.
« Reply #37 on: October 19, 2008, 08:24:36 PM »

Genetic-based Human Diseases Are An Ancient Evolutionary Legacy, Research Suggests

ScienceDaily (Oct. 19, 2008) — Tomislav Domazet-Lošo and Diethard Tautz from the Max Planck Institute for Evolutionary Biology in Plön, Germany, have systematically analysed the time of emergence for a large number of genes - genes which can also initiate diseases. Their studies show for the first time that the majority of these genes were already in existence at the origin of the first cells.

The search for further genes, particularly those which are involved in diseases caused by several genetic causes, is thus facilitated. Furthermore, the research results confirm that the basic interconnections are to be found in the function of genes - causing the onset of diseases - can also be found in model organisms (Molecular Biology and Evolution).

The Human Genome Project that deciphered the human genetic code, uncovered thousands of genes that, if mutated, are involved in human genetic diseases. The genomes of many other organisms were deciphered in parallel. This now allows the evolution of these disease associated genes to be systematically studied.

Tomislav Domazet-Lošo and Diethard Tautz from the Max Planck Institute for Evolutionary Biology in Plön (Germany) have used for this analysis a novel statistical method, "phylostratigraphy" that was developed by Tomislav Domazet-Lošo at the Ruđer Bošković Institute in Zagreb (Croatia). The method allows the point of origin for any existing gene to be determined by tracing the last common ancestor in which this gene existed. Based on this information, it is then possible to determine the minimum age for any given gene.

Applying this method to disease genes, the scientists from Plön came to surprising findings. The vast majority of these genes trace back to the origin of the first cell. Other large groups emerged more than one billion years ago around the first appearance of multi-cellular organisms, as well as at the time of origin of bony fishes about 400 million years ago. Surprisingly, they found almost no disease associated genes among those that emerged after the origin of mammals.

These findings suggest that genetic diseases affected primarily ancient cellular processes, which emerged already during the early stages of life on Earth. This leads to the conclusion that all living organisms today, i.e. not only humans, will be affected by similar genetic diseases. Furthermore, this implies that genetically caused diseases will never be beaten completely, because they are linked to ancient evolutionary processes.

Although it was already known that many disease associated genes occur also in other organisms distant to humans, such as the fruitfly Drosophila or the round worm Caenorhabditis, the analysis of Domazet-Lošo and Tautz shows now for the first time that this is systematically true for the vast majority of these genes. At present it remains unknown why the more recently evolved genes, for example those involved in the emergence of the mammals, do not tend to cause diseases when mutated.

The research results of the scientists from Plön also have some practical consequences. It will now be easier to identify candidates for further disease genes, in particular for those involved in multi-factorial diseases. Furthermore, the results confirm that the functional knowledge gained about such genes from remote model organisms is also relevant for understanding the genes in humans.
Power User
Posts: 42482

« Reply #38 on: October 19, 2008, 08:47:18 PM »

That's deep  shocked
« Reply #39 on: November 07, 2008, 11:47:18 AM »

'Junk' DNA Proves Functional; Helps Explain Human Differences From Other Species


According to a new study, what was previously believed to be "junk" DNA is one of the important ingredients distinguishing humans from other species. (Credit: iStockphoto)
ScienceDaily (Nov. 5, 2008) — In a paper published in Genome Research on Nov. 4, scientists at the Genome Institute of Singapore (GIS) report that what was previously believed to be "junk" DNA is one of the important ingredients distinguishing humans from other species.

More than 50 percent of human DNA has been referred to as "junk" because it consists of copies of nearly identical sequences. A major source of these repeats is internal viruses that have inserted themselves throughout the genome at various times during mammalian evolution.

Using the latest sequencing technologies, GIS researchers showed that many transcription factors, the master proteins that control the expression of other genes, bind specific repeat elements. The researchers showed that from 18 to 33% of the binding sites of five key transcription factors with important roles in cancer and stem cell biology are embedded in distinctive repeat families.

Over evolutionary time, these repeats were dispersed within different species, creating new regulatory sites throughout these genomes. Thus, the set of genes controlled by these transcription factors is likely to significantly differ from species to species and may be a major driver for evolution.

This research also shows that these repeats are anything but "junk DNA," since they provide a great source of evolutionary variability and might hold the key to some of the important physical differences that distinguish humans from all other species.

The GIS study also highlighted the functional importance of portions of the genome that are rich in repetitive sequences.

"Because a lot of the biomedical research use model organisms such as mice and primates, it is important to have a detailed understanding of the differences between these model organisms and humans in order to explain our findings," said Guillaume Bourque, Ph.D., GIS Senior Group Leader and lead author of the Genome Research paper.

"Our research findings imply that these surveys must also include repeats, as they are likely to be the source of important differences between model organisms and humans," added Dr. Bourque. "The better our understanding of the particularities of the human genome, the better our understanding will be of diseases and their treatments."

"The findings by Dr. Bourque and his colleagues at the GIS are very exciting and represent what may be one of the major discoveries in the biology of evolution and gene regulation of the decade," said Raymond White, Ph.D., Rudi Schmid Distinguished Professor at the Department of Neurology at the University of California, San Francisco, and chair of the GIS Scientific Advisory Board.

"We have suspected for some time that one of the major ways species differ from one another – for instance, why rats differ from monkeys – is in the regulation of the expression of their genes: where are the genes expressed in the body, when during development, and how much do they respond to environmental stimuli," he added.

"What the researchers have demonstrated is that DNA segments carrying binding sites for regulatory proteins can, at times, be explosively distributed to new sites around the genome, possibly altering the activities of genes near where they locate. The means of distribution seem to be a class of genetic components called 'transposable elements' that are able to jump from one site to another at certain times in the history of the organism. The families of these transposable elements vary from species to species, as do the distributed DNA segments which bind the regulatory proteins."

Dr. White also added, "This hypothesis for formation of new species through episodic distributions of families of gene regulatory DNA sequences is a powerful one that will now guide a wealth of experiments to determine the functional relationships of these regulatory DNA sequences to the genes that are near their landing sites. I anticipate that as our knowledge of these events grows, we will begin to understand much more how and why the rat differs so dramatically from the monkey, even though they share essentially the same complement of genes and proteins."
« Reply #40 on: November 07, 2008, 11:49:02 AM »

Second Post:

'Space invader' DNA infiltrated mammalian genomes
22:00 20 October 2008 news service
Jessica Griggs

Parts of mammalian DNA are so alien they have been dubbed "space invaders" by the researchers that found them. The discovery, if confirmed, will change our understanding of evolution.

We normally get our genes "vertically" – handed down from our parents and theirs before them. Bacteria get theirs in this way too, but also "horizontally" – passed from one, unrelated individual to another.

Now biologists at the University of Texas, Arlington, have found the unexpected: horizontal gene transfer has occurred in mammals and amphibians too.

The culprit is a kind of "parasitic" DNA found in all our cells, known as a transposon. Study leader Cédric Feschotte says that what he calls space invader tranposons jumped sideways millions of years ago into several species by piggybacking onto a virus.

The transposon then assimilated itself into sex chromosomes, ensuring that it would get passed onto future generations. "It is very interesting conceptually – the idea that some parts of a mammal's DNA don't come from an ancestral species," he says.

Alien invasion

Out of 26 animal genomes, the team found a near-identical length of DNA, known as the hAT transposon, in seven species, separated by some 340 million years of evolution.

These include species as widely diverged as a bush baby, a South American opossum, an African clawed frog and a tenrec – a mammal that looks like a hedgehog, but is actually more closely related to elephants.

The fact that invasive DNA was seen in a bush baby but not in any other primates, and in a tenrec but not in elephants, hints that something more exotic than standard inheritance is going on.

However, this patchy distribution by itself does not rule out the traditional method, as some of the species could have lost the transposon DNA throughout evolutionary history.

So the team looked at the position of the hAT transposon – if it had been inherited from a common ancestor it would have been found in the same position, with respect to other genes, in each species. But they could not find a single case of this.

Since first entering the genome, the hAT has been able to reproduce dramatically – in the tenrec, 99,000 copies were found, making up a significant chunk of its DNA. Feschotte speculates that this must have had a dramatic effect on its evolutionary development.

"It's like a bombardment", he says. "It must have been evolutionarily significant because the transposon generated a huge amount of DNA after the initial transfer."

Feschotte says he expects many more reports of horizontal gene jumping. "We're talking about a paradigm shift because, until now, horizontal transfer has been seen as very rare in animal species. It's actually a lot more common than we think."

Mammal extinctions?

The team thinks that the hAT transposon invasion occurred about 30 million years ago and spread across at least two continents. "It's like a pandemic, and one that can infect species that weren't genetically or geographically close. It's puzzling, scary almost," Feschotte says.

It may not be a coincidence that the time of the invasion coincides with a period in evolutionary history that saw mass mammal extinctions. This is usually attributed to climate change, Feschotte says, but it is not crazy to suppose that this type of invasion could contribute to species extinction.

The hAT transposon does not occur in humans, but some 45% of our genome is of transposon origin.

Feschotte's work on the hAT transposon is the first time that a "jumping gene" has been shown to have entered mammalian genomes, and the first time it has been shown to do so in at around the same time, in a range of unrelated species, in different parts of the world.

Feschotte admits that we cannot rule out another transposon offensive occurring in mammals, and thinks that bats are the species most likely to be the source. For some reason, he says, they seem to be most susceptible to picking up transposons – possibly because of the viruses they carry.

'Rather scary'

"Bats are notorious reservoir species for a plethora of viruses, including some very nasty to humans like rabies, SARS and perhaps Ebola," he says.

"Since these bats are full of active DNA transposons and are frequently involved in viral spill-over, the door for the transfer of an active DNA transposon to humans seems wide open. Rather scary."

Greg Hurst, an evolutionary biologist at the University of Liverpool, UK, says that the arrival of a new transposable element can be evolutionary significant, because new elements tend to be more active. "They will jump a fair bit more than older elements, which the resident genome will have evolved to suppress."

Most of the consequences of having a transposon jump around in your genome will be deleterious, Hurst says, but some will be advantageous. "The evolutionary life of the species could certainly hit the fast lane for a bit when it happens."

Journal reference: Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.0806548105)

Related Articles
Evolution myths: Evolution promotes the survival of species
16 April 2008
Genomics: Junking the junk DNA
11 July 2007
Primeval life reflected in present-day pools
02 July 2005
Jumping genes help choreograph development
23 October 2004
Cedric Feschotte, University of Texas
Feschotte lab
Greg Hurst, University of Liverpool
« Reply #41 on: November 13, 2008, 07:25:36 PM »

Perhaps a chew toy for Crafty:

How warfare shaped human evolution

12 November 2008 by Bob Holmes

IT'S a question at the heart of what it is to be human: why do we go to war? The cost to human society is enormous, yet for all our intellectual development, we continue to wage war well into the 21st century.

Now a new theory is emerging that challenges the prevailing view that warfare is a product of human culture and thus a relatively recent phenomenon. For the first time, anthropologists, archaeologists, primatologists, psychologists and political scientists are approaching a consensus. Not only is war as ancient as humankind, they say, but it has played an integral role in our evolution.

The theory helps explain the evolution of familiar aspects of warlike behaviour such as gang warfare. And even suggests the cooperative skills we've had to develop to be effective warriors have turned into the modern ability to work towards a common goal.

These ideas emerged at a conference last month on the evolutionary origins of war at the University of Oregon in Eugene. "The picture that was painted was quite consistent," says Mark Van Vugt, an evolutionary psychologist at the University of Kent, UK. "Warfare has been with us for at least several tens, if not hundreds, of thousands of years." He thinks it was already there in the common ancestor we share with chimps. "It has been a significant selection pressure on the human species," he says. In fact several fossils of early humans have wounds consistent with warfare.

Studies suggest that warfare accounts for 10 per cent or more of all male deaths in present-day hunter-gatherers. "That's enough to get your attention," says Stephen LeBlanc, an archaeologist at Harvard University's Peabody Museum in Boston.

Primatologists have known for some time that organised, lethal violence is common between groups of chimpanzees, our closest relatives. Whether between chimps or hunter-gatherers, however, intergroup violence is nothing like modern pitched battles. Instead, it tends to take the form of brief raids using overwhelming force, so that the aggressors run little risk of injury. "It's not like the Somme," says Richard Wrangham, a primatologist at Harvard University. "You go off, you make a hit, you come back again." This opportunistic violence helps the aggressors weaken rival groups and thus expand their territorial holdings.

Such raids are possible because humans and chimps, unlike most social mammals, often wander away from the main group to forage singly or in smaller groups, says Wrangham. Bonobos - which are as closely related to humans as chimps are - have little or no intergroup violence because they tend to live in habitats where food is easier to come by, so that they need not stray from the group.

If group violence has been around for a long time in human society then we ought to have evolved psychological adaptations to a warlike lifestyle. Several participants presented the strongest evidence yet that males - whose larger and more muscular bodies make them better suited for fighting - have evolved a tendency towards aggression outside the group but cooperation within it. "There is something ineluctably male about coalitional aggression - men bonding with men to engage in aggression against other men," says Rose McDermott, a political scientist at Stanford University in California.

Aggression in women, she notes, tends to take the form of verbal rather than physical violence, and is mostly one on one. Gang instincts may have evolved in women too, but to a much lesser extent, says John Tooby, an evolutionary psychologist at the University of California at Santa Barbara. This is partly because of our evolutionary history, in which men are often much stronger than women and therefore better suited for physical violence. This could explain why female gangs only tend to form in same-sex environments such as prison or high school. But women also have more to lose from aggression, Tooby points out, since they bear most of the effort of child-rearing.

Not surprisingly, McDermott, Van Vugt and their colleagues found that men are more aggressive than women when playing the leader of a fictitious country in a role-playing game. But Van Vugt's team observed more subtle responses in group bonding. For example, male undergraduates were more willing than women to contribute money towards a group effort - but only when competing against rival universities. If told instead that the experiment was to test their individual responses to group cooperation, men coughed up less cash than women did. In other words, men's cooperative behaviour only emerged in the context of intergroup competition (Psychological Science, vol 18, p 19).

Some of this behaviour could arguably be attributed to conscious mental strategies, but anthropologist Mark Flinn of the University of Missouri at Columbia has found that group-oriented responses occur on the hormonal level, too. He found that cricket players on the Caribbean island of Dominica experience a testosterone surge after winning against another village. But this hormonal surge, and presumably the dominant behaviour it prompts, was absent when the men beat a team from their own village, Flinn told the conference. "You're sort of sending the signal that it's play. You're not asserting dominance over them," he says. Similarly, the testosterone surge a man often has in the presence of a potential mate is muted if the woman is in a relationship with his friend. Again, the effect is to reduce competition within the group, says Flinn. "We really are different from chimpanzees in our relative amount of respect for other males' mating relationships."

The net effect of all this is that groups of males take on their own special dynamic. Think soldiers in a platoon, or football fans out on the town: cohesive, confident, aggressive - just the traits a group of warriors needs.

Chimpanzees don't go to war in the way we do because they lack the abstract thought required to see themselves as part of a collective that expands beyond their immediate associates, says Wrangham. However, "the real story of our evolutionary past is not simply that warfare drove the evolution of social behaviour," says Samuel Bowles, an economist at the Santa Fe Institute in New Mexico and the University of Siena, Italy. The real driver, he says, was "some interplay between warfare and the alternative benefits of peace".

Though women seem to help broker harmony within groups, says Van Vugt, men may be better at peacekeeping between groups.

Our warlike past may have given us other gifts, as well. "The interesting thing about war is we're focused on the harm it does," says Tooby. "But it requires a super-high level of cooperation." And that seems to be a heritage worth hanging on to.
« Reply #42 on: November 19, 2008, 01:26:09 PM »

News -  November 19, 2008

Scientists Sequence Half the Woolly Mammoth's Genome
Study could be a step toward resurrecting a long-extinct animal
By Kate Wong

Editor's note: This story will appear in our January issue but is being posted early because of a publication in today's Nature.

Thousands of years after the last woolly mammoth lumbered across the tundra, scientists have sequenced a whopping 50 percent of the beast’s nuclear genome,  they report in a new study. Earlier attempts to sequence the DNA of these icons of the Ice Age produced only tiny quantities of code. The new work marks the first time that so much of the genetic material of an extinct creature has been retrieved. Not only has the feat provided insight into the evolutionary history of mammoths, but it is a step toward realizing the science-fiction dream of being able to resurrect a long-gone animal.

Researchers led by Webb Miller and Stephan C. Schuster of Pennsylvania State University extracted the DNA from hair belonging to two Siberian woolly mammoths and ran it through a machine that conducts so-called highthroughput sequencing. Previously, the largest amount of DNA from an extinct species comprised around 13 million base pairs—not even 1 percent of the genome. Now, writing in the November 20 issue of Nature, the team reports having obtained more than three billion base pairs. “It’s a technical breakthrough,” says ancient-DNA expert Hendrik N. Poinar of McMaster University in Ontario.

Interpretation of the sequence is still nascent, but the results have already helped overturn a long-held assumption about the proboscidean past. Received wisdom holds that the woolly mammoth was the last of a line of species in which each one begat the next, with only one species existing at any given time. The nuclear DNA reveals that the two mammoths that yielded the DNA were quite different from each other, and they seem to belong to populations that diverged 1.5 million to two million years ago. This finding confirms the results of a recent study of the relatively short piece of DNA that resides in the cell’s energy-producing organelles—called mitochondrial DNA—which suggested that multiple species of woolly mammoth coexisted. “It looks like there was speciation that we were previously unable to detect” using fossils alone, Ross D. E. MacPhee of the American Museum of Natural History in New York City observes.

Thus far the mammoth genome exists only in bits and pieces: it has not yet been assembled. The researchers are awaiting completion of the genome of the African savanna elephant, a cousin of the woolly mammoth, which will serve as a road map for how to reconstruct the extinct animal’s genome.

Armed with complete genomes for the mammoth and its closest living relative, the Asian elephant, scientists may one day be able to bring the mammoth back from the beyond. “A year ago I would have said this was science fiction,” Schuster remarks. But as a result of this sequencing achievement, he now believes one could theoretically modify the DNA in the egg of an elephant to match that of its furry cousin by artificially introducing the appropriate substitutions to the genetic code. Based on initial comparisons of mammoth and elephant DNA, he estimates that around 400,000 changes would produce an animal that looks a lot like a mammoth; an exact replica would require several million.

(The recent cloning of frozen mice is not applicable to woolly mammoths, Schuster believes, because whereas mice are small and therefore freeze quickly, a mammoth carcass would take many days to ice over—a delay that would likely cause too much DNA degradation for cloning.)

In the nearer term, biologists are hoping to glean insights into such mysteries as how woolly mammoths were adapted to their frigid world and what factors led to their demise. Miller notes that by studying the genomes of multiple mammoths from different time periods, researchers will be able to chart the decrease in genetic diversity as the species died out. The downfall of the mammoths and other species may contain lessons for modern fauna in danger of disappearing, he says.

Indeed, the team is now sequencing DNA they have obtained from a thylacine, an Australian marsupial that went extinct in 1936, possibly as a result of infection. They want to compare its DNA with that of the closely related Tasmanian devil, which is currently under threat from a devastating facial cancer.

“We’re hoping to learn why one species went extinct and the other didn’t and then use that [knowledge] in conservation efforts,” Miller says. If the research turns up genes associated with survival, scientists can use that information to develop a breeding program for the Tasmanian devil that maximizes the genetic diversity of the population—and increases the frequency of genes that confer immunity. Perhaps the greatest promise of ancient DNA is not raising the dead but preserving the living.
Power User
Posts: 42482

« Reply #43 on: November 19, 2008, 07:47:59 PM »

What an extraordinary world we live in!

What does "proboscidean" mean?
« Reply #44 on: November 19, 2008, 08:54:47 PM »

I assumed a root of "proboscis" and so figured "of bigged nosed creatures."
Power User
Posts: 42482

« Reply #45 on: November 20, 2008, 02:46:00 AM »

Duh embarassed
Power User
Posts: 42482

« Reply #46 on: December 06, 2008, 08:09:35 AM »

Margaret Somerville | Friday, 5 December 2008
Aping their betters
If animals co-operate to benefit their community, does it mean they are ethical beings?

Recently, I participated in a round-table discussion, “Apes or Angels: What is the Origin of Ethics?” at McGill University. It was billed as honouring the 150th anniversary of the publication of Charles Darwin's Theory of Natural Selection.

The issue on the table was whether the ethical system that underlies "our unique social and economic system ... that leads us to rely on the support and co-operation of other individuals, largely unknown to one another" is simply the result of evolution through natural selection and a more advanced form of the social co-operation we see in animals; or whether "our social behaviour and the ethics on which it is based [are] uniquely human and owe nothing to the processes that govern societies of ants or bacteria. Our bodies may have evolved, but our ethics requires another kind of explanation."

In short, are ethics and morality in humans just one more outcome of natural selection through evolution, or do they have some other origin?

My co-panelists included world-renowned evolutionary biologists; distinguished academics specializing in researching the relation of economics and evolutionary biology; an anthropologist with expertise on co-operative behaviour in apes and monkeys; and a global leader in the field of evolution education, whose expert witness testimony in the U.S. federal trial on biological evolution, education and the U.S. constitution, contributed to the court ruling that the teaching of intelligent design in high-school science classes was unconstitutional.

I was a loner as an ethicist and, possibly, the only person who thought that humans were not just an improved version of other animals in terms of ethical behaviour.

First, we discussed whether we could say animals had a sense of ethics. My co-panelists referred to research that shows primates perceive and become angry when they can see they are not being treated fairly -- for instance, when one gets a bigger reward for a certain response than another. They explained that animals form community and act to maximize benefit to the community, including through self-sacrifice. They proposed that these behaviours were early forms of ethical conduct and that it was relevant in tracing and understanding the evolution of ethics in humans to know when these behaviours first appeared, in which animals, and at what point on the evolutionary tree.

This approach reflects a range of crucial assumptions.

First, that ethics -- and one assumes morality, as ethics is based on morality -- is just a genetically determined characteristic not unique to humans. Genetic reductionism is a view that we are nothing more than "gene machines", including with respect to our most "human" characteristics, such as ethics.

We probably have genes that give us the capacity to seek ethics. (These genes might need to be activated by certain experiences or learning. We can imagine them as being like a TV set: we need it to see a telecast, but it doesn't determine what we see.) I propose, however, that ethics consists of more than just a genetically programmed response.

Ethics require moral judgment. That requires deciding between right and wrong. As far as we know, animals are not capable of doing that. There's a major difference between engaging in social conduct that benefits the community, as some animals do, and engaging in that same conduct because it would be ethically wrong not to do so, as humans do.

My colleagues believed ethics were not unique to humans. Definition is a problem here. If ethics are broadly defined to encompass certain animal behaviour, they are correct. But if ethics are the practical application of morality, then to say animals have ethics is to attribute a moral instinct to them.

My colleagues' approach postulates an ethics continuum on which humans are just more "ethically advanced" than animals -- that is, there is only a difference in degree, not a difference in kind, between humans and animals with respect to having a capacity to be ethical.

Whether animals and humans are just different-in-degree or different-in-kind ("special" and, therefore, deserve "special respect") is at the heart of many of the most important current ethical conflicts, including those about abortion, human embryonic stem cell research, new reproductive technologies, and euthanasia.

Princeton philosopher Peter Singer is an "only a difference in degree" adherent. He says we are all animals and, therefore, giving preferential treatment to humans is "speciesism" -- wrongful discrimination on the basis of species identity. Animals and humans deserve the same respect. What we wouldn't do to humans we shouldn't do to animals; and what we would do for animals -- for instance, euthanasia -- we should do for humans.

MIT artificial intelligence and robotics scientist, Rodney Brooks, argues the same on behalf of robots. He claims that those which are more intelligent than us will deserve greater respect than we do.

In contrast, I believe that humans are "special" (different-in-kind) as compared with other animals and, consequently, deserve "special respect".

Traditionally, we have used the idea that humans have a soul and animals don't to justify our differential treatment of humans and animals in terms of the respect they deserve. But soul is no longer a universally accepted concept.

Ethics can, however, be linked to a metaphysical base without needing to invoke religious or supernatural features or beliefs. E could speak of a secular "human spirit" nature or, as German philosopher Jurgen Habermas describes it, an "ethics of the human species". I propose that ethics necessarily involve some transcendent experience, one that humans can have and animals cannot.

And I want to make clear that we can believe in evolution and also believe in God. The dichotomy often made in the media between being "atheist-anti-religion/pro-evolution," on the one hand, and "believer-pro-religion/anti-evolution," on the other, does not reflect reality. Evolution and a belief in God are not, as Richard Dawkins argues, incompatible.

The argument that it's dangerous to abandon the ideas of human specialness and that a moral instinct and search for ethics is uniquely human, was greeted with great skepticism by my colleagues, who seemed to think that only religious people would hold such views.

To conclude: "Do ants have ethics?" -- that is, Does the behaviour, bonding and the formation of community in animals have a different base from that in humans? How we answer that question is of immense importance, because it will have a major impact on the ethics we hand on to future generations.

Margaret Somerville is director of the Centre for Medicine, Ethics and Law at McGill University and author of The Ethical Imagination: Journeys of the Human Spirit.

« Reply #47 on: December 09, 2008, 04:22:47 PM »

Spare the Rod, Spoil Society
Does punishing free riders increase long-run cooperation?

Ronald Bailey | December 9, 2008
Want to punish the fat cats on Wall Street who have allegedly wrecked the economy? Of course you do! And it's only natural, according to research done by two economists whose work focuses on the puzzle of human cooperation. As Swiss researchers Ernst Fehr and Simon Gächter noted in 2002, "Unlike other creatures, people frequently cooperate with genetically unrelated strangers, often in large groups." They argued that one significant key to cooperation is the existence of "altruistic punishment."
During the course of human evolution, people frequently engaged in cooperative activities such as big game hunting and the preservation of common property resources like fisheries. But it's all too easy for individuals to free ride on such projects. So how does true cooperation occur? Fehr and Gächter point to altruistic punishers: people who respond with strong emotion or even violence when someone else benefits from the labor of others without contributing something themselves. It may cost something to act as altruistic punisher, but going postal on non-cooperators does encourage everybody to contribute to the public good. The New York Times even speculated that this drive to punish free riders was behind the American public's disinclination to support the Congressional bailout proposals back in September.

To test their hypothesis, Fehr and Gächter set up a series of public goods experiments in which each player chose how much money to contribute to a joint investment—without knowing beforehand how much the other players will contribute. If everyone puts in a lot, they maximize their profits. However, the games were rigged such that non-cooperators could gain even more by taking a share of the profits while retaining their own initial endowments. The researchers found that when punishing non-cooperators was possible (say, spend $1 to reduce the free riders' endowments by $3), it substantially increased the amount that nearly all subjects invested in the public good. For example, in experiments done in 2000, where free riders could be punished, experimental subjects contributed two to four times more than when there was no punishment option.

However, Dutch experimenters noted that public goods games where punishment was allowed actually produced lower overall returns than did games in which no punishment occurred. Why? Because the destruction of the non-cooperators' resources was greater than the subsequent gains from cooperation. Punishment increases cooperation, but it also makes the group poorer. This is not a particularly inspiring outcome.

In a new study published last week in Science, however, Gächter and his colleagues show that the lower overall returns from games in which punishment is possible may be an experimental artifact resulting from the number of rounds in which the games are played. In most experimental conditions, public goods games are played for ten rounds or less. The new research compares the outcomes of games lasting for 10 rounds versus 50 rounds, both when punishment is possible and when it is not.

What happens when players have only ten rounds in which to invest? As satisfying as it is to punish free riders, the average payoffs after ten rounds are indeed lower than when no punishment is allowed. The results are quite different when the games last 50 rounds. Interestingly, the payoffs in the initial rounds when punishment is possible are lower than the payoffs when punishment can't occur. However, as rounds of play accumulate, the payoffs in the games where free riders can be punished rise rapidly, while the payoffs in games in which free riders are not punished drop throughout the duration of play.

Another happy result is that once players understand that they can be punished for free riding, they start investing enthusiastically in the common pool, causing the costs of punishment to drop to near zero. "Overall, our experiments show that punishment not only increases cooperation, it also makes groups and individuals better off in the long run because the costs of punishment become negligible and are outweighed by the increased gains from cooperation," conclude the researchers. "These results support group selection models of cooperation and punishment, which require that punishment increases not only cooperation but also group average payoffs."

So punishing free riders increases cooperation and boosts incomes over the long-run. But that is not always the case (at least in shorter-term games). Earlier this year, Gächter and his colleagues reported the results from a series of public goods games using players from 16 different societies. Their research turned up profound cross-cultural differences in response to punishment. All groups punished free riders, but the free riders did not all respond with increased cooperation. Instead, some sought revenge by punishing their punishers—if you whack me, I'll whack you, in other words. So a cycle of vendettas broke out.

For example, players from Muscat, Greece, and Saudi Arabia were the most vengeful. On the other hand, players from the United States, Australia, and Britain were the least vengeful and most likely to respond to punishment with increased cooperation. The researchers concluded that revenge is stronger among participants from "societies with weak norms of civic cooperation and a weak rule of law." Not surprisingly, the overall payoffs were significantly lower in the games in which participants indulged in cycles of vengeance.

We've come a long way from the bands of Pleistocene hunter-gatherers in which these psychological tendencies evolved. In today's complex economy, which encompasses globe-spanning webs of cooperation, how do people correctly identify free riders who merit punishment? Are the investment bankers with big bonuses free riders? What about hedge fund managers? Government agencies? Politicians? Perhaps the good news from experimental economics is that while Americans want to punish free riders as much as the next guys do, we are unlikely to engage in a self-defeating cycle of financial vengeance that will make us all poorer.

Ronald Bailey is reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
Power User
Posts: 42482

« Reply #48 on: December 11, 2008, 03:07:43 PM »

Evolutionary theory predictions confirmed
« Reply #49 on: December 14, 2008, 07:30:22 PM »


In sixteenth-century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted in a sling on a stage and slowly lowered into a fire. According to historian Norman Davies, "[T]he spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonized." Today, such sadism would be unthinkable in most of the world. This change in sensibilities is just one example of perhaps the most important and most underappreciated trend in the human saga: Violence has been in decline over long stretches of history, and today we are probably living in the most peaceful moment of our species' time on earth.

In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.

Some of the evidence has been under our nose all along. Conventional history has long shown that, in many ways, we have been getting kinder and gentler. Cruelty as entertainment, human sacrifice to indulge superstition, slavery as a labor-saving device, conquest as the mission statement of government, genocide as a means of acquiring real estate, torture and mutilation as routine punishment, the death penalty for misdemeanors and differences of opinion, assassination as the mechanism of political succession, rape as the spoils of war, pogroms as outlets for frustration, homicide as the major form of conflict resolution—all were unexceptionable features of life for most of human history. But, today, they are rare to nonexistent in the West, far less common elsewhere than they used to be, concealed when they do occur, and widely condemned when they are brought to light.

At one time, these facts were widely appreciated. They were the source of notions like progress, civilization, and man's rise from savagery and barbarism. Recently, however, those ideas have come to sound corny, even dangerous. They seem to demonize people in other times and places, license colonial conquest and other foreign adventures, and conceal the crimes of our own societies. The doctrine of the noble savage—the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood"). But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler.

To be sure, any attempt to document changes in violence must be soaked in uncertainty. In much of the world, the distant past was a tree falling in the forest with no one to hear it, and, even for events in the historical record, statistics are spotty until recent periods. Long-term trends can be discerned only by smoothing out zigzags and spikes of horrific bloodletting. And the choice to focus on relative rather than absolute numbers brings up the moral imponderable of whether it is worse for 50 percent of a population of 100 to be killed or 1 percent in a population of one billion.

Yet, despite these caveats, a picture is taking shape. The decline of violence is a fractal phenomenon, visible at the scale of millennia, centuries, decades, and years. It applies over several orders of magnitude of violence, from genocide to war to rioting to homicide to the treatment of children and animals. And it appears to be a worldwide trend, though not a homogeneous one. The leading edge has been in Western societies, especially England and Holland, and there seems to have been a tipping point at the onset of the Age of Reason in the early seventeenth century.

At the widest-angle view, one can see a whopping difference across the millennia that separate us from our pre-state ancestors. Contra leftist anthropologists who celebrate the noble savage, quantitative body-counts—such as the proportion of prehistoric skeletons with axemarks and embedded arrowheads or the proportion of men in a contemporary foraging tribe who die at the hands of other men—suggest that pre-state societies were far more violent than our own. It is true that raids and battles killed a tiny percentage of the numbers that die in modern warfare. But, in tribal violence, the clashes are more frequent, the percentage of men in the population who fight is greater, and the rates of death per battle are higher. According to anthropologists like Lawrence Keeley, Stephen LeBlanc, Phillip Walker, and Bruce Knauft, these factors combine to yield population-wide rates of death in tribal warfare that dwarf those of modern times. If the wars of the twentieth century had killed the same proportion of the population that die in the wars of a typical tribal society, there would have been two billion deaths, not 100 million.

Political correctness from the other end of the ideological spectrum has also distorted many people's conception of violence in early civilizations—namely, those featured in the Bible. This supposed source of moral values contains many celebrations of genocide, in which the Hebrews, egged on by God, slaughter every last resident of an invaded city. The Bible also prescribes death by stoning as the penalty for a long list of nonviolent infractions, including idolatry, blasphemy, homosexuality, adultery, disrespecting one's parents, and picking up sticks on the Sabbath. The Hebrews, of course, were no more murderous than other tribes; one also finds frequent boasts of torture and genocide in the early histories of the Hindus, Christians, Muslims, and Chinese.

At the century scale, it is hard to find quantitative studies of deaths in warfare spanning medieval and modern times. Several historians have suggested that there has been an increase in the number of recorded wars across the centuries to the present, but, as political scientist James Payne has noted, this may show only that "the Associated Press is a more comprehensive source of information about battles around the world than were sixteenth-century monks." Social histories of the West provide evidence of numerous barbaric practices that became obsolete in the last five centuries, such as slavery, amputation, blinding, branding, flaying, disembowelment, burning at the stake, breaking on the wheel, and so on. Meanwhile, for another kind of violence—homicide—the data are abundant and striking. The criminologist Manuel Eisner has assembled hundreds of homicide estimates from Western European localities that kept records at some point between 1200 and the mid-1990s. In every country he analyzed, murder rates declined steeply—for example, from 24 homicides per 100,000 Englishmen in the fourteenth century to 0.6 per 100,000 by the early 1960s.

On the scale of decades, comprehensive data again paint a shockingly happy picture: Global violence has fallen steadily since the middle of the twentieth century. According to the Human Security Brief 2006, the number of battle deaths in interstate wars has declined from more than 65,000 per year in the 1950s to less than 2,000 per year in this decade. In Western Europe and the Americas, the second half of the century saw a steep decline in the number of wars, military coups, and deadly ethnic riots.

Zooming in by a further power of ten exposes yet another reduction. After the cold war, every part of the world saw a steep drop-off in state-based conflicts, and those that do occur are more likely to end in negotiated settlements rather than being fought to the bitter end. Meanwhile, according to political scientist Barbara Harff, between 1989 and 2005 the number of campaigns of mass killing of civilians decreased by 90 percent.

The decline of killing and cruelty poses several challenges to our ability to make sense of the world. To begin with, how could so many people be so wrong about something so important? Partly, it's because of a cognitive illusion: We estimate the probability of an event from how easy it is to recall examples. Scenes of carnage are more likely to be relayed to our living rooms and burned into our memories than footage of people dying of old age. Partly, it's an intellectual culture that is loath to admit that there could be anything good about the institutions of civilization and Western society. Partly, it's the incentive structure of the activism and opinion markets: No one ever attracted followers and donations by announcing that things keep getting better. And part of the explanation lies in the phenomenon itself. The decline of violent behavior has been paralleled by a decline in attitudes that tolerate or glorify violence, and often the attitudes are in the lead. As deplorable as they are, the abuses at Abu Ghraib and the lethal injections of a few murderers in Texas are mild by the standards of atrocities in human history. But, from a contemporary vantage point, we see them as signs of how low our behavior can sink, not of how high our standards have risen.

The other major challenge posed by the decline of violence is how to explain it. A force that pushes in the same direction across many epochs, continents, and scales of social organization mocks our standard tools of causal explanation. The usual suspects—guns, drugs, the press, American culture—aren't nearly up to the job. Nor could it possibly be explained by evolution in the biologist's sense: Even if the meek could inherit the earth, natural selection could not favor the genes for meekness quickly enough. In any case, human nature has not changed so much as to have lost its taste for violence. Social psychologists find that at least 80 percent of people have fantasized about killing someone they don't like. And modern humans still take pleasure in viewing violence, if we are to judge by the popularity of murder mysteries, Shakespearean dramas, Mel Gibson movies, video games, and hockey.

What has changed, of course, is people's willingness to act on these fantasies. The sociologist Norbert Elias suggested that European modernity accelerated a "civilizing process" marked by increases in self-control, long-term planning, and sensitivity to the thoughts and feelings of others. These are precisely the functions that today's cognitive neuroscientists attribute to the prefrontal cortex. But this only raises the question of why humans have increasingly exercised that part of their brains. No one knows why our behavior has come under the control of the better angels of our nature, but there are four plausible suggestions.

The first is that Hobbes got it right. Life in a state of nature is nasty, brutish, and short, not because of a primal thirst for blood but because of the inescapable logic of anarchy. Any beings with a modicum of self-interest may be tempted to invade their neighbors to steal their resources. The resulting fear of attack will tempt the neighbors to strike first in preemptive self-defense, which will in turn tempt the first group to strike against them preemptively, and so on. This danger can be defused by a policy of deterrence—don't strike first, retaliate if struck—but, to guarantee its credibility, parties must avenge all insults and settle all scores, leading to cycles of bloody vendetta. These tragedies can be averted by a state with a monopoly on violence, because it can inflict disinterested penalties that eliminate the incentives for aggression, thereby defusing anxieties about preemptive attack and obviating the need to maintain a hair-trigger propensity for retaliation. Indeed, Eisner and Elias attribute the decline in European homicide to the transition from knightly warrior societies to the centralized governments of early modernity. And, today, violence continues to fester in zones of anarchy, such as frontier regions, failed states, collapsed empires, and territories contested by mafias, gangs, and other dealers of contraband.

Payne suggests another possibility: that the critical variable in the indulgence of violence is an overarching sense that life is cheap. When pain and early death are everyday features of one's own life, one feels fewer compunctions about inflicting them on others. As technology and economic efficiency lengthen and improve our lives, we place a higher value on life in general.

A third theory, championed by Robert Wright, invokes the logic of non-zero-sum games: scenarios in which two agents can each come out ahead if they cooperate, such as trading goods, dividing up labor, or sharing the peace dividend that comes from laying down their arms. As people acquire know-how that they can share cheaply with others and develop technologies that allow them to spread their goods and ideas over larger territories at lower cost, their incentive to cooperate steadily increases, because other people become more valuable alive than dead.

Then there is the scenario sketched by philosopher Peter Singer. Evolution, he suggests, bequeathed people a small kernel of empathy, which by default they apply only within a narrow circle of friends and relations. Over the millennia, people's moral circles have expanded to encompass larger and larger polities: the clan, the tribe, the nation, both sexes, other races, and even animals. The circle may have been pushed outward by expanding networks of reciprocity, à la Wright, but it might also be inflated by the inexorable logic of the golden rule: The more one knows and thinks about other living things, the harder it is to privilege one's own interests over theirs. The empathy escalator may also be powered by cosmopolitanism, in which journalism, memoir, and realistic fiction make the inner lives of other people, and the contingent nature of one's own station, more palpable—the feeling that "there but for fortune go I".

Whatever its causes, the decline of violence has profound implications. It is not a license for complacency: We enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to end it, and so we should work to end the appalling violence in our time. Nor is it necessarily grounds for optimism about the immediate future, since the world has never before had national leaders who combine pre-modern sensibilities with modern weapons.

But the phenomenon does force us to rethink our understanding of violence. Man's inhumanity to man has long been a subject for moralization. With the knowledge that something has driven it dramatically down, we can also treat it as a matter of cause and effect. Instead of asking, "Why is there war?" we might ask, "Why is there peace?" From the likelihood that states will commit genocide to the way that people treat cats, we must have been doing something right. And it would be nice to know what, exactly, it is.

[First published in The New Republic, 3.19.07.]
Pages: [1] 2 3 Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines Valid XHTML 1.0! Valid CSS!