Dog Brothers Public Forum
Return To Homepage
Welcome, Guest. Please login or register.
September 23, 2014, 05:28:13 PM

Login with username, password and session length
Search:     Advanced search
Welcome to the Dog Brothers Public Forum.
82593 Posts in 2250 Topics by 1062 Members
Latest Member: seawolfpack5
* Home Help Search Login Register
+  Dog Brothers Public Forum
|-+  Politics, Religion, Science, Culture and Humanities
| |-+  Science, Culture, & Humanities
| | |-+  Evolutionary biology/psychology
« previous next »
Pages: 1 2 [3] Print
Author Topic: Evolutionary biology/psychology  (Read 53152 times)
tim nelson
Newbie
*
Posts: 23


« Reply #100 on: April 15, 2011, 05:18:40 PM »

I highly recommend books by Paul Shephard. Nature and Madness, The Tender Carnivore, and Coming Home to the Pleistocene. He was way ahead of his time in my opinion.

He is a proponent for hunter-gatherer lifestyle being the most healthy. He spends a lot of time on comparing our development with other primates socially, physically, nutritionally, etc and with other social and less social large predators. I liked his ideas of how we developed to be quite a mix, digestive systems like that of true omnivores : raccoon, bear, boars. But our social structure and culture was more wolf like. How we especially males evolved as modern humans spending considerable time hunting large game whenver and wherever  we lived. so that those complex processes that occupied the male psyche of problem solving a moving problem, coordinating together, and the types of exercise. And without large game to hunt we need comparable experiences, we crave an experience like that. Comparable such as hunting humans in war, fighting, etc.

Anyway, I liked most of his stuff. Lots of convincing evidence, and I liked his biased stance on hunter-gatherer cultures which helps.
Logged
Body-by-Guinness
Power User
***
Posts: 2788


« Reply #101 on: April 22, 2011, 01:19:32 PM »

A New Hominin – A. sediba
Published by Steven Novella under Evolution
Comments: 8
Following the branching bush of human evolution is getting increasingly difficult. When I studied human evolution in college, things were much simpler. There were a few Australopithecus species followed by a few Homo species, leading to modern humans. It was recognized at the time that these fossil species probably did not represent a nice clean straight line to Homo sapiens, but it seems the family tree has become much bushier than was imagined at the time.
Here is a recent representation of the hominin family tree. We have added more species of Australopithecus and Homo, plus new genuses of Kenyanthropus and Paranthropus (not even including older genuses that predate Australopithecus).
Now researchers have announced the discovery of yet another species of early hominin, about 2 million years old – likely a late species of Australopithecus named A. sediba. They discovered four individuals – two adults, a child and an infant, who likely fell into a “death trap”  in a cave in what is now Malapa, South Africa.
Each bit of fossil evidence is like a piece to a complex puzzle. As more pieces fit into place, however, the picture becomes more complex and more questions are generated. We are still at the stage where new evidence generates more questions than answers, and we have no idea how complex the final picture that emerges will be.
The new discovery is no exception. A. sediba has a mixture of modern (Homo) and primitive (Australopithecine) traits. It has a small brain like a primitive Australopithecus, but has pelvic structure and hand features that are more modern than other members of the genus.
It should also be noted that the first members of the Homo genus arose about a million years before the age of these specific specimens – so these individuals do not represent a population that in ancestral to our genus.
As always, there are multiple ways to interpret this data. It is possible that A. sediba is the ancestral Australopithecine species that led to Homo – either directly, or closely related to that species (yet to be discovered). In this case, these individuals would be later representatives of that species. Species often persist, even for millions of years, after other species branch off from them. So it is always possible to find representative of an ancestral species that are more recent than species that evolved from them.
It is also possible that A. sediba is a separate line of Australopithecines that did not lead to Homo, but developed some similar features. In this case the “modern” features in A. sediba would be analogous to (similar to, but not ancestral to) the modern feature, rather than homologous to (related through evolutionary derivation) the modern Homo features.
Another possibility that was not mentioned in the Science article that I linked to is that these individuals, and possibly A. sediba as a species, or perhaps just one breeding population, represent the results of interbreeding between Homo and Australopithecus species. In this case modern features would have literally mixed with the more primitive features together in A. sediba.
This adds a new layer complexity to our picture of the human family tree (or any family tree). When species divide the separation is not clean, and later remixing of genes is not only possible but probable. There is genetic evidence, for example, of later mixing of genes between human ancestors and chimpanzee ancestors after the split. So it’s not a stretch to think that hominin populations were at least occasionally interbreeding .
I suspect there are many more hominin species and subspecies to be discovered. The picture that is emerging is fascinating, if it is becoming increasingly difficult to keep track of it all. I’ll just have to muddle through.

http://theness.com/neurologicablog/?p=3139
Logged
Body-by-Guinness
Power User
***
Posts: 2788


« Reply #102 on: April 22, 2011, 02:20:57 PM »

Second post.

The Neuroscience of the Gut
Strange but true: the brain is shaped by bacteria in the digestive tract
By Robert Martone  | Tuesday, April 19, 2011 | 18

Researchers track the gut-brain connection
Image: dyoma
People may advise you to listen to your gut instincts: now research suggests that your gut may have more impact on your thoughts than you ever realized. Scientists from the Karolinska Institute in Sweden and the Genome Institute of Singapore led by Sven Pettersson recently reported in the Proceedings of the National Academy of Sciences that normal gut flora, the bacteria that inhabit our intestines, have a significant impact on brain development and subsequent adult behavior.

We human beings may think of ourselves as a highly evolved species of conscious individuals, but we are all far less human than most of us appreciate. Scientists have long recognized that the bacterial cells inhabiting our skin and gut outnumber human cells by ten-to-one. Indeed, Princeton University scientist Bonnie Bassler compared the approximately 30,000 human genes found in the average human to the more than 3 million bacterial genes inhabiting us, concluding that we are at most one percent human. We are only beginning to understand the sort of impact our bacterial passengers have on our daily lives.

Moreover, these bacteria have been implicated in the development of neurological and behavioral disorders. For example, gut bacteria may have an influence on the body’s use of vitamin B6, which in turn has profound effects on the health of nerve and muscle cells. They modulate immune tolerance and, because of this, they may have an influence on autoimmune diseases, such as multiple sclerosis. They have been shown to influence anxiety-related behavior, although there is controversy regarding whether gut bacteria exacerbate or ameliorate stress related anxiety responses. In autism and other pervasive developmental disorders, there are reports that the specific bacterial species present in the gut are altered and that gastrointestinal problems exacerbate behavioral symptoms. A newly developed biochemical test for autism is based, in part, upon the end products of bacterial metabolism.

But this new study is the first to extensively evaluate the influence of gut bacteria on the biochemistry and development of the brain. The scientists raised mice lacking normal gut microflora, then compared their behavior, brain chemistry and brain development to mice having normal gut bacteria. The microbe-free animals were more active and, in specific behavioral tests, were less anxious than microbe-colonized mice. In one test of anxiety, animals were given the choice of staying in the relative safety of a dark box, or of venturing into a lighted box. Bacteria-free animals spent significantly more time in the light box than their bacterially colonized littermates. Similarly, in another test of anxiety, animals were given the choice of venturing out on an elevated and unprotected bar to explore their environment, or remain in the relative safety of a similar bar protected by enclosing walls. Once again, the microbe-free animals proved themselves bolder than their colonized kin.

Pettersson’s team next asked whether the influence of gut microbes on the brain was reversible and, since the gut is colonized by microbes soon after birth, whether there was evidence that gut microbes influenced the development of the brain. They found that colonizing an adult germ-free animal with normal gut bacteria had no effect on their behavior. However, if germ free animals were colonized early in life, these effects could be reversed. This suggests that there is a critical period in the development of the brain when the bacteria are influential.

Consistent with these behavioral findings, two genes implicated in anxiety -- nerve growth factor-inducible clone A (NGF1-A) and brain-derived neurotrophic factor (BDNF) -- were found to be down-regulated in multiple brain regions in the germ-free animals. These changes in behavior were also accompanied by changes in the levels of several neurotransmitters, chemicals which are responsible for signal transmission between nerve cells. The neurotransmitters dopamine, serotonin and noradrenaline were elevated in a specific region of the brain, the striatum, which is associated with the planning and coordination of movement and which is activated by novel stimuli, while there were there were no such effects on neurotransmitters in other brain regions, such as those involved in memory (the hippocampus) or executive function (the frontal cortex).

When Pettersson’s team performed a comprehensive gene expression analysis of five different brain regions, they found nearly 40 genes that were affected by the presence of gut bacteria. Not only were these primitive microbes able to influence signaling between nerve cells while sequestered far away in the gut, they had the astonishing ability to influence whether brain cells turn on or off specific genes.   

How, then, do these single-celled intestinal denizens exert their influence on a complex multicellular organ such as the brain? Although the answer is unclear, there are several possibilities: the Vagus nerve, for example, connects the gut to the brain, and it’s known that infection with the Salmonella bacteria stimulates the expression of certain genes in the brain, which is blocked when the Vagus nerve is severed. This nerve may be stimulated as well by normal gut microbes, and serve as the link between them and the brain. Alternatively, those microbes may modulate the release of chemical signals by the gut into the bloodstream which ultimately reach the brain. These gut microbes, for example, are known to modulate stress hormones which may in turn influence the expression of genes in the brain.

Regardless of how these intestinal “guests” exert their influence, these studies suggest that brain-directed behaviors, which influence the manner in which animals interact with the external world, may be deeply influenced by that animal’s relationship with the microbial organisms living in its gut. And the discovery that gut bacteria exert their influence on the brain within a discrete developmental stage may have important implications for developmental brain disorders.

http://www.scientificamerican.com/article.cfm?id=the-neuroscience-of-gut
Logged
Body-by-Guinness
Power User
***
Posts: 2788


« Reply #103 on: April 26, 2011, 10:11:03 PM »

Baby Language
Published by Steven Novella under Neuroscience
Comments: 4
Recent studies demonstrate that babies 12-18 months old have similar activity in their brains in response to spoken words as do adults, a fact that tells us a lot about the development of language function.
In the typical adult brain language function is primarily carried out in highly specialized parts of the brain – Wernicke’s area (in the dominant, usually left, superior temporal lobe) processes words into concepts and concepts into words, while Broca’s area (in the dominant posterior-inferior frontal lobe) controls speech output. The two areas are connected by the arcuate fasciculus and are fed by both auditory and visual input. Taken as a whole this part of the brain functions as the language cortex. A stroke or other damage to this area in an adult results in loss of one ore more aspects of speech, depending on the extent of damage.
Damage to this part of the brain in babies, however, does not have the same effect. When such children grow up they are able to develop essentially normal language function. There are two prevailing theories to explain this. The first is that language function is more widely distributed in infants than in adults, perhaps also involving the same structures on the non-dominant side of the brain. As the brain matures language function becomes confined to the primary language cortex.
The second theory is that brain plasticity allows non-damaged parts of the brain to take over function for the language cortex. Such plasticity exists even in adult brains, but is vastly more significant in babies, whose brains are still developing and wiring themselves. There is still a lot of raw brain material that has not fully specialized yet that can be coopted for whatever functions are needed.
The new research has implications for this debate. If the former theory is correct, then babies who are just learning language would activate their brains more broadly than adults in response to language. If babies show a similar pattern of activation, that would support the plasticity theory.
This latest research firmly supports plasticity as the answer. Researchers at the University of California used functional MRI scans and magnetoencephalography (MEG) to look at the brain activity of 12-18 month old children in response to spoken words. They found that their primary language cortex lit up in a similar pattern to adults. They further tested to see if the children had any sense of the meaning of the words. They showed pictures of common objects with either a correct or incorrect spoken word. The children showed increased language area activity when the words were incongruous to the picture – and the researchers showed this is the same increase in activity as seen in adults.
What this research implies is that the genetic program for brain design comes into effect very early in brain development. The language cortex is destined to be language cortex right from the beginning, as long as nothing goes wrong with this process.
It should also be noted that this study looked only at the response to individual words. What it says about the 12-18 month old stage of development is that children of this age are already programming their language areas and storing up words and their meanings. This research did not look at other aspects of language, such as grammar – the ability to string multiple words together in a specific way in order to create meaning. It also did not look at the visual processing of written words.
Any parent of young children will likely remember with great detail the functional language development of their own children. At this age, and even younger than 12 months, children do seem to be sponges for language. Once they start learning words, they do so very quickly. Young children also seem to understand far more words than they can say. I don’t think this is mere confirmation bias (although that would tend to exaggerate the appearance of word acquisition), and research bears out that children can understand many more words than they can say. The ability to speak comes a bit later than the ability to assign meaning to specific words.
I remember that I played games with my children when they were about one year old, and still in the babbling stage. They could reliably, for example, retrieve specific toys by name (being very careful to avoid the clever Hans effect). I remember, in fact, be very surprised at how well they performed – they seemed to understand many more words than I would have thought given the rudimentary nature of their babbling. In this case, careful research confirms subjective experience – children learn to understand words spoken to them before they gain the ability to say them.
This makes sense from the point of view that it is very neurologically difficult to articulate. We take it for granted, but it does require dedicated cortex to pull of this feat. Also, think about how easy it is to become dysarthric – we start slurring our words even when we are just a little sleep deprived, or with a moderate amount of alcohol. It does seem to be a function that goes early when brain function is even slightly compromised, which says something about how demanding it is.
One more tangential point – it also strikes me that we tend to naively judge what is going on in people’s heads by what is coming out of their mouths. This is not unreasonable in most circumstances, but there are many reasons why people may be more mentally sharp than is evidenced by their articulation. Young children are just one example – they may be babbling with their mouths, but there is more linguistically going on in their brains.

http://theness.com/neurologicablog/?p=2711
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #104 on: May 17, 2011, 01:08:55 PM »


      Status Report: "The Dishonest Minority"



Three months ago, I announced that I was writing a book on why security
exists in human societies.  This is basically the book's thesis statement:

     All complex systems contain parasites.  In any system of
     cooperative behavior, an uncooperative strategy will be effective
     -- and the system will tolerate the uncooperatives -- as long as
     they're not too numerous or too effective. Thus, as a species
     evolves cooperative behavior, it also evolves a dishonest minority
     that takes advantage of the honest majority.  If individuals
     within a species have the ability to switch strategies, the
     dishonest minority will never be reduced to zero.  As a result,
     the species simultaneously evolves two things: 1) security systems
     to protect itself from this dishonest minority, and 2) deception
     systems to successfully be parasitic.

     Humans evolved along this path.  The basic mechanism can be
     modeled simply.  It is in our collective group interest for
     everyone to cooperate. It is in any given individual's short-term
     self-interest not to cooperate: to defect, in game theory terms.
     But if everyone defects, society falls apart.  To ensure
     widespread cooperation and minimal defection, we collectively
     implement a variety of societal security systems.

     Two of these systems evolved in prehistory: morals and reputation.
     Two others evolved as our social groups became larger and more
     formal: laws and technical security systems.  What these security
     systems do, effectively, is give individuals incentives to act in
     the group interest.  But none of these systems, with the possible
     exception of some fanciful science-fiction technologies, can ever
     bring that dishonest minority down to zero.

     In complex modern societies, many complications intrude on this
     simple model of societal security. Decisions to cooperate or
     defect are often made by groups of people -- governments,
     corporations, and so on -- and there are important differences
     because of dynamics inside and outside the groups. Much of our
     societal security is delegated -- to the police, for example --
     and becomes institutionalized; the dynamics of this are also
     important.

     Power struggles over who controls the mechanisms of societal
     security are inherent: "group interest" rapidly devolves to "the
     king's interest."  Societal security can become a tool for those
     in power to remain in power, with the definition of "honest
     majority" being simply the people who follow the rules.

     The term "dishonest minority" is not a moral judgment; it simply
     describes the minority who does not follow societal norm.  Since
     many societal norms are in fact immoral, sometimes the dishonest
     minority serves as a catalyst for social change.  Societies
     without a reservoir of people who don't follow the rules lack an
     important mechanism for societal evolution.  Vibrant societies
     need a dishonest minority; if society makes its dishonest minority
     too small, it stifles dissent as well as common crime.

At this point, I have most of a first draft: 75,000 words.  The
tentative title is still "The Dishonest Minority: Security and its Role
in Modern Society."  I have signed a contract with Wiley to deliver a
final manuscript in November for February 2012 publication.  Writing a
book is a process of exploration for me, and the final book will
certainly be a little different -- and maybe even very different -- from
what I wrote above.  But that's where I am today.

And it's why my other writings -- and the issues of Crypto-Gram --
continue to be sparse.

Lots of comments -- over 200 -- to the blog post.  Please comment there;
I want the feedback.
http://www.schneier.com/blog/archives/2011/02/societal_securi.html
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #105 on: August 19, 2011, 01:19:39 PM »

TTT for the attention of Richard Lighty
Logged
prentice crawford
Power User
***
Posts: 775


« Reply #106 on: September 08, 2011, 06:10:09 PM »

 

  Closest Human Ancestor May Rewrite Steps in Our Evolution
By Charles Q. Choi, LiveScience Contributor |

  A startling mix of human and primitive traits found in the brains, hips, feet and hands of an extinct species identified last year make a strong case for it being the immediate ancestor to the human lineage, scientists have announced.

These new findings could rewrite long-standing theories about the precise steps human evolution took, they added, including the notion that early human female hips changed shape to accommodate larger-brained offspring. There is also new evidence suggesting that this species had the hands of a toolmaker.

Fossils of the extinct hominid known as Australopithecus sediba were accidentally discovered by the 9-year-old son of a scientist in the remains of a cave in South Africa in 2008, findings detailed by researchers last year. Australopithecus means "southern ape," and is a group that includes the iconic fossil Lucy, while sediba means "wellspring" in the South African language Sotho. [See images of human ancestor]

Two key specimens were discovered — a juvenile male as developed as a 10- to 13-year-old human and an adult female maybe in her late 20s or early 30s. The species is both a hominid and a hominin — hominids include humans, chimpanzees, gorillas and their extinct ancestors, while hominins include those species after Homo, the human lineage, split from that of chimpanzees.

To begin to see where Au. sediba might fit on the family tree, researchers pinned down the age of the fossils by dating the calcified sediments surrounding them with advanced uranium-lead dating techniques and a method called paleomagnetic dating, which measures how many times the Earth's magnetic field has reversed. They discovered the fossils were approximately 1.977 million years old, which predates the earliest appearances of traits specific to the human lineage Homo in the fossil record. This places Au. sediba in roughly the same age category as hominids such as Homo habilis and Homo rudolfensis, which were thought to be potential ancestors to Homo erectus, the earliest undisputed predecessor of modern humans. [10 Things That Make Humans Special]

"As the fossil record for early human ancestors increases, the need for more accurate dates is becoming paramount," said researcher Robyn Pickering at the University of Melbourne in Australia.

Small but humanlike brain

Most aspects of Au. sediba display an intriguing mix of both human and more primitive features that hint it might be an intermediary form between Australopithecus and Homo.

"The fossils demonstrate a surprisingly advanced but small brain, a very evolved hand with a long thumb like a human's, a very modern pelvis, but a foot and ankle shape never seen in any hominin species that combines features of both apes and humans in one anatomical package," said researcher Lee Berger, a paleoanthropologist at the University of Witwatersrand in Johannesburg, South Africa. "The many very advanced features found in the brain and body and the earlier date make it possibly the best candidate ancestor for our genus, the genus Homo, more so than previous discoveries such as Homo habilis."

The brain is often thought of as what distinguishes humanity from the rest of the animal kingdom, and the juvenile specimen of Au. sediba had an exceptionally well-preserved skull that could shed light on the pace of brain evolution in early hominins. To find out more, the researchers scanned the space in the skull where its brain would have been using the European Synchrotron Radiation Facility in Grenoble, France; the result is the most accurate scan ever produced for an early human ancestor, with a level of detail of up to 90 microns, or just below the size of a human hair.

The scan revealed Au. sediba had a much smaller brain than seen in human species, with an adult version maybe only as large as a medium-size grapefruit. However, it was humanlike in several ways — for instance, its orbitofrontal region directly behind the eyes apparently expanded in ways that make it more like a human's frontal lobe in shape. This area is linked in humans with higher mental functions such as multitasking, an ability that may contribute to human capacities for long-term planning and innovative behavior.

"We could be seeing the beginnings of those capabilities," researcher Kristian Carlson at the University of Witwatersrand told LiveScience.

These new findings cast doubt on the long-standing theory that brains gradually increased in size and complexity from Australopithecus to Homo. Instead, their findings corroborate an alternative idea — that Australopithecus brains did increase in complexity gradually, becoming more like Homo, and later increased in size relatively quickly.

Modern hips

This mosaic of modern and primitive traits held true with its hips as well. An analysis of the partial pelvis of the female Au. sediba revealed that it had modern, humanlike features.

"It is surprising to discover such an advanced pelvis in such a small-brained creature," said researcher Job Kibii at the University of the Witwatersrand.  "It is short and broad like a human pelvis ... parts of the pelvis are indistinguishable from that of humans."

Scientists had thought the human-like pelvis evolved to accommodate larger-brained offspring. The new findings of humanlike hips in Au. sediba despite small-brained offspring suggests these pelvises may have instead initially evolved to help this hominin better wander across the landscape, perhaps as grasslands began to expand across its habitat.

When it came to walking, investigating the feet and ankles of the fossils revealed surprises about how Au. sediba might have strode across the world. No hominin ankle has ever been described with so many primitive and advanced features.

"If the bones had not been found stuck together, the team may have described them as belonging to different species," said researcher Bernhard Zipfel at the University of the Witwatersrand.

The  researchers discovered that its ankle joint is mostly like a human's, with some evidence for a humanlike arch and a well--efined Achilles tendon, but its heel and shin bones appear to be mostly ape-like. This suggested the hominid probably climbed trees yet also halkid in a unique way not exactly like that of humans.

Altogether, such anatomical traits would have allowed Au. sediba to walk in perhaps a more energy-efficient way, with tendons storing energy and returning that energy to the next step, said researcher Steve Churchill from Duke University in Durham, N.C. "These are the kinds of things that we see with the genus Homo," he explained.

What nice hands …

Finally, an analysis of Au. sediba's hands suggests it might have been a toolmaker. The fossils — including the most complete hand known in an early hominin, which is missing only a few bones and belonged to the mature female specimen — showed its hand was capable of the strong grasping needed for tree-climbing, but that it also had a long thumb and short fingers. These would have allowed it a precision grip useful for tools, one involving just the thumb and fingers, where the palm does not play an active part.

Altogether, the hand of Au. sediba has more features related to tool-making than that of the first human species thought of as a tool user, the "handy man" Homo habilis, said researcher Tracy Kivell at the Max Planck Institute for Evolutionary Anthropology in Germany. "This suggests to us that sediba may also have been a toolmaker."

Though the scientists haven't excavated the site in search of stone tools, "the hand and brain morphology suggest that Au. sediba may have had the capacity to manufacture and use complex tools," Kivell added.

The researchers do caution that although they suggest that Au. sediba was ancestral to the human lineage, all these apparent resemblances between it and us could just be coincidences, with this extinct species evolving similar traits to our lineages due, perhaps, to similar circumstances. [Top 10 Missing Links]

In fact, it might be just as interesting to imagine that Au. sediba was not directly ancestral to Homo, because it opens up the possibility "of independent evolution of the same sorts of features," Carlson said. "Whether or not it's on the same lineage as leading to Homo, I think there are interesting questions and implications."

The scientists detailed their findings in the Sept. 9 issue of the journal Science.

                                                P.C.
Logged

Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #107 on: September 24, 2011, 11:09:57 AM »

Matt Ridley is an author whom I follow.  I have read his "The Red Queen" (the evolutionary reasons sex exists and the implications thereof) and "Nature via Nuture (also quite brilliant and which has triggered a shift in how I think about these things.)



By MATT RIDLEY
The crowd-sourced, wikinomic cloud is the new, new thing that all management consultants are now telling their clients to embrace. Yet the cloud is not a new thing at all. It has been the source of human invention all along. Human technological advancement depends not on individual intelligence but on collective idea sharing, and it has done so for tens of thousands of years. Human progress waxes and wanes according to how much people connect and exchange.

When the Mediterranean was socially networked by the trading ships of Phoenicians, Greeks, Arabs or Venetians, culture and prosperity advanced. When the network collapsed because of pirates at the end of the second millennium B.C., or in the Dark Ages, or in the 16th century under the Barbary and Ottoman corsairs, culture and prosperity stagnated. When Ming China, or Shogun Japan, or Nehru's India, or Albania or North Korea turned inward and cut themselves off from the world, the consequence was relative, even absolute decline.

Knowledge is dispersed and shared. Friedrich Hayek was the first to point out, in his famous 1945 essay "The Uses of Knowledge in Society," that central planning cannot work because it is trying to substitute an individual all-knowing intelligence for a distributed and fragmented system of localized but connected knowledge.

So dispersed is knowledge, that, as Leonard Reed famously observed in his 1958 essay "I, Pencil," nobody on the planet knows how to make a pencil. The knowledge is dispersed among many thousands of graphite miners, lumberjacks, assembly line workers, ferrule designers, salesmen and so on. This is true of everything that I use in my everyday life, from my laptop to my shirt to my city. Nobody knows how to make it or to run it. Only the cloud knows.

One of the things I have tried to do in my book "The Rational Optimist" is to take this insight as far back into the past as I can—to try to understand when it first began to be true. When did human beings start to use collective rather than individual intelligence?

In doing so, I find that the entire field of anthropology and archaeology needs Hayek badly. Their debates about what made human beings successful, and what caused the explosive take-off of human culture in the past 100,000 years, simply never include the insight of dispersed knowledge. They are still looking for a miracle gene, or change in brain organization, that explains, like a deus ex machina, the human revolution. They are still looking inside human heads rather than between them.

Enlarge Image

CloseGetty Images
 ."I think there was a biological change—a genetic mutation of some kind that promoted the fully modern ability to create and innovate," wrote the anthropologist Richard Klein in a 2003 speech to the American Association for the Advancement of Science. "The sudden expansion of the brain 200,000 years ago was a dramatic spontaneous mutation in the brain . . . a change in a single gene would have been enough," the neuroscientist Colin Blakemore told the Guardian in 2010.

There was no sudden change in brain size 200,000 years ago. We Africans—all human beings are descended chiefly from people who lived exclusively in Africa until about 65,000 years ago—had slightly smaller brains than Neanderthals, yet once outside Africa we rapidly displaced them (bar acquiring 2.5% of our genes from them along the way).

And the reason we won the war against the Neanderthals, if war it was, is staring us in the face, though it remains almost completely unrecognized among anthropologists: We exchanged. At one site in the Caucasus there are Neanderthal and modern remains within a few miles of each other, both from around 30,000 years ago. The Neanderthal tools are all made from local materials. The moderns' tools are made from chert and jasper, some of which originated many miles away. That means trade.

Evidence from recent Australian artifacts shows that long-distance movement of objects is a telltale sign of trade, not migration. We Africans have been doing this since at least 120,000 years ago. That's the date of beads made from marine shells found a hundred miles inland in Algeria. Trade is 10 times as old as agriculture.

At first it was a peculiarity of us Africans. It gave us the edge over Neanderthals in their own continent and their own climate, because good ideas can spread through trade. New weapons, new foods, new crafts, new ornaments, new tools. Suddenly you are no longer relying on the inventiveness of your own tribe or the capacity of your own territory. You are drawing upon ideas that occurred to anybody anywhere anytime within your trading network.

In the same way, today, American consumers do not have to rely only on their own citizens to discover new consumer goods or new medicines or new music: The Chinese, the Indians, the Brazilians are also able to supply them.

That is what trade does. It creates a collective innovating brain as big as the trade network itself. When you cut people off from exchange networks, their innovation rate collapses. Tasmanians, isolated by rising sea levels about 10,000 years ago, not only failed to share in the advances that came after that time—the boomerang, for example—but actually went backwards in terms of technical virtuosity. The anthropologist Joe Henrich of the University of British Columbia argues that in a small island population, good ideas died faster than they could be replaced. Tierra del Fuego's natives, on a similarly inhospitable and small land, but connected by trading canoes across the much narrower Magellan strait, suffered no such technological regress. They had access to a collective brain the size of South America.

Which is of course why the Internet is such an exciting development. For the first time humanity has not just some big collective brains, but one truly vast one in which almost everybody can share and in which distance is no obstacle.

The political implications are obvious: that human collaboration is necessary for society to work; that the individual is not—and has not been for 120,000 years—able to support his lifestyle; that trade enables us to work for each other not just for ourselves; that there is nothing so antisocial (or impoverishing) as the pursuit of self-sufficiency; and that authoritarian, top-down rule is not the source of order or progress.

Hayek understood all this. And it's time most archaeologists and anthropologists, as well as some politicians and political scientists, did as well.

Mr. Ridley writes the Journal's weekly Mind & Matter column. He is the author of "The Rational Optimist: How Prosperity Evolves" (Harper, 2010). This op-ed is adapted from his Hayek Prize lecture, given under the auspices of the Manhattan Institute, to be delivered on Sept. 26.

Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #108 on: January 13, 2012, 08:31:36 AM »

This footage seems to me to be quite extraordinary. Apparently the crow has observed humans snowboarding and has taken up the sport himself.

Thus we see:

a) cross-species learning
b) the use of a tool
c) play

http://www.youtube.com/watch?v=3dWw9GLcOeA&feature=share
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #109 on: February 06, 2012, 07:38:47 AM »


By LIONEL TIGER

When the first phone line linked two New England towns, the inevitable arrogant scold asked if the people of town X had anything to say to the folks of town Y. His implication was "no." Why have more to do with (implicitly fallen) fellow humans than absolutely necessary? Why should technology abet friendliness?

Mr. Scold was wrong. One of the most successful magazine launches of the last decades was People, carefully and endlessly just about that, week in and week out, year after year. Europe boasts a strange menagerie of similar publications that ceaselessly chronicle the libidinous events in the lives of minor Scandinavian royalty and the housing buys and sells of soccer stars before and after their divorces. Magazines pay the price of a used fighter plane for the first photo of the baby of certified stars.

People want to know about this town and that other town too. It's their nature.

Primates always want to know what is going on. If it's over the hill where you can't see for sure what's up, that's even more stimulating and important to secure long-range survival. Primates are intensely interested in each other and other groups. It was pointed out in the 1960s that in some ground-living species, members of the group glanced at the lead primate every 20 or 30 seconds. Think Louis Quatorze or Mick Jagger. Look, look, look—people are always on the lookout.

The human who has most adroitly—if at first innocently, and in the next weeks most profitably—capitalized on this is Facebook founder Mark Zuckerberg.

"Facebook." Get it? Not FootBook or ElbowBook. The face. It gets you a driver's license and stars send it out to fans. We know that many users' first and classical impulse was acquiring convivial acquaintance with young women. Facebook married that ancient Darwinian urgency to a cheap, brilliantly lucid, and endlessly replicable technology.

The result has been virtually incalculable and not only for Mr. Zuckerberg's lunch money. Nearly one-sixth of homo sapiens are on Facebook. Half of Americans over age 12 are on it. It is world-wide and has been joined by other tools of conviviality such as Twitter. Nearly 15% of Americans already belong to that new tribe. There are others.

Mr. Zuckerberg has re-primatized a group of humans of unprecedented number, diffusion and intensity. His product costs him virtually nothing to produce—it is simply us. We enter his shop, display ourselves as attractively or interestingly as we can, replenish ourselves hourly or daily or by the minute, and do it for nothing. Doesn't cost him a nickel.

And why? Just because we're primates with endlessly deep interest in each other, with a knack and need to groom each other—either physically, as monkeys do, or with "What a nice hairdo/dress/divorce/promotion!" as Facebookworms do. There is much to transmit between towns and between people.

Mr. Zuckerberg bestrides vast business numbers once dreamt of only by toothpaste and soft-drink makers. This reflects a new commercial demography in which the consumer is not someone who wants something necessary, but rather one who seeks to assert simply what he is. And the tool he uses in order to become nothing more or less than an efficient, interesting and socially prosperous primate is the Facebook page.

The technology is new but the passion for connection isn't. In Paris a hundred years ago pneumatic tubes ran all the through the parts of town that could afford them so messages could be written and sent as if by courier. When I was a student in London, there were mail deliveries twice a day and in some environs three. The homo sapien wants to know, to exchange, to show its face.

And when the counting houses work triple-time recording the riches from all this, it will be sweet comedy to remember that Mr. Zuckerberg became the richest primatologist in the world because he gave his customers nothing new, except the chance to be their old ape selves.

Mr. Tiger, an emeritus professor of anthropology at Rutgers, is the author of "The Decline of Males" (St. Martins, 2000) and, with Michael McGuire, of "God's Brain" (Prometheus Books, 2010).
Logged
bigdog
Power User
***
Posts: 2165


« Reply #110 on: March 21, 2012, 07:58:15 PM »

http://www.nytimes.com/2012/03/20/opinion/brooks-when-the-good-do-bad.html?_r=1&ref=davidbrooks


"According to this view, most people are naturally good, because nature is good. The monstrosities of the world are caused by the few people (like Hitler or Idi Amin) who are fundamentally warped and evil.

This worldview gives us an easy conscience, because we don’t have to contemplate the evil in ourselves. But when somebody who seems mostly good does something completely awful, we’re rendered mute or confused.

But of course it happens all the time. That’s because even people who contain reservoirs of compassion and neighborliness also possess a latent potential to commit murder."
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #111 on: March 22, 2012, 06:12:49 AM »

Indeed BD.  Coincidentally enough I am re-reading right now a rather thick book that is a collection of essays on the concept, Jungian and otherwise, of the shadow.
Logged
DougMacG
Power User
***
Posts: 5964


« Reply #112 on: April 26, 2012, 02:32:48 PM »

America’s false autism epidemic,  by Dr. Allen Frances, professor emeritus at Duke University’s department of psychology

The apparent epidemic of autism is in fact the latest instance of the fads that litter the history of psychiatry.

We have a strong urge to find labels for disturbing behaviors; naming things gives us an (often false) feeling that we control them. So, time and again, an obscure diagnosis suddenly comes out of nowhere to achieve great popularity. It seems temporarily to explain a lot of previously confusing behavior — but then suddenly and mysteriously returns to obscurity.

Not so long ago, autism was the rarest of diagnoses, occurring in fewer than one in 2,000 people. Now the rate has skyrocketed to 1 in 88 in America (and to a remarkable 1 in 38 in Korea). And there is no end in sight.

Increasingly panicked, parents have become understandably vulnerable to quackery and conspiracy theories. The worst result has been a reluctance to vaccinate kids because of the thoroughly disproved and discredited suggestion that the shots can somehow cause autism.

There are also frantic (and probably futile) efforts to find environmental toxins that might be harming developing brains, explaining the sudden explosion of autism.

Anything is possible, but when rates rise this high and this fast, the best bet is always that there has been a change in diagnostic habits, not a real change in people or in the rate of illness.

So what is really going on to cause this “epidemic”?

Perhaps a third of the huge jump in rates can be explained by three factors: the much-increased public and provider awareness of autism, the much-reduced stigma associated with it and the fact that the definition of autism has been loosened to include milder cases.

Sixteen years ago, when we updated the DSM (the official manual of psych diagnoses) for the fourth edition, we expanded the definition of autism to include Aspergers. At the time, we expected this to triple the rate of diagnosed cases; instead, it has climbed 20 times higher.

That unexpected jump has three obvious causes. Most important, the diagnosis has become closely linked with eligibility for special school services.

Having the label can make the difference between being closely attended to in a class of four versus being lost in a class of 40. Kids who need special attention can often get it only if they are labeled autistic.

So the autism tent has been stretched to accommodate a wide variety of difficult learning, behavioral and social problems that certainly deserve help — but aren’t really autism. Probably as many as half of the kids labeled autistic wouldn’t really meet the DSM IV criteria if these were applied carefully.

Freeing autism from its too tight coupling with service provision would bring down its rates and end the “epidemic.” But that doesn’t mean that school services should also be reduced. The mislabeled problems are serious in their own right, and call out for help.

The second driver of the jump in diagnosis has been a remarkably active and successful consumer advocacy on autism, facilitated by the power of the Internet. This has had four big upsides: the identification of previously missed cases, better care and education for the identified cases, greatly expanded research and a huge reduction in stigma.

But there are two unfortunate downsides: Many people with the diagnosis don’t really meet the criteria for it, and the diagnosis has become so heterogeneous that it loses meaning and predictive value. This is why so many kids now outgrow their autism. They were never really autistic in the first place.

A third cause has been overstated claims coming from epidemiological research — studies of autism rates in the general population. For reasons of convenience and cost, the ratings in the studies always have to be done by lay interviewers, who aren’t trained as clinicians and so are unable to judge whether the elicited symptoms are severe and enduring enough to qualify as a mental disorder.

It’s important to understand that the rates reported in these studies are always upper limits, not true rates; they exaggerate the prevalence of autism by including people who’d be excluded by careful clinical interview. (This also explains why rates can change so quickly from year to year.)

So where do we stand, and what should we do? I am for a more careful and restricted diagnosis of autism that isn’t driven by service requirements. I am also for kids getting the school services they need.

The only way to achieve both goals is to reduce the inordinate power of the diagnosis of autism in determining who gets what educational service. Psychiatric diagnosis is devised for use in clinical settings, not educational ones. It may help contribute to educational decisions but should not determine them.

Human nature changes slowly, if at all, but the ways we label it can change fast and tend to follow fleeting fashions.

Dr. Allen Frances, now a professor emeritus at Duke University’s department of psychology, chaired the DSM IV task force.

Read more: http://www.nypost.com/p/news/opinion/opedcolumnists/america_false_autism_epidemic_jfI7XORH94IcUB795b6f7L#ixzz1tB0kPCdK
Logged
JDN
Power User
***
Posts: 2004


« Reply #113 on: April 27, 2012, 11:55:38 AM »

On a personal note, an acquaintance of mine at a local 4 year college sought and was diagnosed with Attention Deficit Disorder.  He was given private instruction, private tutors, extra time on tests, exemptions from certain requirements, etc.  At what cost?  And frankly, IMHO his degree is suspect.  My heart bleeds for the truly handicapped; but this has become ridiculous. 
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #114 on: July 25, 2012, 09:01:53 PM »



Silicon Valley Says Step Away From the Device
Tech firms are uneasy over the effect time online has on relationships.
By MATT RICHTEL
Published: July 23, 2012
 
Stuart Crabb, a director in the executive offices of Facebook, naturally likes to extol the extraordinary benefits of computers and smartphones. But like a growing number of technology leaders, he offers a warning: log off once in a while, and put them down.

The New York Times

In a place where technology is seen as an all-powerful answer, it is increasingly being seen as too powerful, even addictive.

The concern, voiced in conferences and in recent interviews with many top executives of technology companies, is that the lure of constant stimulation — the pervasive demand of pings, rings and updates — is creating a profound physical craving that can hurt productivity and personal interactions.

“If you put a frog in cold water and slowly turn up the heat, it’ll boil to death — it’s a nice analogy,” said Mr. Crabb, who oversees learning and development at Facebook. People “need to notice the effect that time online has on your performance and relationships.”

The insight may not sound revelatory to anyone who has joked about the “crackberry” lifestyle or followed the work of researchers who are exploring whether interactive technology has addictive properties.

But hearing it from leaders at many of Silicon Valley’s most influential companies, who profit from people spending more time online, can sound like auto executives selling muscle cars while warning about the dangers of fast acceleration.

“We’re done with this honeymoon phase and now we’re in this phase that says, ‘Wow, what have we done?’ ” said Soren Gordhamer, who organizes Wisdom 2.0, an annual conference he started in 2010 about the pursuit of balance in the digital age. “It doesn’t mean what we’ve done is bad. There’s no blame. But there is a turning of the page.”

At the Wisdom 2.0 conference in February, founders from Facebook, Twitter, eBay, Zynga and PayPal, and executives and managers from companies like Google, Microsoft, Cisco and others listened to or participated in conversations with experts in yoga and mindfulness. In at least one session, they debated whether technology firms had a responsibility to consider their collective power to lure consumers to games or activities that waste time or distract them.

The actual science of whether such games and apps are addictive is embryonic. But the Diagnostic and Statistical Manual of Mental Disorders, widely viewed as the authority on mental illnesses, plans next year to include “Internet use disorder” in its appendix, an indication researchers believe something is going on but that requires further study to be deemed an official condition.

Some people disagree there is a problem, even if they agree that the online activities tap into deep neurological mechanisms. Eric Schiermeyer, a co-founder of Zynga, an online game company and maker of huge hits like FarmVille, has said he has helped addict millions of people to dopamine, a neurochemical that has been shown to be released by pleasurable activities, including video game playing, but also is understood to play a major role in the cycle of addiction.

But what he said he believed was that people already craved dopamine and that Silicon Valley was no more responsible for creating irresistible technologies than, say, fast-food restaurants were responsible for making food with such wide appeal.

“They’d say: ‘Do we have any responsibility for the fact people are getting fat?’ Most people would say ‘no,’ ” said Mr. Schiermeyer. He added: “Given that we’re human, we already want dopamine.”

Along those lines, Scott Kriens, chairman of Juniper Networks, one of the biggest Internet infrastructure companies, said the powerful lure of devices mostly reflected primitive human longings to connect and interact, but that those desires needed to be managed so they did not overwhelm people’s lives.

“The responsibility we have is to put the most powerful capability into the world,” he said. “We do it with eyes wide open that some harm will be done. Someone might say, ‘Why not do so in a way that causes no harm?’ That’s naïve.”

“The alternative is to put less powerful capability in people’s hands and that’s a bad trade-off,” he added.

Mr. Crabb, the Facebook executive, said his primary concern was that people live balanced lives. At the same time, he acknowledges that the message can run counter to Facebook’s business model, which encourages people to spend more time online. “I see the paradox,” he said.

The emerging conversation reflects a broader effort in the valley to offer counterweights to the fast-paced lifestyle. Many tech firms are teaching meditation and breathing exercises to their staff members to help them slow down and disconnect.

At Cisco, Padmasree Warrior, the chief technology and strategy officer and its former head of engineering, a position where she oversaw 22,000 employees, said she regularly told people to take a break and a deep breath, and did so herself. She meditates every night and takes Saturday to paint and write poetry, turning off her phone or leaving it in the other room.

“It’s almost like a reboot for your brain and your soul,” she said. She added of her Saturday morning digital detox: “It makes me so much calmer when I’m responding to e-mails later.”

Kelly McGonigal, a psychologist who lectures about the science of self-control at the Stanford School of Medicine (and has been invited to lecture at the business school at Stanford), said she regularly talked with leaders at technology companies about these issues. She added that she was impressed that they had been open to discussing a potential downside of their innovations. “The people who are running these companies deeply want their technology and devices to enhance lives,” said Dr. McGonigal. “But they’re becoming aware of people’s inability to disengage.”

She also said she believed that interactive gadgets could create a persistent sense of emergency by setting off stress systems in the brain — a view that she said was becoming more widely accepted.

“It’s this basic cultural recognition that people have a pathological relationship with their devices,” she said. “People feel not just addicted, but trapped.”

Michelle Gale, who recently left her post as the head of learning and development at Twitter, said she regularly coached engineers and executives at the company that their gadgets had addictive properties.

“They said, ‘Wow, I didn’t know that.’ Or, ‘I guess I knew that but I don’t know what to do about it,’ ” recalled Ms. Gale, who regularly organized meditation and improvisation classes at Twitter to encourage people to let their minds wander.

Google has started a “mindfulness” movement at the company to teach employees self-awareness and to improve their ability to focus. Richard Fernandez, an executive coach at Google and one of the leaders of the mindfulness movement, said the risks of being overly engaged with devices were immense.

“It’s nothing less than everything,” he said, adding that if people can find time to occasionally disconnect, “we can have more intimate and authentic relationships with ourselves and those we love in our communities.”

Google, which owns YouTube, earns more ad revenue as people stay online longer. But Mr. Fernandez, echoing others in Silicon Valley, said they were not in business to push people into destructive behavior.

“Consumers need to have an internal compass where they’re able to balance the capabilities that technology offers them for work, for search, with the qualities of the lives they live offline,” he said.

“It’s about creating space, because otherwise we can be swept away by our technologies.”

Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #115 on: September 16, 2012, 03:35:11 PM »

Proposition:

"Ontogeny recapitulates philogeny."

True or false?


In a somewhat related vein:

http://en.wikipedia.org/wiki/Lamarckism   I remember reading a comment many years ago that criticized something Konrad Lorenz had said as being Lamarkian, but this past year I read Matt Ridley's "Nature via Nuture"-- a book which I found quite exciting though certain passages went right over my head with nary a look back, which seemed to me to resurrect the question.   In a related vein, there is this http://en.wikipedia.org/wiki/Epigenetics
« Last Edit: September 16, 2012, 03:53:55 PM by Crafty_Dog » Logged
objectivist1
Power User
***
Posts: 557


« Reply #116 on: September 16, 2012, 03:45:00 PM »

This is from Wikipedia, which I know is not necessarily the authoritative source, but I've also read articles by modern biologists which state that this theory is not valid:

Haeckel

Ernst Haeckel attempted to synthesize the ideas of Lamarckism and Goethe's Naturphilosophie with Charles Darwin's concepts. While often seen as rejecting Darwin's theory of branching evolution for a more linear Lamarckian "biogenic law" of progressive evolution, this is not accurate: Haeckel used the Lamarckian picture to describe the ontogenic and phylogenic history of the individual species, but agreed with Darwin about the branching nature of all species from one, or a few, original ancestors.[18] Since around the start of the twentieth century, Haeckel's "biogenetic law" has been refuted on many fronts.[7]
Haeckel formulated his theory as "Ontogeny recapitulates phylogeny". The notion later became simply known as the recapitulation theory. Ontogeny is the growth (size change) and development (shape change) of an individual organism; phylogeny is the evolutionaryhistory of a species. Haeckel's recapitulation theory claims that the development of advanced species passes through stages represented by adult organisms of more primitive species.[7] Otherwise put, each successive stage in the development of an individual represents one of the adult forms that appeared in its evolutionary history.
For example, Haeckel proposed that the pharyngeal grooves between the pharyngeal arches in the neck of the human embryo resembled gill slits of fish, thus representing an adult "fishlike" developmental stage as well as signifying a fishlike ancestor. Embryonic pharyngeal slits, which form in many animals when the thin branchial plates separating pharyngeal pouches and pharyngeal grooves perforate, open the pharynx to the outside. Pharyngeal arches appear in all tetrapod embryos: in mammals, the first pharyngeal arch develops into the lower jaw (Meckel's cartilage), the malleus and the stapes. But these embryonic pharyngeal arches, grooves, pouches, and slits in human embryos could not at any stage carry out the same function as the gills of an adult fish.
Haeckel produced several embryo drawings that often overemphasized similarities between embryos of related species. The misinformation was propagated through many biology textbooks, and popular knowledge, even today. Modern biology rejects the literal and universal form of Haeckel's theory.[8]
Haeckel's drawings were disputed by Wilhelm His, who had developed a rival theory of embryology.[19] His developed a "causal-mechanical theory" of human embryonic development.[20]
Darwin's view, that early embryonic stages are similar to the same embryonic stage of related species but not to the adult stages of these species, has been confirmed by modern evolutionary developmental biology.
[edit]Modern status

The Haeckelian form of recapitulation theory is now considered defunct.[21] However, embryos do undergo a period where their morphology is strongly shaped by their phylogenetic position, rather than selective pressures.[22]
Logged

"You have enemies?  Good.  That means that you have stood up for something, sometime in your life." - Winston Churchill.
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #117 on: February 21, 2013, 04:30:11 PM »



http://www.pointofinquiry.org/
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #118 on: April 05, 2013, 11:01:04 AM »


http://www.dickmorris.com/chinas-secret-genetic-engineering-dick-morris-tv-lunch-alert/?utm_source=dmreports&utm_medium=dmreports&utm_campaign=dmreports
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #119 on: April 22, 2013, 11:05:34 PM »



WSJ
By WILLIAM Y. BROWN

DNA was the topic of U.S. Supreme Court argument on April 15. Can a gene be patented if it occurs in nature—which is generally grounds for exclusion—but has been identified by an individual scientist or company and removed from the cells in which it occurs? Lower courts are split on the matter, and the justices didn't tip their hands.

But whether a gene can be patented will be irrelevant if it disappears before anyone has identified it. That is what's happening now and will continue to happen—at a rate perhaps 100 to 200 times faster than in prehistoric days—due to modern man's outsize influence on nature and encroachment on habitat. Unless we have sequenced a species' DNA, extinction means gone forever and never really known. Preservation of the DNA is the simpler, cheaper route, with sequencing to follow. If the Library of Congress is where every book is stored, the world needs the equivalent for species DNA.

Preserving the DNA of known species would provide genetic libraries for research and commerce and for recovery of species that are endangered—the Armur Leopard and the Northern Right Whale, for example. Preservation would also offer the potential to restore species that have gone extinct. We currently lack preserved DNA for most of the 1.9 million species that have been named, but that is fewer than the number of people in Houston. No doubt additional species exist, but their DNA can be preserved as they are named. The job is doable.

Just a small fraction of species are maintained as living organisms in cultivation or captivity or are kept frozen as viable seeds or cells. These are the best, because whole, reproducing organisms can be grown from them by planting or cloning. Botanical gardens and zoos keep the living stuff. The Millennium Seed Bank at Kew Gardens in England is on a course to preserve frozen seeds of all vascular plant species, and the Svalbard Seed Vault in Norway is taking seed duplicates from other facilities. The San Diego "Frozen Zoo" has some 20,000 viable cell cultures representing 1,000 vertebrate species, including "Lonesome George," the last Pinta Island Galapagos tortoise, which expired last year. Its DNA would have disintegrated if the Frozen Zoo hadn't made a heroic mission after the tortoise's death to get a sample.

Enlarge Image


Close
Getty Images
 .For a fraction more species, DNA is kept at low temperature in dead cells or extracted form. The American Museum of Natural History in New York keeps 70,000 samples in liquid nitrogen, the Academy of Natural Sciences in Philadelphia has frozen samples for 4,000 bird species, and the National Museum of Natural History at the Smithsonian has embarked on an ambitious course to freeze species tissues.

Yet the DNA of most species is still not preserved. We need a plan. One might think that preserving the DNA of life on earth would cost a moonshot of money. But a viable cell culture in liquid nitrogen for a species at the Frozen Zoo costs only $200 to $300 to establish and just $1 a year to maintain. Multiplying $250 per species by 1.9 million species comes to $475 million, ignoring what has already been done. The U.S. pays more than twice that daily on the national debt. But let's be real, nobody is throwing new money around, even when the priority is obvious.

There is another way that could work, and would be much cheaper. First, we could develop a website to track progress on preservation whose key information is managed directly by contributing facilities. It would be a "wiki" site for DNA repositories, and many keepers would be delighted to share information if they could manage it themselves. They could both update holdings and let people know what species they will take and under what conditions.

Second, we can establish new incentives and mandates for contributing specimens, including grant, publication and permit requirements. Some grant makers and publications already require that DNA information be shared with a genetic information bank kept by the National Institutes of Health. Why not tissue too?

Third, donors who care could help develop and fund "citizen science" projects of museums and nonprofit groups to collect, identify and contribute specimens to repositories. The collections would grow, and so might public connection to nature. At the end of it all, we will preserve what we appreciate. And patent lawyers will be happy too, because they'll have something to fight about.

Mr. Brown, a former president of the Academy of Natural Sciences, is a senior fellow at the Brookings Institution.
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #120 on: May 06, 2013, 03:03:21 PM »



http://www.upworthy.com/2-monkeys-were-paid-unequally-see-what-happens-next?g=2&c=upw1
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #121 on: May 16, 2013, 08:45:35 AM »

http://www.dailymail.co.uk/health/article-2325414/Men-physically-strong-likely-right-wing-political-views.html
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #122 on: June 20, 2013, 07:07:42 AM »

Hat tip to David Gordon


Wired for Culture: Origins of the Human Social Mind by Mark Pagel
W. W. Norton & Company | 2012 | ISBN: 0393065871, 0393344207 | English | 432 pages

A fascinating, far-reaching study of how our species' innate capacity for culture altered the course of our social and evolutionary history.

A unique trait of the human species is that our personalities, lifestyles, and worldviews are shaped by an accident of birth—namely, the culture into which we are born. It is our cultures and not our genes that determine which foods we eat, which languages we speak, which people we love and marry, and which people we kill in war. But how did our species develop a mind that is hardwired for culture—and why?

Evolutionary biologist Mark Pagel tracks this intriguing question through the last 80,000 years of human evolution, revealing how an innate propensity to contribute and conform to the culture of our birth not only enabled human survival and progress in the past but also continues to influence our behavior today. Shedding light on our species’ defining attributes—from art, morality, and altruism to self-interest, deception, and prejudice—Wired for Culture offers surprising new insights into what it means to be human.
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #123 on: September 05, 2013, 08:10:17 AM »

The Decline of Violence
http://www.ted.com/talks/steven_pinker_on_the_myth_of_violence.html

Non-Zero Sum
http://www.ted.com/talks/robert_wright_on_optimism.html


« Last Edit: September 05, 2013, 08:16:11 AM by Crafty_Dog » Logged
ccp
Power User
***
Posts: 4090


« Reply #124 on: October 16, 2013, 09:32:51 AM »

Not long ago many people wondered if we are still "evolving".  How can we be if there is no survival of the fittest.  Even those who are not "fit" still get to survive and reproduce in our society.

Now it is clear.  Not only are we evolving but evolution will accelerate.   We will soon begin to control our evolution and accelerate it.  From simple choosing the sex of babies to divesting of flawed DNA to insertion of chosen DNA.  Parents will be able to view menus of traits.  You want your son to be tall, athletic.  How about an IQ of 180?  How about extrovert?  High energy?

No problem.   

Not only will evolution increase so that we develop master races of humans we will be controlling it.

Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #125 on: October 16, 2013, 11:12:01 AM »

There is, in truth, a terror in the world.  Under the hum of the miraculous machines and the ceaseless publications of the brilliant physicists a silence waits and listens and is heard.

It is the silence of apprehension  We do not trust our time, and the reason we do not trust out rimes is because it is we who have made the time, and we do not trust ourselves.  We have played the hero's part, mastered the monsters, accomplished the labors, become gods-- and we do not trust ourselves as gods.  We know what we are.

In the old days the gods were someone else; the knowledge of what we are did not frighten us.   There were Furies to pursue the Hitlers, and Athenas to restore the Truth.  But now that we are gods ourselves we bear the knowledge for ourselves-- like that old Greek hero who learned when all his labors had been accomplished that it was he himself who had killed his son.
Logged
Crafty_Dog
Administrator
Power User
*****
Posts: 31267


« Reply #126 on: May 16, 2014, 12:44:15 AM »

http://www.ancient-origins.net/human-origins-science/human-skull-challenges-out-africa-theory-001283
Logged
Pages: 1 2 [3] Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!