Dog Brothers Public Forum
December 10, 2016, 08:37:20 AM
Login with username, password and session length
Welcome to the Dog Brothers Public Forum.
Dog Brothers Public Forum
Politics, Religion, Science, Culture and Humanities
Science, Culture, & Humanities
Topic: Philosophy (Read 9019 times)
January 06, 2007, 09:32:52 AM »
We kick off a new thread with this piece:
January 2, 2007
Free Will: Now You Have It, Now You Don’t By DENNIS OVERBYE
I was a free man until they brought the dessert menu around. There was one of
those molten chocolate cakes, and I was suddenly being dragged into a vortex,
swirling helplessly toward caloric doom, sucked toward the edge of a black
(chocolate) hole. Visions of my father’s heart attack danced before my glazed
eyes. My wife, Nancy, had a resigned look on her face.
The outcome, endlessly replayed whenever we go out, is never in doubt, though I
often cover my tracks by offering to split my dessert with the table. O.K., I can
imagine what you’re thinking. There but for the grace of God.
Having just lived through another New Year’s Eve, many of you have just resolved
to be better, wiser, stronger and richer in the coming months and years. After
all, we’re free humans, not slaves, robots or animals doomed to repeat the same
boring mistakes over and over again. As William James wrote in 1890, the whole
“sting and excitement” of life comes from “our sense that in it things are really
being decided from one moment to another, and that it is not the dull rattling off
of a chain that was forged innumerable ages ago.” Get over it, Dr. James. Go get
yourself fitted for a new chain-mail vest. A bevy of experiments in recent years
suggest that the conscious mind is like a monkey riding a tiger of subconscious
decisions and actions in progress, frantically making up stories about being in
As a result, physicists, neuroscientists and computer scientists have joined the
heirs of Plato and Aristotle in arguing about what free will is, whether we have
it, and if not, why we ever thought we did in the first place.
“Is it an illusion? That’s the question,” said Michael Silberstein, a science
philosopher at Elizabethtown College in Pennsylvania. Another question, he added,
is whether talking about this in public will fan the culture wars.
“If people freak at evolution, etc.,” he wrote in an e-mail message, “how much
more will they freak if scientists and philosophers tell them they are nothing
more than sophisticated meat machines, and is that conclusion now clearly
warranted or is it premature?”
Daniel C. Dennett, a philosopher and cognitive scientist at Tufts University who
has written extensively about free will, said that “when we consider whether free
will is an illusion or reality, we are looking into an abyss. What seems to
confront us is a plunge into nihilism and despair.”
Mark Hallett, a researcher with the National Institute of Neurological Disorders
and Stroke, said, “Free will does exist, but it’s a perception, not a power or a
driving force. People experience free will. They have the sense they are free.
“The more you scrutinize it, the more you realize you don’t have it,” he said.
That is hardly a new thought. The German philosopher Arthur Schopenhauer said, as
Einstein paraphrased it, that “a human can very well do what he wants, but cannot
will what he wants.”
Einstein, among others, found that a comforting idea. “This knowledge of the
non-freedom of the will protects me from losing my good humor and taking much too
seriously myself and my fellow humans as acting and judging individuals,” he said.
How comforted or depressed this makes you might depend on what you mean by free
will. The traditional definition is called “libertarian” or “deep” free will. It
holds that humans are free moral agents whose actions are not predetermined. This
school of thought says in effect that the whole chain of cause and effect in the
history of the universe stops dead in its tracks as you ponder the dessert menu.
At that point, anything is possible. Whatever choice you make is unforced and
could have been otherwise, but it is not random. You are responsible for any
damage to your pocketbook and your arteries.
“That strikes many people as incoherent,” said Dr. Silberstein, who noted that
every physical system that has been investigated has turned out to be either
deterministic or random. “Both are bad news for free will,” he said. So if human
actions can’t be caused and aren’t random, he said, “It must be — what — some
weird magical power?”
People who believe already that humans are magic will have no problem with that.
But whatever that power is — call it soul or the spirit — those people have to
explain how it could stand independent of the physical universe and yet reach from
the immaterial world and meddle in our own, jiggling brain cells that lead us to
say the words “molten chocolate.”
A vote in favor of free will comes from some physicists, who say it is a
prerequisite for inventing theories and planning experiments.
That is especially true when it comes to quantum mechanics, the strange
paradoxical theory that ascribes a microscopic randomness to the foundation of
reality. Anton Zeilinger, a quantum physicist at the University of Vienna, said
recently that quantum randomness was “not a proof, just a hint, telling us we have
Is there any evidence beyond our own intuitions and introspections that humans
work that way?
Two Tips of the Iceberg
In the 1970s, Benjamin Libet, a physiologist at the University of California, San
Francisco, wired up the brains of volunteers to an electroencephalogram and told
the volunteers to make random motions, like pressing a button or flicking a
finger, while he noted the time on a clock.
Dr. Libet found that brain signals associated with these actions occurred half a
second before the subject was conscious of deciding to make them.
The order of brain activities seemed to be perception of motion, and then
decision, rather than the other way around.
In short, the conscious brain was only playing catch-up to what the unconscious
brain was already doing. The decision to act was an illusion, the monkey making up
a story about what the tiger had already done.
Dr. Libet’s results have been reproduced again and again over the years, along
with other experiments that suggest that people can be easily fooled when it comes
to assuming ownership of their actions. Patients with tics or certain diseases,
like chorea, cannot say whether their movements are voluntary or involuntary, Dr.
In some experiments, subjects have been tricked into believing they are responding
to stimuli they couldn’t have seen in time to respond to, or into taking credit or
blame for things they couldn’t have done. Take, for example, the “voodoo
experiment” by Dan Wegner, a psychologist at Harvard, and Emily Pronin of
Princeton. In the experiment, two people are invited to play witch doctor.
One person, the subject, puts a curse on the other by sticking pins into a doll.
The second person, however, is in on the experiment, and by prior arrangement with
the doctors, acts either obnoxious, so that the pin-sticker dislikes him, or nice.
After a while, the ostensible victim complains of a headache. In cases in which he
or she was unlikable, the subject tended to claim responsibility for causing the
headache, an example of the “magical thinking” that makes baseball fans put on
their rally caps.
“We made it happen in a lab,” Dr. Wegner said.
Is a similar sort of magical thinking responsible for the experience of free will?
“We see two tips of the iceberg, the thought and the action,” Dr. Wegner said,
“and we draw a connection.”
But most of the action is going on beneath the surface. Indeed, the conscious mind
is often a drag on many activities. Too much thinking can give a golfer the yips.
Drivers perform better on automatic pilot. Fiction writers report writing in a
kind of trance in which they simply take dictation from the voices and characters
in their head, a grace that is, alas, rarely if ever granted nonfiction writers.
Naturally, almost everyone has a slant on such experiments and whether or not the
word “illusion” should be used in describing free will. Dr. Libet said his results
left room for a limited version of free will in the form of a veto power over what
we sense ourselves doing. In effect, the unconscious brain proposes and the mind
In a 1999 essay, he wrote that although this might not seem like much, it was
enough to satisfy ethical standards. “Most of the Ten Commandments are ‘do not’
orders,” he wrote.
But that might seem a pinched and diminished form of free will.
Dr. Dennett, the Tufts professor, is one of many who have tried to redefine free
will in a way that involves no escape from the materialist world while still
offering enough autonomy for moral responsibility, which seems to be what everyone
The belief that the traditional intuitive notion of a free will divorced from
causality is inflated, metaphysical nonsense, Dr. Dennett says reflecting an
outdated dualistic view of the world.
Rather, Dr. Dennett argues, it is precisely our immersion in causality and the
material world that frees us. Evolution, history and culture, he explains, have
endowed us with feedback systems that give us the unique ability to reflect and
think things over and to imagine the future. Free will and determinism can
“All the varieties of free will worth having, we have,” Dr. Dennett said.
“We have the power to veto our urges and then to veto our vetoes,” he said. “We
have the power of imagination, to see and imagine futures.”
In this regard, causality is not our enemy but our friend, giving us the ability
to look ahead and plan. “That’s what makes us moral agents,” Dr. Dennett said.
“You don’t need a miracle to have responsibility.”
Other philosophers disagree on the degree and nature of such “freedom.” Their
arguments partly turn on the extent to which collections of things, whether
electrons or people, can transcend their origins and produce novel phenomena.
These so-called emergent phenomena, like brains and stock markets, or the idea of
democracy, grow naturally in accordance with the laws of physics, so the story
goes. But once they are here, they play by new rules, and can even act on their
constituents, as when an artist envisions a teapot and then sculpts it — a concept
sometimes known as “downward causation.” A knowledge of quarks is no help in
predicting hurricanes — it’s physics all the way down. But does the same apply to
the stock market or to the brain? Are the rules elusive just because we can’t
solve the equations or because something fundamentally new happens when we
increase numbers and levels of complexity?
Opinions vary about whether it will ultimately prove to be physics all the way
down, total independence from physics, or some shade in between, and thus how free
we are. Dr. Silberstein, the Elizabethtown College professor, said, “There’s
nothing in fundamental physics by itself that tells us we can’t have such emergent
properties when we get to different levels of complexities.”
He waxed poetically as he imagined how the universe would evolve, with more and
more complicated forms emerging from primordial quantum muck as from an elaborate
computer game, in accordance with a few simple rules: “If you understand, you
ought to be awestruck, you ought to be bowled over.”
George R. F. Ellis, a cosmologist at the University of Cape Town, said that
freedom could emerge from this framework as well. “A nuclear bomb, for example,
proceeds to detonate according to the laws of nuclear physics,” he explained in an
e-mail message. “Whether it does indeed detonate is determined by political and
ethical considerations, which are of a completely different order.”
I have to admit that I find these kind of ideas inspiring, if not liberating. But
I worry that I am being sold a sort of psychic perpetual motion machine. Free
wills, ideas, phenomena created by physics but not accountable to it. Do they
offer a release from the chains of determinism or just a prescription for a very
intricate weave of the links?And so I sought clarity from mathematicians and
computer scientists. According to deep mathematical principles, they say, even
machines can become too complicated to predict their own behavior and would labor
under the delusion of free will.
If by free will we mean the ability to choose, even a simple laptop computer has
some kind of free will, said Seth Lloyd, an expert on quantum computing and
professor of mechanical engineering at the Massachusetts Institute of Technology.
Every time you click on an icon, he explained, the computer’s operating system
decides how to allocate memory space, based on some deterministic instructions.
But, Dr. Lloyd said, “If I ask how long will it take to boot up five minutes from
now, the operating system will say ‘I don’t know, wait and see, and I’ll make
decisions and let you know.’ ”
Why can’t computers say what they’re going to do? In 1930, the Austrian
philosopher Kurt Gödel proved that in any formal system of logic, which includes
mathematics and a kind of idealized computer called a Turing machine, there are
statements that cannot be proven either true or false. Among them are
self-referential statements like the famous paradox stated by the Cretan
philosopher Epimenides, who said that all Cretans are liars: if he is telling the
truth, then, as a Cretan, he is lying.
One implication is that no system can contain a complete representation of itself,
or as Janna Levin, a cosmologist at Barnard College of Columbia University and
author of the 2006 novel about Gödel, “A Madman Dreams of Turing Machines,” said:
“Gödel says you can’t program intelligence as complex as yourself. But you can let
it evolve. A complex machine would still suffer from the illusion of free will.”
Another implication is there is no algorithm, or recipe for computation, to
determine when or if any given computer program will finish some calculation. The
only way to find out is to set it computing and see what happens. Any way to find
out would be tantamount to doing the calculation itself.
“There are no shortcuts in computation,” Dr. Lloyd said.
That means that the more reasonably you try to act, the more unpredictable you
are, at least to yourself, Dr. Lloyd said. Even if your wife knows you will order
the chile rellenos, you have to live your life to find out.
To him that sounds like free will of a sort, for machines as well as for us. Our
actions are determined, but so what? We still don’t know what they will be until
the waiter brings the tray.
That works for me, because I am comfortable with so-called physicalist reasoning,
and I’m always happy to leverage concepts of higher mathematics to cut through
The Magician’s Spell
So what about Hitler?
The death of free will, or its exposure as a convenient illusion, some worry,
could wreak havoc on our sense of moral and legal responsibility. According to
those who believe that free will and determinism are incompatible, Dr. Silberstein
said in an e-mail message, it would mean that “people are no more responsible for
their actions than asteroids or planets.” Anything would go.
Dr. Wegner of Harvard said: “We worry that explaining evil condones it. We have to
maintain our outrage at Hitler. But wouldn’t it be nice to have a theory of evil
in advance that could keep him from coming to power?”
He added, “A system a bit more focused on helping people change rather than paying
them back for what they’ve done might be a good thing.”
Dr. Wegner said he thought that exposing free will as an illusion would have
little effect on people’s lives or on their feelings of self-worth. Most of them
would remain in denial.
“It’s an illusion, but it’s a very persistent illusion; it keeps coming back,” he
said, comparing it to a magician’s trick that has been seen again and again. “Even
though you know it’s a trick, you get fooled every time. The feelings just don’t
In an essay about free will in 1999, Dr. Libet wound up quoting the writer Isaac
Bashevis Singer, who once said in an interview with the Paris Review, “The
greatest gift which humanity has received is free choice. It is true that we are
limited in our use of free choice. But the little free choice we have is such a
great gift and is potentially worth so much that for this itself, life is
I could skip the chocolate cake, I really could, but why bother? Waiter!
Correction: January 4, 2007
An article in Science Times on Tuesday about the debate over free will misstated
the location of Elizabethtown College, where Michael Silberstein, who commented
on free will and popular culture, is a science philosopher. It is in
Pennsylvania, not Maryland.
Reply #1 on:
January 06, 2007, 12:25:00 PM »
Homo sapiens is closer to "free will" than lower speicies. I was searching last evening on the smartest animals in the world. It seems to be a somewhat subjective analysis but what came up was chimps, other apes, whales, dolphins, dogs, cats, crows, mice, octopusses, and elephants.
Just one thought. Star Trek's Spock (Vulcan) would be further along the evolutionary scale towards a living organism with more free will than us. He operates on logic, and his choices were free of emotions, urges, and feelings. One has to wonder how in the world he could stand working and living along side humans!
Reply #2 on:
January 06, 2007, 06:30:50 PM »
Hmmm... free will?
Was it free will of me to post here and now or did I simply have to?
Was it destined of me to read this thread?
Was I destined to write an answer without much content?
I guess I pretty much believe in free will, but also do I believe that there are A LOT of forces within us that allow to question wether we act upon free will. Free choice seems to be simple and natural on the surface, but decision finding seems to be a rather complicate process reaching deep within the subconscious material we're made of. Very interesting topic.
"En un lugar de la Mancha, de cuyo nombre no quiero
acordarme, no ha mucho tiempo que viv?a un hidalgo de los de
lanza en astillero, adarga antigua, roc?n flaco y galgo corredor."
Reply #3 on:
April 10, 2009, 11:52:19 AM »
Some thoughts on philosophy about our culture:
I was watching a cable show and the topic was that school prayer is offensive to some.
I am wondering if we don't have religion than where do we get values from?
Those who are against any form of religion in the public domain, school, government, courts, etc are I think called secularists.
They don't believe in a higher power, a God, or a power greater than us.
So what do they believe in and where are we/our children supposed to get values that makes us good citizens, neighbors, family members, friends?
I conclude that there values is what would be so called political correctness. Political correctness trandscends previous values, religion. Live and let live, make love not war. Do not dare offend anyone for their lifestyle. In the extreme it is even more. We are not responsible for anything. Murderers did not decide to kill. *They* are the victims of bad genes, bad upbringing, childhood abuse etc.
Bin Laden is a killer but the US in its arrogance and capatilistic imperialism brought on his hatred.
All lifestyles are OK. Anyone who disagrees with that is not.
These are the values, the "codes" the left has decided we should all live by.
Political correctness is the new "Ten Commandments" taught by the liberal educational majority to children now.
It is only by progression, by extension that socialism, facism or the like also is taught.
Capatilism is not correct.
Democracy only is correct as long as the majority hold to political correctness.
Freedom of speech is only acceptable to the extent it is politically correct.
Reply #4 on:
July 15, 2012, 05:00:04 PM »
A philosopher for the Facebook generation
Jean-Jacques Rousseau wrote hundreds of pages about himself – this is what is tragi-comic about him – yet he had no self-knowledge
By Theodore Dalrymple
6:30PM BST 30 Jun 2012
"Here I am, then, alone in the world, with no longer a brother, neighbour, or friend, but only myself, for company. The most sociable and loving of humans has been banished from society by unanimous agreement."
Thus Jean-Jacques Rousseau at the beginning of his Reveries of a Solitary Walker, his last work. It might be said to be the founding document of the age of self-pity: has anyone come after him so lacking in compassion that he feels not pity for himself?
The tercentenary of Rousseau’s birth has just passed. He was born in Geneva (remember this when someone asks you with sarcasm to name three great Swiss) on June 28 1712, the son of a watchmaker, and died at Erménonville, France on July 2 1778. In his 68 years of life he became one of the most influential, which is not necessarily to say one of the best or clearest, thinkers of all time. His trajectory was truly remarkable, and his work still arouses passionate controversy. For example, was he a libertarian or an incipient totalitarian? No one is neutral about Rousseau. I don’t think I have ever heard anyone say of him “well, on the one hand, but on the other…”
His initial passion was music, and his first publication a new system of musical notation which he presented to the French Academy of Sciences. He then published a Dissertation on Modern Music, contributed articles on music to Diderot’s great Encyclopaedia and composed an opera, The Village Soothsayer, that was actually performed at the Paris opera.
But of course it is for his philosophical ideas that he is principally remembered, revered and reviled. Whether his profound influence upon history and society was the result of the truth of what he said, or of its convenience for the people who followed him, may be questioned.
When someone reads the opening words of The Social Contract, for example, namely that Man is born free and everywhere is in chains, does he think, “Gosh, that is true, I never thought of that before!” or does he think, “I wish I were free of all the irritating restraints on my behaviour that prevent me from doing exactly as I choose”?
Rousseau was so contradictory that what we take from him depends almost as much on us as on him. He is a kind of lightning conductor for our desires. Democrats see in his concept of “the general will” the notion of popular sovereignty; aspiring dictators see in it something they believe that they embody, a semi-mystical entity that is independent of any individual’s will, much less that of the numerical majority, and of which he is merely the inspired mouthpiece, as it were.
Rousseau was genuinely revolutionary in the way in which he overturned the notion of Original Sin. For most thinkers before him the question was how Man was to be made good, given his bad or imperfect nature; for Rousseau the question was how Man became bad, given his natural goodness (his answer was society). He did not believe in a return to Nature, exactly, but sought the political means to restore Man to his natural goodness. Personally, I think Rousseau was disastrously mistaken in this; in my opinion, the limitation of the bad in Man is infinitely more important and less sinister politically than the search for the good. When you have limited the bad, the good can take care of itself.
Rousseau was also the unwitting founder of the psychology of the Real Me, that is to say of the inner core of each of us that remains immaculate and without sin, however the external person actually behaves. The inner core, the Real Me, is good; what might be called the Epiphenomenal Me, that is to say the one that loses his temper, tells lies, eats too much, etc, is the result of external influences upon him. In this way a monster of depravity may preserve a high opinion of himself and continue his depravity; nothing he can do can deprive him of the natural goodness first espied by Rousseau.
Jean-Jacques was also, in his way, the philosophical progenitor of Facebook, of the notion that we should live our lives in the open, hiding nothing, for concealment is both the symptom and the cause of insincerity, which was one of J-J’s bugbears. He begins his Confessions in a self-congratulatory way: “Here is the only portrait of a man, painted exactly after nature and in all her truth, that exists and probably ever will exist.” The portrait is extremely interesting because Rousseau, whatever his faults, was an extremely interesting man. Who would not be amused by Rousseau’s account of how he became aware as a child of the sexual pleasure to be had by being beaten by a woman? He continues: “To be at the knees of an imperious mistress, to obey her orders, to have to ask her pardon, was for me a very sweet pleasure…”
The problem is that while all men are born equal, they are not all born equally interesting; so the confessional mode does not suit everyone. Besides, and this is what is tragi-comic about him, Rousseau wrote many hundreds of pages about himself but had no self-knowledge. He quarrelled with virtually everybody he ever knew; he even managed to reduce the philosopher David Hume, one of the most equable human beings in the history of the world, known in Paris as le bon David, to absolute fury. Yet never once did Rousseau think, “Maybe it’s not them, maybe it’s me.”
Thus this most fascinating man was the originator of the most characteristic of our modern vices – self-expression without self-examination.
Modern Wisdom from Ancient Minds.
Reply #5 on:
December 18, 2012, 01:39:46 PM »
Modern Wisdom from Ancient Minds
Victor Davis Hanson - December 18, 2012
The Tragic View
Of course we can acquire a sense of man’s predictable fragilities from religion, the Judeo-Christian view in particular, or from the school of hard knocks. Losing a grape crop to rain a day before harvest, or seeing a warehouse full of goods go up in smoke the week before their sale, or being diagnosed with leukemia on the day of a long-awaited promotion convinces even the most naïve optimist that the world sort of works in tragic ways that we must accept, but do not fully understand.
Yet classical literature is the one of the oldest and most abstract guides to us that there are certain parameters that we may seek to overcome, but must also accept that we ultimately cannot.
You Can’t Stop Aging, Nancy
Take the modern obsession with beauty and aging, two human facts that all the Viagra and surgery in the world cannot change. I expect few readers have endured something like the Joe Biden makeover or the Nancy Pelosi facial fix (I thought those on the Left were more inclined to the natural way? Something is not very green and egalitarian about spending gads of money for something so unnatural). Most of you accept wrinkles, creaky joints, and thinning hair. Oh, we exercise and try to keep in shape and youthful, but a Clint Eastwood seems preferable looking to us than a stretched and stitched Sylvester Stallone.
The Greek lyric poets, from Solon to Mimnermus, taught that there is nothing really “golden” about old age. That did not mean that at about age 50-70 one is not both wiser than at 20 and less susceptible to the destructive appetites and passions — only that such mental and emotional maturity come at the terrible price of a decline in energy and physicality. When I now mow the lawn or chain saw, in about 10 minutes a knee is sore, an elbow swollen, a back strained — and from nothing more than a silly wrong pivot. Biking 100 miles a week seems to make the joints more, not less, painful. At 30 going up a 30-foot ladder was fun; at near 60 it is a high-wire act. There is some cruel rule that the more it is necessary at 60 to build muscle mass, the more the joints and tendons seem to rebel at the necessary regimen.
The ancients honored old age, as the revered Gerousia and the Senate attest, but on the concession that with sobriety came far less exuberance and spontaneity. I suppose old Ike would never had mouthed JFK’s “pay any price” to intervene and oppose communism. Yet we must try to stay competitive until the last breath, if not with our bodies, then with our minds — like old blabbermouth Isocrates railing in his 90s, or Sophocles writing the Oedipus at Colonus (admittedly not a great play) well after 90. Cicero’s De Senectute reminds us that knowledge and learning can bridge some of the vast gap between the age cohorts. I remember an 80-year-old woman in one of my Greek classes who palled around with the 20-somethings; apparently when they were all reading Homer, they all forgot trivial things such as looks and age — at least for the ephemeral two hours they were reading The Iliad. (One young man after a class said, “She looks good in jeans.”)
In term of relative power, the Greeks and Romans felt that youth often trumped wisdom, at least in the sense that the firm 21 year old held all the cards with her obsessed 50-year-old admirer. When I sometimes read of the latest harassment suit that involved consensual adult sex involving an “imbalance in power,” I wonder what a Petronius, who wrote about crafty youth using their beauty to incite and humiliate the foolish aging, would think. Was Paula Broadwell really a victim in a “power imbalance”? Over the decades I have seen a number of adept young graduate students who fooled silly old goats (often the same nerds that they were in high school) into consensual relationships that aided their careers, but then, when the benefits were exhausted, they moved on, only to define themselves as victims as the need arose. A Greek would laugh at that idea of victims and oppressors.
As far as beauty goes, what is so attractive about either the perfect Stepford wives’ look or the starved model appearance? From red-figure vase painting to Rubens, Western tastes have appreciated curves, not lines. Where did the new beauty profile come from that is abnormal and usually achieved only through surgery: 5’ 10” females, weighing 120 lbs., with micro-waists and huge breasts and rears, as if more than 1% of the population is born that way? Ovid also reminds us that, on occasion, a blemish can mesmerize the beholder, in the way perhaps Cleopatra’s ample nose incited Caesar and Antony. I used to find the actress Sandy Dennis’s uncorrected overbite appealing in the way I don’t find today’s oversized, bleached, spot-lighted, and perfectly capped choppers inviting. A mole for the Greeks should not be removed. The classics remind us that a small defect is no defect at all. Forty years ago, I once knew an undergraduate with a scar across running across her chin, maybe six inches in length, and a few millimeters wide. It was hypnotic. And what happened to the classical emphases on voice, comportment, grace, and gesture as ingredients of beauty? Have they simply fallen by the wayside in our boobs/butt obsessed popular culture? Are there voice or posture classes anymore, or has it become all liposuction and implants?
Admittedly classical literature is aristocratic, at least in the sense that the well-read and learned had more money than those whom they often wrote about. But that said, it is striking how frequently over a thousand years of Greek and Latin masterpieces arise words like “mob” (ochlos) and “throng” (turba) to describe the herd-like desire for entitlements without worry as to how they were to be funded. Virgil (vulgus vult decipi, ergo decipiatur) and Horace (Odi profanum vulgus et arceo) would assume that even the Wall Street Journal is not read at Super Wal-Mart. (But be careful: at a local electric motor shop, the two Hispanic mechanic/owners once asked me how I would rate Peter Green’s Alexander the Great — and then cited four other biographies — while I was waiting to have a motor rewound.)
Alexis de Tocqueville put forth a thesis that American democracy had a chance because the small-scale entrepreneur (see above) and autonomous, self-reliant agrarian were not so prone to the Siren-calls of the European mob. He felt that we in American would not perhaps follow the model of the fourth-century Athenian dêmos or imperial Roman vulgus that flocked to the cities for the dole, and hated the wealthy the more they taxed them (don’t think Obama will be happy with just raising rates on “millionaires”) — as if the ability to pay high taxes was always proof of the ability to pay even more. Tocqueville derived that pessimistic view from Aristotle whose best democracy was a politeia — rule by owners of some property, who were largely agrarian and self-reliant, and did not expect subsidies from others. Classics, then, teaches us to beware a situation when 47% of the population do not pay income taxes and nearly half of us receive federal and state subsidies. Perhaps we should go over the cliff so that the 53% all understand the burdens of higher taxes to subsidize the 47% who pay no income taxes. If we hike taxes on those who make over $1 million a year, then cannot we not insist that everyone pays at least $500 per year in federal income taxes — to appreciate that April 15 is not Christmas?
In that regard I now often think of Solon’s seisachtheia, the “shaking off” of debts by those small farmers of Attica burdened from having to pay 1/6th (or so scholars still believe) of their produce to their creditors — or the Messenian helots who were obligated to give ¼ to ½ of everything they produced to their Spartan overlord. Yet at this point, with a looming 40% federal tax rate, 12% California tax, returning payroll and higher Medicare taxes, and the new Obamacare hit, millions would prefer the oppressive take of classical serfdom to the present 55-60% of their income grabbed by the state. The new American helots, after all, will fork over sixty percent of their almond crops to the IRS, build six out of ten houses for their government, drive their trucks until July for Washington — and write thirty PJ weekly columns a year for Obama. The Tea Party might have been better named the Helot Party.
I was thinking of the class strife in Sallust’s Conspiracy of Cataline the other day as well; I used to teach it and the Jugurthine War in third-year Latin. In my thirties I never quite understood the standard hackneyed redistributionist call of the late Roman republic for “cancellation of debts and redistribution of property!” But recently I reread Sallust with a new awareness — in the context of all the talk of mortgage forgiveness, credit card forgiveness, student loan forgiveness, wealth taxes, and new estates taxes. The subtext of those Catalinian platforms, of course, is that someone else was culpable for having enough money in the first place (rather than prudence, character, dutifulness, etc.) to pay what he had borrowed — and therefore as atonement should pay for others who were defrauded by the system.
In the Roman state, those who borrowed unwisely periodically needed a clean slate — paid for by those who mostly did not, albeit always dressed up in the sense of the noble poor and the rapacious rich. “Pay your fair share,” “fat cat,” “you didn’t build that,” “at some point you’ve made enough money,” etc. are right out of the demagoguery so brilliantly chronicled by Aristophanes, Plutarch, and Sallust. Debt relief and redistribution were not quaint classical topoi, but inherent in the human condition. For now our would be Gracchus in the White House seems a lot more like a Publius Claudius Pulcher (author of the expansion of the grain dole), an upscale elite who chose demagoguery as the best route to power, fame, influence, and riches — and who can’t finish a sentence without blasting “millionaires and billionaires” as the source of all our woes. How did it happen that those in government, with higher than private sector salaries, with access to free perks, with better than normal pensions and benefits, so often talk about the need for higher taxes without anyone replying that they were selfish in asking the worse off to subsidize the better off?
The Golden Mean
One theme sort of resonates through classical literature. Character consists of moderation, of avoiding hubris and thereby escaping nemesis. Character is formed through balanced behavior, from the trivial of not overeating, oversleeping, and overdrinking (“glutton,” “sloth,” and “drunk” have disappeared from the American vocabulary, though they were ubiquitous in Western languages for the last 2500 years), to being humble in success and resilient in humiliation and defeat as well. But here is the warning: the good man — whether Ajax or Socrates — should expect — perhaps even welcome? — the disdain of the crowd, and usually will not win acclaim or receive what he deserves in this life. (Achilles finally came to accept that.)
Once upon a time in Hollywood, great directors grasped that, and so in their versions of the Iliad or the Sophoclean play — think Shane, Ride the High Country, High Noon, The Searchers, The Man Who Shot Liberty Valence — the man with character, if not killed, rides off into the sunset alone, glad to be free of those he saved. We don’t like our George S. Pattons and Curtis Lemays, at least until we are faced with the Waffen SS and the Japanese imperial military. Today, Marshall Will Kane might be dubbed a “loser,” or Ethan Edwards as “obsessed.”
Whatever character is, it was not Susan Rice’s recent letter/op-ed bowing out of consideration for nomination to the office of Secretary of State. Instead it was Euripidean projection, Pentheus-style, as she alleged politicization and cheap partisan distraction on the part of her critics, even as she unleashed a pattern of obfuscation of her own and race/gender pandering from her supporters.
Ave atque Salve
I was given a great gift to have been a student of classics, to have lived on a farm, and to have had a father who was nobly self-destructive in the Ajaxan sense (on his Selma gravestone reads Sophocles’ chiastic aphorism, “live nobly, or nobly die”). He practiced an archaic code that won him admiration, but made his job, his career, and his life almost impossible, whether over Tokyo in a B-29, or on a tractor, or in the Byzantine labyrinth of junior college administration, at which he excelled with his colleagues and students, but was deemed too eccentric by his administrative superiors. When I came home at 26 puffed up with a PhD, he met me in the driveway and said “The shed needs new shingles,” a not too subtle reminder right out of Hesiod that with intellectual progress can come moral regress.
One of the great, though inadvertent gifts of the Obama administration has been to remind us that the Rhodes Scholarship, the Harvard Law degree, the Stanford PhD, the Princeton BA mean, well, nothing much at all, if not perhaps a suspicion that a lot of intellectual branding and grandstanding came at the expense of two years on a tuna boat, or a year picking apples, or four summers at Starbucks, of anything to remind the young genius that he was not so smart after all, and that character is not created by getting an award or being stamped by an unworldly elite institution.
In this age of Obama and a corrupting equality of result, we must continue to speak out, with dash and style, with the knowledge that most of our peers prefer sameness and mandated equality to freedom and liberty, if the latter result in inequality. But at least we are not alone, the best of the ancient world nods with us.
And that is the point, is it not — to keep the ancient faith and so welcome rather than fear the popular anger of the age?
"You have enemies? Good. That means that you have stood up for something, sometime in your life." - Winston Churchill.
Reply #6 on:
March 21, 2013, 11:03:05 AM »
March 20, 2013 | 0900 GMT
By Robert D. Kaplan
Chief Geopolitical Analyst
What is modernity? Is it skyscrapers, smart phones, wonder drugs, atomic bombs? You're not even close. Modernity, at least in the West, is the journey away from religious virtue toward secular self-interest. Religious virtue is fine for one's family and the world of private morality. But the state -- that defining political structure of modern times -- requires something colder, more chilling. For the state must organize the lives of millions of strangers and protect their need to selfishly acquire material possessions. If everyone stole from everyone else there would be anarchy. So the state monopolizes the use of force, taking it away from criminals. The state appeals not to God, but to individual selfishness. Thus, it clears the path for progress.
Thomas Hobbes conceived of the modern state in his Leviathan, published in 1651. Hobbes is known wrongly as a gloomy philosopher because of his emphasis on anarchy. Hobbes was actually a liberal optimist, who saw the state as the solution to anarchy, allowing people to procure possessions and build a community. Hobbes knew that in the path toward a better world, order first has to be established. Only later can humankind set about making such order non-tyrannical.
But what did Hobbes' philosophy ultimately build on? It built on the first of the moderns, the early 16th century Florentine Niccolo Machiavelli, whose masterpiece, The Prince, was written 500 years ago in 1513. Here is an anniversary as important as the 500th anniversary of Columbus discovering America, celebrated in 1992.
By taking politics away from the narrowing fatalism of the medieval Roman Catholic Church, Machiavelli created the very secular politics from which Hobbes could conceive of the idea of the state.The Prince may be less a work of cynicism than an instructional guide to overcome fate -- the fatalism of the Roman Catholic Church at that time. Thus, Machiavelli, more than Michelangelo perhaps, was the true inventor of the Renaissance. The founders of the American Republic, who conceived of a polity in which church and state were separate and in which government existed to lay the rules for individuals to compete freely in the struggle to acquire wealth, owed much to Machiavelli and Hobbes.
But it is with Machiavelli, more than with Hobbes, where the principles of Western modernity truly begin. Indeed, we are fortunate to have still among us one of the great interpreters of Machiavelli, Harvard Professor Harvey C. Mansfield Jr. Mansfield knows that it is more important to tell hard truths than it is to be liked and to get good reviews. That is why I have always had such deep respect for him, even though I have never met him. I know Mansfield the way one should know a great scholar: only through his writings.
Mansfield's book, Machiavelli's Virtue (1996), though drawing on the ideas of an earlier interpreter of Machiavelli, University of Chicago political scientist Leo Strauss, is an academic classic in its own right. Mansfield himself may not necessarily agree with Machiavelli, but he fearlessly shows why this towering figure of the Renaissance is still so relevant. For by setting the terms for political reality, Machiavelli helps lay the foundation for geopolitics.
Mansfield, interpreting Machiavelli's original Italian, explains to us that necessity frees people from religious faith. People may pray to God and go to church or synagogue or the mosque, but they must also acquire food and possessions for the sake of their loved ones, and thus they must enter into competition with their fellow human beings; just as nations must enter into competition with other nations. This is not something to lament, however. For in the last analysis, self-interest can lead to peace while rigid moral principles can lead to war. Self-interest informs compromise with other human beings, and thus a state governed by self-interest is likely to compromise with other states: whereas a person or state governed solely by religious or moral virtue will tend to delegitimize as immoral those with whom he or it disagrees -- and therein lies conflict. Virtue, in other words, is fine. But outstanding virtue -- because it tempts sanctimoniousness -- is dangerous. It is ultimately with this maxim that we find philosophical justification for moderation in contemporary politics and statecraft.
Those who find such thinking dark or cynical may be under the illusion that politics can bring respite from primitive necessity. Machiavelli, as Mansfield explains, is doubtful of this. Yes, politicians may announce their intention to strive for truth and justice, but their unspoken concerns and desires, even in a democracy -- especially in a democracy -- are really about satisfying the selfish needs of their constituents. Face it, primitive necessity is a fixture of the human condition. And, therefore, the only way to reduce conflict and suffering is through anxious foresight: the ability to foresee danger and necessities ahead. Thus are intelligence agencies more likely to prevent atrocities than humanitarians.
In politics, explains Machiavelli (through Mansfield), one who does good often cannot be good. One must even learn how to be bad, or at least devious, for the sake of the common good. This is not necessarily the end justifies the means, for Machiavelli is careful to stipulate that only the minimum amount of cruelty should be applied for the sake of the greatest amount of good.
Machiavelli is all about results. He believes that you define something in politics not by its inherent excellence, but by its outcome. For political virtue is separate from individual perfection. A leader may be honest, unselfish and moral, but if he starts a war that later proved unnecessary and killed many people, he lacks virtue -- despite being on a personal level very sympathetic. Conversely, a leader may be cynical, selfish and excessively ambitious, but if he keeps his countrymen away from danger he can still be said to have virtue -- despite being personally unappealing. Likeability has nothing to do with virtue, it turns out. For politics -- and especially geopolitics -- is concerned, according to Machiavelli, with knowing about the world rather than knowing about heaven. Indeed, precisely because Machiavelli was concerned with men and not with God, he was a humanist.
Machiavelli has his limits. For example, he could not have foreseen 20th century totalitarianism that mirrored the self-righteousness of the medieval Church with which he was in conflict, but on a much larger scale. He imagined the never-ending struggle between Italian city-states; not the titanic conflicts between gargantuan nuclear powers. Because the stakes are arguably higher now because of weapons of mass destruction, there is a danger of taking Machiavelli too far and using his philosophy to justify all sorts of risky subterfuges.
But there is a greater danger in simply dismissing his philosophy as unworthy of our so-called enlightened age. For our age is determined less by globalization than by the battle of space and power, both between states and between groups within states themselves -- as witnessed most recently by the ethnic and sectarian turmoil throughout the Greater Middle East. An American leader who is forced to grapple with such anarchy, even as he must take care to adopt the right tone with a militarily ascendant China and with an economically rising Latin America, could do worse than act "Machiavellian." And thanks to Professor Mansfield, we now know the true meaning of that adjective.
Read more: Machiavelli's Virtue | Stratfor
Mike Tyson on Philosophy
Reply #7 on:
December 16, 2013, 08:48:18 AM »
Mike Tyson Explores Kierkegaard
The former heavyweight champ considers philosophy and love.
By Mike Tyson
Dec. 13, 2013 6:15 p.m. ET
I'm currently reading "The Quotable Kierkegaard," edited by Gordon Marino, a collection of awesome quotes from that great Danish philosopher. (He wanted his epitaph to read: "In yet a little while / I shall have won; / Then the whole fight / Will all at once be done.") I love reading philosophy. Most philosophers are so politically incorrect—challenging the status quo, even challenging God. Nietzsche's my favorite. He's just insane. You have to have an IQ of at least 300 to truly understand him. Apart from philosophy, I'm always reading about history. Someone very wise once said the past is just the present in funny clothes. I read everything about Alexander, so I downloaded "Alexander the Great: The Macedonian Who Conquered the World" by Sean Patrick. Everyone thinks Alexander was this giant, but he was really a runt. "I would rather live a short life of glory than a long one of obscurity," he said. I so related to that, coming from Brownsville, Brooklyn.
What did I have to look forward to—going in and out of prison, maybe getting shot and killed, or just a life of scuffling around like a common thief? Alexander, Napoleon, Genghis Khan, even a cold pimp like Iceberg Slim—they were all mama's boys. That's why Alexander kept pushing forward. He didn't want to have to go home and be dominated by his mother. In general, I'm a sucker for collections of letters. You think you've got deep feelings? Read Napoleon's love letters to Josephine. It'll make you think that love is a form of insanity. Or read Virginia Woolf's last letter to her husband before she loaded her coat up with stones and drowned herself in a river. I don't really do any light reading, just deep, deep stuff. I'm not a light kind of guy.
— Mr. Tyson is the author of "The Undisputed Truth."
Re: Mike Tyson on Philosophy
Reply #8 on:
December 16, 2013, 09:47:19 AM »
Quote from: Crafty_Dog on December 16, 2013, 08:48:18 AM
I don't really do any light reading, just deep, deep stuff. I'm not a light kind of guy.
That describes me as well. Interesting to see Tyson opening himself up to the intellectual world. I saw an interesting interview of him by Greta Van Susteren not long ago. He may actually be getting his life in order. Miraculous, if true.
"You have enemies? Good. That means that you have stood up for something, sometime in your life." - Winston Churchill.
Reply #9 on:
December 16, 2013, 10:26:41 AM »
His life truly has been an Adventure.
David Brooks: The Ambition Explosion
Reply #10 on:
November 29, 2014, 02:42:02 PM »
The Ambition Explosion
NOV. 27, 2014
In 1976, Daniel Bell published a book called “The Cultural Contradictions of Capitalism.” Bell argued that capitalism undermines itself because it nurtures a population of ever more self-gratifying consumers. These people may start out as industrious, but they soon get addicted to affluence, spending, credit and pleasure and stop being the sort of hard workers capitalism requires.
Bell was right that there’s a contradiction at the heart of capitalism, but he got its nature slightly wrong. Affluent, consumerist capitalists still work hard. Just look around.
The real contradiction of capitalism is that it arouses enormous ambition, but it doesn’t help you define where you should focus it. It doesn’t define an end to which you should devote your life. It nurtures the illusion that career and economic success can lead to fulfillment, which is the central illusion of our time.
Capitalism on its own breeds people who are vaguely aware that they are not living the spiritually richest life, who are ill-equipped to know how they might do so, who don’t have the time to do so, and who, when they go off to find fulfillment, end up devoting themselves to scattershot causes and light religions.
To survive, capitalism needs to be embedded in a moral culture that sits in tension with it, and provides a scale of values based on moral and not monetary grounds. Capitalism, though, is voracious. The personal ambition it arouses is always threatening to blot out the counterculture it requires.
Modern China is an extreme example of this phenomenon, as eloquently described by Evan Osnos in his book, “Age of Ambition,” which just won the National Book Award for nonfiction.
As Osnos describes it, the capitalist reforms of Deng Xiaoping raised the ambition levels of an entire society. A people that had been raised under Mao to be a “rustless screw in the revolutionary machine” had the chance, in the course of one generation, to achieve rags-to-riches wealth. This led, Osnos writes, to a hunger for new sensations, a ravenous desire to make new fortunes.
Osnos describes the “English fever” that swept some Chinese youth. Li Yang was a shy man who found that the louder he bellowed English phrases the bolder he felt as a human being. Li filled large arenas, charging more than a month’s wages for a single day of instruction. He had the crowds shouting English phrases en masse, like “I would like to take your temperature!” and repeating his patriotic slogans, “Conquer English to make China stronger!”
Osnos interviewed a member of the Li cult who called himself Michael and considered himself a “born-again English speaker.” For Michael, learning English was intermingled with the aspirational mantras he surrounded himself with: “The past does not equal the future. Believe in yourself. Create miracles.”
It was this ambition explosion as much as anything else that created China’s prosperity. One mother who called herself “Harvard Mom” had her daughter hold ice cubes in her hands for 15 minutes at a time to teach fortitude. Soon China was building the real estate equivalent of Rome every fortnight.
But the fever, like communism before it, stripped away the deep rich spiritual traditions of Buddhism and Taoism. Society hardened. Corruption became rampant. People came to believe that society was cruel and unforgiving. They hunkered down. One day, a little girl was hit by a bread truck in the city of Foshan. Seventeen people passed and did nothing as she lay bleeding on the ground. The security video of the incident played over and over again on TV, haunting the country.
Li Yang, the English teacher, turned out to be a notorious wife-beater. His disciple, Michael, became embittered. The optimistic slogans now on his wall had undertones of frustration: “I have to mentally change my whole life’s destiny!” and “I can’t stand it anymore!”
This led, as it must among human beings who are endowed with a moral imagination that can be suppressed but never destroyed, to a great spiritual searching. Osnos writes that many Chinese sensed that there was a spiritual void at the core of their society. They sought to fill it any way they could, with revived Confucianism, nationalism, lectures by the Harvard philosopher Michael Sandel and Christianity.
Osnos writes that this spiritual searching is going out in all directions at once with no central melody. One gets the sense that the nation’s future will be determined as much by this quest as by political reform or capitalist innovation.
China is desperately searching for a spiritual and humanist nest to hold capitalist ambition. Those of us in the rest of the world are probably not searching as feverishly for a counterculture, but the essential challenge is the same. Capitalist ambition is an energizing gale force. If there’s not an equally fervent counterculture to direct it, the wind uproots the tender foliage that makes life sweet.
Last Edit: November 29, 2014, 02:44:31 PM by Crafty_Dog
New Men, New Rights
Reply #11 on:
January 05, 2016, 02:30:14 PM »
Son of Saul, Kierkegaard, and the Holocaust
Reply #12 on:
February 29, 2016, 02:54:25 PM »
‘Son of Saul,’ Kierkegaard and the Holocaust
By Katalin Balog February 28, 2016 9:15 pm February 28, 2016 9:15 pm
The Stone is a forum for contemporary philosophers and other thinkers on issues both timely and timeless.
Art is often the subject of philosophy. But every now and then, a work of art — something other than a lecture or words on a page — can function as philosophy. “Son of Saul,” a film set in Auschwitz-Birkenau during the Holocaust, is such a work of art. It engages with a profound set of problems that also occupied the 19th-century Danish philosopher Soren Kierkegaard.
Written and directed by the Hungarian filmmaker László Nemes, “Son of Saul” won awards at Cannes, the Golden Globes and elsewhere before making its way to the Oscars to win the award for best foreign language film. It follows a day in the life of Saul, a member of the Sonderkommando, a group of mostly Jewish prisoners the Nazis forced to assist with herding people to the gas chambers, burning the bodies and collecting gold and valuables from the corpses. The film creates a direct, experiential and visceral engagement with these events by maintaining a relentless focus on the minute-to-minute unfolding of Saul’s world.
In long, unbroken shots, we see the reality of the death camp revealed, its textures made tangible. By using close-ups and shallow focus images throughout, Nemes gives viewers no opportunity to disengage from Saul’s point of view. It is as though we are shadowing him in hell. In immersing the viewer this way, Nemes places us there with Saul. This seems to be a moral imperative as well as an aesthetic choice. By eliciting a full, visceral engagement from the viewer, the film embodies the respect for the singular events of the Holocaust that more commercial treatments of the subject fail at. The film is a thoroughly personal, subjective account of the Holocaust.
The movie’s central theme is Saul’s inner world, the loss and recovery of his soul. In scene after scene we see his face unmoved, his eyes watching but remote; there is a repellent sense of his — and our own — indifference. But then he witnesses a young boy briefly surviving the gas, only to be put to death a few minutes later by a Nazi doctor, possibly Josef Mengele. From this moment, he becomes consumed by the idea of giving the boy a proper Jewish burial. He claims the boy is his son. Saul’s backstory is entirely missing from the film — we don’t even learn if he really had a son — but that is beside the point. What matters, to him and to us is that he is able to feel again.
Much of Kierkegaard’s philosophy is a warning against the tendency — greatly accelerated in modern times — to take an increasingly objective, abstract perspective on the world. While the paradigm example of this is science, it is most problematic when applied to one’s own life and existence. To identify life with its abstractions is, in Kierkegaard’s view, a dangerous but all too common error.
There are generally two, radically different ways to relate to the world: objective and subjective. Objectivity is an orientation towards reality based on abstracting away, in various degrees, from subjective experience, and from individual points of view. A subjective orientation, on the other hand, is based on an attunement to the inner experience of feeling, sensing, thinking and valuing that unfolds in our day-to-day living. This distinction has been brought into contemporary philosophical discourse most notably by Thomas Nagel, in a number of his essays, most famously in “What Is It Like to Be a Bat?”
Now in print
The Stone Reader: Modern Philosophy in 133 Arguments
The Stone Reader
An anthology of essays from The Times’s philosophy series, published by Liveright.
The spectacular success of science in the past 300 years has raised hopes that it also holds the key to guiding human beings towards a good life. Psychology and neuroscience has become a main source of life advice in the popular media. But philosophers have long held reservations about this scientific orientation to how to live life. The 18th century Scottish philosopher David Hume, for instance, famously pointed out, no amount of fact can legislate value, moral or otherwise. You cannot derive ought from is. But there is another, in some way more radical concern, expressed in Western philosophy most forcefully by Kierkegaard, and in literature by Dostoyevsky — two religiously inspired thinkers — namely that our experience of life matters in ineffable ways that no objective understanding of the world can capture.
Wittgenstein, in a well-known letter to Ludwig von Ficker, the publisher of the “Tractatus,” claimed that “the whole point of the book is to show that what is important lies in what cannot be expressed” in a scientific language. Suppose there was a super-intelligent organism — in a twist on Frank Jackson’s knowledge argument — that lacked any feeling or experience, a creature of pure thought. Such a creature could, for example, know everything about the brain, even everything about the world at large, all in scientific terms; but it would know nothing of human significance. (An interesting video depicting Jackson’s argument can be found here.)
This line of thought holds that human significance comes from subjective experience and that human beings cannot thrive without an orientation towards, and engagement with, the subjective experience of their lives, and that, as a matter of fact, a predominantly objective, conceptual orientation to oneself is detrimental to well being. As Kierkegaard put it, “Science and scholarship want to teach that becoming objective is the way. Christianity teaches that the way is to become subjective, to become a subject.”
Science is the best method we have for approaching the world objectively. But in fact it is not science per se that is the problem, from the point of view of subjectivity. It is objectivizing, in any of its forms. One can frame a decision, for example, in objective terms. One might decide between career choices by weighing differences in workloads, prestige, pay and benefits between, say, working for an advanced technology company versus working for a studio in Hollywood. We are often encouraged to make choices by framing them in this way. Alternatively, one might try to frame the decision more in terms of what it might be like to work in either occupation; in this case, one needs to have the patience to dwell in experience long enough for one’s feelings about either alternative to emerge. In other words, one might deliberate subjectively.
This is, of course, a crude opposition. We hardly ever deliberate purely objectively or purely subjectively. And there are built-in limits to subjective decision-making. Often, as the philosopher Laurie Paul has argued (as explained in an essay here by the psychologist Paul Bloom last year), we are not even in a position to imagine what our lives will be like after a life-altering decision.
The bottom line is, what Kierkegaard pointed out is a steady push in society toward more objectivity, and less engagement with subjectivity, with what is — sometimes derisively — called inwardness. There is less of a tendency for modern humans to live thoroughly immersed in life, experiencing it, and more of a tendency of being mostly distracted by its abstractions, by all the ways our culture conceptually frames our existence as individuals, Democrats and Republicans, man and women, one percenters, workers, consumers, and so on. And here, as a result, is the problem: by becoming less subjective, we become more cut off from sources of meaning and value.
One does not have to agree with Kierkegaard’s single-minded, hostile rejection of objective thought and objectivity to still consider what he has to say about the cultivation of subjectivity, because that is where his major insights lie. So what about his exhortation to become subjective? Why is there even a need for this? Isn’t it true that, given our experience of life, we already are? It seems that one cannot fail to be a subject, to be subjective. However, as Kierkegaard points out, the mind can flee its own subjectivity; instead of dwelling in the presence of one’s experience, one can escape into alienation; into theorizing about needs, goals and happiness, and live by abstract principles and objective measures. As Freud has described, there are various ways of doing this: by repressing experience, dissociating from it, numbing it, turning away from it.
More From The Stone
Read previous contributions to this series.
Most commonly, we turn our back on subjectivity to escape from pain. Suffering, one’s own, or others’, might become bearable, one hopes, when one takes a step back and views it objectively, conceptually, abstractly. And when it comes to something as monumental as the Holocaust, one’s mind cannot help but be numbed by the sheer magnitude of it. How could one feel the pain of all those people, sympathize with millions? Instead one is left with the “facts,” the numbers.
“Son of Saul” approaches its stupefying subject in a way that echoes Kierkegaard’s imperative. The audience is not given any space to distance from Saul’s reality or turn it into an abstraction of suffering, innocence, or goodness; the film doesn’t depict the story of the Holocaust in generic ways that would encourage getting lost in a historical account. Rather, it allows viewers to feel its textures, and perceive the sights and sounds that make up individual experience. In this way, the film depicts what many critics have argued could not be depicted.
It achieves this by not letting the viewer off the hook, by demanding participation, as far as it is possible in the imagination, in the experience of the Holocaust. As Kierkegaard puts it in “Either/Or,” “For one may have known a thing many times and acknowledged it … and yet it is only by the deep inward movements, only by the indescribable emotions of the heart, that for the first time you are convinced that what you have known belongs to you … for only the truth that edifies is truth for you.”
In addition to employing a subjective approach visually, Nemes also makes subjectivity the theme of the film. It depicts the loss of subjectivity both at the larger societal level, as well as at the level of the individual. It is this double engagement of subjectivity that makes the film so effective.
The death camp is the absurd end point of technological thinking, of the objectification of human beings. Totalitarian regimes the world over know the value and power of subjectivity – that is why they work so hard to destroy it. Murdered Jews are referred to as “pieces” in the movie by the German camp administrators. The victims are denied the status of subjects — they are mere physical objects to be dealt with. In this world of mechanized objectivity, the Nazi’s industrial brutality is considered entirely normal. Any individual not obeying its construct of proper behavior only has their own conscience to rely on.
Imre Kertész, winner of the Noble Prize for Literature for his autobiographical novel about a Hungarian boy taken to Auschwitz, explained this vividly in his acceptance speech when he describes a crucial experience that contributed to writing the novel: “The experience was about solitude, a more difficult life … the need to step out of the mesmerizing crowd, out of history, which renders you faceless and fateless.”
At a yet deeper level, it is the subjectivity of its protagonist that is the main theme of the movie. Through his encounter with the boy Saul gains back his soul. At the moment that Saul witnesses the killing of the boy, he becomes — in Kierkegaard’s beautiful expression — a Knight of Faith, someone who made a commitment, and who can pursue it with passion, as if it was the surest thing, even in the face of overwhelming odds. Saul is finally able to experience the death and destruction around him by committing to this one, dead boy; not unlike the film nudges the viewers to relate to the Holocaust as real, by committing them to this one, near dead protagonist. Having seen it you will remember, deliriously, having been there, in some small part of your body. It is a duty created by the Holocaust, suggests the movie. It is a dark counterpart of the imperative for Jews to relive the experience of deliverance from bondage into freedom each year at Passover.
At the same time that it depicts Saul’s conversion, “Son of Saul” also directly engages the viewer’s subjectivity by its style and mode of presentation; its achievement is to embody the dynamic that is its very subject matter. Kierkegaard called such communication — the only sort he thought befitting a subjective thinker — “double reflection.” He thought this is the only way that the authenticity of the message can be guarded — the only way to avoid being a town crier of subjectivity. In this way, “Son of Saul” is both art and philosophy: It makes inwardness visible. Through its depiction of death and destruction it reminds us how to live.
Katalin Balog is an associate professor of philosophy at Rutgers University-Newark. She writes about the nature of mind, consciousness and the self.
Now in print: “The Stone Reader: Modern Philosophy in 133 Arguments,” An anthology of essays from The Times’s philosophy series, edited by Peter Catapano and Simon Critchley, published by Liveright Books.
Follow The New York Times Opinion section on Facebook and on Twitter, and sign up for the Opinion Today newsletter.
Please select a destination:
DBMA Martial Arts Forum
=> Martial Arts Topics
Politics, Religion, Science, Culture and Humanities
=> Politics & Religion
=> Science, Culture, & Humanities
=> Espanol Discussion
Powered by SMF 1.1.21
SMF © 2015, Simple Machines