Lotta links to other interesting pieces in the online article.
The Oil Price Bubble Bursts
And prices are falling a dollar a day
Ronald Bailey | November 18, 2008
Oil prices have dropped by 60 percent since July. And they fell without the benefit of a gasoline tax holiday, new anti-speculator regulations, or a windfall profits tax on oil companies. A year ago, crude oil was going for $88.00 per barrel and gasoline cost an average of $2.76 per gallon. Over the following months, the price soared, reaching an inflation-adjusted record high of just over $147 per barrel in July. Then the bottom fell out. Yesterday, the price was hovering around $58, up from a recent low of $53 per barrel. The result is gasoline prices plummeting from a national average of $4.11 per gallon in July to below $2.07 per gallon now. So what happened?
First, just as one would expect, higher prices led to lower demand. U.S. demand for petroleum in 2008 was 5.4 percent lower than in 2007, falling by 1.1 million barrels per day (bpd) from 20.7 million to 19.6 million barrels per day. As prices rose Americans curtailed their driving. The Federal Highway Administration reported that in August 2008, Americans drove 15 billion fewer miles, or 5.6 percent less, than they did in August 2007. On the other hand, recent high prices have called forth new sources of supply. For example, Canadian oil sands now produce 1.1 million barrels per day. And new deepwater offshore production rigs like the Thunder Horse (250,000 barrels per day) and Tahiti (125,000 barrels per day) platforms are coming online. Falling demand and increasing supply mean lower prices.
In addition, a good portion of the lower demand for oil is the result of the global economic slowdown. "This time the usual petroleum boom/bust cycle lined up on top of the business cycle," said Tim Evans, an energy futures analyst at Citigroup's Futures Perspective. In March 2008, Evans warned that we were in the midst of a bubble and that oil prices would drop. When the investment firm Goldman Sachs suggested the possibility of $200 per barrel oil, Evans predicted that prices would fall to $60 to $70 per barrel. He observed presciently that "this is the riskiest time to be long in crude oil since 1980."
So as prices drop will demand increase? Yes, but Evans believes that U.S. demand will rise slowly. Why? In part because various federal government policy responses to recent high oil prices are unlikely to be reversed. For example, the Federal government has mandated that Corporate Average Fuel Economy standards for automobiles rise from 27.5 miles per gallon now to 35 miles per gallon by 2020. Evans thinks that hybrid automobile technology may look economically attractive even at current prices. Plug-in hybrids like the Chevy Volt should use about 2 cents of electricity per mile compared to 12 cents per mile of gasoline. In addition, Evans says, "The biofuels initiatives aren't going to go away. Even if they are not economically smart, the votes are there to make sure that we stick with these programs." So subsidized biofuels will displace some demand for gasoline, putting downward pressure on the price of crude oil.
On the supply side, those "windfall profits" that oil companies have been earning in the last couple of years are paying for exploration and development of more oil supplies. It is true that the oil companies have been using their record profits to buy back stock and thus increase shareholder value. Some members of Congress believe that the oil companies should spend their profits on alternative energy projects that the companies don't believe can be justified economically. And if the oil companies don't stop enriching their shareholders, Congress will see to it that the "windfall profits" are taxed away and spent by government bureaucrats on alternative energy projects. It is possible that the members of Congress know better how to spend oil company profits than do their executives, but the Federal government's record in this area is not impressive.
Naturally, suppliers don't like lower prices, so the members in the Organization of Petroleum Exporting Countries (OPEC) want to drive up prices by restricting supply. In October, OPEC members pledged to cut oil production by 1.5 million barrels per day beginning on November 1. They plan to hold another meeting later this month to discuss further reductions. Even as consumers enjoy lower prices at the gas pump now, analysts at the International Energy Agency fret that they will lead to underinvestment in oil production capacity, resulting in a crude oil supply crunch by the middle of the next decade. Disturbingly, 80 percent of the world's known oil reserves are owned by government oil companies whose revenues are looted rather than reinvested in production. In any case, lower prices and the credit crunch are already causing oil companies to shelve some projects. Alternative energy promoters also fear lower petroleum prices because they make their projects even less economically feasible. Some are advocating a higher gasoline tax in order to counteract the deleterious effects of lower crude oil prices on the glorious alternative energy future.
So what's next for oil prices? For the coming year, Evans thinks that the price of oil will bounce around in a trading range of $50 to $90 per barrel, averaging around $70 per barrel.
Ronald Bailey is reason's science correspondent. His book Liberation Biology: The Scientific and Moral Case for the Biotech Revolution is now available from Prometheus Books.
If she takes it, she's under his thumb-- and she won't take it because to do so would mean that Bill would have to explain all kinds of shady money he has been receiving, so I seriously doubt she will take it.
She'd also have to divest herself of financial interests that would give her an appearance of impropriety, and that would likely include any 2012 election mechanism. It was a cagey move on BHO's part, placate the Pumas, while giving Hillary the choice of either dismantling her political machine or STFU.
Health Insurance Premiums Rise Up To 33 Percent With State Pricing Rule, USA
18 Nov 2008
New research shows that the cost of health insurance for a typical family increases about $100 per month when state governments limit price adjustments based on factors like age, health or risky behaviors such as smoking.
The finding by Brigham Young University economist Mark Showalter is one of several examples of how one state's set of rules can result in widely different prices than what's found in the state next door. Perhaps the most eye-opening contrast exists in Trenton, New Jersey, where premiums cost about twice as much as those sold across the Delaware River in Pennsylvania.
"Establishing the actual costs of specific state regulations informs discussion of how to make health insurance more affordable," Showalter said. "It helps present a picture of what would happen if consumers were allowed to buy insurance from other states."
Showalter began the research during an appointment as a senior economist for the U.S. Council of Economic Advisers. He co-authored the new study with Amanda Kowalski of the National Bureau of Economic Research and William Congdon of The Brookings Institution. Their report will appear later this month in the academic journal Forum for Health Economics & Policy.
The researchers analyzed prices offered state-by-state for the estimated 26.5 million Americans who purchase directly from insurers rather than through an employer.
Seven states prevent insurers from adjusting prices based on one or more factors like age, health status or risky behavior. The researchers found such rules - known as community ratings - increased family premiums between 21 and 33 percent.
The rule is intended to promote equity but may consequently make insurance too expensive for healthy people. The study found New Jersey's strict form of community ratings responsible for premiums set two to three times higher than if the requirement were not in place.
University of Minnesota health economist Roger Feldman, who was not involved with the study, is funded by the U.S. Department of Health and Human Services to figure out how an interstate market might reduce the number of uninsured Americans.
"This study enables us to predict the effect of allowing consumers to shop for insurance across state lines," said Feldman. "Those kinds of simulations would not be possible without this study."
The researchers also found that health insurance premiums rise 10 percent or more when a state government makes insurers accept all doctors, hospitals or pharmacies instead of steering customers to an exclusive network of providers.
Twenty-one states have laws that require insurers to allow a patient to buy prescription drugs from any pharmacy they choose. Seven of those states force insurers to offer the same flexibility with choosing a doctor or hospital. According to the study, these laws result in a typical family paying about $30 more per month for health insurance.
Data for Showalter's study came from two major health insurers that do business nationwide. The analysis took into account factors that vary by location, such as the cost of health care, state taxes and consumer demographics.
"All of the regulations we studied have presumed benefits," Showalter said. "Our goal was to quantify the costs so that policymakers can better weigh the two."
A BYU alumnus, Showalter received a Ph.D. from the Sloan School of Management at the Massachusetts Institute of Technology. In 1991 he joined the BYU Economics Department faculty and has since published many journal articles on health and education.
Shocking News! State Mandates Increase the Cost of Health Insurance!
Ronald Bailey | November 18, 2008, 11:36am
Researchers at Brigham Young University, the National Bureau of Economic Research and the Brookings Institution have found that health insurance mandates raise the price for everybody. As the press release describing the study explains:
New research shows that the cost of health insurance for a typical family increases about $100 per month when state governments limit price adjustments based on factors like age, health or risky behaviors such as smoking.
The finding by Brigham Young University economist Mark Showalter is one of several examples of how one state's set of rules can result in widely different prices than what's found in the state next door. Perhaps the most eye-opening contrast exists in Trenton, New Jersey, where premiums cost about twice as much as those sold across the Delaware River in Pennsylvania...
Seven states prevent insurers from adjusting prices based on one or more factors like age, health status or risky behavior. The researchers found such rules - known as community ratings - increased family premiums between 21 and 33 percent.
The rule is intended to promote equity but may consequently make insurance too expensive for healthy people. The study found New Jersey's strict form of community ratings responsible for premiums set two to three times higher than if the requirement were not in place.
Who knew that 1800 federal and state mandates would boost the price of health insurance? Well, actually, lots of analysts do. For example, Harvard business school professor Regina Herzlinger told reason:
"It's like I'm shopping for a car and my state mandates that all cars have heated seats," says Herzlinger. Car buyers would not long stand for a heated car seat mandate that raises the price of a car by $1,000, and similarly individual health insurance shoppers would object to unnecessarily expensive insurance mandates.
It is very likely that legislators rarely consider the costs of such mandates to consumers, so the good news is that the study now quantifies them so that these trade-offs can be made explicitly. Whole press release for the study is available here.
After some hesitation—causing it to miss the initial filing deadline—the District of Columbia appealed to the Supreme Court. The NRA was simultaneously pushing a new federal law that would have mooted the newly renamed Heller case by overturning the city’s anti-gun laws. Levy lobbied against the measure, arguing that a Supreme Court victory would be more permanent and more important to the whole country than just overturning D.C.’s restrictions. That bill did not pass in 2007, and the Heller case was taken up by the Supreme Court in November of that year. Only at that point, after years of obstruction, did the NRA became highly cooperative, putting together a significant amicus brief endorsed by the majority of both houses of Congress and by Vice President Dick Cheney.
Chris Cox, head of the NRA’s Institute for Legislative Action, is happy with how his organization’s relationship to Heller turned out. Sure, he admits, there was conflict along the way. “In my experience, you get a bunch of lawyers in the room and you’ll probably not have agreement,” he says. “There was concern prior to [ John] Roberts and [Samuel] Alito even being on the Court as to whether or not the timing was right. It all worked out. Was it lucky? Was it strategy? I’ll let other people answer that. But I applaud Alan [Gura] and his team. The victory was ultimately due to a lot of hard work by a lot of people for decades, certainly including the NRA, and in the end the Second Amendment is stronger.”
At the oral arguments before the Supreme Court, decades of intellectual debate about the meaning of the Second Amendment came springing to life. D.C.’s lawyer Walter Dellinger started off by proposing a version of the Second Amendment which, while claiming it protects an individual right, but only if that individual is participating in the common defense in the form of a militia, in essence means that, no, the Second Amendment really doesn’t mean a thing in practical terms to Americans today in terms of home defense.
Chief Justice John Roberts prodded him on why the Framers said “the people” if they meant “the militia.” Dellinger said, well, the terms were really congruent, so the right applied to all the people but only for a militia purpose. For example, Dellinger offered, a private citizen might have a cause of action under the Second Amendment if the federal government interfered with his state’s right to form a militia.
Dellinger was only a few minutes into his presentation when Justice Anthony Kennedy—considered to be the swing vote, in this case as in many others—buoyed the Levy team by suggesting the Second Amendment “supplemented” the militia “by saying there’s a general right to bear arms quite without reference to the militia either way,” and talking of how the Founding Fathers’ attitudes about guns were born from a frontier experience, with worries about personal, not merely civil and political, defense from hostile crooks, Indians, wolves, and grizzly bears.
At the same time, Kennedy made it clear that he believed the Second Amendment right to bear arms was, like other rights in the Constitution, subject to regulation. Even with the historical examples from early America and England, he saw that by some of those laws, “You couldn’t conceal a gun and you also couldn’t carry it, but yet you had a right to have it.”
Dellinger argued that the legal right in D.C. to own (yet not, by the letter of the law, ever use in the home) long guns obviated any constitutional difficulties that might exist in the handgun ban. Chief Justice Roberts, straight out of the Heller team’s playbook, made the First Amendment analogy, asking Dellinger: Would it be constitutionally acceptable for a municipality to ban books as long as newspapers—a viable substitute source of expression—were still legal?
When it was Gura’s turn, he was asked to explain the meaning of the militia reference. He said it was to describe a purpose of the right of the people that the Amendment protected. He angered some in the hardcore gun rights movement when he concluded that the weapons protected by the Amendment should be ones that combined a militia purpose and a normal civilian purpose, since people were expected to supply them from their own everyday collection of weapons they typically used. Gura did not want to be pressed into arguing that machine guns should have unlimited Second Amendment protection.
He did ably defend the idea that personal self-defense was built into weapons rights during the Founding era. He granted that reasonable licensing doesn’t necessarily violate the Second Amendment. He also granted that empirical considerations about such matters as murder rates could play into policymakers’ decisions about what made for a reasonable gun regulation—but added that the very purpose of a constitutional right is to make sure that not everything is up for grabs just because a legislature thinks regulations are “reasonable.”
Many Internet gun-rights activists accused Gura of selling out on the machine gun issue. “We wanted to win,” Gura responds. “And you win constitutional litigation by framing issues in as narrow a manner as possible. I could not tell the justices honestly that I hadn’t thought about machine guns. ‘Gee, I don’t know, maybe…’ That’s a bunch of crap. I would have lost credibility, it would have been obviously a lie and I’m not going to lie to the Court, and I would have lost the case.”
Justice Antonin Scalia’s majority opinion said everything that Gura said, and that a generation of Second Amendment scholars had been saying for decades. The Second Amendment protected an individual right. The prefatory clause did not restrict the operative one; the protected right went beyond militia service. The relevant contemporaneous debates and state constitutions at the time of the Founding supported this interpretation. The Miller precedent was about the type of weapon, not the people to whom the right accrued.
Still, the decision wasn’t everything devotees of gun rights might have hoped for. Scalia also wrote: “The Second Amendment right is not unlimited. It is not a right to keep and carry any weapon whatsoever in any manner whatsoever and for whatever purpose: For example, concealed weapons prohibitions have been upheld under the Amendment or state analogues. The Court’s opinion should not be taken to cast doubt on longstanding prohibitions on the possession of firearms by felons and the mentally ill, or laws forbidding the carrying of firearms in sensitive places such as schools and government buildings, or laws imposing conditions and qualifications on the commercial sale of arms.”
Heller, then, by no means settled the entire gun control debate. It did instantly generate a series of lawsuits, many sponsored by the NRA, against jurisdictions with gun bans similar to D.C.’s, including Chicago (hit with two suits) and three of its suburbs. Some Illinois towns have already rescinded their handgun restrictions. Washington, D.C., after months of foot-dragging that prompted Dick Heller to file another lawsuit against the city, has finally allowed its citizens to register, own, and keep loaded in the home both revolvers and semiautomatic handguns.
Still, most gun laws short of total bans will likely survive under the Heller standard, even if it is authoritatively established that the Second Amendment ruling in the case applies to state and local actions. In the near term at least, Heller will heat up the gun debate instead of ending it.
But the case was vitally important to American public policy. One, it normalized within constitutional law the notion that self-defense is a right. Guns can kill, to be sure. But the principle that Heller vindicated was one at the core of Western liberalism, that of self-defense, which is for life. Those who believe in a strong activist government generally do so because they fear the potential savagery of human social life. They just don’t seem to want, with gun control, to allow the individual to do anything about it.
The Heller case was a prime example of how calm, dedicated, and strategic thinking on the part of crusaders for smaller government can achieve real and (probably) lasting victories. Fighting against even those who should have been their staunchest allies, Levy and his team of libertarian lawyers watched the zeitgeist, crafted a smart (though risky) strategy, and won.
Our legal system and our Constitution allowed them to do something about D.C.’s gun laws, even as D.C.’s gun laws did not allow its citizens to do much about their own safety. Because this group of people acted to preserve the right to self-defense, the rest of America has seen affirmed at least the basics of that right. The contours of that right to self defense remain to be defined by others who choose to follow in Levy and his crew’s footsteps.
Senior Editor Brian Doherty is the author of Gun Control on Trial: Inside the Supreme Court Battle Over the Second Amendment, from which this is excerpted.
The inside story of how a gang of libertarian lawyers made constitutional history
Brian Doherty | Decemember 2008 Print Edition
On the last date of the U.S. Supreme Court’s 2008 spring session, justices declared by a 5-4 decision in D.C. v. Heller that, yes, the Second Amendment does secure an individual right to keep and bear arms. With that, the high court voided the District of Columbia’s extreme regulations on gun ownership, which had amounted in practice to a complete ban on any usable weapon for self-protection, even in the home.
In retrospect, D.C. v. Heller seems almost inevitable, because of shifting public and academic attitudes toward gun rights. But victory came only after a protracted struggle, with many pitfalls along the way. It was pulled off by a small gang of philosophically dedicated lawyers—not “gun nuts” in any stereotypical sense, but thoughtful libertarians who believe Second Amendment liberties are a vital part of our free republic. Together they consciously crafted a solid, clean civil rights case to overturn the most onerous and restrictive set of gun regulations in the country. In the process, they set the stage for further legal challenges to other firearms restrictions from coast to coast.
Someone was going to reach the Supreme Court with a challenge to firearms regulation. In the 2001 Fifth Circuit case U.S. v. Emerson, a federal appeals court for the first time declared unequivocally that the Second Amendment, despite containing the word “militia” in its preamble, did indeed protect an individual right to bear arms. Though groundbreaking in the judicial system, that individual rights interpretation was already dominant within the legal academy, after decades of scholarship chipped away at the once-preeminent “collective rights” view that the amendment only protected either a state’s right to maintain a militia, or an individual’s rights within the context of militia service.
The Emerson decision rippled beyond the courts. On November 9, 2001, then–Attorney General John Ashcroft sent a memo to all U.S. attorneys praising the case for how it “undertook a scholarly and comprehensive review of the pertinent legal materials and specifically affirmed that the Second Amendment ‘protects the right of individuals, including those not then actually a member of any militia or engaged in active military service or training, to privately possess and bear their own firearms.’ ”
Gun rights were on the rise politically as well. Democrats lost Congress in 1994, and the White House in 2000, in part because of a backlash against the 1994 assault weapon ban. In the 21st century, the party no longer makes gun control a major issue. On the state level, laws making it easier for citizens to carry weapons have also been proliferating over the past two decades; the number of states with concealed-weapon “shall issue” standards (objective criteria with little or no bureaucratic discretion) now stands at a de facto 37, up from just eight in 1986.
That was the legal, political, and social environment in which Heller was launched in 2003. “The timing was ripe,” says attorney Robert Levy, then a senior fellow at the libertarian Cato Institute (and now its chairman) and the man who financed and spearheaded the case.
Yet Heller was almost derailed on a series of occasions, sometimes by the very people who cherish gun rights and constitutional protections the most, including the National Rifle Association (NRA). Many lacked confidence that the Court was ready to catch up with the legal academy. In the hour of opportunity, many blinked. Victory over these self-doubts provide a powerful reminder that, as Barry Goldwater reminded us, sometimes an overly fearful moderation in the pursuit of justice is no virtue, and that even decades of bad policy and bad political philosophy can turn around with smart, tenacious efforts.
Parker Becomes Heller
The inevitable post-Emerson challenge to gun restrictions could well have come from a radically different point of view. Various Washington, D.C., public defenders, for example, were trying to apply Emerson to reduce the prison sentences of their clients—street criminals who typically had a whole host of charges hanging over their heads, not otherwise law-abiding citizens seeking to arm themselves in their home.
So, prodded on by suggestions from a young lawyer named Clark Neily from the libertarian public interest law firm the Institute for Justice, Robert Levy assembled a team that included his Cato colleague Gene Healy (who dropped out before the case reached the Supreme Court), Neily himself, and the private-practice attorney who eventually argued the case in front of the Court, a Virginia libertarian named Alan Gura. Levy’s team then went searching for the ideal clients.
D.C. was the best place to start litigating the Second Amendment. The district is not a state but a federal enclave under direct control of Congress (though it has its own government with home-rule leeway), so lawyers could sidestep the contentious and still-unsolved issue of whether the Second Amendment applied to the states via the Fourteenth Amendment, which stipulates that “No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States.…nor deny to any person within its jurisdiction the equal protection of the laws.” That amendment has for the past half-century or so been interpreted to apply the provisions of most of the Bill of Rights to state and local government actions.
Besides, the city had the most ridiculously severe gun laws in the country. According to D.C. Codes 7-2502.01, 7-2502.02, 7-2507.02, and 22-4504 and 4515, it was illegal to have a handgun without registering it, and you couldn’t register it if you didn’t already own it before the law was passed in 1976; it was illegal to have your long gun in the home in any condition other than unloaded and disassembled or trigger-locked; and if you had a registered handgun, even carrying it around your house could net you a year in jail and a $1,000 fine.
After much searching by Levy’s team, six plaintiffs were selected. They filed the case on February 10, 2003. Back then, it wasn’t the Heller case, but the Parker case, named after original lead plaintiff Shelly Parker.
Parker, a black woman, had the potential to become a new kind of civil rights icon, standing up not just for the right to be treated fairly by other people but to take control over her own life and safety. She had a dramatic story of the type that should make everyone this side of Sarah Brady want to overnight her an out-of-state mail-order handgun.
In February 2002, Parker, a former nurse now working in software design, moved to a neighborhood on the northeastern edge of Capitol Hill rife with tenacious drug gangs. She wanted her neighborhood to be a safer and more comfortable place for law-abiding citizens, and so made a nuisance of herself to local drug dealers, walking the streets as a one-woman citizen patrol, calling cops when she saw illegal activity, and installing a security camera for her yard.
By June of that year, Parker’s car window had been broken, her security camera had been stolen, and a gang lookout rammed a car into her back fence. When the first news stories about the case appeared, one young drug dealer, physically imposing at over seven feet tall, allegedly shook her gate one night, shouting, “Bitch, I’ll kill you! I live on this block, too.” Parker thought it would be a good thing for her to have a firearm to protect herself in her home; D.C. law forbade her from doing so.
But, like four of the other original six plaintiffs, Parker was found by the Circuit Court of Appeals for the D.C.Circuit to lack legal “standing”—that is, actually suffering a direct injury under the law legitimate enough for her to legally challenge it. By March 2007, Dick Heller was the only plaintiff left. As many involved with the case would admit without wanting to stress it too much, Heller was probably the plaintiff they wanted least as a Second Amendment poster boy.
Heller isn’t a sweet lady trying to turn around a dodgy neighborhood; he’s an outspoken ideological activist seeking to push the federal government back within its constitutional bounds, and therefore (his lawyers fretted) potentially off-putting to judges, media, and citizens alike. One of his best friends, a thick, intense, walrus-mustachioed man named Dane vonBreichenruchardt, runs a small-scale political action group called the Bill of Rights Foundation, appears with Heller at most press conferences and events.
The best hook about Heller was his day job, as a trained and licensed special police officer contracted by a private firm to provide security services for the District of Columbia. For years, he carried a gun every day at the Thurgood Marshall Federal Judicial Center, yet he still had to turn over his sidearm and bullets at the end of each workday and go home, defenseless.
The city could hardly maintain that it was inherently unsafe for Dick Heller to possess or handle a weapon, since he does it every day as part of his job, and is deputized to do so by the city itself, background checks and all.
Heller knew his lawyers weren’t comfortable with him openly discussing many of his anti-government enthusiasms. When the cameras or notepads were in front of him, he wanted to talk about “the insanity of it, the overreach of government relegating all of us to second-class citizenship. The government grants us a gun then takes it away, says your life is not worth spit, but says ‘take care of us 9-5.’ That’s where I developed the idea that we truly are second-class citizens. How is that any different than Moscow?”
And that, he acknowledges, “is when the lawyers would go like this.” He makes a pained and annoyed face. “ ‘Moscow’ and ‘communist’—they didn’t want to hear that yet—until June! They said after the decision comes down, go for it. They almost wrote it down for me: ‘I just want to defend my own life in my own home.’ ”
The NRA v. Heller
The Heller case quickly found a powerful opponent in the National Rifle Association. This surprises nearly every layman I discuss the case with, most of whom assume the NRA was behind the lawsuit in the first place. The Parker lawyers received backroom visits from allies of the NRA before their case was filed, discouraging them from going forward. The Supreme Court (which still had Sandra Day O’Conner back then) would not reliably deliver a victory, they argued, and an authoritative statement from the Supremes that the Second Amendment did not protect an individual right could prove devastating to the long-term cause.
This was an intellectually respectable objection, the Levy team thought, but ultimately too fearful. If no one would fight for the Second Amendment qua Second Amendment in a relevant case, then its supposed paladins were as complicit in its irrelevance as were the most rabid partisans for the idea that the Second Amendment only applied to militias and is thus a dead letter.
“The second problem the NRA had with our case was territorial,” Gura says. “They didn’t want something like this going on that they didn’t have their hands in.” In fact, in April 2003, less than two months from Parker’s filing in U.S. District Court for the District of Columbia, a new lawsuit challenging D.C.’s gun laws, Seegars v. Ashcroft, was filed with the backing of the NRA and its longtime Second Amendment legal eagle Stephen Halbrook in charge.
As per then-standard NRA practice, Halbrook offered the court a menu of options to choose from to overthrow D.C. gun laws, hoping one of them might work even if a direct Second Amendment challenge did not. Among them were claims that Congress had only empowered D.C. to create for itself regulations that were “usual and reasonable,” and that D.C.’s gun laws, being the most severe ones in the nation, were therefore unusual and unreasonable.
Unlike the Levy team, Halbrook and the NRA chose to sue not only Washington, D.C., but the U.S. Department of Justice. The DOJ is a significantly more formidable opponent than the District of Columbia. To add insult to injury, because of their unease with Levy and his comparatively inexperienced crew, the NRA team used Seegars as an excuse to try to scuttle Parker altogether by taking over the case, through the legal gambit of “consolidation.” That’s when two cases that are asking courts to decide on essentially the same matter can be combined, whether or not one of the parties really wants it—a hostile takeover of the litigation, as it were. The consolidation request, made to the court in April 2003, was denied.
Then in January 2004, at the D.C. District Court, all but one Seegars plaintiff—a woman with a registered shotgun contesting the trigger-lock aspect of D.C.’s laws—were denied standing. The last remaining plaintiff lost the case on a basic “doesn’t belong to a militia” argument. The Seegars team appealed, bringing their case into the appeals process before Parker had even been considered at the District Court. It wasn’t until March 31, 2004 that that court dismissed Parker, basically on the grounds that those plaintiffs weren’t in a militia, either. The Levy team expected this initial loss, but appealed, determined to fight the case all the way through the appeals process.
Because the D.C. Circuit Court of Appeals decided that the issues in both cases were essentially the same, they halted the appeals progress of Parker, at D.C.’s request, pending resolution of Seegars. Then in a February 2005 decision, Seegars was wrecked on the rock of standing, for D.C. Circuit-specific peculiarities explained further below.
The NRA also harmed Parker through its decision to bring DOJ into the case. The D.C. Circuit Court of Appeals, in coming down with its Parker decision on March 9, 2007, booted five of the original plaintiffs off the case, for the same reason of standing that the five Seegars plaintiffs were all tossed away. The standing argument had been introduced to the case by the Justice Department; D.C. hadn’t thought of it on its own.
Sure, Parker and her compatriots might think that a core, fundamental constitutional right was being denied them. But by the D.C. Circuit’s standard, they had suffered no specific injury such that they had standing to sue.
The D.C. Circuit has a peculiar position on standing, more stringent than in any other circuit. The 1997 case Navegar v. U.S., coincidentally involving a gun manufacturer, established that plaintiffs must, in the language of D.C.’s filing to dismiss the plaintiffs in Parker, “demonstrate a threat of prosecution that is ‘credible and immediate,’ or imminent, and ‘not merely abstract or speculative.’ ” More or less, D.C. said that since the plaintiffs might be able to get away with breaking the gun laws, they had no standing to challenge those laws.
How is it that Heller alone survived the standing challenge? Even before the Parker case was officially filed, his friend Dane vonBreichenruchardt knew Heller was involved and intending to be a plaintiff—it was vonBreichenruchardt, who already knew Levy, who had introduced Heller to Levy.
VonBreichenruchardt had been a plaintiff in a previous case against certain regulations affecting the operations of nonprofits, rules that he felt amounted to a prior restraint on his First Amendment rights. He saw his case dismissed for lack of standing, for various reasons, one of which was that since he had not actually been punished for violating the law, it could be said that his claim that the regulations in question violated his rights was merely speculative.
So vonBreichenruchardt encouraged Heller to fill out a form to register one of the handguns Heller owned (apparently stored outside the district), even though he knew there was no way the city would actually accept the illegal pistol.
“It makes all the difference in the world that this one guy went down and filled out an absolutely meaningless piece of paper which you knew in advance was a futile act,” Neily says. “It was not intentional on the part of Alan, Bob, and myself, but it was intentional on the part of Dick and Dane, and it was very important that Dane had that insight and did that.” Heller slid in because he had a permit denied: a clear injury with a paper trail.
Standing wasn’t the only issue the D.C. Circuit Court of Appeals decided on March 9, 2007. The other action judges took that day proved to be better news for the Parker team. In a two-one vote, the three-judge panel sent the case back to District Court with an order: Grant summary judgment to Heller. Translation: Heller wins.
The decision was a glorious victory for the Levy team and for the Second Amendment. Judge Laurence H. Silberman, in his majority opinion, hit all the right points. He decided that the “people” referred to in the Amendment meant the people, that is, all of us as individuals. He decided that “bear arms” had more than just a military meaning in the idiom of the Founding era.
Silberman’s decision interpreted the 1939 Supreme Court case U.S. v. Miller, the dominant precedent regarding the Second Amendment, to say that cases hinged on the type of weapon the right affected, and whether the weapon had potential militia use, not on whether persons claiming the right were themselves in a militia. The judge did not accept D.C.’s claim that any constitutional infringement was mitigated because the city might not punish a long-gun owner for loading and using his weapon in self-defense in defiance of the letter of the law. “Judicial leniency,” he wrote, “cannot make up for the unreasonable restriction of a right.”
I mean, do you really think that the great majority of this nation's youth is lock stepping to some type of liberal educational agenda? If so, you are veering into paranoid conspiracy territory.
Is that a fact? Guess you've never dealt with any of the "green" gibberish being pounded into kids' heads these days. Recycling by about any empiric measure is a grossly ineffective tactic that for the most part leads to sorted piles of trash heading to the landfill instead of unsorted piles of trash doing the same. The underlying cause of the recycling effort would be done a hell of a lot of good if some entrepreneur came along and created markets for recycled material, but what's being taught from kindergarten to grad school in just about every school in the nation, sort your trash into piles that end up in the same place, or find economically productive means of using recycled material? I've rarely seen the latter while the former is epidemic, IMO because members of the nanny state left figure if they can get you feeling guilty enough to muck around in your trash can from elementary school on they are well on their way to getting you to feel guilty enough about everything else to dictate other odious, counterproductive responses.
As mentioned above, the Second Amendment is another case in point. It's not like their isn't copious source material demonstrating the Second Amendment means what it says, and it's not like there aren't leftist in possession of intellectual rigor who haven't, often times reluctantly, come out and said the framers clearly intended US citizens an individual right to keep and bear arms, but try to find those facts in any curriculum in America. My kids are regularly chastised for relaying historical truth with no veering into paranoid territory required. Think on that for a minute: kids contending with opprobrium at school because they spoke accurately. And that's 'sposed to be trumped by the concern that the folks who are force feeding falsehoods might feel insulted when their methods are called out? Which is the bath water and where is the baby?
The solution is simple: intellectual rigor, but that's getting pretty hard to find in the current everybody-who-competes-gets-a-trophy climate. If "save the planet," or "don't hurt anyone's feelings," or "regurgitate this reflexively" are the educational ends then yes, muddleheaded lockstep is a common result. If, on the other hand, the ability to marshal evidence, speak of it cogently, document sources, analyze them effectively, respond incisively to critcism and so is the educational end, then level headed thinking results.
Indeed, the best class I ever took was taught by Roger Wilkins, a very left wing professor who won a Pulitzer for writing the Washington Post editorials that helped drum Nixon out of office. The first day of class he said something like "I am an unapologetic liberal and believe the left wing democratic ideals are what this country needs most." I remember thinking "oh fornication, here we go again: I'll take on some left wing fruit loop on his own ground and end up with another 'C.'"
Didn't happen; Roger respects rigor above all else and favors well framed argument he disagrees with over fuzzy headed gibberish favoring his side of the aisle. He got my libertarian number quickly and used me as a foil as needed. Many a class would start with Roger saying something like "Guinness, let's here what you think about affirmative action," and then it would be on from there. I spent many an office hour debating things further with him, took every course he offered, earned an "A" in them all and was used as the example when students complained he never gave high grades, helped him with his research, and, when he'd be called out of town suddenly, was the guy he'd ask to help with his classes in his absence. I learned a hell of a lot from the guy, and respect the hell out of him for showing me just how effective rigor embedded in the lesson plan can be. As such I'm further all sorts of annoyed that I had to spend so much far less productive time in classrooms with baton wavers who'd get wounded and snarky if you didn't join their parade.
Bottom line: there's no need to present alternative forms of education if rigor is introduced early and throughout the curriculum. Alas these days concepts of PC epistemological purity seems to be the driving force.
I've certainly had a hard time finding historically accurate information about the Second Amendment and it's genesis at various colleges, and my kids are certainly not getting accurate info about it in elementary school. AGW, on the other hand is all over the place; I've had a couple interesting conversations with teachers after my kids relay that dad says AGW is unmitigated foolishness.
Thanks for chasing down that link, Freki. I meant to track it back, too.
More comment on the "hottest October" double data entry problem Goddard has confessed to. Think the author gets it right: the scary part about this error is not that an error occurred, but that a major anomaly went unnoticed because it fit so neatly into what global warming evangelists sought to find.
Lorne Gunter: Global warming numbers get a little help from their friends Posted: November 17, 2008, 9:13 AM by Kelly McParland Lorne Gunter,
Last week, the Goddard Institute for Space Studies – one of four agencies responsible for monitoring the global temperatures used by the U.N.’s Intergovernmental Panel on Climate Change – released its statistics for October. According to the GISS figures, last month was the warmest October on record around the world.
This struck some observers as odd. There had been no reports of autumn heat waves in the international press and there is almost always blanket coverage of any unusually warm weather since it fits into the widespread media bias that climate catastrophe lies just ahead. In fact, quite the opposite had occurred; there had been plenty of stories about unseasonably cool weather.
London had experienced its first October snow in 70 years. Chicago and the Great Plains states had broken several lowest-temperature records, some of which had stood for 120 years. Tibet had broken snowfall records. Glaciers in Alaska, the Alps and New Zealand had begun advancing. Sea ice expanded so rapidly it covered 30% more of Arctic than at the end of October 2007. (Of course, you saw few stories about that, too, since interest in the Arctic ice cover is reserved only for when its melting reinforces hysteria over global warming and polar bear extinction).
So the GISS claim that October was the warmest ever seemed counterintuitive, to say the least.
Thanks, though, to Steve McIntyre, the Toronto computer analyst who maintains the blog climateaudit.org, and Anthony Watts, the American meteorologist who runs wattsupwiththat.com, we did not have to wait long to find out the cause of the GISS’s startling statistics: Data-entry error.
October wasn’t the warmest October ever, it was only the 70th warmest in the past 114 years – in the bottom half of all Octobers, not at the top of the list. So why the massive discrepancy between the published GISS numbers and the correct ones?
Um, some guy – not at Goddard, a GISS spokesman was quick to point out as he toed the ground and gazed downward sheepishly – had supplied the NASA branch with September figures for much of the globe, rather than October ones. September being typically a much warmer month than October (at least in the Northern Hemisphere), when the September temps had been entered into the October report they produced – heh, heh – an unprecedented spike upwards in last month’s temperature.
Yeah, no kidding, like when Santa’s bathroom scale readings are inadvertently entered into Paris Hilton’s weight diary and they produce an unprecedented upward tonnage.
I truly think there was simply a case of garbage in, garbage out.
There have been some in the blogosphere who have charged that GISS’s actions were deliberate; that the institute lied to cover up the fact that through most of 2008 global temperatures have been on a downward plunge. Since that’s bad news for an group that has been at the forefront of promoting climate-change hysteria, GISS manipulated the data to support its campaign.
Frankly, I don’t think it’s that nefarious. Still, I think a bigger problem – unscientific bias at GISS and elsewhere in the global-warming community – has been exposed by this incident.
September figures from scores of weather stations around the world seem merely to have been copied into the GISS October database. Temperatures from Ethiopia, Kenya, Tunisia, Kazakhstan, most of Russia, Ukraine, Brazil, Malaysia, the Philippines, Finland, the U.K., Ireland and elsewhere seem to have been incorrectly duplicated. The problem isn’t that this mistake occurred, but rather that no one at Goddard seemed to think a one-month temperature jump of nearly a full degree worldwide warranted a double-check. The keepers of one of the U.N.’s four primary temperature records are sure the globe is warming dangerously, so sure it never even occurred to them to check why or how October’s figures were so anomalous.
It took bloggers using little more than desktop PCs and Internet connections only a few hours to find the errors. The difference is, they were prepared to look. Their minds were not so clouded by bias in favour of the warming theory that they have stopped asking obvious questions.
Scientists and activists who support the warming theory often insist the science is settled and this incident proves it is settled – in their own minds. For too many, scientific inquiry has ceased.
Long piece with a lot of formatting I'm too lazy to replicate. What follows are the first several paragraphs:
The Futile Quest for Climate Control Robert M. Carter
The idea that human beings have changed and are changing the basic climate system of the Earth through their industrial activities and burning of fossil fuels—the essence of the Greens’ theory of global warming—has about as much basis in science as Marxism and Freudianism. Global warming, like Marxism, is a political theory of actions, demanding compliance with its rules.
Marxism, Freudianism, global warming. These are proof—of which history offers so many examples—that people can be suckers on a grand scale. To their fanatical followers they are a substitute for religion. Global warming, in particular, is a creed, a faith, a dogma that has little to do with science. If people are in need of religion, why don’t they just turn to the genuine article?
Climate change knows three realities: science reality, which is what working scientists deal with every day; virtual reality, which is the wholly imaginary world inside computer climate models; and public reality, which is the socio-political system within which politicians, business people and the general citizenry work.
The science reality is that climate is a complex, dynamic, natural system that no one wholly comprehends, though many scientists understand different small parts. So far, science provides no unambiguous evidence that dangerous or even measurable human-caused global warming is occurring.
The virtual reality is that computer models predict future climate according to the assumptions that are programmed into them. There is no established Theory of Climate, and therefore the potential output of all realistic computer general circulation models (GCMs) encompasses a range of both future warmings and coolings, the outcome depending upon the way in which they are constructed. Different results can be produced at will simply by adjusting such poorly known parameters as the effects of cloud cover.
The public reality in 2008 is that, driven by strong environmental lobby groups and evangelistic scientists and journalists, there is a widespread but erroneous belief in our society that dangerous global warming is occurring and that it has human causation.
William Kininmonth (“Illusions of Climate Science”, Quadrant, October) has summarised well the nature of the main scientific arguments that relate to human-caused climate change. Therefore, I shall concentrate here a little less on the science, except as background information that relates to how we got to where we are today. My main aim is to explain the need for a proper national climate change policy that relates to real rather than imaginary risk, a policy position that neither the previous nor the present Australian government has achieved. Instead—in response to strong pressure from lobby groups whose main commonality is financial or other self-interest, and a baying media—our present national climate policy is to try to prevent human-caused global warming. This will be a costly, ineffectual and hence futile exercise.
Using September's data for October made the world seem pretty darn warm indeed. After his "hockey stick" and algorithm (AlGoreIthm?) debacles, you'd think Hansen would check his figures. No doubt it's a vast conspiracy launched by big oil.
The world has never seen such freezing heat By Christopher Booker Last Updated: 12:01am GMT 16/11/2008
A surreal scientific blunder last week raised a huge question mark about the temperature records that underpin the worldwide alarm over global warming. On Monday, Nasa's Goddard Institute for Space Studies (GISS), which is run by Al Gore's chief scientific ally, Dr James Hansen, and is one of four bodies responsible for monitoring global temperatures, announced that last month was the hottest October on record.
This was startling. Across the world there were reports of unseasonal snow and plummeting temperatures last month, from the American Great Plains to China, and from the Alps to New Zealand. China's official news agency reported that Tibet had suffered its "worst snowstorm ever". In the US, the National Oceanic and Atmospheric Administration registered 63 local snowfall records and 115 lowest-ever temperatures for the month, and ranked it as only the 70th-warmest October in 114 years.
So what explained the anomaly? GISS's computerised temperature maps seemed to show readings across a large part of Russia had been up to 10 degrees higher than normal. But when expert readers of the two leading warming-sceptic blogs, Watts Up With That and Climate Audit, began detailed analysis of the GISS data they made an astonishing discovery. The reason for the freak figures was that scores of temperature records from Russia and elsewhere were not based on October readings at all. Figures from the previous month had simply been carried over and repeated two months running.
The error was so glaring that when it was reported on the two blogs - run by the US meteorologist Anthony Watts and Steve McIntyre, the Canadian computer analyst who won fame for his expert debunking of the notorious "hockey stick" graph - GISS began hastily revising its figures. This only made the confusion worse because, to compensate for the lowered temperatures in Russia, GISS claimed to have discovered a new "hotspot" in the Arctic - in a month when satellite images were showing Arctic sea-ice recovering so fast from its summer melt that three weeks ago it was 30 per cent more extensive than at the same time last year.
A GISS spokesman lamely explained that the reason for the error in the Russian figures was that they were obtained from another body, and that GISS did not have resources to exercise proper quality control over the data it was supplied with. This is an astonishing admission: the figures published by Dr Hansen's institute are not only one of the four data sets that the UN's Intergovernmental Panel on Climate Change (IPCC) relies on to promote its case for global warming, but they are the most widely quoted, since they consistently show higher temperatures than the others.
If there is one scientist more responsible than any other for the alarm over global warming it is Dr Hansen, who set the whole scare in train back in 1988 with his testimony to a US Senate committee chaired by Al Gore. Again and again, Dr Hansen has been to the fore in making extreme claims over the dangers of climate change. (He was recently in the news here for supporting the Greenpeace activists acquitted of criminally damaging a coal-fired power station in Kent, on the grounds that the harm done to the planet by a new power station would far outweigh any damage they had done themselves.)
Yet last week's latest episode is far from the first time Dr Hansen's methodology has been called in question. In 2007 he was forced by Mr Watts and Mr McIntyre to revise his published figures for US surface temperatures, to show that the hottest decade of the 20th century was not the 1990s, as he had claimed, but the 1930s.
Another of his close allies is Dr Rajendra Pachauri, chairman of the IPCC, who recently startled a university audience in Australia by claiming that global temperatures have recently been rising "very much faster" than ever, in front of a graph showing them rising sharply in the past decade. In fact, as many of his audience were aware, they have not been rising in recent years and since 2007 have dropped.
Dr Pachauri, a former railway engineer with no qualifications in climate science, may believe what Dr Hansen tells him. But whether, on the basis of such evidence, it is wise for the world's governments to embark on some of the most costly economic measures ever proposed, to remedy a problem which may actually not exist, is a question which should give us all pause for thought.
I couldn't care less what consenting adults do behind closed doors. I've heard some compelling arguments as to why institutions designed to propagate families shouldn't be conferred to those biologically incapable of having children, and have some visceral feelings about children being thrust through adoption into same sex parenting controversies, but all in all hope American society is evolving in a manner where the content of someone's character matters far more than what gender they get their jollies with.
Feminists in this country do stand up for the rights of women all over the world.
Uhm, yeah, right. Just like all of them did when Palin was nominated. Duck over to the Daily Kos or the Democratic Underground some day if you want a full dose of situational ethics on full display. Whether it's support of petty dictators like Hugo Chavez, apologists for home grown terrorists like Bill Ayers and Bernadette Dohrn, or boneheads who think our dead troops got their just desserts, there are a lot of folks on the left who have a very hard time defining a standard and then applying it consistently, which is the reason I no longer count myself among their ranks.
I've seen the abstract of the article published in Nature, but have yet to read the full piece, summarized below.
Global warning: We are actually heading towards a new Ice Age, claim scientists By CHER THORNHILL Last updated at 5:06 PM on 13th November 2008
It has plagued scientists and politicians for decades, but scientists now say global warming is not the problem.
We are actually heading for the next Ice Age, they claim.
British and Canadian experts warned the big freeze could bury the east of Britain in 6,000ft of ice.
Most of Scotland, Northern Ireland and England could be covered in 3,000ft-thick ice fields.
The expanses could reach 6,000ft from Aberdeen to Kent – towering above Ben Nevis, Britain’s tallest mountain.
And what's more, the experts blame the global change on falling - rather than climbing - levels of greenhouse gases.
Lead author Thomas Crowley from the University of Edinburgh and Canadian colleague William Hyde say that currently vilified greenhouse gases – such as carbon dioxide – could actually be the key to averting the chill.
The warning, published in the authoritative journal Nature, is based on records of tiny marine fossils and the earth’s shifting orbit.
There are plenty of instances where folks claiming all sorts of PC bona fides stand with thugs who have committed numerous atrocities. The left's love affair with Castro is a case in point, as is its pro-Palestinian, anti-Israel stance. Just about any time any organ of the UN convenes sundry plutocrats, autocrats, kleptocrats, and their attendants spout some anti-Western democratic snivel from their perch atop a downtrodden populace, with rare objection from the left side of the aisle.
My comment was a parody of conversations I've witnessed on many occasions where sweetness and light types have had to invoke situational algebra to figure out how to respond when someone they stand in solidarity with behaves in a manner that would earn a white, male, Republican an excoriation in no uncertain terms. If you have not witnessed the same I would question your veracity and hope that the horror you espouse when the extreme is lampooned informs your thinking in less stark instances.
Wait a second, it happened in Afghanistan, so we have to be culturally sensitive. Missives about the oppression of women are only appropriate when they focus on instances of sexism occurring in Western democracies. At least I think thats the current standard, though I confess I get confused when calculating which situational ethic trumps what heinous act.
Laurent Murawiec’s The Mind of Jihad is, at last, a book on radical Islam that does it all. Unlike many engaged in the heated debate over the nature of our enemies, Murawiec does not believe that ancient texts tell us all we need to know. He insists that all ideas change over time, even those believed to have been dictated by God’s angel. He has therefore immersed himself not only in the sacred texts of Islam but also in the richly variegated speeches, writings, and actions of its most extremist practitioners: the jihadis waging war against us.
He candidly admits that it was not easy, that many of his initial ideas turned out to be wrong, and that his current understanding of “the mind of jihad” surprises him. This understanding holds that the current doctrine is far more than the resuscitation of medieval commandments, and in fact has a lot to do with modern European and Soviet totalitarianism.
As Murawiec tells us in fascinating detail, the jihadis have been willing to collaborate will all European totalitarian movement and regimes. And although we have heard quite a lot about their collaboration with the Fuhrer (in the person of Amin al-Husayni, the Grand Mufti of Jerusalem), there was a constant, intimate and extremely important alliance with the Soviet Union, which gave some of the key jihadis training in organization (and, most likely, intelligence as well).
He does go a bit far at times, though. “Most of the ugly repertoire of Modern Arab and Muslim anti-Semitism,” he writes, “came from the Soviet Union (with only the racial-biological component added by the Nazis.” That gives insufficient credit to the long tradition of Muslim anti-Semitism; they didn’t need Lenin and Stalin to teach them to hate Jews. But they did need Hitler and, more importantly, Himmler, to explain the most modern ways to hate, and then annihilate, the Jews. No surprise that the mufti quietly visited Auschwitz with his buddy Adolf Eichmann.
But perhaps the most valuable part of this invaluable book is the fascinating exposition of how Islamists, theoretically tied to a social and political doctrine that made it very difficult, if not impossible, to rebel against Islamic rulers, came to embrace a very leftist call for revolution. The key figure, according to Murawiec, is the Pakistani Sayyid Abul Ala Maududi, a friend of Khomeini and of Sayyid Qutb (Osama bin Laden’s hero). Maududi, as Murawiec notes, is a throwback to the medieval European chiliasts, like Thomas Muntzer and the radical Anabaptists. And like the European millenarians, Maududi’s claims are universal: “Islam addresses its call for effecting (its) program of destruction and reconstruction, revolution and reform not to just one nation, but to all humanity.” This effectively transforms Islam from a religion into a political cause, a call to arms, “as if Lenin’s ‘The State and Revolution’ had become their bedtime reading.”
As a result of these European and Soviet influences, the jihadis are inspired by a real lust for blood, and are members of a cult of death. Murawiec has a wonderful eye and a fine nose for telling anecdotes, such as that of Jordanian Prime Minister Wasfi al-Tell’s assassination at the Sheraton hotel in Cairo in November 1971. One of the major figures in the repression of the PLO in Jordan, al-Tell had been the object of death threats following “Black September,” and Arafat’s vengeance was swift and brutal:
Five . . . shots, fired at point-blank range. . . . He staggered back against the shattered swing doors . . . and he fell dying among the shards of glass on the marble floor. As he lay there, one of his killers bent over and lapped the blood that poured from his wounds.
Murawiec calmly draws the proper conclusion: “Something out of the ordinary was occurring, not war in the accepted sense, not political conflict or even guerrilla warfare.”
The Mind of Jihad is a work of considerable elegance and culture; it probably could only have been written by a European who has become an American, as it combines the best of French appreciation for the details of jihadist ideology — and jihadism’s connection to European precursors — with a keen pragmatic eye for the terrible consequences of these ideas and passions. It’s a hell of a book, and it deserves a lot of attention.
— Michael Ledeen is Freedom Scholar at the Foundation for the Defense of Democracies.
Hmm, mayhaps the engines of commerce are loosening things up in China?
November 14, 2008 China Eases a Licensing Rule for Media
By DAVID BARBOZA SHANGHAI — China agreed on Thursday to loosen restrictions on foreign news and information providers inside the country, settling a trade dispute with the United States, the European Union and Canada.
The agreement, which was signed in Geneva, allows international news and information agencies, like Bloomberg, Dow Jones & Company and Thomson Reuters, to more freely compete and sell their services inside China, where government controls were tightened in 2006.
The United States and European Union had filed a case against China at the World Trade Organization in March arguing that China unfairly required foreign news and financial information providers to be licensed by the Xinhua News Agency, a Chinese state-controlled entity that serves as the official outlet for the Communist Party and also a competitor of the foreign news companies. Canada later filed its own complaint against China.
According to the settlement, China agreed to remove the requirement that financial news providers be licensed by Xinhua and instead will set up an independent regulatory agency to oversee all financial news and information providers.
Foreign news and financial services companies are eager to sell their services into China’s booming financial services market, where a growing number of Chinese companies and government agencies are seeking valuable and timely news and financial information.
The United States trade representative, Susan C. Schwab, called the settlement a major step toward making financial information more widely available.
She said: “I am very pleased we have been able to sign an agreement with China today to allow financial information suppliers like Bloomberg, Dow Jones, Thomson Reuters to operate in China free of unfair restrictions that threatened to place them at a serious advantage.”
Doctor Doom The Worst Is Not Behind Us Nouriel Roubini 11.13.08, 12:01 AM ET It is useful, at this juncture, to stand back and survey the economic landscape--both as it is now, and as it has been in recent months. So here is a summary of many of the points that I have made for the last few months on the outlook for the U.S. and global economy, as well as for financial markets:
--The U.S. will experience its most severe recession since World War II, much worse and longer and deeper than even the 1974-1975 and 1980-1982 recessions. The recession will continue until at least the end of 2009 for a cumulative gross domestic product drop of over 4%; the unemployment rate will likely reach 9%. The U.S. consumer is shopped-out, saving less and debt-burdened: This will be the worst consumer recession in decades.
--The prospect of a short and shallow six- to eight-month V-shaped recession is out of the window; a U-shaped 18- to 24-month recession is now a certainty, and the probability of a worse, multi-year L-shaped recession (as in Japan in the 1990s) is still small but rising. Even if the economy were to exit a recession by the end of 2009, the recovery could be so weak because of the impairment of the financial system and the credit mechanism that it may feel like a recession even if the economy is technically out of the recession.
--Obama will inherit an economic and financial mess worse than anything the U.S. has faced in decades: the most severe recession in 50 years; the worst financial and banking crisis since the Great Depression; a ballooning fiscal deficit that may be as high as a trillion dollars in 2009 and 2010; a huge current account deficit; a financial system that is in a severe crisis and where deleveraging is still occurring at a very rapid pace, thus causing a worsening of the credit crunch; a household sector where millions of households are insolvent, into negative equity territory and on the verge of losing their homes; a serious risk of deflation as the slack in goods, labor and commodity markets becomes deeper; the risk that we will end in a deflationary liquidity trap as the Fed is fast approaching the zero-bound constraint for the Fed funds rate; the risk of a severe debt deflation as the real value of nominal liabilities will rise, given price deflation, while the value of financial assets is still plunging.
--The world economy will experience a severe recession: Output will sharply contract in the Eurozone, the U.K. and the rest of Europe, as well as in Canada, Japan and Australia/New Zealand. There is also a risk of a hard landing in emerging market economies. Expect global growth--at market prices--to be close to zero in Q3 and negative by Q4. Leaving aside the effects of the fiscal stimulus, China could face a hard landing growth rate of 6% in 2009. The global recession will continue through most of 2009.
--The advanced economies will face stag-deflation (stagnation/recession and deflation) rather than stagflation, as the slack in goods, labor and commodity markets will lead advanced economies' inflation rates to become below 1% by 2009.
--Expect a few advanced economies (certainly the U.S. and Japan and possibly others) to reach the zero-bound constraint for policy rates by early 2009. With deflation on the horizon, zero-bound on interest rates implies the risk of a liquidity trap where money and bonds become perfectly substitutable, where real interest rates become high and rising, thus further pushing down aggregate demand, and where money market fund returns cannot even cover their management costs.
Deflation also implies a debt deflation where the real value of nominal debts is rising, thus increasing the real burden of such debts. Monetary policy easing will become more aggressive in other advanced economies even if the European Central Bank cuts too little too late. But monetary policy easing will be scarcely effective, as it will be pushing on a string, given the glut of global aggregate supply relative to demand--and given a very severe credit crunch.
--For 2009, the consensus estimates for earnings are delusional: Current consensus estimates are that S&P 500 earnings per share (EPS) will be $90 in 2009, up 15% from 2008. Such estimates are outright silly. If EPS falls--as is most likely--to a level of $60, then with a price-to-earnings (P/E) ratio of 12, the S&P 500 index could fall to 720 (i.e. about 20% below current levels).
If the P/E falls to 10--as is possible in a severe recession--the S&P could be down to 600, or 35% below current levels.
And in a very severe recession, one cannot exclude that EPS could fall as low as $50 in 2009, dragging the S&P 500 index to as low as 500. So, even based on fundamentals and valuations, there are significant downside risks to U.S. equities (20% to 40%).
Similar arguments can be made for global equities: A severe global recession implies further downside risks to global equities in the order of 20% to 30%.Thus, the recent rally in U.S. and global equities was only a bear-market sucker's rally that is already fizzling out--buried under a mountain of worse-than-expected macro, earnings and financial news.
--Credit losses will be well above $1 trillion and closer to $2 trillion, as such losses will spread from subprime to near-prime and prime mortgages and home equity loans (and the related securitized products); to commercial real estate, to credit cards, auto loans and student loans; to leveraged loans and LBOs, to muni bonds, corporate bonds, industrial and commercial loans and credit default swaps. These credit losses will lead to a severe credit crunch, absent a rapid and aggressive recapitalization of financial institutions.
--Almost all of the $700 billion in the TARP program will be used to recapitalize U.S. financial institutions (banks, broker dealers, insurance companies, finance companies) as rising credit losses (close to $2 trillion) will imply that the initial $250 billion allocated to recap these institutions will not be enough. Sooner rather than later, a TARP-2 will become necessary, as the recapitalization needs of U.S. financial institutions will likely be well above $1 trillion.
--Current spreads on speculative-grade bonds may widen further as a tsunami of defaults will hit the corporate sector; investment-grade bond spreads have widened excessively relative to financial fundamentals, but further spread-widening is possible, driven by market dynamics, deleveraging and the fact that many AAA-rated firms (say, GE) are not really AAA, and should be downgraded by the rating agencies.
--Expect a U.S. fiscal deficit of almost $1 trillion in 2009 and 2010. The outlook for the U.S. current account deficit is mixed: The recession, a rise in private savings and a fall in investment, and a further fall in commodity prices will tend to shrink it, but a stronger dollar, global demand weakness and a larger U.S. fiscal deficit will tend to worsen it. On net, we will observe still-large U.S. twin fiscal and current account deficits--and less willingness and ability in the rest of the world to finance it unless the interest rate on such debt rises.
--In this economic and financial environment, it is wise to stay away from most risky assets for the next 12 months: There are downside risks to U.S. and global equities; credit spreads--especially for the speculative grade--may widen further; commodity prices will fall another 20% from current levels; gold will also fall as deflation sets in; the U.S. dollar may weaken further in the next six to 12 months as the factors behind the recent rally weather off, while medium-term bearish fundamentals for the dollar set in again; government bond yields in the U.S. and advanced economies may fall further as recession and deflation emerge but, over time, the surge in fiscal deficits in the U.S. and globally will reduce the supply of global savings and lead to higher long-term interest rates unless the fall in global real investment outpaces the fall in global savings.
Expect further downside risks to emerging-markets assets (in particular, equities and local and foreign currency debt), especially in economies with significant macro, policy and financial vulnerabilities. Cash and cash-like instruments (short-term dated government bonds and inflation-indexed bonds that do well both in inflation and deflation times) will dominate most risky assets.
So, serious risks and vulnerabilities remain, and the downside risks to financial markets (worse than expected macro news, earnings news and developments in systemically important parts of the global financial system) will, over the next few months, overshadow the positive news (G-7 policies to avoid a systemic meltdown, and other policies that--in due time--may reduce interbank spreads and credit spreads).
Beware, therefore, of those who tell you that we have reached a bottom for risky financial assets. The same optimists told you that we reached a bottom and the worst was behind us after the rescue of the creditors of Bear Stearns in March; after the announcement of the possible bailout of Fannie and Freddie in July; after the actual bailout of Fannie and Freddie in September; after the bailout of AIG in mid-September; after the TARP legislation was presented; and after the latest G-7 and E.U. action.
In each case, the optimists argued that the latest crisis and rescue policy response was the cathartic event that signaled the bottom of the crisis and the recovery of markets. They were wrong literally at least six times in a row as the crisis--as I have consistently predicted over the last year--became worse and worse. So enough of the excessive optimism that has been proved wrong at least six times in the last eight months alone.
A reality check is needed to assess risks--and to take appropriate action. And reality tells us that we barely avoided, only a week ago, a total systemic financial meltdown; that the policy actions are now finally more aggressive and systematic, and more appropriate; that it will take a long while for interbank and credit markets to mend; that further important policy actions are needed to avoid the meltdown and an even more severe recession; that central banks, instead of being the lenders of last resort, will be, for now, the lenders of first and only resort; that even if we avoid a meltdown, we will experience a severe U.S., advanced economy and, most likely, global recession, the worst in decades; that we are in the middle of a severe global financial and banking crisis, the worst since the Great Depression; and that the flow of macro, earnings and financial news will significantly surprise (as during the last few weeks) on the downside with significant further risks to financial markets.
I'll stop now.
Nouriel Roubini, a professor at the Stern Business School at New York University and chairman of Roubini Global Economics, is a weekly columnist for Forbes.com.
IT'S a question at the heart of what it is to be human: why do we go to war? The cost to human society is enormous, yet for all our intellectual development, we continue to wage war well into the 21st century.
Now a new theory is emerging that challenges the prevailing view that warfare is a product of human culture and thus a relatively recent phenomenon. For the first time, anthropologists, archaeologists, primatologists, psychologists and political scientists are approaching a consensus. Not only is war as ancient as humankind, they say, but it has played an integral role in our evolution.
The theory helps explain the evolution of familiar aspects of warlike behaviour such as gang warfare. And even suggests the cooperative skills we've had to develop to be effective warriors have turned into the modern ability to work towards a common goal.
These ideas emerged at a conference last month on the evolutionary origins of war at the University of Oregon in Eugene. "The picture that was painted was quite consistent," says Mark Van Vugt, an evolutionary psychologist at the University of Kent, UK. "Warfare has been with us for at least several tens, if not hundreds, of thousands of years." He thinks it was already there in the common ancestor we share with chimps. "It has been a significant selection pressure on the human species," he says. In fact several fossils of early humans have wounds consistent with warfare.
Studies suggest that warfare accounts for 10 per cent or more of all male deaths in present-day hunter-gatherers. "That's enough to get your attention," says Stephen LeBlanc, an archaeologist at Harvard University's Peabody Museum in Boston.
Primatologists have known for some time that organised, lethal violence is common between groups of chimpanzees, our closest relatives. Whether between chimps or hunter-gatherers, however, intergroup violence is nothing like modern pitched battles. Instead, it tends to take the form of brief raids using overwhelming force, so that the aggressors run little risk of injury. "It's not like the Somme," says Richard Wrangham, a primatologist at Harvard University. "You go off, you make a hit, you come back again." This opportunistic violence helps the aggressors weaken rival groups and thus expand their territorial holdings.
Such raids are possible because humans and chimps, unlike most social mammals, often wander away from the main group to forage singly or in smaller groups, says Wrangham. Bonobos - which are as closely related to humans as chimps are - have little or no intergroup violence because they tend to live in habitats where food is easier to come by, so that they need not stray from the group.
If group violence has been around for a long time in human society then we ought to have evolved psychological adaptations to a warlike lifestyle. Several participants presented the strongest evidence yet that males - whose larger and more muscular bodies make them better suited for fighting - have evolved a tendency towards aggression outside the group but cooperation within it. "There is something ineluctably male about coalitional aggression - men bonding with men to engage in aggression against other men," says Rose McDermott, a political scientist at Stanford University in California.
Aggression in women, she notes, tends to take the form of verbal rather than physical violence, and is mostly one on one. Gang instincts may have evolved in women too, but to a much lesser extent, says John Tooby, an evolutionary psychologist at the University of California at Santa Barbara. This is partly because of our evolutionary history, in which men are often much stronger than women and therefore better suited for physical violence. This could explain why female gangs only tend to form in same-sex environments such as prison or high school. But women also have more to lose from aggression, Tooby points out, since they bear most of the effort of child-rearing.
Not surprisingly, McDermott, Van Vugt and their colleagues found that men are more aggressive than women when playing the leader of a fictitious country in a role-playing game. But Van Vugt's team observed more subtle responses in group bonding. For example, male undergraduates were more willing than women to contribute money towards a group effort - but only when competing against rival universities. If told instead that the experiment was to test their individual responses to group cooperation, men coughed up less cash than women did. In other words, men's cooperative behaviour only emerged in the context of intergroup competition (Psychological Science, vol 18, p 19).
Some of this behaviour could arguably be attributed to conscious mental strategies, but anthropologist Mark Flinn of the University of Missouri at Columbia has found that group-oriented responses occur on the hormonal level, too. He found that cricket players on the Caribbean island of Dominica experience a testosterone surge after winning against another village. But this hormonal surge, and presumably the dominant behaviour it prompts, was absent when the men beat a team from their own village, Flinn told the conference. "You're sort of sending the signal that it's play. You're not asserting dominance over them," he says. Similarly, the testosterone surge a man often has in the presence of a potential mate is muted if the woman is in a relationship with his friend. Again, the effect is to reduce competition within the group, says Flinn. "We really are different from chimpanzees in our relative amount of respect for other males' mating relationships."
The net effect of all this is that groups of males take on their own special dynamic. Think soldiers in a platoon, or football fans out on the town: cohesive, confident, aggressive - just the traits a group of warriors needs.
Chimpanzees don't go to war in the way we do because they lack the abstract thought required to see themselves as part of a collective that expands beyond their immediate associates, says Wrangham. However, "the real story of our evolutionary past is not simply that warfare drove the evolution of social behaviour," says Samuel Bowles, an economist at the Santa Fe Institute in New Mexico and the University of Siena, Italy. The real driver, he says, was "some interplay between warfare and the alternative benefits of peace".
Though women seem to help broker harmony within groups, says Van Vugt, men may be better at peacekeeping between groups.
Our warlike past may have given us other gifts, as well. "The interesting thing about war is we're focused on the harm it does," says Tooby. "But it requires a super-high level of cooperation." And that seems to be a heritage worth hanging on to.
I just wish Bush, i.e. the republican party had been more fiscally responsible. I mean democrats are suppose to spend but republicans are suppose to keep a tight purse. Look at the mess we are in when the both team up and spend and print money.
No debate at all there. Indeed, as everyone writes myopic op eds about how the Repubs are now doomed I'm hoping this last election serves to prune the party of it's non-fiscally conservative and non-libertarian impulses. It'll be interesting to see how the God Squad side of the party influences the recovery.
On the flip side, across the aisle certainly doesn't have much call to become anything but more of the same. It'll be interesting to see how they handle their "mandate." Already a bunch of rumblings from the Bush loathers that investigations pend; think that impulse has big backfire potential.
My point in response was that the Republican side as well led by Bush these past 8 years has not been fiscally responsible with my tax dollars; I don't care about voluntary contributions.
Ha! Fiscally responsible legislators. Next you'll be demanding sober Vicars.
And no, in politics when you win (again on both sides of the aisle) the same standards don't apply. To some degree, (unfortunately) I think that when you win the same standards (versus the loser) don't apply to most things in life.
Alas, these days when vying for the win the same standards don't apply, witness the various mea culpas emerging as media outlets deconstruct their coverage of Palin et al.
When reason.tv spoke with former FEC head Brad Smith earlier this year, he offered this through-the-looking-glass take on campaign finance requirements:
Imagine if George Bush were to announce here in the fading twilight of his presidency that in order to prevent terrorists from infiltrating American political parties and thus asserting control of American government, we needed to introduce the PATRIOT II Act. And the PATRIOT II Act would require citizens to report to the government their political activities. And the government would keep that in a database, which by the way they would then make available to private individuals like employers or maybe groups that might want to protest outside your home...
You know what, we have that law already, and it's called campaign finance, it's called the Federal Election Campaign Act. Which requires you to report to the government, or requires the campaigns to report to the government people who give them money and the government keeps that in a database, and they make that available, anybody can go online and look that stuff up on the Internet.
Ta Da! Meet Scott Eckern, the Mormon artistic director of the California Musical Theater (take a second to ponder that combo) was forced to resign yesterday after activists mining campaign donations publicized the fact that he had given money to the effort to ban gay marriage in California.
It is, of course, the perfect right of the theater to send him packing for any reason, and I personally think anyone who gives money to oppose gay marriage sucks nuts.
But the whole episode is pretty unsavory. Eckern, who seems to have a decent relationship with his sister (a lesbian), and good relationships with his theater colleagues (lots of gay), was probably not spewing anti-gay bile at work. If he had been, it's hard to imagine he would have lasted for seven years in his current position.
Instead, Eckern's private, personal donation to a legal political cause he believes in was forced into the public eye by government-mandated disclosure. It seems unlikely that Eckern wanted the donation to be made public—he may not have even known that it would be. Though I hesitate to make this comparison for obviously reasons, Eckern was essentially outed by the state for his privately-held views.
But wait, The New York Times says "the swift resignation was not met with cheers by those on either side." Whew. At least everyone realizes that this is a forced error, that everyone has been put into a terrible position by forces outside of their control.
Or not. Marc Shaiman, the Tony Award-winning composer, told the Times that the entire episode left him "'deeply troubled' because of the potential for backlash against gays who protested Mr. Eckern’s donation." [itals mine]
"It will not help our cause because we will be branded exactly as what we were trying to fight," said Mr. Shaiman, who is gay.
At worst, those who forced out Eckern are guilty of failing to give him the benefit of the doubt, and perhaps (as Shaiman can't quite bring himself to admit) a little hypocrisy. Imagine the situation reversed: A small non-profit that focuses on, say, education and happens to be culturally conservative, discovers that an employee has given money to protect gay marriage and fires him.
But the real culprit here is campaign finance laws. Not all political actions should be public actions, and this case illustrates why minorities of all kinds occasionally need privacy to be full participants in political life.
Wow, speaking of Glib Cognitive Dissonance, are you really conflating how the national debt is handled with how the DNC handles its debt? I realize Bush it the root of all evil and controls sundry nefarious plots that would put the Illuminati to shame, but last time I looked Congress held a purse string or two.
The absurdity doesn't end there. By reneging on his promise to accept federal campaign funds BHO managed to outspend McCain several times over. By what factor is hard to pin down as the act of reneging removed BHO from oversight. McCain, on the other hand, snared himself both by accepting federal funds and then by being forced to abide by McCain Feingold. Don't get me wrong, I think that latter piece of legislation is an utter abomination that directly impinges on First Amendment freedoms, and it's amusing to watch the resulting petard hoist, but imagine the media narrative if the roles were reversed: filthy rich Republicans break promise and bludgeon honorable black guy who played by the rules to defeat by grossly outspending him. Guess the standards don't apply the same way when your guy wins.
We live in two Americas. One America, now the minority, functions in a print-based, literate world. It can cope with complexity and has the intellectual tools to separate illusion from truth. The other America, which constitutes the majority, exists in a non-reality-based belief system. This America, dependent on skillfully manipulated images for information, has severed itself from the literate, print-based culture. It cannot differentiate between lies and truth. It is informed by simplistic, childish narratives and clichés. It is thrown into confusion by ambiguity, nuance and self-reflection. This divide, more than race, class or gender, more than rural or urban, believer or nonbeliever, red state or blue state, has split the country into radically distinct, unbridgeable and antagonistic entities.
There are over 42 million American adults, 20 percent of whom hold high school diplomas, who cannot read, as well as the 50 million who read at a fourth- or fifth-grade level. Nearly a third of the nation’s population is illiterate or barely literate. And their numbers are growing by an estimated 2 million a year. But even those who are supposedly literate retreat in huge numbers into this image-based existence. A third of high school graduates, along with 42 percent of college graduates, never read a book after they finish school. Eighty percent of the families in the United States last year did not buy a book.
The illiterate rarely vote, and when they do vote they do so without the ability to make decisions based on textual information. American political campaigns, which have learned to speak in the comforting epistemology of images, eschew real ideas and policy for cheap slogans and reassuring personal narratives. Political propaganda now masquerades as ideology. Political campaigns have become an experience. They do not require cognitive or self-critical skills. They are designed to ignite pseudo-religious feelings of euphoria, empowerment and collective salvation. Campaigns that succeed are carefully constructed psychological instruments that manipulate fickle public moods, emotions and impulses, many of which are subliminal. They create a public ecstasy that annuls individuality and fosters a state of mindlessness. They thrust us into an eternal present. They cater to a nation that now lives in a state of permanent amnesia. It is style and story, not content or history or reality, which inform our politics and our lives. We prefer happy illusions. And it works because so much of the American electorate, including those who should know better, blindly cast ballots for slogans, smiles, the cheerful family tableaux, narratives and the perceived sincerity and the attractiveness of candidates. We confuse how we feel with knowledge.
The illiterate and semi-literate, once the campaigns are over, remain powerless. They still cannot protect their children from dysfunctional public schools. They still cannot understand predatory loan deals, the intricacies of mortgage papers, credit card agreements and equity lines of credit that drive them into foreclosures and bankruptcies. They still struggle with the most basic chores of daily life from reading instructions on medicine bottles to filling out bank forms, car loan documents and unemployment benefit and insurance papers. They watch helplessly and without comprehension as hundreds of thousands of jobs are shed. They are hostages to brands. Brands come with images and slogans. Images and slogans are all they understand. Many eat at fast food restaurants not only because it is cheap but because they can order from pictures rather than menus. And those who serve them, also semi-literate or illiterate, punch in orders on cash registers whose keys are marked with symbols and pictures. This is our brave new world.
Political leaders in our post-literate society no longer need to be competent, sincere or honest. They only need to appear to have these qualities. Most of all they need a story, a narrative. The reality of the narrative is irrelevant. It can be completely at odds with the facts. The consistency and emotional appeal of the story are paramount. The most essential skill in political theater and the consumer culture is artifice. Those who are best at artifice succeed. Those who have not mastered the art of artifice fail. In an age of images and entertainment, in an age of instant emotional gratification, we do not seek or want honesty. We ask to be indulged and entertained by clichés, stereotypes and mythic narratives that tell us we can be whomever we want to be, that we live in the greatest country on Earth, that we are endowed with superior moral and physical qualities and that our glorious future is preordained, either because of our attributes as Americans or because we are blessed by God or both.
The ability to magnify these simple and childish lies, to repeat them and have surrogates repeat them in endless loops of news cycles, gives these lies the aura of an uncontested truth. We are repeatedly fed words or phrases like yes we can, maverick, change, pro-life, hope or war on terror. It feels good not to think. All we have to do is visualize what we want, believe in ourselves and summon those hidden inner resources, whether divine or national, that make the world conform to our desires. Reality is never an impediment to our advancement. The Princeton Review analyzed the transcripts of the Gore-Bush debates, the Clinton-Bush-Perot debates of 1992, the Kennedy-Nixon debates of 1960 and the Lincoln-Douglas debates of 1858. It reviewed these transcripts using a standard vocabulary test that indicates the minimum educational standard needed for a reader to grasp the text. During the 2000 debates, George W. Bush spoke at a sixth-grade level (6.7) and Al Gore at a seventh-grade level (7.6). In the 1992 debates, Bill Clinton spoke at a seventh-grade level (7.6), while George H.W. Bush spoke at a sixth-grade level (6., as did H. Ross Perot (6.3). In the debates between John F. Kennedy and Richard Nixon, the candidates spoke in language used by 10th-graders. In the debates of Abraham Lincoln and Stephen A. Douglas the scores were respectively 11.2 and 12.0. In short, today’s political rhetoric is designed to be comprehensible to a 10-year-old child or an adult with a sixth-grade reading level. It is fitted to this level of comprehension because most Americans speak, think and are entertained at this level. This is why serious film and theater and other serious artistic expression, as well as newspapers and books, are being pushed to the margins of American society. Voltaire was the most famous man of the 18th century. Today the most famous “person” is Mickey Mouse.
In our post-literate world, because ideas are inaccessible, there is a need for constant stimulus. News, political debate, theater, art and books are judged not on the power of their ideas but on their ability to entertain. Cultural products that force us to examine ourselves and our society are condemned as elitist and impenetrable. Hannah Arendt warned that the marketization of culture leads to its degradation, that this marketization creates a new celebrity class of intellectuals who, although well read and informed themselves, see their role in society as persuading the masses that “Hamlet” can be as entertaining as “The Lion King” and perhaps as educational. “Culture,” she wrote, “is being destroyed in order to yield entertainment.”
“There are many great authors of the past who have survived centuries of oblivion and neglect,” Arendt wrote, “but it is still an open question whether they will be able to survive an entertaining version of what they have to say.”
The change from a print-based to an image-based society has transformed our nation. Huge segments of our population, especially those who live in the embrace of the Christian right and the consumer culture, are completely unmoored from reality. They lack the capacity to search for truth and cope rationally with our mounting social and economic ills. They seek clarity, entertainment and order. They are willing to use force to impose this clarity on others, especially those who do not speak as they speak and think as they think. All the traditional tools of democracies, including dispassionate scientific and historical truth, facts, news and rational debate, are useless instruments in a world that lacks the capacity to use them.
As we descend into a devastating economic crisis, one that Barack Obama cannot halt, there will be tens of millions of Americans who will be ruthlessly thrust aside. As their houses are foreclosed, as their jobs are lost, as they are forced to declare bankruptcy and watch their communities collapse, they will retreat even further into irrational fantasy. They will be led toward glittering and self-destructive illusions by our modern Pied Pipers—our corporate advertisers, our charlatan preachers, our television news celebrities, our self-help gurus, our entertainment industry and our political demagogues—who will offer increasingly absurd forms of escapism.
The core values of our open society, the ability to think for oneself, to draw independent conclusions, to express dissent when judgment and common sense indicate something is wrong, to be self-critical, to challenge authority, to understand historical facts, to separate truth from lies, to advocate for change and to acknowledge that there are other views, different ways of being, that are morally and socially acceptable, are dying. Obama used hundreds of millions of dollars in campaign funds to appeal to and manipulate this illiteracy and irrationalism to his advantage, but these forces will prove to be his most deadly nemesis once they collide with the awful reality that awaits us.
So it's not like we don't have a model for what can happen with soak the rich tax schemes. . . .
Empire State Implosion The financial meltdown and the welfare state.
The global credit panic has swept away many illusions, and we're about to find out if that includes those of the politicians who have feasted for years on Wall Street tax revenues. Ground Zero is New York, which has lived a tax-and-spend fantasy thanks to the long bull market and "progressive" tax rates. Reality is now biting.
The financial services industry employs between 2% and 3% of nongovernment workers in New York, the same as it did in the late 1970s. What's changed is the share of total wages in the state represented by Wall Street jobs, which had skyrocketed to nearly 20% last year from a little over 2% in 1977.
"This is 212,000 people making nearly $80 billion in wages and salaries last year," explained E.J. McMahon of the Manhattan Institute at a recent panel discussion on the financial crisis. "This is all taxed at the margin, so it plays an outsized role in the state's finances." This is also the dirty little secret of highly "progressive" tax rates: They make a state dependent on relatively few taxpayers.
The financial industry doubled its percentage of the national economy in the 1980s, and did so again between 1990 and 2006. As Wall Street wages have grown, so has New York's dependence on revenue from the personal income tax. In 1977 personal income taxes represented less than 45% of all state taxes. In 2007 they represented about 60%. And for the past 30 years, inflation-adjusted state spending has tracked closely with booms and busts on Wall Street. According to John Cape, a former state budget director, about 45,000 New York taxpayers provide the state "with anywhere from 20% to 30% of total income tax receipts."
New York City has also done little to decrease its addiction to revenue from a single industry. Mayor Michael Bloomberg missed the chance to use 9/11 as an opportunity for reform, and he's declined to challenge public unions over pay and benefits. Bigger and bigger budgets have been submitted and approved as though record Wall Street profits would never end. The financial industry is 14% of gross city product. In 2006, New York City received 50% of its personal income tax revenue from the top 1% of earners, many of whom work in finance.
During previous downturns Albany has resisted structural reforms. Instead of lessening the state's dependence on this narrow slice of the tax base, lawmakers have been content to wait for Wall Street to come roaring back. To cover the rising costs of debt payments, school aid, Medicaid, pensions and other budget drivers, they've raised taxes, sometimes temporarily but often permanently.
It would be a tragic mistake to view the current downturn as merely another cyclical blip. It may take Wall Street years to come back, and once it does it certainly won't look the same. Fewer big global banks are likely to emerge from the ashes; and while they will be better capitalized, they will also be more highly regulated. More reasonable leverage ratios mean less risk-taking and less profit even in good times. Bonus pools are likely to be anemic for some time.
New York's revenue coffers are set to take a hit. The only question is how big. The state budget deficit is already projected to be $1.5 billion in the current fiscal year, and Governor David Paterson estimates it could grow to $14 billion over the next two years if nothing is done.
To his credit, the Democratic Governor is trying to force Albany to confront its addictions. He's said that a tax hike -- even one targeting only the "rich" -- would be damaging. Mr. Paterson is urging labor unions to renegotiate contracts on behalf of public employees. And he's proposed trimming as much as $2 billion from this year's budget, including cuts to health care and education.
Naturally, union officials and hospital advocacy groups are balking at the Governor's requests and pushing for tax increases, but out-of-control education and Medicaid spending is what has fed the state's structural deficit. New York spends more money per pupil ($14,000) than any other state. Its only rivals are New Jersey and Connecticut and all three are at least 40% above the national average. The state's Medicaid costs of $2,260 per resident are twice the national average and equal to what Texas and Florida spend combined.
If New York wants to make sure a rejuvenated financial industry returns to Wall Street, it should be looking to reform its steeply progressive tax code. A leaner, more risk-averse and heavily regulated finance industry will be all the more sensitive to the high cost of doing business in New York. The Big Apple already imposes the highest personal income tax rate of any jurisdiction in the country (10.5%). And it's significantly higher than neighboring New Jersey (8.97%) and Connecticut (5%).
The financial industry has been having a painful reckoning with more realistic assessments of risk. New York's politicians need a similarly rude awakening.
Washington's $5 Trillion Tab Elizabeth Moyer, 11.12.08, 5:15 PM ET
For all the fury over Treasury Secretary Henry Paulson's $700 billion emergency economic relief fund, it seems downright puny when compared to the running total of the government's response to the credit crisis.
According to CreditSights, a research firm in New York and London, the U.S. government has put itself on the hook for some $5 trillion, so far, in an attempt to arrest a collapse of the financial system.
The estimate includes many of the various solutions cooked up by Paulson and his counterparts Ben Bernanke at the Federal Reserve and Sheila Bair at the Federal Deposit Insurance Corp., as the credit crisis continues to plague banks and the broader markets.
The Fed has taken on much of that total, including lending a cumulative $1 trillion in overnight or short-term loans since March to primary dealers through its emergency discount window and making a cumulative $1.8 trillion available through its term auction facility, a series of short-term transactions it began making available twice a month in January. It should be noted that a portion of the funds lent in these programs has been repaid and that the totals represent what has been made available.
The Fed also took on tens of billions in debt, including $29 billion in debt of Bear Stearns, and made $60 billion of credit available to American International Group. It is committing $22.5 billion to set up a special purpose vehicle to manage some of AIG's residential mortgage-backed securities, and it is financing $30 billion of a second fund to hold $70 billion of multi-sector collaterized debt obligations on which AIG wrote credit default swaps.
The Treasury, in addition to the $700 billion raised in the Emergency Economic Stabilization Act, agreed to guarantee money market funds against losses up to $50 billion, will inject $40 billion of capital into AIG and is backing the conservatorship of Fannie Mae and Freddie Mac, to the tune of $200 billion.
The FDIC, meanwhile, is guaranteeing $1.5 trillion of senior unsecured bank debt.
Not included in the total are the Fed's long-existing discount window lending to commercial banks, the mortgage modification plan announced by regulators on Tuesday, support for the Federal Home Loan Banks and a myriad of other programs.
Paulson and Bernanke have tried any number of ways to stop the free fall in housing prices and unfreeze the credit markets, with limited success. Rates that banks charge each other for three-month loans have dropped to 2.1% over the corresponding Treasury security, from their high of 4.8% in October. But lending is contracting as banks brace for rising credit costs and corporate borrowers hunker down.
The Treasury has turned its focus from attempting to buy troubled assets from banks, which was the original intent of the October Emergency Economic Stabilization Act, to injecting capital in the form of preferred equity stakes.
It started out with $125 billion worth of investments in eight major U.S. banks and has since expanded the program to an increasingly broad range of financial and nonfinancial companies. And with just $60 billion left of its initial $350 billion authorization under the emergency act, the Treasury faces a growing number of companies--including Detroit's automakers--begging for assistance.
David Hendler, an analyst at CreditSights, says it looks as if government is left holding the bag, and of course that translates into everyone.
"The losses have to be taken, but no one wants to take them," Hendler said at a conference Wednesday, speaking about the banks and their handling of troubled assets. "It seems like the taxpayers are going to be taking a good portion of that."
China tries to do so with a degree of success (think Olympics and protest coverage), but it's a losing game over the long term. It'd be like the war on drugs 2.0: you criminalize a large portion of your population and drop a lot of your wealth into enforcement efforts. Not saying it couldn't happen, but the US would have to take the Orwellian turn the UK is flirting with and then widely deploy a lot of nanny state software and hardware.
Emanuel to Republican Drug Warriors: 'Thanks for the White Flag' Jacob Sullum | November 12, 2008, 1:20pm
In today's column, I noted that Rep. Rahm Emanuel (D-Ill.), Barack Obama's choice for chief of staff, has a history as a hard-line drug warrior. Here is another example of his tougher-than-thou rhetoric, from a 2006 press release "in response to reports that Attorney General Alberto Gonzalez' [sic] called the war on terror a real war, not like the war on drugs":
Thanks for the white flag. From the United States' most senior law enforcement official, the man who should be leading the war on drugs, this white flag of surrender will not be reassuring to the millions of parents trying to protect their kids.
The excuse for Emanuel's attempt to position himself to the right of the Bush administration on drug policy is not just lame but alarming. The statement by Gonzales to which he refers was made during an interview with The Kansas City Star in which the attorney general defended the administration's unilateral, indefinite detention of suspected terrorists. Here's an excerpt from the Star article, which I found on Nexis (italics added):
[Gonzales] said that "just like in every other war," the American people will have to trust the government to protect the rights of those in custody while pursuing justice in secret. Pressed on how long extraordinary measures—for instance, the imprisonment of suspects without the filing of charges—might continue, he said they would last at least until the pursuit of al-Qaida and its accomplices has come to an end.
"First of all this is a real war," he said, drawing a distinction between the war on terror and "the war on drugs or the war on poverty or something like that. It's like the Cold War. At some point this conflict is going to be over. But today it is not over."
Instead of challenging the Bush administration's use of war rhetoric to justify chucking habeas corpus, due process, and the separation of powers, Emanuel faulted it for waging the war on drugs with insufficient enthusiasm. Not only does this not bode well for drug policy in the Obama administration; it further undermines the next president's claim to be better than Bush on civil liberties in general.
The Obamas are a warm vision for the White House -- but he should strive toward full transparency. Plus: Yes, I still like Sarah Palin! By Camille Paglia
Nov. 12, 2008 |
Dazed and confused. A week after the election of Barack Obama, millions of American news junkies are in serious cold turkey, the big bump of withdrawal from two years of addiction to the dizzying ups and downs of a campaign that threatened never to end.
Eat dirt, you sour Clintons, who said Obama was "unelectable." Obama's 8 million vote margin over his Republican opponent -- miraculously sparing us endless litigation and chad counting -- was an exhilarating testimony to his personal gifts and power of persuasion. And the formidable Michelle Obama, with her electric combo of brains and style, is already rewriting first ladyhood. The warm partnership of the Obamas (wonderfully caught by the camera as they disappeared offstage after his victory) has set an inspiring standard for modern marriage.
Yes, it's true we know relatively little about Barack Obama, and his triumph is a roll of the dice. But John McCain (like Bob Dole) was a major Republican misfire -- a candidate of personal honor and heroic sacrifice who was woefully inadequate for the times. McCain's lurching grandstanding during the Wall Street crisis made him look like a ham actor on a bender. In debate, McCain was always pugnacious but too often bland or rambling, and he often missed glaring opportunities to score off Obama's vagueness or contradictions.
McCain's brusque treatment of his long-suffering wife, Cindy, was also off-putting -- nowhere more so than after his concession speech, when he barely remembered to give her a perfunctory hug. Probably no one is more relieved by McCain's defeat than Cindy, who seemed too frail and tightly wound for the demanding role of first lady. Now she can slip away once more into blessed privacy.
No one knows whether Obama will move to the center or veer hard left. Perhaps even he doesn't know. But I have great optimism about his political instincts and deftness. He wants to be president of all the people -- if that is possible in so divided a nation. His natural impulse seems to be toward reconciliation and concord. The big question will be how patient the Democratic left wing is in demanding drastic changes in social policy, particularly dicey with a teetering economy.
As I've watched Obama gracefully step up to podiums or move through crowds, I've been reminded not of basketball, with its feints and pivots, but of surfing, that art form of his native Hawaii. A photograph of Obama body surfing on vacation was widely publicized in August. But I'm talking about big-time competitive surfing, as in this stunning video tribute to the death-defying Laird Hamilton (who, like Obama, was raised fatherless in Hawaii). Obama's ability to stay on his feet and outrun the most menacing waves that threaten to engulf him seems to embody the breezy, sunny spirit of the American surfer.
In the closing weeks of the election, however, I became increasingly disturbed by the mainstream media's avoidance of forthright dealing with several controversies that had been dogging Obama -- even as every flimsy rumor about Sarah Palin was being trumpeted as if it were engraved in stone on Mount Sinai. For example, I had thought for many months that the flap over Obama's birth certificate was a tempest in a teapot. But simple questions about the certificate were never resolved to my satisfaction. Thanks to their own blathering, fanatical overkill, of course, the right-wing challenges to the birth certificate never gained traction.
But Obama could have ended the entire matter months ago by publicly requesting Hawaii to issue a fresh, long-form, stamped certificate and inviting a few high-profile reporters in to examine the document and photograph it. (The campaign did make the "short-form" certificate available to Factcheck.org, a project of the Annenberg Public Policy Center at the University of Pennsylvania.) And why has Obama not made his university records or thesis work widely available? The passivity of the press toward Bush administration propaganda about weapons of mass destruction led the nation into the costly blunder of the Iraq war. We don't need another presidency that finds it all too easy to rely on evasion or stonewalling. I deeply admire Obama, but as a voter I don't like feeling gamed or played.
Another issue that I initially dismissed was the flap over William Ayers, the Chicago-based former member of the violent Weather Underground. Conservative radio host Sean Hannity began the drumbeat about Ayers' association with Obama a year ago -- a theme that most of the mainstream media refused to investigate or even report until this summer. I had never heard of Ayers and couldn't have cared less. I was irritated by Hillary Clinton's aggressive flagging of Ayers in a debate, and I accepted Obama's curt dismissal of the issue.
Hence my concern about Ayers has been very slow in developing. The mainstream media should have fully explored the subject early this year and not allowed it to simmer and boil until it flared up ferociously in the last month of the campaign. Obama may not in recent years have been "pallin' around" with Ayers, in Sarah Palin's memorable line, but his past connections with Ayers do seem to have been more frequent and substantive than he has claimed. Blame for the failure of this issue to take hold must also accrue to the conservative talk shows, which use the scare term "radical" with simplistic sensationalism, blanketing everyone under the sun from scraggly ex-hippies to lipstick-chic Nancy Pelosi.
Pursuing the truth about Ayers, I recently rented the 2002 documentary "The Weather Underground," from Netflix. It was riveting. Although the film seems to waver between ominous exposé and blatant whitewash, the full extent of the group's bombing campaign is dramatically demonstrated. It's not for everyone: The film uses gratuitous cutaways of horrifying carnage, from the Vietnam War to the Manson murders (such as Sharon Tate's smiling corpse, bathed in blood). But the news footage of the Greenwich Village townhouse destroyed in 1970 by bomb-making gone wrong in the basement still has enormous impact. Standing in the chaotic street, actor Dustin Hoffman, who lived next door, seems like Everyman at the apocalypse.
Ayers comes off in the film as a vapid, slightly dopey, chronic juvenile with stunted powers of ethical reasoning. The real revelation is his wife, Bernardine Dohrn (who evidently worked at the same large Chicago law firm as Michelle Obama in the mid-1990s). Of course I had heard of Dohrn -- hers was one of the most notorious names of our baby-boom generation -- and I knew her black-and-white police mug shot. But I had never seen footage of her speaking or interacting with others. Well, it's pretty obvious who wears the pants in that family!
The mystery of Bernardine Dohrn: How could such a personable, attractive, well-educated young woman end up saying such things at a 1969 political rally as this (omitted in the film) about the Manson murders: "Dig it. First they killed those pigs, then they ate dinner in the same room with them. They even shoved a fork into a victim's stomach. Wild!" And how could Dohrn have so ruthlessly pursued a decade-long crusade of hatred and terrorism against innocent American citizens and both private and public property?
"The Weather Underground" never searches for answers, but it does show Dohrn, then and now, as a poised, articulate woman of extremely high intelligence and surprising inwardness. The audio extra of her reading the collective's first public communiqué ("Revolutionary violence is the only way") is chilling. But the tumultuous footage of her 1980 surrender to federal authorities is a knockout. Mesmerized, I ran the clip six or seven times of her seated at a lawyer's table while reading her still defiant statement. The sober scene -- with Dohrn hyper-alert in a handsome turtleneck and tweedy jacket -- was tailor-made for Jane Fonda in her "Klute" period, androgynous shag. Only illegalities by federal investigators prevented Dohrn from being put away on ice for a long, long time.
Given that Obama had served on a Chicago board with Ayers and approved funding of a leftist educational project sponsored by Ayers, one might think that the unrepentant Ayers-Dohrn couple might be of some interest to the national media. But no, reporters have been too busy playing mini-badminton with every random spitball about Sarah Palin, who has been subjected to an atrocious and at times delusional level of defamation merely because she has the temerity to hold pro-life views.
How dare Palin not embrace abortion as the ultimate civilized ideal of modern culture? How tacky that she speaks in a vivacious regional accent indistinguishable from that of Western Canada! How risible that she graduated from the State University of Idaho and not one of those plush, pampered commodes of received opinion whose graduates, in their rush to believe the worst about her, have demonstrated that, when it comes to sifting evidence, they don't know their asses from their elbows.
Liberal Democrats are going to wake up from their sadomasochistic, anti-Palin orgy with a very big hangover. The evil genie released during this sorry episode will not so easily go back into its bottle. A shocking level of irrational emotionalism and at times infantile rage was exposed at the heart of current Democratic ideology -- contradicting Democratic core principles of compassion, tolerance and independent thought. One would have to look back to the Eisenhower 1950s for parallels to this grotesque lock-step parade of bourgeois provincialism, shallow groupthink and blind prejudice.
I like Sarah Palin, and I've heartily enjoyed her arrival on the national stage. As a career classroom teacher, I can see how smart she is -- and quite frankly, I think the people who don't see it are the stupid ones, wrapped in the fuzzy mummy-gauze of their own worn-out partisan dogma. So she doesn't speak the King's English -- big whoop! There is a powerful clarity of consciousness in her eyes. She uses language with the jumps, breaks and rippling momentum of a be-bop saxophonist. I stand on what I said (as a staunch pro-choice advocate) in my last two columns -- that Palin as a pro-life wife, mother and ambitious professional represents the next big shift in feminism. Pro-life women will save feminism by expanding it, particularly into the more traditional Third World.
As for the Democrats who sneered and howled that Palin was unprepared to be a vice-presidential nominee -- what navel-gazing hypocrisy! What protests were raised in the party or mainstream media when John Edwards, with vastly less political experience than Palin, got John Kerry's nod for veep four years ago? And Gov. Kathleen Sebelius of Kansas, for whom I lobbied to be Obama's pick and who was on everyone's short list for months, has a record indistinguishable from Palin's. Whatever knowledge deficit Palin has about the federal bureaucracy or international affairs (outside the normal purview of governors) will hopefully be remedied during the next eight years of the Obama presidencies.
The U.S. Senate as a career option? What a claustrophobic, nitpicking comedown for an energetic Alaskan -- nothing but droning committees and incestuous back-scratching. No, Sarah Palin should stick to her governorship and just hit the rubber-chicken circuit, as Richard Nixon did in his long haul back from political limbo following his California gubernatorial defeat in 1962. Step by step, the mainstream media will come around, wipe its own mud out of its eyes, and see Palin for the populist phenomenon that she is.
Camille Paglia's column appears on the second Wednesday of each month. Every third column is devoted to reader letters. Please send questions for her next letters column to this mailbox. Your name and town will be published unless you request anonymity.
Oh my, so many places this tidbit could be posted. Please note that the original piece links to a video of the primary source.
Female Egyptian Lawyer Promotes Sexual Harassment against Jews
by Hana Levi Julian
(IsraelNN.com) A female Egyptian lawyer has recommended that Arab men begin sexually harassing Jewish women as a means of forcing Jews to leave Israel. Egypt, which signed a peace treaty with Israel in 1979, is perceived among Western nations as a moderate Arab nation where secular Arabs are a majority.
In a video clip of the interview which aired on Al Arabiyah television on October 31, 2008, Nagla Al-Imam said, "In my opinion, they are fair game for all Arabs, and there is nothing wrong… this is a new form of resistance."
They [women] are fair game for all Arabs, and there is nothing wrong… this is a new form of resistance.
According to a translation provided by the Middle East Media Research Institute (MEMRI), which released the clip, Al-Imam specified, however, that her "resistance" plan did not include rape.
"No. Sexual harassment… In my view, the [Israeli women] do not have any right to respond. The resistance fighters would not initiate such a thing, because their moral values are much loftier than that. However, if such a thing did happen to them, the [Israeli women] have no right to make any demands, because this would put us on equal terms – leave the land so we won't rape you. These two things are equal," she said.
Al-Imam added that she did not want "young Arab men to be interrogated," but rather, she wanted "these Zionist girls with Israeli citizenship to be expelled from our Arab countries. This is a form of resistance, and a way of rejecting their presence."
Convergence in Action A socialist columnist writes a libertarian article for a conservative magazine.
I don't agree with all the points, but do enjoy a good screed when I encounter one.
November 17, 2008 Issue
A Long Train of Abuses
By Alexander Cockburn
If there’s one thing defenders of civil liberties know, it’s that assaults on constitutional freedoms are bipartisan. Just as constitutional darkness didn’t first fall with the arrival in the Oval Office of George W. Bush, the shroud will not lift with his departure and the entry of President Barack Obama.
As atrocious as the Bush record on civil liberties has been, there’s no more eager and self-righteous hand reaching out to the Bill of Rights to drop it into the shredder than that of a liberal intent on legislating freedom. Witness the great liberal drive to criminalize expressions of hate and impose fierce punitive enhancements if the criminal has been imprudent enough to perpetrate verbal breaches of sexual or ethnic etiquette while bludgeoning his victim to death.
No doubt the conservatives who cheered Bush on as he abrogated ancient rights and stretched the powers of his office to unseen limits would have shrieked if a Democrat had taken such liberties. But now Obama will be entitled to the lordly prerogatives Bush established.
Growing up in Ireland and the United Kingdom, I gazed with envy at the United States, with its constitutional protections and its Bill of Rights contrasting with the vast ad hoc tapestry of Britain’s repressive laws and “emergency” statutes piled up through the centuries. Successive regimes from the Plantagenet and Tudor periods forward went about the state’s business of enforcing the enclosures, hanging or transporting strikers, criminalizing disrespectful speech, and, of course, abolishing the right to carry even something so innocuous as a penknife. Instructed by centuries of British occupation, my native Ireland, I have to say, took a slightly more relaxed attitude. My father once asked an Irish minister of justice back in the 1960s about the prodigious size and detail of the Irish statute book. “Ah, Claud,” said the minister equably, “our laws are mainly for guidance.”
President Bush was also a man unbound by law, launching appalling assaults on freedom, building on the sound foundation of kindred assaults in Clinton’s time, perhaps most memorably expressed in the screams of parents and children fried by U.S. government forces in the Branch Davidian compound in Waco. Clinton, too, flouted all constitutional war powers inhibitions, with his executive decision to rain bombs on the civilian population of the former Yugoslavia.
Bush has forged resolutely along the path blazed by Clinton in asserting uninhibited executive power to wage war, seize, confine, and torture at will, breaching constitutional laws and international treaties and covenants concerning the treatment of combatants. The Patriot Act took up items on the Justice Department’s wish list left over from Clinton’s dreadful Antiterrorism and Effective Death Penalty Act of 1996, which trashed habeas corpus protections.
The most spectacular abuses of civil liberties under Bush, such as the prison camp at Guantanamo, are acute symptoms of a chronic disease. The larger story of the past eight years has been the great continuity between this administration and those that have come before. The outrages perpetrated against habeas corpus under Republicans and Democrats alike, for example, have been innumerable, many of them little publicized. Take the case of people convicted of sexual felonies, who reach the end of their stipulated terms only to find that they face continued imprisonment without any specified terminus, under the rubric of “civil confinement,” a power as fierce as any lettre de cachet in France’s ancien régime.
Free speech is no longer a right. Stand alongside the route of a presidential cavalcade with a humble protest sign, and the Secret Service or local law enforcement will haul you off to some remote cage labeled “Designated Protest Area.” Seek to exercise your right to dispense money for a campaign advertisement or to support a candidate, and you will fall under the sanction of McCain-Feingold, otherwise known as the Bipartisan Campaign Reform Act of 2002.
In the case of public expressions of protest, we may expect particular diligence by the Secret Service and other agencies in the Obama years, though his reneging on a campaign promise to accept only public financing has stopped campaign-finance reform in its tracks. Liberals joyously eying Obama’s amazing $150 million haul in his final weeks have preserved a tactful silence on this topic, after years of squawking about the power of the corporate dollar to pollute democracy’s proceedings.
Worse than in the darkest days of the ’50s, when Americans could have their passports revoked by fiat of the State Department, citizens and legal residents no longer have the right to travel freely even inside the nation’s borders. Appearance on any of the innumerable watch lists maintained by government agencies means inability to get on a plane. And today you need your papers for more than just travel. The Indiana statute recently approved by the Supreme Court demands that persons lacking “proper” ID only cast provisional ballots, with a bureaucratic apparatus for subsequent verification. Thus, Americans no longer have an unimpaired right to vote, even if of appropriate age.
The late Murray Kempton used to tell me he remembered that Alf Landon, campaigning against FDR and specifically Social Security back in 1936, used to shout to the crowds words to the effect of “Mark my words, those Social Security numbers will follow you from cradle to grave.” Landon was right. Today you might as well have the SS number tattooed on your forehead, along with all other significant “private” data, preferably in some bright hue so the monitoring cameras along highways and intersections can get a clean hit. “Drill baby drill” has been the war cry of the government’s data-mining programs throughout the Bush years, and we can expect no improvement ahead.
Fourth Amendment protections have likewise gone steadily downhill. Warrantless wiretappers had a field day under Bush, and Congress reaffirmed their activities in the FISA bill, for which Obama voted in a turnaround from previous pledges. Incoming vice president Joe Biden can claim a significant role here since he has been an ardent prosecutor of the war on drugs, used since the Harrison Act of 1914—and even before then with the different penalties attaching to opium as used by middle class whites or Chinese—to enhance the right of police to enter, terrorize, and prosecute at will. Indeed, the war on drugs, revived by President Nixon and pursued vigorously by all subsequent administrations, has been as powerful a rationale for tearing up the Constitution as the subsequent war on terror. It’s like that with all wars. Not far from where I live in northern California, combating narcotics was the excuse for serious inroads in the early 1990s into the Posse Comitatus statutory inhibition on use of the U.S. military in domestic law enforcement, another constitutional abuse whose roots have continued to sink deeper during the Bush years.
In the past eight years, Bush has ravaged the Fourth Amendment with steadfast diligence, starting with his insistence that he could issue arrest warrants if there was reason to believe a noncitizen was implicated in terrorist activity. Seized under this pretext and held within America’s borders or in some secret prison overseas, the captive had no recourse to a court of law. Simultaneously, the “probable cause” standard, theoretically disciplining the state’s innate propensity to search and to seize, has been systematically abused, as have the FBI’s powers under the “material witness” statute to arrest and hold their suspects. Goodbye habeas corpus.
Not only individual liberties but federalism and the rights of states have been relentlessly eroded in the Bush years, often amidst liberal cheers at such excrescences as the No Child Left Behind law. Property rights, too, have suffered great setbacks. Government’s power to seize land under the canons of “eminent domain” received sinister buttress by the Supreme Court in the 2005 Kelo decision.
Have there been any bright patches in the gloom? I salute one: the vindication of the Second Amendment in the Supreme Court’s recent Heller decision, written by Justice Scalia. Liberals would do well to acknowledge the wisdom of that ruling, just as conservatives should recognize the continuity between the outrages they decried under Clinton and the strip-mining of American liberties that has taken place under Bush.
Alexander Cockburn is coeditor of the newsletter and website CounterPunch (counterpunch.org) and has written a biweekly column for The Nation for many years. Next spring CounterPunch Books will publish his A Short History of Fear: The Rise and Fall of Global Warming.
Schumer’s Fairness Doctrine fatuity Published by Briggs at 6:37 am under Philosophy, Politics
First listen to the appalling Chuck Schumer responding to a question about the proposed Fairness Doctrine (link from Unfair Doctrine):
Let’s summarize. He said:
I think we should all try to be fair and balanced, don’t you? [Radio broadcasts]: It’s not like printing a broadside…Do you think we should allow people to put pornography on the air? Absolutely not. The very same people who don’t want the Fairness Doctrine, want the FCC to limit pornography on the air. But you can’t say “Government, Hands off” in one area to a commercial enterprise, “But you’re allowed to intervene in another.” That’s not consistent. Schumer is treasure trove to people like me who are always on the lookout for examples of appallingly bad reasoning to use for teaching students logic. Almost any Schumer speech can be milked for at least one lesson—you could probably get half a semester from this bare minute.
Now, nobody knows what any new Fairness Doctrine might be since it is now in its “trial balloon” phase. But we can look to an earlier, abandoned incarnation of it for some clues. We can also glean hints from Schumer’s words.
Schumer thinks we should try to be “Fair & Balanced.” A fine thing, but not something that can be mandated. This is not a question of opinion or morality. For example, supposed on some matter the truth is A (where this is some argument or proposition about a decision we have to make). I set up a newspaper to tout A. Another group, unhappy with the reality of A, says “B is better because it shows we care.” But since A is true, it is absurd for me to publish anything else. It is even more absurd for the government to threaten me with criminal liability for my refusal to explain the merits of B.
Of course, we don’t often know the truth of some thing, but we can make a rational guess. It might be, conditional on some evidence, that A is nearly true, or more than likely true, and that every other alternative to A is less likely to be true. Again, it is absurd for me to publish anything else, and equally or more absurd for the government to intervene.
Can the government ban certain opinions from being published? The answer is yes. In certain circumstances, it is rational to proscribe behavior. Some examples: calls for armed insurrection, pleas for murder or other crimes, for sedition and so on. It is not only right the government should ban these, but it is its duty to do so. The exact limits of opinion that can and should be banned are, of course, unknown, and will be, in some cases, flexibly defined. But in no case does it make sense for the government to say, “Ok, make your plea for murdering the president, but you also have to allow Mr X 5 minutes to offer his counter opinion.” The ludicrousness of any such an argument is apparent. In short, either an idea is banned or it is allowable (a trivial tautology, but one that bears mentioning).
It does not follow that because the vast majority of Americans want to ban or limit pornography from being broadcast, that the government can ban, limit, or regulate any other opinion. Whether or not it is right to ban or limit certain opinions, or what constitutes the definition of those opinions, it does not follow—it is idiotic to propose—that the government should allow airing of the controversial opinion but then require the broadcaster provide time for counter opinions. If that were the case, then we could have a station air Deep Throat followed by a plea for proper dental hygiene.
Proper dental hygiene? Why not “The evils of pornography”? Why not, indeed. Now comes the easiest refutation of any implementation of a Fairness Doctrine. Suppose I say “A is true!” The government wants to say, “You may say A is true, but I mandate that you allow fair time for opponents of A. You shall also bear the expense of this.” Who are the legitimate opponents of A? Those that say B? C?, D, E, F…?
This is the meat of it, friends. Pay attention. In order to enforce any “Fairness” Doctrine, the government will be forced to define the opposite of A. Because, for any matter that is uncertain, there are an infinite or certainly an enormously huge number of alternatives to A. You cannot, in finite time, broadcast every alternative to A even if you wanted to. The only way to mandate broadcasting alternatives to A is by the government dictating—and dictating requires a dictator—what those alternatives are.
For example, in the earlier incarnation of this naked power grab, a prominent person who was “attacked” on the air was to be allowed time to offer his defense. What defines an “attack”? Does any negative opinion about the Great Leader in power constitute an “attack”? The Great Leader proposes a tax increase, and a broadcaster says, “This will negatively effect credit and so make it more difficult to get home loans.” Is this an “attack”? Who can say? The government wants to say. In fact, it must say.
There is no way around this fact: the government must get into the business of defining what an “attack” is, what are its limits, and so on. There is no alternative if you require a Fairness Doctrine. There must come into an existence an office to administer Fairness (I propose “Ministry of Truth”).
Of course, many, like Schumer, would like nothing better than to be in the business of defining what are the limits of opinion on political matters. The reason for this is obvious as it is odious.
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.
It is impossible for any Fairness Doctrine to be consonant with those words. It is not a debatable point: it is logically impossible. Unless, as Schumer and other advocates of the “living constitution” want to do, you change the meaning of the plain-English words “Congress shall make no law prohibiting the free exercise of the press.” They must interpret this to mean “Congress shall make no law prohibiting the free exercise of the press unless that law allows us to respond to people who hurt our feelings or otherwise pick on us, or that the speech printed or broadcast is hateful.” This is so absurd that I am shocked that anybody but an academic could ever think it.
Well, that’s enough. I’m already sick of this. There are no subtleties involved in this argument, not anywhere. To see these power-hungry politicians licking their chops over the possibilities due to them because of their recent electoral victory is truly frightening.
Sigh. I didn’t even get to the obvious logical absurdity in Schumer’s phrase “But you can’t say…” I’ll leave that for homework.
Five Myths About the Great Depression Herbert Hoover was no proponent of laissez-faire.
By ANDREW B. WILSON The current financial crisis has revived powerful misconceptions about the Great Depression. Those who misinterpret the past are all too likely to repeat the exact same mistakes that made the Great Depression so deep and devastating.
Here are five interrelated and durable myths about the 1929-39 Depression:
- Herbert Hoover, elected president in 1928, was a doctrinaire, laissez-faire, look-the-other way Republican who clung to the idea that markets were basically self-correcting. The truth is more illuminating. Far from a free-market idealist, Hoover was an ardent believer in government intervention to support incomes and employment. This is critical to understanding the origins of the Great Depression. Franklin Roosevelt didn't reverse course upon moving into the White House in 1933; he went further down the path that Hoover had blazed over the previous four years. That was the path to disaster.
Hoover, a one-time business whiz and a would-be all-purpose social problem-solver in the Lee Iacocca mold, was a bowling ball looking for pins to scatter. He was a government activist fixated on the idea of running the country as an energetic CEO might run a giant corporation. It was Hoover, not Roosevelt, who initiated the practice of piling up big deficits to support huge public-works projects. After declining or holding steady through most of the 1920s, federal spending soared between 1929 and 1932 -- increasing by more than 50%, the biggest increase in federal spending ever recorded during peacetime.
Public projects undertaken by Hoover included the San Francisco Bay Bridge, the Los Angeles Aqueduct, and Hoover Dam. The Republican president won plaudits from the American Federation of Labor for his industrial policy, which included jawboning business leaders to refrain from cutting wages as the economy fell. Referring to counteracting the business cycle and propping up wages, Hoover said: "No president before has ever believed that there was a government responsibility in such cases . . . we had to pioneer a new field." Though he did not coin the phrase, Hoover championed many of the basic ideas -- such as central planning and control of the economy -- that came to be known as the New Deal.
- The stock market crash in October 1929 precipitated the Great Depression. What the crash mainly precipitated was a raft of wrongheaded policies that did major damage to the economy -- beginning with the disastrous retreat into protectionism marked by the passage of the Smoot-Hawley tariff, which passed the House in May 1929 and the Senate in March 1930, and was signed into law by Hoover in June 1930. As prices fell, Smoot-Hawley doubled the effective tariff duties on a wide range of manufactures and agricultural products. It triggered the beggar-thy-neighbor policies of countervailing tariffs that caused the international economy to collapse. Some have argued that the increasing likelihood that the Smoot-Hawley tariff would pass was a major contributing factor to the stock-market collapse in the fall of 1929.
- Where the market had failed, the government stepped in to protect ordinary people. Hoover's disastrous agricultural policies involved the know-it-all Hoover acting as his own agriculture secretary and in fact writing the original Agricultural Marketing Act that evolved into Smoot-Hawley. While exports accounted for 7% of U.S. GDP in 1929, trade accounted for about one-third of U.S. farm income. The loss of export markets caused by Smoot-Hawley devastated the agricultural sector. Following in Hoover's footsteps, FDR concentrated on trying to raise farm income by such tactics as setting quotas on production and paying farmers to remove acreage from production -- even though this meant higher prices for hard-pressed consumers and had the effect of both lowering productivity and driving farmers off their land.
- Greed caused the stock market to overshoot and then crash. The real culprit here -- as in the housing bubble in our own time -- is the one identified by the economic historian Charles Kindleberger in the classic book "Manias, Panics, and Crashes": a speculative fever induced by excessively easy credit and broken by the inevitable return to more realistic valuations.
In the late 1920s, cheap and easy money fueled a tremendous increase in margin trading and a proliferation of "investment trusts" that offered little in the way of dividends or demonstrable earnings per share, but still promised phenomenal capital gains. "Speculation," as Kindleberger neatly defined it, "involves buying for resale rather than use in the case of commodities, and for resale rather than income in the case of financial assets."
The last thing Hoover wanted to do upon coming to office was to rein in the stock market boom by allowing interest rates to rise to a more normal level. The key to prosperity, in his view, lay not in sound money and rising productivity, but in letting the good times roll -- through government action aimed at maintaining high wages and high stock market valuations.
- Enlightened government pulled the nation out of the worst downturn in its history and came to the rescue of capitalism through rigorous regulation and government oversight. To the contrary, the Hoover and Roosevelt administrations -- in disregarding market signals at every turn -- were jointly responsible for turning a panic into the worst depression of modern times. As late as 1938, after almost a decade of governmental "pump priming," almost one out of five workers remained unemployed. What the government gave with one hand, through increased spending, it took away with the other, through increased taxation. But that was not an even trade-off. As the root cause of a great deal of mismanagement and inefficiency, government was responsible for a lost decade of economic growth.
Hoover was destined to fill the role of the left's designated scapegoat. Despite that, the one place where he and FDR truly "triumphed" was in enlisting the support of leading writers and intellectuals for government planning and intervention. This had a lasting effect on the way that generations of people think about the Great Depression. The antienterprise spirit among thought leaders of this time (and later) extended to top business publications. "Do you still believe in Lazy-Fairies?" Business Week asked derisively in 1931. "To plan or not to plan is no longer the question. The real question is who is to do it?"
In his economic policies and his incessant governmental activism, Hoover differed far more sharply with his Republican predecessor than he did with his Democratic successor. Calvin Coolidge, president from 1923 to 1929, made no secret of his disdain for Hoover, who served as his secretary of commerce and won praise from such highly regarded liberals as John Maynard Keynes and Jean Monnet. "That man has offered me unsolicited advice for six years, all of it bad," Coolidge said. He mockingly referred to Hoover as "Wonder Boy."
With the vitality of U.S. and world economies at stake, it is essential that the decisions of the coming months are shaped by the right lessons -- not the myths -- of the Great Depression.
Mr. Wilson, a former Business Week bureau chief, is a writer based in St. Louis.
Milwaukee Puts a Vote-Fraud Cop Out of Business Local Democrats don't take the issue seriously.
By JOHN FUND Last week Mike Sandvick, head of the Milwaukee Police Department's five-man Special Investigative Unit, was told by superiors not to send anyone to polling places on Election Day. He was also told his unit -- which wrote the book on how fraud could subvert the vote in his hometown -- would be disbanded.
"We know what to look for," he told me, "and that scares some people." In disgust, Mr. Sandvick plans to retire. (A police spokeswoman claims the unit isn't being disbanded and that any changes to the unit "aren't significant.")
In February, Mr. Sandvick's unit released a 67-page report on what it called an "illegal organized attempt to influence the outcome of (the 2004) election in the state of Wisconsin" -- a swing state whose last two presidential races were decided by less than 12,000 votes.
The report found that between 4,600 and 5,300 more votes were counted in Milwaukee than the number of voters recorded as having cast ballots. Absentee ballots were cast by people living elsewhere; ineligible felons not only voted but worked at the polls; transient college students cast improper votes; and homeless voters possibly voted more than once.
Much of the problem resulted from Wisconsin's same-day voter law, which allows anyone to show up at the polls, register and then cast a ballot. ID requirements are minimal. If someone lacks any ID, he can vote so long as someone who lives in the same city vouches for him. The report found that in 2004 a total of 1,305 "same day" voters gave information that was declared "un-enterable" or invalid by election officials.
According to the report, this loophole was abused by many out-of-state workers for the John Kerry campaign. They had "other staff members who were registered voters vouch for them by corroborating their residency."
The investigative unit believed at least 16 workers from the Kerry campaign, and two allied get-out-the-vote groups, "committed felony crimes." But local prosecutors didn't pursue them in part because of a "lack of confidence" in the abysmal record-keeping of the city's Election Commission.
Pat Curley, Milwaukee Mayor Tom Barrett's chief of staff, told me he was very upset by the surprise release of the report. "I don't believe all of the facts are necessarily accurate," he said. Which ones? He only cited the report's interpretation of state policy on homeless voters. He denies the mayor's office had any role in disbanding the unit.
Mr. Sandvick says the problems his unit found in 2004 are "only the tip of the iceberg" of what could happen today. His unit has found out-of-state groups registering their temporary workers, a college dorm with 60 voters who aren't students, and what his unit believes are seven illegal absentee ballots.
"The time to stop voter fraud is prior to when the questionable ballot is mixed in with all the valid votes," he says. Former police captain Glenn Frankovis agrees: "This issue could be solved if [the police chief] would assign police officers to the polling locations as was customary about 20 years ago." But election monitors are now viewed as "intimidating" in minority precincts and have been withdrawn.
Mr. Sandvick's report concluded "the one thing that could eliminate a large percentage of the fraud" it found would be elimination of same-day voter registration (which is also in use in seven other states). It also suggested that voters present a photo ID at the polls, a requirement the U.S. Supreme Court declared constitutional this spring.
But weeks after the vote fraud report was released, Wisconsin Sen. Russ Feingold introduced federal legislation to mandate same-day registration in every state. He claimed the system had worked well in Wisconsin and if "we can bring more people into the process, [it] only strengthens our democracy." Democrats tell me his bill is a top priority of the new Congress.
"They say voter fraud isn't a problem," notes Mr. Sandvick, "but after this election it may be all too clear it is." Now that Mr. Sandvick is resigning from the force after a long, honorable career, let's hope someone else is allowed to follow up on the spadework he's done.
'Space invader' DNA infiltrated mammalian genomes 22:00 20 October 2008 NewScientist.com news service Jessica Griggs
Parts of mammalian DNA are so alien they have been dubbed "space invaders" by the researchers that found them. The discovery, if confirmed, will change our understanding of evolution.
We normally get our genes "vertically" – handed down from our parents and theirs before them. Bacteria get theirs in this way too, but also "horizontally" – passed from one, unrelated individual to another.
Now biologists at the University of Texas, Arlington, have found the unexpected: horizontal gene transfer has occurred in mammals and amphibians too.
The culprit is a kind of "parasitic" DNA found in all our cells, known as a transposon. Study leader Cédric Feschotte says that what he calls space invader tranposons jumped sideways millions of years ago into several species by piggybacking onto a virus.
The transposon then assimilated itself into sex chromosomes, ensuring that it would get passed onto future generations. "It is very interesting conceptually – the idea that some parts of a mammal's DNA don't come from an ancestral species," he says.
Out of 26 animal genomes, the team found a near-identical length of DNA, known as the hAT transposon, in seven species, separated by some 340 million years of evolution.
These include species as widely diverged as a bush baby, a South American opossum, an African clawed frog and a tenrec – a mammal that looks like a hedgehog, but is actually more closely related to elephants.
The fact that invasive DNA was seen in a bush baby but not in any other primates, and in a tenrec but not in elephants, hints that something more exotic than standard inheritance is going on.
However, this patchy distribution by itself does not rule out the traditional method, as some of the species could have lost the transposon DNA throughout evolutionary history.
So the team looked at the position of the hAT transposon – if it had been inherited from a common ancestor it would have been found in the same position, with respect to other genes, in each species. But they could not find a single case of this.
Since first entering the genome, the hAT has been able to reproduce dramatically – in the tenrec, 99,000 copies were found, making up a significant chunk of its DNA. Feschotte speculates that this must have had a dramatic effect on its evolutionary development.
"It's like a bombardment", he says. "It must have been evolutionarily significant because the transposon generated a huge amount of DNA after the initial transfer."
Feschotte says he expects many more reports of horizontal gene jumping. "We're talking about a paradigm shift because, until now, horizontal transfer has been seen as very rare in animal species. It's actually a lot more common than we think."
The team thinks that the hAT transposon invasion occurred about 30 million years ago and spread across at least two continents. "It's like a pandemic, and one that can infect species that weren't genetically or geographically close. It's puzzling, scary almost," Feschotte says.
It may not be a coincidence that the time of the invasion coincides with a period in evolutionary history that saw mass mammal extinctions. This is usually attributed to climate change, Feschotte says, but it is not crazy to suppose that this type of invasion could contribute to species extinction.
The hAT transposon does not occur in humans, but some 45% of our genome is of transposon origin.
Feschotte's work on the hAT transposon is the first time that a "jumping gene" has been shown to have entered mammalian genomes, and the first time it has been shown to do so in at around the same time, in a range of unrelated species, in different parts of the world.
Feschotte admits that we cannot rule out another transposon offensive occurring in mammals, and thinks that bats are the species most likely to be the source. For some reason, he says, they seem to be most susceptible to picking up transposons – possibly because of the viruses they carry.
"Bats are notorious reservoir species for a plethora of viruses, including some very nasty to humans like rabies, SARS and perhaps Ebola," he says.
"Since these bats are full of active DNA transposons and are frequently involved in viral spill-over, the door for the transfer of an active DNA transposon to humans seems wide open. Rather scary."
Greg Hurst, an evolutionary biologist at the University of Liverpool, UK, says that the arrival of a new transposable element can be evolutionary significant, because new elements tend to be more active. "They will jump a fair bit more than older elements, which the resident genome will have evolved to suppress."
Most of the consequences of having a transposon jump around in your genome will be deleterious, Hurst says, but some will be advantageous. "The evolutionary life of the species could certainly hit the fast lane for a bit when it happens."
Journal reference: Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.0806548105)
'Junk' DNA Proves Functional; Helps Explain Human Differences From Other Species
According to a new study, what was previously believed to be "junk" DNA is one of the important ingredients distinguishing humans from other species. (Credit: iStockphoto) ScienceDaily (Nov. 5, 2008) — In a paper published in Genome Research on Nov. 4, scientists at the Genome Institute of Singapore (GIS) report that what was previously believed to be "junk" DNA is one of the important ingredients distinguishing humans from other species.
More than 50 percent of human DNA has been referred to as "junk" because it consists of copies of nearly identical sequences. A major source of these repeats is internal viruses that have inserted themselves throughout the genome at various times during mammalian evolution.
Using the latest sequencing technologies, GIS researchers showed that many transcription factors, the master proteins that control the expression of other genes, bind specific repeat elements. The researchers showed that from 18 to 33% of the binding sites of five key transcription factors with important roles in cancer and stem cell biology are embedded in distinctive repeat families.
Over evolutionary time, these repeats were dispersed within different species, creating new regulatory sites throughout these genomes. Thus, the set of genes controlled by these transcription factors is likely to significantly differ from species to species and may be a major driver for evolution.
This research also shows that these repeats are anything but "junk DNA," since they provide a great source of evolutionary variability and might hold the key to some of the important physical differences that distinguish humans from all other species.
The GIS study also highlighted the functional importance of portions of the genome that are rich in repetitive sequences.
"Because a lot of the biomedical research use model organisms such as mice and primates, it is important to have a detailed understanding of the differences between these model organisms and humans in order to explain our findings," said Guillaume Bourque, Ph.D., GIS Senior Group Leader and lead author of the Genome Research paper.
"Our research findings imply that these surveys must also include repeats, as they are likely to be the source of important differences between model organisms and humans," added Dr. Bourque. "The better our understanding of the particularities of the human genome, the better our understanding will be of diseases and their treatments."
"The findings by Dr. Bourque and his colleagues at the GIS are very exciting and represent what may be one of the major discoveries in the biology of evolution and gene regulation of the decade," said Raymond White, Ph.D., Rudi Schmid Distinguished Professor at the Department of Neurology at the University of California, San Francisco, and chair of the GIS Scientific Advisory Board.
"We have suspected for some time that one of the major ways species differ from one another – for instance, why rats differ from monkeys – is in the regulation of the expression of their genes: where are the genes expressed in the body, when during development, and how much do they respond to environmental stimuli," he added.
"What the researchers have demonstrated is that DNA segments carrying binding sites for regulatory proteins can, at times, be explosively distributed to new sites around the genome, possibly altering the activities of genes near where they locate. The means of distribution seem to be a class of genetic components called 'transposable elements' that are able to jump from one site to another at certain times in the history of the organism. The families of these transposable elements vary from species to species, as do the distributed DNA segments which bind the regulatory proteins."
Dr. White also added, "This hypothesis for formation of new species through episodic distributions of families of gene regulatory DNA sequences is a powerful one that will now guide a wealth of experiments to determine the functional relationships of these regulatory DNA sequences to the genes that are near their landing sites. I anticipate that as our knowledge of these events grows, we will begin to understand much more how and why the rat differs so dramatically from the monkey, even though they share essentially the same complement of genes and proteins."
A new study for the most part confirms John Lott's "more guns, less crime" thesis, and calls into serious question the only intervening study that found otherwise. The piece is very statistically dense; abstract and conclusion follow:
Carlisle e. Moody and Thomas B. Marvell
“Shall issue” right-to-carry concealed weapons laws require authorities to issue concealed-weapons permits, allowing the permit holder to carry a concealed handgun, to anyone who applies, unless the applicant has a criminal record or a history of mental illness. The shall-issue laws are state laws, applicable to all coun- ties within the state.3 In contrast, states with “may issue” laws allow considerable discretion to the authorities. In may-issue states, authorities typically require that the applicant demonstrate a particular need for a concealed weapons permit, and self-defense usually is not deemed sufficient. Consequently, shall-issue states are much more permissive of individual freedom to carry concealed handguns. In 1997 John Lott and David Mustard published, “Crime, Deterrence and Right-to-Carry Concealed Handguns” in the Journal of Legal Studies. They found that shall-issue states had lower violent crime rates, presumably because the laws result in more people carrying concealed weapons. Criminals might be deterred by the greater likelihood of others being armed, and of arms being concealed. Lott and Mustard’s article created a furor and the debate continues. Much of this debate takes place in op-ed columns, letters to editors, internet chat rooms, and web logs. In this article we concentrate on the academic debate. We review the main threads of the discussion in the literature and extend the debate with our own statistical analyses. In particular, we extend the investigation of influential work in Stanford Law Review by Ian Ayres and John J. Donohue III (2003a, 2003b), who, contrary to Lott and Mus- tard, claim to find that shall-issue laws actually lead to an overall increase in crime. The new statistical analysis contained in the present article finds that shall issue laws are generally beneficial. Purists in statistical analysis object with some cause to some of methods employed both by Ayres and Donohue, by us, and by the literature in general. But the new investigation presented here upgrades Ayres and Donohue in a few significant ways, so, at least until the next study comes along, our paper should neutralize Ayres and Donohue’s “more guns, more crime” conclusion. are generally beneficial. Purists in statistical analysis object with some cause to some of methods employed both by Ayres and Donohue, by us, and by the literature in general. But the new investigation presented here upgrades Ayres and Donohue in a few significant ways, so, at least until the next study comes along, our paper should neutralize Ayres and Donohue’s “more guns, more crime” conclusion.
suMMary and ConClusion Many articles have been published finding that shall-issue laws reduce crime. Only one article, by Ayres and Donohue who employ a model that combines a dummy variable with a post-law trend, claims to find that shall-issue laws increase crime. However, the only way that they can produce the result that shall-issue laws increase crime is to confine the span of analysis to five years. We show, using their own estimates, that if they had extended their analysis by one more year, they would have concluded that these laws reduce crime. Since most states with shall- issue laws have had these laws on the books for more than five years, and the law will presumably remain on the books for some time, the only relevant analysis extends beyond five years. We extend their analysis by adding three more years of data, control for the effects of crack cocaine, control for dynamic effects, and correct the standard errors for clustering. We find that there is an initial increase in crime due to passage of the shall-issue law that is dwarfed over time by the decrease in crime associated with the post-law trend. These results are very similar to those of Ayres and Donohue, properly interpreted. The modified Ayres and Donohue model finds that shall-issue laws significantly reduce murder and burglary across all the adopting states. These laws appear to significantly increase assault, and have no net effect on rape, robbery, larceny, or auto theft. However, in the long run only the trend coefficients matter. We estimate a net benefit of $450 million per year as a result of the passage of these laws. We also estimate that, up through 2000, there was a cumulative overall net benefit of these laws of $28 billion since their passage. We think that there is credible statistical evidence that these laws lower the costs of crime. But at the very least, the present study should neutralize any “more guns, more crime” thinking based on Ayres and Donohue’s work in the Stanford Law Review. We acknowledge that, especially in light of the methodological issues of the literature in general, the magnitudes derived from our analysis of crime statistics and the supposed costs of crime might be dwarfed by other considerations in judging the policy issue. Some might contend that allowing individuals to carry a concealed weapon is a moral or cultural bad. Others might contend that greater liberty is a moral or cultural good. All we are confident in saying is that the evidence, such as it is, seems to support the hypothesis that the shall-issue law is generally beneficial with respect to its overall long run effect on crime.
Where Was the Ad? And there must be some excellent reason we didn’t see this from the McCain campaign, right? Right?
By Bill Whittle
Of the many — actually, it approaches infinite — missed Republican opportunities of this campaign season, I feel there is one so obvious I had to try to remedy it — personally.
It’s just a few hours until the polls open, but I wanted it out there just so I could say it was. Here you go.
My friends, I’m John McCain. Back in 2002, I fought hard to limit the amount of money in politics. I thought it was corrosive and anti-democratic. Public financing of campaigns has long been a Democratic rallying cry, and I crossed the aisle to work with my colleague, Senator Russ Feingold, to pass legislation limiting the amount of money being pumped into campaigns. Nothing I have done has damaged me more with the base of my own party, but I thought it was the right thing to do, so I did it.
During the primaries, both Senator Obama and I agreed to make this campaign about issues and not about money, and I was proud and pleased when he joined me in a pledge to accept public financing for the general election.
However, back in June, Senator Obama renounced that pledge. Once it became clear that he could raise more money by breaking his promise – not just to me, and to America, but to the Democratic Party ideal they have fought for for so long – once he realized he could raise more money by breaking that promise, he broke it.
I did not.
So now, Senator Obama has raised over $600 million dollars. Because I remained committed to a principle we both agreed upon, he is able to outspend me at least seven to one. Remember that, next time you see an ad run by Senator Obama. Or the next one. Or the one after that. Or the one after that. Or the one after that. Or the one after that. Or the one after that.
And if that doesn’t bother you – at least a little – just ask yourself one question: What if Senator Obama, running on a platform of Change and “a new kind of politics” was the one to accept public financing, and the Republican opponent did not. What if the Democrat, true to his principles and a personal pledge, held true to his beliefs, while the Republican raised six hundred million dollars and turned off the standard credit card anti-fraud protections while doing so? What if the Republican outspent the democrat more than seven to one, and, as a result was up by a few points in key battlefield states.
What would you think then?
Would you not be inclined to say he “bought the election?” And do you think, in the face of that advantage, that anyone will ever accept public financing again?
And what if, in the face of that disadvantage, all you had to trust and depend on was the fundamental integrity of the press to present whatever damaging information they and their army of reporters could uncover, on either candidate?
What if they too failed to live up to their obligation to you? Then where would this principled stand leave you?
Imagine a Republican sitting in a room where people were being disparaged due to their religion or the color of their skin, and then try to imagine a news organization sitting on a video of the incident. Never happen, it'd be all over the airwaves faster than you can say "news cycle."
Alas, you don't have to imagine someone trying to excuse similar behavior by pointing out tangential associations that have nothing to do with the point under discussion as you've done it already. But hey, if sitting on a board that gives money to an organization that has someone in it who supports terror is bad, then I guess serving on a board where the guy sitting next to you supported terror tactics is even worse, right?
This sort of inane equivocation doesn't bode well for informed discussion. . . .
The Los Angeles Times’s Strange Notion of Journalistic Ethics Give us the tape … or at least a transcript of Obama’s radical shindig.
By Andrew C. McCarthy
When it comes to insulting our collective intelligence, the Obamedia soundtrack of the ongoing campaign breaks new ground on a daily, indeed an hourly, basis. Still, the Los Angeles Times takes the cake.
Change you can believe in is a short hop from fairy tales you can be sold. In that spirit, the Times tells us, we’d really, really love to release the videotape we’re holding of that 2003 Khalidi shindig — the one where Barack Obama joined a motley collection of Israel-bashers, including the former terrorists Bill Ayers and Bernadine Dohrn, to sing the praises of Rashid Khalidi — former mouthpiece for PLO master-terrorist Yasser Arafat. But alas, our hands are tied by journalistic ethics.
Of course the ever ethical Times would never try to skew election coverage in favor of a candidate it has recently endorsed (after blowing kisses at him for two years). Nor would the newspaper give its readers anything but a complete, accurate, and truthful account of an event like the Khalidi Bash that it deemed worthy enough to cover. You can take that to the bank. But, gosh-darn, it turns out that a “source” the Times won’t name supposedly provided reporter Peter Wallsten with the videotape on the solemn promise that the paper would never let it see the light of day … except to report on it as the Times saw fit.
If you believe that one, I’ve got a tax cut for you.
Let’s suspend disbelief for a moment. Let’s pretend that there is really some sentient being out there who actually leaks a videotape to a reporter wanting and expecting the event depicted to be given news coverage but somehow not wanting or expecting the tape itself to be published. And let’s further pretend that this phantom source who doesn’t want to tape disclosed nevertheless gives the tape to the newspaper rather than keeping control over it himself.
Let’s say we buy that this highly unlikely scenario actually happened. That would still not prevent the Los Angeles Times from putting out a transcript of the Khalidi testimonials and other speechifying.
We know, for example, that Barack Obama spoke for several minutes. Yet the Times has provided us with only the most cursory summary — to be more precise, not a summary but an account. A summary is a synopsis that fairly reflects what was said. Reporter Wallsten, to the contrary, fleetingly tells us only that “Obama adopted a different tone [from rabid anti-Israel speakers] in his comments and called for finding common ground.”
How so? We’re not told. Here’s the entirety of the Times description of Obama’s remarks:
His many talks with the Khalidis, Obama said, had been “consistent reminders to me of my own blind spots and my own biases. . . . It’s for that reason that I'm hoping that, for many years to come, we continue that conversation — a conversation that is necessary not just around Mona and Rashid's dinner table,” but around “this entire world.”
How very enlightening. What were the topics of the dinner-table talk? What blind spots and biases was Obama referring to? Did anything in his speech provide clues? We have no idea: the Times doesn’t tell us.
Moreover, we also know that several speakers that night sang paeans to Khalidi — who regards the establishment of a Jewish state in “Palestine” as the Nakba (i.e., “The Catastrophe”) and justifies terrorist attacks against Israeli military and government targets. The Times concedes the party was a forum “where anger at Israeli and U.S. Middle East policy was freely expressed.” Yet, again, we are given only two blurbs:
[A] young Palestinian American recited a poem accusing the Israeli government of terrorism in its treatment of Palestinians and sharply criticizing U.S. support of Israel. If Palestinians cannot secure their own land, she said, “then you will never see a day of peace.” One speaker likened “Zionist settlers on the West Bank” to Osama bin Laden, saying both had been “blinded by ideology.”
You know there was a lot more where that came from, spouted by several other speakers whom the Times story fails to name. Why not put out a transcript of what was said and by whom? And if the Times has information about what was in the commemorative book that was prepared for the occasion of Khalidi’s triumphant departure to assume the Edward Said chair at Columbia University, why not put that out too?
Even if you accept for argument’s sake the bunk about honoring the “source’s” supposed wishes, the newspaper wouldn’t need to release the tape in order to give us a more comprehensive account of what happened that evening. So it’s not that the Times is simply withholding the tape. The Times is trying to suppress the story. Not the story as Wallsten spun it back in April. The full story.
The full story couldn’t be more relevant. Barack Obama says he is a staunch supporter of Israel. The importance of the Khalidi festivities isn’t simply that Obama lavished praise on a man who was an Arafat apologist — although that is troubling in itself. What also matters is that many speakers (no doubt including Obama’s good friend Khalidi himself) said extremely provocative things about Israel and American policy.
While that went on, Obama apparently sat there in tacit acceptance, if not approval. He didn’t get up to leave. He wasn’t roused to a defense of his country. He didn’t deliver a spirited condemnation of Islamic terror. He just sat there. And when it came his turn to speak, he spoke … glowingly … about Khalidi. He was clearly comfortable around the agitators and, equally crucial, they were clearly comfortable spewing their bile in front of him — confident that they were certainly not giving offense.
Why would the Times think it’s not newsworthy to tell us in detail what Obama sat through and chose not to refute? He says he supports Israel, but shouldn’t we get a peek at what he actually does when Israel is under attack. After all, he wants to be in charge and soon the attacks may be more than just verbal.
All of that could be made known by the publication of a transcript, without breaching any purported promise to the purported source.
But, the Times sputters, we’ve already done that news story back in April. The material facts have already been publicized thanks to our crack reporting.
Bill Ayers and Bernadine Dohrn were at the party. Given the controversy over their extensive relationship with Obama — sitting on boards together, doling out millions of dollars together, lauding each other’s writings, joint appearances at conferences, Obama’s introduction to Chicago politics in the Ayers/Dohrn home, etc. — didn’t the Times think their attendance together at a party for Khalidi was worth reporting?
Given that Obama now preposterously claims he and Ayers barely know each other, didn’t the Times think it was worth mentioning that guest-of-honor Khalidi, a very close friend of Obama, just happens also to be a very close friend of Ayers?
The party was sponsored by the Arab American Action Network (AAAN) — an organization founded by Khalidi and his wife (who also worked for the PLO’s press agency) and lavishly funded by Obama and Ayers when they sat together on the board of the Woods Fund. Did the Times think that was newsworthy?
Again, apparently not. Wallsten’s article does not mention the AAAN’s role in the party. He describes the AAAN “a social service group” which is headed by Khalidi’s wife and was given a $40,000 grant by the Woods Fund when Obama sat on the board. In fact, AAAN is an activist Palestinian organization that regards Israel as illegitimate and supports driver’s licenses and welfare benefits for illegal aliens. Further, it was founded by both Khalidi and his wife, it actually received almost twice as much Woods Fund support as the Times said (i.e., $75,000, not $40,000), and, at the time of those grants, one of Obama’s partners on the board was Bill Ayers.
Besides Obama and Khalidi (about whose speeches the Times tells us precious little), who else spoke at the party? What was said? What was written in the commemorative book prepared for the occasion? The Times doesn’t tell us.
In fact, though the Times’s story runs 2000 words, very little of it is about the party the Times now contends it covered adequately. Most of it is dedicated to probing what Wallsten frames as the alluring mystery of Barack Obama’s position on the Israeli/Palestinian dispute. Is he really a strong Israel supporter? Do anti-Israeli Palestinians really have good reason to regard him as a friend? Would he shift away from the strong U.S. alliance with Israel to a more “even-handed” approach—as one Chicago Palestinian-rights activist claims to have heard Obama say he favored (Obama denies it)?
We don’t know. The Times raises these and other questions, acknowledges that they are vexing, but then withholds from us critical information by which we might draw our own informed conclusions.
The mainstream press, of course, is urging Congress to enact a “shield law,” protecting reporters from government subpoenas. To a former prosecutor, that’s worth noting. You see, in matters of great public importance, prosecutors have ethical obligations, too. One of them says that if you provide an incomplete or misleading version of an event to the public’s courts, and you have information in your file that would clarify the situation, you are duty-bound to disclose that information. That way, the factfinder is equipped to make an intelligent, informed decision about what the truth is.
By contrast, the mainstream media want the right to mislead you, to provide you with a woefully incomplete record, but to deprive you of clarifying information even when it is readily at their disposal. You just have to take their word for what happened, and never you mind the details.
Are you comfortable taking the Obamedia’s word for it? Or do you think you ought to have a look at what Los Angeles Times has unilaterally decided not to show you?
The time for a newspaper to start worrying about journalistic ethics is when it publishes the story, not six months later when, in the stretch run of a crucial election, it gets called on an obviously incomplete report. Ethics, furthermore, are about fair and honest treatment. If the videotape at issue involved John McCain rubbing elbows with radicals or the CIA trying to protect national defense secrets, the Times would publish it and revel in the inevitable Pulitzer for its “courage” in doing so.
Let’s see the tape … or at least a transcript.
— National Review’s Andrew C. McCarthy chairs the FDD’s Center for Law & Counterterrorism and is the author of Willful Blindness: A Memoir of the Jihad (Encounter Books 2008).
Media's Presidential Bias and Decline Columnist Michael Malone Looks at Slanted Election Coverage and the Reasons Why
Column By MICHAEL S. MALONE Oct. 24, 2008 —
The traditional media are playing a very, very dangerous game -- with their readers, with the Constitution and with their own fates.
The sheer bias in the print and television coverage of this election campaign is not just bewildering, but appalling. And over the last few months I've found myself slowly moving from shaking my head at the obvious one-sided reporting, to actually shouting at the screen of my television and my laptop computer.
But worst of all, for the last couple weeks, I've begun -- for the first time in my adult life -- to be embarrassed to admit what I do for a living. A few days ago, when asked by a new acquaintance what I did for a living, I replied that I was "a writer," because I couldn't bring myself to admit to a stranger that I'm a journalist.
You need to understand how painful this is for me. I am one of those people who truly bleeds ink when I'm cut. I am a fourth-generation newspaperman. As family history tells it, my great-grandfather was a newspaper editor in Abilene, Kan., during the last of the cowboy days, then moved to Oregon to help start the Oregon Journal (now the Oregonian).
My hard-living -- and when I knew her, scary -- grandmother was one of the first women reporters for the Los Angeles Times. And my father, though profoundly dyslexic, followed a long career in intelligence to finish his life (thanks to word processors and spellcheckers) as a very successful freelance writer. I've spent 30 years in every part of journalism, from beat reporter to magazine editor. And my oldest son, following in the family business, so to speak, earned his first national byline before he earned his drivers license.
So, when I say I'm deeply ashamed right now to be called a "journalist," you can imagine just how deep that cuts into my soul.
Now, of course, there's always been bias in the media. Human beings are biased, so the work they do, including reporting, is inevitably colored. Hell, I can show you 10 different ways to color variations of the word "said" -- muttered, shouted, announced, reluctantly replied, responded, etc. -- to influence the way a reader will apprehend exactly the same quote. We all learn that in Reporting 101, or at least in the first few weeks working in a newsroom.
But what we are also supposed to learn during that same apprenticeship is to recognize the dangerous power of that technique, and many others, and develop built-in alarms against them.
But even more important, we are also supposed to be taught that even though there is no such thing as pure, Platonic objectivity in reporting, we are to spend our careers struggling to approach that ideal as closely as possible.
That means constantly challenging our own prejudices, systematically presenting opposing views and never, ever burying stories that contradict our own world views or challenge people or institutions we admire. If we can't achieve Olympian detachment, than at least we can recognize human frailty -- especially in ourselves.
For many years, spotting bias in reporting was a little parlor game of mine, watching TV news or reading a newspaper article and spotting how the reporter had inserted, often unconsciously, his or her own preconceptions. But I always wrote it off as bad judgment and lack of professionalism, rather than bad faith and conscious advocacy.
Sure, being a child of the '60s I saw a lot of subjective "New" Journalism, and did a fair amount of it myself, but that kind of writing, like columns and editorials, was supposed to be segregated from "real" reporting, and, at least in mainstream media, usually was. The same was true for the emerging blogosphere, which by its very nature was opinionated and biased.
But my complacent faith in my peers first began to be shaken when some of the most admired journalists in the country were exposed as plagiarists, or worse, accused of making up stories from whole cloth.
I'd spent my entire professional career scrupulously pounding out endless dreary footnotes and double-checking sources to make sure that I never got accused of lying or stealing someone else's work -- not out of any native honesty, but out of fear: I'd always been told to fake or steal a story was a firing offense & indeed, it meant being blackballed out of the profession.
And yet, few of those worthies ever seemed to get fired for their crimes -- and if they did they were soon rehired into even more prestigious jobs. It seemed as if there were two sets of rules: one for us workaday journalists toiling out in the sticks, and another for folks who'd managed, through talent or deceit, to make it to the national level.
Meanwhile, I watched with disbelief as the nation's leading newspapers, many of whom I'd written for in the past, slowly let opinion pieces creep into the news section, and from there onto the front page. Personal opinions and comments that, had they appeared in my stories in 1979, would have gotten my butt kicked by the nearest copy editor, were now standard operating procedure at the New York Times, the Washington Post, and soon after in almost every small town paper in the U.S.
But what really shattered my faith -- and I know the day and place where it happened -- was the war in Lebanon three summers ago. The hotel I was staying at in Windhoek, Namibia, only carried CNN, a network I'd already learned to approach with skepticism. But this was CNN International, which is even worse.
I sat there, first with my jaw hanging down, then actually shouting at the TV, as one field reporter after another reported the carnage of the Israeli attacks on Beirut, with almost no corresponding coverage of the Hezbollah missiles raining down on northern Israel. The reporting was so utterly and shamelessly biased that I sat there for hours watching, assuming that eventually CNNi would get around to telling the rest of the story & but it never happened.
The Presidential Campaign
But nothing, nothing I've seen has matched the media bias on display in the current presidential campaign.
Republicans are justifiably foaming at the mouth over the sheer one-sidedness of the press coverage of the two candidates and their running mates. But in the last few days, even Democrats, who have been gloating over the pass -- no, make that shameless support -- they've gotten from the press, are starting to get uncomfortable as they realize that no one wins in the long run when we don't have a free and fair press.
I was one of the first people in the traditional media to call for the firing of Dan Rather -- not because of his phony story, but because he refused to admit his mistake -- but, bless him, even Gunga Dan thinks the media is one-sided in this election.
Now, don't get me wrong. I'm not one of those people who think the media has been too hard on, say, Republican vice presidential nominee Gov. Sarah Palin, by rushing reportorial SWAT teams to her home state of Alaska to rifle through her garbage. This is the big leagues, and if she wants to suit up and take the field, then Gov. Palin better be ready to play.
The few instances where I think the press has gone too far -- such as the Times reporter talking to prospective first lady Cindy McCain's daughter's MySpace friends -- can easily be solved with a few newsroom smackdowns and temporary repostings to the Omaha bureau.
No, what I object to (and I think most other Americans do as well) is the lack of equivalent hardball coverage of the other side -- or worse, actively serving as attack dogs for the presidential ticket of Sens. Barack Obama, D-Ill., and Joe Biden, D-Del.
If the current polls are correct, we are about to elect as president of the United States a man who is essentially a cipher, who has left almost no paper trail, seems to have few friends (that at least will talk) and has entire years missing out of his biography.
That isn't Sen. Obama's fault: His job is to put his best face forward. No, it is the traditional media's fault, for it alone (unlike the alternative media) has had the resources to cover this story properly, and has systematically refused to do so.
Why, for example to quote the lawyer for Republican presidential nominee Sen. John McCain, R-Ariz., haven't we seen an interview with Sen. Obama's grad school drug dealer -- when we know all about Mrs. McCain's addiction? Are Bill Ayers and Tony Rezko that hard to interview? All those phony voter registrations that hard to scrutinize? And why are Sen. Biden's endless gaffes almost always covered up, or rationalized, by the traditional media?
Joe the Plumber
The absolute nadir (though I hate to commit to that, as we still have two weeks before the election) came with Joe the Plumber.
Middle America, even when they didn't agree with Joe, looked on in horror as the press took apart the private life of an average person who had the temerity to ask a tough question of a presidential candidate. So much for the standing up for the little man. So much for speaking truth to power. So much for comforting the afflicted and afflicting the comfortable, and all of those other catchphrases we journalists used to believe we lived by.
I learned a long time ago that when people or institutions begin to behave in a matter that seems to be entirely against their own interests, it's because we don't understand what their motives really are. It would seem that by so exposing their biases and betting everything on one candidate over another, the traditional media is trying to commit suicide -- especially when, given our currently volatile world and economy, the chances of a successful Obama presidency, indeed any presidency, is probably less than 50/50.
Furthermore, I also happen to believe that most reporters, whatever their political bias, are human torpedoes & and, had they been unleashed, would have raced in and roughed up the Obama campaign as much as they did McCain's. That's what reporters do. I was proud to have been one, and I'm still drawn to a good story, any good story, like a shark to blood in the water.
So why weren't those legions of hungry reporters set loose on the Obama campaign? Who are the real villains in this story of mainstream media betrayal?
The editors. The men and women you don't see; the people who not only decide what goes in the paper, but what doesn't; the managers who give the reporters their assignments and lay out the editorial pages. They are the real culprits.
Why? I think I know, because had my life taken a different path, I could have been one: Picture yourself in your 50s in a job where you've spent 30 years working your way to the top, to the cockpit of power & only to discover that you're presiding over a dying industry. The Internet and alternative media are stealing your readers, your advertisers and your top young talent. Many of your peers shrewdly took golden parachutes and disappeared. Your job doesn't have anywhere near the power and influence it did when your started your climb. The Newspaper Guild is too weak to protect you any more, and there is a very good chance you'll lose your job before you cross that finish line, 10 years hence, of retirement and a pension.
In other words, you are facing career catastrophe -- and desperate times call for desperate measures. Even if you have to risk everything on a single Hail Mary play. Even if you have to compromise the principles that got you here. After all, newspapers and network news are doomed anyway -- all that counts is keeping them on life support until you can retire.
And then the opportunity presents itself -- an attractive young candidate whose politics likely matches yours, but more important, he offers the prospect of a transformed Washington with the power to fix everything that has gone wrong in your career.
With luck, this monolithic, single-party government will crush the alternative media via a revived fairness doctrine, re-invigorate unions by getting rid of secret votes, and just maybe be beholden to people like you in the traditional media for getting it there.
And besides, you tell yourself, it's all for the good of the country . . .
This is the opinion of the columnist and in no way reflects the opinion of ABC News.
Michael S. Malone is one of the nation's best-known technology writers. He has covered Silicon Valley and high-tech for more than 25 years, beginning with the San Jose Mercury News as the nation's first daily high-tech reporter. His articles and editorials have appeared in such publications as The Wall Street Journal, the Economist and Fortune, and for two years he was a columnist for The New York Times. He was editor of Forbes ASAP, the world's largest-circulation business-tech magazine, at the height of the dot-com boom. Malone is the author or co-author of a dozen books, notably the best-selling "Virtual Corporation." Malone has also hosted three public television interview series, and most recently co-produced the celebrated PBS miniseries on social entrepreneurs, "The New Heroes." He has been the ABCNews.com "Silicon Insider" columnist since 2000.
Perhaps Doug can chime in here as I don't have the economic background to explain this sense if full, but the problem with most redistributionist schemes is that most "wealth" is based on symbols in some account somewhere. Those symbols have value based on their liquidity and ability to trade them for goods and services. If the government comes along and starts taking a greater percentage of each symbol as it is passed around--symbols they ultimately create and arbitrate--then not only are the number of symbols in a given account reduced, but the belief that a given symbol will purchase X amount of goods or service also takes a hit. After all, if the government can take back 10 percent of every one of their symbols when transferred, there's nothing to prevent them from snagging 20, 40, or 60 percent, particularly when "soak the rich" is the shrill cry.
The sad thing here is that the rich are much more versant in the language of symbols, and have many more options when a given government starts redistributing symbols. The rich find precious metals, offshore accounts, different currencies etc. in which to warehouse their symbols, or, more sadly still, say wealth creation is not worth tax hit, and pick their chips off the table and move on. It's the middle and lower classes who don't have the option of moving what symbols they do have elsewhere and hence end up sucking it up when the government dilutes what can be obtained with their symbols by taking a greater percentage off the top.
DANIEL J. FLYNN Obama: The Oak Grown from Acorn The radical group is front and center when it comes to voter fraud. 16 October 2008 Stealing Elections, Revised and Updated: How Voter Fraud Threatens Our Democracy, by John Fund (Encounter, 175 pp., $19.95)
Last week, well before news broke today of an FBI voter-fraud investigation of the Association of Community Organizers for Reform Now (Acorn), Nevada authorities raided the group’s Las Vegas headquarters. The offices of Nevada’s secretary of state and attorney general, both Democrats, seized computers, voter-registration cards, and employee information after Acorn submitted numerous fraudulent names and addresses as part of its voter-registration drive. “Some of these [forms] were facially fraudulent; we basically had the starting lineup for the Dallas Cowboys,” Ross Miller, Nevada’s secretary of state, explained. “Tony Romo is not registered to vote in Nevada.” Acorn’s Project Vote alleges that the raid is part of a nationally orchestrated effort to suppress voter turnout. “Project Vote has been attacked all over the country because we registered at least 1.2 million voters,” theorizes Nevada Acorn’s Bonnie Smith-Greathouse. “That could sway an election.”
And that’s just the point, argues John Fund in the updated and timely reissue of his Stealing Elections: How Voter Fraud Threatens Our Democracy. Fund contends that recent changes in election laws have made it easier to “sway an election,” as Smith-Greathouse puts it—through cheating. “The United States has a haphazard, fraud-prone election system befitting a developing nation rather than the globe’s leading democracy,” Fund asserts. At times, Fund’s subject seems more fitting for a magazine exposé than for a book—until one confronts the sheer volume of examples he has compiled. Like a portrait of corruption from a century prior, Lincoln Steffens’s Shame of the Cities, Fund’s Stealing Elections adopts a muckraking style and spotlights a national problem by illuminating it on a city-by-city basis.
In the name of making every vote count, efforts to expand the electorate have resulted in tallying votes that shouldn’t be considered and negating valid votes. Over a century’s worth of reforms designed to protect the concept of “one man, one vote” have been undermined in just a few decades. Fund points out that most states now allow voters to obtain absentee ballots without establishing a need (such as status as a student, soldier, or diplomat, or showing that one would be out of state on Election Day). One state, Oregon, has eliminated polling places entirely. The raison d’être of the secret ballot—to protect the public from having votes bought or coerced—is thus discarded.
Same-day registration, which backers argue further democratizes elections, is, according to Stealing Elections, “not a reform at all but an added opportunity for mischief”—such as vote buying. The comical scheme of an Al Gore–supporting New York socialite offering free cigarettes to homeless Milwaukeeans in exchange for votes could only occur in a state with same-day registration. Voters registering multiple times under the Motor Voter law, some liberals’ hostility toward poll workers checking government-issued identifications, and lawyers invading locales with election disputes—all increase the chances that legitimate votes will wind up cast aside or canceled out by illegitimate ones.
Stealing Elections overflows with examples of electoral shenanigans. The controversial 2004 Democratic primary, for instance, in which Texas Secretary of State Henry Cuellar unseated Congressman Ciro Rodriguez, ran rife with peculiarities that affected the outcome. While Rodriguez boasted a slim 126-vote lead on election night, the recount in Zapata County turned up a missing ballot box with 304 votes, four-fifths of them for Cuellar. “Webb County reported that their recount came up with 115 more votes than they had first reported,” Fund writes. “Cuellar won every one of the newly discovered votes.” In San Antonio, an area the challenger carried decisively, election officials discovered voter-registration applications for 42 dead people.
On election night that same year, Washington State voters elected Republican Dino Rossi over Democrat Christine Gregoire. On Christmas Eve, state lawyers overturned the election after a third recount. “Nearly 2,000 more votes were counted in King County than the number of individual voters who appeared on the list of those who had cast a ballot,” Fund reports. In one Seattle precinct—where most of the voters had curiously registered just that past year—70 percent of voters listed a government administration building as their residential address. Election officials found hundreds of “lost” ballots, accepted the votes of hundreds of ineligible felons, and, in a few instances, counted the votes of those residing in graveyards. One ballot punched for Gregoire but listing Rossi in the “write-in” line was strangely added in the recount to the totals for Gregoire. Given the strange methodology employed by ballot counters, it’s not surprising that Gregoire is now Washington’s governor.
In St. Louis, dogs join the dead on the election rolls. In 2000, voters nationwide let out a collective gasp in the waning hours of Election Day. Lawyers for Jesse Jackson and Al Gore convinced judges in St. Louis to keep polls open in selected African-American neighborhoods, altering election law by extending voting hours for those most likely to support Gore. Along with the discovery of a voting machine in an abandoned lot the day after the election, and the revelation that 56,000 St. Louis voters had registered multiple times, Missouri voters also learned that “Robert Odom”—on whose behalf Gore-Lieberman lawyers had successfully sued to keep the polls open—had voted in the early afternoon, before the court order extending poll-closure times was issued. The lawsuit was clearly premeditated, as the evidence of computerized phone banks, all-too-ready with a get-out-the-vote message, made clear. The exclamation point to the Show Me State’s 2000 horror show was provided by Ritzy, the 13-year-old spaniel who had been on the voter rolls for eight years.
A common thread in many of the cases that Fund spotlights is the shadowy presence of Acorn. Two and a half years after the debacle in Seattle, Washington’s attorney general indicted seven Acorn workers for their role in what he called “the worst case of voter registration fraud” in the state’s history. In St. Louis, eight Acorn workers pled guilty to election fraud this past April. On the other side of Missouri, in 2006, four Kansas City Acorn workers were indicted after officials deemed nearly 15,000 of their 35,000 registrations phony.
In the mid-nineties, Barack Obama ran Acorn’s Project Vote campaign in Illinois. He sued the state of Illinois on the group’s behalf in 1995 to implement the Motor Voter law. “After he joined the board of the Woods Fund,” Stealing Elections notes, “Obama saw to it that substantial grants were given to Acorn.” Senator Obama has championed Acorn’s legislative priorities in Congress. His presidential campaign even donated more than $800,000 to Acorn. Obama is the oak grown from Acorn, a group so proud of its association that it boasts “Obama Organizing Fellows” and runs a “Camp Obama” training event. While Acorn boasts of its Obama association, the candidate, of course, is more reticent. That’s because he well knows that many non-dead, non-animal voters would not find a close association with such a group a desirable quality in a potential president.
“Once a community organizer, then a foundation grant-maker, and now a lobbyist for direct government funding, Barack Obama has been with Acorn throughout his career,” Fund writes. “In return, Acorn is pledging to spend $35 million this year registering voters—both real and fictive. Should Obama become president, look for Acorn to have a vastly more ambitious legislative agenda, and for Obama to be responsive.” Acorn, in other words, has a lot riding on Tony Romo voting early, often, and everywhere.
Daniel J. Flynn is the author of A Conservative History of the American Left. (Crown Forum, 2008).
I'm no fan of McCain, and if it weren't for the likelihood of a couple Supreme Court nominations, I'd be hoping for a BHO win with Democratic Control of both House and Senate so that the GOP would be inspired to return to its more Libertarian roots and be poised to throw the bums out after what I anticipate would be 4 not particularly fruitful years. In short we wouldn't have had a Reagan if Jimmy Carter hadn't come first. . . .
October 22, 2008 The Second Coming of Jimmy Carter
By Rick Richman Barack Obama is taking America down a path modeled by Jimmy Carter, and threatens to be as bad a president as his trailblazer. A unlikely guide unwittingly will help make the case.
David Brooks asserted in the New York Times last week that, after watching Barack Obama for two years, it is "easy to sketch out a scenario in which he could be a great president."
[Obama] has shown the same untroubled self-confidence day after day. . . .
Brooks connected these personality traits with the "unshakable serenity" of FDR and Reagan, which in turn led to the Brooksian "scenario" of potential Obamian greatness.
I have no idea what Brooks means by an "organized unconscious;" nor exactly what a "deep, bottom-up process" is; nor how Obama's "untroubled self-confidence" differs from George W. Bush's "untroubled self-confidence." Still less do I understand how these esoteric personality traits relate to seeing "reality unfiltered," as opposed to representing a filter of their own.
What interests me, however, is Brooks' belief that, based on personality traits he has observed for two years, he can predict a presidency reminiscent of FDR and Reagan.
Such a prediction -- made before the man takes office, before he has even made a single cabinet choice, much less made a presidential policy decision; before he has faced a single crisis, much less handled one successfully -- is transparently absurd. But more than that, it brings forth a sense of déjà vu.
We have been down this road before, with an inexperienced driver, and the car crashed.
On November 3, 1976, the day after Jimmy Carter's election, the New York Times ran a profile explaining his remarkable political victory -- how a one-term governor from Georgia, with no significant record, began planning his presidential campaign in the second year of his one-and-only four-year term, and then went on to secure the nomination from more experienced rivals and defeat a sitting president:
He believed passionately that if he could talk to enough voters about a "Government as good as the American people," he could win. . .
Words, skillfully used, could play dual roles for him. Liberals came to conceive of him as one of their own. Conservatives responded to him sympathetically as well. Blacks in Harlem voiced their support. Whites in Mississippi got behind him. . . .
[T]he theme was always visible: a government as good as the people. It was voiced a hundred different ways, but the impact on his listeners was constant.
Americans, he said, were entitled to decent, compassionate, honest, competent government because Americans are decent, compassionate, honest and competent.
In other words: Jimmy Carter won by constantly telling Americans that he was the one they were waiting for.
He made them think that by voting for him, it reflected well on them. He played on the electorate's hope for change, and he offered a blank slate on which that hope could be projected. His speeches were secular sermons that would later translate into presidential addresses about the need to transcend our inordinate fear of communism and to overcome our malaise that was hindering his policies.
Carter had built his campaign on something that was, at the time, unique in modern American politics: the thoughtful campaign autobiography. Written while he was governor, it was re-published in paperback in June 1976 and given a New York Times review, written by a member of the editorial board. The review extolled both the book and its author:
Jimmy Carter has contrived a new literary form, the campaign biography written as autobiography by the candidate himself. It is a skillful, simply-written blend of personal history, social description and political philosophy that makes fascinating reading. . . .
Critics, friendly as well as unfriendly, worry whether Jimmy Carter believes in anything larger than his own success. This book does not provide conclusive answers. . . . Basically, however, Carter reminds one of two earlier Presidents, Theodore Roosevelt and John F. Kennedy. Although both were of a progressive bent, they were really neither liberal nor conservative by conviction. Rather, they believed in governing.
Carter was certified as the One in the closing benediction at the 1976 Democratic convention, given by no less a figure than the father of Martin Luther King, Jr. Televised on all three networks (the entire visual media at the time), the benediction heralded Jimmy Carter as someone sent to redeem the country: "Surely the Lord sent Jimmy Carter to come on out and bring America back where she belongs."
Thirty-two years later, no one associates Jimmy Carter with Roosevelt or Kennedy, or with "governing." Few people believe the Lord sent him, or that he brought America back where she belonged.
What were we thinking when we elected him? The answer is: some of the same things we are thinking now.
He was a blank slate to be filled with visions of Roosevelt and Kennedy. People thought his unique background and perspective would unite North and South, black and white. He had accomplished little in his political career, but he had written a thoughtful autobiography, with an audaciously hopeful title: "Why Not the Best?" He gave good speeches.
There was little substantive content to his campaign, which instead endlessly repeated his government-as-good-as-its-people mantra. His one specific proposal was "zero-based budgeting," under which each year the federal budget would start at zero and be analyzed by him line by line. He had no national or foreign policy experience.
But as a liberal governor from a Southern state, Carter was thought to have a remarkable "temperament." The New York Times thought he was a "keenly intelligent man" because the cover page of his autobiography featured quotations from Reinhold Niebuhr ("The sad duty of politics is to establish justice in a sinful world"), Bob Dylan (about "a funny ol'world that's a-comin' along"), and Dylan Thomas ("A hand rules pity as a hand rules heaven").
Now flash forward thirty years. In April 2007, shortly after Obama announced his candidacy, David Brooks had a one-on-one interview with him. They were speaking about effective aid to Africa. As Brooks related the conversation the next day in "Obama, Gospel and Verse":
Out of the blue I asked, "Have you ever read Reinhold Niebuhr?"
Obama's tone changed. "I love him. He's one of my favorite philosophers."
So I asked, What do you take away from him?
"I take away," Obama answered in a rush of words, "the compelling idea that there's serious evil in the world, and hardship and pain. And we should be humble and modest in our belief we can eliminate those things. But we shouldn't use that as an excuse for cynicism and inaction. I take away ... the sense we have to make these efforts knowing they are hard, and not swinging from naïve idealism to bitter realism."
My first impression was that for a guy who's spent the last few months fund-raising, and who was walking off the Senate floor as he spoke, that's a pretty good off-the-cuff summary of Niebuhr's "The Irony of American History." My second impression is that his campaign is an attempt to thread the Niebuhrian needle, and it's really interesting to watch.
A less credulous commentator might have noted that Obama had used 70 words and four sentences to express a cliché: we can't do everything, but we must do everything we can. He might have noted that "threading the Niebuhrian needle" is simply the Goldilocks principle applied to idealism and realism (not too much; not too little - just right). He might have observed that Obama spoke well but did not really say anything. But Obama already had him at "Niebuhr."
Nine months later, after Obama won the Iowa caucuses, a "vibrating" David Brooks (in Leon Wieseltier's observation) wrote that it was "a huge moment."
Whatever their political affiliations, Americans are going to feel good about the Obama victory, which is a story of youth, possibility and unity through diversity -- the primordial themes of the American experience. . . .
At first blush, his speeches are abstract, secular sermons of personal uplift -- filled with disquisitions on the nature of hope and the contours of change.
He talks about erasing old categories like red and blue (and implicitly, black and white) and replacing them with new categories, of which the most important are new and old. . . .
It was like the second coming of the 1976 Jimmy Carter -- the one who would unite North and South, black and white, and provide us a government as good as we were; it was the second coming of the man who knew Niebuhr! By last week, Brooks was speaking of FRD and Reagan.
If elected, Obama will be the least experienced president since Jimmy Carter. No one knows what Obama really thinks, much less what he will actually do, since he had one set of policies in the primaries and another during the general election, and his rhetoric is as unspecific as Carter's was (except Obama did say in the debates - twice - that he intended to go through the federal budget "line by line").
He has released no records from college or law school, nor his law firm client list, nor the files relating to his legislative experience in Illinois. He has acknowledged a history of drug use and the fact that he currently smokes, but he refuses to release any medical records. He has spent most of his still-unfinished first term in the Senate running for president, which his supporters argue is the executive experience that qualifies him for the presidency.
His own running mate has told us Obama could have made a better vice-presidential choice, and has warned us that Obama's inexperience will result in multiple international crises in his first six months. But Obama wrote an excellent autobiography, has an organized unconscious, and knows Niebuhr.
There is a good chance that if we elect him, we will one day ask: what were we thinking?
McCain Feingold and his very tepid support of the Second Amendment are two of the reasons I have a very hard time getting behind McCain. With that said, BHO has been turning campaign spending on its ear, with very little notice by the MSM.
October 22, 2008, 8:00 a.m.
Fake Donors, Phony Pledge On campaign finance, Obama declared independence from his promises.
By David Freddoso
Starting in June, Barack Obama’s website stopped asking for donations. Instead, it began asking for citizens who would “declare their independence from a broken system by supporting the first presidential election truly funded by the people.”
Perhaps the campaign did not expect that among those “declaring their independence would be donors named “Doodad Pro,” “Derty Poiiuy,” and “Jgtj Jfggjjfgj.” (And you thought Barack Obama had a funny name.) They may not have known that at least four Missourians and one Virginian would declare their independence involuntarily and later find fraudulent donations to Obama’s campaign on their credit card statements. The Obama campaign cannot claim ignorance of “Good Will,” whose address is the Goodwill headquarters in Austin, and whose occupation is “Loving You.” The Goodwill office received a letter from Obama last month indicating that Mr. Will had exceeded the legal limit with his $7,000 in contributions, and asking whether part of the money could be directed to Obama’s general election campaign.
Such abuse of the system may just be the inevitable consequence of a political system driven by massive amounts of money — or at least, that’s what Barack Obama used to say, before he figured out how to use that system to his advantage.
Reporters now note dryly that Barack Obama promised to take public matching funds for the presidential election, which would have limited the amount he could spend, and that he then reneged on his promise in June. This narrative understates the case.
Obama actually went much farther than merely giving his word that he would accept matching funds. In February of 2007, he challenged all of the Republican candidates for president to pledge, along with him, that they would take matching funds. It was supposed to be a rare display of political courage on his part, for the sake of principles he believed in.
Sen. John McCain, who has long clashed with conservatives on issues of campaign finance, accepted Obama’s challenge on Obama’s terms. Obama would later write on a November 2007 questionnaire from the Midwest Democracy Network: “If I am the Democratic nominee, I will aggressively pursue an agreement with the Republican nominee to preserve a publicly financed general election.” In February of this year, he wrote an op-ed stating again that he would “aggressively pursue” an agreement with McCain that would set “real spending limits.” He repeated this promise on FOX News on April 27.
Then, all of the sudden, Barack Obama announced in June that the public campaign-financing system was “broken” and so he could not participate in it. Presumably, someone went and broke the public campaign-financing system sometime between April and mid-June of this year.
Who did it? Barack Obama did. He broke the system as soon as it became clear to him that by rejecting public financing, he might be able to raise half a billion dollars and drown his opponent in money, as he is doing now.
It may all seem like a minor point now — just an occasion for a bit of Republican whining as Obama’s attack ads dominate the airwaves thanks to his broken promise. After all, Obama has raised quite a bit of money. But his donations from fake donors evoke the fake promise he made on principle just months ago to restrict campaign spending and limit the influence of special interests.
News reporters often assume, incorrectly, that the numbers in the FEC reports they scour each quarter are put on the Internet by magic. In fact, each one has to be recorded individually by a human being in what is really a painstaking process. This applies not only to the larger amounts contributed by Mr. Will and Mr. Jfggjjfgj, but also to amounts less than $200. A pair of human eyes has to check each one, even if amounts smaller than $200 are not required by law to be disclosed in any report.
Obama’s finance team missed quite a few obviously troubling large donations, from such unsavory individuals as Mr. Jfggjjfgj, “Mong Kong,” “Test Person,” and “Jockim Alberton,” who lives at a fictional address on a street that does not exist in Wilmington, Delaware. How many fictional characters might there be among the $220 million that Obama has collected in small, undisclosed contributions?
Obama’s small donors have all been recorded, and he could easily follow McCain’s lead by disclosing this major source of his campaign’s money. Hopefully the list of donors contains no one with Asdfjkl as a surname, and it bears no resemblance to an ACORN voter-registration list.
— David Freddoso is a staff reporter for National Review Online and author of The Case Against Barack Obama.