Dog Brothers Public Forum
Return To Homepage
September 05, 2015, 05:19:11 AM
Login with username, password and session length
Welcome to the Dog Brothers Public Forum.
Dog Brothers Public Forum
Politics, Religion, Science, Culture and Humanities
Science, Culture, & Humanities
Physics & Mathematics
Topic: Physics & Mathematics (Read 20690 times)
Physics & Mathematics
March 11, 2007, 09:27:20 AM »
Three days after learning that he won the 2006 Nobel Prize in Physics, George Smoot was talking about the universe. Sitting across from him in his office at the University of California, Berkeley, was Saul Perlmutter, a fellow cosmologist and a probable future Nobelist in Physics himself. Bearded, booming, eyes pinwheeling from adrenaline and lack of sleep, Smoot leaned back in his chair. Perlmutter, onetime acolyte, longtime colleague, now heir apparent, leaned forward in his.
“Time and time again,” Smoot shouted, “the universe has turned out to be really simple.”
Perlmutter nodded eagerly. “It’s like, why are we able to understand the universe at our level?”
“Right. Exactly. It’s a universe for beginners! ‘The Universe for Dummies’!”
But as Smoot and Perlmutter know, it is also inarguably a universe for Nobelists, and one that in the past decade has become exponentially more complicated. Since the invention of the telescope four centuries ago, astronomers have been able to figure out the workings of the universe simply by observing the heavens and applying some math, and vice versa. Take the discovery of moons, planets, stars and galaxies, apply Newton’s laws and you have a universe that runs like clockwork. Take Einstein’s modifications of Newton, apply the discovery of an expanding universe and you get the big bang. “It’s a ridiculously simple, intentionally cartoonish picture,” Perlmutter said. “We’re just incredibly lucky that that first try has matched so well.”
But is our luck about to run out? Smoot’s and Perlmutter’s work is part of a revolution that has forced their colleagues to confront a universe wholly unlike any they have ever known, one that is made of only 4 percent of the kind of matter we have always assumed it to be — the material that makes up you and me and this magazine and all the planets and stars in our galaxy and in all 125 billion galaxies beyond. The rest — 96 percent of the universe — is ... who knows?
“Dark,” cosmologists call it, in what could go down in history as the ultimate semantic surrender. This is not “dark” as in distant or invisible. This is “dark” as in unknown for now, and possibly forever.
If so, such a development would presumably not be without philosophical consequences of the civilization-altering variety. Cosmologists often refer to this possibility as “the ultimate Copernican revolution”: not only are we not at the center of anything; we’re not even made of the same stuff as most of the rest of everything. “We’re just a bit of pollution,” Lawrence M. Krauss, a theorist at Case Western Reserve, said not long ago at a public panel on cosmology in Chicago. “If you got rid of us, and all the stars and all the galaxies and all the planets and all the aliens and everybody, then the universe would be largely the same. We’re completely irrelevant.”
All well and good. Science is full of homo sapiens-humbling insights. But the trade-off for these lessons in insignificance has always been that at least now we would have a deeper — simpler — understanding of the universe. That the more we could observe, the more we would know. But what about the less we could observe? What happens to new knowledge then? It’s a question cosmologists have been asking themselves lately, and it might well be a question we’ll all be asking ourselves soon, because if they’re right, then the time has come to rethink a fundamental assumption: When we look up at the night sky, we’re seeing the universe.
Not so. Not even close.
In 1963, two scientists at Bell Labs in New Jersey discovered a microwave signal that came from every direction of the heavens. Theorists at nearby Princeton University soon realized that this signal might be the echo from the beginning of the universe, as predicted by the big-bang hypothesis. Take the idea of a cosmos born in a primordial fireball and cooling down ever since, apply the discovery of a microwave signal with a temperature that corresponded precisely to the one that was predicted by theorists — 2.7 degrees above absolute zero — and you have the universe as we know it. Not Newton’s universe, with its stately, eternal procession of benign objects, but Einstein’s universe, violent, evolving, full of births and deaths, with the grandest birth and, maybe, death belonging to the cosmos itself.
But then, in the 1970s, astronomers began noticing something that didn’t seem to fit with the laws of physics. They found that spiral galaxies like our own Milky Way were spinning at such a rate that they should have long ago wobbled out of control, shredding apart, shedding stars in every direction. Yet clearly they had done no such thing. They were living fast but not dying young. This seeming paradox led theorists to wonder if a halo of a hypothetical something else might be cocooning each galaxy, dwarfing each flat spiral disk of stars and gas at just the right mass ratio to keep it gravitationally intact. Borrowing a term from the astronomer Fritz Zwicky, who detected the same problem with the motions of a whole cluster of galaxies back in the 1930s, decades before anyone else took the situation seriously, astronomers called this mystery mass “dark matter.”
Page 2 of 6)
So there was more to the universe than meets the eye. But how much more? This was the question Saul Perlmutter’s team at Lawrence Berkeley National Laboratory set out to answer in the late 1980s. Actually, they wanted to settle an issue that had been nagging astronomers ever since Edwin Hubble discovered in 1929 that the universe seems to be expanding. Gravity, astronomers figured, would be slowing the expansion, and the more matter the greater the gravitational effect. But was the amount of matter in the universe enough to slow the expansion until it eventually stopped, reversed course and collapsed in a backward big bang? Or was the amount of matter not quite enough to do this, in which case the universe would just go on expanding forever? Just how much was the expansion of the universe slowing down?
The tool the team would be using was a specific type of exploding star, or supernova, that reaches a roughly uniform brightness and so can serve as what astronomers call a standard candle. By comparing how bright supernovae appear and how much the expansion of the universe has shifted their light, cosmologists sought to determine the rate of the expansion. “I was trying to tell everybody that this is the measurement that everybody should be doing,” Perlmutter says. “I was trying to convince them that this is going to be the tool of the future.” Perlmutter talks like a microcassette on fast-forward, and he possesses the kind of psychological dexterity that allows him to walk into a room and instantly inhabit each person’s point of view. He can be as persuasive as any force of nature. “The next thing I know,” he says, “we’ve convinced people, and now they’re competing with us!”
By 1997, Perlmutter’s Supernova Cosmology Project and a rival team had amassed data from more than 50 supernovae between them — data that would reveal yet another oddity in the cosmos. Perlmutter noticed that the supernovae weren’t brighter than expected but dimmer. He wondered if he had made a mistake in his observations. A few months later, Adam Riess, a member of a rival international team, noticed the same general drift in his math and wondered the same thing. “I’m a postdoc,” he told himself. “I’m sure I’ve messed up in at least 10 different ways.” But Perlmutter double-checked for intergalactic dust that might have skewed his readings, and Riess cross-checked his math, calculation by calculation, with his team leader, Brian Schmidt. Early in 1998, the two teams announced that they had each independently reached the same conclusion, and it was the opposite of what either of them expected. The rate of the expansion of the universe was not slowing down. Instead, it seemed to be speeding up.
That same year, Michael Turner, the prominent University of Chicago theorist, delivered a paper in which he called this antigravitational force “dark energy.” The purpose of calling it “dark,” he explained recently, was to highlight the similarity to dark matter. The purpose of “energy” was to make a distinction. “It really is very different from dark matter,” Turner said. “It’s more energylike.”
More energylike how, exactly?
Turner raised his eyebrows. “I’m not embarrassed to say it’s the most profound mystery in all of science.”
Extraordinary claims,” Carl Sagan once said, “require extraordinary evidence.” Astronomers love that saying; they quote it all the time. In this case the claim could have hardly been more extraordinary: a new universe was dawning.
It wouldn’t be the first time. We once thought the night sky consisted of the several thousand objects we could see with the naked eye. But the invention of the telescope revealed that it didn’t, and that the farther we saw, the more we saw: planets, stars, galaxies. After that we thought the night sky consisted of only the objects the eye could see with the assistance of telescopes that reached all the way back to the first stars blinking to life. But the discovery of wavelengths beyond the optical revealed that it didn’t, and that the more we saw in the radio or infrared or X-ray parts of the electromagnetic spectrum, the more we discovered: evidence for black holes, the big bang and the distances of supernovae, for starters.
(Page 3 of 6)
The difference with “dark,” however, is that it lies not only outside the visible but also beyond the entire electromagnetic spectrum. By all indications, it consists of data that our five senses can’t detect other than indirectly. The motions of galaxies don’t make sense unless we infer the existence of dark matter. The brightness of supernovae doesn’t make sense unless we infer the existence of dark energy. It’s not that inference can’t be a powerful tool: an apple falls to the ground, and we infer gravity. But it can also be an incomplete tool: gravity is ... ?
Dark matter is ... ? In the three decades since most astronomers decisively, if reluctantly, accepted the existence of dark matter, observers have eliminated the obvious answer: that dark matter is made of normal matter that is so far away or so dim that it can’t be seen from earth. To account for the dark-matter deficit, this material would have to be so massive and so numerous that we couldn’t possibly miss it.
Which leaves abnormal matter, or what physicists call nonbaryonic matter, meaning that it doesn’t consist of the protons and neutrons of “normal” matter. What’s more (or, perhaps more accurately, less), it doesn’t interact at all with electricity or magnetism, which is why we wouldn’t be able to see it, and it can rarely interact even with protons and neutrons, which is why trillions of these particles might be passing through you every second without your knowing it. Theorists have narrowed the search for dark-matter particles to two hypothetical candidates: the axion and the neutralino. But so far efforts to create one of these ghostly particles in accelerators, which mimic the high levels of energy in the first fraction of a second after the birth of the universe, have come up empty. So have efforts to catch one in ultrasensitive detectors, which number in the dozens around the world.
For now, dark-matter physicists are hanging their hopes on the Large Hadron Collider, the latest-generation subatomic-particle accelerator, which goes online later this year at the European Center for Nuclear Research on the Franco-Swiss border. Many cosmologists think that the L.H.C. has made the creation of a dark-matter particle — as George Smoot said, holding up two fingers — “this close.” But one of the pioneer astronomers investigating dark matter in the 1970s, Vera Rubin, says that she has lived through plenty of this kind of optimism; she herself predicted in 1980 that dark matter would be identified within a decade. “I hope he’s right,” she says of Smoot’s assertion. “But I think it’s more a wish than a belief.” As one particle physicist commented at a “Dark Universe” symposium at the Space Telescope Science Institute in Baltimore a few years ago, “If we fail to see anything in the L.H.C., then I’m off to do something else,” adding, “Unfortunately, I’ll be off to do something else at the same time as hundreds of other physicists.”
Juan Collar might be among them. “I know I speak for a generation of people who have been looking for dark-matter particles since they were grad students,” he said one wintry afternoon in his University of Chicago office. “I doubt how many of us will remain in the field if the L.H.C. brings home bad news. I have been looking for dark-matter particles for more than 15 years. I’m 42. So most of my colleagues, my age, we are kind of going through a midlife crisis.” He laughed. “When we get together and we drink enough beer, we start howling at the moon.”
Although many scientists say that the existence of the axion will be proved or disproved within the next 10 years — as a result of work at Lawrence Livermore National Laboratory — the detection of a neutralino one way or the other is much less certain. A negative result from an experiment might mean only that theorists haven’t thought hard enough or that observers haven’t looked deep enough. “It could very well be that Mother Nature has decided that the neutralino is way down there,” Collar said, pointing not to a graph that he taped up in his office but to a point below the sheet of paper itself, at the blank wall. “If that is the case,” he went on to say, “we should retreat and worship Mother Nature. These particles maybe exist, but we will not see them, our sons will not see them and their sons won’t see them.”
Last Edit: May 27, 2012, 10:47:50 AM by Crafty_Dog
Dark Energy Part Two
Reply #1 on:
March 11, 2007, 09:28:05 AM »
(Page 4 of 6)
The challenge with dark energy, as opposed to dark matter, is even more difficult. Dark energy is whatever it is that’s making the expansion of the universe accelerate, but, for instance, does it change over time and space? If so, then cosmologists have a name for it: quintessence. Does it not change? In that case, they’ll call it the cosmological constant, a version of the mathematical fudge factor that Einstein originally inserted into the equations for relativity to explain why the universe had neither expanded nor contracted itself out of existence.
After the discovery of dark energy, Perlmutter concluded that the next generation of dark-energy telescopes would have to include a space-based observatory. But the search for financing for such an ambitious project can require as much forbearance as the search for dark energy itself. “I don’t think I’ve ever seen as much of Washington as I have in the last few years,” he says, sighing. Even if his Supernova Acceleration Probe didn’t now face competition from several other proposals for federal financing (including, perhaps inevitably, one involving his old rival Riess), delays have prevented it from being ready to launch until at least the middle of the next decade. “Ten years from now,” says Josh Frieman of the University of Chicago, “when we’re talking about spending on the order of a billion dollars to put something up in space — which I think we should do — you’re getting into that class where you’re spending real money.”
Even some cosmologists have begun to express reservations. At a conference at Durham University in England last summer, a “whither cosmology?” panel featuring some of the field’s most prominent names questioned the wisdom of concentrating so much money and manpower on one problem. They pointed to what happened when the government-sponsored Dark Energy Task Force solicited proposals for experiments a couple of years ago. The task force was expecting a dozen, according to one member. They got three dozen. Cosmology was choosing a “risky and not very cost-effective way of moving forward,” one Durham panelist told me later, summarizing the sentiment he heard there.
But even if somebody were to figure out whether or not dark energy changes across time and space, astronomers still wouldn’t know what dark energy itself is. “The term doesn’t mean anything,” said David Schlegel of Lawrence Berkeley National Laboratory this past fall. “It might not be dark. It might not be energy. The whole name is a placeholder. It’s a placeholder for the description that there’s something funny that was discovered eight years ago now that we don’t understand.” Not that theorists haven’t been trying. “It’s just nonstop,” Perlmutter told me. “There’s article after article after article.” He likes to begin public talks with a PowerPoint illustration: papers on dark energy piling up, one on top of the next, until the on-screen stack ascends into the dozens. All the more reason not to put all of cosmology’s eggs into one research basket, argued the Durham panelists. As one summarized the situation, “We don’t even have a hypothesis to test.”
Michael Turner won’t hear of it. “This is one of these godsend problems!” he says. “If you’re a scientist, you’d like to be around when there’s a great problem to work on and solve. The solution is not obvious, and you could imagine it being solved tomorrow, you could imagine it taking another 10 years or you could imagine it taking another 200 years.”
But you could also imagine it taking forever.
“Time to get serious.” The PowerPoint slide, teal letters popping off a black background, stared back at a hotel ballroom full of cosmologists. They gathered in Chicago last winter for a “New Views of the Universe” conference, and Sean Carroll, then at the University of Chicago, had taken it upon himself to give his theorist colleagues their marching orders.
“There was a heyday for talking out all sorts of crazy ideas,” Carroll, now at Caltech, recently explained. That heyday would have been the heady, post-1998 period when Michael Turner might stand up at a conference and turn to anyone voicing caution and say, “Can’t we be exuberant for a while?” But now has come the metaphorical morning after, and with it a sobering realization: Maybe the universe isn’t simple enough for dummies like us humans. Maybe it’s not just our powers of perception that aren’t up to the task but also our powers of conception. Extraordinary claims like the dawn of a new universe might require extraordinary evidence, but what if that evidence has to be literally beyond the ordinary? Astronomers now realize that dark matter probably involves matter that is nonbaryonic. And whatever it is that dark energy involves, we know it’s not “normal,” either. In that case, maybe this next round of evidence will have to be not only beyond anything we know but also beyond anything we know how to know.
(Page 5 of 6)
That possibility always gnaws at scientists — what Perlmutter calls “that sense of tentativeness, that we have gotten so far based on so little.” Cosmologists in particular have had to confront that possibility throughout the birth of their science. “At various times in the past 20 years it could have gotten to the point where there was no opportunity for advance,” Frieman says. What if, for instance, researchers couldn’t repeat the 1963 Bell Labs detection of the supposed echo from the big bang? Smoot and John C. Mather of NASA (who shared the Nobel in Physics with Smoot) designed the Cosmic Background Explorer satellite telescope to do just that. COBE looked for extremely subtle differences in temperature throughout all of space that carry the imprint of the universe when it was less than a second old. And in 1992, COBE found them: in effect, the quantum fluctuations that 13.7 billion years later would coalesce into a universe that is 22 percent dark matter, 74 percent dark energy and 4 percent the stuff of us.
And if the right ripples hadn’t shown up? As Frieman puts it: “You just would have thrown up your hands and said, ‘My God, we’ve got to go back to the drawing board!’ What’s remarkable to me is that so far that hasn’t happpened.”
Yet in a way it has. In the observation-and-theory, call-and-response system of investigating nature that scientists have refined over the past 400 years, the dark side of the universe represents a disruption. General relativity helped explain the observations of the expanding universe, which led to the idea of the big bang, which anticipated the observations of the cosmic-microwave background, which led to the revival of Einstein’s cosmological constant, which anticipated the observations of supernovae, which led to dark energy. And dark energy is ... ?
The difficulty in answering that question has led some cosmologists to ask an even deeper question: Does dark energy even exist? Or is it perhaps an inference too far? Cosmologists have another saying they like to cite: “You get to invoke the tooth fairy only once,” meaning dark matter, “but now we have to invoke the tooth fairy twice,” meaning dark energy.
One of the most compelling arguments that cosmologists have for the existence of dark energy (whatever it is) is that unlike earlier inferences that physicists eventually had to abandon — the ether that 19th-century physicists thought pervaded space, for instance — this inference makes mathematical sense. Take Perlmutter’s and Riess’s observations of supernovae, apply one cornerstone of 20th-century physics, general relativity, and you have a universe that does indeed consist of .26 matter, dark or otherwise, and .74 something that accelerates the expansion. Yet in another way, dark energy doesn’t add up. Take the observations of supernovae, apply the other cornerstone of 20th-century physics, quantum theory, and you get gibberish — you get an answer 120 orders of magnitude larger than .74.
Which doesn’t mean that dark energy is the ether of our age. But it does mean that its implications extend beyond cosmology to a problem Einstein spent the last 30 years of his life trying to reconcile: how to unify his new physics of the very large (general relativity) with the new physics of the very small (quantum mechanics). What makes the two incompatible — where the physics breaks down — is gravity.
In physics, gravity is the ur-inference. Even Newton admitted that he was making it up as he went along. That a force of attraction might exist between two distant objects, he once wrote in a letter, is “so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it.” Yet fall into it we all do on a daily basis, and physicists are no exception. “I don’t think we really understand what gravity is,” Vera Rubin says. “So in some sense we’re doing an awful lot on something we don’t know much about.”
It hasn’t escaped the notice of astronomers that both dark matter and dark energy involve gravity. Early this year 50 physicists gathered for a “Rethinking Gravity” conference at the University of Arizona to discuss variations on general relativity. “So far, Einstein is coming through with flying colors,” says Sean Carroll, who was one of the gravity-defying participants. “He’s always smarter than you think he was.”
But he’s not necessarily inviolate. “We’ve never tested gravity across the whole universe before,” Riess pointed out during a news conference last year. “It may be that there’s not really dark energy, that that’s a figment of our misperception about gravity, that gravity actually changes the way it operates on long ranges.”
The only way out, cosmologists and particle physicists agree, would be a “new physics” — a reconciliation of general relativity and quantum mechanics. “Understanding dark energy,” Riess says, “seems to really require understanding and using both of those theories at the same time.”
Page 6 of 6)
“It’s been so hard that we’re even willing to consider listening to string theorists,” Perlmutter says, referring to work that posits numerous dimensions beyond the traditional (one of time and three of space). “They’re at least providing a language in which you can talk about both things at the same time.”
According to quantum theory, particles can pop into and out of existence. In that case, maybe the universe itself was born in one such quantum pop. And if one universe can pop into existence, then why not many universes? String theorists say that number could be 10 raised to the power of 500. Those are 10-with-500-zeros universes, give or take. In which case, our universe would just happen to be the one with an energy density of .74, a condition suitable for the existence of creatures that can contemplate their hyper-Copernican existence.
And this is just one of a number of theories that have been popping into existence, quantum-particle-like, in the past few years: parallel universes, intersecting universes or, in the case of Stephen Hawking and Thomas Hertog just last summer, a superposition of universes. But what evidence — extraordinary or otherwise — can anyone offer for such claims? The challenge is to devise an experiment that would do for a new physics what COBE did for the big bang. Predictions in string theory, as in the 10-to-the-power-of-500-universes hypothesis, depend on the existence of extra dimensions, a stipulation that just might put the burden back on particle physics — specifically, the hope that evidence of extra dimensions will emerge in the Large Hadron Collider, or perhaps in its proposed successor, the International Linear Collider, which might come online sometime around 2020, or maybe in the supercollider after that, if the industrial nations of 2030 decide they can afford it.
“You want your mind to be boggled,” Perlmutter says. “That is a pleasure in and of itself. And it’s more a pleasure if it’s boggled by something that you can then demonstrate is really, really true.”
And if you can’t demonstrate that it’s really, really true?
“If the brilliant idea doesn’t come along,” Riess says, “then we will say dark energy has exactly these properties, it acts exactly like this. And then” — a shrug — “we will put it in a box.” And there it will remain, residing perhaps not far from the box labeled “Dark Matter,” and the two of them bookending the biggest box of them all, “Gravity,” to await a future Newton or Einstein to open — or not.
massive black holes
Reply #2 on:
April 21, 2007, 09:01:57 PM »
I was watching the
"Science" channel on cable the other night. They had a show on supermassive black holes. I didn't realize that present theory holds that there is a black hole in every galaxy and is in some way related to the clustering of the stars in that galaxy. It is also theorized that quasars are also related to supermassive black holes.
I remember in my astronomy classes in the 70's (ugh!) that quasars were the farthests objects in the universe and there was absolutely no explanation as to what they were. A lot of discovery has happened since then. A lot of theories formulated.
Yet every time I read about space I am left with this empty feeling. I feel like we will never be able to understand "where it all began". It seems unanswerable. It seems incomprehensible. Should this thread be headed under religion or God? But to me the concept of God doesn't really answer the great questions since the beginning of man. But it is more comforting.
This link is not to the particular show but to another space site which came up today on a news link:
The Shadow goes
Reply #3 on:
June 20, 2007, 09:01:37 AM »
The Shadow Goes
By MARGARET WERTHEIM
Published: June 20, 2007
ON Thursday, on the summer solstice, the Sun will celebrate the year’s lazy months by resting on the horizon. The word solstice derives from the Latin “sol” (sun) and “sistere” (to stand still). The day marks the sun’s highest point in the sky, the moment when our shadows shrink to their shortest length of the year. How strange to think that these mundane friends, our ever-present familiars, can actually go faster than the sun’s rays.
I remarked on this recently to my husband as we sat on the porch with our shadows pooling by our chairs. Nothing can go faster than light, he insisted, expressing what is surely the most widely known law of physics, ingrained into us by a thousand “Nova” programs.
That is the point, I explained: Nothing can go faster than light. A shadow isn’t a thing. It’s a non-thing. It’s the absence of light.
Special relativity dictates that we cannot move anything more quickly than the particles of light known as photons, but no law says you can’t do nothing faster than light. Physicists have known this for a long time, even if they generally do not mention it on PBS documentaries.
My husband looked troubled, as did my sister and some friends I regaled with the story that evening. Like the warp drive on “Star Trek,” faster-than-light travel is supposed to be a science-fiction fantasy. Isn’t it?
They are right about the travel: According to relativity, no physical substance can exceed the speed of light because it would take infinite energy to accelerate anything to such a velocity.
Yet the laws of physics pertain only to that which is. That which isn’t is not bound by relativity’s restraint. From the point of view of relativity, a shadow (having no mass) is a non-thing, an existential void.
It’s quite easy to conjure up a faster-than-light shadow, at least in theory. Build a great klieg light, a superstrong version of the ones set up at the Academy Awards. Now paste a piece of black paper onto the klieg’s glass so there is a shadow in the middle of the beam, like the signal used to summon Batman. And we are going to mount our light in space and broadcast the Bat-call to the cosmos.
The key to our trick is to rotate the klieg. As the light turns, the bat shadow sweeps across the sky. Round and round it goes, projecting into the void. Just as the rim of a bicycle wheel moves faster than its hub, so too, away from the source our bat shadow will fly faster and faster, a consequence of the geometry that guarantees the rim of a really big wheel moves faster than a co-rotating small wheel.
At a great enough distance from the source, our shadow bat will go so fast it will exceed the speed of light. This does not violate relativity because a shadow carries no energy. Literally nothing is transferred. Our shadow bat can go 10 times the speed of light or 100 times faster without breaking any of physics’ sacred rules.
My sister leapt to the heart of this apparent paradox: Why isn’t the light itself traveling faster than the speed of light? Isn’t it also rotating in space? Actually, no. The bulbs that produce the light are spinning, but the light particles leave the source at 186,000 miles a second, the vaunted “speed of light.” Once emitted, the photons continue to travel at this speed directly away from the source. Only the shadow revolves around the great circle. The critical point is that no object, no substance, defies light.
My husband was right to object that you’d need one spectacular klieg to produce a detectable shadow thousands of miles out in space. Still, the theory is sound.
The anthropologist Mary Douglas noted that all systems of categorizing break down somewhere, unable to incorporate certain forms. By standing beyond relativity’s injunction, shadows suggest the limits of all classification schemes, a tension that even modern science cannot completely resolve.
In the terms recognized by relativity, shadows are non-things. Yet before the invention of clocks, shadows were the most important means for telling time. Weightless and without energy, shadows can nonetheless convey information — though they cannot, despite our giant klieg, be used for faster-than-light communication. That’s because the shadow’s location cannot be detected until the light, moving at its ponderous relativistic pace, arrives.
“Here there be monsters,” said the medieval maps, signaling the limits of reason’s reach. As a map of being, physics is flanked by the monsters of non-being whose outlines we glimpse in the paradoxes of quantum mechanics and in the zooming arc of a shadow bat going faster than light.
In Christian theology we are told, “God is that which nothing is greater than.” The scientific corollary might be, “Light is that which nothing is faster than” — a statement true both in spirit and fact.
Margaret Wertheim, the director of the Institute for Figuring, a science and mathematics education organization, is writing a book on physics and the imagination.
Super Collider my Cause a Black Hole that will Kill us All, Oh My!
Reply #4 on:
August 29, 2007, 09:02:06 AM »
For those needing a new unreasoned fear to latch on to. . . .
Aug 3, 2007
Fears over factoids
Recent TV programmes have claimed that the Earth could be destroyed by black holes created in particle accelerators and that helium-3 from the Moon could be used for fusion energy. Frank Close warns that these "factoids" must be stamped out before they become accepted as facts
Did you know that when the Large Hadron Collider (LHC) comes online at CERN next spring, it could end up creating mini black holes that destroy the Earth? This is not something from a Dan Brown novel, but from a TV documentary broadcast as part of the BBC's Horizon series in the UK on 1 May – a programme that has been running for 40 years and is supposedly the flagship of TV science in the country. Although the documentary itself was fairly measured, the producers began the programme with the black-hole claim and used it in their publicity for the show.
Physicists who recall superb Horizon documentaries of the past – for example, on the discovery of the W and Z bosons – will have been disappointed that such a marvellous project as the LHC should have been sensationalized in this way. It was disheartening that the programme makers felt the need to rehash these unnecessary concerns over black holes being produced in particle accelerators, which physicists had already dismissed before the Relativistic Heavy Ion Collider (RHIC) came online at the Brookhaven National Laboratory in 2000 (Physics World July 2000 pp19–20, print edition only).
Meanwhile, another Horizon documentary, broadcast on 10 April, claimed that one reason for sending humans to the Moon is so that we can mine it for helium-3 as a fuel for fusion power back on Earth. The need to bring helium-3 back from the Moon has even been briefly referred to in Physics World (May 2007 pp12–13, print edition only) and, more worryingly, has been presented to US congressional committees, including the Science and Technology Committee of the House of Representatives in 2004.
As a particle physicist, I am of course interested in the LHC; and as the chair of a working group set up by the British National Space Centre to look into the future of UK space science – including the possibility of humans returning to the Moon – I am also intrigued by the helium-3 story. Both of the claims bother me and, on investigation, each is revealed as an example of what I call "factoid science" – myths of dubious provenance that propagate, become received wisdom and could even influence policy. So what is the reality and what can physicists do to correct such mis-information?
The story of the LHC as an Armageddon machine would be laughable were it not so serious. Aficionados of Dan Brown – whose novel Angels and Demons was set partly at CERN – might believe that the Geneva lab produces antimatter capable of making weapons of mass destruction. But I did not expect to find similarly outlandish statements used to promote Horizon. As the programme's website puts it: "Some scientists argue that during a 10-year spell of operation there is a 1 in 50 million chance that experiments like the LHC could cause a catastrophe of epic proportions." The site then invites the public to take part in a poll on whether the LHC should be turned on or not, based on this "probability".
While the LHC will create the most energetic collisions ever seen on Earth, cosmic rays at these and even higher energies have been bombarding our and other planets for billions of years without mishap. When I asked the producers of Horizon where they had obtained the 1-in-50-million statistic, I was told it had been taken from a "reliable source": Our Final Century by Cambridge University cosmologist Martin Rees. But when I read his book, it became clear that the programme's research had sadly been incomplete. On page 124, Rees discusses a paper published in 1999 by CERN theorists Arnon Dar, Alvaro de Rújula and Ulrich Heinz that uses the fact that the Earth and the cosmos have survived for several billion years to estimate the probability of colliders producing hypothetical particles called "strangelets" that might destroy our planet (1999 Phys. Lett. B 470 142).
Rees fairly describes their conclusions as follows: "If the experiment were run for 10 years, the risk of catastrophe was no more than 1 in 50 million." In other words, the chance of disaster is one in at least 50 million (as no disaster has occurred); this is rather different from saying, as Horizon does, that there is a "1 in 50 million" probability of a catastrophe happening from the moment the LHC switches on.
Moreover, when Dar and colleagues wrote their 1999 paper, a committee of eminent physicists appointed by the Brookhaven lab was also investigating if RHIC could produce strangelets (arXiv:hep-ph/ 9910333v3). That study used not just information from cosmology but also data from collisions between heavy ions (albeit at lower energies than RHIC would obtain) to show that the chances of catastrophe are at least one part in 1019.
Furthermore, these figures refer specifically to strangelets being produced at RHIC, as Rees makes clear, and have nothing to do with the question of whether we should risk creating black holes. Indeed, why does Horizon talk about black holes at all? The only reason can be that a theory does exist that posits that mini black holes could be produced in a collider. But if one mentions this theory, then one must include the whole of it, which clearly states that mini black holes pose no hazard whatsoever because they do not grow but evaporate and die.
As if any more evidence was needed that colliders are safe, CERN also set up an "LHC safety-study group" to see if its new collider could create black holes or strangelets. It concluded – in an official CERN report published in 2003 (CERN-2003-001) – that there is "no basis for any conceivable threat" of either eventuality, which is as near as science can get to saying zero. Unfortunately, the Horizon programme made no mention of these serious and time-consuming enquiries even though CERN's press office gave the programme's researchers a copy of the lab's 2003 report. Instead, the public has been led to believe that scientists are prepared to embark on experiments that could spell the end of the planet.
Let me now turn to the helium-3 factoid. At most fusion experiments, such as the Joint European Torus (JET) in the UK, a fuel of deuterium and tritium nuclei is converted in a tokomak into helium-4 and a neutron, thereby releasing energy in the process. No helium-3 is involved, so where does the myth come from? Enter "helium-3 fusion" into Google and you will find numerous websites pointing out that the neutron produced in deuterium–tritium fusion makes the walls of the tokomak radioactive, but that fusion could be "clean" if only we reacted deuterium with helium-3 to produce helium-4 and a proton.
Given that the amount of helium-3 available on Earth is trifling, it has been proposed that we should go to the Moon to mine the isotope, which is produced in the Sun and might be blown onto the lunar surface via the solar wind. Apart from not even knowing for certain if there is any helium-3 on the Moon, there are two main problems with this idea – one obvious and one intriguingly subtle. The first problem is that, in a tokomak, deuterium reacts up to 100 times more slowly with helium-3 than it does with tritium. This is because fusion has to overcome the electrical repulsion between the protons in the fuel, which is much higher for deuterium– helium-3 reactions (the nuclei have one and two protons, respectively) than it is for deuterium– tritium reactions (one proton each).
Clearly, deuterium–helium-3 is a poor fusion process, but the irony is much greater as I shall now reveal. A tokomak is not like a particle accelerator where counter-rotating beams of deuterium and helium-3 collide and fuse. Instead, all of the nuclei in the fuel mingle together, which means that two deuterium nuclei can rapidly fuse to give a tritium nucleus and proton. The tritium can now fuse with the deuterium – again much faster than the deuterium can with helium-3 – to yield helium-4 and a neutron.
So by bringing helium-3 from the Moon, all we will end up doing is create a deuterium– tritium fusion machine, which is the very thing the helium aficionados wanted to avoid! Undeterred, some of these people even suggest that two helium-3 nuclei could be made to fuse with each other to produce deuterium, an alpha particle and energy. Unfortunately, this reaction occurs even more slowly than deuterium–tritium fusion and the fuel would have to be heated to impractically high temperatures that would be beyond the reach of a tokomak. And as not even the upcoming International Thermonuclear Experimental Reactor (ITER) will be able to generate electricity from the latter reaction, the lunar-helium-3 story – like the LHC as an Armageddon machine – is, to my mind, moonshine.
Does any of this matter beyond raising the blood pressure of some physicists? All publicity is good publicity, some might say. But I believe we should all be concerned. The LHC factoid has now been repeated in the New Yorker and in various reviews of the Horizon documentary. Even some nonphysics colleagues are asking me to explain what it is all about. If Horizon claims to be the flagship TV science series on which the public rely to form their opinions, I would hope that their researchers do their research, and that the editors then take due account of it.
The factoids about mining the Moon for fusion fuel and of the LHC Armageddon make a cautionary tale. A decade from now it is possible that committees of well-informed scientists and rather less-well-informed politicians, with public opinion weighing on their minds, will be deciding on our involvement in mega-projects such as the next huge accelerator, human space exploration, or even a post-ITER commercial fusion plant.
Decision making driven by public opinion that is influenced by factoids already has a dire history in the bio-medical arena: the controversy over whether to give children a combined immunization against measles, mumps and rubella (MMR) being the most recent example. My advice is that if you see an error in the media, speak out, write to the editors and try to get corrections made. It is an opportunity to get good science in the news.
About the author
Frank Close is a theoretical physicist at the University of Oxford, UK
Reply #5 on:
December 19, 2007, 10:26:01 AM »
Laws of Nature, Source Unknown
By DENNIS OVERBYE
Published: December 18, 2007
“Gravity,” goes the slogan on posters and bumper stickers. “It isn’t just a good idea. It’s the law.”
And what a law. Unlike, say, traffic or drug laws, you don’t have a choice about obeying gravity or any of the other laws of physics. Jump and you will come back down. Faith or good intentions have nothing to do with it.
Existence didn’t have to be that way, as Einstein reminded us when he said, “The most incomprehensible thing about the universe is that it is comprehensible.” Against all the odds, we can send e-mail to Sri Lanka, thread spacecraft through the rings of Saturn, take a pill to chase the inky tendrils of depression, bake a turkey or a soufflé and bury a jump shot from the corner.
Yes, it’s a lawful universe. But what kind of laws are these, anyway, that might be inscribed on a T-shirt but apparently not on any stone tablet that we have ever been able to find?
Are they merely fancy bookkeeping, a way of organizing facts about the world? Do they govern nature or just describe it? And does it matter that we don’t know and that most scientists don’t seem to know or care where they come from?
Apparently it does matter, judging from the reaction to a recent article by Paul Davies, a cosmologist at Arizona State University and author of popular science books, on the Op-Ed page of The New York Times.
Dr. Davies asserted in the article that science, not unlike religion, rested on faith, not in God but in the idea of an orderly universe. Without that presumption a scientist could not function. His argument provoked an avalanche of blog commentary, articles on Edge.org and letters to The Times, pointing out that the order we perceive in nature has been explored and tested for more than 2,000 years by observation and experimentation. That order is precisely the hypothesis that the scientific enterprise is engaged in testing.
David J. Gross, director of the Kavli Institute for Theoretical Physics in Santa Barbara, Calif., and co-winner of the Nobel Prize in physics, told me in an e-mail message, “I have more confidence in the methods of science, based on the amazing record of science and its ability over the centuries to answer unanswerable questions, than I do in the methods of faith (what are they?).”
Reached by e-mail, Dr. Davies acknowledged that his mailbox was “overflowing with vitriol,” but said he had been misunderstood. What he had wanted to challenge, he said, was not the existence of laws, but the conventional thinking about their source.
There is in fact a kind of chicken-and-egg problem with the universe and its laws. Which “came” first — the laws or the universe?
If the laws of physics are to have any sticking power at all, to be real laws, one could argue, they have to be good anywhere and at any time, including the Big Bang, the putative Creation. Which gives them a kind of transcendent status outside of space and time.
On the other hand, many thinkers — all the way back to Augustine — suspect that space and time, being attributes of this existence, came into being along with the universe — in the Big Bang, in modern vernacular. So why not the laws themselves?
Dr. Davies complains that the traditional view of transcendent laws is just 17th-century monotheism without God. “Then God got killed off and the laws just free-floated in a conceptual vacuum but retained their theological properties,” he said in his e-mail message.
But the idea of rationality in the cosmos has long existed without monotheism. As far back as the fifth century B.C. the Greek mathematician and philosopher Pythagoras and his followers proclaimed that nature was numbers. Plato envisioned a higher realm of ideal forms, of perfect chairs, circles or galaxies, of which the phenomena of the sensible world were just flawed reflections. Plato set a transcendent tone that has been popular, especially with mathematicians and theoretical physicists, ever since.
Steven Weinberg, a Nobel laureate from the University of Texas, Austin, described himself in an e-mail message as “pretty Platonist,” saying he thinks the laws of nature are as real as “the rocks in the field.” The laws seem to persist, he wrote, “whatever the circumstance of how I look at them, and they are things about which it is possible to be wrong, as when I stub my toe on a rock I had not noticed.”
The ultimate Platonist these days is Max Tegmark, a cosmologist at the Massachusetts Institute of Technology. In talks and papers recently he has speculated that mathematics does not describe the universe — it is the universe.
Page 2 of 3)
Dr. Tegmark maintains that we are part of a mathematical structure, albeit one gorgeously more complicated than a hexagon, a multiplication table or even the multidimensional symmetries that describe modern particle physics. Other mathematical structures, he predicts, exist as their own universes in a sort of cosmic Pythagorean democracy, although not all of them would necessarily prove to be as rich as our own.
“Everything in our world is purely mathematical — including you,” he wrote in New Scientist.
This would explain why math works so well in describing the cosmos. It also suggests an answer to the question that Stephen Hawking, the English cosmologist, asked in his book, “A Brief History of Time”: “What is it that breathes fire into the equations and makes a universe for them to describe?” Mathematics itself is on fire.
Not every physicist pledges allegiance to Plato. Pressed, these scientists will describe the laws more pragmatically as a kind of shorthand for nature’s regularity. Sean Carroll, a cosmologist at the California Institute of Technology, put it this way: “A law of physics is a pattern that nature obeys without exception.”
Plato and the whole idea of an independent reality, moreover, took a shot to the mouth in the 1920s with the advent of quantum mechanics. According to that weird theory, which, among other things, explains why our computers turn on every morning, there is an irreducible randomness at the microscopic heart of reality that leaves an elementary particle, an electron, say, in a sort of fog of being everywhere or anywhere, or being a wave or a particle, until some measurement fixes it in place.
In that case, according to the standard interpretation of the subject, physics is not about the world at all, but about only the outcomes of experiments, of our clumsy interactions with that world. But 75 years later, those are still fighting words. Einstein grumbled about God not playing dice.
Steven Weinstein, a philosopher of science at the University of Waterloo, in Ontario, termed the phrase “law of nature” as “a kind of honorific” bestowed on principles that seem suitably general, useful and deep. How general and deep the laws really are, he said, is partly up to nature and partly up to us, since we are the ones who have to use them.
But perhaps, as Dr. Davies complains, Plato is really dead and there are no timeless laws or truths. A handful of poet-physicists harkening for more contingent nonabsolutist laws not engraved in stone have tried to come up with prescriptions for what John Wheeler, a physicist from Princeton and the University of Texas in Austin, called “law without law.”
As one example, Lee Smolin, a physicist at the Perimeter Institute for Theoretical Physics, has invented a theory in which the laws of nature change with time. It envisions universes nested like Russian dolls inside black holes, which are spawned with slightly different characteristics each time around. But his theory lacks a meta law that would prescribe how and why the laws change from generation to generation.
Holger Bech Nielsen, a Danish physicist at the Niels Bohr Institute in Copenhagen, and one of the early pioneers of string theory, has for a long time pursued a project he calls Random Dynamics, which tries to show how the laws of physics could evolve naturally from a more general notion he calls “world machinery.”
On his Web site, Random Dynamics, he writes, “The ambition of Random Dynamics is to ‘derive’ all the known physical laws as an almost unavoidable consequence of a random fundamental ‘world machinery.’”
Dr. Wheeler has suggested that the laws of nature could emerge “higgledy-piggledy” from primordial chaos, perhaps as a result of quantum uncertainty. It’s a notion known as “it from bit.” Following that logic, some physicists have suggested we should be looking not so much for the ultimate law as for the ultimate program..
Anton Zeilinger, a physicist and quantum trickster at the University of Vienna, and a fan of Dr. Wheeler’s idea, has speculated that reality is ultimately composed of information. He said recently that he suspected the universe was fundamentally unpredictable.
I love this idea of intrinsic randomness much for the same reason that I love the idea of natural selection in biology, because it and only it ensures that every possibility will be tried, every circumstance tested, every niche inhabited, every escape hatch explored. It’s a prescription for novelty, and what more could you ask for if you want to hatch a fecund universe?
Page 3 of 3)
But too much fecundity can be a problem. Einstein hoped that the universe was unique: given a few deep principles, there would be only one consistent theory. So far Einstein’s dream has not been fulfilled.Cosmologists and physicists have recently found themselves confronted by the idea of the multiverse, with zillions of universes, each with different laws, occupying a vast realm known in the trade as the landscape.
In this case there is meta law — one law or equation, perhaps printable on a T-shirt — to rule them all. This prospective lord of the laws would be string theory, the alleged theory of everything, which apparently has 10500 solutions. Call it Einstein’s nightmare.
But it is soon for any Einsteinian to throw in his or her hand. Since cosmologists don’t know how the universe came into being, or even have a convincing theory, they have no way of addressing the conundrum of where the laws of nature come from or whether those laws are unique and inevitable or flaky as a leaf in the wind.
These kinds of speculation are fun, but they are not science, yet. “Philosophy of science is about as useful to scientists as ornithology is to birds,” goes the saying attributed to Richard Feynman, the late Caltech Nobelist, and repeated by Dr. Weinberg.
Maybe both alternatives — Plato’s eternal stone tablet and Dr. Wheeler’s higgledy-piggledy process — will somehow turn out to be true. The dichotomy between forever and emergent might turn out to be as false eventually as the dichotomy between waves and particles as a description of light. Who knows?
The law of no law, of course, is still a law.
When I was young and still had all my brain cells I was a bridge fan, and one hand I once read about in the newspaper bridge column has stuck with me as a good metaphor for the plight of the scientist, or of the citizen cosmologist. The winning bidder had overbid his hand. When the dummy cards were laid, he realized that his only chance of making his contract was if his opponents’ cards were distributed just so.
He could have played defensively, to minimize his losses. Instead he played as if the cards were where they had to be. And he won.
We don’t know, and might never know, if science has overbid its hand. When in doubt, confronted with the complexities of the world, scientists have no choice but to play their cards as if they can win, as if the universe is indeed comprehensible. That is what they have been doing for more than 2,000 years, and they are still winning.
Plane vs. Conveyer Belt
Reply #6 on:
February 04, 2008, 12:08:54 PM »
Plane vs. Conveyer Belt: Hell Yeah the Plane Takes Off
by Higgins - January 31, 2008 - 4:20 PM
Last night the Discovery show Mythbusters settled a longstanding debate: whether an airplane on a conveyer belt (running at the same speed, but in the opposite direction as the plane) can take off. The short answer, as liveblogged by Jason Kottke:
HELL YEAH THE PLANE TAKES OFF
It’s a curious problem. As a thought experiment, it seems (at least to me) like the plane shouldn’t take off, since it’s not gaining takeoff velocity relative to the ground. But according to, you know, SCIENCE, the plane doesn’t need to reach takeoff velocity relative to the ground — it just needs lift an appropriate amount of lift. It’s the velocity of the air relative to the wings that counts, which is generated by the action of the engines.
Despite explanations of this sort of physicists, the issue wasn’t really settled until last night’s Mythbusters episode — they replicated the experiment on a small scale, then with a real airplane (albeit an ultralight), using a huge tarp dragged by a truck as the “conveyer belt.” Even the plane’s pilot thought the plane wouldn’t take off. When Jason Kottke first blogged about the issue last February, his comment thread was hot with controversy. So Kottke tuned in to Mythbusters last night and liveblogged the event, with results visible above. His exuberance over the plane’s liftoff has resulted in a “HELL YEAH THE PLANE TAKES OFF” tee-shirt available starting at $18. Wow.
Watch the Mythbusters clip in question below…. (Note: if this clip is pulled down, I’ll try to dig up another.)
See <http://www.mentalfloss.com/blogs/archives/11750> for the Mythbusters clip...
Reply #7 on:
February 25, 2008, 01:26:35 PM »
Electron Filmed for First Time - Yahoo! News
Reply #8 on:
March 29, 2008, 10:14:23 AM »
Asking a Judge to Save the World, and Maybe a Whole Lot More
More fighting in Iraq. Somalia in chaos. People in this country can’t afford their mortgages and in some places now they can’t even afford rice.
None of this nor the rest of the grimness on the front page today will matter a bit, though, if two men pursuing a lawsuit in federal court in Hawaii turn out to be right. They think a giant particle accelerator that will begin smashing protons together outside Geneva this summer might produce a black hole or something else that will spell the end of the Earth — and maybe the universe.
Scientists say that is very unlikely — though they have done some checking just to make sure.
The world’s physicists have spent 14 years and $8 billion building the Large Hadron Collider, in which the colliding protons will recreate energies and conditions last seen a trillionth of a second after the Big Bang. Researchers will sift the debris from these primordial recreations for clues to the nature of mass and new forces and symmetries of nature.
But Walter L. Wagner and Luis Sancho contend that scientists at the European Center for Nuclear Research, or CERN, have played down the chances that the collider could produce, among other horrors, a tiny black hole, which, they say, could eat the Earth. Or it could spit out something called a “strangelet” that would convert our planet to a shrunken dense dead lump of something called “strange matter.” Their suit also says CERN has failed to provide an environmental impact statement as required under the National Environmental Policy Act.
Although it sounds bizarre, the case touches on a serious issue that has bothered scholars and scientists in recent years — namely how to estimate the risk of new groundbreaking experiments and who gets to decide whether or not to go ahead.
The lawsuit, filed March 21 in Federal District Court, in Honolulu, seeks a temporary restraining order prohibiting CERN from proceeding with the accelerator until it has produced a safety report and an environmental assessment. It names the federal Department of Energy, the Fermi National Accelerator Laboratory, the National Science Foundation and CERN as defendants.
According to a spokesman for the Justice Department, which is representing the Department of Energy, a scheduling meeting has been set for June 16.
Why should CERN, an organization of European nations based in Switzerland, even show up in a Hawaiian courtroom?
In an interview, Mr. Wagner said, “I don’t know if they’re going to show up.” CERN would have to voluntarily submit to the court’s jurisdiction, he said, adding that he and Mr. Sancho could have sued in France or Switzerland, but to save expenses they had added CERN to the docket here. He claimed that a restraining order on Fermilab and the Energy Department, which helps to supply and maintain the accelerator’s massive superconducting magnets, would shut down the project anyway.
James Gillies, head of communications at CERN, said the laboratory as of yet had no comment on the suit. “It’s hard to see how a district court in Hawaii has jurisdiction over an intergovernmental organization in Europe,” Mr. Gillies said.
“There is nothing new to suggest that the L.H.C. is unsafe,” he said, adding that its safety had been confirmed by two reports, with a third on the way, and would be the subject of a discussion during an open house at the lab on April 6.
“Scientifically, we’re not hiding away,” he said.
But Mr. Wagner is not mollified. “They’ve got a lot of propaganda saying it’s safe,” he said in an interview, “but basically it’s propaganda.”
In an e-mail message, Mr. Wagner called the CERN safety review “fundamentally flawed” and said it had been initiated too late. The review process violates the European Commission’s standards for adhering to the “Precautionary Principle,” he wrote, “and has not been done by ‘arms length’ scientists.”
Physicists in and out of CERN say a variety of studies, including an official CERN report in 2003, have concluded there is no problem. But just to be sure, last year the anonymous Safety Assessment Group was set up to do the review again.
“The possibility that a black hole eats up the Earth is too serious a threat to leave it as a matter of argument among crackpots,” said Michelangelo Mangano, a CERN theorist who said he was part of the group. The others prefer to remain anonymous, Mr. Mangano said, for various reasons. Their report was due in January.
This is not the first time around for Mr. Wagner. He filed similar suits in 1999 and 2000 to prevent the Brookhaven National Laboratory from operating the Relativistic Heavy Ion Collider. That suit was dismissed in 2001. The collider, which smashes together gold ions in the hopes of creating what is called a “quark-gluon plasma,” has been operating without incident since 2000.
Mr. Wagner, who lives on the Big Island of Hawaii, studied physics and did cosmic ray research at the University of California, Berkeley, and received a doctorate in law from what is now known as the University of Northern California in Sacramento. He subsequently worked as a radiation safety officer for the Veterans Administration.
Mr. Sancho, who describes himself as an author and researcher on time theory, lives in Spain, probably in Barcelona, Mr. Wagner said.
Doomsday fears have a long, if not distinguished, pedigree in the history of physics. At Los Alamos before the first nuclear bomb was tested, Emil Konopinski was given the job of calculating whether or not the explosion would set the atmosphere on fire.
The Large Hadron Collider is designed to fire up protons to energies of seven trillion electron volts before banging them together. Nothing, indeed, will happen in the CERN collider that does not happen 100,000 times a day from cosmic rays in the atmosphere, said Nima Arkani-Hamed, a particle theorist at the Institute for Advanced Study in Princeton.
What is different, physicists admit, is that the fragments from cosmic rays will go shooting harmlessly through the Earth at nearly the speed of light, but anything created when the beams meet head-on in the collider will be born at rest relative to the laboratory and so will stick around and thus could create havoc.
The new worries are about black holes, which, according to some variants of string theory, could appear at the collider. That possibility, though a long shot, has been widely ballyhooed in many papers and popular articles in the last few years, but would they be dangerous?
According to a paper by the cosmologist Stephen Hawking in 1974, they would rapidly evaporate in a poof of radiation and elementary particles, and thus pose no threat. No one, though, has seen a black hole evaporate.
As a result, Mr. Wagner and Mr. Sancho contend in their complaint, black holes could really be stable, and a micro black hole created by the collider could grow, eventually swallowing the Earth.
But William Unruh, of the University of British Columbia, whose paper exploring the limits of Dr. Hawking’s radiation process was referenced on Mr. Wagner’s Web site, said they had missed his point. “Maybe physics really is so weird as to not have black holes evaporate,” he said. “But it would really, really have to be weird.”
Lisa Randall, a Harvard physicist whose work helped fuel the speculation about black holes at the collider, pointed out in a paper last year that black holes would probably not be produced at the collider after all, although other effects of so-called quantum gravity might appear.
As part of the safety assessment report, Dr. Mangano and Steve Giddings of the University of California, Santa Barbara, have been working intensely for the last few months on a paper exploring all the possibilities of these fearsome black holes. They think there are no problems but are reluctant to talk about their findings until they have been peer reviewed, Dr. Mangano said.
Dr. Arkani-Hamed said concerning worries about the death of the Earth or universe, “Neither has any merit.” He pointed out that because of the dice-throwing nature of quantum physics, there was some probability of almost anything happening. There is some minuscule probability, he said, “the Large Hadron Collider might make dragons that might eat us up.”
Reply #9 on:
July 14, 2009, 11:13:09 PM »
How to map the multiverse
04 May 2009 by Anil Ananthaswamy
BRIAN GREENE spent a good part of the last decade extolling the virtues of string theory. He dreamed that one day it would provide physicists with a theory of everything that would describe our universe - ours and ours alone. His bestselling book The Elegant Universe eloquently captured the quest for this ultimate theory.
"But the fly in the ointment was that string theory allowed for, in principle, many universes," says Greene, who is a theoretical physicist at Columbia University in New York. In other words, string theory seems equally capable of describing universes very different from ours. Greene hoped that something in the theory would eventually rule out most of the possibilities and single out one of these universes as the real one: ours.
So far, it hasn't - though not for any lack of trying. As a result, string theorists are beginning to accept that their ambitions for the theory may have been misguided. Perhaps our universe is not the only one after all. Maybe string theory has been right all along.
Greene, certainly, has had a change of heart. "You walk along a number of pathways in physics far enough and you bang into the possibility that we are one universe of many," he says. "So what do you do? You smack yourself in the head and say, 'Ah, maybe the universe is trying to tell me something.' I have personally undergone a sort of transformation, where I am very warm to this possibility of there being many universes, and that we are in the one where we can survive."
We keep banging into the possibility that we are one universe of many. Maybe that's telling us something
Greene's transformation is emblematic of a profound change among the majority of physicists. Until recently, many were reluctant to accept this idea of the "multiverse", or were even belligerent towards it. However, recent progress in both cosmology and string theory is bringing about a major shift in thinking. Gone is the grudging acceptance or outright loathing of the multiverse. Instead, physicists are starting to look at ways of working with it, and maybe even trying to prove its existence.
If such ventures succeed, our universe will go the way of Earth - from seeming to be the centre of everything to being exposed as just a backwater in a far vaster cosmos. And just as we are unable to deduce certain aspects of Earth from first principles - such as its radius or distance from the sun - we will have to accept that some things about our universe are a random accident, inexplicable except in the context of the multiverse.
One of the first to argue for a multiverse was Russian physicist Andrei Linde, now at Stanford University in California. In the 1980s, Linde extended and improved upon an idea called inflation, which suggests that the universe underwent a period of exponential expansion in the first fractions of a second after the big bang. Inflation successfully explains why the universe looks pretty much the same in all directions, and why space-time is "flat", despite Einstein showing that it can just as easily be curved.
Linde realised that inflation could be ongoing or "eternal", in the sense that once space-time starts inflating, it can stop in some parts (such as ours) yet take off with renewed vigour elsewhere. This process continues ad infinitum, giving rise to a patchwork of regions of space, each with different properties. When and how inflation ceases in a particular patch dictates the exact nature and types of fundamental particles there and the laws of physics that govern their behaviour. Over time, eternal inflation gives rise to just about every possible type of universe predicted by string theory. Our universe, argues Linde, is a part of this multiverse.
It wasn't until 1998, however, that the multiverse gained any traction, when astronomers studying distant supernovae announced that the expansion of the universe is accelerating. They put this down to the vacuum of space having a small energy density, which exerts a repulsive force to counteract gravity as the universe ages. This became known as dark energy, or the cosmological constant.
Its discovery was a huge blow. Up till then, physicists had hoped that some ultimate theory would deduce the values of fundamental constants of nature from first principles, including the cosmological constant, and explain why the laws of physics are as they are, just right for the formation of stars and galaxies and possibly the emergence of life. This seems not to be the case. Nothing in string theory, or indeed any other theory in physics, can predict the observed value of the cosmological constant.
However, if our universe is part of a multiverse then we can ascribe the value of the cosmological constant to an accident. The same goes for other aspects of our universe, such as the mass of the electron. The idea is simply that each universe's laws of physics and fundamental constants are randomly determined, and we just happen to live in one where these are suited for life. "If not for the multiverse, you would have these unsolved problems at every corner," says Linde.
The other compelling argument for a multiverse comes from string theory. This maintains that all fundamental particles of matter and forces of nature arise from the vibration of tiny strings in 10 dimensions. For us not to notice the extra six dimensions of space, they must be curled up, or compacted, so small as to be undetectable. For decades, mathematicians toiled over what different forms this compaction could take, and they found myriad ways of scrunching up space-time - a staggering 10500 or more.
Each form gives rise to a different vacuum of space-time, and hence a different universe - with its own vacuum energy, fundamental particles and laws of physics. The hope, nurtured by Greene and others, was that there was some kind of uniqueness principle that would pick out the particular form of space-time that produces our universe.
That hope has since receded dramatically. In 2004, Michael Douglas of the State University of New York in Stony Brook, and Leonard Susskind of Stanford University surveyed the developments in string theory to date and concluded that all these theoretical varieties of space-time should be taken seriously as physical realities - that is, they point to a multiverse. Susskind coined the term "the landscape of string theory" to describe the 10500 or more different universes. Nothing in string theory suggests that any one of these universes is preferred over others. Rather, it appears all are equally likely.
Together, dark energy and string theory are making physicists see the multiverse anew. "Just about everybody is convinced that the idea of uniqueness has gone down the drain," says Susskind. So what are we to do? Throw up our hands and admit that we will never be able to explain why our universe is the way it is?
Exploring the landscape
Not a bit of it. Susskind argues that we can still ask meaningful questions within the context of the multiverse, just not the ones we'd ask if ours were the only universe. Questions such as: can we identify the exact point in the landscape that corresponds to our universe, or at least the parts of the landscape that most closely resemble our universe? Is it possible to tell which of our universe's properties can be derived from first principles and which ones are random?
We can still ask meaningful questions about the universe, just not the ones we'd ask if it were unique
Also, can we find parts of the landscape with the right conditions for eternal inflation to take place? After all, the landscape and eternal inflation are independent concepts. Confirming that they are compatible would lend more credence to the multiverse idea.
These are not trivial questions to answer, but string theorists are rising to the challenge by feverishly exploring the landscape. Investigating a collection of 10500 universes is not a matter of enumerating the properties of each of them, however. "We just can't make a list of 10500 things," says Nobel laureate Steven Weinberg of the University of Texas at Austin. "That's more than the number of atoms in the observable universe."
The first line of attack has been to develop mathematical models of the landscape. These describe the landscape as a terrain of hills and valleys, where each valley represents a place with its own parameters (such as the mass of the electron) and fields (such as gravity).
How does a universe develop according to this scenario, and what can it tell us about ours? Imagine the universe as it starts off as a speck of space-time. This baby universe is filled with fields, whose properties change due to quantum fluctuations. If the conditions are ripe for inflation, the speck will grow and this will alter its nature. Depending on the changing environment inside the emerging universe, the inflationary process could grind to a halt, continue apace or even spawn other specks of space-time.
According to the landscape picture, the baby universe starts off in one valley. Quantum fluctuations can then cause the entire universe to "tunnel" through an adjoining hill, eventually ending up in another valley with different properties. This process continues, with the universe tunnelling from valley to valley, until it reaches a place stable enough for inflation to run its full course.
Given this scenario, one of the most important tasks is reconciling eternal inflation with the landscape. "The whole picture can be boiled down to one issue: is there eternal inflation in the landscape?" says Henry Tye of Cornell University in Ithaca, New York. In Linde's model of eternal inflation, the speck of space-time starts off with high energy density. The energy density slowly falls as space-time inflates. The quest is to find configurations of space-time among the 10500 that match Linde's requirements for eternal inflation.
Until recently, this had seemed impossible. Then, last year, Eva Silverstein and Alexander Westphal of Stanford University identified two places within the landscape for Linde's version of eternal inflation to take place (Physical Review D, vol 78, p 106003).
It's a promising start, but Tye argues that eternal inflation within string theory is not a done deal. Physicists could just as well start with string theory models of the universe with entirely different initial conditions that would lead to inflation, though not eternal inflation.
Experiments are the key to answering such concerns, by testing the predictions of the various alternative theories. For instance, the energy density in the model proposed by Silverstein is high enough to create strong gravitational waves, ripples in space-time generated by the rapid expansion of the universe. Such waves could have polarised the photons of the cosmic microwave background, the radiation left over from the big bang, and such an imprint would still be detectable today. The European Space Agency's Planck satellite, due to launch soon, will look for any polarisation.
If Planck sees it, then it will lend support to Silverstein's models and eternal inflation. But even if experiments like Planck do lend support for eternal inflation, theorists will need independent confirmation for the ideas of string theory. Unfortunately no specific predictions of string theory are yet within experimental reach, but there is one key general property that could be confirmed soon. String theory requires that the universe has a property known as supersymmetry, which posits that every particle known to physicists has a heavier and as yet unseen superpartner. Physicists will be looking for some of these superpartners at the Large Hadron Collider, the new particle accelerator at CERN, near Geneva, Switzerland.
The scenario of a universe tunnelling through the landscape also makes a unique prediction. If our universe emerged after tunnelling in this way, then the theory predicts that space-time today will be ever so slightly curved. That's because in this scenario, inflation does not last long enough to make the universe totally flat.
Today's measurements show the universe to be flat, but the uncertainty in those measurements still leaves room for space-time to be slightly curved - either like a saddle (negatively curved) or like a sphere (positively curved). "If we originated from a tunnelling event from an ancestor vacuum, the bet would be that the universe is negatively curved," says Susskind. "If it turns out to be positively curved, we'd be very confused. That would be a setback for these ideas, no question about it."
Until any such setback the smart money will remain with the multiverse and string theory. "It has the best chance of anything we know to be right," Weinberg says of string theory. "There's an old joke about a gambler playing a game of poker," he adds. "His friend says, 'Don't you know this game is crooked, and you are bound to lose?' The gambler says, 'Yes, but what can I do, it's the only game in town.' We don't know if we are bound to lose, but even if we suspect we may, it is the only game in town."
Anil Ananthaswamy is a consulting editor for New Scientist
Reply #10 on:
September 10, 2009, 10:47:18 AM »
Evaporating, Extra Dimensional Black Holes?
Reply #11 on:
September 16, 2009, 02:54:08 PM »
HUNTING HIDDEN DIMENSIONS
Black holes, giant and tiny, may reveal new realms of space By Diana Steele September 26th, 2009; Vol.176 #7 (p. 22) Text Size
Black hole blastView larger version | The creation of black holes in the Large Hadron Collider, which will smash protons together at nearly the speed of light, would indicate the existence of extra dimensions. A simulation of one possible fingerprint of a black hole (above) in the collider's Compact Muon Solenoid detector shows colored cones to represent different particle types, and bar lengths indicate particles' energy intensity. The CMS Collaboration
In many ways, black holes are science’s answer to science fiction. As strange as anything from a novelist’s imagination, black holes warp the fabric of spacetime and imprison light and matter in a gravitational death grip. Their bizarre properties make black holes ideal candidates for fictional villainy. But now black holes are up for a different role: heroes helping physicists assess the real-world existence of another science fiction favorite — hidden extra dimensions of space.
Astrophysical giants several times the mass of the sun and midget black holes smaller than a subatomic particle could provide glimpses of an extra-dimensional existence.
Out in space, astrophysicists are looking hard to see if large black holes are shrinking on a time scale that might be detected by modern telescopes. If so, it might mean the black holes are evaporating into extra dimensions.
In the laboratory, black holes far smaller than anything that could be seen with a microscope might be produced in Europe’s Large Hadron Collider after it starts running again in November (SN: 7/19/08, p. 16). The detection of such a black hole, which would evaporate in a hail of subatomic particles in a tiny fraction of a second, would provide evidence that unseen dimensions of space exist.
What makes either of these ideas even plausible is a bold theory put forth just over 10 years ago that purports to explain the weakness of gravity by supposing that some of it is leaking out into extra dimensions.
Gravity feels strong to humans because it makes climbing hills hard. But one of the fundamental paradoxes about gravity is demonstrated by the fact that an ordinary refrigerator magnet can pick up a paperclip — counteracting the entire mass of the Earth pulling down on the clip.
Physicists call this the “hierarchy problem,” referring to the fact that all the other forces of nature are more than 30 orders of magnitude stronger than gravity.
“It’s hard to explain such a huge number from any mathematical postulate or any physical principle,” says Greg Landsberg, a theoretical physicist at Brown University in Providence, R.I. “It’s a bit of an embarrassment for our field, because what it really means is, we don’t seem to understand gravity.”
Measuring extra dimensions
Isaac Newton declared in the 17th century that gravity gets weaker by the square of the distance between two objects. If the moon were twice as far from Earth, it would feel one-quarter the gravity.
But in 1998, theoretical physicists Nima Arkani-Hamed, Savas Dimopoulos and Gia Dvali pointed out that gravity had never been measured below a distance of about a millimeter. Suppose, they suggested, that gravity differs from Newtonian expectations at distances smaller than a millimeter.
That could happen if there are extra dimensions of space that gravity leaks into. These hidden dimensions might be shaped, for example, like the circumference of a hose. From a distance, the hose looks like a one-dimensional line, but seen up close, it has a curled-up second dimension. Arkani-Hamed, Dimopoulos and Dvali — whose model is known as ADD, short for their names — suggest that there could be extra dimensions as large as a millimeter in diameter.
“In principle, the extra dimensions can be so small, like trillions and trillions of times smaller than a millimeter, and that’s what string theory predicts,” says theoretical astrophysicist Dimitrios Psaltis of the University of Arizona in Tucson. But “if you introduce those large extra dimensions, then gravity can get diluted in some way.”
Gravity may spread into the extra dimensions while the other known forces and particles are confined to the three familiar spatial dimensions. So gravity could be just as strong as the other forces — but only felt strongly at short distances.
The universe as a flatlandView larger version | The known universe could be very thin in an extra dimension other than the familiar three dimensions of space.J. Korenblat; NASA
Tiny curled extra dimensions aren’t the only possibility. In 1999, theoretical physicists Lisa Randall and Raman Sundrum proposed that one extra dimension might stretch out to infinity. If either theory is true, it would also mean that at very small distances, gravity would be much stronger than Newton’s prediction.
The idea of “large” extra dimensions sent experimental physicists scrambling.
So far, physicists using sensitive small-scale experiments have measured the force of gravity at distances just under 50 micrometers and haven’t found any deviation from Newton’s law yet. But they keep looking.
Shrinking black holes
Black holes, as the most gravitationally dense objects in the universe, might provide another way of testing the extra-dimension hypotheses. Black holes know a thing or two about gravity; the trick is getting them to reveal their secrets.
In the 1970s, theoretical physicist Stephen Hawking calculated that black holes actually lose mass. That mass vanishes over time in the form of what’s now called Hawking radiation. “Over time” generally means over billions of years, like the age of the universe. The larger the black hole is, the more slowly it shrinks. But as it gets smaller, the evaporation rate accelerates.
And if there are extra dimensions of the Randall-Sundrum type, astrophysical black holes might emit gravity waves into these other dimensions and shrink faster than otherwise expected. So, Psaltis thought, finding a small black hole that’s really old would limit the size of the extra dimensions. “If you notice that a black hole lived, for example, a hundred million years,” Psaltis says, “that means that it couldn’t have evaporated, couldn’t have lost its mass really, really fast.”
But finding out the age and weight of a black hole is about as tricky as discovering that of a vain movie star. So Psaltis tried to find a way to get the black hole to reveal a little bit more about itself.
He found a black hole that looked like it had been kicked out of the plane of the Milky Way galaxy following a violent supernova explosion, like a fastball hit over the wall at Fenway Park. Since the black hole would have been born in the explosion, Psaltis could estimate its age by measuring how fast it and its companion star were zooming away from the galaxy, then backtracking to find out how long ago it had been ejected.
He calculated that this particular black hole, J1118+480, was a minimum of 11 million years old. Using that age and an estimated mass, Psaltis put an upper limit of 80 micrometers on the size of any extra dimensions, as he reported in Physical Review Letters in 2007.
Tim Johanssen, Psaltis’ graduate student, came up with another idea for measuring whether black holes are losing weight, one that doesn’t depend on knowing their ages. Most black holes a few times the mass of the sun have been detected because they orbit a companion star. The masses of the star and the black hole, as well as the distance between them, determine how fast the two rotate around each other, like Olympic pair skaters spinning around each other in a death spiral. If the mass of the black hole is changing, the rate at which it and its companion orbit each other, called the orbital period, changes as well.
Johanssen calculated how quickly a black hole would have to lose mass in order to see a noticeable difference in the orbital period. “Just from normal astrophysical mechanisms, we would expect [the period] to halve or double at a time-scale on the order of the age of the universe, billions of years,” says Psaltis. “If extra dimensions exist, and they are as large as, say, a tenth of a millimeter, then that time scale goes down to about several millions of years. Which means that if you make an observation over a year, you expect a change in the orbital period of a few parts per million. This is tiny, but this is something that modern observations of binary systems can actually do.”
Johanssen, Psaltis and astronomer Jeffrey McClintock of the Harvard-Smithsonian Center for Astrophysics looked closely at the best-studied black hole binary, A0620-00, which has been observed for about a decade. So far, they found, there has been no observable change in its orbital period. That let them constrain the size of the extra dimension to less than 161 micrometers. Their results appeared in February 2009 in the Astrophysical Journal.
Another researcher, Oleg Gnedin of the University of Michigan in Ann Arbor, extrapolated from Psaltis’ work. Gnedin learned of a recently discovered black hole in a globular cluster, one of the oldest groups of stars in the universe. Black holes in globular clusters are on the order of 10 billion years old. The mere existence of a black hole this old puts a very tight constraint — less than 3 micrometers — on the size of the Randall-Sundrum extra dimensions, Psaltis says. That work was published online at arXiv.org in June (SN: 8/1/09, p. 7).
Although the globular cluster work sets the tightest constraint on extra-dimension size so far, the researchers admit that it relies on a lot of assumptions.
Psaltis is pinning his hopes on observations of binary systems because, he says, they’re “a measurement of what is happening right now to the black hole that we are seeing. It does not depend on the history.” He says that even though the researchers haven’t seen any changes in orbital periods so far doesn’t mean the extra dimensions don’t exist, just that they haven’t been found yet. Any change in orbital period, he says, would challenge physicists’ current theory of forces and particles in the universe — called the standard model.
But even if the extra-dimension theories are correct, observers still may never find evidence of such dimensions in the astrophysical black holes. One reason may be that the extra dimensions are of the ADD variety, small and curled up, in which case these tiny dimensions make no difference to the massive black holes in outer space.
The other reason may be that black holes don’t really evaporate faster into other dimensions even if they do exist, says Randall, the Harvard theoretical physicist who coauthored two of the popular extra-dimension models. “People have suggested that the decay rates of black holes might be a way of distinguishing” between the models, she says, “but it’s not fully resolved.”
Micro black holes
It will become pretty clear that large extra dimensions exist if a micro-sized black hole happens to appear in the Large Hadron Collider, or LHC, near Geneva. That’s because if gravity really is much stronger than expected at distances around a few micrometers or so, the LHC may be able to pack enough matter and energy into a small enough space that the system will automatically collapse into a black hole.
But before anyone starts worrying about Geneva disappearing into a black hole, know that this gravitationally dense midget wouldn’t even cross the diameter of an atomic nucleus before disintegrating (SN Online: 6/24/08).
“In this sense, these black holes are completely organic,” says Landsberg. “You could put them in your salad, and you wouldn’t notice that they exist because they immediately evaporate.”
But they might make their presence known to the LHC’s detectors.
That’s the province of a number of theoretical physicists, including Glenn Starkman of Case Western Reserve University in Cleveland. Starkman led a team that developed a computer program, called BlackMax, that tells researchers what subatomic debris a black hole might leave behind as evidence.
Inside the LHC, two beams of protons will stream at speeds close to the speed of light in opposite directions around a circular tunnel. Protons are actually somewhat spread out, says Starkman,and mostly made up of subatomic particles called quarks and gluons. It’s extremely unlikely that any two of these particles will hit each other exactly head-on. But if two quarks or two gluons, or one of each, get close enough to each other as they are flying in opposite directions, there could be enough energy in a small enough space that a black hole would form — if, and only if, gravity is strong enough to start playing a role. “For that to happen,” says Starkman, “there have to be more than three dimensions.”
The black hole would evaporate almost instantaneously, perhaps in a hail of subatomic particles shooting forth in all directions, like a cherry bomb firecracker. Or perhaps researchers would see a signature event in which some of the energy disappears, carried away into other dimensions by gravitons — the invisible gravitational counterpart to the photon.
The good thing, from a theoretical physics point of view, is that if the LHC makes any black holes at all, it will make a lot of them — as many as one per second, or 30 million a year. “Now, 30 million a year may involve optimistic assumptions, but perhaps a million or a hundred thousand or even ten thousand is not impossible,” says Stanford University’s Savas Dimopoulos, the middle “D” of the ADD extra-dimension hypothesis. “Even if you have 10,000 black holes, that is a lot of events to do statistics with and to start testing in detail both the existence of the black hole and the framework of large dimensions.”
Randall, like many, is skeptical. “It’s a cute idea,” she says. But with coauthor Patrick Meade, then at Harvard and now at the Institute for Advanced Study in Princeton, N.J., she argues that the scenario is highly unlikely. Their work was published in May 2008 in the Journal of High Energy Physics.
“It’s virtually impossible that you’re going to make genuine black holes at the LHC because the energy isn’t really high enough,” she says. “You could see some evidence of interesting gravitational effects in higher dimensions in terms of how things would scatter off each other … but it seems very unlikely that you would actually have anything that is really a genuine black hole.”
Still, there are a lot of uncertainties, and until the LHC is up and running, no one will really know.
Dimopoulos, for one, remains optimistic, but he has hedged his bets. In addition to large extra dimensions, he has a stake in two other leading candidates for solving the hierarchy problem. These theories, called technicolor and supersymmetry, don’t rely on extra dimensions — and they both might show their colors at the LHC.
But chances are “that nature may choose a completely different route, and it may be that the solutions to the hierarchy problem will be something that nobody ever thought about,” he says. “And that may be the most exciting scenario for what we will discover.”
Diana Steele is a freelance science writer based in Oberlin, Ohio.
Cosmic Radiation and Warming
Reply #12 on:
December 09, 2009, 02:56:12 PM »
There is a flash video I can't embed associated with this post that is well worth an hour of your life. Contrast the style of the lecturer to that of the AGW panic mongers.
Revolt of the Physicists
Climate science seemed settled in the 1990s. The only theory around was that the increase in CO2 and other greenhouse gases was causing the increase in world temperatures. But then physicists got involved. My guess is that the average physicist has an IQ of somewhere between 150 and 200. The progress that they have been making is incredible.
If you have a scientific background and you still believe in man-made global warming, get out a cup of coffee, a cup of tea, or a glass of brandy, whatever helps you think best, and watch the following lecture from the Cern, one of Europe's most highly respected centers for scientific research:
Windows Media Flash
This lecture by Jasper Kirkby reviews the recent research that physicists have been conducting into climate change. Physicists have discovered that changes in the rate of cosmic ray inflow cause climate change and that solar activity shields the earth from cosmic rays. They haven't completely worked out the mechanism yet, but they think it has to do with cosmic rays causing cloud formation and clouds reflecting sunlight back into space.
When Kirkby gets to the screen showing Galactic Modulation of Climate over the last 500 million years and the cosmic ray variation that explains it, take a close look at the line that plots CO2 over the same period. Note that that line doesn't correspond at all to the temperature periodicity evident in the temperature data. Also listen when Kirkby points out that CO2 concentrations used to be 10 times higher than they are today.
And don't miss the most chilling (literally) prediction of all based on a careful study of sunspot intensity. This prediction was originally submitted and rejected for publication in 2005 (Sunspots May Vanish by 2015), but has been coming true ever since. The earth appears to be headed toward a period of dramatic cooling, at present, due to reduced solar activity.
Meanwhile, clueless world leaders will be meeting at a UN Climate Change Conference in Copenhagen December 7-18 in an attempt to reduce carbon emissions in order to slow global warming.
WSJ: Dark Matter?
Reply #13 on:
December 21, 2009, 09:27:01 PM »
By LAWRENCE KRAUSS
In early December, the Cold Dark Matter Search (CDMS) experiment located in the deep Soudan Mine in northern Minnesota leaked a tantalizing hint that they may have discovered something remarkable. The experiment is designed to directly detect new elementary particles that might make up the dark matter known to dominate our own Milky Way galaxy, all galaxies, and indeed all mass in the universe—so news of a possible breakthrough was thrilling.
The actual result? Two pulses were detected over the course of almost a year that might have been due to dark matter, CDMS announced on Dec. 17. However, there is a 25% chance that the pulses were actually caused by background radioactivity in and around the detector.
Physicists remain fascinated by the possibility that the events at CDMS, reported on the back pages of the world's newspapers, might nevertheless be real. If they are, they will represent the culmination of one of the most incredible detective stories in the history of science.
Beginning in the 1970s, evidence began to accumulate that there was much more mass out there than meets the eye. Scientists, mostly by observing the speed of rotation of our galaxy, estimated that there was perhaps 10 times as much dark matter as visible material.
At around the same time, independent computer calculations following the possible gravitational formation of galaxies supported this idea. The calculations suggested that only some new type of material that didn't interact as normal matter does could account for the structures we see.
Meanwhile, in the completely separate field of elementary particle physics, my colleagues and I had concluded that in order to understand what we see, it is quite likely that a host of new elementary particles may exist at a scale beyond what accelerators at the time could detect. This is one of the reasons there is such excitement about the new Large Hadron Collider in Geneva, Switzerland. Last month, it finally began to produce collisions, and it might eventually directly produce these new particles.
Theorists who had proposed the existence of such particles realized that they could have been produced during the earliest moments of the fiery Big Bang in numbers that could account for the inferred abundance of dark matter today. Moreover, these new particles would have exactly the properties needed for such material. They would interact so weakly with normal matter that they could go through the Earth without a single interaction.
Emboldened by all of these arguments, a brave set of experimentalists began to devise techniques by which they might observe such particles. This required building detectors deep underground, far from the reach of most cosmic rays that would overwhelm any sensitive detector, and in clean rooms with no radioactivity that could produce a false signal.
So when the physics community heard rumors that one of these experiments had detected something, we all waited with eager anticipation. A convincing observation would vindicate almost half a century of carefully developed, if fragile, arguments suggesting a whole new invisible world waiting to be discovered.
For the theorist working at his desk alone at night, it seems almost unfathomable that nature might actually obey the delicate theories you develop on pieces of paper. This is especially true when the theories involve ideas from so many different areas of science and require leaps of imagination.
Alas, to celebrate would be premature: The reported results are intriguing, but less than convincing. Yet if the two pulses observed last week in Minnesota are followed by more signals as bigger detectors turn on in the coming year or two, it will provide serious vindication of the power of human imagination. Combined with rigorous logical inference and technological wizardry—all the things that make science worth celebrating—scientists' creativity will have uncovered hidden worlds that a century ago could not have been conceived.
If, on the other hand, the events turn out to have been mere background radioactivity, physicists will not give up. It will only force us to be more clever and more energetic as we try to unravel nature's mysteries.
Mr. Krauss is director of the Origins Institute at Arizona State University, and a theoretical physicist who has been involved in the search for dark matter for 30 years. His newest book, "Quantum Man," will appear in 2010.
Mixed Quantum State
Reply #14 on:
March 19, 2010, 12:03:43 AM »
Scientists supersize quantum mechanics
Largest ever object put into quantum state.
A quantum drum has become the first visible object to be put into a superposition of quantum states.A. Olsen/iStockphoto
A team of scientists has succeeded in putting an object large enough to be visible to the naked eye into a mixed quantum state of moving and not moving.
Andrew Cleland at the University of California, Santa Barbara, and his team cooled a tiny metal paddle until it reached its quantum mechanical 'ground state' — the lowest-energy state permitted by quantum mechanics. They then used the weird rules of quantum mechanics to simultaneously set the paddle moving while leaving it standing still. The experiment shows that the principles of quantum mechanics can apply to everyday objects as well as as atomic-scale particles.
The work is simultaneously being published online today in Nature and presented today at the American Physical Society's meeting in Portland, Oregon1.
According to quantum theory, particles act as waves rather than point masses on very small scales. This has dozens of bizarre consequences: it is impossible to know a particle's exact position and velocity through space, yet it is possible for the same particle to be doing two contradictory things simultaneously. Through a phenomenon known as 'superposition' a particle can be moving and stationary at the same time — at least until an outside force acts on it. Then it instantly chooses one of the two contradictory positions.
The paddle is around 30 micrometres long.O'Connell, A. D. et al.
But although the rules of quantum mechanics seem to apply at small scales, nobody has seen evidence of them on a large scale, where outside influences can more easily destroy fragile quantum states. "No one has shown to date that if you take a big object, with trillions of atoms in it, that quantum mechanics applies to its motion," Cleland says.
There is no obvious reason why the rules of quantum mechanics shouldn't apply to large objects. Erwin Schrödinger, one of the fathers of quantum mechanics, was so disturbed by the possibility of quantum weirdness on the large scale that he proposed his famous 'Schrödinger's cat' thought experiment. A cat is placed in a box with a vial of cyanide and a radioactive source. If the source decays, it triggers a device that will break the vial, killing the cat. During the time the box is shut, Schrödinger argued, the cat is in a superposition of alive and dead — an absurdity as far as he was concerned.
Cleland and his team took a more direct measure of quantum weirdness at the large scale. They began with a a tiny mechanical paddle, or 'quantum drum', around 30 micrometres long that vibrates when set in motion at a particular range of frequencies. Next they connected the paddle to a superconducting electrical circuit that obeyed the laws of quantum mechanics. They then cooled the system down to temperatures below one-tenth of a kelvin.
At this temperature, the paddle slipped into its quantum mechanical ground state. Using the quantum circuit, Cleland and his team verified that the paddle had no vibrational energy whatsoever. They then used the circuit to give the paddle a push and saw it wiggle at a very specific energy.
Next, the researchers put the quantum circuit into a superposition of 'push' and 'don't push', and connected it to the paddle. Through a series of careful measurements, they were able to show that the paddle was both vibrating and not vibrating simultaneously.
"It's wonderful," says Hailin Wang, a physicist at the University of Oregon in Eugene who has been working on a rival technique for putting an oscillator into the ground state. The work shows that the laws of quantum mechanics hold up as expected on a large scale. "It's good for physics for sure," Wang says.
So if trillions of atoms can be put into a quantum state, why don't we see double-decker buses simultaneously stopping and going? Cleland says he believes size does matter: the larger an object, the easier it is for outside forces to disrupt its quantum state.
"The environment is this huge, complex thing," says Cleland. "It's that interaction with this incredibly complex system that makes the quantum coherence vanish."
Still, he says, there's plenty of reasons to keep trying to get large objects into quantum states. Large quantum states could tell researchers more about the relationship between quantum mechanics and gravity — something that is not well understood. And quantum resonators could be useful for something, although Cleland admits he's not entirely sure what. "There might be some interesting application," he says. "But frankly, I don't have one now."
O'Connell, A. D. et al. Nature doi:10.1038/nature08967 (2010).
Reply #15 on:
March 19, 2010, 04:46:49 AM »
There is one of the principle that may get us to the stars. Now we need to "Macrosize" it, and figure out how to control the variables that determine the W's of the return to normalcy.
Reply #16 on:
June 08, 2010, 12:37:55 PM »
Higgs Boson Found?
Reply #17 on:
July 12, 2010, 07:05:02 PM »
Large Hadron Collider rival Tevatron 'has found Higgs boson'
Rumours are emerging from the rival to the Large Hadron Collider that the Higgs boson, or so-called "God particle", has been found.
By Tom Chivers
Published: 5:24PM BST 12 Jul 2010
Link to this video
Tommaso Dorigo, a physicist at the University of Padua, has said in his blog that there has been talk coming out of the Fermi National Accelerator Laboratory in Batavia, Illinois, that the Higgs has been discovered.
The Tevatron, the huge particle accelerator at Fermi - the most powerful in the world after the LHC - is expected to be retired when the CERN accelerator becomes fully operational, but may have struck a final blow before it becomes obsolete.
If one form of the rumour is to be believed - and Prof Dorigo is extremely circumspect about it - then it is a "three-sigma" signature, meaning that there is a statistical likelihood of 99.7 per cent that it is correct. But, of course, that is only if the rumour is to be believed.
In the post, titled "Rumors about a light Higgs", Prof Dorigo said: "It reached my ear, from two different, possibly independent sources, that an experiment at the Tevatron is about to release some evidence of a light Higgs boson signal.
"Some say a three-sigma effect, others do not make explicit claims but talk of a unexpected result."
While media attention has been focusing on the LHC, the Tevatron has been quietly plugging away in the search for Higgs. In the 27 years since it was first completed (it has been regularly upgraded since then) it has discovered a quark and observed four different baryons. While it has not been able to pinpoint the elusive Higgs, it has narrowed the search, reducing the window of possible masses where it might be found.
Last year, Fermi physicists said they expected to have enough data to find or rule out the Higgs by early next year, and gave themselves a fifty-fifty chance of finding it before the end of 2010.
The Higgs boson is the last of the particles posited by the standard model of particle physics still to be found. It is said to explain why other particles have mass, and its discovery would confirm the standard model. If its existence is ruled out altogether, then other, previously less popular theories will have to be examined.
New Scientist suggests that more may be known this month, when scientists present their findings at the International Conference on High Energy Physics (ICHEP), which opens in Paris on 22 July.
Reality behind reality?
Reply #18 on:
October 20, 2010, 10:58:08 AM »
More news of other dimensions and worlds:
Last Edit: October 20, 2010, 11:00:57 AM by prentice crawford
Reply #19 on:
January 24, 2011, 07:01:19 PM »
Quantum Entanglement Could Stretch Across Time
When is a Theory not a Theory?
Reply #20 on:
March 30, 2011, 03:05:56 PM »
Physicist Fist-Fight: What’s the Deal with Strings?
Date: March 9, 2011 | Author: Brian Trent
Category: Astronomy/Space Science, General Science, Physics/Mechanics | Comments: 3 » |
Can one theory explain everything?
This is the oldest objective of science. We can retrace the question to the pre-Socratics, when Thales in 585 BCE first suggested that everything was ultimately made of water, and that the apparent phases of matter were simply different states of this fundamental water, this blood of the Apsu or ichor of the gods.
What Thales was attempting was a universal theory. We have since moved on from water to the discovery of four fundamental forces: the strong nuclear force, the weak nuclear force, gravity, and electro-magnetism. Between this quartet, our understanding of the universe – however incomplete – hangs both powerfully and delicately. Can they ever be combined into a Grand Unification Theory?
Significantly, electro-magnetism was once deemed to be two separate forces, but James Clerk Maxwell, writing towards the end of the nineteenth century, united them for the scientific world.
Today we have come up with two perfectly wonderful, credible theories to tackle the nature of the universe. We have General Relativity to handle gravity, and we have the Standard Model to handle the other three. As best we can tell, both theories are correct and utterly irreconcilable with each other. It was Albert Einstein’s Holy Grail to reconcile them. He didn’t.
And so the quest continues. Yesterday in New York, seven leading physicists attended the American Museum of Natural History for the 11th annual Isaac Asimov Memorial Debate, and they discussed whether or not it was even possible to come up with the proverbial Theory of Everything.
From the article:
Many physicists say our best hope for a theory of everything is superstring theory, based on the idea that subatomic particles are actually teensy tiny loops of vibrating string. When filtered through the lens of string theory, general relativity and quantum mechanics can be made to get along
Ah yes, string theory. The radical discipline which emerged, almost out of defiance, against the scientific establishment and purported to contain the secrets to explaining everything. The hip brand of new thinking that seemed to merge science and mysticism (more on that later.)
As Brian Greene, professor of physics and mathematics at Columbia University, says, “There’s been an enormous amount of progress in string theory. There have been issues developed and resolved that I never thought, frankly, we would be able to resolve. The progress over the last 10 years has only solidified my confidence that this is a worthwhile direction to pursue.”
I will freely admit that I am a fan of superstring theory. How can you look at it and not gape at its fascinating aesthetics. It seems very Zen, the cosmological picture of simple elegance. When physicists smash particles together and get a shower of seemingly endless smaller particles, there is something enticing about a theory that states this endlessly diverse shower is nothing more than different vibration pitches of the same underlying string-like structure. I’ve attended Greene’s lectures and he is articulate and convincing — he has the Sagan touch for communicating ideas in a lucid and affecting manner.
There’s just one little problem:
Superstring theory isn’t a theory.
Since science works on theories and the ability to test those theories, string theory is a little too radical for its own good. String theory does not lend itself to being tested. It does not make quantifiable predictions. Without these important elements, it is not science by our prevailing definition. It becomes a thought experiment. A philosophy. Perhaps, even, a religion.
From the article:
Neil deGrasse Tyson, director of the museum’s Hayden Planetarium, suggested that string theory seems to have stalled, and contrasted the lack of progress of “legions” of string theorists with the seemingly short 10 years it took one man – Einstein – to transition from special relativity to general relativity.
“Are you chasing a ghost or is the collection of you just too stupid to figure this out?” deGrasse Tyson teased.
In fact, one of the chief components of string theory is that it requires 11 dimensions to work. The instant we involve higher dimensional planes, we stray into phantasmagoric territory. The addition of dimensions becomes something like spackle. Tyson’s offhand reference to ghosts is probably quite calculated and deliberate.
What do we think? If string theory is philosophy and not science, does this suggest that the tools we use are no longer practical in addressing the investigation of the cosmos? Or is it simply a matter of time before we develop those tools?
Re: Quantum Entanglement Could Stretch Across Time
Reply #21 on:
March 30, 2011, 10:23:49 PM »
Ooooooh - thinking about time travel just hurts the head. And the funny thing is that equations treat time and space the same - so why would it be that we can imagine teleporting through space so easily but time travel gets complicated so quickly?
Anyone see the low-budget scifi called Primer? Available on NetFlix - a great film considering it was made for $7000. A great film regardless. But the way it makes your head hurt.... that seems so typical of time travel.
That aside - the thing to keep in mind about that article is that it applies to quantum states, not elephants:
“You can send your quantum state into the future without traversing the middle time,” said quantum physicist S. Jay Olson of Australia’s University of Queensland, lead author of the new study.
In ordinary entanglement, two particles (usually electrons or photons) are so intimately bound that they share one quantum state — spin, momentum and a host of other variables — between them. One particle always “knows” what the other is doing. Make a measurement on one member of an entangled pair, and the other changes immediately.
So you have to limit your imagination to what you know about quantum states - the Star Trek analogy involving Scotty doesn't apply so readily.
What does that get anyone? That's where the imagination really comes in and I must say fails me admirably. What kind of technology could you build with it? Undetectable signals (since they aren't present for the "middle time")? Feedforward loops (mechanisms that track states in the past in order to synchronize themselves with past events)? Past-time sensors that send data to a future receiver?
Basically they are quantum "echoes from the past" - kind of like a camera recording current events so they can be replayed later. But you aren't replaying them, you are receiving the signal directly through time.
Maybe.... you could put sensors in an environment where they wouldn't be able to send a signal or record either (for some reason) and have them instead beam the signal into the future. For example, measuring state within a device that is exploding (and would overwhelm a signal with EM or destroy a stored signal).... now why would you want to do that?
OK - how about back to quantum encryption? You encode data and send it into the future - it is effectively gone until that later time when it arrives and can be recaptured. Then it can be retransmitted forward again and the information disappears. That might be a good way to hide information.
Really, since you are stuck in quantum terms, the only thing worth considering is information - data - communications and computation.
Anyone else have a better concept for using time entanglement?
Anti-Gravity from Anti-Atoms?
Reply #22 on:
May 03, 2011, 06:27:52 PM »
Scientists could be months away from discovering antigravity
Scientists at CERN have announced that they've been able to trap 309 atoms of antihydrogen for over 15 minutes. This is long enough that soon, they'll be able to figure out whether antimatter obeys the law of gravity, or whether it's repelled by normal matter and falls "up" instead. It would be antigravity, for real.
While it's never been tested experimentally due to how difficult it is to create and store the stuff, it's disappointingly likely that antimatter will fall "down" just like regular matter. The thinking behind this is that antimatter (despite the "anti-") is made of regular ordinary energy, and even if it's got an opposite charge, it should still obey the same general rules as matter does. Antimatter falling up would mean a violation of the law of conservation of energy, among other things.
That said, if antimatter were to exhibit antigravity, it would go a long way towards explaining some of the peculiarities of our universe. For example, the universe is supposed to have just as much antimatter as it does matter, but we don't know where the antimatter is. If antimatter and normal matter repelled each other, it could mean that there are entire antimatter galaxies out there. Also, that repulsion would explain why the universe is not just expanding, but speeding up its expansion, something that's tricky to figure out when everything in the universe is always attracted towards everything else.
In either case, the team at CERN should be able to put the debate to rest within a couple months, when they plan to trap a blob of antihydrogen and then just watch it to see which way it falls. Down, and the laws of physics stay in place. Up, and you might just get that hoverboard you've always wanted.
Safer Ports through Physics
Reply #23 on:
May 05, 2011, 10:31:58 AM »
Physics for safer ports: New technology uses nuclear 'fingerprints' to scan cargo ships
April 29th, 2011 in Technology / Engineering
The Port of Savannah is the fourth largest container port in the United States, importing hundreds of large metal boxes from cargo ships shown here. Credit: Georgia Department of Economic Development.
While 700 million travelers undergo TSA's intrusive scans and pat-downs each year, 11 million cargo containers enter American ports with little screening at all. And the volume of those containers, roughly equivalent to 590 Empire State Buildings of cargo, could contain something even worse than box knives or exploding shoes, namely nuclear weapons.
Two teams of North Carolina physicists are mapping the intricacies of the atomic nucleus, which could provide better security at the ports. The scientists have identified new "fingerprints" of nuclear materials, such as uranium and plutonium. The fingerprints would be used in new cargo scanners to accurately and efficiently identify suspicious materials. The physics might also be used to improve analysis of spent nuclear fuel rods, which are a potential source of bomb-making materials.
The problem starts at ports, where terrorists may try to smuggle an entire dirty bomb or even smaller amounts of plutonium or uranium by hiding it within the mountains of cargo that pass into the country each day. Cargo scanners using the new nuclear fingerprints would be sensitive enough to spot an entire bomb or the smaller parts to build one, according to Mohammad Ahmed, a nuclear physicist at Duke University.
Ahmed and his colleagues are developing the fingerprints for the next-generation detectors with HIGS, the High Intensity Gamma-Ray Source. It is the world's most intense and tunable source of polarized gamma rays and is located on Duke's campus as part of the Triangle Universities Nuclear Laboratory. HIGS produces gamma rays that are guided to collide with target materials, causing a variety of nuclear reactions.
In the reaction Ahmed and his Duke colleagues study, the collision creates a spray of particles, which fly into a group of detectors. The detectors count the number of neutrons knocked from the atomic nuclei of the target material in either a parallel or perpendicular direction, compared to the polarization plane of the gamma-ray beam. Dividing the number of neutrons emitted parallel to the plane by the number emitted perpendicular is distinct to each material, giving it a unique fingerprint.
Ahmed said these fingerprints could eventually be used to distinguish special nuclear materials, like weapons-grade uranium, from naturally occurring uranium or ordinary objects such as clothing or granite countertops, distinctions that current port scanners cannot make.
In a separate but related project, nuclear physicists from three North Carolina universities are slamming the HIGS beam into atomic nuclei and observing the energy pattern and distribution of the gamma rays that fluoresce back out of the collision. Each material has a distinct fluorescence pattern based on its nuclear structure, according to physicist Calvin Howell, who leads the Duke group.
New neutron "fingerprints" discovered with polarized gamma-rays at Duke could be the foundation for new port security scanners. Graphic: Ashley Yeager, Duke
Howell and his collaborators are studying the fluorescence patterns of potentially dangerous nuclear materials and non-nuclear contraband such as explosives and drugs. They are also identifying the patterns of steel and lead because terrorists can use the metals to conceal and ship weapon-making materials.
The two anti-terrorism projects were developed with the support of the Department of Homeland Security's Domestic Nuclear Detection Office, or DNDO. The agency awarded Ahmed and his colleagues a $2 million grant, while Howell and his collaborators received grants totaling $2 million. DNDO is funding both projects in response to the SAFE (Security and Accountability For Every) Port Act of 2006, which requires security agents to scan for nuclear materials in all of the containers entering the United States through the nation's 22 busiest ports.
Five years after Congress and the president approved the legislation, the equipment to satisfy this mandate still doesn't exist. Meanwhile, the United States transfers about 20 percent of the world's freight across its borders and has more than 300 maritime ports for sea containers and an additional 300 access points, such as border crossings, where dangerous materials might enter the country.
The Duke scientists say their use of polarized gamma-ray beams could one day help satisfy the SAFE policy, and they are building the fingerprint library to make it happen.
The HIGS data show, for example, that a precisely tuned gamma beam at 6 MeV causes weapons-grade uranium, U-235, to emit one neutron parallel to the polarization plane for each neutron emitted perpendicular to the plane, giving the material a neutron fingerprint of one.
Naturally occurring uranium, U-238, emits three parallel neutrons for every one emitted perpendicular to the polarization plane of the beam, giving it a neutron fingerprint of three.
Beryllium, which can also be a neutron source in nuclear weapons, has a neutron fingerprint of 10. The team is now measuring the neutron fingerprints of plutonium and other fissile materials, Ahmed said.
Howell and his collaborators, meanwhile, work at lower energies on HIGS, about 3 MeV. (Surgeons, for comparison, use a "Gamma Knife" at roughly 1 MeV to treat brain tumors.) Their team has already identified the fluorescence patterns of several special nuclear materials and lead.
Both teams will report their results at a meeting with DNDO officials on Thursday, April 28 in Washington D.C. and will store their results in a nuclear identification database.
Ahmed and Howell said that engineers at one private security company and scientists at U.S. national laboratories have already begun using the database to design new port security scanners.
The new detectors will search cargo for the fingerprints using an electron accelerator, possibly coupled to lasers that produce a finely tuned gamma-ray beam, said Craig Wuest of the Global Security Principal Directorate at Lawrence Livermore National Laboratory (LLNL).
The design sounds complex, but in some ways it resembles medical scanning equipment and appears promising to pursue, he said.
Howell's "nuclear resonance fluorescence" approach is interesting because it uses a beam with lower-energy gamma rays and reduces the potential irradiation and contamination of cargo while providing "sufficient detection sensitivity," Wuest, who was not involved in the research, added.
One of Wuest's colleagues at LLNL, nuclear physicist Dennis McNabb,is more intrigued with Ahmed's and Weller's technique. Scientists are only just beginning to measure the fingerprints and background signatures from this neutron-scattering process, and because "the research is in progress, how to best use the data is still an open question," McNabb said.
He also explained that cargo scanners using the data from both teams could be ready for use at ports in about 10 years.
Still, some scientists question whether the emerging science and technology can mature fast enough to meet the real-world threats of terrorists and dirty bombs. For instance, Thomas Cochran, a physicist and senior scientist at the Natural Resources Defense Council, voiced "serious doubts" and said the government should focus instead on eliminating inventories of highly enriched uranium, improving port security, boosting intelligence efforts and training first responders.
Other experts disagree and are urging the government to accelerate research on new science and technologies that could significantly reduce the threat of nuclear weapons smuggling, which seems likely to persist into the next decade. McNabb, a proponent, said, "it takes time to develop new technologies" and suggests that the research may accelerate development in other areas of nuclear security.
The new information from HIGS could improve analysis of spent nuclear fuel rods, which are an environmental issue as well as a potential source of bomb materials, according to Duke physicist Anton Tonchev.
He works on the nuclear resonance fluorescence project with Howell and said the technique provides a nondestructive way to measure the quantities of plutonium and other nuclear materials that remain after the rods are removed from a nuclear reactor.
Currently, the spent fuel rods must be opened and tested to assess what materials remain in them. The process is expensive, but critical for the International Atomic Energy Agency to accurately calculate the amount of leftover fissile and nuclear materials. McNabb and Tonchev said that a new technique to distinguish the leftover U-235, U-238 and plutonium in the spent rods without opening them could substantially lower the costs to manage and account for nuclear waste to prevent nuclear proliferation by terrorists.
Regardless of how fast engineers turn the fingerprint data and new approaches into workable scanning and nuclear fuel devices, the Duke scientists said there is immediate value in compiling a robust database of both the neutron and nuclear resonance fluorescence fingerprints. Government officials at the DNDO concur and cite HIGS as the only facility with the ability to produce such a database, according to Ahmed.
Because of the demand, the physicists have recruited graduate and undergraduate students from Duke, University of North Carolina, North Carolina Agricultural and Technical State University, North Carolina Central University, James Madison University and George Washington University to help with the effort. They especially encourage students from historically black colleges and universities to participate, hoping the effort will help broaden the diversity of nuclear physicists working to identify new ways to curb the threat of future terror attacks.
Provided by Duke University
Reply #24 on:
May 12, 2011, 08:03:08 AM »
"Cargo scanners using the new nuclear fingerprints would be sensitive enough to spot an entire bomb or the smaller parts to build one, according to Mohammad Ahmed, a nuclear physicist at Duke University."
Nice work Mohammed!
"They especially encourage students from historically black colleges and universities to participate, hoping the effort will help broaden the diversity of nuclear physicists working to identify new ways to curb the threat of future terror attacks."
Affirmative action in physics
Reply #25 on:
May 17, 2011, 03:16:36 AM »
would a program like that help short whites obtain a diversity on pro basketball teams?
reality of the universe beyond human comprehension
Reply #26 on:
June 18, 2011, 10:13:28 AM »
Good article in Scientific American with interview of this fellow. He comes to the conclusion after studying and inventing string theory black holes that human minds are totally incapable of truly understanding the universe. I can't pull up the article but this is the interesting fellow who is probably more interesting then the celebrated Stephen Hawking. Then again I am a mere mortal compared to any of these people:
From Wikipedia, the free encyclopedia
Jump to: navigation, search
South Bronx, New York City, New York, USA
Institutions Yeshiva University
University of Tel Aviv
Korea Institute for Advanced Study
Alma mater City College of New York
Doctoral advisor Peter A. Carruthers
Known for Holographic principle
String theory landscape
Hamiltonian lattice gauge theory
Notable awards American Institute of Physics' Science Writing Award
Sakurai Prize (1998)
Boris Pregel Award, New York Academy of Science (1975)
Leonard Susskind (born 1940) is the Felix Bloch Professor of Theoretical Physics at Stanford University. His research interests include string theory, quantum field theory, quantum statistical mechanics and quantum cosmology. He is a member of the National Academy of Sciences, and the American Academy of Arts and Sciences, an associate member of the faculty of Canada's Perimeter Institute for Theoretical Physics, and a distinguished professor of the Korea Institute for Advanced Study. Susskind is widely regarded as one of the fathers of string theory,, having, with Yoichiro Nambu and Holger Bech Nielsen, independently introduced the idea that particles could in fact be states of excitation of a relativistic string. He was the first to introduce the idea of the string theory landscape in 2003. In 1997, Susskind was awarded the J.J. Sakurai Prize for his "pioneering contributions to hadronic string models, lattice gauge theories, quantum chromodynamics, and dynamical symmetry breaking." Susskind's hallmark, according to colleagues, has been the application of "brilliant imagination and originality to the theoretical study of the nature of the elementary particles and forces that make up the physical world."
1 Early life and education
2.1 Scientific career
2.2 Development of String Theory
3.1 The Cosmic Landscape
3.2 The Black Hole War
4.1 Modern Physics: The Theoretical Minimum
4.2 A separate series of lectures on Quantum Mechanics and Special Relativity
5 Smolin-Susskind Debate
6 See also
8 Further reading
9 External links
 Early life and education
Susskind was born to a poor Jewish family from the South Bronx section of New York City, and now resides in Palo Alto, California. He began working as a plumber at the age of 16, taking over for his father who had become ill. Later, he enrolled in the City College of New York as an engineering student, graduating with a B.S. in physics in 1962. In an interview in the Los Angeles Times, Susskind recalls the moment he discussed with his father this change in career path: "When I told my father I wanted to be a physicist, he said, ‘Hell no, you ain’t going to work in a drug store.’ I said no, not a pharmacist. I said, ‘Like Einstein.’ He poked me in the chest with a piece of plumbing pipe. ‘You ain’t going to be no engineer,’ he said. ‘You’re going to be Einstein.’" Susskind then studied at Cornell University under Peter A. Carruthers where he received his Ph.D. in 1965. He has been married twice, first in 1960, and has four children.
Susskind was an Assistant Professor of Physics, then an Associate Professor at Yeshiva University (1966–1970), after which he went for a year at the University of Tel Aviv (1971–72), returning to Yeshiva to become a Professor of Physics (1970–1979). Since 1979 he has been Professor of Physics at Stanford University, and since 2000 has held the Felix Bloch Professorship of Physics.
In 2007, Susskind joined the Faculty of Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada, as an Associate Member. He has been elected to the National Academy of Sciences and the American Academy of Arts and Sciences, and was awarded the 1998 Sakurai Prize for theoretical physics. He is also a distinguished professor at Korea Institute for Advanced Study.
 Scientific career
Susskind was one of at least three physicists who independently discovered during or around 1970 that the Veneziano dual resonance model of strong interactions could be described by a quantum mechanical model of strings, and was the first to propose the idea of the string theory landscape. Susskind has also made contributions in the following areas of physics:
The independent discovery of the string theory model of particle physics
The theory of quark confinement
The development of Hamiltonian lattice gauge theory
The theory of scaling violations in deep inelastic electroproduction
The theory of symmetry breaking sometimes known as "technicolor theory"
The second, yet independent, theory of cosmological baryogenesis (Sakharov's work was first, but was mostly unknown in the Western hemisphere.)
String theory of black hole entropy
The principle of black hole complementarity
The causal patch hypothesis
The holographic principle
M-theory, including development of the BFSS matrix model 
Introduction of holographic entropy bounds in physical cosmology
The idea of an anthropic string theory landscape
 Development of String Theory
The story goes that "In 1970, a young physicist named Leonard Susskind got stuck in an elevator with Murray Gell-Mann, one of physics' top theoreticians, who asked him what he was working on. Susskind said he was working on a theory that represented particles 'as some kind of elastic string, like a rubber band.' Gell-Mann responded with loud, derisive laughter."
Susskind is the author of two popular science books, The Cosmic Landscape: String Theory and the Illusion of Intelligent Design published in 2005, and The Black Hole War: My battle with Stephen Hawking to make the world safe for quantum mechanics published in 2008.
 The Cosmic Landscape
Main article: The Cosmic Landscape
The Cosmic Landscape: String Theory and the Illusion of Intelligent Design is Susskind's first popular science book, published by Little, Brown and Company on December 12, 2005. It is Susskind's attempt to bring his idea of the anthropic landscape of string theory to the general public. In the book, Susskind describes how the string theory landscape was an almost inevitable consequence of several factors, one of which was Steven Weinberg's prediction of the cosmological constant in 1987. The question addressed here is why our universe is fine-tuned for our existence. Susskind explains that Weinberg calculated that if the cosmological constant was just a little larger, our universe would cease to exist.
 The Black Hole War
Main article: Susskind-Hawking battle
The Black Hole War: My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics is Susskind's second popular science book, published by Little, Brown, and Company on July 7, 2008. The book is his most famous work and explains what he thinks would happen to the information and matter stored in a black hole when it evaporates. The book sparked from a debate that started in 1981, when there was a meeting of physicists to try to decode some of the mysteries about how particles of particular elemental compounds function. During this discussion Stephen Hawking stated that the information inside a black hole is lost forever as the black hole evaporates. It took 28 years for Leonard Susskind to formulate his theory that would prove Hawking wrong. He then published his theory in his book, The Black Hole War. Like The Cosmic Landscape, The Black Hole War is aimed at the lay reader. He writes: "The real tools for understanding the quantum universe are abstract mathematics: infinite dimensional Hilbert spaces, projection operators, unitary matrices and a lot of other advanced principles that take a few years to learn. But let's see how we do in just a few pages."
An entire series of courses of lectures on essential theoretical foundations of modern physics by Susskind is available on the iTunes platform from "Stanford on iTunes"  and YouTube from "StanfordUniversity's Channel" . These lectures are intended for the general public as well as students. The following courses are available:
 Modern Physics: The Theoretical Minimum
1 Classical Mechanics (Fall 2007) iTunes YouTube
2 Quantum Mechanics (Winter 2008) iTunes YouTube
3 Special Relativity and Classical Field Theory (Spring 2008) iTunes YouTube
4 Einstein's General Theory of Relativity (Fall 2008) iTunes YouTube
5 Cosmology (Winter 2009) iTunes YouTube
6 Statistical Mechanics (Spring 2009) iTunes YouTube
Particle Physics: 1 Basic Concepts (Fall 2009) iTunes YouTube
Particle Physics: 2 Standard Model (Winter 2010) iTunes YouTube
Particle Physics: 3 Supersymmetry, Grand Unification, String Theory (Spring 2010) iTunes
String Theory and M-Theory (Winter 2011) iTunes YouTube
 A separate series of lectures on Quantum Mechanics and Special Relativity
Quantum Entanglements Part 1 (Fall 2006) iTunes YouTube
Quantum Entanglements Part 2 (Not available online)
Quantum Entanglements Part 3 (Spring 2007) iTunes YouTube
(Note that some of the lecture names are a little mixed-up: "Quantum Entanglements Part 3" is in fact a lecture series on special relativity, and the order in which the lectures were given is 1, 4, 5, 6, 7, 2&3, 8 and 9 (in terms of the numbers given on the videos); There is no mention of string theory in the series "Supersymmetry, Grand Unification, String Theory.")
 Smolin-Susskind Debate
The Smolin-Susskind debate refers to the series of intense postings in 2004 between Lee Smolin and Susskind, concerning Smolin’s argument that the "Anthropic Principle cannot yield any falsifiable predictions, and therefore cannot be a part of science." It began on July 26, 2004, with Smolin's publication of "Scientific alternatives to the anthropic principle". Smolin e-mailed Susskind asking for a comment. Having not had the chance to read the paper, Susskind requested a summarization of his arguments. Smolin obliged, and on July 28, 2004, Susskind responded, saying that the logic Smolin followed "can lead to ridiculous conclusions". The next day, Smolin responded, saying that "If a large body of our colleagues feels comfortable believing a theory that cannot be proved wrong, then the progress of science could get stuck, leading to a situation in which false, but unfalsifiable theories dominate the attention of our field." This was followed by another paper by Susskind which made a few comments about Smolin's theory of "cosmic natural selection". The Smolin-Susskind debate finally ended with each of them agreeing to write a final letter which would be posted on Edge, with three conditions attached: (1) No more than one letter each; (2) Neither sees the other's letter in advance; (3) No changes after the fact.
Although the exchanges ended in 2004, the animosity remains. In 2006, Susskind criticized Smolin as a "mid-level theoretical physicist" whose "popular book-writing activities and the related promotional hustling have given him a platform high above that merited by his physics accomplishments."
 See also
List of theoretical physicists
^ a b His 60th birthday was celebrated with a special symposium at Stanford University on May 20–21, 2000.
^ a b Faculty information sheet, Stanford University,
, retrieved 2009-09-01
^ Life in a landscape of possibilities
^ 60 New Members Chosen by Academy, National Academy of Sciences (press release), May 2, 2000,
, retrieved 2009-09-01
^ a b c Edge.org Leonard Susskind - A Biography (last accessed August 12, 2007).
^ NYAS Publication A Walk Across the Landscape
^ a b c "Leonard Susskind discusses duel with Stephen Hawking", "LA Times", July 26, 2008
^ Welcome To Kias
^ String Theory: The Early Years, John H. Schwarz, 2000
^ L. Susskind, Lattice Models Of Quark Confinement At High Temperature, Phys. Rev. D20 (1979) 2610.
^ J. Kogut and L. Susskind, Phys. Rev. D 11, 395 (1975).
^ Review of Particle Physics, (W.-M. Yao et al., J. Phys. G 33, 1 (2006)) Dynamical Electroweak Symmetry Breaking section cites two 1979 publications, one by Steven Weinberg, the other by L. Susskind to represent the earliest models with technicolor and technifermions.
^ Biography at APS J. J. Sakurai Prize website (last accessed August 12, 2007)
^ L. Susskind, RU-93-44, hep-th/9309145.
^ L. Susskind, Phys. Rev. Lett. 71, 2368 (1993). String theory and the principle of black hole complementarity
^ "The insistence on unitarity in the presence of black holes led 't Hooft (1993) and Susskind (1995b) to embrace a more radical, holographic interpretation of ..." - The Holographic Principle, Raphael Bousso, Rev. Mod. Phys. 74 (2002) 825-874. 
^ T. Banks, W. Fischler, S. H. Shenker, and L. Susskind, M Theory as a Matrix Model: A Conjecture, Phys. Rev. D55 (1997) 5112–5128, hep-th/9610043.
^ L. Susskind, arXiv:hep-th/0302219
^ a b L. Susskind (2005), The cosmic landscape: string theory and the illusion of intelligent design, Little, Brown, ISBN 0316155799
^ a b L. Susskind (2008), The Black Hole War: My battle with Stephen Hawking to make the world safe for quantum mechanics, Little, Brown, ISBN 0-316-01640-3 
^ a b Smolin vs. Susskind: The Anthropic Principle, Edge Institute, August 2004,
, retrieved 2009-09-01
^ Leonard Susskind (25 August 2006), Hold fire! This epic vessel has only just set sail..., Times Higher Education,
, retrieved 2009-09-01
 Further reading
Chown, Marcus, "Our world may be a giant hologram", New Scientist, 15 January 2009, magazine issue 2691. "The holograms you find on credit cards and banknotes are etched on two-dimensional plastic films. When light bounces off them, it recreates the appearance of a 3D image. In the 1990s physicists Leonard Susskind and Nobel prizewinner Gerard 't Hooft suggested that the same principle might apply to the universe as a whole. Our everyday experience might itself be a holographic projection of physical processes that take place on a distant, 2D surface."
 External links
Wikiquote has a collection of quotations related to: Leonard Susskind
Leonard Susskind's Homepage (Stanford University)
"Interview with Leonard Susskind."
"Smolin vs. Susskind: The Anthropic Principle" Susskind and Lee Smolin debate the Anthropic principle
Radio Interview from This Week in Science March 14, 2006 Broadcast
"Father of String Theory Muses on the Megaverse": Podcast.
Leonard Susskind at the Internet Movie Database
The Cosmic Landscape book discussion at The Commonwealth Club, February 2007
The Black Hole War speaks on black hole conflict at The Commonwealth Club, July 2008
Leonard Susskind: My friend Richard Feynman - A Ted talk***
Physics in one minute
Reply #27 on:
September 15, 2011, 05:56:30 PM »
faster than speed of light?
Reply #28 on:
September 22, 2011, 02:28:01 PM »
A fundamental pillar of physics — that nothing can go faster than the speed of light — appears to be smashed by an oddball subatomic particle that has apparently made a giant end run around Albert Einstein's theories.
Scientists at the world's largest physics lab said Thursday they have clocked neutrinos traveling faster than light. That's something that according to Einstein's 1905 special theory of relativity — the famous E (equals) mc2 equation — just doesn't happen.
"The feeling that most people have is this can't be right, this can't be real," said James Gillies, a spokesman for the European Organization for Nuclear Research, or CERN, outside the Swiss city of Geneva.
Gillies told The Associated Press that the readings have so astounded researchers that they are asking others to independently verify the measurements before claiming an actual discovery.
"They are inviting the broader physics community to look at what they've done and really scrutinize it in great detail, and ideally for someone elsewhere in the world to repeat the measurements," he said Thursday.
Scientists at the competing Fermilab in Chicago have promised to start such work immediately.
"It's a shock," said Fermilab head theoretician Stephen Parke, who was not part of the research in Geneva. "It's going to cause us problems, no doubt about that - if it's true."
The Chicago team had similar faster-than-light results in 2007, but those came with a giant margin of error that undercut its scientific significance.
Outside scientists expressed skepticism at CERN's claim that the neutrinos — one of the strangest well-known particles in physics — were observed smashing past the cosmic speed barrier of 186,282 miles per second (299,792 kilometers per second).
University of Maryland physics department chairman Drew Baden called it "a flying carpet," something that was too fantastic to be believable.
CERN says a neutrino beam fired from a particle accelerator near Geneva to a lab 454 miles (730 kilometers) away in Italy traveled 60 nanoseconds faster than the speed of light. Scientists calculated the margin of error at just 10 nanoseconds, making the difference statistically significant. But given the enormous implications of the find, they still spent months checking and rechecking their results to make sure there was no flaws in the experiment.
"We have not found any instrumental effect that could explain the result of the measurement," said Antonio Ereditato, a physicist at the University of Bern, Switzerland, who was involved in the experiment known as OPERA.
The CERN researchers are now looking to the United States and Japan to confirm the results.
A similar neutrino experiment at Fermilab near Chicago would be capable of running the tests, said Stavros Katsanevas, the deputy director of France's National Institute for Nuclear and Particle Physics Research. The institute collaborated with Italy's Gran Sasso National Laboratory for the experiment at CERN.
Katsanevas said help could also come from the T2K experiment in Japan, though that is currently on hold after the country's devastating March 11 earthquake and tsunami.
Scientists agree if the results are confirmed, that it would force a fundamental rethink of the laws of nature.
Einstein's special relativity theory that says energy equals mass times the speed of light squared underlies "pretty much everything in modern physics," said John Ellis, a theoretical physicist at CERN who was not involved in the experiment. "It has worked perfectly up until now."
He cautioned that the neutrino researchers would have to explain why similar results weren't detected before, such as when an exploding star — or supernova — was observed in 1987.
"This would be such a sensational discovery if it were true that one has to treat it extremely carefully," said Ellis.
Re: speed of light broken
Reply #29 on:
September 23, 2011, 09:25:20 AM »
The media needs to take a cold bath.
As has been said clearly, the result is reproducible and therefore "puzzling" - and so the group at CERN have offered up their data to other groups to confirm (or deny). The natural course of science. The speed of light limit is a long-standing principle (if you call 100 years "long") and represents a significant challenge to the scientific status quo. But note that nothing will fall apart or become more or less true should this result stand - even if it does turn out that neutrinos are "breaking the law", like so many revelations in scientific discovery it will simply mean developing newer and better models that account for everything we have seen to date plus this.
This is currently in the realm of "science fiction" - or at least "science speculation".
In gravitation, Newton was proved "wrong". Then Einstein. And I'm sure one day Hawkings will also fall to the latest and best description of gravitation. We all know that we haven't been able to include gravitation properly into the unified theory. It may be because there is some subtlety about space-time that we haven't accounted for. This may help us uncover it.
It's exciting frankly. I don't know why the media portrays it as threatening. No scientist thinks that they know the ultimate truth about life the universe and everything. Science is a series of models - not some stranglehold on truth about reality. Any scientist who believes otherwise should be shown the door. In fact, challenging existing canon is a natural consequence of what every critical thinking scientist should be doing - albeit not gratuitously. It's good science (remember, this is not a religion, despite our tendencies).
In this circumstance, unambiguously, the first position taken is that they must be making systematic errors and they want another group to check it independently. Good scientific process. Until that is done, this is like a UFO sighting - possibly something revolutionary but much more likely something quite banal and well understood - a mistake in scientific preparation and/or analysis. Occam's Razor at work.
The media is like some crazed terrier who goes barking madly up and down the hallway every time it hears a floorboard creak. You need a new dog.
Update: From the UK Guardian
Professor Jim Al-Khalili at the University of Surrey said it was most likely that something was skewing the results. "If the neutrinos have broken the speed of light, it would overturn a keystone theory from the last century of physics. That's possible, but it's far more likely that there is an error in the data. So let me put my money where my mouth is: if the Cern experiment proves to be correct and neutrinos have broken the speed of light,
I will eat my boxer shorts on live TV
Update: XKCD skewers the issue
Last Edit: September 23, 2011, 05:56:36 PM by trickydog
WSJ agrees with Tricky Dog
Reply #30 on:
September 26, 2011, 07:58:59 PM »
By MICHIO KAKU
Einstein wrong? Impossible!
That was the reaction of physicists around the world last week when they heard that experiments in Switzerland indicate that Einstein's theory of relativity might be wrong. Since 1905, when Einstein declared that nothing in the universe could travel faster than light, the theory has been the bedrock of modern physics. Indeed, most of our high-tech wizardry depends on it.
Of course, crackpots have been denouncing Einstein's theory of relativity for years. Like many physicists, I have boxes full of self-published monographs that were mailed to me from people who claim that Einstein was wrong. In the 1930s the Nazi Party criticized Einstein's theory, publishing a book called "100 Authorities Denounce Relativity." Einstein later quipped that you don't need 100 famous intellectuals to disprove his theory. All you need is one simple fact.
Well, that simple fact may be in the form of the latest experiments at the largest particle accelerators in the world, based at CERN, outside Geneva. Physicists fired a beam of neutrinos (exotic, ghost-like particles that can penetrate even the densest of materials) from Switzerland to Italy, over a distance of 454 miles. Much to their amazement, after analyzing 15,000 neutrinos, they found that they traveled faster than the speed of light—one 60-billionth of a second faster, to be precise. In a billionth of a second, a beam of light travels about one foot. So a difference of 60 feet was quite astonishing.
Cracking the light barrier violated the core of Einstein's theory. According to relativity, as you approach the speed of light, time slows down, you get heavier, and you also get flatter (all of which have been measured in the lab). But if you go faster than light, then the impossible happens. Time goes backward. You are lighter than nothing, and you have negative width. Since this is ridiculous, you cannot go faster than light, said Einstein.
A part of the OPERA detector experiment to measure neutrinos.
.The CERN announcement was electrifying. Some physicists burst out with glee, because it meant that the door was opening to new physics (and more Nobel Prizes). New, daring theories would need to be proposed to explain this result. Others broke out in a cold sweat, realizing that the entire foundation of modern physics might have to be revised. Every textbook would have to be rewritten, every experiment recalibrated.
Cosmology, the very way we think of space, would be forever altered. The distance to the stars and galaxies and the age of the universe (13.7 billion years) would be thrown in doubt. Even the expanding universe theory, the Big Bang theory, and black holes would have to be re-examined.
Moreover, everything we think we understand about nuclear physics would need to be reassessed. Every school kid knows Einstein's famous equation E=MC2, where a small amount of mass M can create a vast amount of energy E, because the speed of light C squared is such a huge number. But if C is off, it means that all nuclear physics has to be recalibrated. Nuclear weapons, nuclear medicine and radioactive dating would be affected because all nuclear reactions are based on Einstein's relation between matter and energy.
Michio Kaku, theoretical physics professor at City College of New York, discusses the implications of a recent experiment that undercuts Einstein's theory of relativity.
..If all this wasn't bad enough, it would also mean that the fundamental principles of physics are incorrect. Modern physics is based on two theories, relativity and the quantum theory, so half of modern physics would have to be replaced by a new theory. My own field, string theory, is no exception. Personally, I would have to revise all my theories because relativity is built into string theory from the very beginning.
How will this astonishing result play out? As Carl Sagan once said, remarkable claims require remarkable proof. Laboratories around the world, like Fermilab outside Chicago, will redo the CERN experiments and try to falsify or verify their results.
My gut reaction, however, is that this is a false alarm. Over the decades, there have been numerous challenges to relativity, all of them proven wrong. In the 1960s, for example, physicists were measuring the tiny effect of gravity upon a light beam. In one study, physicists found that the speed of light seemed to oscillate with the time of day. Amazingly, the speed of light rose during the day, and fell at night. Later, it was found that, since the apparatus was outdoors, the sensors were affected by the temperature of daylight.
Reputations may rise and fall. But in the end, this is a victory for science. No theory is carved in stone. Science is merciless when it comes to testing all theories over and over, at any time, in any place. Unlike religion or politics, science is ultimately decided by experiments, done repeatedly in every form. There are no sacred cows. In science, 100 authorities count for nothing. Experiment counts for everything.
Mr. Kaku, a professor of theoretical physics at City College of New York, is the author of "Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100" (Doubleday, 2011).
About tachyons (particles that travel faster than c)
Reply #31 on:
September 26, 2011, 08:41:04 PM »
A bartender says "Hey! We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
Re: About tachyons (particles that travel faster than c)
Reply #32 on:
September 26, 2011, 08:42:27 PM »
Quote from: trickydog on September 26, 2011, 08:41:04 PM
A bartender says "Hey! We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
Reply #33 on:
September 26, 2011, 11:18:25 PM »
"one 60-billionth of a second faster"..."My gut reaction, however, is that this is a false alarm. Over the decades, there have been numerous challenges to relativity, all of them proven wrong. "
Interesting stuff. I recall an experiment 10-12 years ago where they also allegedly made light travel slightly faster than the speed of light for a very short time. Nothing seemed to come out of that in terms of theories discarded or product commercialized. The speed of light is already pretty fast.
Re: About tachyons (particles that travel faster than c)
Reply #34 on:
September 28, 2011, 12:49:21 AM »
Quote from: trickydog on September 26, 2011, 08:41:04 PM
A bartender says "Hey! We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
I can already tell I'm going to start telling this joke constantly.
Reply #35 on:
September 28, 2011, 08:44:23 AM »
TD, CW, You run in fast circles if you can tell that joke without some explaining.
Quote from: Cranewings on September 28, 2011, 12:49:21 AM
Quote from: trickydog on September 26, 2011, 08:41:04 PM
A bartender says "Hey! We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
Reply #36 on:
September 28, 2011, 09:59:39 AM »
Well, I'm not so fast! Explain it to me please!
Reply #37 on:
September 28, 2011, 12:29:54 PM »
Explaining someone else's joke is dangerous territory, good chance of screwing up. The joke writers can correct me or build on this.
In layman's terms, the theory of relativity is about things that happen really fast - messing with the concept of time as we know it - all based on the speed of light, a constant, which is now being challenged. Arrive at your train destination before you departed, that kind of thing... The tachyon is some sub atomic particle that scientists are messing with to violate the limits of the speed of light and threatening Einstein's great theory. In comes the old joke line, like the priest and the rabbi or the man and his dog go into a bar and the bartender says..., only this time the events are happening in reverse order, messing with our concept of time.
A bartender says "Hey! We don't serve tachyons in this establishment".
Two tachyons walk into a bar.
Reply #38 on:
October 24, 2011, 07:47:53 AM »
Teenager solves math riddle posed by Sir Isaac Newton
Reply #39 on:
May 27, 2012, 10:48:24 AM »
BYU Physics class
Reply #40 on:
November 22, 2012, 12:58:18 PM »
Pi and other infinities
Reply #41 on:
January 01, 2013, 01:00:38 PM »
explaining that which is unexplainable
Reply #42 on:
October 12, 2013, 11:30:23 AM »
Higgs Boson Gets Nobel Prize, But Physicists Still Don’t Know What It Means
By Adam Mann
Data from the CMS experiment, one of the main Higgs-searching experiments at the Large Hadron Collider. Image: CERN
More than a year ago, scientists found the Higgs boson. This morning, two physicists who 50 years ago theorized the existence of this particle, which is responsible for conferring mass to all other known particles in the universe, got the Nobel, the highest prize in science.
For all the excitement the award has already generated, finding the Higgs — arguably the most important discovery in more than a generation — has left physicists without a clear roadmap of where to go next. While popular articles often describe how the Higgs might help theorists investigating the weird worlds of string theory, multiple universes, or supersymmetry, the truth is that evidence for these ideas is scant to nonexistent.
No one is sure which of these models, if any, will eventually describe reality. The current picture of the universe, the Standard Model, is supposed to account for all known particles and their interactions. But scientists know that it’s incomplete. Its problems need fixing, and researchers could use some help figuring out how. Some of them look at the data and say that we need to throw out speculative ideas such as supersymmetry and the multiverse, models that look elegant mathematically but are unprovable from an experimental perspective. Others look at the exact same data and come to the opposite conclusion.
“Physics is at a crossroads,” said cosmologist Neil Turok, speaking to a class of young scientists in September at the Perimeter Institute, which he directs. “In a sense we’ve entered a very deep crisis.”
The word “crisis” is a charged one within the physics community, invoking eras such as the early 20th century, when new observations were overturning long-held beliefs about how the universe works. Eventually, a group of young researchers showed that quantum mechanics was the best way to describe reality. Now, as then, many troubling observations leave physicists scratching their heads. Chief among them is the “Hierarchy Problem,” which in its simplest form asks why gravity is approximately 10 quadrillion times weaker than the three other fundamental forces in the universe. Another issue is the existence of dark matter, the unseen, mysterious mass thought to be responsible for strange observations in the rotation of galaxies.
The solution to both these problems might come from the discovery of new particles beyond the Higgs. One theory, supersymmetry, goes beyond the Standard Model to say that every subatomic particle — quarks, electrons, neutrinos, and so on — also has a heavier twin. Some of these new particles might have the right characteristics to account for the influence of dark matter. Engineers built the Large Hadron Collider to see if such new particles exist (and may yet see them once it reaches higher energy in 2014), but so far it hasn’t turned up anything other than the Higgs.
In fact, the Higgs itself has turned out to be part of the issue. The particle was the final piece in the Standard Model puzzle. When scientists discovered it at the LHC, it had a mass of 125 GeV, about 125 times heavier than a proton — exactly what standard physics expected. That was kind of a buzzkill. Though happy to know the Higgs was there, many scientists had hoped it would turn out to be strange, to defy their predictions in some way and give a hint as to which models beyond the Standard Model were correct. Instead, it’s ordinary, perhaps even boring.
All this means that confidence in supersymmetry is dropping like a stone, according to Tommaso Dorigo, a particle physicist at the LHC. In one blog post, he shared a rather pornographic plot showing how the findings of the LHC eliminated part of the evidence for supersymmetry. Later, he wrote that many physicists would have previously bet their reproductive organs on the idea that supersymmetric particles would appear at the LHC. That the accelerator’s experiments have failed to find anything yet “has significantly cooled everybody down,” he wrote.
In fact, when the organizers of a Higgs workshop in Madrid last month asked physicists there if they thought the LHC would eventually find new physics other than the Higgs boson, 41 percent said no. As to how to solve the known problems of the Standard Model, respondents were all over the map. String theory fared the worst, with three-quarters of those polled saying they did not think it is the ultimate answer to a unified physics.
One possibility has been brought up that even physicists don’t like to think about. Maybe the universe is even stranger than they think. Like, so strange that even post-Standard Model models can’t account for it. Some physicists are starting to question whether or not our universe is natural. This cuts to the heart of why our reality has the features that it does: that is, full of quarks and electricity and a particular speed of light.
This problem, the naturalness or unnaturalness of our universe, can be likened to a weird thought experiment. Suppose you walk into a room and find a pencil balanced perfectly vertical on its sharp tip. That would be a fairly unnatural state for the pencil to be in because any small deviation would have caused it to fall down. This is how physicists have found the universe: a bunch of rather well-tuned fundamental constants have been discovered that produce the reality that we see.
A natural explanation would show why the pencil is standing on its end. Perhaps there is a very thin string holding the pencil to the ceiling that you never noticed until you got up close. Supersymmetry is a natural explanation in this regard – it explains the structure of universe through as-yet-unseen particles.
But suppose that infinite rooms exist with infinite numbers of pencils. While most of the rooms would have pencils that have fallen over, it is almost certain that in at least one room, the pencil would be perfectly balanced. This is the idea behind the multiverse. Our universe is but one of many and it happens to be the one where the laws of physics happen to be in the right state to make stars burn hydrogen, planets form round spheres, and creatures like us evolve on their surface.
The multiverse idea has two strikes against it, though. First, physicists would refer to it as an unnatural explanation because it simply happened by chance. And second, no real evidence for it exists and we have no experiment that could currently test for it.
As of yet, physicists are still in the dark. We can see vague outlines ahead of us but no one knows what form they will take when we reach them. Finding the Higgs has provided the tiniest bit of light. But until more data appears, it won’t be enough.
Last Edit: October 12, 2013, 04:39:57 PM by Crafty_Dog
CERN May Not Have Discovered Higgs Boson After All
Reply #43 on:
November 09, 2014, 08:18:09 AM »
CERN May Not Have Discovered Higgs Boson After All
November 9, 2014 By Corey Leighton —Leave a Comment
In July of 2012, researchers at CERN announced that the 40 year hunt for the elusive Higgs boson may have come to an end. The announcement made headlines around the world, and particle physicists considered it a critical discovery to be one of the first of many from the lab’s famous Large Hadron Collider. But scientists at the University of Southern Denmark’s Center for Cosmology and Particle Physics Phenomenology are now casting doubt, saying that the detected particle may not be the elusive Higgs boson after all.
The Higgs boson is one of the key building blocks of the Standard Model of particle physics. The standard model attempts to explain the electromagnetic, weak, and strong nuclear forces, and the Higgs boson is a critical piece of the puzzle. Its discovery would lead the way to understanding the Higgs field, which, in turn, would explain how everything we see around us has mass. So, the announcement from CERN that it had been detected was received with much fanfare… excitement which might now be premature.
“The current data is not precise enough to determine exactly what the particle is,” says university researcher Mads Toudal Frandsen. “It could be a number of other known particles.”
Frandsen’s team now suggests that the detected particle may not only not be a Higgs boson, but it could be a ‘techni-higgs’ particle which would support a set of theories that are beyond the standard model known as ‘Technicolor”.
“A techni-higgs particle is not an elementary particle. Instead, it consists of so-called techni-quarks, which we believe are elementary,” he says.
“Techni-quarks may bind together in various ways to form for instance techni-higgs particles, while other combinations may form dark matter. We therefore expect to find several different particles at the LHC, all built by techni-quarks.”
The ultimate verdict most likely likes deep in the heart of the now-dormant LHC, which is currently silent while CERN scientists work to increase the power of the world’s most powerful particle supercollider. CERN hopes to have the LHC back online in early 2015.
Source: Tech Times
The Big Rip
Reply #44 on:
July 02, 2015, 11:29:00 PM »
New model of cosmic stickiness favors ‘Big Rip’ demise of universe
This is a time line of life of the universe that ends in a Big Rip. Credit Jeremy Teaford, Vanderbilt University
From Vanderbilt University:
The universe can be a very sticky place, but just how sticky is a matter of debate.
That is because for decades cosmologists have had trouble reconciling the classic notion of viscosity based on the laws of thermodynamics with Einstein’s general theory of relativity. However, a team from Vanderbilt University has come up with a fundamentally new mathematical formulation of the problem that appears to bridge this long-standing gap.
The new math has some significant implications for the ultimate fate of the universe. It tends to favor one of the more radical scenarios that cosmologists have come up with known as the “Big Rip.” It may also shed new light on the basic nature of dark energy.
The new approach was developed by Assistant Professor of Mathematics Marcelo Disconzi in collaboration with physics professors Thomas Kephart and Robert Scherrer and is described in a paper published earlier this year in the journal Physical Review D.
“Marcelo has come up with a simpler and more elegant formulation that is mathematically sound and obeys all the applicable physical laws,” said Scherrer.
The type of viscosity that has cosmological relevance is different from the familiar “ketchup” form of viscosity, which is called shear viscosity and is a measure of a fluid’s resistance to flowing through small openings like the neck of a ketchup bottle. Instead, cosmological viscosity is a form of bulk viscosity, which is the measure of a fluid’s resistance to expansion or contraction. The reason we don’t often deal with bulk viscosity in everyday life is because most liquids we encounter cannot be compressed or expanded very much.
Disconzi began by tackling the problem of relativistic fluids. Astronomical objects that produce this phenomenon include supernovae (exploding stars) and neutron stars (stars that have been crushed down to the size of planets).
Scientists have had considerable success modeling what happens when ideal fluids – those with no viscosity – are boosted to near-light speeds. But almost all fluids are viscous in nature and, despite decades of effort, no one has managed to come up with a generally accepted way to handle viscous fluids traveling at relativistic velocities. In the past, the models formulated to predict what happens when these more realistic fluids are accelerated to a fraction of the speed of light have been plagued with inconsistencies: the most glaring of which has been predicting certain conditions where these fluids could travel faster than the speed of light.
“This is disastrously wrong,” said Disconzi, “since it is well-proven experimentally that nothing can travel faster than the speed of light.”
These problems inspired the mathematician to re-formulate the equations of relativistic fluid dynamics in a way that does not exhibit the flaw of allowing faster-than-light speeds. He based his approach on one that was advanced in the 1950s by French mathematician André Lichnerowicz.
Next, Disconzi teamed up with Kephart and Scherrer to apply his equations to broader cosmological theory. This produced a number of interesting results, including some potential new insights into the mysterious nature of dark energy.
In the 1990s, the physics community was shocked when astronomical measurements showed that the universe is expanding at an ever-accelerating rate. To explain this unpredicted acceleration, they were forced to hypothesize the existence of an unknown form of repulsive energy that is spread throughout the universe. Because they knew so little about it, they labeled it “dark energy.”
Most dark energy theories to date have not taken cosmic viscosity into account, despite the fact that it has a repulsive effect strikingly similar to that of dark energy. “It is possible, but not very likely, that viscosity could account for all the acceleration that has been attributed to dark energy,” said Disconzi. “It is more likely that a significant fraction of the acceleration could be due to this more prosaic cause. As a result, viscosity may act as an important constraint on the properties of dark energy.”
Another interesting result involves the ultimate fate of the universe. Since the discovery of the universe’s run-away expansion, cosmologists have come up with a number of dramatic scenarios of what it could mean for the future.
One scenario, dubbed the “Big Freeze,” predicts that after 100 trillion years or so the universe will have grown so vast that the supplies of gas will become too thin for stars to form. As a result, existing stars will gradually burn out, leaving only black holes which, in turn, slowly evaporate away as space itself gets colder and colder.
An even more radical scenario is the “Big Rip.” It is predicated on a type of “phantom” dark energy that gets stronger over time. In this case, the expansion rate of the universe becomes so great that in 22 billion years or so material objects begin to fall apart and individual atoms disassemble themselves into unbound elementary particles and radiation.
The key value involved in this scenario is the ratio between dark energy’s pressure and density, what is called its equation of state parameter. If this value drops below -1 then the universe will eventually be pulled apart. Cosmologists have called this the “phantom barrier.” In previous models with viscosity the universe could not evolve beyond this limit.
In the Desconzi-Kephart-Scherrer formulation, however, this barrier does not exist. Instead, it provides a natural way for the equation of state parameter to fall below -1.
“In previous models with viscosity the Big Rip was not possible,” said Scherrer. “In this new model, viscosity actually drives the universe toward this extreme end state.”
According to the scientists, the results of their pen-and-paper analyses of this new formulation for relativistic viscosity are quite promising but a much deeper analysis must be carried out to determine its viability. The only way to do this is to use powerful computers to analyze the complex equations numerically. In this fashion the scientists can make predictions that can be compared with experiment and observation.
The research was supported by National Science Foundation grant 1305705 and Department of Energy grant DE-SC0011981.
Re: Physics & Mathematics
Reply #45 on:
July 03, 2015, 11:01:21 AM »
A bit over my head, but provokes a sense of wonder nonetheless.
Reply #46 on:
August 19, 2015, 02:39:32 PM »
Quantum spookiness confirmed
Reply #47 on:
August 31, 2015, 08:42:54 PM »
Please select a destination:
DBMA Martial Arts Forum
=> Martial Arts Topics
Politics, Religion, Science, Culture and Humanities
=> Politics & Religion
=> Science, Culture, & Humanities
=> Espanol Discussion
Powered by SMF 1.1.19
SMF © 2013, Simple Machines