Archive for November, 2006

Unique Marvel of Ancient Greek Technology Gives Up New Secrets

Thursday, November 30th, 2006

The most sophisticated mechanical device of ancient Greece may finally be giving up its secrets. Researchers have long known the so-called Antikythera mechanism was a calendar of sorts that represented the positions of the sun and moon using a series of gears. In its complexity it outshined all other objects for a thousand years following its creation sometime around the second century B.C. Now an international consortium of researchers has probed the machine’s corroded fragments with sophisticated x-ray and light imaging tools to uncover the true sophistication of this geared wonder.

The device could predict eclipses as well as reproduce a subtle irregularity in the moon’s orbit, they reveal. Moreover, it may have been able to represent the motions of the planets, although the necessary gears seem to be long gone. They also confirm a prior hypothesis that the device relied on spiral grooves to count off certain lunar cycles. “We don’t know what it was for but we do believe we know what it did,” says astronomer Mike Edmunds of Cardiff University in Wales.

Divers recovered the Antikythera instrument in 1901 from a 2,000-year-old shipwreck that had sunk beneath the Mediterranean Sea, midway between Greece’s Peloponnesian peninsula and the island Crete. What they found was a hunk of calcified bronze gears and other fragments, along with a decayed wooden box that had housed the mechanism. In pioneering work begun in the 1950s (and first described in a 1959 Scientific American article), the late science historian Derek Price reasoned that this encrusted mess was a solar-lunar calendar. The box would open in the front to reveal a dial with rotating gear-driven pointers to mark off the sun’s position and the phases of the moon. The back of the device contained two dials that counted off other cycles.

More…

Administration Lawyer Claims Link Between CO2 and Warming “Cannot Unequivocally Be Established”

Thursday, November 30th, 2006

– Every day we delay taking decisive action against Global Climate Change is a day in which the power of the coming changes increases. Even if all CO2 emissions were utterly ceased today, the experts tell us that the material already in the atmosphere will still produce strong effects in the coming decades as the Earth’s massive and ponderous climate shifts into a new configuration to accommodate them. Someday, the names of those who advocated delay now will be cursed by future generations.
———————————

Yesterday, the Supreme Court heard oral arguments in Massachusetts v. EPA. The outcome of the case will “likely determine whether the [Environmental Protection Agency] can regulate [greenhouse gas emissions] from power plants and other industries.”

Deputy Solicitor General Gregory Garre, who argued the case for the administration, admitted to the Justices that he had limited knowledge of climate science. “I am not an expert on global climate change,” Garre said.

Despite being uninformed in this “extraordinarily complex area of science,” Garre tried to introduce an element of doubt into the link between greenhouse gas emissions and climate change. From Slate’s account of the arguments:

Justice Stephen Breyer lights into Garre for some of the agency’s silly reasoning in declining to regulate the emissions. When Garre says that scientific uncertainty alone can justify the EPA’s refusal to regulate, Justice John Paul Stevens asks whether it matters that even the scientists who worked on the National Research Counsel study on global warming felt there was less scientific uncertainty than the EPA claimed. Garre insists that there is a “likely connection” between greenhouse gases and global warming but that “it cannot unequivocally be established.”

There is no doubt among the experts. The Intergovernmental Panel on Climate Change (IPCC), a body which involves thousands of scientists from over 120 countries who develop detailed reports on climate change, produced a report in 2001 which was reviewed by more than 1,000 top experts, including so-called “climate skeptics” and representatives from industry. The report stated, “There is new and stronger evidence that most of the warming observed over the last 50 years is attributable to human activities.”

Most recently, the National Academy of Sciences unequivocally concluded that natural causes cannot explain the unprecedented warmth over the last 400 years, and “human activities are responsible for much of the recent warming.”

To the original article:

EXPERTS WANT TIGHTER CONTROLS ON NANOTECHOLOGY

Thursday, November 30th, 2006

– It is one of the signature attributes of mankind that as we’ve used our intelligence to bull our way to dominance of the planet and the biosphere, that we’ve repeatedly underestimated the effects of our actions on the world around us.

– Rachel Carson’s book, Silent Spring was, perhaps, our first major wakeup call in this regard. Today, the world’s soil, streams and oceans swarm with chemicals of all sorts that have no analogues in the natural world and are, in many cases, having unexpected and damaging effects on the planet’s biological forms – including us.

– Reviewing Kurt Vonnecgut’s 1963 book Cat’s Cradle in which in introduced us to the hypothetical Ice-Nine is instructive at this point as we embark on releasing larger and larger numbers of nanotechnlogical materials into the natural environment and, once agin, assuming that all will be alright. An amazing assumption that we seem to make over and over again so that caution will not get in the way of profits.

——————————–

WASHINGTON (AFP)—Nanotechnologies pose real threats to health and the environment and need prompt testing and oversight, but government and industry are moving slowly on the issue, scientists and environmentalists said.

Speaking after the U.S. Environmental Protection Agency took its first step to regulate a nanomaterial–near atomic-sized particles of silver being used as pesticide in products from shoes to a washing machine–experts told AFP that nanotechnology is already producing materials that can harm the environment and human health.

“There are some very serious concerns about potential health consequences,” said Patrice Simms of the U.S. Natural Resources Defense Council (NRDC).

“We know next to nothing about their potential health effects,” said Simms.

Nanotechnology is the creation and use of materials barely larger than atomic in scale, measuring usually between one and 100 nanometers. A nanometer is one-billionth of a meter, and a human hair is roughly 80,000 nanometers in width.

At that size–small enough to pass through cell membranes in the body–many materials can take on physical and chemical properties not seen in their larger forms, giving them uses never imagined before.

A Washington-based group, The Project on Emerging Nanotechnologies, has catalogued 356 products already using nanotechnology, including “breathable” bedsheets, lighter, stiffer golf clubs, skin care creams, computer chips and antibacterial socks.

The technology also promises more substantial “miracle” uses, from health applications like cancer treatments, to drinking water filtration systems for poor countries, to longer-life batteries.

But materials at that size may also pose dangers when they are inhaled, ingested, absorbed through the skin, or spread through nature by wind and water, scientists warn.

“Something different happens when you begin to work at a very small scale,” said Andrew Maynard, chief science advisor at the Project on Emerging Technologies.

“We know that a lot of materials like asbestos and particles affect the health because of their shapes and sizes as well as their chemistry.

“It’s reasonable to assume that some of these new materials are going to do the same thing,” noting that there are a number of new nanomaterials in filament form, like asbestos which causes lung disease.

The problem is that both industry and the government have assumed the existing regulatory framework for chemicals and other materials is adequate, Simms pointed out.

More…

The dollar’s slide: How far, how hard?

Wednesday, November 29th, 2006

– The US Trade Deficit and the US National Debt are, in many people’s minds, getting seriously over extended and there will come a time when this perception gains general traction with those foreign investors who like to stockpile their money in US Bonds because of their safe and certain returns. When they decided too much is too much and shift their investments elsewhere, the US is going to come in for a hard financial landing. Since the US Trade Deficit and National Debt continue to rise year after year as if there is no tipping point, one can only wonder when we’ll cross it and everything will suddenly change. This possibility is just one of the many threads that make up the Perfect Storm hypothesis.

———————————–

The currency sank about 2.5 percent against the euro in the last 5 sessions. More losses may be coming.

(Fortune Magazine) — U.S. currency traders gorged on Thanksgiving turkey and took a half day last Friday while the rest of the world quietly bet against the dollar.

At first, it looked like a handful of speculators were taking advantage of light trading volume, which makes it easier to move a market up or down. But then more players started lining up against the greenback, too, and the worries hit harder than post-holiday indigestion.

The dollar has tumbled about 2.5 percent against the euro in the five sessions through Tuesday. Although the greenback came back a bit Wednesday, the dollar’s near its weakest against the euro since March 2005. The dollar also fared badly against the British pound, though it’s done slightly better against the lowly Japanese yen.

“With the dollar debacle, the health of the economy, current and future, is on trial,” said Brian Wesbury, chief economist at First Trust Advisors.

More…

061130 – Thursday – Correspondence

Wednesday, November 29th, 2006

Dennis:

Your latest email suggested that your pessimism is deepening. How can that be in light of the fact that the Bushies apparently were unable to use Diebold’s machines to win? A small victory, but one nonetheless.

D.

——————–

D.,

Sorry for my slow response. I’ve been a busy boy since I arrived here in New Zealand two weeks ago.

Well, you pose an interesting question. And, if I judged the world’s probable future solely through the lens of US politics, I think it would, indeed, indicate misplaced pessimism on my part. The current turn away from Bush’s policies is a good sign. I wish, however, that I felt that the change of direction was due to the Democrats offering up a new and persuasive visions for the country’s future but, I fear instead, that it was due more to a series of serious misadventures on the Republican side which pushed the electorate in the Democrat’s direction.

I recently read a book named, Crashing the Gate: Netroots, Grassroots, and the Rise of People-Powered Politics by Zuniga and Armstrong (review) which analyzes why the Republicans have so badly out maneuvered the Democrats over the last 15 years and what has and continues to be wrong with the machinery on the Democratic side. The book is written from the POV of a new and vibrant thread on the Democratic side – that is those young and technical types who have collectively joined forces to influence politics through the Internet. If you’ll recall, we first heard of these folks when Howard Dean’s campaign surged ahead so strongly due to their organizing via the Internet.

So, until people like Zuniga and Armstrong capture the control mechanisms of the Democratic Party Machinery, I think we’ll continue to see more of the same tired strategies which the Republicans have long since learned to organize around and bulldoze through.

But, in a fundamental way, all of that is peripheral to my concerns for the world’s probable future. The wells springs of why the world is on a collision course with disaster have much more to do with our inborn biological imperatives than with which country is currently sitting atop the dog pile.

I don’t know if you and I have discussed the concept of Biological Imperatives before or not. The idea is that all biological forms here on earth, from very near the beginning of biological evolution until the present, share deep inborn imperatives to propagate their genes forward in time and to create and protect spaces within which their progeny can grow to maturity so that they can, in their turn, propagate their genes forward as well. It is a strategy which has served all of biology well up until now. But now, one species, us, has become so powerful that we’ve broken free of all the checks and balances of the natural world and we’ve grown until we’ve covered the planet and now, with no more frontiers to conquer and no more spaces to fill, this strategy has finally, after billions of years, come to the place where its applicability has run out and a new strategy that acknowledges limits has to be implemented or we are going to self destruct and take much of the biosphere with us.

So, if you buy this hypothesis, then what’s going on with global politics is of only marginal importance. I tend to think of the Democrats and the Republicans these days as Tweedle-Dee and Tweedle-Dum. Neither of them seems to have even the faintest grasp of what’s going on and what the stakes are.

Dennis Gallagher
samadhisoft.com
in Aotearoa/New Zealand

The Storm Perfected by Jim Kunstler

Tuesday, November 28th, 2006

– Jim Kunstler is the author of The Long Emergency: Surviving the Converging Catastrophes of the Twenty-First Century which is an excellent analysis of why the demise of cheaply available oil will inevitably lead to the demise of suburbia and all of the consequences which will follow from that.
————————–

November 27, 2006
Last week, I had one of those clarifying moments when the enormity of the American fiasco stirred my livers and lights again. I was riding in a car at sundown between St. Cloud and Minneapolis on I-94 through a fifty-mile-plus corridor of bargain shopping infrastructure on each side of the highway. The largest automobile dealerships I have ever seen lay across the edge of the prairie like so many UFO landing strips, with eerie forests of sodium-vapor lamps shining down on the inventory. The brightly colored signs of the national chain fried food parlors vied for supremacy of the horizon with the big box logos. The opposite lane was a blinding river of light as the cars plied north from the Twin Cities to these distant suburbs in the pre-Thanksgiving rush hour.

All that tragic stuff deployed out on the prairie was but the visible part of the storm now being perfected for us. On the radio, Iraq was coming completely apart and with it the illusion of America being able to control a larger set of global events — with dire implications for all the glowing plastic crap along the interstates, and the real-live people behind the headlights in those rivers of cars.

The main fresh impression I had amidst all this is how over it is. The glowing smear of auto-oriented commerce along I-94 (visible from space, no doubt) had the look of being finished twenty minutes ago. Beyond the glowing logos lay the brand new residential subdivisions full of houses that now may never be sold, put up by a home-building industry in such awful trouble that it may soon cease to exist. If suburbia was the Great Work of the American ethos, then our work is done. We perfected it, we completed it, and, like a brand new car five minutes after delivery, it has already lost much of its value.

More…

Web Tool Said to Offer Way Past the Government Censor

Tuesday, November 28th, 2006

– I like this idea.  I think all of us should have free and unfettered access to information so we can each make up our own minds about things.   I oppose governments which try to control access to information.

———————————————-

TORONTO, Nov. 21 — Deep in a basement lab at the University of Toronto a team of political scientists, software engineers and computer-hacking activists, or “hactivists,” have created the latest, and some say most advanced tool yet in allowing Internet users to circumvent government censorship of the Web.

The program, called psiphon (pronounced “SY-fon”), will be released on Dec. 1 in response to growing Internet censorship that is pushing citizens in restrictive countries to pursue more elaborate and sophisticated programs to gain access to Western news sites, blogs and other censored material.

“The problem is growing exponentially,” said Ronald Deibert, director of the University of Toronto’s Citizen Lab, which designed psiphon. “What might have started as censorship of pornography and Western news organizations has expanded to include blogging sites, religious sites, health information sites and many others.”

Psiphon is downloaded by a person in an uncensored country (psiphon.civisec.org), turning that person’s computer into an access point. Someone in a restricted-access country can then log into that computer through an encrypted connection and using it as a proxy, gain access to censored sites. The program’s designers say there is no evidence on the user’s computer of having viewed censored material once they erase their Internet history after each use. The software is part of a broader effort to live up to the initial hopes human rights activists had that the Internet would provide unprecedented freedom of expression for those living in restrictive countries.

“Governments have militarized their censorship efforts to an incredible extent so we’re trying to reverse some of that and restore that promise that the Internet once had for unfettered access and communication,” Dr. Deibert said.

When it opened in 2000, the Citizen Lab, which is one of four institutions in the OpenNet Initiative (opennetinitiative.org), was actively monitoring a handful of countries, mainly China, Iran and Saudi Arabia, that censored the Internet. But citing increased filtering by governments, the lab now monitors more than 40 countries.

The program’s designers say existing anticensorship programs are too complicated for everyday computer users, leave evidence on the user’s computer and lack security in part because they have to be advertised publicly, making it easy for censors to detect and block access to them.

“Now you will have potentially thousands, even tens of thousands, of private proxies that are almost impossible for censors to follow one by one,” said Qiang Xiao, director of the China Internet Project at the University of California, Berkeley.

Instead of publicly advertising the required login and password information, psiphon is designed to be shared within trusted social circles of friends, family and co-workers. This feature is meant to keep the program away from censors but is also the largest drawback because it limits efforts to get the program to as many people as possible.

The software is also designed to allow users to post on blogs and other Web sites like Wikipedia, which has been a problem for some other anticensorship programs. By requiring only login information and no installation, psiphon is intended for anyone with basic computer knowledge because psiphon functions much the same as any typical browser.

“So far it’s been tech solutions for tech people,” said Dmitri Vitaliev, a human rights activist in Russia who has been testing psiphon in countries where the Internet is censored. “We have not had very good tools so everyone has been eagerly awaiting psiphon.”

Original article on the NY Times site here…

Fragmentation Quickly Destabilizes Amazon Rain Forest

Tuesday, November 28th, 2006

The towering–and tiny–trees of the Amazon can live for hundreds of years. But a 22-year study of what happens when the rain forest is sliced up by timber cutting, cattle ranching and soy farming has revealed that survivors in various fragments do not last for long. “In just two decades,” notes William Laurance of the Smithsonian Tropical Research Institute, who led the study, “a wink of time for a thousand-year-old tree, the ecosystem has been seriously degraded.”

Since 1980, researchers have been studying 40 different one-hectare plots in nine rain forest fragments in central Amazonia near Manaus. Covering roughly 32,000 individual trees composed of 1,162 species, 24 of the hectare plots rest near the edges of the remnant fragments while 16 lie deep within intact interiors. Comparing the two reveals that trees located on the edges of such fragments quickly perish, dying nearly three times faster than their interior peers. “When you fragment the rain forest, hot winds from the surrounding pastures blow into the forest and kill many trees, which just can’t handle the stress,” explains team member Henrique Nascimento of Brazil’s National Institute for Amazonian Research. “Also, winds build up around the fragment and knock down a lot of trees.”

Although overall tree species richness did not change over the two decades of the study, the type of species that predominated at the edges changed radically: from specialized trees capable of persisting in the dark understory to so-called generalist species. “These species are fast-growth, short-lived species with low wood density,” Nascimento explains, such as Cecropia sciadophylla, which has increased by more than 3,000 percent after fragmentation. Such edge fragments are also highly unstable, with one species replacing another in rapid succession, and the trees themselves remain generally smaller than their undisturbed, towering brethren in the interior.

More…

Mysterious Stabilization of Atmospheric Methane May Buy Time in Race to Stop Global Warming

Tuesday, November 28th, 2006

Since 1978 chemists at the University of California, Irvine, have been collecting air in 40 locations from northern Alaska to southern New Zealand. Using gas chromatography, the scientists have measured the levels of methane–CH4–in the lowest layer of our atmosphere. Although not nearly as abundant as carbon dioxide–CO2–methane remains the second most important greenhouse gas, both because each molecule of CH4 in the atmosphere traps 23 times as much heat as carbon dioxide and it helps create more ozone–yet another greenhouse gas–in the atmosphere. During the two decades of measurements, methane underwent double-digit growth as a constituent of our atmosphere, rising from 1,520 parts per billion by volume (ppbv) in 1978 to 1,767 ppbv in 1998. But the most recent measurements have revealed that methane levels are barely rising anymore–and it is unclear why.

Chemist Isobel Simpson led the research examining samples from 1998 through 2005 and found that methane levels had practically stopped rising, reaching 1,772 ppbv in 2005. During this period some years did see rises while others actually saw slight decreases, according to the paper presenting the result in the November 23 issue of Geophysical Research Letters. By also measuring levels of ethane (C2H6) and perchloroethylene, or perc, (C2Cl4) the researchers determined that these pulses in methane levels during this period could be linked to major forest fires, such as the massive burn in Indonesia from late 1997 to early 1998. “All three of these molecules are removed by the same process–reaction with hydroxyl,” a radical formed from water in the atmosphere, explains Nobel Prize-winning chemist F. Sherwood Rowland, who participated in the research. “Both methane and ethane are produced in biomass burning, but perc is an industrial solvent. If biomass burning is the source, then perc [levels] should behave quite differently from the two hydrocarbons, and this is what we observed.”

But that does not solve the larger question of why methane in the atmosphere seems to have reached a plateau. “The scientific community agrees that the pause is source-driven rather than sink-driven, that is, caused by decreasing emissions of methane,” Simpson says. “I don’t believe we have reached a consensus on which sources have decreased and by how much.” Leading hypotheses include: the collapse of the Soviet Union, which resulted in a decline in energy use in Russia and the other former Soviet republics; repairs to oil and gas lines to prevent leaks; decreasing emissions from coal mining; widespread drought that led to decreased emissions from natural wetlands; and a decline in rice production. “The trends of major man-made sources such as rice fields and cattle have greatly slowed down over the last two decades,” notes physicist Aslam Khalil of Portland State University. “As these–rice and cattle–were once big sources, their lack of continued increase would then cause atmospheric methane to stop increasing as well.”

More…

Scientists fear results of collapsed ice shelf

Tuesday, November 28th, 2006

By JOHN HENZELL, The Press newspaper, Christchurch, New Zealand

The Ross Ice Shelf, a raft of ice the size of France, could collapse quickly, triggering a dramatic rise in sea levels, scientists warn.

A New Zealand-led drilling team in Antarctica has recovered three million years of climate history, but the news is not good for the future.

Initial analysis of sea-floor cores near Scott Base suggest the Ross Ice Shelf had collapsed in the past and had probably done so suddenly.

The team’s co-chief scientist, Tim Naish, said the sediment record was important because it provided crucial evidence about how the Ross Ice Shelf would react to climate change, with potential to dramatically increase sea levels.

“If the past is any indication of the future, then the ice shelf will collapse,” he said.

“If the ice shelf goes, then what about the West Antarctic Ice Sheet? What we’ve learnt from the Antarctic Peninsula is when once buttressing ice sheets go, the glaciers feeding them move faster and that’s the thing that isn’t so cheery.”

Antarctica stores 90 per cent of the world’s water, with the the West Antarctic Ice Sheet holding an estimated 30 million cubic kilometres.

In January, British Antarctic Survey researchers predicted that its collapse would make sea levels rise by at least 5m (16ft), with other estimates predicting a rise of up to 17m (55ft).

More…