High Risk Investment That Brought Down The U.S. Economy Returns, With A New Name

February 10th, 2015

When a restaurant fails a health code inspection, sometimes the easiest thing to do is to close up shop, let people forget what happened, then slap a new sign on the door and reopen under a new name. That’s essentially what the world’s biggest banks are doing with a complex, high-risk investment product that helped destroy the global economy less than eight years ago.

Goodbye, “collateralized debt obligations.” Hello, “bespoke tranche opportunities.” Banks including Goldman Sachs are marketing that newfangled product, according to Bloomberg, and total sales of “bespoke tranche opportunities” leaped from under $5 billion in 2013 to $20 billion last year.

Like other derivatives, these “BTOs” allow investors to place wagers on the outcome of various loans, bonds, and securities in which they are not directly invested. Hedge funds and other sophisticated financial industry actors use derivatives both as a form of insurance to manage the total risk they are exposed to across their whole investment portfolio, and to gamble on real-world economic events such as mortgage payments, municipal bonds, and the price of physical commodities. The resulting web of complicated contracts can be very difficult to untangle, and can involve impossible-sounding amounts of money. The Financial Crisis Inquiry Commission concluded that derivatives “were at the center of the storm” and “amplified the losses from the collapse of the housing bubble by allowing multiple bets on the same securities.” In 2010, the total on-paper value of every derivative contract worldwide was $1.4 quadrillion, or 23 times the total economic output of the entire planet.

Collateralized debt obligations (CDOs) are a form of derivative that breaks one pool of financial assets — either direct loans or securities that are based on groups of loans — into multiple layers of riskiness. Those layers care called tranches, and investors who buy the least-risky tranche of the derivative will get paid before those who buy the second tranche, and so on. Banks selling traditional CDOs had to create these multiple risk tranches based on a given set of loans or securities, and then hope that someone would buy each of them.

The new “bespoke” version of the idea flips that business dynamic around. An investor tells a bank what specific mixture of derivatives bets it wants to make, and the bank builds a customized product with just one tranche that meets the investor’s needs. Like a bespoke suit, the products are tailored to fit precisely, and only one copy is ever produced. The new products are a symptom of the larger phenomenon of banks taking complex risks in pursuit of higher investment returns, Americans for Financial Reform’s Marcus Stanley said in an email, and BTOs “could be automatically exempt” from some Dodd-Frank rules.

This is not the first time that large banks have tried to reboot the CDO machine since the financial crisis made those products a much-reviled household name. In early 2013, JP Morgan Chase and Morgan Stanley tried and failed to find buyers for a new set of CDOs. The nature of that failure helps illuminate the rationale behind the new version of the product. Finding buyers for the various different layers of risk was “like trying to line up boxcars,” one investor told the Financial Times after the 2013 reboot effort fizzled. Many of the firms that used to buy such products prior to the crisis “no longer exist, and those that survive have very bad memories” of the experience, another analyst said.

Since then, those same old characters seem to have found a way to get back into the business. In addition to Goldman, which narrowly avoided criminal chargesafter a Senate investigation revealed its shady pre-crisis mortgage dealings, sellers of “bespoke tranche obligations” now include Citigroup and the french banking giant BNP Paribas. BNP’s recent notoriety doesn’t relate to the financial crisis, but rather to the bank’s violation of various U.S. sanctions against Iran, Cuba, and Sudan. And while Citigroup’s past leadership now says financial deregulation was a mistake and that megabanks like Citi should be broken up to protect the economy, its current leadership is chipping away at key Dodd-Frank reforms. Citi was also heavily involved in the “robosigning” scandal that lead to hundreds of thousands or even millions of unjust foreclosures.

– to the Original:  ➡

 

No, climate models aren’t exaggerating global warming

February 10th, 2015

Weather and climate agencies around the world have been almost unanimous in declaring 2014 the hottest year on record — something that has promoted considerable chagrin among climate change doubters. That’s because these “skeptics” have long sought to cast doubt on man-made global warming by pointing to an alleged global warming “pause” or “slowdown” — going on to suggest that the computerized climate models that scientists use to project future temperatures are flawed, and overestimate carbon dioxide’s warming effect.

So, is that true? Do the models consistently overestimate the warming effects of greenhouse gases like CO2?

As a recent study suggests, the answer is no. While many models didn’t predict the relatively modest surface-warming “hiatus,” it’s not because they’re biased in favor of greenhouse-gas emissions’ warming effects. Rather, researchers report in Nature, these computer simulations just struggle to predict “chaotic” (or random) short-term changes in the climate system that can temporarily add or subtract from CO2 emissions’ warming effects.

It’s true that air temperatures have increased slower in the past 15 years or so, and climate models on average instead predicted much more warming. And scientists are slowly beginning to figure out why temperatures didn’t rise quite as much as expected.

One probable contributor is pure natural variability: Cyclical processes in the Earth’s climate and temporary changes in the amount of solar radiation that reach the Earth’s surface can introduce “blips” into the Earth’s warming trend. Right now, oceans may be temporarily sucking up more heat from the atmosphere than they normally do. Moreover, a temporary downturn in solar output and an increase in light-reflecting aerosol pollution (acting like a chemical sunblock of sorts) could also have partially masked CO2-driven warming.

But researchers Jochem Marotzke of the Max Planck Institute of Meteorology and Piers M. Forster of the University of Leeds also wanted to check whether climate models are biased, by testing how their temperature predictions stack up against reality. So the researchers tested how 114 model simulations that underpin last year’s assessment report of the U.N. Intergovernmental Panel on Climate Change (IPCC) performed — not just for the 15-year period from 1998-2012 but for all 15-year periods stretching back to 1900. If this analysis were to show that models consistently overestimated or underestimated the amount of warming that actually occurred, then they must have some sort of systematic bias.

As it turns out, however, the models did pretty well. In each 15-year period, the model simulations produced a range of predictions. But each 15-year interval’s actual temperature trend always fell somewhere in the models’ prediction range. Moreover, even when 15-year actual temperature trends did fall toward the edges of the corresponding predicted ranges, they weren’t consistently at the higher or lower edges. Basically, when the models were missing the mark, they weren’t doing so consistently in one direction.

So, it’s true that the IPCC model runs didn’t predict the recent warming slowdown. But as these findings show, they didn’t accurately predict certain other 15-year periods of warming accelerations or slowdowns in the past either, and it’s not because they were always overestimating warming. Indeed, in some 15-year periods, the models underestimated warming. Essentially, that means climate skeptics are cherry-picking when they point out that climate models didn’t predict the recent 15-year hiatus.

That doesn’t entirely explain why the model simulations in a given year produced varying results to begin with, though. Was it due to differences in the underlying physics coded into the models? (The models differ slightly in terms of how much light they assume hits the Earth, how “sensitive” temperatures are to changes in CO2, and how much heat the oceans suck up.) Or was it just random fluctuations in the climate system? Or a combination? The researchers did a statistical analysis to answer that question.

In the end, none of those physical reasons was a major factor. Random fluctuations had 2.5 times the impact on the model predictions’ variations as all those physical factors together did, the researchers found. Only when the researchers used longer-term intervals (of more than 60 years) did differences in sunlight amount, ocean heat trapping or climate sensitivity start to make a big difference.

So climate models may not provide the perfect picture of what will happen to temperatures in a given short-term period (on 10- or 20-year scales). But maybe they simply can’t, due to the random ways in which climate can temporarily fluctuate. That doesn’t mean that climate models aren’t valuable to us. They still give us good sense of the long-term picture, the one that is more important for us to worry about anyway: that temperatures are increasing, and that natural factors can’t explain this increase.

As the researchers argue, then, their findings ought to put to rest assertions by climate “skeptics” that climate models overestimate how much warming we’re going to get.

– to the Original:  ➡

 

Red state, red power: Nebraska’s publicly-owned electricity system

February 1st, 2015

– Here’s a story I love and it reminds me of another story I posted some time ago here.  That one is about a different kind of Bank in North Dakota – another Red state.

-dennis

= = = = = = = = = = = = = = = = = = =

Republican Nebraska’s energy is all publicly owned or cooperative, writes Thomas M. Hanna, and prices are among America’s lowest, with great service standards and a strong commitment to renewables. Decentralised and locally accountable, this could be the model that replaces inefficient, unresponsive monopolies – both nationalised and corporate.

Around the world, people often assume that in the United States, home to a no-holds-barred version of ‘free market’ capitalism, private ownership operates more or less across the board.

There is, however, a rich and robust history and experience of public ownership throughout the country – often found in the least expected of places.

For instance, there is one state where every single resident and business receives electricity from a public or community-owned institution rather than a for-profit corporation.

It is not a famously liberal state like Vermont or Massachusetts. Rather, it is conservative Nebraska, with its two Republican Senators and two (out of three) Republican members of Congress, that has embraced the complete socialization of energy distribution.

The ‘red states’ – named after the color now given to states that vote Republican in elections – are often ‘red’ in more ways than one.

Public and cooperative ownership for good service, low prices

In Nebraska, 121 publicly-owned utilities, 10 cooperatives, and 30 public power districts provide electricity to a population of around 1.8 million people. Public and cooperative ownership keeps costs low for the state’s consumers.

Nebraskans pay one of the lowest rates for electricity in the nation and revenues are reinvested in infrastructure to ensure reliable and cheap service for years to come. “There are no stockholders, and thus no profit motive”, the Nebraska Power Association proudly proclaims.

“Our electric prices do not include a profit. That means Nebraska’s utilities can focus exclusively on keeping electric rates low and customer service high. Our customers, not big investors in New York and Chicago, own Nebraska’s utilities.”

Payments (in lieu of taxes) from the state’s publicly-owned utilities exceed $30 million a year and support a variety of social services throughout the state-including the public education system.

Nebraska has a long history of publicly-owned power systems dating back to the beginnings of electrification in the late 1800s. Initially, these co-existed with small private utilities. However, in the post-World War One era, large corporate electric holding companies backed by Wall Street banks entered the market and began taking over smaller private and municipal systems.

Using their financial and political power, these corporations dramatically consolidated the power industry in Nebraska and attempted to stop new cooperatives and publicly-owned utilities from forming. During this time more than one third of the state’s municipal utilities were sold to private corporations.

Tired of abusive corporate practices, in 1930 residents and advocates of publicly-owned utilities took a revenue bond financing proposal straight to the voters, bypassing the corporate influenced legislature which had previously failed to pass similar legislation.

It was approved overwhelmingly – signaling both popular support for publicly-owned utilities in the state and also the beginnings of their resurgence. Led by powerful Nebraska Senator George W. Norris – the driving force behind the publicly-owned Tennessee Valley Authority – a series of state and federal laws were passed including:

  • the state’s Enabling Act (1933) which allowed 15% of eligible voters in an area to petition for a decision on a publicly-owned utility;
  • the Public Utility Holding Company Act (1935) which forced the breakup and restructuring of corporate electricity monopolies;
  • and the Rural Electrification Act (1936) which provided financing for rural electricity projects.

By 1949, Nebraska had solidified its status as the first and only all-public power state.

Local democracy in action

Local control and the possibility for democratic participation are defining features of Nebraska’s publicly-owned electricity system. At the ground level, public utilities and cooperatives are run by publicly elected power district boards, cooperative boards, or elected city councils (often through appointed boards).

These bodies establish budgets, establish service standards and policies, and set prices. Regularly scheduled meetings of power boards and councils are open to public involvement and comment.

Should they so wish, every Nebraskan has the opportunity to become involved in the decision making of their local electricity provider.

One such example relates to the increasing use and proliferation of renewable energy facilities. While the state remains heavily reliant on coal and nuclear sources to provide low-cost energy to consumers, interest in renewable energy – primarily wind – has taken off in recent years.

In 2003, electricity consumers, many of whom drove more than 100 miles for the event, participated in an eight-hour deliberative polling survey for the Nebraska Public Power District (NPDD) – a public corporation owned by the state of Nebraska that supplies energy to 600,000 people via local publicly-owned utilities and cooperatives.

The topic at hand was the potential addition of more than 200 MW of wind energy by 2010. 96% of the participants supported the wind project, with 50% agreeing it was the right size and 36% wanting it expanded (compared to just 3% who wanted it reduced).

In addition to its other wind power facilities, in 2005 NPDD began operating the Ainsworth Wind Energy Facility, the nation’s second largest publicly-owned wind farm consisting of 36 turbines generating up to 59.5 MW of energy.

In 2011, the state’s energy plan acknowledged both that power generation from wind had doubled every two years since 2006 and that developing just 1 percent of the potential energy from wind in Nebraska would satisfy the state’s entire peak demand.

Moreover, public ownership of electricity generation and distribution in Nebraska is complemented by another seemingly socialist idea – planning. The Nebraska Power Review Board is a state agency that oversees the publicly-owned electricity system.

In addition to its regulatory functions-such as monitoring rate increases and arbitrating conflicts-the five person Review Board (appointed by the Governor and confirmed by the legislature with party, occupational, and term limit restrictions) “oversees the preparation and filing of a coordinated long-range power supply plan”, as well as the location and construction of new electricity generation facilities.

Decentralised and locally accountable

As demonstrated by Nebraska’s nearly 100 years of experience with a completely public and community-owned electricity system, American experimentation offers an interesting alternative to how public ownership has often been implemented in other parts of the world.

Describing the post-World War Two British public-ownership program, University of Glasgow professor Andrew Cumbers writes:

“The nationalization of the electricity, gas and other utilities resulted in the centralization of many activities that had formerly been locally or municipally owned and subject to a reasonable degree of local democratic control …

“Not only did this eviscerate important traditions of municipal socialism and more democratic forms of public ownership, but it also led to an increasing number of costly and unaccountable decisions (notably the decision to invest in nuclear power) by nationalized entities.”

Such experiences often reinforce the concern that public ownership of larger scale systems can lead to inefficiency, unaccountability, and bureaucracy. But Nebraska demonstrates that this does not necessarily have to be the case.

The principles of subsidiarity and local control can, in fact, be preserved through a networked mix of publicly-owned institutions at various scales without sacrificing efficiency or service quality. Of course, public ownership alone is not a fix-all solution.

It does, however, provide an opportunity for a community, a city, or even a whole region or nation to become actively involved in economic decision making on important matters affecting their lives, their environment, and their future.

– To the Original: ➡

 

 

 

 

 

 

 

 

 

 

The Davos oligarchs are right to fear the world they’ve made

January 27th, 2015

Escalating inequality is the work of a global elite that will resist every challenge to its vested interests

The billionaires and corporate oligarchs meeting in Davos this week are getting worried about inequality. It might be hard to stomach that the overlords of a system that has delivered the widest global economic gulf in human history should be handwringing about the consequences of their own actions.

But even the architects of the crisis-ridden international economic order are starting to see the dangers. It’s not just the maverick hedge-funder George Soros, who likes to describe himself as a class traitor. Paul Polman, Unilever chief executive, frets about the “capitalist threat to capitalism”. Christine Lagarde, the IMF managing director, fears capitalism might indeed carry Marx’s “seeds of its own destruction” and warns that something needs to be done.

The scale of the crisis has been laid out for them by the charity Oxfam. Just 80 individuals now have the same net wealth as 3.5 billion people – half the entire global population. Last year, the best-off 1% owned 48% of the world’s wealth, up from 44% five years ago. On current trends, the richest 1% will have pocketed more than the other 99% put together next year. The 0.1% have been doing even better, quadrupling their share of US income since the 1980s.

This is a wealth grab on a grotesque scale. For 30 years, under the rule of what Mark Carney, the Bank of England governor, calls “market fundamentalism”, inequality in income and wealth has ballooned, both between and within the large majority of countries. In Africa, the absolute number living on less than $2 a day has doubled since 1981 as the rollcall of billionaires has swelled.

In most of the world, labour’s share of national income has fallen continuously and wages have stagnated under this regime of privatisation, deregulation and low taxes on the rich. At the same time finance has sucked wealth from the public realm into the hands of a small minority, even as it has laid waste the rest of the economy. Now the evidence has piled up that not only is such appropriation of wealth a moral and social outrage, but it is fuelling social and climate conflict, wars, mass migration and political corruption, stunting health and life chances, increasing poverty, and widening gender and ethnic divides.

Escalating inequality has also been a crucial factor in the economic crisis of the past seven years, squeezing demand and fuelling the credit boom. We don’t just know that from the research of the French economist Thomas Piketty or the British authors of the social study The Spirit Level. After years of promoting Washington orthodoxy, even the western-dominated OECD and IMF argue that the widening income and wealth gap has been key to the slow growth of the past two neoliberal decades. The British economy would have been almost 10% larger if inequality hadn’t mushroomed. Now the richest are using austerity to help themselves to an even larger share of the cake.

The big exception to the tide of inequality in recent years has been Latin America. Progressive governments across the region turned their back on a disastrous economic model, took back resources from corporate control and slashed inequality. The numbers living on less than $2 a day have fallen from 108 million to 53 million in little over a decade. China, which also rejected much of the neoliberal catechism, has seen sharply rising inequality at home but also lifted more people out of poverty than the rest of the world combined, offsetting the growing global income gap.

These two cases underline that increasing inequality and poverty are very far from inevitable. They’re the result of political and economic decisions. The thinking person’s Davos oligarch realises that allowing things to carry on as they are is dangerous. So some want a more “inclusive capitalism” – including more progressive taxes – to save the system from itself.

But it certainly won’t come about as a result of Swiss mountain musings or anxious Guildhall lunches. Whatever the feelings of some corporate barons, vested corporate and elite interests – including the organisations they run and the political structures they have colonised – have shown they will fight even modest reforms tooth and nail. To get the idea, you only have to listen to the squeals of protest, including from some in his own party, at Ed Miliband’s plans to tax homes worth over £2m to fund the health service, or the demand from the one-time reformist Fabian Society that the Labour leader be more pro-business (for which read pro-corporate), or the wall of congressional resistance to Barack Obama’s mild redistributive taxation proposals.

Perhaps a section of the worried elite might be prepared to pay a bit more tax. What they won’t accept is any change in the balance of social power – which is why, in one country after another, they resist any attempt to strengthen trade unions, even though weaker unions have been a crucial factor in the rise of inequality in the industrialised world.

It’s only through a challenge to the entrenched interests that have dined off a dysfunctional economic order that the tide of inequality will be reversed. The anti-austerity Syriza party, favourite to win the Greek elections this weekend, is attempting to do just that – as the Latin American left has succeeded in doing over the past decade and a half. Even to get to that point demands stronger social and political movements to break down or bypass the blockage in a colonised political mainstream. Crocodile tears about inequality are a symptom of a fearful elite. But change will only come from unrelenting social pressure and political challenge.

– To the original: ➡

 

As inequality soars, the nervous super rich are already planning their escapes

January 26th, 2015

Hedge fund managers are preparing getaways by buying airstrips and farms in remote areas, former hedge fund partner tells Davos during session on inequality

With growing inequality and the civil unrest from Ferguson and the Occupy protests fresh in people’s mind, the world’s super rich are already preparing for the consequences. At a packed session in Davos, former hedge fund director Robert Johnson revealed that worried hedge fund managers were already planning their escapes. “I know hedge fund managers all over the world who are buying airstrips and farms in places like New Zealand because they think they need a getaway,” he said.

Johnson, who heads the Institute of New Economic Thinking and was previously managing director at Soros, said societies can tolerate income inequality if the income floor is high enough. But with an existing system encouraging chief executives to take decisions solely on their profitability, even in the richest countries inequality is increasing.

Johnson added: “People need to know there are possibilities for their children – that they will have the same opportunity as anyone else. There is a wicked feedback loop. Politicians who get more money tend to use it to get more even money.”

Global warming and social media are among the trends the 600 super-smart World Economic Forum staffers told its members to watch out for long before they became ubiquitous. This year, income inequality is fast moving up the Davos agenda – a sure sign of it is poised to burst into the public consciousness.

Jim Wallis, founder of Sojourners and a Davos star attraction after giving the closing address in 2014, said he had spent a lot of time learning from the leaders behind recent social unrest in Ferguson. He believes that will prove “a catalytic event” which has already changed the conversation in the US, bringing a message from those who previously “didn’t matter”.

So what is the solution to having the new voices being sufficiently recognised to actually change the status quo into one where those with power realise they do matter?

Clarke said: “Solutions are there. What’s been lacking is political will. Politicians do not respond to those who don’t have a voice In the end this is all about redistributing income and power.”

She added: “Seventy five percent of people in developing countries live in places that are less equal than they were in 1990.”

The panellists were scathing about politicians, Wallis describing them as people who held up wet fingers “to see which way the money is blowing in from.”

Author, philosopher and former academic Rebecca Newberger-Goldstein saw the glass half full, drawing on history to prove society does eventually change for the better. She said Martin Luther King was correct in his view that the arch of history might be long, but it bends towards justice.

In ancient Greece, she noted, even the greatest moralists like Plato and Aristotle never criticised slavery. Newberger-Goldstein said: “We’ve come a long way as a species. The truth is now dawning that everybody matters because the concept of mattering is at the core of every human being.” Knowing you matter, she added, is often as simple as having others “acknowledge the pathos and reality of your stories. To listen.”

Mexican micro-lending entrepreneur Carlos Danel expanded on the theme. His business, Gentera, has thrived by working out that “those excluded are not the problem but realising there’s an opportunity to serve them.”

He added: “Technology provides advantages that can lower costs and enable us to provide products and services that matter to the people who don’t seem to matter to society. And that’s beyond financial services – into education and elsewhere.”

Which, Danel believes, is why business was created in the first place – to serve. A message that seemed to get lost somewhere in the worship of profit.

– To the original: ➡

– Research thanks to Kierin M.

‘It is profitable to let the world go to hell’

January 22nd, 2015

As politicians and business leaders gather in Davos, climate expert Jørgen Randers argues that democracy will continue to hamper climate action

How depressed would you be if you had spent more than 40 years warning of an impending global catastrophe, only to be continually ignored even as you watch the disaster unfolding?

So spare a thought for Jørgen Randers, who back in 1972 co-authored the seminal work Limits to Growth (pdf), which highlighted the devastating impacts of exponential economic and population growth on a planet with finite resources.

As politicians and business leaders gather in Davos to look at ways to breathe new life into the global battle to address climate change, they would do well to listen to Randers’ sobering perspective.

The professor of climate strategy at the Norwegian Business School has been pretty close to giving up his struggle to wake us up to our unsustainable ways, and in 2004 published a pessimistic update of his 1972 report showing the predictions made at the time are turning out to be largely accurate.

What he cannot bear is how politicians of all persuasions have failed to act even as the scientific evidence of climate change mounts up, and as a result he has largely lost faith in the democratic process to handle complex issues.

In a newly published paper in the Swedish magazine Extrakt he writes:

It is cost-effective to postpone global climate action. It is profitable to let the world go to hell.

I believe that the tyranny of the short term will prevail over the decades to come. As a result, a number of long-term problems will not be solved, even if they could have been, and even as they cause gradually increasing difficulties for all voters.

Randers says the reason for inaction is that there will be little observable benefit during the first 20 years of any fiscal sacrifice, even though tougher regulations and taxes will guarantee a better climate for our children and grandchildren.

He has personal experience of this, having chaired a commission in Norway that in 2006 came up with a 15-point plan to solve the climate problem if every Norwegian was willing to pay €250 (£191) in extra taxes every year for the next generation or so.

If the plan had been given the green light, it would have allowed the country to cut its greenhouse gas emissions by two-thirds by 2050 and provide a case study other rich countries could learn from.

He says:

In my mind, the cost was ridiculously low, equivalent to an increase in income taxes from 36% to 37%, given that this plan would eliminate the most serious threat to the rich world in this century.

In spite of this, a vast majority of Norwegians were against this sacrifice. To be frank, most voters preferred to use the money for other causes – like yet another weekend trip to London or Sweden for shopping.

When it comes to more regulation or higher taxes, Randers says voters tend to revolt and, as a result, politicians will continue to refuse to take courageous steps for fear of being thrown out of office at the next election.

“The capitalist system does not help,” says Randers. “Capitalism is carefully designed to allocate capital to the most profitable projects. And this is exactly what we don’t need today.

– More: ➡

– research thanks to Kael L.

Prosecute Torturers and Their Bosses

January 20th, 2015

– If I were to consider renouncing my U.S. citizenship, this would be very near the top of the list of reasons why.  

– The U.S. executed Japanese soldiers after WWII as war criminals for doing the same stuff the U.S. did here and claimed it was OK.   I don’t think so.

“No amount of legal pretzel logic can justify the behavior detailed in the report. Indeed, it is impossible to read it and conclude that no one can be held accountable. At the very least, Mr. Obama needs to authorize a full and independent criminal investigation.”

– dennis

——————————-

Since the day President Obama took office, he has failed to bring to justice anyone responsible for the torture of terrorism suspects — an official government program conceived and carried out in the years after the attacks of Sept. 11, 2001.

He did allow his Justice Department to investigate the C.I.A.’s destruction of videotapes of torture sessions and those who may have gone beyond the torture techniques authorized by President George W. Bush. But the investigation did not lead to any charges being filed, or even any accounting of why they were not filed.

Mr. Obama has said multiple times that “we need to look forward as opposed to looking backwards,” as though the two were incompatible. They are not. The nation cannot move forward in any meaningful way without coming to terms, legally and morally, with the abhorrent acts that were authorized, given a false patina of legality, and committed by American men and women from the highest levels of government on down.

Americans have known about many of these acts for years, but the 524-page executive summary of the Senate Intelligence Committee’s report erases any lingering doubt about their depravity and illegality: In addition to new revelations of sadistic tactics like “rectal feeding,” scores of detainees were waterboarded, hung by their wrists, confined in coffins, sleep-deprived, threatened with death or brutally beaten. In November 2002, one detainee who was chained to a concrete floor died of “suspected hypothermia.”

These are, simply, crimes. They are prohibited by federal law, which definestorture as the intentional infliction of “severe physical or mental pain or suffering.” They are also banned by the Convention Against Torture, the international treaty that the United States ratified in 1994 and that requires prosecution of any acts of torture.

So it is no wonder that today’s blinkered apologists are desperate to call these acts anything but torture, which they clearly were. As the report reveals, these claims fail for a simple reason: C.I.A. officials admitted at the time that what they intended to do was illegal.

In July 2002, C.I.A. lawyers told the Justice Department that the agency needed to use “more aggressive methods” of interrogation that would “otherwise be prohibited by the torture statute.” They asked the department to promise not to prosecute those who used these methods. When the department refused, they shopped around for the answer they wanted. They got it from the ideologically driven lawyers in the Office of Legal Counsel, who wrote memos fabricating a legal foundation for the methods. Government officials now rely on the memos as proof that they sought and received legal clearance for their actions. But the report changes the game: We now know that this reliance was not made in good faith.

No amount of legal pretzel logic can justify the behavior detailed in the report. Indeed, it is impossible to read it and conclude that no one can be held accountable. At the very least, Mr. Obama needs to authorize a full and independent criminal investigation.

– More: ➡

Scientists react to warmest year: 2014 underscores ‘undeniable fact’ of human-caused climate change

January 19th, 2015

In 134 years of temperature records, the warmth in 2014 exceeded them all, NOAA and NASA announced today.

Unsurpassed heating of the world’s oceans fueled the chart-topping warmth.

Ocean temperatures were more than 1 F above average, NOAA said.  They warmed to a new record even in the absence of an El Niño event, a naturally occurring cycle of ocean heating in the tropical Pacific.

“This is the first year since 1997 that the record warmest year was not an El Niño year at the beginning of the year, because the last three have been,”  Gavin Schmidt, who directs the NASA Goddard Institute for Space Studies, told the Post’s Chris Mooney.

Related: It’s official: 2014 was the hottest year in recorded history(Washington Post Wonk Blog)

Land temperatures weren’t quite record-setting, but still ranked 4th warmest since the start of the data set in 1880. California, much of Europe, including the United Kingdom, and parts of Australia all experienced their warmest years.

News of record global warmth may surprise residents of the eastern U.S., which witnessed colder than normal temperatures in 2014. But the chill was an anomaly and, in fact, the eastern U.S. was among the coolest areas of the world compared to normal.

In NOAA’s analysis of global temperatures, 7 of 12 months in 2014 reached record highs, including December.

Thirteen of the warmest 15 years on record have occurred since the year 2000 according to Climate Central, a non-profit science communications organization based in Princeton, New Jersey. The likelihood of this happening by chance, with the assistance of manmade greenhouse emissions, is less than 1 in 27 million, it calculated.

– More: ➡

 

Planetary Boundaries 2.0 – new and improved

January 18th, 2015

As Science publishes the updated research, four of nine planetary boundaries have been crossed

Four of nine planetary boundaries have now been crossed as a result of human activity, says an international team of 18 researchers in the journal Science (16 January 2015). The four are: climate change, loss of biosphere integrity, land-system change, altered biogeochemical cycles (phosphorus and nitrogen).

Two of these, climate change and biosphere integrity, are what the scientists call “core boundaries”. Significantly altering either of these “core boundaries” would “drive the Earth System into a new state”.

“Transgressing a boundary increases the risk that human activities could inadvertently drive the Earth System into a much less hospitable state, damaging efforts to reduce poverty and leading to a deterioration of human wellbeing in many parts of the world, including wealthy countries,” says Lead author, Professor Will Steffen, researcher at the Centre and the Australian National University, Canberra. “In this new analysis we have improved our quantification of where these risks lie.”

Other co-authors from the Centre are Johan Rockström, Sarah Cornell, Ingo FetzerOonsie Biggs, Carl Folke and Belinda Reyers.

Request publication

What’s new?
The new paper is a development of the Planetary Boundaries concept, which was first published in 2009, identifying nine global priorities relating to human-induced changes to the environment. The science shows that these nine processes and systems regulate the stability and resilience of the Earth System – the interactions of land, ocean, atmosphere and life that together provide conditions upon which our societies depend.

The research builds on a large number of scientific publications critically assessing and improving the planetary boundaries research since its original publication. It confirms the original set of boundaries and provides updated analysis and quantification for several of them, including phosphorus and nitrogen cycles, land-system change, freshwater use and biosphere integrity.

Though the framework keeps the same processes as in 2009, two of them have been given new names, to better reflect what they represent, and yet others have now also been assessed on a regional level.

“Loss of biodiversity” is now called “Change in biosphere integrity.” Biological diversity is vitally important, but the framework now emphasises the impact of humans on ecosystem functioning. Chemical pollution has been given the new name “Introduction of novel entities,” to reflect the fact that humans can influence the Earth system through new technologies in many ways.

“Pollution by toxic synthetic substances is an important component, but we also need to be aware of other potential systemic global risks, such as the release of radioactive materials or nanomaterials,” says Sarah Cornell, coordinator of the Planetary Boundaries research at the Centre. “We believe that these new names better represent the scale and scope of the boundaries,” she continues.

In addition to the globally aggregated Planetary Boundaries, regional-level boundaries have now been developed for biosphere integrity, biogeochemical flows, land-system change and freshwater use. At present only one regional boundary (South Asian Monsoon) can be established for atmospheric aerosol loading.

Nine planetary boundaries
1. Climate change
2. Change in biosphere integrity (biodiversity loss and species extinction)
3. Stratospheric ozone depletion
4. Ocean acidification
5. Biogeochemical flows (phosphorus and nitrogen cycles)
6. Land-system change (for example deforestation)
7. Freshwater use
8. Atmospheric aerosol loading (microscopic particles in the atmosphere that affect climate and living organisms)
9. Introduction of novel entities (e.g. organic pollutants, radioactive materials, nanomaterials, and micro-plastics).

– More: ➡

The rate of sea-level rise is ‘far worse than previously thought,’ study says

January 18th, 2015

Researchers have come up with a new and improved way of measuring the rise in the sea level, and the news is not good: The seas have risen dramatically faster over the last two decades than anyone had known.

For hundreds of years, the seas were measured by more or less the equivalent of plopping a yard stick into the ocean and seeing if the ocean went up or down. But now, that method looks to be outdated.

According to a new study published on Wednesday in Nature, the new method involves an advanced statistical model that analyzes all of the factors contributing to sea rise. It has yielded what appears to be a much more accurate picture of the oceans and suggests previous studies had severely underestimated the acceleration of recent sea rise.

“What this paper shows is that the sea-level acceleration over the past century has been greater than had been estimated by others,” lead writer Eric Morrow said in a statement. “It’s a larger problem than we initially thought.” Co-author Carling Hay added in an interview with BBC: “The acceleration into the last two decades is far worse than previously thought. This new acceleration is about 25 percent higher than previous estimates.”

– More: ➡