Archive for the ‘CrashBlogging’ Category
NSA hiding Equation spy program on hard drives
Wednesday, February 18th, 2015– In 1999, Motorola, at my request, sent me to Silicon Valley for a week-long course in advanced Windows Win32 programming.
– During this course, I remember talking with another participant; a young computer whiz who was from the NSA.
– He talked about how they (the NSA computer guys) conducted red-team green-team battles to see who could infiltrate the other’s team’s computer systems.
– But the thing he talked about, that caught my interest the most, was when he said the hot new frontier was getting into firmware as a way of exerting control over computers remotely. It was a new idea that immediately fascinated me but once he saw my interest, I think he realized that he might be talking too much and clammed up. He avoided me for the rest of the week.
– The story, below, says that the technique of firmware infiltration may have been around since 2001. I’m sure I heard the sound of the other shoe dropping when I read that.
– The article says:
It is not clear how the NSA may have obtained the hard drives’ source code. Western Digital spokesman Steve Shattuck said the company “has not provided its source code to government agencies.” The other hard drive makers would not say if they had shared their source code with the NSA.
– I don’t find it all that mysterous. How hard would it be for the NSA to field computer-savvy agents directed to seek employment in these companies? Or, as the article says, to require the companies to provide their source code to the NSA for security reviews before the U.S. Government will allow it to be used in U.S. facilities?
– Once the NSA has the firmware’s source code, they can modify it and then intercept the firm’s drives in shipment and refresh the firmware on the intercepted drives with the NSA’s new stuff … that does everything the old firmware does … and a bit more.
– The interception-during-shipment technique was outed over a year ago as being one of their favorite techniques though in that case it had to do with routers.
– dennis
= = = = = = = = = = = = = = = = = = = = = = = = =
The US National Security Agency has figured out how to hide spying software deep within hard drives made by Western Digital, Seagate, Toshiba and other top manufacturers, giving the agency the means to eavesdrop on the majority of the world’s computers, according to cyber researchers and former operatives.
That long-sought and closely guarded ability was part of a cluster of spying programs discovered by Kaspersky Lab, the Moscow-based security software maker that has exposed a series of Western cyberespionage operations.
Kaspersky said it found personal computers in 30 countries infected with one or more of the spying programs, with the most infections seen in Iran, followed by Russia, Pakistan, Afghanistan, China, Mali, Syria, Yemen and Algeria. The targets included government and military institutions, telecommunication companies, banks, energy companies, nuclear researchers, media, and Islamic activists, Kaspersky said.
The firm declined to publicly name the country behind the spying campaign, but said it was closely linked to Stuxnet, the NSA-led cyberweapon that was used to attack Iran’s uranium enrichment facility. The NSA is the agency responsible for gathering electronic intelligence on behalf of the United States.
A former NSA employee told Reuters that Kaspersky’s analysis was correct, and that people still in the intelligence agency valued these spying programs as highly as Stuxnet. Another former intelligence operative confirmed that the NSA had developed the prized technique of concealing spyware in hard drives, but said he did not know which spy efforts relied on it.
NSA spokeswoman Vanee Vines declined to comment.
Kaspersky published the technical details of its research on Monday, which should help infected institutions detect the spying programs, some of which trace back as far as 2001.
The disclosure could further hurt the NSA’s surveillance abilities, already damaged by massive leaks by former contractor Edward Snowden. Snowden’s revelations have hurt the United States’ relations with some allies and slowed the sales of US technology products abroad.
The exposure of these new spying tools could lead to greater backlash against Western technology, particularly in countries such as China, which is already drafting regulations that would require most bank technology suppliers to proffer copies of their software code for inspection.
TECHNOLOGICAL BREAKTHROUGH
According to Kaspersky, the spies made a technological breakthrough by figuring out how to lodge malicious software in the obscure code called firmware that launches every time a computer is turned on.
Disk drive firmware is viewed by spies and cybersecurity experts as the second-most valuable real estate on a PC for a hacker, second only to the BIOS code invoked automatically as a computer boots up.
“The hardware will be able to infect the computer over and over,” lead Kaspersky researcher Costin Raiu said in an interview.
Though the leaders of the still-active espionage campaign could have taken control of thousands of PCs, giving them the ability to steal files or eavesdrop on anything they wanted, the spies were selective and only established full remote control over machines belonging to the most desirable foreign targets, according to Raiu. He said Kaspersky found only a few especially high-value computers with the hard-drive infections.
Kaspersky’s reconstructions of the spying programs show that they could work in disk drives sold by more than a dozen companies, comprising essentially the entire market. They include Western Digital, Seagate, Toshiba, IBM, Micron Technology and Samsung.
Western Digital, Seagate and Micron said they had no knowledge of these spying programs. Toshiba and Samsung declined to comment. IBM did not respond to requests for comment.
GETTING THE SOURCE CODE
Raiu said the authors of the spying programs must have had access to the proprietary source code that directs the actions of the hard drives. That code can serve as a roadmap to vulnerabilities, allowing those who study it to launch attacks much more easily.
“There is zero chance that someone could rewrite the [hard drive] operating system using public information,” Raiu said.
Concerns about access to source code flared after a series of high-profile cyberattacks on Google Inc and other US companies in 2009 that were blamed on China. Investigators have said they found evidence that the hackers gained access to source code from several big US tech and defense companies.
It is not clear how the NSA may have obtained the hard drives’ source code. Western Digital spokesman Steve Shattuck said the company “has not provided its source code to government agencies.” The other hard drive makers would not say if they had shared their source code with the NSA.
Seagate spokesman Clive Over said it has “secure measures to prevent tampering or reverse engineering of its firmware and other technologies.” Micron spokesman Daniel Francisco said the company took the security of its products seriously and “we are not aware of any instances of foreign code.”
According to former intelligence operatives, the NSA has multiple ways of obtaining source code from tech companies, including asking directly and posing as a software developer. If a company wants to sell products to the Pentagon or another sensitive US agency, the government can request a security audit to make sure the source code is safe.
“They don’t admit it, but they do say, ‘We’re going to do an evaluation, we need the source code,'” said Vincent Liu, a partner at security consulting firm Bishop Fox and former NSA analyst. “It’s usually the NSA doing the evaluation, and it’s a pretty small leap to say they’re going to keep that source code.”
Kaspersky called the authors of the spying program “the Equation group,” named after their embrace of complex encryption formulas.
The group used a variety of means to spread other spying programs, such as by compromising jihadist websites, infecting USB sticks and CDs, and developing a self-spreading computer worm called Fanny, Kasperky said.
Fanny was like Stuxnet in that it exploited two of the same undisclosed software flaws, known as “zero days,” which strongly suggested collaboration by the authors, Raiu said. He added that it was “quite possible” that the Equation group used Fanny to scout out targets for Stuxnet in Iran and spread the virus.
– To the Original: ➡
The 1847 lecture that predicted human-induced climate change
Monday, February 16th, 2015“The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun.”
= = = = = = = = = = = = = = = = = = = = = =
A near-forgotten speech made by a US congressman warned of global warming and the mismanagement of natural resources
When we think of the birth of the conservation movement in the 19th century, the names that usually spring to mind are the likes of John Muir and Henry David Thoreau, men who wrote about the need to protect wilderness areas in an age when the notion of mankind’s “manifest destiny” was all the rage.
But a far less remembered American – a contemporary of Muir and Thoreau – can claim to be the person who first publicised the now largely unchallenged idea that humans can negatively influence the environment that supports them.
George Perkins Marsh (1801-1882) certainly had a varied career. Here’s how Clark University in Massachusetts, which has named an institute in his memory, describes him:
Throughout his 80 years Marsh had many careers as a lawyer (though, by his own words, “an indifferent practitioner”), newspaper editor, sheep farmer, mill owner, lecturer, politician and diplomat. He also tried his hand at various businesses, but failed miserably in all – marble quarrying, railroad investment and woolen manufacturing. He studied linguistics, knew 20 languages, wrote a definitive book on the origin of the English language, and was known as the foremost Scandinavian scholar in North America. He invented tools and designed buildings including the Washington Monument. As a congressman in Washington (1843-49) Marsh helped to found and guide the Smithsonian Institution. He served as US Minister to Turkey for five years where he aided revolutionary refugees and advocated for religious freedom. He spent the last 21 years of his life (1861-82) as US Minister to the newly United Kingdom of Italy.
In other words, he kept himself busy. But I would argue his defining moment came on 30 September, 1847, when, as a congressman for the Whig party (a forerunner of the Republican party), he gave a lecture to the Agricultural Society of Rutland County, Vermont. (The speech was published a year later.) It proved to be the intellectual spark that led him to go on and publish in 1864 his best-known work, Man and Nature: Physical Geography as Modified by Human Action.
More than 160 years on, it really does pay to re-read his speech as it seems remarkably prescient today. It also shows that he was decades ahead of most other thinkers on this subject. After all, he delivered his lecture a decade or more before John Tyndall began to explore the thesis that slight changes in the atmosphere’s composition could cause climatic variations. And it was a full half a century before Svante Arrhenius proposed that carbon dioxide emitted by the “enormous combustion of coal by our industrial establishments” might warm the world (something he thought would be beneficial).
Yes, in his speech, Marsh talks about “civilised man” and “savages” – and the language is turgid in places – but let’s cut him a little slack: this was 1847, after all. It’s about half way through he gets to the bit that matters most to us today:
Man cannot at his pleasure command the rain and the sunshine, the wind and frost and snow, yet it is certain that climate itself has in many instances been gradually changed and ameliorated or deteriorated by human action. The draining of swamps and the clearing of forests perceptibly effect the evaporation from the earth, and of course the mean quantity of moisture suspended in the air. The same causes modify the electrical condition of the atmosphere and the power of the surface to reflect, absorb and radiate the rays of the sun, and consequently influence the distribution of light and heat, and the force and direction of the winds. Within narrow limits too, domestic fires and artificial structures create and diffuse increased warmth, to an extent that may effect vegetation. The mean temperature of London is a degree or two higher than that of the surrounding country, and Pallas believed, that the climate of even so thinly a peopled country as Russia was sensibly modified by similar causes.
Some of the terminology he uses is clearly a little archaic to our ears today, but, broadly speaking, his hunch has subsequently proved to be correct. You can see him grappling with concepts that we now know as the urban heat island effectand greenhouse effect.
But in the speech he also called for a more thoughtful approach to consuming natural resources, despite the apparent near-limitless abundance on offer across the vast expanses of northern America. As the Clark University biography notes, he wasn’t an environmental sentimentalist. Rather, he believed that all consumption must be reasoned and considered, with the impact on future generations always kept in mind: he was making the case for what we now call “sustainable development”. In particular, he argued that his audience should re-evaluate the worth of trees:
The increasing value of timber and fuel ought to teach us that trees are no longer what they were in our fathers’ time, an incumbrance. We have undoubtedly already a larger proportion of cleared land in Vermont than would be required, with proper culture, for the support of a much greater population than we now possess, and every additional acre both lessens our means for thorough husbandry, by disproportionately extending its area, and deprives succeeding generations of what, though comparatively worthless to us, would be of great value to them.
The functions of the forest, besides supplying timber and fuel, are very various. The conducting powers of trees render them highly useful in restoring the disturbed equilibrium of the electric fluid; they are of great value in sheltering and protecting more tender vegetables against the destructive effects of bleak or parching winds, and the annual deposit of the foliage of deciduous trees, and the decomposition of their decaying trunks, form an accumulation of vegetable mould, which gives the greatest fertility to the often originally barren soils on which they grow, and enriches lower grounds by the wash from rains and the melting snows.
The inconveniences resulting from a want of foresight in the economy of the forest are already severely felt in many parts of New England, and even in some of the older towns in Vermont. Steep hill-sides and rocky ledges are well suited to the permanent growth of wood, but when in the rage for improvement they are improvidently stripped of this protection, the action of sun and wind and rain soon deprives them of their thin coating of vegetable mould, and this, when exhausted, cannot be restored by ordinary husbandry. They remain therefore barren and unsightly blots, producing neither grain nor grass, and yielding no crop but a harvest of noxious weeds, to infest with their scattered seeds the richer arable grounds below.
But this is by no means the only evil resulting from the injudicious destruction of the woods. Forests serve as reservoirs and equalizers of humidity. In wet seasons, the decayed leaves and spongy soil of woodlands retain a large proportion of the falling rains, and give back the moisture in time of drought, by evaporation or through the medium of springs. They thus both check the sudden flow of water from the surface into the streams and low grounds, and prevent the droughts of summer from parching our pastures and drying up the rivulets which water them.
On the other hand, where too large a proportion of the surface is bared of wood, the action of the summer sun and wind scorches the hills which are no longer shaded or sheltered by trees, the springs and rivulets that found their supply in the bibulous soil of the forest disappear, and the farmer is obliged to surrender his meadows to his cattle, which can no longer find food in his pastures, and sometime even to drive them miles for water.
Again, the vernal and autumnal rains, and the melting snows of winter, no longer intercepted and absorbed by the leaves or the open soil of the woods, but falling everywhere upon a comparatively hard and even surface, flow swiftly over the smooth ground, washing away the vegetable mould as they seek their natural outlets, fill every ravine with a torrent, and convert every river into an ocean. The suddenness and violence of our freshets increases in proportion as the soil is cleared; bridges are washed away, meadows swept of their crops and fences, and covered with barren sand, or themselves abraded by the fury of the current, and there is reason to fear that the valleys of many of our streams will soon be converted from smiling meadows into broad wastes of shingle and gravel and pebbles, deserts in summer, and seas in autumn and spring.
The changes, which these causes have wrought in the physical geography of Vermont, within a single generation, are too striking to have escaped the attention of any observing person, and every middle-aged man, who revisits his birth-place after a few years of absence, looks upon another landscape than that which formed the theatre of his youthful toils and pleasures. The signs of artificial improvement are mingled with the tokens of improvident waste, and the bald and barren hills, the dry beds of the smaller streams, the ravines furrowed out by the torrents of spring, and the diminished thread of interval that skirts the widened channel of the rivers, seem sad substitutes for the pleasant groves and brooks and broad meadows of his ancient paternal domain.
If the present value of timber and land will not justify the artificial re-planting of grounds injudiciously cleared, at least nature ought to be allowed to reclothe them with a spontaneous growth of wood, and in our future husbandry a more careful selection should be made of land for permanent improvement. It has long been a practice in many parts of Europe, as well as in our older settlements, to cut the forests reserved for timber and fuel at stated intervals. It is quite time that this practice should be introduced among us.
After the first felling of the original forest it is indeed a long time before its place is supplied, because the roots of old and full grown trees seldom throw up shoots, but when the second growth is once established, it may be cut with great advantage, at periods of about twenty-five years, and yields a material, in every respect but size, far superior to the wood of the primitive tree. In many European countries, the economy of the forest is regulated by law; but here, where public opinion determines, or rather in practice constitutes law, we can only appeal to an enlightened self-interest to introduce the reforms, check the abuses, and preserve us from an increase of the evils I have mentioned.
A footnote: it is 150 years ago this year since Marsh was personally appointed by Abraham Lincoln to be the US’s first ambassador to Italy. (Marsh was buried in Rome.) Just three years later, Lincoln approved the legislation which would lead to the creation of Yosemite National Park in California. This acted as a precedent across the world for federal and state governments to purchase or secure wilderness areas so they could be protected in perpetuity from development or exploitation. It’s speculation, of course, but I’ve always wondered whether Marsh and Lincoln ever discussed such matters, be it in person or in correspondence. Perhaps, there’s a keen historian out there who knows the answer?
– To the Original: ➡
– Research thanks to: Piers L.
High Risk Investment That Brought Down The U.S. Economy Returns, With A New Name
Tuesday, February 10th, 2015When a restaurant fails a health code inspection, sometimes the easiest thing to do is to close up shop, let people forget what happened, then slap a new sign on the door and reopen under a new name. That’s essentially what the world’s biggest banks are doing with a complex, high-risk investment product that helped destroy the global economy less than eight years ago.
Goodbye, “collateralized debt obligations.” Hello, “bespoke tranche opportunities.” Banks including Goldman Sachs are marketing that newfangled product, according to Bloomberg, and total sales of “bespoke tranche opportunities” leaped from under $5 billion in 2013 to $20 billion last year.
Like other derivatives, these “BTOs” allow investors to place wagers on the outcome of various loans, bonds, and securities in which they are not directly invested. Hedge funds and other sophisticated financial industry actors use derivatives both as a form of insurance to manage the total risk they are exposed to across their whole investment portfolio, and to gamble on real-world economic events such as mortgage payments, municipal bonds, and the price of physical commodities. The resulting web of complicated contracts can be very difficult to untangle, and can involve impossible-sounding amounts of money. The Financial Crisis Inquiry Commission concluded that derivatives “were at the center of the storm” and “amplified the losses from the collapse of the housing bubble by allowing multiple bets on the same securities.” In 2010, the total on-paper value of every derivative contract worldwide was $1.4 quadrillion, or 23 times the total economic output of the entire planet.
Collateralized debt obligations (CDOs) are a form of derivative that breaks one pool of financial assets — either direct loans or securities that are based on groups of loans — into multiple layers of riskiness. Those layers care called tranches, and investors who buy the least-risky tranche of the derivative will get paid before those who buy the second tranche, and so on. Banks selling traditional CDOs had to create these multiple risk tranches based on a given set of loans or securities, and then hope that someone would buy each of them.
The new “bespoke” version of the idea flips that business dynamic around. An investor tells a bank what specific mixture of derivatives bets it wants to make, and the bank builds a customized product with just one tranche that meets the investor’s needs. Like a bespoke suit, the products are tailored to fit precisely, and only one copy is ever produced. The new products are a symptom of the larger phenomenon of banks taking complex risks in pursuit of higher investment returns, Americans for Financial Reform’s Marcus Stanley said in an email, and BTOs “could be automatically exempt” from some Dodd-Frank rules.
This is not the first time that large banks have tried to reboot the CDO machine since the financial crisis made those products a much-reviled household name. In early 2013, JP Morgan Chase and Morgan Stanley tried and failed to find buyers for a new set of CDOs. The nature of that failure helps illuminate the rationale behind the new version of the product. Finding buyers for the various different layers of risk was “like trying to line up boxcars,” one investor told the Financial Times after the 2013 reboot effort fizzled. Many of the firms that used to buy such products prior to the crisis “no longer exist, and those that survive have very bad memories” of the experience, another analyst said.
Since then, those same old characters seem to have found a way to get back into the business. In addition to Goldman, which narrowly avoided criminal chargesafter a Senate investigation revealed its shady pre-crisis mortgage dealings, sellers of “bespoke tranche obligations” now include Citigroup and the french banking giant BNP Paribas. BNP’s recent notoriety doesn’t relate to the financial crisis, but rather to the bank’s violation of various U.S. sanctions against Iran, Cuba, and Sudan. And while Citigroup’s past leadership now says financial deregulation was a mistake and that megabanks like Citi should be broken up to protect the economy, its current leadership is chipping away at key Dodd-Frank reforms. Citi was also heavily involved in the “robosigning” scandal that lead to hundreds of thousands or even millions of unjust foreclosures.
– to the Original: ➡
No, climate models aren’t exaggerating global warming
Tuesday, February 10th, 2015Weather and climate agencies around the world have been almost unanimous in declaring 2014 the hottest year on record — something that has promoted considerable chagrin among climate change doubters. That’s because these “skeptics” have long sought to cast doubt on man-made global warming by pointing to an alleged global warming “pause” or “slowdown” — going on to suggest that the computerized climate models that scientists use to project future temperatures are flawed, and overestimate carbon dioxide’s warming effect.
So, is that true? Do the models consistently overestimate the warming effects of greenhouse gases like CO2?
As a recent study suggests, the answer is no. While many models didn’t predict the relatively modest surface-warming “hiatus,” it’s not because they’re biased in favor of greenhouse-gas emissions’ warming effects. Rather, researchers report in Nature, these computer simulations just struggle to predict “chaotic” (or random) short-term changes in the climate system that can temporarily add or subtract from CO2 emissions’ warming effects.
It’s true that air temperatures have increased slower in the past 15 years or so, and climate models on average instead predicted much more warming. And scientists are slowly beginning to figure out why temperatures didn’t rise quite as much as expected.
One probable contributor is pure natural variability: Cyclical processes in the Earth’s climate and temporary changes in the amount of solar radiation that reach the Earth’s surface can introduce “blips” into the Earth’s warming trend. Right now, oceans may be temporarily sucking up more heat from the atmosphere than they normally do. Moreover, a temporary downturn in solar output and an increase in light-reflecting aerosol pollution (acting like a chemical sunblock of sorts) could also have partially masked CO2-driven warming.
But researchers Jochem Marotzke of the Max Planck Institute of Meteorology and Piers M. Forster of the University of Leeds also wanted to check whether climate models are biased, by testing how their temperature predictions stack up against reality. So the researchers tested how 114 model simulations that underpin last year’s assessment report of the U.N. Intergovernmental Panel on Climate Change (IPCC) performed — not just for the 15-year period from 1998-2012 but for all 15-year periods stretching back to 1900. If this analysis were to show that models consistently overestimated or underestimated the amount of warming that actually occurred, then they must have some sort of systematic bias.
As it turns out, however, the models did pretty well. In each 15-year period, the model simulations produced a range of predictions. But each 15-year interval’s actual temperature trend always fell somewhere in the models’ prediction range. Moreover, even when 15-year actual temperature trends did fall toward the edges of the corresponding predicted ranges, they weren’t consistently at the higher or lower edges. Basically, when the models were missing the mark, they weren’t doing so consistently in one direction.
So, it’s true that the IPCC model runs didn’t predict the recent warming slowdown. But as these findings show, they didn’t accurately predict certain other 15-year periods of warming accelerations or slowdowns in the past either, and it’s not because they were always overestimating warming. Indeed, in some 15-year periods, the models underestimated warming. Essentially, that means climate skeptics are cherry-picking when they point out that climate models didn’t predict the recent 15-year hiatus.
That doesn’t entirely explain why the model simulations in a given year produced varying results to begin with, though. Was it due to differences in the underlying physics coded into the models? (The models differ slightly in terms of how much light they assume hits the Earth, how “sensitive” temperatures are to changes in CO2, and how much heat the oceans suck up.) Or was it just random fluctuations in the climate system? Or a combination? The researchers did a statistical analysis to answer that question.
In the end, none of those physical reasons was a major factor. Random fluctuations had 2.5 times the impact on the model predictions’ variations as all those physical factors together did, the researchers found. Only when the researchers used longer-term intervals (of more than 60 years) did differences in sunlight amount, ocean heat trapping or climate sensitivity start to make a big difference.
So climate models may not provide the perfect picture of what will happen to temperatures in a given short-term period (on 10- or 20-year scales). But maybe they simply can’t, due to the random ways in which climate can temporarily fluctuate. That doesn’t mean that climate models aren’t valuable to us. They still give us good sense of the long-term picture, the one that is more important for us to worry about anyway: that temperatures are increasing, and that natural factors can’t explain this increase.
As the researchers argue, then, their findings ought to put to rest assertions by climate “skeptics” that climate models overestimate how much warming we’re going to get.
– to the Original: ➡