Climate: LNG in B.C. vs Alberta tarsands

Status
Not open for further replies.
I was aware you were unable to understand.



OBD you have passed on to the the list of irrelevant with you hoax theory.
It's all just "noise" from your side with your friends.
The denialists are denying it's the warmest year. Which makes sense if you're in denial
Seek help.....

[CHHJLQVUX0k]https://www.youtube.com/watch?v=CHHJLQVUX0k
 
http://fortune.com/2014/12/08/oil-prices-drop-impact/
Oil price drops: Don't panic, really

COMMENTARY by Cyrus Sanati @beyondblunt DECEMBER 8, 2014, 12:32 PM EST

Oil prices have a lot more room to fall before things get really scary. Here’s why.

The recent drop in crude prices won’t kill off the US shale oil industry. It’ll just make it more efficient.

Profit margins and break-even points are relative not only to the price of oil, but also to the cost of doing business. As oil prices drop, producers will undoubtedly renegotiate their ludicrously expensive oil service contracts, slash wages for their workforce and cut perks to bring their costs in line with the depressed price for crude. The demand for oil remains strong, which should provide an adequate floor for producers in the long run, but only after they get their finances in order.

How oil prices ever reached $100 a barrel still remains a mystery to many who have followed the industry for years. But the 40% drop in oil prices over the past six months has been shocking for oil bears and bulls alike. Why on earth did it fall so hard, so fast? There is plenty of speculation, ranging from the Saudi’s wish to “crush” the U.S. shale industry, to the U.S. colluding with the Saudi’s to flood the market in order to bankrupt an aggressive Russia and an obstinate Iran.

Conspiracy theories aside, the fact is oil prices have dropped and they may stay “low” for a while. This has analysts, journalists, and pundits running around claiming that it’s the end of the world.

It is understandable that people are nervous. After all, the oil industry is a major producer of jobs and wealth for the U.S. It contributes around $1.2 trillion to U.S. GDP and supports over 9.3 million permanent jobs, according to a study from The Perryman Group. Not all that money and jobs come directly from the shale oil industry or even the energy industry as a whole but instead derive from the multiplier effect the industry has on local economies. Given this, it’s clear why any drop in the oil price, let alone a 40% drop, is cause for concern.

Nowhere in the U.S. is that concern felt more acutely than in Houston, Texas, the nation’s oil capital. The falling price of crude hasn’t had a major impact on the city’s economy, at least not yet. But people, especially the under-40 crowd—the Shale Boomers, as I call them—are starting to grow very worried. At bars and restaurants in Houston’s newly gentrified East End and Midtown districts, you often hear the young bucks (and does) comparing notes on their company’s break-even points with respect to oil prices. Those who work for producers with large acreage in the Bakken shale in North Dakota are saying West Texas Intermediate (WTI) crude needs to stay above $60 a barrel for their companies to stay in the black. Those who work for producers with large acreage in the Eagleford shale play in south Texas say their companies can stay above water with oil as low as $45 to $50 a barrel.

Both groups say that they have heard their companies are starting to walk away from some of the more “speculative” parts of their fields, which translates to a decrease in production, the first such decrease in years. This was confirmed Wednesday when the Fed’s Beige Book noted that oil and gas activity in North Dakota decreased in early November due to the rapid fall in oil prices. Nevertheless, the Fed added the outlook from “officials” in North Dakota “remained optimistic,” and that they expect oil production to continue to increase over the next two years.

What are these “officials,” thinking? Don’t they worry about the break-even price of oil? Sure they do, but unlike the Shale Boomers, they also probably remember drilling for oil when it traded in the single digits, which really wasn’t that long ago. For these seasoned oil men, crude at $60 a barrel still looks mighty appealing.

Doug Sheridan, the founder of EnergyPoint Research, which conducts satisfaction surveys, ratings, and reports for the energy industry, recalls when he had lunch with an oil executive of a major energy giant 10 years ago who confided in him that his firm was worried that oil prices had risen too high, too fast. “He was concerned that the high prices would attract negative attention from the press and Congress,” Sheridan told Fortune. “The funny thing was, oil prices were only around $33 a barrel.”

The shale boom has perpetuated the notion that drilling for oil, especially in shale formations, is somehow super complicated and expensive. It really isn’t. Fracking a well involves just shooting a bunch of water and chemicals down a hole at high pressure—not exactly rocket science. The drilling technique has been around since the 1940s, and the energy industry has gotten very good at doing it over the decades. Recent advances in technology, such as horizontal drilling, have made fracking wells even easier and more efficient.

But even though drilling for oil has become easier and more efficient, production costs have gone through the roof. Why? There are a few reasons for this, but the main one is the high price of oil. When oil service firms like Halliburton and Schlumberger negotiate contracts with producers, they usually take the oil price into consideration. The higher the oil price, the higher the cost for their services. This, combined with the boom in cheap credit over the last few years, has increased demand for everything related to the oil service sector—from men to material to housing. In what other industry do you know where someone without a college degree can start out making six figures for doing manual labor? You can in the oil and gas sectors, especially in places like Western North Dakota. There, McDonald’s employees make $20 an hour and rent for a modest place can top $2,000 a month.

But as the oil price drops, so will costs, bringing the “break-even” price down with it. Seasoned oil men know how to get this done—it involves a little Texas theater, which is sort of like bargaining at a Turkish bazaar. The producers will first clutch their hearts and tell their suppliers that they simply cannot afford to drill any more given the sharp slump in oil prices. Their suppliers will offer a slight discount on their services but the producer will say he’s “walking away.” This is where we are in the negotiating cycle.

After letting the oil service firms sweat a bit (traditionally around two to four months), a producer will give their former suppliers a call, saying they are “thinking” of getting back in the game. Desperate for work, the suppliers will now be willing to renegotiate a whole new agreement based on a lower oil price. The aim of the new contract is to give producers close to the same margin they had when prices were much higher. Profits are restored and everyone is happy.

This negotiation will happen across all parts of the oil and gas cost structure. So welders who were making $135,000 a year will probably see a pay cut, while the administrative staff back at headquarters will probably miss out on that fat bonus check they have come to rely on. Rig workers and engineers will see their pay and benefits slashed as well. Anyone who complains will be sent to Alaska or somewhere even worse than Western North Dakota in the winter, like Siberia (seriously). And as with any bursting bubble, asset prices will start to fall for everything from oil leases to jack-up rigs to townhouses in Houston. Oh, and that McDonald’s employee in Western North Dakota will probably need to settle for $15 an hour.

But oil production will continue, that is, until prices reach a point at which it truly makes no sense for anyone to drill anywhere.

So, what is the absolute lowest price oil can be produced for in the U.S.? Consider this—fracking last boomed in the U.S. back in the mid-1980s, when a barrel of oil fetched around $23. That is equivalent to around $50 a barrel today, when adjusted for inflation. That fracking boom went bust after prices fell to around $8 a barrel, which is worth around $18 in today’s money. With oil last week hitting $63 a barrel, it seems that prices have a lot more room to fall before things get really scary.
 
http://m.thetyee.ca/Opinion/2014/05...-Fossil-Fuel-Subsidies/#.VLtwpYfXjdM.facebook

IMF Pegs Canada's Fossil Fuel Subsidies at $34 Billion

In such giveaways we're a world leader, a fact rarely noted when federal budgets are debated.

By Mitchell Anderson, 15 May 2014, TheTyee.ca
Image for IMF Pegs Canada's Fossil Fuel Subsidies at $34 Billion
There's a reason Canada enjoys some of the cheapest gas in the developed world. Nozzle photo via Shutterstock.
While Canada slashes budgets for research, education and public broadcasting, there is one part of our economy that enjoys remarkable support from the Canadian taxpayer: the energy sector.

The International Monetary Fund estimates that energy subsidies in Canada top an incredible $34 billion each year in direct support to producers and uncollected tax on externalized costs.

These figures are found in the appendix of a major report released last year estimating global energy subsidies at almost $2 trillion. The report estimated that eliminating the subsidies would reduce global carbon emissions by 13 per cent. The stunning statistics specific to this country remain almost completely unreported in Canadian media.

Contacted by The Tyee, researchers from the IMF helpfully provided a detailed breakdown of Canadian subsidies provided to petroleum, natural gas and coal consumption. The lion's share of the $34 billion are uncollected taxes on the externalized costs of burning transportation fuels like gasoline and diesel -- about $19.4 billion in 2011. These externalized costs include impacts like traffic accidents, carbon emissions, air pollution and road congestion.

The report also referenced figures sourced from the OECD showing an additional $840 million in producer support to oil companies through a constellation of provincial and federal incentives to encourage fossil fuel extraction. This brought total petroleum subsidies in Canada in 2011 to $20.23 billion -- more than 20 times the annual budget of Environment Canada.

In comparison to other countries, Canada provides more subsidies to petroleum as a proportion of government revenue than any developed nation on Earth besides the United States and Luxembourg.

Natural gas consumption also enjoys billions in subsidies in Canada. The IMF estimates that un-priced carbon emissions from burning natural gas added up to $7.3 billion per year. There's another $440 million in producer support and $360 million in other un-taxed externalities, all of which tops $8.1 billion. This tax giveaway on natural gas alone is 44 per cent more than Canada provides in international aid every year.

What about coal? Canada consumes over 30 million tonnes per year. While we currently export over half our domestic production, the IMF study only considered externalized costs within our own country. They found that the coal industry receives $4.5 billion in annual subsides -- almost all of this is un-priced carbon and sulfur dioxide emissions. This generous largesse towards the dirtiest of fuels is about four times what the CBC receives in public support every year.

Or we could spend that on...

What could Canada do with an extra $34 billion a year? Both Vancouver and Toronto are struggling with how to fund long overdue upgrades to public transportation. Subway construction comes in at about $250 million per kilometre, meaning we could build about 140 kilometres of badly-needed urban subway lines every year. Light rail transport (LRT) is about one-quarter of the cost of subways, meaning for the same money we could build about 560 kilometres of at-grade transit infrastructure.

This foregone revenue in less than two years could fully fund the Big Move transit plan for southern Ontario, providing affordable access for 80 per cent of people living from Hamilton to Oshawa. Toronto's transit system has languished for decades. This sorely needed infrastructure would save the average household thousands in wasted time sitting in traffic, and Canada's economy billions in reduced congestion costs.

The proposed Vancouver subway line to the University of British Columbia could be built using less than two months of the subsidies provided every day to the energy sector. Forty kilometres of rapid transit in Surrey could be had for about the same amount.

What about green energy infrastructure? Adding solar and wind capacity provides some of the best job-generation per dollar of any option available -- more than seven times the employment from an equivalent investment in oil and gas extraction. Extrapolating the findings from a 2012 report on green jobs, $34 billion could create 500,000 person years of employment and install more than 150,000 megawatts of clean generating capacity. Canada currently ranks 12th in the G20 on green energy investment and has been steadily falling behind our competitors.

Canada's infrastructure deficit of crumbling roads and outdated water and sewage treatment is pegged at $171 billion. This backlog could be wiped out in five years with the revenue we are subsidizing to the energy sector.

Of course, not all things of value can be measured by bricks and mortar. Thirty-four billion dollars each year could provide $10-a-day childcare for 5.5 million children ages 0 to 5. Canada's child care costs are currently the highest in the OECD.

No free lunch in energy costs

For all the complaining Canadians do about fuel prices, it's ironic to note the IMF essentially says we are undervaluing the true cost of gasoline by about $0.30 per litre. Compared to other nations, Canada enjoys some of the cheapest gas in the developed world. Fuel in Italy and Germany is almost double our price at the pump. Ever think it's odd that bottled water at the gas station costs more than the fuel you just put in your tank?

Consider for a moment all the costs of finding and extracting crude oil, shipping it across the globe, refining it into gasoline and trucking it to your neighbourhood. Not to mention the billions spent by some countries projecting military power into volatile oil-producing parts of the world and the very human price of those interventions. Additional un-priced costs after petroleum is burned, such as climate change, traffic congestion, road accidents and air pollution make gasoline perhaps the most subsidized substance on Earth.

Every decision based on artificially low energy prices can have years of unintended consequences. If gas is cheap, people will choose to buy cars rather than take transit, clogging both our roads and emergency rooms. Transportation accidents alone cost Canada $3.7 billion each year. Every vehicle bought based on low fuel prices will produce years of carbon emissions, and every owner over the life of that vehicle will have an interest in voting for cheaper gas.

The opposite, of course, is also true. Less than half of Vancouverites in their early twenties today have chosen to get a driver's license, down from 60 per cent 10 years ago. Better public transit and more expensive car ownership seem to be the main factors driving this remarkable demographic shift.

The IMF can hardly be accused of being a left-leaning, alarmist organization. Through this valuable research, they make the case that there is no free lunch in energy costs, and we exclude these externalized costs at our peril.

A country can be judged on what it chooses to tax and what it chooses to subsidize. And by that yardstick, this nation currently seems to care more about cheap energy than almost anything else.

Mitchell Anderson is a Vancouver based freelance writer and frequent contributor to The Tyee. He is writing a book, The Oil Vikings: What Norway can teach the world about wise resource use.
 
Gavin Schmidt now admits NASA are only 38% sure 2014 was the hottest year

I said the vaguest scientists in the world lie by omission, and it’s what they don’t say that gives them away. The “hottest ever” press release didn’t tell us how much hotter the hottest year supposedly was, nor how big the error bars were. David Rose of the Daily Mail pinned down Gavin Schmidt of NASA GISS to ask a few questions that bloggers and voters want answered but almost no other journalist seems to want to ask.

Finally…

Nasa admits this means it is far from certain that 2014 set a record at all

Does that mean 97% of climate experts are 62% sure they are wrong?*

The thing with half-truths is that they generate a glorious fog, but it has no substance. Ask the spin-cloud of a couple of sensible questions and the narrative collapses. This is the kind of analysis that would have stopped the rot 25 years ago if most news outlets had investigative reporters instead of science communicators trained to “raise awareness”. (The media IS the problem). If there was a David-Rose-type in most major dailies, man-made global warming would never have got off the ground.

The claim made headlines around the world, but yesterday it emerged that GISS’s analysis – based on readings from more than 3,000 measuring stations worldwide – is subject to a margin of error. Nasa admits this means it is far from certain that 2014 set a record at all.

Yet the Nasa press release failed to mention this, as well as the fact that the alleged ‘record’ amounted to an increase over 2010, the previous ‘warmest year’, of just two-hundredths of a degree – or 0.02C.

The margin of error is about a tenth of a degree, so those error bars are 500% larger than the amount pushed in headlines all over the world. Gavin Schmidt of course, is horrified that millions of people may have been mislead:

GISS’s director Gavin Schmidt has now admitted Nasa thinks the likelihood that 2014 was the warmest year since 1880 is just 38 per cent. However, when asked by this newspaper whether he regretted that the news release did not mention this, he did not respond.

I’m sure he’s too busy contacting newspapers and MSNBC to make sure stories from NASA GISS are accurate and scientifically correct.

Read more: http://www.dailymail.co.uk

In the mood for sport? Turn the torch back on the journalists who were too gullible to ask a sensible question. Let’s start asking the ABC and BBC journalists why they didn’t ask “how much hotter was it” and “how big are those error bars”.
 
NOAA, NASA: 2014 was probably not the warmest year on our record

A direct proof that the professional alarmists are intentionally lying

As I discussed in detail, the surface temperature record significantly disagrees with the satellite datasets when it comes to the question whether 2014 was a warmest or near-warmest year.

Satellites answer this question with a clear "No": 1998 was 0.3 °C warmer than 2014. This difference (decrease of temperature) is rather safely greater than their error margin which allows you to say that the global mean temperature as defined and calculated via the RSS methodology, for example, almost certainly didn't peak in 2014. (If it did, it would be no big deal, anyway, but it did not.) The year 2014 was tied on the 6th and 7th place among the 36 according to the RSS AMSU satellite methodology, for example.

On the other hand, NOAA's NCDC and NASA's GISS ended up with the mean value of the global mean temperature for 2014 to be about 0.02 °C higher than the second warmest year on their record, with their (different) definition of the global mean temperature, and the second year on their record is 2010 (closely followed by 2005).

Immediately, sensible people – including several climate scientists – were telling them that this difference – 0.02 °C – is so tiny that it is easily beaten by the error margin which prevents you from acquiring any confidence while deciding which year was actually *the* warmest one.




Now, the two questions are: how much do the error margins of the NOAA, NASA temperature records matter? And if they change the answer to the question whether 2014 was the warmest one, did they know about this fact when they loudly announced that "2014 was the warmest year" or did they overlook that detail?




The answer is that the answer is heavily affected by the error margins and NASA, NOAA knew that – but were careful to get the wrong answer to the media. Here is a January 16th, 2015 tweet by Gavin Schmidt of NASA's GISS, pretty much the most important guy in that agency. He reveals that they provided the media with fraudulent claims while he seems to be arrogant towards a person who just asked ("@DavidRoseUK Were you on the press call? The uncertainties were directly addressed."):


When the uncertainties are taken into account, you can't say which year was the warmest one on the two records. Instead, you may only quantify the probability – by integrating the probabilistic distributions in various ways – that one year or another year was the warmest one. And both NASA, NOAA did this exercise for us – in fact, before they spoke to the media! If you were one of the few people in the world who attended a press conference, you could have heard a "footnote". But no participant did notice that footnote, partially because most journalists are either dishonest or stupid.

The answer is that 2014 had the highest chance, according to their methodology and (possibily manipulated) raw numbers. But in both cases, the chance was lower than 50%. If I simply subtract the first numbers from 100 percent, the tables above imply that
According to NOAA, the probability that a different year than 2014 was the warmest one was 52%.
According to NASA, the probability that a different year than 2014 was the warmest one was 62%.
Using the informal language for the probability ranges, both teams' work implies that
It is more likely than not that the warmest year was a different one than 2014.
The temperatures were pretty much constant for a decade or two, so the error margins guarantee that all the warmer years in those decades – and there have been several warmer years like that – had comparable chances of having been the #1.

As the tweet unambiguously proves, Gavin Schmidt knew about this fact. That didn't prevent them from pushing virtually all mainstream media to publish the lie – in the very title – that
NASA: 2014 was the warmest year
Sorry, but even your own work shows that this probably wasn't the case and you were deliberately lying to the media – and everyone else – about the results of your work. You are doing it all the time, 24 hours a day, 7 days a week, and 52 or 53 weeks a year.

The tweet by Gavin Schmidt is a simple example of mass manipulation in action. They publish some of the correct yet inconvenient clarifications at places where almost no one reads – the press conference was attended by a small number of people, Schmidt has a few thousand Twitter followers, almost no one reads the bulk of the IPCC reports etc. – while at the places which matter because millions of people read them, they always post the distortions, oversimplifications, and downright lies.

This combination of strategies allows them to say that they "did release the truth". However, they are careful that they only speak the truth when almost no one listens.

Update

Hours after this blog post was published, The Daily Mail published a criticism by David Rose with pretty much the same content. It is totally plausible but not certain that he was able to make the same conclusion independently of this TRF blog post.
 

Attachments

  • image.jpg
    image.jpg
    76.3 KB · Views: 35
NASA Keeps Telling "Warmest" Lies



By Alan Caruba

On January 16 The New York Times reported the lies NASA keeps telling about global warming with an article titled “2014 Breaks Heat Record, Challenging Global Warming Skeptics.” We have reached the point where neither a famed government agency nor a famed daily newspaper can be believed simply because both are lying to advance the greatest hoax of the modern era.

Remember that 2014 started off with something called a “polar vortex” to describe the incredibly cold weather being experienced and remember, too, that we were being told that it was evidence of global warming! That’s how stupid the “Warmists” who keep saying such things think we are.

The Earth is in the 19th year of a natural cooling cycle based on the reduced radiation of the Sun which is in its own natural cycle. It hasn’t been getting warmer and most people who give it any thought at all know the truth of that.

Enough people have concluded this that, according to a recent CNN poll, more than half, 57%, say that global warming is not a global threat. In addition, the poll revealed that only 50% of Americans believe the alleged global warming is not caused by man-made emissions, while 23% believe it is the result of natural changes, and 26% believe global warming is not a proven fact.

That’s progress. No youngster under the age of 19 has ever experienced a single day of global warming. No computer model that ever predicted it has been accurate. Neither the Pope nor the President, nor any other world leader who repeats the global warming claim is correct.

The latest claim came from NASA and, as I continue to remind readers, it is a government agency whose budget depends on parroting the lies the President keeps telling about global warming.

Astrophysicist, Dr. David Whitehouse, said “The NASA press release is highly misleading…talk of a record is scientifically and statistically meaningless.” He was joined by climatologist Dr. Roy Spencer who said “We are arguing over the significance of hundredths of a degree.”

Do you believe that a hundredth of a degree makes a difference? Well, it does if you are a government agency desperately trying to keep the global warming hoax alive. Climatologist Dr. Pat Michaels asked “Is 58.46 degrees distinguishable from 58.45 degrees? In a word, NO.”

Marc Morano, the editor of CFACT’s ClimateDepot.com, said, “There are dueling global datasets—surface temperature records and satellite records—and they disagree. The satellites show an 18 year-plus global warming standstill and the satellite was set up to be ‘more accurate’ than the surface records.” As for the NASA claim, Morano dismissed it as “simply a political statement not based on temperature gauges.” Morano, a former member of the staff of the U.S. Senate Environmental & Public Works Committee, is working on an upcoming documentary “Climate Hustle.”

How does this affect you? The lie that carbon dioxide and methane emissions, dubbed “greenhouse gases”, are causing global warming is the basis for the Obama administration’s attack on the nation’s energy sector and, in particular, the provision of electricity by coal-fired plants. In the past six years many of these plants have been shut down or will be. The result is less electricity and higher prices for electricity. The other result is an attack on the oil and natural gas industry that drill to access these resources. There is not a scintilla of truth to justify what is being done to Americans in the name of global warming.

There is yet another result and that is the loss of jobs in the energy sector and the reduction in revenue to the nation and states it represents. The nation’s economy overall has been in sluggish state which the word “growth” doesn’t even begin to describe. That hurts everyone.


Most of us don’t have a lot of time to get up to speed and stay there regarding the facts surrounding global warming or climate change. An excellent source of information is the Environment & Climate News, a monthly publication by The Heartland Institute, a thirty year old non-profit free market think tank that will sponsor its tenth annual International Conference on Climate Change in Washington, D.C. in June.

NASA has been allowed to degrade to the point where the agency that sent men to the Moon no longer has the capacity to even transport them to the International Space Station built by the Russians. We have gone from the world’s leader in space exploration to an agency that has been turned into a propaganda machine asserting that a hundredth of a degree “proves” that global warming is happening.

The U.S. and the rest of the world are setting records, but they are records for how cold it has become everywhere. There was snow recently in Saudi Arabia from a storm that swept across Iraq, Syria, Lebanon, Israel, and Jordan. Does that sound like global warming to you? For an excellent source of information on the cooling of the planet, visit http://iceagenow.info.

You have an obligation to yourself, your family, friends and co-workers to not just know the truth but to denounce entities like NASA, the EPA, and The New York Times, Time, Newsweek, National Geographic, and others that keep repeating the lies about global warming.

© Alan Caruba, 2015
 
I see you are wrong again.
Catching on yet?


OBD you have passed on to the the list of irrelevant with you hoax theory.
It's all just "noise" from your side with your friends.
The denialists are denying it's the warmest year. Which makes sense if you're in denial
Seek help.....
 

Attachments

  • image.jpg
    image.jpg
    45.2 KB · Views: 26
I see you are wrong again.
Catching on yet?



OBD you have passed on to the the list of irrelevant with you hoax theory.
It's all just "noise" from your side with your friends.
The denialists are denying it's the warmest year. Which makes sense if you're in denial
Seek help.....

LOL
comic-12.jpg
 
Can you see a trend here OBD?
[vA7tfz3k_9A]https://www.youtube.com/watch?v=vA7tfz3k_9A#t=202
 
The Most Dishonest Year On Record

Last week, according to our crackerjack mainstream media, NASA announced that 2014 was the hottest year, like, ever.

No, really. The New York Times began its report with: “Last year was the hottest in earth’s recorded history.”

Well, not really. As we’re about to see, this is a claim that dissolves on contact with actual science. But that didn’t stop the press from running with it.

If you follow the link I gave to the New York Times piece, you will see that this opening sentence has since been rewritten, for reasons which will soon become clear. But the Times wasn’t the only paper to start with that claim, and most of the headlines are still up. The Washington Post has: “2014 Was the Hottest Year in Recorded History.” The Boston Globe: “2014 Was Earth’s Hottest Year in Recorded History.” And so on.

You can see how misleading this is. When you read the phrase “in recorded history,” you think we’re talking about a really long time—the time dating back to the first historical records in Sumeria, circa 3500 BC. (That’s what you’ll find if you look up the phrase “recorded history.”) That’s a time frame of 5,000 to 6,000 years. But in the case of the temperature record, it actually means only 135 years. Accurate, systematic, global thermometer measurements of surface temperatures go back only to 1880. That’s why the Times report, presumably after getting whacked for a wildly misleading opening sentence, changed it to: “Last year was the hottest on earth since record-keeping began in 1880.” Which is a whole lot less impressive.

That “recorded history” gaffe is even worse when you consider that during “recorded history,” in the 5,000-year sense of the phrase, there’s good evidence that the Earth has been warmer than it is today.

We don’t have thermometer measurements going back that far, but scientists can use “proxies”—things they can measure that tend to vary with temperature, such as the composition of ancient deposits of seashells, or the thickness of the rings in ancient, slow-growing trees—to get very rough estimates. These have usually shown warmer temperatures during Roman times and the Middle Ages, when “recorded history” describes wine grapes growing in Northern England and Newfoundland.

There have been a few attempts to write these warm periods out of existence—one of them being Michael Mann’s infamous “hockey stick” graph, which implausibly asserts that global temperatures remained totally flat in every century except the 20th—but these claims have been controversial to say the least.

That’s why the implication that this is the warmest year ever in all of human history should never have gotten by a reporter who knows the first thing about the science on this issue. It implies a claim that we’re pretty sure just isn’t true.

Now let’s move on to the corrected statement, that this is the hottest year since the thermometer record began in 1880. But this a very short time for gathering data about the climate and distinguishing new trends from natural variation. For example, about half of the warming that occurs in that time happens prior to 1940, before it could have been caused by human activity. This warming was probably a natural rebound from the Little Ice Age, a cool period that ended in the middle of the 19th century.

More broadly, all changes in temperature that we observe today are relatively small variations within a much larger trend on a geological time scale. We know that the earth is going through a series of freezing and warming cycles on a scale of tens of thousands to hundreds of thousands of years. And it has mostly been freezing. We’re fortunate enough to live in a cozy, warm “interglacial” period between ice ages. So we’re all staring down the barrel of the next ice age, yet we’re spending our time worrying about global warming.

But let’s say we take this hyperventilation about a few relatively warm decades seriously. Even by that standard, this latest claim is ridiculously over-hyped.

If 2014 is supposed to be “hotter” than previous years, it’s important to ask: by how much?

You can spend a long time searching through press reports to get an actual number on this—which is a scandal unto itself. Just saying one year was “hotter” or “the hottest” is a vague qualitative description. It isn’t science. Science runs on numbers. You haven’t said anything that is scientifically meaningful until you state how much warmer this year was compared to previous years—and until you give the margin of error of that measurement.

The original NASA press release did not give those figures—and most press reports just ran with it anyway. This in itself says a lot. When it comes to global warming, “journalism” has come to mean: “copying press releases from government agencies.”

But a few folks decided to do some actual journalism, and Britain’s Daily Mail reports that

the NASA press release failed to mention…that the alleged ‘record’ amounted to an increase over 2010, the previous ‘warmest year’, of just two-hundredths of a degree—or 0.02C. The margin of error is said by scientists to be approximately 0.1C—several times as much.

Pause for a moment to digest that. The margin of error was plus or minus one tenth of a degree. The difference supposedly being measured here is two hundredths of a degree—five times smaller than the margin of error. The Daily Mail continues:

As a result, GISS’s director Gavin Schmidt has now admitted NASA thinks the likelihood that 2014 was the warmest year since 1880 is just 38 per cent. However, when asked by this newspaper whether he regretted that the news release did not mention this, he did not respond.

This is not exactly a high point in the employment of the scientific method.

If we take into account this margin of error, the most we can say is that 2014 was, so far as we know, just as warm as 2005 and 2010. There is no significant difference between these years. And that gives the lie to claims of runaway global warming. As the redoubtable Judith Curry recently pointed out:

The real issue that is of concern to me is the growing divergence between the observed global temperature anomalies and what was predicted by climate models. Even if 2014 is somehow unambiguously the warmest year on record, this won’t do much to alleviate the growing discrepancy between climate model predictions and the observations.

She links to this graph which shows that observed temperatures are falling at or below the low end of the range predicted by the climate models. With every year that passes, the models predict a greater and greater increase in temperature—but the actual observations remain stubbornly flat. Curry concludes that “ranking 1998, 2005, 2010, and 2014 as the ‘warmest years’ seems very consistent with a plateau in surface temperatures since 1998.”

So allow me to suggest a more accurate first sentence to sum up this story: “In the tiny little blip of geological time for which we have accurate surface temperature records, last year was pretty much the same as 2005 and 2010, continuing a plateau of global temperatures that has lasted nearly 20 years.”

What remains of the original description of this news? Nothing but bluff, spin, and the uncritical press-release journalism that dominates mainstream reporting on the climate. It may or may not be the hottest year ever, but this is definitely in the running for the most dishonest year on record.
 

Attachments

  • image.jpg
    image.jpg
    86.3 KB · Views: 41
Last edited by a moderator:
http://www.iflscience.com/environment/solving-puzzle-sea-level-rise-reexamining-past
Solving The Puzzle Of Sea-Level Rise By Reexamining The Past

January 16, 2015 | by Carling Hay

photo credit: Harvard and Rutgers scientists propose a new, potentially more accurate way, to measure the rate of sea level rise.

When you ask yourself what the biggest unanswered scientific questions are, “how did sea levels change over the past 100 years?” is unlikely to appear at the top of your list.

After all, haven’t we already figured that out? It turns out that obtaining a complete picture of how our oceans have been changing is not a simple task, yet is vital for making future projections. In a paper published in Nature this week, my research group developed a more accurate yardstick for measuring rising sea levels, offering clues to a discrepancy that scientists have grappled with for years and potentially insight into future projections of the rise in sea levels.

Sea-level observations over the 20th century come from tide-gauge records, which, in their simplest form, are essentially yardsticks attached to coastlines around the world. The global coverage of these measurements is limited, particularly at the start of the 20th century and in the southern hemisphere. Additionally, even the most complete records can include significant gaps in time. The incompleteness of these records makes obtaining estimates of global mean sea level very difficult.

The latest report from the Intergovernmental Panel on Climate Change (IPCC) included two different estimates of the global increase in sea level over the 20th century. The first estimate came from a suite of previously published studies that analyzed the tide-gauge observations directly. The 1901-1990 rate obtained using these approaches falls in the range of 1.5 to 2 millimeters per year.

The second estimate was computed by adding estimates from individual sources, such as melting water from ice on land and expansion of the ocean. This “bottom-up” approach produced a lower rate of 1 to 1.2 millimeters per year over the same time period.

Analyzing the fingerprints

Explaining the difference between these two different estimates has been a pressing issue within the sea-level community. Most scientists believe the higher estimate derived from tide-gauge records, but they have questioned the lower “bottom-up” estimate.

One possible explanation is that estimates of Greenland and Antarctic melting over the century may have been underestimated. Measurements of ice sheet mass balances have traditionally come from ground-based measurements and satellite observations. However, these observations are also very limited.

That’s when we started to ask ourselves whether or not it was possible to use sea-level observations to try to estimate how much individual ice sheets and glaciers have been melting. Could we use tide-gauge observations over the 20th century to infer how individual contributions combine to produce the global increase in sea levels?

Addressing this question turned out to be much more challenging than any of us initially thought. The physics that allows us to tackle this problem comes down to understanding why sea level at one tide gauge is different than sea level at another.

There are a variety of factors that affect local sea level measurements. These include ongoing effects due to the last ice age, heating and expansion of the ocean due to global warming, changes in ocean circulation and present-day melting of land ice.

All of these processes produce unique patterns, or “fingerprints,” of sea-level change that we can model and predict. Our goal has been to infer the individual contributions to sea level by looking for these fingerprints in the tide-gauge records. This type of “fingerprinting” analysis had been applied to paleo sea-level records, but no one had attempted to look for these predicted patterns in 20th-century sea-level observations.



Developing a way to fingerprint modern sea-level records involved drawing on data analysis and statistical techniques common in other fields such as engineering, economics and meteorology. We brought together these techniques and applied them, for the first time, to the field of sea-level research.

Essentially, our method extracts global information from the limited local observations. Once we began applying our statistical approach to tide-gauge records, we realized that we could add all of our estimated individual contributions to produce a record of global mean sea-level change over time. We assumed that our estimate would agree with the previously published results, but what we found was actually quite different.

The results of our analysis show that the previous estimate for 1901-1990 was too high. We estimate a 1901-1990 “sea-level rise” rate of 1.2 millimeters per year, down from 1.5 millimeters per year. In fact, our lower estimate agrees with the “bottom-up” approach presented in the IPCC. It closes “the sea-level budget” gap by eliminating the discrepancy between the two different types of measurements.

Image Credit : NASA Goddard Space and Flight Center, CC BY

While this may initially appear to be a positive result – that the rise in sea levels was slower last century – it isn’t necessarily. When we look at sea-level rise in the last few decades, we find that our 1993-2010 rate of 3 millimeters per year agrees with previously published results. Using our model, this means that the rate of global sea-level rise has increased not by a factor of two (three divided 1.5 millimeters per year) but rather by a factor of 2.5 (three divided by 1.2 millimeters per year).

That implies that the rate of global sea-level rise is 25% higher in recent decades – a substantial increase. And this revision may affect projections of future sea-level rise.
 
Last edited by a moderator:
http://www.iflscience.com/environment/humanity-existential-danger-zone-study-confirms

Humanity Is In The Existential Danger Zone, Study Confirms
January 15, 2015 | by James Dyke
photo credit: Humans have ignored the signs. Thomas Hawk, CC BY-NC


The Earth’s climate has always changed. All species eventually become extinct. But a new study has brought into sharp relief the fact that humans have, in the context of geological timescales, produced near instantaneous planetary-scale disruption. We are sowing the seeds of havoc on the Earth, it suggests, and the time is fast approaching when we will reap this harvest.

This in the year that the UN climate change circus will pitch its tents in Paris. December’s Conference of the Parties will be the first time individual nations submit their proposals for their carbon emission reduction targets. Sparks are sure to fly.

The research, published in the journal Science, should focus the minds of delegates and their nations as it lays out in authoritative fashion how far we are driving the climate and other vital Earth systems beyond any safe operating space. The paper, headed by Will Steffen of the Australian National University and Stockholm Resilience Centre, concludes that our industrialised civilisation is driving a number of key planetary processes into areas of high risk.

It argues climate change along with “biodiversity integrity” should be recognised as core elements of the Earth system. These are two of nine planetary boundaries that we must remain within if we are to avoid undermining the biophysical systems our species depends upon.

The original planetary boundaries were conceived in 2009 by a team lead by Johan Rockstrom, also of the Stockholm Resilience Centre. Together with his co-authors, Rockstrom produced a list of nine human-driven changes to the Earth’s system: climate change, ocean acidification, stratospheric ozone depletion, alteration of nitrogen and phosphorus cycling, freshwater consumption, land use change, biodiversity loss, aerosol and chemical pollution. Each of these nine, if driven hard enough, could alter the planet to the point where it becomes a much less hospitable place on which to live.

The past 11,000 years have seen a remarkably stable climate. The name given to this most recent geological epoch is the Holocene. It is perhaps no coincidence that human civilisation emerged during this period of stability. What is certain is that our civilisation is in very important ways dependent on the Earth system remaining within or at least approximately near Holocene conditions.

This is why Rockstrom and co looked at human impacts in these nine different areas. They wanted to consider the risk of humans bringing about the end of the Holocene. Some would argue that we have already entered a new geological epoch – the Anthropocene – which recognises that **** sapiens have become a planet-altering species. But the planetary boundaries concepts doesn’t just attempt to quantify human impacts. It seeks to understand how they may affect human welfare now, and in the future.



It’s been a stable 11,000 years. Steffen et al​

The 2009 paper proved to be very influential, but it also attracted a fair amount of criticism. For example, it has been argued that some of the boundaries are not in fact global in scale. There are very large regional variations in consumption of freshwater and phosphorus fertiliser pollution, for instance.



Phosphorous pollution in croplands. Steffen et al

That means that while globally we may be in the green, there could be an increasing number of regions that are deep in the red.

Updated boundaries

The latest research develops the methodology so that it now includes regional evaluations. For example it assesses basin-level freshwater use and biome-level species extinction rates. It also includes a new boundary of “novel entities” – new forms of life and novel compounds the likes of which the Earth system has not experienced and so impact of which is extremely challenging to assess. Ozone-depleting CFCs are perhaps the best example of how a seemingly inert substance can produce planetary damage.



Tree cover remaining in the world’s major forest biomes. Steffen et al​

The paper also gives an update on where we stand on some of the planetary boundaries. At first sight, it looks as though there may be some good news in that climate change is no longer in the red. But then closer inspection reveals that a new yellow “zone of uncertainty with increasing risk” has been added to the previous green and red classification.



2/9ths into the red. Steffen et al

Climate change impacts are firmly within this new yellow zone. Our atmosphere currently has about 400 parts per million (ppm) of carbon dioxide. To recover back to the green zone we still need to get back to 350ppm – the same precautionary boundary as before.

Perhaps most importantly the research produces a two-tier hierarchy in which climate change and biosphere integrity are recognised as the core planetary boundaries through which the others operate. This makes sense: life and climate are the main columns buttressing our continual existence within the Holocene. Weakening them risks amplifying other stresses on other boundaries.

Reasons not to be cheerful

And so to the very bad news. Given the importance of biodiversity to the functioning of the Earth’s climate and the other planetary boundaries, it is with real dismay that this study adds yet more evidence to the already burgeoning pile that concludes we appear to be doing our best to destroy it as fast as we possibly can.

Extinction rates are very hard to measure but the background rate – the rate at which species would be lost in the absence of human impacts – is something like ten a year per million species. Current extinction rates are anywhere between 100 to 1000 times higher than that. We are possibly in the middle of one of the great mass extinctions in the history of life on Earth.
 

Attachments

  • image-20150115-5198-1u76d8i.jpg
    image-20150115-5198-1u76d8i.jpg
    18.2 KB · Views: 40
Last edited by a moderator:
The creadibility of your posts has now been shown to be very questionable due to the recent , this is the warmest year ever put out by NASA.
Makes all your arguments questionable, due to the motivation of the scientists.




Here's a quicker method for this thread.
Did OBD post it?
A) Yes - it's bad science.
B) No - it's probably not bad science.
 
The creadibility of your posts has now been shown to be very questionable due to the recent , this is the warmest year ever put out by NASA.
Makes all your arguments questionable, due to the motivation of the scientists.

ya OBD - 95% of all of NASAs, NOAAs, IPCCs likely thousands of scientists, technicians, and support staff are "secretly" against big oil and are making this stuff up to ruin the economy...

CHECK YOUR MEDICATION!

I think it's contraindicated.
 
Status
Not open for further replies.
Back
Top