This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

[IOP] A community website from IOP Publishing

environmentalresearchweb blog

Renewable Impacts

By Dave Elliott

It has been an eventful year for renewables. While progress continues apace, with renewables now supplying around 15% of electricity in the UK and 22% of global electricity, in this pre- Xmas post, rather than spelling out all the good news, I will look at some of the less good stories from the year- concerning wind power and CSP. Continue reading

Posted in Renew your energy | Tagged , , , , , , , , , , | Leave a comment | Permalink
View all posts by this author  | View this author's profile

AGU Fall Meeting 2014: new underwater vehicle discovers ‘slimeballs’

by Liz Kalaugher

The Arctic can seem like a barren landscape but huge numbers of “slimeballs” are lurking beneath the sea ice. As Antje Boetius of the Alfred Wegener Institute (AWI) Helmholtz Center for Polar and Marine Research, Germany, explained at an AGU Fall Meeting press conference, she discovered these larvaceans, as they are more correctly known, during a July 2014 test run of the brand new Nereid Under Ice (NUI) remotely operated vehicle. “It was the first time we could see such an abundance of life under the ice,” she said.

The larvaceans, first named because scientists thought they must be the larval stage of a creature that would later become more beautiful, feed on the algae growing beneath the sea-ice. And they’re gelatinous so they look pretty slimy. Copepods and ctenophores (comb jellyfish) were also in evidence, with the jellyfish feeding on the copepods. What’s not clear, according to Boetius, is which animals are feeding on the jellyfish and acting in turn as food for the seals, and ultimately the polar bears, also spotted during the tests.

NUI was ideal for this task because its fibreoptic link to the Polarstern icebreaker means it can stray further from the ship than a conventional tether would allow, reaching undisturbed ice or even beneath glacial ice tongues or ice shelves. The tethering system for NUI was originally developed for the Nereus deep-sea robot which explored the Mariana Trench in 2009 but was lost in May 2014.

Although polar explorer Nansen was first to describe, in Boetius’ words, the “brownish greenish mass of ice algae”, scientists hadn’t been able to map the algae until now. “People still assume zero production under the ice but that is wrong,” said Boetius. Previously, scientists were only able to observe algae underneath ice broken up as their research ships passed through it. Even buoys can’t reach the top two metres of the ocean beneath sea-ice – they’re at risk of damage from ice projecting below the surface.

Boetius first included NUI in a research cruise proposal five years ago, even though it didn’t then exist. “The biggest fun is when someone tells you that you can’t do something and you go ahead and do it anyway,” she said. “It was a real joy to be able to work with it.” The researcher was confident that Woods Hole Oceanographic Institution (WHOI), US, could develop such a vehicle in time for an Arctic research cruise by AWI’s icebreaker Polarstern. And the organization came up trumps. As well as the fibre-optic link, the $3 million vehicle incorporates under-ice and seafloor landing skids and an acoustic communication system in case the fibre breaks. “Usually ROVs are constrained to stay below the ship,” said Michael Jakuba, WHOI’s lead project engineer for NUI. “This one can be further away.”

This summer NUI made four dives in the Arctic, reaching up to 800 m away from the ship and to a maximum depth of 45 m. Now Boetius and colleague Christopher German of WHOI have 16 hours of video to examine in detail.

Posted in AGU Fall Meeting 2014 | Leave a comment | Permalink
View all posts by this author  | View this author's profile

AGU Fall Meeting 2014: territory dispute over Greenland helps climate researchers

by Liz Kalaugher

Old aerial photos of Greenland helped researcher Anders Bjork track glacier retreat.

Old aerial photos of Greenland helped researcher Anders Bjork track glacier retreat. Image credit Natural History Museum of Denmark.

You never know how history will turn out. Back in 1931, a group of Norwegians settled in south-east Greenland. Both they and the Danes, who reckoned Greenland belonged to them, began aerial mapping to prove their claims to the land in the international court in The Hague.

The science that helped the Danes win the dispute is now valuable to researchers studying the retreat of coastal glaciers. As Anders Bjork of the Natural History Museum of Denmark detailed in a press conference at the AGU Fall Meeting, he’s used some of the photos from these four years of flights to examine 110 years of changes in Greenland.

Denmark, according to Bjork, was “quite surprised” when Norway moved in to Greenland’s desolate uninhabited south-east.

Both the Danes, assisted by German pilots, and the Norwegians surveyed the region from the air, dealing with temperatures of -20°C and flight altitudes of 11,000 metres to avoid Greenland’s high mountains. This was a challenge for the aviators given that the largest peak in Denmark is just 500 feet, Bjork explained. Louise Boyd from Marin County, just down the road from the San Francisco meeting, also took aerial shots of the area.

By comparing the historic photos with data from IceBridge flights, Bjork found that one glacier retreated by 5 km between 1932 and 2013. The pictures also revealed that the glaciers retreated by 30 metres per year in the early part of last century, compared to 10 metres per year in the last fifteen years or so. The 1920s and 1930s saw temperatures rise by 2°C per decade as the effects of the Little Ice Age waned; from 1990 to 2010 the increase was 1.3°C per decade. Bjork says the massive meltdown after the Little Ice Age shows that glaciers respond fast to changes in climate and precipitation.

Bjork reckons the scientific data also had a big influence on the international court’s final decision. The Danes had explored the area since the mid-19th century, longer than Norway. “The Danish superstars were polar explorers,” he said. “Thousands of people were cheering when they left [on their expeditions].”

Posted in AGU Fall Meeting 2014 | Leave a comment | Permalink
View all posts by this author  | View this author's profile

AGU Fall Meeting 2014: why do volcanologists have all the best tunes?

by Liz Kalaugher

After a long, fascinating day of talks, posters and rain, where better to head than to a bar, particularly one with free entertainment? Again the AGU had thought to provide – last night saw the Fall Meeting’s third annual Open Mic night, with delegates signing up for three minute spots.

The range of approaches taken by these multitalented geoscientists was awesome. There was an astronomy rap, singing with accompaniments from iPods, ukeleles, or guitars, juggling, plate-spinning, a brief moment of interpretative dance, performance poetry, readings, advice to new postdocs (“say nothing, just smile”), no less than two renditions of Patsy Cline’s Crazy, the first serenaded to compere Richard Alley of Pennsylvania State University on the grounds that “they’re crazy for ignoring your data”, science jokes (“what’s an astronaut’s favourite computer key? The space bar”) and a tribute to a participant’s thesis advisor to the tune of Eye of the Tiger. Not to mention a parody of a hit from Frozen – “let it flow, I never cared about pyroclastics anyway”.

Thank you to everyone who participated for a thoroughly enjoyable evening. The variety was astounding. Except for one thing – volcanologists seemed particularly likely to take up a musical instrument and sing about their science. So was last night’s dataset a fluke or has the AGU Fall Meeting found its very own reproducible result? Be here next year to witness the repeat experiment. If you come up to the mic you could even skew the results.

Posted in AGU Fall Meeting 2014 | Leave a comment | Permalink
View all posts by this author  | View this author's profile

AGU Fall Meeting 2014: climate scientists avoid the weather

California raining: geoscientists at the AGU Fall Meeting huddle in the lobby of Moscone West

California raining: geoscientists at the AGU Fall Meeting huddle in the lobby of Moscone West

by Liz Kalaugher

At 6.10 pm on Tuesday, it felt for the first time like there are 24,000 delegates at the AGU Fall Meeting in San Francisco. Most of them, it seemed, were clustered in the lobby of Moscone West, trying to avoid heading out into the rare Californian rain.

The rain is “certainly welcome”, according to Jay Famiglietti of NASA’s Jet Propulsion Laboratory speaking to a press conference that morning. (Some of the conference attendees in the lobby later that day may have disagreed.) But it won’t have been enough to stop the drought. The state will need 42 cubic km (or 11 trillion gallons) of water to recover, Famiglietti has determined using data from the GRACE (Gravity Recovery and Climate Experiment) satellites, in the first calculation of its kind. Since 2011, the Sacramento and San Joaquin river basins have decreased in volume by 15 cubic km of water each year, the analysis also revealed, more water than California residents use for their annual domestic and municipal needs.

Posted in AGU Fall Meeting 2014 | Leave a comment | Permalink
View all posts by this author  | View this author's profile

AGU Fall Meeting 2014: Mecca pollution reaches mega levels

by Liz Kalaugher

Each year millions of pilgrims head to the Saudi Arabian city of Mecca, temporarily doubling or even tripling  its 1.7 million population in the world’s largest religious pilgrimage. But the number of inhabitants is not the only factor that gets a boost over the five days of the Hajj – the resulting traffic congestion as pilgrims travel between holy sites means that air pollution levels soar.

“The air quality during the Hajj season exposes pilgrims to dangerous levels of pollutants,” said Haider Khwaja of the University at Albany, US, at a press conference at the AGU Fall Meeting in San Francisco. “This is a major, major health problem.” The pollution can exceed World Health Organization (WHO) standards for PM10, PM2.5 and ozone many times over.

As Isobel Simpson of the University of California, Irvine, US, explained, when she and her colleagues took measurements in October 2012 for one of Saudi Arabia’s first air quality studies, they were surprised how very high some of the concentrations of pollutants were – among the highest seen by the team in urban areas around the world in two decades of research.

The situation was particularly bad inside Mecca’s tunnels. The city has 58 tunnels, from 0.7 to 1.4 km long, some shared by vehicles and pedestrians. Inside the Al-Masjid Al-Haram tunnel, levels of carbon monoxide reached a peak of 57 parts per million (ppm), the study revealed, 300 times the background level. Exposure to this kind of concentration for just a short time can cause headache, dizziness and nausea, as well as potentially bringing on heart attacks. This isn’t just an issue for pilgrims – local citizens such as police officers, hotel workers and volunteers can spend a considerable amount of time inside the tunnels doing their jobs or giving directions.

Khwaja has found that the PM10 concentrations along one particular route caused a 6-700% increase in the risk of daily mortality, a 7-840% increase in the risk of hospital admissions and a 39-4550% increase in the risk of cough. Children and those over fifty are likely to be the most susceptible groups.

So what’s causing the pollution? The major source is vehicle exhaust, Simpson said, with a surprise entry for gasoline evaporation in the number two spot. Adding a rubber ring vapour lock around every gas pump nozzle could significantly cut these emissions when drivers fill up their tanks, Simpson explained. Liquefied petroleum gas was the third most important source, with coolants from air-conditioning systems for accommodation tents taking fourth place.

Other ways to solve the problem, according to Azhar Siddique of King Abdulaziz University, Saudi Arabia, include reducing the amount of benzene in the country’s gasoline (which is relatively rich in the compound); cutting fuel sulphur levels; introducing emissions testing and regulations for vehicles, such as compulsory catalytic converters or particle trappers; boosting the price of fuel; providing separate tunnels for vehicles and pedestrians; traffic management during Hajj; and introducing more public transport. The Hajj route has had a railway since 2011 and there are plans to expand this, as well as to introduce air quality monitoring to Mecca’s tunnels.

With as many as 5 million pilgrims projected for 2025, it’s clear the air pollution problem isn’t going to go away by itself.

Posted in AGU Fall Meeting 2014 | Leave a comment | Permalink
View all posts by this author  | View this author's profile

AGU Fall Meeting 2014: Lightning never strikes twice?

by Liz Kalaugher

Normally aircraft pilots avoid thunderstorms but in spring 2014 one particularly intrepid aviator spent five days deliberately seeking out and flying directly through storms, for hours at a time. The goal? To find out more about the X-rays and gamma-rays created by lightning.

As Pavlo Kochkin of the Eindhoven University of Technology in The Netherlands explained at the AGU Fall Meeting, in experiments “not easy to repeat” the plane, donated by Airbus, headed out from Toulouse, France, earlier this year, chock-full of sensors from the ILDAS In-flight lightning strike damage assessment system. On the 30th April it made for northern Italy where it descended into a storm from a height of 4 km and spent five hours battling turbulence and enduring 20 lightning strikes whilst taking measurements all the while.

The tests confirmed that lightning is bright in X-rays, according to Kochkin, and may have witnessed a long gamma-ray glow, as well as confirming that the radiation is a property of the lightning and not related to altitude.

Kochkin stressed that people don’t need to worry about terrestrial gamma-ray flashes when they get on a plane. But if the pilot of a research craft deliberately flying through a storm suffered a direct hit from a gamma-ray, they could receive a dose of radiation equivalent to a whole-body CT scan in just a few microseconds. “It’s not necessarily great but it’s not going to kill you,” said Kochkin. Gamma-rays could also affect the plane’s electronics but “probably the aircraft will do fine”.

As Joseph Dwyer, University of New Hampshire, Durham, US, detailed, there are several sources of X- and gamma-rays inside our atmosphere. Lightning near the ground, for example, produces X-rays whilst thunderstorms glow in the gamma-ray, sometimes continuously. “This is really strange,” said Dwyer. “It’s … like a black hole or supernova would do.” But in space, or in particle accelerators, the gamma-rays are produced inside a vacuum – in the atmosphere they form under pressure. Thunderstorms also produce terrestrial gamma-ray flashes, which last a few milliseconds and can be picked up by low Earth orbit spacecraft. Around 1000 such flashes a day are detected worldwide.

Understanding these high-energy emissions could give us clues to how thunderstorms and lightning work, Dwyer said – we still don’t know how lightning starts. Since terrestrial gamma-ray flashes form around the same time as lightning, as a result of the strong electric field inside the storm cloud that may also cause lightning, “you can think of them as a probe for looking at lightning initiation.” Although, terrestrial gamma-ray flashes are tricky to study too. “It’s hard to measure inside a storm,” Dwyer said, particularly since not many spacecraft are dedicated to the task.

But it’s worth the effort. Terrestrial gamma-ray flashes are the most energetic atmospheric phenomenon on planet Earth, as Themistoklis Chronis, University of Alabama, Huntsville, US, put it. Discovered in the early 1990s, the flashes were first linked to high altitude sprites before scientists realised their association with intra-cloud lightning.

The flashes are still able to surprise today. Despite theories that only stronger storms would create the flashes, Chronis has found that even weak clouds are up to the job. He and his colleagues looked at 24 storms that produced the flashes in locations along the US Gulf Coast, the Caribbean and Guam, where NEXRAD ground radar and atmospheric sounding data were also available to provide details on storm strength. Any type of storm can generate a terrestrial gamma-ray flash, the team concluded.

The flashes all originated from the highest part of the storm, between seven and nine miles high, Chronis found. This could be because gamma rays from flashes lower down encounter so much water vapour as they travel higher into the atmosphere that they become too weak to reach NASA’s Fermi Gamma-ray Burst Monitor high above them. So there may be more terrestrial gamma-ray flashes than we think.

Now Chronis would like to look at other parts of the world and to find out whether the flashes form during the growing or decaying phase of the storm. “Lightning still holds many secrets,” he said.

Posted in AGU Fall Meeting 2014, General | Leave a comment | Permalink
View all posts by this author  | View this author's profile

WREC in London

By Dave Elliott

Some continue to portray renewables as marginal, with for example, ExxonMobil claiming that their potential is limited by ‘scalability, geographic dispersion,intermittency (in the case of solar and wind), and cost relative to other sources’, and renewables are only likely to make up about 5% of the global energy mix by 2040: www.ft.com/cms/s/0/5a2356a4-f58e-11e3-afd3-00144feabdc0.html?siteedition=uk#axzz33albsQ2B

Most however see renewables as booming, with IRENA looking to 30% or more of primary energy coming from renewables globally by 2030 (www.irena.org/remap). That is the sort of future envisaged, on the way to maybe near 100% of power by 2050, by most who attended the 13th biannual World Renewable Energy Congress, this one at Kingston University, London, in August.

Continue reading

Posted in Renew your energy | Tagged , , , , , , , | Leave a comment | Permalink
View all posts by this author  | View this author's profile

Balancing variable renewables

By Dave Elliott

There is now a range of books looking at the technical and policy options available for managing the use of variable energy resources such as wind and solar energy. The pioneering text in this area was Earthscan’s “Renewable Electricity and the Grid” from 2007, edited by Godfrey Boyle , with contributions from many of the UK top experts. But the field has since expanded with, for example, a lot of new work being done in the US. Continue reading

Posted in Renew your energy | Tagged , , , , , , , , , , | Leave a comment | Permalink
View all posts by this author  | View this author's profile

RE < C: The end of a project and the stereotype of Silicon Valley

A recent article by two Google engineers, Ross Koningstein and David Fork, in IEEE Spectrum has raised quite a discussion. The article entitled “What It Would Really Take to Reverse Climate Change” discusses Google’s investment in the “RE<C” project that sought to “…develop renewable energy sources that would generate electricity more cheaply than coal-fired power plants do”.  The goal was to produce a gigawatt of power (presumably installed capacity).  Google abandoned the project in 2011, according to the article because they believed it would not meet their cost goal and would also not avert significant impacts from climate change (they state the need to keep atmospheric concentrations of CO2 below 350 ppm as suggested by James Hansen).

I commend the two engineers for writing this article discussing their efforts and thoughts. However, I see this foray into energy as typical of the Silicon Valley mentality that is used to “solving” some technological problem quickly, selling the company or idea to a larger company, and then moving on to the next great app.  Whether it is RE<C or making advanced biofuels from algae or cellulosic feedstocks, the Silicon Valley stereotype thinks the “energy problem” will be solvable just like cellular phones and that their “energy days” will be another line on their CV.  Unfortunately, the realities of the energy production business are more difficult to change than realities on the energy consumption side of the business.  Most innovative companies of the last several years are emerging to use information to consume energy more smartly because we no longer have the money and demographics to increase energy consumption.  This is part of the new reality.

The Google engineers don’t mention the solution that will come about but needs no technology: consuming less energy. This will be the only solution that actually reduces CO2 emissions, but it will instead coincide with higher energy prices and costs, not “cheap zero-carbon energy” as is stated as a goal.  The reason is because of the rebound effect, or Jevons Paradox (named after the British economist William Stanley Jevons).   The cheaper energy becomes, the more the world consumes in the aggregate of all people consuming energy and not just a single device (refrigerator, car etc.) becoming more energy efficient.

Even divergent opinions on the limits of the planet and human endeavours agree that the effect of cheap energy is to increase total consumption compared to if energy were more expensive.  I explain this concept via two books I use for my energy class at The University of Texas: The Bottomless Well (TBW) and Limits to Growth: The 30-year Update (LTG). I specifically use these two books (there are other possibilities) to force students to understand widely divergent opinions on how people interpret the past use of energy for guiding (or not) future energy policy and use of natural resources.  TBW is optimistic on human ingenuity, the discovery of new technologies, and increased efficiency to provide the services we crave.  LTG accepts that humans are clever animals, but also understands the physical constraints of a finite planet will eventually even trump gains in efficiency (so that production and consumption do not increase infinitely), forcing the reduction of consumption and physical stocks that we can maintain (largely people and industrial capital).  TBW says it is best for the government to get out of the way of industry in improving technologies. LTG says that forward-looking policies are (really “would have needed to have been already”) necessary to minimize environmental damage and promote the necessary equity that will be needed after the world peaks in annual throughput (e.g. ~ GDP, but not exactly).

From the IEEE Spectrum article, I view Google as starting in the TBW camp, but never quite reaching the conclusion of the LTG authors.  That is to say they no longer believe technology can solve the problem (they stopped their project), but they believe the solution is some new technology that we have yet to create.   The Google authors state in their IEEE Spectrum article: “Our reckoning showed that reversing the trend [of increasing atmospheric CO2 concentration] would require both radical technological advances in cheap zero-carbon energy, as well as a method of extracting CO2 from the atmosphere and sequestering the carbon.”  They further state: “Not only had RE<C failed to reach its goal of creating energy cheaper than coal, but that goal had not been ambitious enough to reverse climate change.  That realization prompted us to reconsider the economics of energy. What’s needed, we concluded, are reliable zero-carbon energy sources so cheap that the operators of power plants and industrial facilities alike have an economic rationale for switching over soon—say, within the next 40 years.”  Businesses choose the most economic solutions because those are the ones that give them the greatest chance of growing, not shrinking.  If all businesses are growing, and storing, streaming, and beaming more and more information in the cloud servers that Google has provided us with, then this requires more resources, not less … more emissions, not less.  Cheap low-carbon energy might coincide with cheap high-carbon energy too, because if it is really cheap enough, we might be growing enough to continue to afford fossil energy.  Personally, I doubt this outcome because the large growth days are over.  But how do we really assess how “cheap” energy really is?  Let’s look to a time series from the UK.

Figure 1 shows a calculation from Roger Fouquet on the cost of energy in England and the United Kingdom.  What a nice piece of work!  (Note: The UK is perhaps the best example of understanding long-term energy costs and the transition to fossil fuel usage starting in earnest in the late 1700s.)  If we use England and the UK as a proxy for the modern world, Fouquet’s calculations indicate that the last decade (2000-2010) was effectively the time of cheapest energy in the history of mankind (see Figure 1).  It was cheap energy that enabled the human population to reach 7 billion.  In other words, cheap energy enabled us to farm land more intensively with less human effort to produce more food such that it was possible to increase the population. Without modern farming (fertilizer inputs based on creating ammonia from the hydrogen in natural gas, liquid-fueled combustion engines in tractors, fossil fueled transport and storage of food) we would not have 7+ billion people on the planet.  It is simply too expensive and physically impossible to feed 7 billion people via subsistence farming.  More expensive food and energy (really, food is energy) puts downward pressure on population, and that in turn puts downward pressure on the environment.

 

UK_PercentGDPOnEnergy_TotalOnly

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Figure 1. The cost of energy for energy services as a percentage of England and United Kingdom gross domestic product [Data courtesy of Roger Fouquet].

 

At the end of the article, the Google team states: “We’re not trying to predict the winning technology here, but its cost needs to be vastly lower than that of fossil energy systems.”  There are two mathematical ways for competing technologies to become vastly lower cost than fossil energy systems.  Either the new technologies become cheaper while fossil energy stays roughly constant (or becomes cheaper more slowly), or fossil energy becomes more expensive while the competing technologies get cheaper, stay the same cost, or increase more slowly.  The real curb on resource consumption and CO2 emissions will be indicated by aggregate energy costs per Figure 1.  If energy spending as a fraction of GDP increases, it indicates we are reaching diminishing returns to consumption and our responses (e.g. research, new energy resource extraction) are inadequate to continue increasing consumption.  This would be the interpretation if we are trying to make energy cheaper and cheaper (most people and governments want this).  However, it is theoretically possible to purposely choose (e.g. by policy) to increase energy spending as a fraction of GDP.  Putting a price or tax on CO2 emissions is an example policy (Note: Internalizing the cost of CO2 emissions makes fossil fuel consumption more expensive but does not make renewable energy cheaper.).

Energy has practically never been cheaper than during the time Google has existed as a company.  If energy is already this cheap, how can we say it is not cheap enough to invest in technologies to mitigate fossil fuel impacts (carbon capture from coal-fired power plants and even capture of CO2 from the air)?  The common statement is that we need (low-carbon) energy to be cheaper to mitigate climate change.  This is tantamount to us “Waiting for Godot” to arrive.   It’s as if we’re saying: “we’re so smart, but if only we were a little smarter, we’d have the cheap unobtainium we’ve been hoping for so that we can do as many things as we want with no environmental impact.”  Unfortunately, all elements in the periodic table have mass and obey the laws of physics, not our social laws of economics.  There are fundamental energetic (low energy return on energy invested) reasons why we have yet to be able to “policy induce” cellulosic liquid biofuels into existence.

The climate solution that Google could not find is not made of some more of some new kind of widget; it is made of less of all past and future widgets.  We got into the climate predicament by millions of incremental advancements, and perhaps we’ll only reduce emissions rates in the same way.  In terms of practically playing in the energy space, as a hybrid solution to “solving” the energy problem, Google ventured into energy management by buying Nest (thermostats that learn your habits and program your home climate control) earlier this year.  This should pay dividends for all, with the tradeoff going further into an Orwellian future of increased mass information on citizen activities.  It is unclear if these types of technologies will help Google (and the rest of us) decrease environmental impacts, increase use of low-carbon energy, or decrease greenhouse gas emissions rates.  But we can be sure that Google owning NEST certainly follows their existing business model of gathering more information to continue selling targeted ads based on your habits.

Posted in Energy the nexus of everything | Tagged , , , | 2 Comments | Permalink
View all posts by this author  | View this author's profile
WordPress Appliance - Powered by TurnKey Linux