How oil spills kill fish: new study points to cardiac arrest; possible implications for humans

Oil spills kill fish. That’s well known. Now scientists say they have found out why oil spills kill adult fish. The chemicals in the oil often trigger an irregular heartbeat and cardiac arrest.

A joint study by Stanford University and the US National Atmospheric and Oceanic Administration have discovered that crude oil interferes with fish heart cells. The toxic consequence is a slowed heart rate, reduced cardiac contractility and irregular heartbeats that can lead to cardiac arrest and sudden cardiac death.

The study was published Feb. 14, 2014 in the prestigious international journal Science and unveiled at the convention of the American Association for the Advancement of Science in Chicago.

The study is part of the ongoing Natural Resource Damage Assessment of the April 2010 Deepwater Horizon oil spill in the Gulf of Mexico.

Scientists have known for some time that crude oil is known to be “cardiotoxic” to developing fish. Until now, the mechanisms underlying the harmful effects were unclear.

Exxon Valdez

Studies going back to the Exxon Valdez oil spill in Alaska in 1989 have shown that exposure to crude oil-derived chemicals disrupt cardiac function and impairs development in larval fishes. The studies have described a syndrome of embryonic heart failure, bradycardia (slow heart beat), arrhythmias (irregular heartbeats) and edema in exposed fish embryos.

After the Gulf of Mexico spill, studies began on young fish in the aftermath of the Deepwater Horizon spill. The two science teams wanted to find out how oil specifically impacts heart cells.

Crude oil is a complex mixture of chemicals, some of which are known to be toxic to marine animals.

Past research focused on “polycyclic aromatic hydrocarbons” (PAHs), which can also be found in coal tar, creosote, air pollution and stormwater runoff from land. In the aftermath of an oil spill, the studies show PAHs can persist for many years in marine habitats and cause a variety of adverse environmental effects.

The scientists found that oil interferes with cardiac cell excitability, contraction and relaxation – vital processes for normal beat-to-beat contraction and pacing of the heart.

Low concentrations of crude

The study shows that very low concentrations of crude oil disrupt the specialized ion channel pores – where molecules flow in and out of the heart cells – that control heart rate and contraction in the cardiac muscle cell. This cyclical signalling pathway in cells throughout the heart is what propels blood out of the pump on every beat. The protein components of the signalling pathway are highly conserved in the hearts of most animals, including humans.

The researchers found that oil blocks the potassium channels distributed in heart cell membranes, increasing the time to restart the heart on every beat. This prolongs the normal cardiac action potential, and ultimately slows the heartbeat. The potassium ion channel impacted in the tuna is responsible for restarting the heart muscle cell contraction cycle after every beat, and is highly conserved throughout vertebrates, raising the possibility that animals as diverse as tuna, turtles and dolphins might be affected similarly by crude oil exposure. Oil also resulted in arrhythmias in some ventricular cells.

“The ability of a heart cell to beat depends on its capacity to move essential ions like potassium and calcium into and out of the cells quickly.” said Barbara Block, a professor of marine sciences at Stanford. She said, “We have discovered that crude oil interferes with this vital signalling process essential for our heart cells to function properly.”

Nat Scholz, leader of the Ecotoxicology Program at NOAA’s Northwest Fisheries Science Center in Seattle said.”We’ve known from NOAA research over the past two decades that crude oil is toxic to the developing hearts of fish embryos and larvae, but haven’t understood precisely why.”

Long term problems in fish hearts

He added: “These new findings more clearly define petroleum-derived chemical threats to fish and other species in coastal and ocean habitats, with implications that extend beyond oil spills to other sources of pollution such as land-based urban stormwater runoff.”

The new study also calls attention to a previously under appreciated risk to wildlife and humans, particularly from exposure to cardioactive PAHs that can also exist when there are high levels of air pollution.

“When we see these kinds of acute effects at the cardiac cell level,” Block said, “it is not surprising that chronic exposure to oil from spills such as the Deepwater Horizon can lead to long-term problems in fish hearts.”

The study used captive populations of bluefin and yellowfin tuna at the Tuna Research and Conservation Center, a collaborative facility operated by Stanford and the Monterey Bay Aquarium. That meant the research team was able to directly observe the effects of crude oil samples collected from the Gulf of Mexico on living fish heart cells.

“The protein ion channels we observe in the tuna heart cells are similar to what we would find in any vertebrate heart and provide evidence as to how petroleum products may be negatively impacting cardiac function in a wide variety of animals,” she said. “This raises the possibility that exposure to environmental PAHs in many animals – including humans – could lead to cardiac arrhythmias and bradycardia, or slowing of the heart.”

Tuna spawning

The Deepwater Horizon disaster released over 4 million barrels of crude oil during the peak spawning time for the Atlantic bluefin tuna in the spring of 2010. Electronic tagging and fisheries catch data indicate that Atlantic bluefin spawn in the area where the Deepwater Horizon drilling rig collapsed, raising the possibility that eggs and larvae, which float near the surface waters, were exposed to oil.

Blue fin tuna
An Atlantic bluefin tuna ( ©Gilbert Van Ryckevorsel/TAG A Giant/Courtesy Standford University)

The spill occurred in the major spawning ground of the western Atlantic population of bluefin tuna in the Gulf of Mexico. The most recent stock assessment, conducted in 2012, estimated the spawning population of the bluefin tuna to be at only 36 percent of the 1970 baseline population. Additionally, many other pelagic fishes were also likely to have spawned in oiled habitats, including yellowfin tuna, blue marlin and swordfish.

Block and her team bathed isolated cardiac cells from the tuna in low dose crude oil concentrations similar to what fish in early life stages may have encountered in the surface waters where they were spawned after the April 2010 oil spill in the Gulf of Mexico.

They measured the heart cells’ response to record how ions flowed into and out of the heart cells to identify the specific proteins in the excitation-contraction pathway that were affected by crude oil chemical components.

Fabien Brette, a research associate in Block’s lab and lead author on the study said the scientists looked at the function of healthy heart cells in a laboratory dish and then used a microscope to measure how the cells responded when crude oil was introduced.

“The normal sequence and synchronous contraction of the heart requires rapid activation in a coordinated way of the heart cells,” Block said. “Like detectives, we dissected this process using laboratory physiological techniques to ask where oil was impacting this vital mechanism.”

Related: Oil spill caused “unexpected lethal impact” on herring, study shows

 

Methane leaks from natural gas industry 50 per cent higher than EPA estimates study says

EPA Gas Leakage
This shows EPA Greenhouse Gas Inventory leakage estimates. Below: This shows results from recent experimental studies. Studies either focus on specific industry segments, or use broad atmospheric data to estimate emissions from multiple segments or the entire industry. Studies have generally found either higher emissions than expected from EPA inventory methods, or found mixed results (some sources higher and others lower).
( Stanford University School of Earth Sciences)

 

 

A new study indicates that atmospheric emissions of methane, a critical greenhouse gas, mostly leaking from the natural gas industry are likely 50 per cent higher than previously estimated by the US Environmental Protection Agency.

A study, “Methane Leakage from North American Natural Gas Systems,” published in the Feb. 14 issue of the international journal Science, synthesizes diverse findings from more than 200 studies ranging in scope from local gas processing plants to total emissions from the United States and Canada.

The scientists say this first thorough comparison of evidence for natural gas system leaks confirms that organizations including the EPA have underestimated U.S. methane emissions generally, as well as those from the natural gas industry specifically.

Natural gas consists predominantly of methane. Even small leaks from the natural gas system are important because methane is a potent greenhouse gas – about 30 times more potent than carbon dioxide.

“People who go out and actually measure methane pretty consistently find more emissions than we expect,” said the lead author of the new analysis, Adam Brandt, an assistant professor of energy resources engineering at Stanford University. “Atmospheric tests covering the entire country indicate emissions around 50 per cent more than EPA estimates,” said Brandt. “And that’s a moderate estimate.”

The standard approach to estimating total methane emissions is to multiply the amount of methane thought to be emitted by a particular kind of source, such as leaks at natural gas processing plants or belching cattle, by the number of that source type in a region or country. The products are then totalled to estimate all emissions. The EPA does not include natural methane sources, like wetlands and geologic seeps.

The national natural gas infrastructure has a combination of intentional leaks, often for safety purposes, and unintentional emissions, like faulty valves and cracks in pipelines. In the United States, the emission rates of particular gas industry components – from wells to burner tips – were established by the EPA in the 1990s.

Since then, many studies have tested gas industry components to determine whether the EPA’s emission rates are accurate, and a majority of these have found the EPA’s rates too low. The new analysis does not try to attribute percentages of the excess emissions to natural gas, oil, coal, agriculture, landfills, etc., because emission rates for most sources are so uncertain.
Several other studies have used airplanes and towers to measure actual methane in the air, to test total estimated emissions. The new analysis, which is authored by researchers from seven universities, several national laboratories and US federal government bodies, and other organizations, found these atmospheric studies covering very large areas consistently indicate total U.S. methane emissions of about 25 to 75 per cent higher than the EPA estimate.

Some of the difference is accounted for by the EPA’s focus on emissions caused by human activity. The EPA excludes natural methane sources like geologic seeps and wetlands, which atmospheric samples unavoidably include. The EPA likewise does not include some emissions caused by human activity, such as abandoned oil and gas wells, because the amounts of associated methane are unknown.

The new analysis finds that some recent studies showing very high methane emissions in regions with considerable natural gas infrastructure are not representative of the entire gas system. “If these studies were representative of even 25 percent of the natural gas industry, then that would account for almost all the excess methane noted in continental-scale studies,” said a co-author of the study, Eric Kort, an atmospheric science professor at the University of Michigan. “Observations have shown this to be unlikely.”

Methane air sampling systems
Top-down methods take air samples from aircraft or tall towers to measure gas concentrations remote from sources. Bottom-up methods take measurements directly at facilities. Top-down methods provide a more complete and unbiased assessment of emissions sources, and can detect emissions over broad areas. However, they lack specificity and face difficulty in assigning emissions to particular sources. Bottom-up methods provide direct, precise measurement of gas emissions rates. However, the high cost of sampling and the need for site access permission leads to small sample sizes and possible sampling bias.
(Stanford University School of Earth Sciences)

Natural gas as a replacement fuel

The scientists say that even though the gas system is almost certainly leakier than previously thought, generating electricity by burning gas rather than coal still reduces the total greenhouse effect over 100 years. Not only does burning coal release an enormous amount of carbon dioxide, mining it releases methane.

Perhaps surprisingly though, the analysis finds that powering trucks and buses with natural gas instead of diesel fuel probably makes the globe warmer, because diesel engines are relatively clean. For natural gas to beat diesel, the gas industry would have to be less leaky than the EPA’s current estimate, which the new analysis also finds quite improbable.

“Fueling trucks and buses with natural gas may help local air quality and reduce oil imports, but it is not likely to reduce greenhouse gas emissions. Even running passenger cars on natural gas instead of gasoline is probably on the borderline in terms of climate,” Brandt said.

The natural gas industry, the analysis finds, must clean up its leaks to really deliver on its promise of less harm. Fortunately for gas companies, a few leaks in the gas system probably account for much of the problem and could be repaired. One earlier study examined about 75,000 components at processing plants. It found some 1,600 unintentional leaks, but just 50 faulty components were behind 60 percent of the leaked gas.

“Reducing easily avoidable methane leaks from the natural gas system is important for domestic energy security,” said Robert Harriss, a methane researcher at the Environmental Defense Fund and a co-author of the analysis. “As Americans, none of us should be content to stand idly by and let this important resource be wasted through fugitive emissions and unnecessary venting.”

Gas companies not cooperating

One possible reason leaks in the gas industry have been underestimated is that emission rates for wells and processing plants were based on operators participating voluntarily. One EPA study asked 30 gas companies to cooperate, but only six allowed the EPA on site.

“It’s impossible to take direct measurements of emissions from sources without site access,” said Garvin Heath, a senior scientist with the National Renewable Energy Laboratory and a co-author of the new analysis. “But self-selection bias may be contributing to why inventories suggest emission levels that are systematically lower than what we sense in the atmosphere.”

The research was funded by the nonprofit organization Novim through a grant from the Cynthia and George Mitchell Foundation. “We asked Novim to examine 20 years of methane studies to explain the wide variation in existing estimates,” said Marilu Hastings, sustainability program director at the Cynthia and George Mitchell Foundation. “Hopefully this will help resolve the ongoing methane debate.”

###
Other co-authors of the Science study are Francis O’Sullivan of the MIT Energy Initiative; Gabrielle Pétron of the National Oceanic and Atmospheric Administration (NOAA) and the University of Colorado; Sarah M. Jordaan of the University of Calgary; Pieter Tans, NOAA; Jennifer Wilcox, Stanford; Avi Gopstein of the U.S. Department of State; Doug Arent of the National Renewable Energy Laboratory and the Joint Institute for Strategic Energy Analysis; Steven Wofsy of Harvard University; Nancy Brown of the Lawrence Berkeley National Laboratory; independent consultant Richard Bradley; and Galen Stucky and Douglas Eardley, both of the University of California-Santa Barbara. The views expressed in the study are those of the authors, and do not necessarily reflect those of the U.S. Department of State or the U.S. government.