With climate change, the oceans are becoming more acid and that is a threat to the dungeness crab, according to a study by the US National Oceanic and Atmospheric Administration.
The study says ocean acidification expected to accompany climate change may slow development and reduce survival of the larval stages of Dungeness crab.
The dungeness crab is a key component of the Northwest marine ecosystem and vital to fishery revenue from Oregon to Alaska.
The research by NOAA Fisheries’ Northwest Fisheries Science Center in Seattle indicates that the declining pH anticipated in Puget Sound could jeopardize populations of Dungeness crab and put the fishery at risk. The study was recently published in the journal Marine Biology.
Ocean acidification occurs as the ocean absorbs carbon dioxide from the combustion of fossil fuels. Average ocean surface pH is expected to drop to about 7.8 off the West Coast by 2050, and could drop further during coastal upwelling periods.
Dungeness crab is the highest revenue fishery in Washington and Oregon, and the second most valuable in California, although the fishery was recently closed in some areas because of a harmful algal bloom. The Dungeness crab harvest in 2014 was worth more than $80 million in Washington, $48 million in Oregon and nearly $67 million in California
“I have great faith in the resiliency of nature, but I am concerned,” said Jason Miller, lead author of the research, which was part of his dissertation. “Crab larvae in our research were three times more likely to die when exposed to a pH that can already be found in Puget Sound, our own back yard, today.”
Scientists collected eggs from Dungeness crabs in Puget Sound and placed them in tanks at the NWFSC’s Montlake Research Laboratory. The tanks held seawater with a range of pH levels reflecting current conditions as well as the lower pH occasionally encountered in Puget Sound when deep water wells up near the surface. Larvae also went into tanks with the even lower-pH conditions expected with ocean acidification.
“The question was whether the lower pH we can expect to see in Puget Sound interferes with development of the next generation of Dungeness crab,” said Paul McElhany, a NOAA Fisheries research scientist and senior author of the paper. “Clearly the answer is yes. Now the question is, how does that play out in terms of affecting their life cycle and populations overall?”
Larvae hatched at the same rate regardless of pH, but those at lower pH took longer to hatch and progressed through their larval stages more slowly. Scientists suggested that the lower pH may reduce the metabolic rate of embryos. That could extend their vulnerable larval period, or could jeopardize the timing of their development in relation to key food sources, researchers suggested.
Larval survival also dropped by more than half at lower pH. At pH 8.0, roughly equivalent to seawater today, 58 percent of the crab larvae – called zoeae – survived for 45 days. At pH 7.5, which sometimes occurs in Puget Sound now, survival was 14 percent. At pH 7.1, which is expected to roughly approximate the pH of water upwelling on the West Coast with ocean acidification, zoeae survival remained low at 21 percent.
“Areas of greatest vulnerability will likely be where deep waters, naturally low in pH, meet acidified surface waters,” such as areas of coastal upwelling along the West Coast and in estuary environments such Hood Canal, the new study predicts.
Scientists have identified a new species of a strange marine mammal group that lived on the Pacific Coast between 33 million years ago and 10 million years ago. The new specimens — from at least four individuals — were recovered from Unalaska, in theAleutians.
The Desmostylians, unlike other marine mammals species alive today — such as whales, seals and sea cows –are extinct. The researchers call them “desmos” for short. Unlike whales and seals, but like manatees, desmos were vegetarians
The desmos are found from Baja, California, up along the west coast of North America, around the Alaska Peninsula, the storm-battered Aleutian Islands, to Russia’s Kamchatka Peninsula and Sakhalin Island, to the Japanese islands
Their strange columnar teeth and odd style of eating don’t occur in any other mammal. They rooted around coastlines, ripping up vegetation, such as marine algae, sea grass and other near-shore plants.
The new species, 23 million years old was a big, hippo-sized animal with a long snout and tusks, It has a unique tooth and jaw structure that indicates it was not only a vegetarian, but literally sucked vegetation from shorelines like a vacuum cleaner, said vertebrate paleontologist and study co-author Louis L. Jacobs, Southern Methodist University, Dallas.
They probably swam like polar bears, using their strong front limbs to power along. On land, they would have had the lumbering gait of a sloth.
A large, stocky-limbed mammal, desmos’ modern relatives remain a mystery. Scientists have previously linked the animals perhaps to manatees, horses or elephants. Adult desmostylians were large enough to be relatively safe from predators.
The identification of a new species belonging to Desmostylia has intensified the rare animal’s brief mysterious journey through prehistoric time, according to the new study.
While alive, the newly discovered creatures lived in what is now Unalaska’s Dutch Harbor.
“The new animal — when compared to one of a different species from Japan — made us realize that desmos do not chew like any other animal,” said Jacobs, a professor of earth sciences. “They clench their teeth, root up plants and suck them in.”
To eat, the animals buttressed their lower jaw with their teeth against the upper jaw, and used the powerful muscles that attached there, along with the shape of the roof of their mouth, to suction-feed vegetation from coastal bottoms. Big muscles in the neck would help to power their tusks, and big muscles in the throat would help with suction.
“No other mammal eats like that,” Jacobs said. “The enamel rings on the teeth show wear and polish, but they don’t reveal consistent patterns related to habitual chewing motions.”
The new specimens also represent a new genus — meaning desmostylians in the same family diverged from one anoher in key physical characteristics, particularly the tooth and jaw structure, said Jacobs, who is one of 10 scientists collaborating on the research.
Discovery of a new genus and species indicates the desmostylian group was larger and more diverse than previously known, said paleontologist and co-author Anthony Fiorillo, vice president of research and collections and chief curator at the Perot Museum of Nature and Science, Dallas, and an adjunct research professor at SMU.
“Our new study shows that though this group of strange and extinct mammals was short-lived, it was a successful group with greater biodiversity than had been previously realized,” Fiorillo said.
Compared to other mammals, desmos were latecomers and didn’t appear on earth until fairly recently — 33 million years ago. Also unusual for mammals, they survived a mere 23 million years, dying out 10 million years ago.
The research was funded by the Perot Museum of Nature and Science, U.S. National Park Service – Alaska Region Office, and SMU’s Institute for the Study of Earth and Man.
The newest desmo made its home on Unalaska Island, the farthest north of any occurrence of the group, which only lived along the shores of the North Pacific.
“That’s the only place they’re known in the world,” Jacobs said. The Unalaska fossils represent at least four individuals, and one is a baby.
“The baby tells us they had a breeding population up there,” Jacobs said. “They must have stayed in sheltered areas to protect the young from surf and currents.”
In addition, “the baby also tells us that this area along the Alaska coast was biologically productive enough to make it a good place for raising a family,” said Fiorillo.
Just as cattle assemble in a herd, and a group of fish is a school, multiple desmostylians constitute a “troll” — a designation selected by Jacobs to honor Alaskan Ray Troll, the artist who has depicted desmos most.
The first Unalaska fossils were discovered in the 1950s in a rock quarry during U.S. Geological Survey mapping.
Others found more recently were on display at the Ounalashka Corporation headquarters. Those specimens were offered to Fiorillo and Jacobs for study after Fiorillo gave a public presentation to the community on his work in Alaska.
“The fruits of that lecture were that it started the networking with the community, which in turn led us to a small, but very important collection of fossils that had been unearthed in the town when they built a school a few years earlier,” Fiorillo said. “The fossils were shipped to the Perot Museum of Nature and Science for preparation in our lab and those fossils are the basis for our work now.”
From there, the researchers discovered that the fossils were a new genus and species.
The authors report their discoveries in a special volume of the international paleobiology journal, Historical Biology. The article published online Oct. 1 at http://bit.ly/1PQAHZJ
The researchers named the new mammal Ounalashkastylus tomidai. “Ounalashka,” means “near the peninsula” in the Aleut language of the indigenous people of the Aleutian Islands.
“Stylus” is from the Latin for “column” and refers to the shape of cusps in the teeth.
“Tomida” honors distinguished Japanese vertebrate paleontologist Yukimitsu Tomida.
The article appears in a special volume of Historical Biology to honor the career accomplishments of Tomida upon his retirement from the Department of Geology and Paleontology in Tokyo’s National Museum of Nature and Science.
In addition to Jacobs, Fiorillo and Polcyn, other authors were Yosuke Nishida, SMU; Yuri Kimura, Smithsonian Institution and the Tokyo Museum; Kentaro Chiba, University of Toronto; Yoshitsugu Kobayashi, Hokkaido University Museum, Naoki Kohno, National Museum of Nature and Science; and Kohei Tanaka, University of Calgary.
The Historical Biology article is titled “A new desmostylian mammal from Unalaska (USA) and the robust Sanjussen jaw from Hokkaido (Japan), with comments on feeding in derived desmostylids” and appears in the special issue “Contributions to vertebrate palaeontology in honour of Yukimitsu Tomida.”
A “devastating megathrust earthquake” could hit Haida Gwaii sometime in the future, according to Canadian and US studies carried out after the magnitude 7.8 earthquake off Haida Gwaii on Oct. 27, 2012 and the 7.5 magnitude quake off Craig, Alaska, a few weeks later on Jan. 5, 2013.
The 2004 Indian Ocean earthquake and the 2011 Tōhoku earthquake in Japan, both accompanied by major tsunamis are recent examples of “great” (higher than magnitude 8.0) megathrust earthquakes. Most of the concern on the west coast has been the likelihood of a megathrust earthquake on the Cascadia Fault on the Juan de Fuca plate that stretches from northern California to the middle of Vancouver Island.
The 2012 Haida Gwaii main shock was the second largest seismic event in Canada since the establishment of a modern seismograph network. The first was the 1949 Haida Gwaii/Queen Charlotte earthquake with a magnitude of 8.1 That 1949 Haida Gwaii earthquake was a strike-slip event, where the plates move side-to-side, similar to the 1906 San Francisco earthquake and other quakes on the San Andreas Fault in California.
The 2012 Haida Gwaii earthquake is characterized in the studies as a “mini-megathrust” event, where part of the crust is pushed upward, meaning that a larger megathrust could have much more destructive consequences from both the earthquake and a possible tsunami.
Complex system of faults
The new studies show that the Pacific and North America plate boundary off the coast of British Columbia and southeastern Alaska creates a system of faults capable of producing very large earthquakes. The scientists conclude that while the two earthquakes in 2012 and 2013 released strain built up over years on the tectonic plates, those events did not release strain along the Queen Charlotte Fault off the west coast of Haida Gwaii. That means the fault remains the likely source of a future large earthquake.
A special issue of the Bulletin of the Seismological Society of America (BSSA), released Monday, April 6, 2015, contains 19 scientific and technical papers, outlining the results of the work carried out over the past two years.
The team estimated the rupture dimension of the 2012 Haida Gwaii earthquake to be about 120 kilometres long at a depth of about 30 kilometres.
The Craig earthquake ruptured the Queen Charlotte fault over a distance of more than 100 kilometres and at a depth of about 20 kilometres.
The two areas are joined in what is called the Queen Charlotte Fairweather Fault System. To the south the Queen Charlotte Fault also interacts with the Juan de Fuca plate that stretches from Vancouver Island to northern California.
“The study of these two quakes revealed rich details about the interaction between the Pacific and North America Plates, advancing our understanding of the seismic hazard for the region,” said Thomas James, research scientist at Geological Survey of Canada.
Two faults off Haida Gwaii
The studies conclude that the interaction between the plates off Haida Gwaii is much more complex than previously believed. Before the 2012 earthquake, the Queen Charlotte Fault, a strike-slip fault similar to the San Andreas Fault in California, was believed to be the dominating tectonic structure in the area. The 2012 tremor confirmed the existence of a previously suspected thrust fault beneath what is called the “Queen Charlotte Terrace,” to the west of the Queen Charlotte Fault, where the Pacific plate is sliding at a low angle below the North American plate.
The Queen Charlotte Terrace, which is about a kilometre below the surface of the ocean, is built up of layers of sediment, several kilometres thick, scraped off the oceanic plate as it subducts under the North American plate. It may also include some fragments of oceanic crust. For most of the terrace, it is “present as a clearly defined linear feature,” but the study adds: “north of about 53.5° N, a complex pattern of ridges and valleys appears.”
The earthquake was “essentially a mini-megathrust earthquake along the dipping plate interface of a subduction system,” one of the scientific papers says. The epicenter of the Haida Gwaii main shock was located about five kilometres landward (northeast) of the Queen Charlotte Fault. That probably means that the rupture was near the bottom of the locked plates, where the plate motion’s side to side movement is also thrusting downward. Significant aftershocks appeared to cluster on the periphery of the main rupture zone with most of the aftershocks occurring seaward to the west.
The scientists used GPS observations of crustal motion to locate the earthquake’s rupture offshore to the west of Haida Gwaii.
The situation off Haida Gwaii is complex because while the Pacific plate is converging with the North American plate at a rate of 15 to 20 millimetres a year, at the same time the two plates are slipping by each other toward the north northwest at angle of about 20 degrees at a rate of about 50 millimetres a year.
Honn Kao, a seismologist with the Geological Survey of Canada said, “This was an event the thrust interface of the plate boundary system, confirming that there is a subduction system in the Haida Gwaii area.
“The implication of a confirmed subduction zone is that in addition to the Queen Charlotte Fault, we now have another source which can produce devastating megathrust earthquakes in the area,” said Kao.
The study of the Haida Gwaii tremor looked at the causative faults, the rupture processes and depths of the main shock and sequence of strong aftershocks.
The Haida Gwaii earthquake generated a significant tsunami that left deposits indicating run-up exceeding 3 metres (maximum 13 metres) in a number of bays and inlets along about 230 kilometres along the west coast of Haida Gwaii. In Hawaii, a 0.8 metre wave was measured on a tide gauge.
In Queen Charlotte City perceptible shaking lasted for one and half to two minutes, with very strong shaking for about 30 seconds. The earthquake was felt as far away as Yukon Territory, Alberta, and Montana.
The study says “Damage was limited, in part owing to the sparse population, but also because of the seismic resistance of the generally low rise, wood-frame buildings on the islands. Felt intensities were at expected values close to the source zone, but regional intensities were smaller than predicted.”
The Haida Gwaii rupture also shook southeastern Alaska. The northwest direction of ground motion then may have influenced the timing of the Craig earthquake a few weeks later in January 2013. That earthquake occurred farther north in southeast Alaska, where relative plate motion is nearly parallel to the Queen Charlotte fault.
The Haida Gwaii aftershocks clustered around the periphery of the rupture zone, both on the seaward and landward side of the plate boundary and reflected what the study calls “normal faulting behavior–caused by the bending, extending or stretching of rock– rather than the thrust faulting of the main shock.” The pattern of aftershocks is similar to those observed after the 2011 Japanese megathrust earthquake.
“Our observations of normal faulting imply that the main shock of the Haida Gwaii earthquake dramatically altered the stress field in the rupture zone, especially in a neighboring region,” Kao said.
The distribution of aftershocks occurred to the north of a previously identified seismic gap where large earthquakes have not occurred in historic times. The gap is located to the south of the where 1949 magnitude 8.1 Queen Charlotte earthquake ruptured.
Though the Haida Gwaii earthquake may have activated some part of the Queen Charlotte Fault, Kao said, it was limited and did not relieve stress along the seismic gap.
The study concludes:
The Haida Gwaii event confirmed substantial seismic and tsunami hazard from large thrust events on the plate margin along the southern Queen Charlotte fault. It occurred where relatively young oceanic lithosphere under thrusts North America and in some ways is an analog for the much larger megathrust earthquakes known to occur on the Cascadia subduction zone to the south, where the young Juan de Fuca plate and other small plates subduct beneath North America. The Haida Gwaii earthquake had a complex pattern of main shock rupture and aftershocks and a large tsunami.
Further study needed
The Geological Survey of Canada plans further studies to understand the formations off Haida Gwaii.
One question to ask is if there are any records of major earthquake events in the past history of Haida Gwaii. The study notes that the impact of the tsunami was relatively minor “in this region with steep rocky coastlines.” That means there are limited sources of coastal sediments that can be checked for past events. It adds: “Low-elevation lakes, ponds, and bogs may offer the best opportunities for paleotsunami studies” warning that large earthquakes in the past that produced tsunamis may have left little evidence in the “paleoseismic record of Haida Gwaii and similar settings worldwide.”
Megathrust earthquakes occur at subduction zones at destructive plate boundaries where one tectonic plate is subducted (forced underneath) by another. These interplate earthquakes are the planet’s most powerful, with moment magnitudes that can exceed 9.0. Since 1900, all earthquakes of magnitude 9.0 or greater have been megathrust earthquakes. During the rupture, one side of the fault is pushed upwards relative to the other, and it is this type of movement that is known as thrust. The displacement of the ocean in a thrust can trigger a tsunami.
A transform fault is one where the motion is predominantly horizontal. Those faults end abruptly and are connected on both ends to other faults, ridges, or subduction zones. The best-known (and most destructive) are those on land at the margins of tectonic plates. Transform faults are the only type of strike-slip faults at plate boundaries show strike-slip or side-to-side in movement.
Queen Charlotte Terrace
The Queen Charlotte Terrace is a 25 kilometre wide zone of built up marine sediment immediately west of the active Queen Charlotte fault. The crust is about 12 kilometres thick at the terrace. On Haida Gwaii, the earth’s crust is 18 kilometres thick at the eastern edge. On the BC mainland the crust is in excess of 30 kilometres thick.
The 1949 Haida Gwaii quake was one of the largest in the recorded history of North America.
The largest known earthquake along the coast was the megathrust event on the Cascadia fault on January 26, 1700 where the Juan de Fuca plate ruptured for about 1,000 kilometres along from what is now northern California to Vancouver Island, estimated at magnitude 9.0. The dating is based on a tsunami that hit Japan that had no associated local earthquake as well studies of tree rings from the remains of trees downed in the tsunami.
The United States says acidification of the oceans means there is an already growing risk to the northwest coast fishery, including crab and salmon, according to studies released by the National Oceanic and Atmospheric Administration.
As more carbon dioxide is released into the atmosphere and absorbed by the oceans, the water is becoming more acidic and that affects many species, especially shellfish, dissolving the shells.
A NOAA study released today of environmental and economic risks to the Alaska fishery says:
Many of Alaska’s nutritionally and economically valuable marine fisheries are located in waters that are already experiencing ocean acidification, and will see more in the near future…. Communities in southeast and southwest Alaska face the highest risk from ocean acidification because they rely heavily on fisheries that are expected to be most affected by ocean acidification…
An earlier NOAA study, released in April, identified a long term threat to the salmon fishery as small ocean snails called pteropods which are a prime food source for pink salmon are already being affected by the acidification of the ocean.
The term “ocean acidification” describes the process of ocean water becoming more acidic as a result of absorbing nearly a third of the carbon dioxide released into the atmosphere from human sources. This change in ocean chemistry is affecting marine life, particularly the ability of shellfish, corals and small creatures in the early stages of the food chain to build skeletons or shells.
Today’s NOAA study is the first published research by the Synthesis of Arctic Research (SOAR) program, which is supported by an US inter-agency agreement between NOAA’s Office of Oceanic and Atmospheric Research and the Bureau of Ocean Energy Management (BOEM) Alaska Region.
Des Nobles, President of Local #37 Fish [UFAWU-UNIFOR] told Northwest Coast Energy News that the fisheries union and other fisheries groups in Prince Rupert have asked both the Canadian federal and the BC provincial governments for action on ocean acidification. Nobles says so far those requests have been ignored,
Threat to crabs
The studies show that red king crab and tanner crab grow more slowly and don’t survive as well in more acidic waters. Alaska’s coastal waters are particularly vulnerable to ocean acidification because of cold water that can absorb more carbon dioxide and unique ocean circulation patterns which bring naturally acidic deep ocean waters to the surface.
“We went beyond the traditional approach of looking at dollars lost or species impacted; we know these fisheries are lifelines for native communities and what we’ve learned will help them adapt to a changing ocean environment,” said Jeremy Mathis, Ph.D., co-lead author of the study, an oceanographer at NOAA’s Pacific Marine Environmental Laboratory in Seattle, and the director of the University of Alaska Fairbanks School of Fisheries and Ocean Sciences Ocean Acidification Research Center.
As for Dungeness crab, Sarah Cooley, a co-author of the Alaska study, who was with the Woods Hole Oceanographic Institution at the time, told Northwest Coast Energy News, “The studies have not been done for Dungeness crab that have been done for king and tanner crab, that’s something we’re keenly aware of. There’s a big knowledge gap at this point.” She says NOAA may soon be looking at pilot study on Dungeness crab.
Risk to Salmon, Mackerel and Herring
In a 2011-2013 survey, a NOAA-led research team found the first evidence: “that acidity of continental shelf waters off the West Coast is dissolving the shells of tiny free-swimming marine snails, called pteropods, which provide food for pink salmon, mackerel and herring.”
The survey estimated that the percentage of pteropods along the west coast with dissolving shells due to ocean acidification had “doubled in the near shore habitat since the pre-industrial era and is on track to triple by 2050 when coastal waters become 70 percent more corrosive than in the pre-industrial era due to human-caused ocean acidification.”
That study documented the movement of corrosive waters onto the continental shelf from April to September during the upwelling season, when winds bring water rich in carbon dioxide up from depths of about 120 to 180 metres to the surface and onto the continental shelf.
“We haven’t done the extensive amount of studies yet on the young salmon fry,” Cooley said. “I would love to see those studies done. I think there is a real need for that information. Salmon are just so so important for the entire Pacific Northwest and up to Alaska.”
In Prince Rupert, Barb Faggetter, an independent oceanographer whose company Ocean Ecology has consulted for the fisherman’s union and NGOs, who was not part of the study, spoke generally about the threat of acidification to the region.
She is currently studying the impact of the proposed Liquified Natural Gas terminals that could be built at Prince Rupert near the Skeena River estuary. Faggetter said that acidification could affect the species eaten by juvenile salmon. “As young juveniles they eat a lot of zooplankton including crustaceans and shell fish larvae.”
She added, “Any of the shell fish in the fishery, including probably things like sea urchins are all organisms that are susceptible to ocean acidification because of the loss of their capacity to actually incorporate calcium carbonate into their shells.”
Faggetter said her studies have concentrated on potential habitat loss near Prince Rupert as a result of dredging and other activities for liquified natural gas development, She adds that ocean acidification “has been a consideration that climate change will further worsen any potential damage that we’re currently looking at.”
Her studies of the Skeena estuary are concentrating on “rating” areas based on the food supply available to juvenile salmon, as well as predation and what habitat is available and the quality of that habitat to identify areas that “are most important for the juvenile salmon coming out of the Skeena River estuary and which are less important.”
She said that climate change and ocean acidification could impact the Skeena estuary and “probably reduce some of the environments that are currently good because they have a good food supply. If ocean acidification reduces that food supply that will no longer be good habitat for them” [juvenile salmon].
The August 2011 NOAA survey of the pteropods was done at sea using “bongo nets” to retrieve the small snails at depths up to 200 metres. The research drew upon a West Coast survey by the NOAA Ocean Acidification Program in that was conducted on board the R/V Wecoma, owned by the National Science Foundation and operated by Oregon State University.
Nina Bednarsek, Ph.D., of NOAA’s Pacific Marine Environmental Laboratory in Seattle, the lead author of the April pteropod paper said, “Our findings are the first evidence that a large fraction of the West Coast pteropod population is being affected by ocean acidification.
“Dissolving coastal pteropod shells point to the need to study how acidification may be affecting the larger marine ecosystem. These near shore waters provide essential habitat to a great diversity of marine species, including many economically important fish that support coastal economies and provide us with food.”
Ecology and economy
Today’s study on the effects of acidification on the Alaska fishery study examined the potential effects on a state where the fishing industry supports over 100,000 jobs and generates more than $5 billion in annual revenue. Fishery-related tourism also brings in $300 million annually to the state.
The study also shows that approximately 120,000 people or roughly 17 percent of Alaskans rely on subsistence fisheries for most, if not all of their dietary protein. The Alaska subsistence fishery is open to all residents of the state who need it, although a majority of those who participate in the subsistence fishery are Alaska’s First Nations. In that way it is somewhat parallel to Canada’s Food, Ceremonial and Social program for First Nations.
“Ocean acidification is not just an ecological problem—it’s an economic problem,” said Steve Colt, Ph.D., co-author of the study and an economist at the University of Alaska Anchorage. “The people of coastal Alaska, who have always looked to the sea for sustenance and prosperity, will be most affected. But all Alaskans need to understand how and where ocean acidification threatens our marine resources so that we can work together to address the challenges and maintain healthy and productive coastal communities.”
The Alaska study recommends that residents and stakeholders in vulnerable regions prepare for environmental challenge and develop response strategies that incorporate community values and needs.
“This research allows planners to think creatively about ways to help coastal communities withstand environmental change,” said Cooley, who is now science outreach manager at Ocean Conservancy, in Washington, D.C. “Adaptations can be tailored to address specific social and environmental weak points that exist in a community.
“This is really the first time that we’ve been able to go under the hood and really look at the factors that make a particular community in a borough or census are less or more vulnerable from changing conditions resulting from acidification. It gives us a lot of power so that we don’t just look at environmental issues but also look at the social story behind that risk.”
As for the southern part of the Alaska panhandle nearest British Columbia, Cooley said, “What we found is that there is a high relative risk compared to some of the other areas of Alaska and that is because the communities there undertake a lot of subsistence fishing, There tend not be a whole lot of commercial harvests in the fisheries there but they are very very important from a subsistence stand point… And they’re tied to species that we expect to be on the front line of acidification, many of the clam species that are harvested in that area and some of the crab species.”
Long term effects
Libby Jewett, Director of the NOAA Ocean Acidification Program and author of the pteropod study said, “Acidification of our oceans may impact marine ecosystems in a way that threatens the sustainability of the marine resources we depend on.
“Research on the progression and impacts of ocean acidification is vital to understanding the consequences of our burning of fossil fuels.”
“Acidification is happening now,” Cooley said. “We have not yet observed major declines in Alaskan harvested species. In Washington and Oregon they have seen widespread oyster mortality from acidification.
“We don’t have the documentation for what’s happening in Alaska right now but there are a lot of studies staring up right now that will just keep an eye out for that sort of thing, Acidification is going to be continuing progressively over the next decades into the future indefinitely until we really curb carbon dioxide emissions. There’s enough momentum in the system that is going to keep acidification advancing for quite some time.
“What we need to be doing as we cut the carbon dioxide, we need to find ways to strength communities that depend on resources and this study allows us to think differently about that and too really look at how we can strengthen those communities.
Faggetter said. “It’s one more blow to an already complex situation here, My study has been working particularly on eel grass on Flora Bank (pdf) which is a very critical habitat, which is going to be impacted by these potential industrial developments and that impact will affect our juvenile salmon and our salmon fishery very dramatically, that could be further worsened by ocean acidification.”
She said that acidification could also be a long term threat to plans in Prince Rupert to establish a geoduck fishery (pronounced gooey-duck).
The popular large 15 to 20 centimetre clam is harvested in Washington State and southern BC, but so far hasn’t been subject to commercial fishing in the north.
NOAA said today’s study shows that by examining all the factors that contribute to risk, more opportunities can be found to prevent harm to human communities at a local level. Decision-makers can address socioeconomic factors that lower the ability of people and communities to adapt to environmental change, such as low incomes, poor nutrition, lack of educational attainment and lack of diverse employment opportunities.
NOAA’s Ocean Acidification Program and the state of Alaska are also developing tools to help industry adapt to increasing acidity.
The new NOAA study is the first published research by the Synthesis of Arctic Research (SOAR) program. which is supported by an inter-agency agreement between NOAA’s Office of Oceanic and Atmospheric Research and the Bureau of Ocean Energy Management (BOEM) Alaska Region.
Oil spills kill fish. That’s well known. Now scientists say they have found out why oil spills kill adult fish. The chemicals in the oil often trigger an irregular heartbeat and cardiac arrest.
A joint study by Stanford University and the US National Atmospheric and Oceanic Administration have discovered that crude oil interferes with fish heart cells. The toxic consequence is a slowed heart rate, reduced cardiac contractility and irregular heartbeats that can lead to cardiac arrest and sudden cardiac death.
The study was published Feb. 14, 2014 in the prestigious international journal Science and unveiled at the convention of the American Association for the Advancement of Science in Chicago.
The study is part of the ongoing Natural Resource Damage Assessment of the April 2010 Deepwater Horizon oil spill in the Gulf of Mexico.
Scientists have known for some time that crude oil is known to be “cardiotoxic” to developing fish. Until now, the mechanisms underlying the harmful effects were unclear.
Studies going back to the Exxon Valdez oil spill in Alaska in 1989 have shown that exposure to crude oil-derived chemicals disrupt cardiac function and impairs development in larval fishes. The studies have described a syndrome of embryonic heart failure, bradycardia (slow heart beat), arrhythmias (irregular heartbeats) and edema in exposed fish embryos.
After the Gulf of Mexico spill, studies began on young fish in the aftermath of the Deepwater Horizon spill. The two science teams wanted to find out how oil specifically impacts heart cells.
Crude oil is a complex mixture of chemicals, some of which are known to be toxic to marine animals.
Past research focused on “polycyclic aromatic hydrocarbons” (PAHs), which can also be found in coal tar, creosote, air pollution and stormwater runoff from land. In the aftermath of an oil spill, the studies show PAHs can persist for many years in marine habitats and cause a variety of adverse environmental effects.
The scientists found that oil interferes with cardiac cell excitability, contraction and relaxation – vital processes for normal beat-to-beat contraction and pacing of the heart.
Low concentrations of crude
The study shows that very low concentrations of crude oil disrupt the specialized ion channel pores – where molecules flow in and out of the heart cells – that control heart rate and contraction in the cardiac muscle cell. This cyclical signalling pathway in cells throughout the heart is what propels blood out of the pump on every beat. The protein components of the signalling pathway are highly conserved in the hearts of most animals, including humans.
The researchers found that oil blocks the potassium channels distributed in heart cell membranes, increasing the time to restart the heart on every beat. This prolongs the normal cardiac action potential, and ultimately slows the heartbeat. The potassium ion channel impacted in the tuna is responsible for restarting the heart muscle cell contraction cycle after every beat, and is highly conserved throughout vertebrates, raising the possibility that animals as diverse as tuna, turtles and dolphins might be affected similarly by crude oil exposure. Oil also resulted in arrhythmias in some ventricular cells.
“The ability of a heart cell to beat depends on its capacity to move essential ions like potassium and calcium into and out of the cells quickly.” said Barbara Block, a professor of marine sciences at Stanford. She said, “We have discovered that crude oil interferes with this vital signalling process essential for our heart cells to function properly.”
Nat Scholz, leader of the Ecotoxicology Program at NOAA’s Northwest Fisheries Science Center in Seattle said.”We’ve known from NOAA research over the past two decades that crude oil is toxic to the developing hearts of fish embryos and larvae, but haven’t understood precisely why.”
Long term problems in fish hearts
He added: “These new findings more clearly define petroleum-derived chemical threats to fish and other species in coastal and ocean habitats, with implications that extend beyond oil spills to other sources of pollution such as land-based urban stormwater runoff.”
The new study also calls attention to a previously under appreciated risk to wildlife and humans, particularly from exposure to cardioactive PAHs that can also exist when there are high levels of air pollution.
“When we see these kinds of acute effects at the cardiac cell level,” Block said, “it is not surprising that chronic exposure to oil from spills such as the Deepwater Horizon can lead to long-term problems in fish hearts.”
The study used captive populations of bluefin and yellowfin tuna at the Tuna Research and Conservation Center, a collaborative facility operated by Stanford and the Monterey Bay Aquarium. That meant the research team was able to directly observe the effects of crude oil samples collected from the Gulf of Mexico on living fish heart cells.
“The protein ion channels we observe in the tuna heart cells are similar to what we would find in any vertebrate heart and provide evidence as to how petroleum products may be negatively impacting cardiac function in a wide variety of animals,” she said. “This raises the possibility that exposure to environmental PAHs in many animals – including humans – could lead to cardiac arrhythmias and bradycardia, or slowing of the heart.”
The Deepwater Horizon disaster released over 4 million barrels of crude oil during the peak spawning time for the Atlantic bluefin tuna in the spring of 2010. Electronic tagging and fisheries catch data indicate that Atlantic bluefin spawn in the area where the Deepwater Horizon drilling rig collapsed, raising the possibility that eggs and larvae, which float near the surface waters, were exposed to oil.
The spill occurred in the major spawning ground of the western Atlantic population of bluefin tuna in the Gulf of Mexico. The most recent stock assessment, conducted in 2012, estimated the spawning population of the bluefin tuna to be at only 36 percent of the 1970 baseline population. Additionally, many other pelagic fishes were also likely to have spawned in oiled habitats, including yellowfin tuna, blue marlin and swordfish.
Block and her team bathed isolated cardiac cells from the tuna in low dose crude oil concentrations similar to what fish in early life stages may have encountered in the surface waters where they were spawned after the April 2010 oil spill in the Gulf of Mexico.
They measured the heart cells’ response to record how ions flowed into and out of the heart cells to identify the specific proteins in the excitation-contraction pathway that were affected by crude oil chemical components.
Fabien Brette, a research associate in Block’s lab and lead author on the study said the scientists looked at the function of healthy heart cells in a laboratory dish and then used a microscope to measure how the cells responded when crude oil was introduced.
“The normal sequence and synchronous contraction of the heart requires rapid activation in a coordinated way of the heart cells,” Block said. “Like detectives, we dissected this process using laboratory physiological techniques to ask where oil was impacting this vital mechanism.”
The worldwide population of Orcas probably crashed during the last Ice Age, creating a “bottleneck” in the genetic diversity of the species around the world, a problem that could continue to affect killer whales today, according to a new genetic study published on February 4,2014.
Rus Hoelzel from the School of Biological and Biomedical Sciences, at Durham University, in the UK and colleagues used DNA sequencing from archive material from earlier studies, or from museum specimens to track the evolution of the Orcas.
One group of Orcas that lives off South Africa are an exception, with greater genetic diversity than others, the new study has revealed.
“Killer whales have a broad world-wide distribution, rivaling that of humans. At the same time, they have very low levels of genetic diversity,” Hoelzel said.
“Our data suggest that a severe reduction in population size during the coldest period of the last ice age could help explain this low diversity, and that it could have been an event affecting populations around the world.
The Orca population along the Northwest Pacific Coast has the same low genetic diversity as in other areas, the study showed.
The killer whale is as a top predator, feeding on everything from seals to sharks. That means from the top of the food chain, the Orca also serves as a sentinel species for past and future ocean ecosystems and environmental change.
In the study published in the journal Molecular Biology and Evolution, Hoelzel and his colleagues assembled 2.23 gigabytes of Northern Hemisphere killer whale genomic data and mitochondrial DNA (mtDNA) from 616 samples worldwide.
The scientists concluded that killer whales were stable in population size during most of the Pleistocene (2.5 million – 11,000 years ago) followed by a rapid decline and bottleneck during the last great period of the Ice Age (110,000-12,000 years ago).
“Our data supports the idea of a population bottleneck affecting killer whales over a wide geographic range and leading to the loss of diversity,” Hoelzel said. “The South African population stands out as an exception, which may be due to local conditions that were productive and stable over the last million years or so.”
They are pointing to the “Benguela upwelling” ocean system which delivers nutrient rich cold water to the oceans off South Africa. The Benguela system remained stable despite the last glacial period and the nutrient rich water would have been able to sustain the supplies of fish and dolphins that killer whales in this part of the world feed on.
The scientists believe that other major upwelling systems around the world – the California current off North America; Humboldt off South America; and the Canary current off the coast of North Africa – were either disrupted or collapsed altogether during the last glacial or Pleistocene periods. This could potentially have reduced the food supply to killer whales in these areas, leading to the fall in their numbers.
While it was likely that other factors affecting killer whale populations were “overlapping and complex”, the scientists ruled out hunting as an effect on the bottleneck in populations. Hunting by early man could not have happened on a sufficient enough scale to promote the global decline in killer whale numbers.
In an e-mail to Northwest Coast Energy News, Hoelzel said that the team sequenced the DNA from a male killer whale from the Southeast Alaska resident community. “This genome revealed the same pattern of historical population dynamics as we found for a whale from the North Atlantic, suggesting shared history across a very broad geographic range, and a shared population bottleneck around the time of the last glacial maximum,” Hoelzel said.
The scientists say looking at the genetic diversity of the ocean’s other top predators, such as sharks, might potentially suggest a negative impact on their numbers too during the Ice Age.
If you read both the 76 pages of Volume One of the Northern Gateway Joint Review decision and the 417 pages of Volume 2, a total of 493 pages, one word keeps reappearing. That word is “burden.”
The JRP panel asks “How did we weigh the balance of burdens, benefits, and risks?”
And it says:
Many people and parties commented on the economic benefits and burdens that could be brought about by the Enbridge Northern Gateway Project. In our view, opening Pacific Basin markets wouldbe important to the Canadian economy and society. Though difficult to measure, we found that the economic benefits of the project would likely outweigh any economic burdens.
The JRP notes:
The Province of British Columbia and many hearing participants argued that most of the project’s economic benefits would flow to Alberta, the rest of Canada, and foreign shareholders in oil and pipeline companies. They said British Columbia would bear too many of the environmental and economic burdens and risks compared to the benefits.
But, as the panel does throughout the ruling, it accepts, with little, if any, skepticism, Northern Gateway’s evidence and assertion:
Northern Gateway said about three-quarters of construction employment would occur in British Columbia, and the province would get the largest share of direct benefits from continuing operations.
It does touch on the “burdens” faced by the Aboriginal people of northern BC and others in the event of a catastrophic spill.
In the unlikely event of a large oil spill, we found that there would be significant adverse effects on lands, waters, or resources used by Aboriginal groups. We found that these adverse effects would not be permanent and widespread. We recognize that reduced or interrupted access to lands, waters, or resources used by Aboriginal groups, including for country foods, may result in disruptions in the ability of Aboriginal groups to practice their traditional activities. We recognize that such an event would place burdens and challenges on affected Aboriginal groups. We find that such interruptions would be temporary. We also recognize that, during recovery from a spill, users of lands, waters, or resources may experience disruptions and possible changes in access or use.
And the JRP goes on to say:
We recommend approval of the Enbridge Northern Gateway Project, subject to the 209 conditions set out in Volume 2 of our report. We have concluded that the project would be in the public interest. We find that the project’s potential benefits for Canada and Canadians outweigh the potential burdens and risks….
We are of the view that opening Pacific Basin markets is important to the Canadian economy and society. Societal and economic benefits can be expected from the project. We find that the environmental burdens associated with project construction and routine operation can generally be effectively mitigated. Some environmental burdens may not be fully mitigated in spite of reasonable best efforts and techniques…. We acknowledge that this project may require some people and local communities to adapt to temporary disruptions during construction.
As for the chance of a major oil spill, again the JRP talks about burdens:
The environmental, societal, and economic burdens of a large oil spill, while unlikely and not permanent, would be significant. Through our conditions we require Northern Gateway to implement appropriate and effective spill prevention measures and spill response capabilities, so that the likelihood and consequences of a large spill would be minimized.
It is our view that, after mitigation, the likelihood of significant adverse environmental effects resulting from project malfunctions or accidents is very low.
We find that Canadians will be better off with this project than without it.
In the Joint Review ruling is one fact. Northern British Columbia must bear the “burden” of the Northern Gateway project for the good of Alberta and the rest of Canada. The JRP accepts, without much questioning, Northern Gateway’s assurances that environmental disruptions during construction will be minimal and that the chances of a major spill from either a pipeline or a tanker are minimal.
Canadians as a whole may be better off with the Northern Gateway. Whether the people who live along the pipeline and tanker route will be better off is another question, one which the Joint Review Panel dismisses with casual disdain.
The politics of the Joint Review Panel
There are actually two Joint Review Panel reports.
One is political, one is regulatory. The political decision by the three member panel, two from Alberta and one from Ontario, is that the concerns of northwestern British Columbia are fully met by Enbridge Northern Gateway’s assurances. There is a second political decision, found throughout both volumes of the report, and the reader sees the Joint Review Panel has the notion that many parts of the environment have already been degraded by previous human activity, and that means the construction and operation of the Northern Gateway will have little consequence.
Here is where the Joint Review Panel is blind to its own bias. With its mandate to rule on the Canadian “public interest,” the panel makes the political determination that, in the Canadian public interest, northwestern BC must bear the “burden” of the project, while other political issues were not considered because, apparently those issues were outside the JRP’s mandate.
…some people asked us to consider the “downstream” emissions that could arise from upgrading, refining, and diluted bitumen use in China and elsewhere. These effects were outside our jurisdiction, and we did not consider them. We did consider emissions arising from construction activities, pipeline operations, and the engines of tankers in Canadian territorial waters.
During our hearings and in written submissions, many people urged us to include assessment of matters that were beyond the scope of the project and outside our mandate set out in the Joint Review Panel Agreement. These issues included both “upstream” oil development effects and “downstream” refining and use of the products shipped on the pipelines and tankers…Many people said the project would lead to increased greenhouse gas emissions and other environmental and social effects from oil sands development. We did not consider that there was a sufficiently direct connection between the project and any particular existing or proposed oil sands development or other oil production activities
If someone in Northwestern British Columbia favours the Northern Gateway project, if they believe (and many people do) what Enbridge Northern Gateway says about the economic benefits, then it is likely they will accept the burden and the further environmental degradation imposed by the Joint Review Panel on this region of British Columbia.
If, on other hand, for those who are opposed to the project, then the decision to impose the burden on this region is both unreasonable and undemocratic (since no one in northern BC, in the energy friendly east or the environmental west has been formally asked to accept or reject the project). For those opposed to the project, the idea that since the environment has already been disrupted by earlier industrial development, that Canadians can continue to degrade the environment with no consequence will only fuel opposition to the project.
As for the assertion that green house gas emissions were not part of the Joint Review Panel’s mandate, that is mendacious. The panel made a political decision on the role of the people of northwestern BC and the state of northwestern BC’s environment. The panel made a political decision to avoid ruling on the role of Northern Gateway in contributing to climate change or the larger world wide economic impact of pipelines and the bitumen sands.
The Joint Review Panel is supposed to be a regulatory body and should be pipeline, terminal and tanker project go ahead after the expected court challenges from First Nations on rights, title and consultation and from the environmental groups, then those 209 conditions kick in.
While the Joint Review Panel largely accepts Enbridge Northern Gateway’s evidence with little questions, in some areas the panel does find flaws in what Northern Gateway planned. In a few instances, it actually accepts the recommendations from intervenors (many from First Nations, who while opposed to the project, successfully demanded route changes to through environmentally sensitive or culturally significant territory.)
When it comes to regulations, as opposed to politics, the Joint Review Panel has done its job and done it well. If all 209 conditions and the other suggestions found in the extensive second volume of the ruling are actually enforced then it is likely that the Northern Gateway will be the safe project that Enbridge says it will be and actually might meet BC Premier Christy Clark’s five conditions for heavy oil pipelines across BC and tankers off the BC coast.
But and there is a big but.
The question is, however, who is going to enforce the 209 conditions? In recent conversations on various social media, people who were quiet during the JRP hearings, have now come out in favour of the pipeline project. Read those comments and you will find that the vast majority of project supporters want those conditions strictly enforced. Long before the JRP findings and before Premier Christy Clark issued her five conditions, supporters of the Northern Gateway, speaking privately, often had their own list of a dozen or two dozen conditions for their support of the project.
The people of northwestern BC had already witnessed cuts to Fisheries and Oceans, Environment Canada and the Canadian Coast Guard in his region even before Stephen Harper got his majority government in May 2011.
Since the majority government Harper has cut millions of dollars from the budgets for environmental studies, monitoring and enforcement. The Joint Review Panel began its work under the stringent rules of the former Fisheries Act and the Navigable Waters Act, both of which were gutted in the Harper government’s omnibus bills. Government scientists have been muzzled and, if allowed to speak, can only speak through departmental spin doctors. The Joint Review Panel requires Enbridge Northern Gateway to file hundreds of reports on the progress of surveying, environmental studies, safety studies, construction plans and activities and project operations. What is going to happen to those reports? Will they be acted on, or just filed in a filing cabinet, perhaps posted on an obscure and hard to find location on the NEB website and then forgotten?
Will the National Energy Board have the staff and the expertise to enforce the 209 conditions? Will there be any staff left at Environment Canada, Transport Canada, Fisheries and Oceans and the Canadian Coast Guard where the conditions demand active participation by government agencies, or ongoing consultation between federal agencies and Northern Gateway? Will there actual be monitoring, participation and consultation between the project and the civil service, or will those activities amount to nothing more than meetings every six months or so, when reports are exchanged and then forgotten? Although Stephen Harper and his government say the Northern Gateway is a priority for the government, the bigger priority is a balanced budget and it is likely there will be more cuts in the coming federal budget, not enhancements to environmental protection for northwestern BC.
The opponents of the project might reluctantly agree to the 209 conditions if Harper government forces the project to go ahead. It will be up to the supporters to decide whether or not they will continue their support of Northern Gateway if the 209 conditions are nothing more than a few pages of Adobe PDF and nothing more.
Special report: Clio Bay cleanup: Controversial, complicated and costly
Haisla First Nation Chief Counsellor Ellis Ross says the Haisla made the proposal to the KM LNG project, a partnership of Chevron and Apache, to use the marine clay to cover the thousands of logs at the bottom of Clio Bay after years frustration with the Department of Fisheries and Oceans and the BC provincial government, which for decades ignored requests for help in restoring almost fifty sunken log sites in Haisla traditional territory.
The problem is that remediation of the hundreds of sites on Canada’s west coast most containing tens of thousands of sunken logs has been so low on DFO’s priority list that even before the omnibus bills that gutted environmental protection in Canada, remediation of sunken log sites by DFO could be called no priority.
Now that the KM LNG has to depose of a total of about 3.5 million cubic metres of marine clay and possibly other materials from the Bish Cove site, suddenly log remediation went to high priority at DFO.
The controversy is rooted in the fact that although the leaders of the Haisla and the executives at Chevron knew about the idea of capping Clio Bay, people in the region, both many residents of Kitimat and some members of the Haisla were surprised when the project was announced in the latest KM LNG newsletter distributed to homes in the valley.
In a statement sent to Northwest Coast Energy News Chevron spokesperson Gillian Robinson Ridell said:
The Clio Bay Restoration Project proposed by Chevron, is planned to get underway sometime in early 2014. The proposal is fully supported by the Federal Department of Fisheries and Oceans and the Haisla First Nation Council. The project has been put forward as the best option for removal of the marine clay that is being excavated from the Kitimat LNG site at Bish Cove. Chevron hired Stantec, an independent engineering and environmental consulting firm with extensive experience in many major habitat restoration projects that involve public safety and environmental conservation. The Haisla, along with Stantec’s local marine biologists, identified Clio Bay as a site that has undergone significant environmental degradation over years of accumulation of underwater wood debris caused by historic log-booming operations. The proposal put forward by the marine biologists was that restoration of the marine ecosystem in the Bay could be achieved if marine clay from Chevron’s facility site, was used to cover the woody debris at the bottom of the Bay. The process outlined by the project proposal is designed to restore the Clio Bay seafloor to its original soft substrate that could sustain a recovery of biological diversity.
Non-aboriginal residents of Kitimat are increasingly worried about being cut off from both Douglas Channel and the terrestrial back country by industrial development. These fears have been heightened by reports that say that Clio Bay could be closed to the public for “safety reasons” for as much as 16 months during the restoration project.
The fact that Clio is known both as a safe anchorage during bad weather and an easy to get to location for day trips from Kitimat has made those worries even more critical.
There is also a strong feeling in Kitimat that the residents were kept out of the loop when it came to the Clio Bay proposal.
In a letter to the District of Kitimat, DFO said:
Clio Bay has been used as a log handling site for decades which has resulted in areas of degraded habitat from accumulations of woody debris materials on the sea floor. The project intends to cap impacted areas with inert materials and restore soft substrate seafloor. The remediation of the seafloor is predicted to enhance natural biodiversity and improve the productivity of the local fishery for Dungeness crab. The project area does support a variety of life that will be impact and therefore the project will require authorization from Fisheries and Oceans Canada for the Harmful Alteration, Disruption or Destruction (HADD) of fish and fish habitat.
The letter avoids the controversy over the use of marine clay but saying “inert material” will be used. That can only increase the worries from residents who say that not only clay but sand, gravel and other overburden from Bish Cove and the upgrade of the Forest Service Road may be used in Clio Bay. (The use of “inert material” also gives DFO an out if it turns out the department concludes the usual practice of using sand is better. That, of course, leaves the question of what to do with the clay).
Although Ellis Ross has said he wants to see large numbers of halibut and cod return to Clio Bay, the DFO letter only mentions the Dungeness Crab.
Try to search “remediation” on the DFO site and the viewer is redirected to a page that cites the omnibus bills passed by the Conservative government and says
On June 29, 2012, the Fisheries Act was amended. Policy and regulations are now being developed to support the new fisheries protection provisions of the Act (which are not yet in force). The existing guidance and policies continue to apply. For more information, see Changes to the Fisheries Act.
On April 2nd, 2013 the Habitat Management Program’s name was changed to the Fisheries Protection Program.
So, despite what communications officers for DFO and the Harper government may say, there was no policy then and there is no policy now on remediation of log sites. Given Harper’s attitude that LNG and possibly bitumen export must proceed quickly with no environmental barriers, it is likely that environmental remediation will continue to be no priority—unless remediation becomes a problem that the energy giants have to solve and pay for.
On the other hand, the State of Alaska and the United States Environmental Protection Agency spent a decade at a site near Ketchikan studying the environmental problems related to sunken logs at transfer sites
Those studies led Alaska to issue guidelines in 2002 with recommended practices for rehabilitating ocean log dump sites and for the studies that should precede any remediation project.
The Alaska studies also show that in Pacific northwest coast areas, the ecological effects of decades of log dumping, either accidental or deliberate, vary greatly depending on the topography of the region, the topography of the seabed, flow of rivers and currents as well as industrial uses along the shoreline.
The Alaska policy is based on studies and a remediation project at Ward Cove, which in many ways resembles Clio Bay, not far from Kitimat, near Ketchikan.
The Alaska policy follows guidelines from both the US Environmental Protection Agency and the US Army Corps of Engineers that recommend using thin layers of “clean sand” as the best practice method for capping contaminated sites. (The Army Corps of Engineers guidelines say that “clay balls” can be used to cap contaminated sites under some conditions. Both a spokesperson for the Corps of Engineers and officials at the Alaska Department of Environmental Conservation told Northwest Coast Energy News that they have no records or research on using marine clay on a large scale to cap a site.)
The EPA actually chose Sechelt, BC, based Construction Aggregates to provide the fine sand for the Ward Cove remediation project. The sand was loaded onto 10,000 tonne deck barges, hauled up the coast to Ward Cove, offloaded and stockpiled then transferred to derrick barges and carefully deposited on the sea bottom using modified clam shell buckets.
The EPA says
Nearly 25,000 tons of sand were placed at the Ward Cove site to cap about 27 acres of contaminated sediments and 3 other acres. In addition, about 3 acres of contaminated sediments were dredged in front of the main dock facility and 1 acre was dredged near the northeast corner of the cove. An additional 50 acres of contaminated sediments have been left to recover naturally.
A report by Integral Consulting, one of the firms involved at the project estimated that 17,800 cubic metres of sand were used at Ward Cove.
In contrast, to 17,800 cubic metres of sand used at Ward Cove, the Bish Cove project must dispose of about 1.2 million cubic metres of marine clay at sea (with another 1.2 million cubic metres slated for deposit in old quarries near Bees Creek).
Studies at Ward Cove began as far back as 1975. In 1990 Alaska placed Ward Cove on a list of “water-quality limited sites.” The studies intensified in 1995 after the main polluter of Ward Cove, the Ketchikan Paper Company, agreed in a consent degree on a remediation plan with the Environmental Protection Agency in 1995. After almost five years of intensive studies of the cove, the sand-capping and other remediation operations were conducted from November 2000 to March 2001. A major post-remediation study was carried out at Ward Cove in 2004 and again in 2009. The next one is slated for 2015.
Deaf ears at DFO
“We need to put pressure on the province or Canada to cleanup these sites. We’ve been trying to do this for the last 30 years. We got nowhere,” Ellis Ross says. “Before when we talked [to DFO] about getting those logs and cables cleaned up, it fell on deaf ears. They had no policy and no authority to hold these companies accountable. So we’re stuck, we’re stuck between a rock and hard place. How do we fix it?”
Ross says there has been one small pilot project using marine clay for capping which the Haisla’s advisers and Chevron believe can be scaled up for Clio Bay.
Douglas Channel studies
The one area around Kitimat that has been studied on a regular basis is Minette Bay. The first study occurred in 1951, before Alcan built the smelter and was used as a benchmark in future studies. In 1995 and 1996, DFO studied Minette Bay and came to the conclusion that because the water there was so stagnant, log dumping there had not contributed to low levels of dissolved oxygen although it said that it could not rule out “other deleterious effects on water quality and habitat`from log dumping.”
That DFO report also says that there were complaints about log dumping at Minette Bay as far back as 1975, which would tend to confirm what Ross says, that the Haisla have been complaining about environmentally degrading practices for about 30 years.
Ross told Northwest Coast Energy News that if the Clio Bay remediation project is successful, the next place for remediation should be Minette Bay.
A year after the Minette Bay study, DFO did a preliminary study of log transfer sites in Douglas Channel, with an aerial survey in March 1997 and on water studies in 1998. The DFO survey identified 52 locations with sunken logs on Douglas Channel as “potential study sites.” That list does not include Clio Bay. On water studies were done at the Dala River dump site at the head of the inlet on Kildala Arm, Weewanie Hotsprings, at the southwest corner of the cove, the Ochwe Bay log dump where the Paril River estuary opens into the Gardner Canal and the Collins Bay log dump also on the Gardner Canal.
In the introduction to its report, published in 2000, the DFO authors noted “the cumulative effect of several hundred sites located on BC coast is currently unknown.”
Since there appears to have been no significant follow-up, that cumulative effect is still “unknown.”
In 2000 and 2001, Chris Picard, then with the University of Victoria, now Science Director for the Gitga’at First Nation did a comparison survey of Clio Bay and Eagle Bay under special funding for a “Coasts Under Stress” project funded by the federal government. Picard’s study found that Eagle Bay, where there had been no log dumping was in much better shape than Clio Bay. For example, Picard’s study says that “Dungeness crabs were observed five times more often in the unimpacted Clio Bay.”
Although low oxygen levels have been cited as a reason for capping Clio Bay, Picard’s study says that “near surface” oxygen levels “did not reliably distinguish Clio and Eagle Bay sediments.” While Clio Bay did show consistent low oxygen levels, Eagle Bay showed “considerable interseasonal variation” which is consistent with the much more intensive and ongoing studies of oxygen levels at Wards Cove.
It appears that Chevron was taken by surprise by the controversy over the Clio Bay restoration. Multiple sources at the District of the Kitimat have told Northwest Coast Energy News that in meetings with Chevron, the company officials seemed to be scrambling to find out more about Clio Bay.
This is borne out by the fact, in its communications with Northwest Coast Energy News, Chevron says its consulting firm, Stantec has cited just two studies, Chris Picard’s survey of Clio Bay and a 1991 overview of log-booming practices on the US and Canadian Pacfic coasts. So far, Chevron has not cited the more up-to-date and detailed studies of Ward Cove that were conducted from 1995 to 2005.
Chevron says that Stantec marine biologists are now conducting extensive field work using divers and Remote Operated Vehicle surveys to “observe and record all flora and fauna in the bay and its levels of abundance. Stantec’s observations echoed the previous studies which determined that the massive amount of wood has harmed Clio Bay’s habitat and ecosystem.”
Reports in the Australian media seem to bare out Chevron’s position on environmental responsibility. Things seem to be working at Barrow Island.
Robinson went on to say:
Those same high environmental standards are being applied to the Kitimat LNG project and the proposed Clio Bay Restoration project. The proposed work would be carried out with a stringent DFO approved operational plan in place and would be overseen by qualified environmental specialists on-site.
The question that everyone in the Kitimat region must now ask is just how qualified are the environmental specialists hired by Chevron and given staff and budget cuts and pressure from the Prime Minister’s Office to downgrade environmental monitoring just how stringent will DFO be monitoring the Clio Bay remediation?
Alaska standardsIn the absence of comprehensive Canadian studies, the only benchmark available is that set by Alaska which calls for:Capping material, typically a clean sand, or silty to gravelly sand, is placed on top of problem sediments. The type of capping material that is appropriate is usually determined during the design phase of the project after a remediation technology has been selected. Capping material is usually brought to the site by barge and put in place using a variety of methods, depending on the selected remedial action alternative.
Thick capping usually requires the placement of 18 to 36 inches of sand over the area. The goal of thick capping is to isolate the bark and wood debris and recreate benthic habitat that diverse benthic infauna would inhabit.
Thin capping requires the placement of approximately 6 – 12 inches of sand on the project area. It is intended to enhance the bottom environment by creating new mini-environments, not necessarily to isolate the bark and wood debris. With thin capping, surface coverage is expected to vary spatially, providing variable areas of capped surface and amended surface sediment (where mixing between capping material and problem sediment occurs) as well as limited areas where no cap is evident.
Mounding places small piles of sand or gravel dispersed over the waste material to create habitat that can be colonized by organisms. Mounding can be used where the substrate will not support capping.
Special report: Clio Bay cleanup: Controversial, complicated and costly
Ward Cove, just eight kilometres west of Ketchikan, Alaska, was so polluted by effluent from pulp and saw mills and a fish plant, and filled with 16,000 sunken lots that it qualified for a U.S. Environmental Protection Agency Superfund cleanup.
The Ward Cove project is now considered a benchmark for cleaning up similar bays. Alaska officials emphasized to Northwest Coast Energy News, that while Ward Cove does provide guidelines for capping and dredging logs, they were not aware of any project where logs were capped that did not have other forms of contamination.
If you take a look at satellite images of Clio Bay, BC and Ward Cove side by side you immediately you see the similarities and differences between the two bodies of water. (Note due to parameters of Google Earth, images are slightly different scales)
Both Clio Bay and Ward Cove are 1.6 kilometres long, somewhat elbow shaped, off a main channel and surrounded by mountains.Ward Cove is 0.8 kilometres wide. Clio Bay is about 0.5 kilmetres wide, 0.8 at its widest point. Both have steep slopes from the mountains. Ward Cove is 61 metres deep at the mouth of the cove, descreasing toward the head. Clio Bay is deeper, 182 metres at the mouth, 90 metres in the centre and between 20 metres and 9 metres at the head.
Both Clio Bay and Ward Cove are subject to tidal circulation. Both Clio Bay and Ward Cove are also influenced by fresh water. Ward Cove is fed by Ward Creek, a smaller Walsh Creek and runoff precipitation the enters the cover from the steep mountain slopes. Clio Bay is fed by one creek, a number of small streams and mountain slope runoff, especially during the spring melt.
Haisla Chief Counsellor Ellis Ross estimates there are between 10,000 and 20,000 sunken logs in Clio Bay. The official summary from the United States Environmental Protection Agency said there were 16,000 sunken logs in Ward Cove.
The major difference with Ward Cove is that it was the site of major industrial development including a pulp mill, a sawmill and a fish plant. That meant the level of pollutants in Ward Cove were much higher than in Clio Bay, which has never been used for an industrial plant. It was the pollutants in Ward Cove, mainly ammonia, hydrogen sulfide, and 4-methylphenol combined with the thousands of sunken logs that made the cove a target cleanup and the associated studies.
A fish plant, Wards Cove Packing opened in 1912 and ceased operations in 2002. The Ketchikan Paper Company mill began operating in 1954 and closed in 1997. Prior to 1971, with the rise of the enviromental movement no permits were required by KPC for discharging effluent into the cove. After that the US Environmental Protection Agency issued a discharge permit and monitored effluent. Throughout the time the KPC mill was operating, the EPA says, “high volumes of log storage (approximately 7 billion board feet) caused accumulation of bark waste and sunken logs at the bottom of the cove.” Gateway Forest Products, a sawmill and veneer plant, continued to store logs in Wards Cove until 2002.
A 2009 monitoring report, conducted by the US Army Corps of Engineers after the cleanup for the EPA noted:
An ecological risk assessment was also conducted using a food-web assessment to estimate risks of bioaccumulative chemicals to representative birds and mammals at the top of the Ward Cove food web. The chemicals evaluated were arsenic, cadmium, mercury, zinc, chlorinated dioxins/furans, and PAHs. The results of this assessment indicated that there are no unacceptable risks to higher trophic level organisms in Ward Cove.
A human health risk assessment was conducted to identify potential risks posed by chemicals detected in sediments or seafood (e.g., fish, shellfish). Ingestion of seafood that may contain chemicals bioaccumulated from the sediments was identified as the only complete exposure pathway for humans. The chemicals that were evaluated included: arsenic, cadmium, mercury, zinc, phenol, 4-methylphenol, chlorinated dioxins/furans, and PAHs. Results concluded that sediments in Ward Cove do not pose an unacceptable risk to human health.
A 2007 report on the Wards Cove remediation from the Alaska Department of Environmental Conservation, noted:
The continuing residues impairment in Ward Cove is caused by the historical accumulation of wood waste on the bottom of the cove. The waste includes an estimated 16,000 sunken logs over at least 75 percent of the bottom and decomposing pulp, wood, and bark waste in sediments in thicknesses up to 10 feet over at least 50 percent of the bottom. Wood waste residues can displace and smother organisms, alter habitat, release leachates, create anoxic conditions, and produce toxic substances, all of which may adversely affect organisms that live both on top of sediments and within sediments.
That is a similar problem to Clio Bay.
The report notes that problems with oxygen increase with depth, noting:
The dissolved oxygen impairment was due largely to the fish-processing waste discharge from the seafood processing facility until 2002, and it was limited to the summer months in deeper waters of the cove (below the picnocline, or stratification layer, approximately 10 meters deep). With that discharge removed, limited monitoring in August and September 2003 indicated that dissolved oxygen impairment might remain near the bottom in waters at depths of 30 meters and greater at certain times and locations due to low natural levels of dissolved oxygen and the continuing decomposition of wood waste. Above 30 meters depth, the waters of the cove appeared to meet the [Alaska state] standard for dissolved oxygen. However, there may be limited capacity for waters at 30 meters and deeper to receive additional loading of oxygen-demanding materials and still meet the standard in summer months.
That should mean that the worries about oxygen depletion at Clio Bay are justified due to Clio’s greater depth.
Studies of the biology of Ward Cove began in 1951, with more in the 1960s and one in 1974. In 1995, Ketchikan Paper Company signed a consent decree with the EPA that called for remediation of Ward Cove, In 2000, KPC and Gateway Forest Products signed a second consent decree with the EPA. Those agreements called on the companies to dredge sediments to improve navigation, remove logs and other debris from the dredging areas and “placing a thin-layer cap of 15-30 cm (six to 12 inches) of sand over about 11 hectares (27 acres) of sunken logs.”
The major studies of Ward Cove began in 1995 after first consent decree. The remediation did not take place until the initial studies were complete in 1999, with dredging and capping taking place from November 2000 to March 2001.
The EPA positioned 13 water quality monitoring stations which operated from 1997 to 2002, to measure salinity, temperature and disolved oxygen, nine inside Ward Cove and four outside the cove in Tongass Narrows. Those studies showed that levels of dissolved oxygen in the cove varied by season, depth and location. Many species from salmon to mobile bottom dwellers like crabs were often able to detect and avoid low oxygen areas.
The EPA and the companies involved planned the remediation so that it included both dredging, capping logs and sediment and leaving some areas where nature would take its course.
The reports say that complete dredging, removal and disposal of the contamination would have cost $200 million, The total actual cost of the Ward Cove Remediation Project, beginning with development of the Remedial Design Work Plan, was estimated to have cost $3,964,000 (in 2000 US dollars).
The EPA says cost for the capping component of the project “including preliminary field investigations and reporting, design and plans development, post construction engineering, procurement, construction management, project management, mobilizationm demobilization, engineering/QC and science support, surveys, and capping items” was $2,563,506. Based on the volume of capping material placed, the unit cost of log capping for the Ward Cove Remediation Project was $110 per cubic yard.
The plan called for dredging about 17,050 cubic yards in the area near the cove’s main dock and the dredging of 3,500 yards metres nearby to improved navigation. Before the dredging, 680 tonnes of sunken logs had to be removed. After dredging, a “thin-layer cap of clean, sandy material” was placed in dredged areas unless native sediments or bedrock was reached during dredging.
In other areas, most covered in sunken logs, the plan called for placement of a thin-layer cap (approximately 6- to 12-inches) of clean, sandy material, with the possibility of “mounding” dropping mounds of sand on specific areas. The 2009 report says the area of sand deposits actually increased “due to the fact that thin layer placement was found to be successful over a broader area, and it was not necessary to construct mounding.”
The plan called for natural recovery in areas where neither capping nor mounding was practicable and so about 50 acres was left alone. (DFO says it plans to leave some parts of Clio Bay uncapped as “reference areas.”) Slope and sand
Two studies were carried out as part of the remediation at Ward Cove that do not appear to be contemplated at Clio Bay. The first looked at the “ability of the organic material to support the weight of 15 to 30 centimetres of sand.” Standard engineering equations used at other fill and capping sites were used as part of that study. A second study was carried out to determine the “minimum safety for a given slope,” which given the steep mountains that line Clio Bay, are likely to be factor in the deposit of marine clay. That study determined “For a silty fine sand and a factor of safety of 1.5, the maximum slope would be approximately 40 per cent.”
Those studies led to the conclusion that for the Ward Cove remediation project, the material to be placed on the fine organic sediment could not be gravel and course sand.”
That’s because the larger gravel and course sand “would tend to sink into the sediment and would not provide quality benethic (seabottom) habitat.”
The project decided to use “fine to medium sand with minimal fines.” It also concluded “Because of the very soft existing sediments and steep slopes at Ward Cove, the … material must be released slowly so that the settling velocity is low and bed impact minimized.”
That meant that the EPA had look for a source of quality sand that met their criterion. The sand was found at Construction Aggregates in Sechelt, BC, loaded on 10,000 tonne deck barges, tugged up the coast, unloaded onto land using a conveyor and stockpiled while more tests were done to determine how to deposit the sand on the sunken logs.
Sand was placed on a smaller barge and taken to the deposit site. Initial tests were done with a mechanical dredge equipped with a clamshell bucket. The operator deposited the sand using “swaths” released from the bucket. To make it work properly, the bucket, as supplied by a manufacturer had to be modified by welding baffle plates to the bucket and lengthening the chains to insure consistent deposition of the sand. Two computers with special software called WINOPS, designed for dredging operations “provided the operator and deck engineer the precise locations of the derrick barge position” in order to ensure precise deposition of the sand. WINOPS dredge positioning and guidance software. The WINOPS system made use of three differential global positioning receivers. One GPS receiver was located at the top of the derrick and provided the center positioning of the dredge bucket. Two fixed receivers, one near the starboard center spud and one near the center aft, provided the barge position and heading.
Although using marine clay is likely to produce different engineering challenges at Clio Bay, it is not currently clear that the project has contemplated the level of precision that was used at Ward Cove.
While KM LNG must find a way to dispose of the marine clay from the Bish Cove excavation site, there is a silver lining for the Haisla Nation’s aim of restoring both Clio Bay and the other 50 sites in their traditional territory, since the Kitimat Sand Hill would likely be a ready resource for any future projects. Monitoring
The EPA considered the project finished in September 2001, and long term monitoring began, with major updates every five years in 2004 and 2009.
An EPA report on the 2004 review showed that the three sand-capped areas and one shallow natural recovery area (not sand-capped) had achieved biological recovery; three other natural recovery areas tested had not achieved biological recovery but were making significant progress.
The 2004 studies showed that benethic (sea bottom) communities in uncapped areas showed “species commonly found in areas where organic enrichment is low or declining.” adding “In three other natural recovery areas, benthic communities have not progressed as far toward recovery but are making significant progress.
By the time of the 2009 update, most of the old industrial infrastucture on land at Ward Cove had been demolished and the land area was slated for redevelopment. Many of the companies that had been there had either gone out of business or had declared bankruptcy and the land was taken over by the Ketchikan Gateway Borough,mostly through foreclosure.
The EPA declared that “The remedial action construction is complete, and the remedial action is an operating or ongoing remedial action.”
The 2009 report says that the project was successful in eliminating sediment toxicity. The area was then quickly being recolonized by a diverse bottom dwelling macroinvertebrate species and those species were spreading beyond the specific study areas, so recovery of Ward Cove is expected to continue.
However the 2004 report went on to say that “the achievement of stable benthic biological communities with balanced species composition in more than 75 percent of the area with documented coverage by wood residues on the bottom of Ward Cove” would happen within 40 years from the 2004 study.
The next review of Ward Cove is slated for August 2015.
Chevron, the partner with Apache in the KM LNG (also known as Kitimat LNG) project at Bish Cove, said Sunday that the company will hold an open house in Kitimat on the controversial Clio Bay reclamation project.
Chevron says there will be a public open house at Riverlodge Tuesday, October 8 from 4 pm to 8 pm.
In an e-mail to politicians and local groups, including Douglas Channel Watch, Marc Douglas, a senior advisor for Chevron, based in Calgary, invited local stakeholders for a series of one hour meetings the same day at the KM LNG offices in City Centre.
Chevron Canada invites you to a meeting to discuss the Clio Bay Marine Life Restoration Project.
This proposed project would see Chevron excavate marine clay from the Kitimat LNG construction site at Bish Cove and work closely with the Federal Department of Fisheries and Oceans to deposit this natural material in specific locations in Clio Bay. The clay will cap-off decaying wood debris left by historic log booming operations that has accumulated on the bottom of Clio Bay, damaging the Bay’s natural ecosystem. A key goal of the project is to restore natural marine life populations in Clio Bay. Come and share your thoughts and ideas with us and learn more about this innovative restoration project.
There has been growing controversy over the Clio Bay project in recent weeks. Members of the Haisla Nation and residents of Kitimat were initially told that due to the large number of sunken logs at Clio Bay, that the area was deprived of oxygen, with limited sealife and that capping the logs with clay from Bish Cove would restore the ecosystem. However, beginning with a discussion at District of Kitimat Council on September 3, more people have been challenging the idea that Clio Bay needs restoration, with fishers posting photographs of recent catches on Facebook pages.
On Sept.3, Councillor Phil Germuth told Council: “Those logs have actually created a woody reef, where like any other reef, an ecosystem is being sustained. So to say that those logs are suffocating the life out of Clio Bay doesn’t seem to have a lot of merit.”
At the time, Chevron told the media that they had consulted with the Department of Fisheries and Oceans and concluded that carefully placed clay would improve the ecosystem.