Ocean acidification a threat to the Dungeness crab: NOAA study

With climate change, the oceans are becoming more acid and that is a threat to the dungeness crab, according to a study by the US National Oceanic and Atmospheric Administration.

The study says ocean acidification expected to accompany climate change may slow development and reduce survival of the larval stages of Dungeness crab.

The dungeness crab is a key component of the Northwest marine ecosystem and vital to fishery revenue from Oregon to Alaska.

The research by NOAA Fisheries’ Northwest Fisheries Science Center in Seattle indicates that the declining pH anticipated in Puget Sound could jeopardize populations of Dungeness crab and put the fishery at risk. The study was recently published in the journal Marine Biology.

Ocean acidification occurs as the ocean absorbs carbon dioxide from the combustion of fossil fuels. Average ocean surface pH is expected to drop to about 7.8 off the West Coast by 2050, and could drop further during coastal upwelling periods.

Survival of Dungeness crab larvae, called zoeae, declined at the lower pH levels expected with ocean acidification. (Jason Miller)
Survival of Dungeness crab larvae, called zoeae, declined at the lower pH levels expected with ocean acidification.
(Jason Miller)

Dungeness crab is the highest revenue fishery in Washington and Oregon, and the second most valuable in California, although the fishery was recently closed in some areas because of a harmful algal bloom. The Dungeness crab harvest in 2014 was worth more than $80 million in Washington, $48 million in Oregon and nearly $67 million in California

“I have great faith in the resiliency of nature, but I am concerned,” said Jason Miller, lead author of the research, which was part of his dissertation. “Crab larvae in our research were three times more likely to die when exposed to a pH that can already be found in Puget Sound, our own back yard, today.”

Scientists collected eggs from Dungeness crabs in Puget Sound and placed them in tanks at the NWFSC’s Montlake Research Laboratory. The tanks held seawater with a range of pH levels reflecting current conditions as well as the lower pH occasionally encountered in Puget Sound when deep water wells up near the surface. Larvae also went into tanks with the even lower-pH conditions expected with ocean acidification.
115757_webcrab
“The question was whether the lower pH we can expect to see in Puget Sound interferes with development of the next generation of Dungeness crab,” said Paul McElhany, a NOAA Fisheries research scientist and senior author of the paper. “Clearly the answer is yes. Now the question is, how does that play out in terms of affecting their life cycle and populations overall?”

Larvae hatched at the same rate regardless of pH, but those at lower pH took longer to hatch and progressed through their larval stages more slowly. Scientists suggested that the lower pH may reduce the metabolic rate of embryos. That could extend their vulnerable larval period, or could jeopardize the timing of their development in relation to key food sources, researchers suggested.

Larval survival also dropped by more than half at lower pH. At pH 8.0, roughly equivalent to seawater today, 58 percent of the crab larvae – called zoeae – survived for 45 days. At pH 7.5, which sometimes occurs in Puget Sound now, survival was 14 percent. At pH 7.1, which is expected to roughly approximate the pH of water upwelling on the West Coast with ocean acidification, zoeae survival remained low at 21 percent.

“Areas of greatest vulnerability will likely be where deep waters, naturally low in pH, meet acidified surface waters,” such as areas of coastal upwelling along the West Coast and in estuary environments such Hood Canal, the new study predicts.

 

Ocean acidification growing risk to west coast fishery, including crab and salmon, US studies show

The United States says acidification of the oceans means there is an already growing risk to the northwest coast fishery, including crab and salmon, according to studies released by the National Oceanic and Atmospheric Administration.

As more carbon dioxide is released into the atmosphere and absorbed by the oceans, the water is becoming more acidic and that affects many species, especially shellfish, dissolving the shells.

A NOAA study released today of environmental and economic risks to the Alaska fishery says:

Many of Alaska’s nutritionally and economically valuable marine fisheries are located in waters that are already experiencing ocean acidification, and will see more in the near future…. Communities in southeast and southwest Alaska face the highest risk from ocean acidification because they rely heavily on fisheries that are expected to be most affected by ocean acidification…

An earlier NOAA study, released in April, identified a long term threat to the salmon fishery as small ocean snails called pteropods which are a prime food source for pink salmon are already being affected by the acidification of the ocean.

Pteropod
This photograph from NOAA of a pteropod, important in the ocean diet of pink salmon, shows the first evidence of marine snails from the natural environment along the U.S. West Coast with signs that shells are dissolving. (NOAA)

NOAA says:

The term “ocean acidification” describes the process of ocean water becoming more acidic as a result of absorbing nearly a third of the carbon dioxide released into the atmosphere from human sources. This change in ocean chemistry is affecting marine life, particularly the ability of shellfish, corals and small creatures in the early stages of the food chain to build skeletons or shells.

Today’s NOAA study is the first published research by the Synthesis of Arctic Research (SOAR) program, which is supported by an US inter-agency agreement between NOAA’s Office of Oceanic and Atmospheric Research and the Bureau of Ocean Energy Management (BOEM) Alaska Region.

Canada’s Department of Fisheries and Oceans says it has ongoing studies on oceanic acidification including the role of petropods in the lifecycle of the salmon.

Des Nobles, President of Local #37 Fish [UFAWU-UNIFOR] told Northwest Coast Energy News that the fisheries union and other fisheries groups in Prince Rupert have asked both the Canadian federal and the BC provincial governments for action on ocean acidification. Nobles says so far those requests have been ignored,

Threat to crabs

The studies show that red king crab and tanner crab grow more slowly and don’t survive as well in more acidic waters. Alaska’s coastal waters are particularly vulnerable to ocean acidification because of cold water that can absorb more carbon dioxide and unique ocean circulation patterns which bring naturally acidic deep ocean waters to the surface.

“We went beyond the traditional approach of looking at dollars lost or species impacted; we know these fisheries are lifelines for native communities and what we’ve learned will help them adapt to a changing ocean environment,” said Jeremy Mathis, Ph.D., co-lead author of the study, an oceanographer at NOAA’s Pacific Marine Environmental Laboratory in Seattle, and the director of the University of Alaska Fairbanks School of Fisheries and Ocean Sciences Ocean Acidification Research Center.

As for Dungeness crab, Sarah Cooley,  a  co-author of the Alaska study, who was with the Woods Hole Oceanographic Institution at the time, told Northwest Coast Energy News, “The studies have not been done for Dungeness crab that have been done for king and tanner crab, that’s something we’re keenly aware of. There’s a big knowledge gap at this point.” She says NOAA may soon be looking at pilot study on Dungeness crab.

Healthy pteropod
A healthy pteropod collected during the U.S. West Coast survey cruise. (NOAA)

Risk to Salmon, Mackerel and Herring

In a 2011-2013 survey, a NOAA-led research team found the first evidence: “that acidity of continental shelf waters off the West Coast is dissolving the shells of tiny free-swimming marine snails, called pteropods, which provide food for pink salmon, mackerel and herring.”

The survey estimated that the percentage of pteropods along the west coast with dissolving shells due to ocean acidification had “doubled in the near shore habitat since the pre-industrial era and is on track to triple by 2050 when coastal waters become 70 percent more corrosive than in the pre-industrial era due to human-caused ocean acidification.”

That study documented the movement of corrosive waters onto the continental shelf from April to September during the upwelling season, when winds bring water rich in carbon dioxide up from depths of about 120 to  180 metres to the surface and onto the continental shelf.

“We haven’t done the extensive amount of studies yet on the young salmon fry,” Cooley said. “I would love to see those studies done. I think there is a real need for that information. Salmon are just so so important for the entire Pacific Northwest and up to Alaska.”

In Prince Rupert, Barb Faggetter, an independent oceanographer whose company Ocean Ecology has consulted for the fisherman’s union and NGOs, who was not part of the study, spoke generally about the threat of acidification to the region.

She is currently studying the impact of the proposed Liquified Natural Gas terminals that could be built at Prince Rupert near the Skeena River estuary. Faggetter said that acidification could affect the species eaten by juvenile salmon. “As young juveniles they eat a lot of zooplankton including crustaceans and shell fish larvae.”

She added, “Any of the shell fish in the fishery,  including probably things like sea urchins are all organisms that are susceptible to ocean acidification because of the loss of their capacity to actually incorporate calcium carbonate into their shells.”

Faggetter said her  studies have concentrated on potential habitat loss near Prince Rupert as a result of dredging and other activities for liquified natural gas development,  She adds that ocean acidification “has been a consideration that climate change will further worsen any potential damage that we’re currently looking at.”

Her studies of the Skeena estuary are concentrating on “rating” areas based on the food supply available to juvenile salmon, as well as predation and what habitat is available and the quality of that habitat to identify areas that “are most important for the juvenile salmon coming out of the Skeena River estuary and which are less important.”

She said that climate change and ocean acidification could impact the Skeena estuary and “probably reduce some of the environments that are currently good because they have a good food supply. If ocean acidification reduces that food supply that will no longer be good habitat for them” [juvenile salmon].

NOAA expediton
Bongo nets are deployed up to 200 meters deep to catch marine snails (pteropods), which are indicators of the progress of ocean acidification. The pteropod samples were collected during the U.S. West Coast survey cruises in 2011 and 2013. Unlike the US, Canada’s DFO is using models to track what’s happening to pteropods. (NOAA)

The  August 2011 NOAA survey of the pteropods was done at sea using “bongo nets” to retrieve the small snails at depths up to 200 metres. The research drew upon a West Coast survey by the NOAA Ocean Acidification Program in that was conducted on board the R/V Wecoma, owned by the National Science Foundation and operated by Oregon State University.

The DFO study, according to the agency website is “being examined in the context of model predictions.

Nina Bednarsek, Ph.D., of NOAA’s Pacific Marine Environmental Laboratory in Seattle, the lead author of the  April pteropod paper said, “Our findings are the first evidence that a large fraction of the West Coast pteropod population is being affected by ocean acidification.

“Dissolving coastal pteropod shells point to the need to study how acidification may be affecting the larger marine ecosystem. These near shore waters provide essential habitat to a great diversity of marine species, including many economically important fish that support coastal economies and provide us with food.”

Ecology and economy

Today’s study on the effects of acidification on the Alaska fishery study examined the potential effects on a state where the fishing industry supports over 100,000 jobs and generates more than $5 billion in annual revenue. Fishery-related tourism also brings in $300 million annually to the state.

Map of Alaska
A map of Alaska shows the economic and ecological risks to parts of the state from ocean acidification. (NOAA)

The study also shows that approximately 120,000 people or roughly 17 percent of Alaskans rely on subsistence fisheries for most, if not all of their dietary protein. The Alaska subsistence fishery is open to all residents of the state who need it, although a majority of those who participate in the subsistence fishery are Alaska’s First Nations. In that way it is somewhat parallel to Canada’s Food, Ceremonial and Social program for First Nations.

“Ocean acidification is not just an ecological problem—it’s an economic problem,” said Steve Colt, Ph.D., co-author of the study and an economist at the University of Alaska Anchorage. “The people of coastal Alaska, who have always looked to the sea for sustenance and prosperity, will be most affected. But all Alaskans need to understand how and where ocean acidification threatens our marine resources so that we can work together to address the challenges and maintain healthy and productive coastal communities.”

The Alaska study recommends that residents and stakeholders in vulnerable regions prepare for environmental challenge and develop response strategies that incorporate community values and needs.

“This research allows planners to think creatively about ways to help coastal communities withstand environmental change,” said Cooley, who is now science outreach manager at Ocean Conservancy, in Washington, D.C.  “Adaptations can be tailored to address specific social and environmental weak points that exist in a community.

“This is really the first time that we’ve been able to go under the hood and really look at the factors that make a particular community in a borough or census are less or more vulnerable from changing conditions resulting from acidification. It gives us a lot of power so that we don’t just look at environmental issues but also look at the social story behind that risk.”

As for the southern part of the Alaska panhandle nearest British Columbia, Cooley said, “What we found is that there is a high relative risk compared to some of the other areas of Alaska and that is because the communities there undertake a lot of subsistence fishing, There tend not be a whole lot of commercial harvests in the fisheries there but they are very very important from a subsistence stand point… And they’re tied to species that we expect to be on the front line of acidification, many of the clam species that are harvested in that area and some of the crab species.”

Long term effects

Libby Jewett, Director of the NOAA Ocean Acidification Program  and author of  the pteropod study said,  “Acidification of our oceans may impact marine ecosystems in a way that threatens the sustainability of the marine resources we depend on.

“Research on the progression and impacts of ocean acidification is vital to understanding the consequences of our burning of fossil fuels.”

“Acidification is happening now,” Cooley said. “We have not yet observed major declines in Alaskan harvested species. In Washington and Oregon they have seen widespread oyster mortality from acidification.

“We don’t have the documentation for what’s happening in Alaska right now but there are a lot of studies staring up right now that will just keep an eye out for that sort of thing,  Acidification is going to be continuing progressively over the next decades into the future indefinitely until we really curb carbon dioxide emissions. There’s enough momentum in the system that is going to keep acidification advancing for quite some time.

“What we need to be doing as we cut the carbon dioxide, we need to find ways to strength communities that depend on resources and this study allows us to think differently about that and too really look at how we can strengthen those communities.

Faggetter said. “It’s one more blow to an already complex situation here, My study has been working particularly on eel grass on Flora Bank (pdf) which is a very critical habitat, which is going to be impacted by these potential industrial developments and that impact will affect our juvenile salmon and our salmon fishery very dramatically, that could be further worsened by ocean acidification.”

She said that acidification could also be a long term threat to plans in Prince Rupert to establish a geoduck fishery (pronounced gooey-duck).

The popular large 15 to 20 centimetre clam is harvested in Washington State and southern BC, but so far hasn’t been  subject to commercial fishing in the north.

NOAA said today’s study shows that by examining all the factors that contribute to risk, more opportunities can be found to prevent harm to human communities at a local level. Decision-makers can address socioeconomic factors that lower the ability of people and communities to adapt to environmental change, such as low incomes, poor nutrition, lack of educational attainment and lack of diverse employment opportunities.

NOAA’s Ocean Acidification Program and the state of Alaska are also developing tools to help industry adapt to increasing acidity.

The new NOAA study is the first published research by the Synthesis of Arctic Research (SOAR) program. which is supported by an inter-agency agreement between NOAA’s Office of Oceanic and Atmospheric Research and the Bureau of Ocean Energy Management (BOEM) Alaska Region.

The pteropod study was published in April in Proceedings of the Royal Society B. The ecological and economic study is published in Progress in Oceanography.

Harper’s Northern Gateway strategy and why it will end up in a muddy mess

It appears that the Stephen Harper’s strategy for approving Northern Gateway has been revealed on background to The Globe and Mail’s Gary Mason. (Either it’s a revelation or a trial balloon).

It comes down to the idea that Harper will approve Gateway “in the national interest,” count on a vote split between the NDP and Liberals in British Columbia to avoid any consequences to the Conservative majority and then leave it up to Enbridge to actually get the job of building the pipeline and terminal project done.

Mason quotes “ a senior member of Mr. Harper’s government,” and while Mason doesn’t say what part of Canada the source is from, (unlikely in my view the source is from BC) what the member told Mason reveals that the Harper government is still mired in it the Matrix-world that has always governed its policy on Northern Gateway.

The first step, apparently coming in the next few days, is that the Harper government “rigorous” new tanker protocols for traffic along the west coast.

Tanker protocols
So the obvious question is, will these protocols be new or will the government simply be reannoucing paper policies that they did in the March 2013? How many of the recommendations of the tanker task force is the government actually going to accept?

Even if the protocols are new, just who is going to enforce those policies?

Mason says:

Even if Gateway and the Kinder Morgan expansion went ahead, he argued, B.C. would still only see about 60 per cent of the annual oil tanker traffic the neighbouring state of Washington deals with. And yet Washington has an exceptionally clean record when it comes to the safe transport of oil in and out of its harbours – this, he noted, while operating under marine safety regulations that are not as rigorous as the ones Ottawa intends to put in place for the shipment of oil along the West Coast.

There are a lot big problems with that statement.

First, there’s an organization that the Mason’s source may have heard of known as the United States Coast Guard. The United States rigorously enforces its “weak” regulations, while Canada’s Coast Guard is plagued by staff shortages and budget cuts.

Second, the State of Washington also rigorously enforces its environmental regulations, not only on the coast but across the state. I have been told by retired British Columbia forestry and environmental officials (not to mention Fisheries and Oceans) that there are often more state environmental watch dogs in most Washington State counties than in all of northern British Columbia where the Northern Gateway is supposed to be going.

The September 2013, report by the US National Oceanographic and Atmospheric Administration on the export of Canadian bitumen sands through the US shows that the Washington Department of Ecology is working on strengthening regulations for both pipelines and (where it’s in state jurisdiction) tanker traffic. The same report says the state of Alaska Department of Environmental Conservation is updating its plans and possible regulations in anticipation that bitumen filled tanker traffic from Kitimat would come close to the coast en route to Asia.

Third, the coast of northern British Columbia is more rugged and stormy than the waters off Washington.

Who pays?

The one factor that the urban media seems to ignore, is the big question.

Who pays?

Who pays to enforce the 209 conditions that the Joint Review Panel imposed on the Northern Gateway project?

If the Harper government announces new tanker regulations in the coming days, who pays to enforce those regulations?

There were no provisions in the February budget for enforcing the 209 conditions. Rather there were continuing budget cuts to the very departments that the JRP ruled must be involved in the studying, planning, implementation and enforcement of the 209 conditions, Environment Canada, Fisheries and Oceans and Transport Canada.

So while Mason says “The federal government will play its part in meeting the five conditions laid out by the B.C. government for support of the project,” the response must be “Show me the money!”

During the recent plebiscite campaign, Northern Gateway finally revealed its plans for the “super tugs” that will escort tankers along the coast and up Douglas Channel.  Owen McHugh, a Northern Gateway emergency manager said, “Adding these four or five tugs to the north coast provides a rescue capability that doesn’t exist in this format. So for any large commercial vessel that is traveling on our coast, this capacity to protect the waters of the north coast.”  Those tugs and Northern Gateway’s plans to station teams at small bases along the coast means that the company is, in effect, creating a parallel, private, coast guard on the BC Coast.

What about the Coast Guard itself? The Harper government has been gutting Coast Guard resources along the coast even before it had its majority. It closed and dismantled the Kitsilano Coast Guard station in Vancouver. There is more dependence on the Royal Canadian Marine Search and Rescue volunteers, who have to raise money locally for modern rescue boats which cost up to $750,000. The money that government was “generously” giving to RCMSAR had to be split up to 70 stations in 42 communities along the coast as well as its administrative and training staff.

And speaking of boats, what about Coast Guard vessels on the coast? As the Globe and Mail has reported, the government’s shipbuilding program is already over budget  and behind schedule. The aim is  Arctic/Offshore Patrol Ships  and new destroyers. With the crippling of HMCS Protecteur that has raised the concerns about the already troubled supply ship program.

Does anyone notice what is missing from that list? What’s missing are  better Coast Guard vessels just to police all the expected tanker traffic on the west coast (whether LNG or bitumen) and no mention of dedicated spill response vessels, which under the “polluter pay” policy will likely be left to private contractors (and hope that the ships are available at the time of a spill)

How will we know?

Then there is the question of how will people even know if the 209 conditions are being enforced; whether or not the reports demanded by the Joint Review Panel are going be sitting on the National Energy Board server and ignored.

There is every indication, given the government’s obsession with secrecy that until there is a disaster the Canadian public will never know what’s going on. Harper’s muzzling doesn’t just cover government scientists, it covers the lowest level of bureaucrats, as District of Kitimat Council found out when low level DFO bureaucrats refused to appear publicly before council to discuss the risk to the Kitimat River.

So the scenario is, according to Mason’s source

“I think once this decision is made, Enbridge could have shovels in the ground the next day,” the member said. “They are ready to go. This means the First Nations could start realizing profits from this right away, as opposed to the promised profits from LNG, which may never materialize. I think they need to think about that.”

First, as part of the blunders is that the Conservatives have always made is the assumption that eventually the First Nations of British Columbia can be paid off, ignoring the commitment of the First Nations, especially on the coast, to protect the environment that sustained them for thousands of years.

While the LNG market is volatile, the “member” forgets that most of the First Nations of British Columbia have opposed the Northern Gateway since Enbridge first floated the idea in 2001. The current LNG rush didn’t start until after Japan shut down its nuclear power plants after the March 2011 earthquake, The first major anti-Enbridge rally,  “The Solidarity Gathering of Nations” was held at Kitamaat Village in May 2010.

Writing off BC

It appears that Conservatives, in their election strategy have already written off Gateway opponents:

Still, there is a raw political calculus that needs to be taken into account. Polls measuring support for the pr.oject in B.C. vary, but generally have shown that anywhere from 55 to 60 per cent of the province opposes Gateway and 40 to 45 per cent support it. Isn’t that enough to scare off a government that needs critical votes in B.C. to win another majority?
“Let’s say 60 per cent are against it,” he said. “And that vote splits between the Liberals and the NDP come the next election. Who are the 40 per cent going to vote for?”

As for the cabinet, it has consistently shown its contempt for northwestern British Columbia  and that is unlikely to change.

Mason also speculates that Harper will approve Gateway to stick it to Barack Obama and the delays on Keystone XL. As he points out that’s a political, not an economic decision.

There are civil disobedience classes being held across northwestern BC  this month.  Access to Information requests by the Vancouver Observer revealed increased RCMP surveillance of the anti-Gateway movement.  There has always been talk of a “war in the woods” if the pipeline project is forced on an unwilling population.

So it comes down to a question that Mason and the Conservatives are avoiding. Mason’s source says Northern Gateway is crucial to the national interest:

“At the end of the day, you have to do what’s right, not what’s politically expedient,” he said. “You have to ask: What’s in the best interests of all Canadians?”

So given all that will the Harper government leave Enbridge to tough it out on its own?

Highly unlikely.

But will the Harper government, with its bean counting obsession on balancing the budget be willing to pay for all that is needed?

Highly likely.

There’s lots of marine clay along the pipeline route, laid down by ancient oceans. That brings to mind just one word. Quagmire, not just the wet, sticky BC mud but a political quagmire.

Kitimat Votes: 25th anniversary of Exxon Valdez disaster looms over Northern Gateway plebiscite

On March 24, 1989, the tanker Exxon Valdez plowed into Bligh Reef in Alaska’s Prince William Sound,  spilling 260,000 to 750,000 barrels or 41,000 to 119,000 cubic metres of crude oil.

That was 25 years ago. The media loves anniversary stories and the Exxon Valdez look-backs and updates are already ramping up—right in the middle of the Kitimat plebiscite on the Northern Gateway pipeline and terminal project.

The hashtag #ExxonValdez25 is beginning to trend, based on a Twitter chat for Monday sponsored by the US National Oceanic and Atmospheric Administration.

The voters of Kitimat who will have to cast their ballots on the Joint Review Panel’s interpretation of the Northern Gateway proposal will find once again that the JRP tilted toward the industry and downplayed the lingering risks from a major tanker disaster—and that means neither the pro nor the anti side can be happy with the events that will be marked on March 24, 2014.

The Exxon Valdez accident is part of the Joint Review Panel findings that the economic benefits of Northern Gateway outweigh the risks. The JRP generally accepted the industry position, taken by both Northern Gateway and by ExxonMobil that Prince William Sound has recovered from the Exxon Valdez incident, something that is fiercely debated and disputed.

One area that is not in dispute is that the Exxon Valez disaster brought laws that forced energy companies to use double-hulled tankers.  However, commercials that indicate that Northern Gateway will be using double-hulled tankers because the company respects the BC coast is pushing things a bit far, since those tankers are required by law.

Northern Gateway told the Joint Reivew Panel that

on a worldwide basis, all data sets show a steady reduction in the number
and size of oil spills since the 1970s. This decline has been even more apparent since regulatory changes in 1990 following the Exxon Valdez oil spill, which required a phase-in of double-hulled tankers in the international fleet. No double-hulled tanker has sunk since 1990. There have been five incidents of double-hulled tankers that have had a collision or grounding that penetrated the cargo tanks. Resulting spills ranged from 700 to 2500 tonnes

The Haisla countered by saying:

The Haisla Nation said that, although there have been no major spills since the Exxon Valdez spill in Prince William Sound, there were 111 reported incidents involving tanker traffic in Prince William Sound between 1997 and 2007. The three most common types of incidents were equipment malfunctions, problems with propulsion, steering, or engine function, and very small spills from tankers at berth at the marine terminal. The Haisla Nation said that, in the absence of state-of-the-art prevention systems in Prince William Sound, any one of those incidents could have resulted in major vessel casualties or oil spills.

 

Related: What the Joint Review Panel said about the Exxon Valdez disaster

A local daily newspaper, The Anchorage Daily News sums it all up:

The herring of Prince William Sound still have not recovered. Neither have killer whales, and legal issues remain unresolved a quarter of a century later. Monday is the 25th anniversary of the disaster, in which the tanker Exxon Valdez ran aground on Bligh Reef and spilled at least 11 million gallons of oil into the pristine waters of the sound.

Prince William Sound today looks spectacular, a stunning landscape of mountainous fjords, blue-green waters and thickly forested islands. Pick up a stone on a rocky beach, maybe dig a little, though, and it is possible to still find pockets of oil.

“I think the big surprise for all of us who have worked on this thing for the last 25 years has been the continued presence of relatively fresh oil,” said Gary Shigenaka, a marine biologist for the National Oceanic and Atmospheric Administration.

Britain’s Sunday Telegraph headlined: Exxon Valdez – 25 years after the Alaska oil spill, the court battle continues

The legal dispute over the spill is still ongoing, with the Telegraph’s Joanna Walters noting:

[S]tate senator Berta Gardner is pushing for Alaskan politicians to demand that the US government forces ExxonMobil Corporation to pay up a final $92 million (£57 million), in what has become the longest-running environmental court case in history. The money would primarily be spent on addressing the crippled herring numbers and the oiled beaches.
“There’s still damage from the spill. The oil on the beaches is toxic and hurting wildlife. We can’t just say we’ve done what we can and it’s all over – especially with drilling anticipated offshore in the Arctic Ocean – this is significant for Alaska and people around the world,” she told The Telegraph.

An ExxonMobil spokesman then told The Telegraph, the energy sector’s standard response:

Richard Keil, a senior media relations adviser at ExxonMobil, said: “The overwhelming consensus of peer-reviewed scientific papers is that Prince William Sound has recovered and the ecosystem is healthy and thriving.”
But federal scientists estimate that between 16,000 and 21,000 gallons of oil from the spill lingers on beaches in Prince William Sound and up to 450 miles away, some of it no more biodegraded than it was at the time of the disaster.

The Sunday Telegraph chronicles which species have recovered in Exxon Valdez: Animal populations in Prince William Sound, Alaska

Overall, the Exxon Valdez disaster was, as US National Public Radio reported, a spur to science. But NPR’s conclusion is the exact opposite of that from the Northern Gateway Joint Review Panel—at least when it comes to fish embryos.

Why The Exxon Valdez Spill Was A Eureka Moment For Science

Twenty-five years of research following the Exxon Valdez disaster has led to some startling conclusions about the persistent effects of spilled oil.
When the tanker leaked millions of gallons of the Alaskan coast, scientists predicted major environmental damage, but they expected those effects to be short lived. Instead, they’ve stretched out for many years.
What researchers learned as they puzzled through the reasons for the delayed recovery fundamentally changed the way scientists view oil spills. One of their most surprising discoveries was that long-lasting components of oil thought to be benign turned out to cause chronic damage to fish hearts when fish were exposed to tiny concentrations of the compounds as embryos.

(NPR also reports on the The Lingering Legacy Of The Exxon Valdez Oil Spill)

It seems that some species recovered better than others from the oilspill.

For example, the recovery of the sea otter population has received widespread media coverage, but with widely divergent points of view. The more conservative and pro-industry writers point to the recovery of the otter population, while environmental coverage stresses the quarter century it took for the otter population to rebound.

Scientific American online and other media outlets reported 25 Years after Exxon Valdez Spill, Sea Otters Recovered in Alaska’s Prince William Sound quoting a report from the U.S. Geological Survey that said that spill killed 40 percent of the 6,500 sea otters living in the sound and more in 1990 and 1991.USGS reported that the main sea otter population in the sound was 4,277 in 2013.

Although recovery timelines varied widely among species, our work shows that recovery of species vulnerable to long-term effects of oil spills can take decades,” said lead author of the study, Brenda Ballachey, research biologist with the U.S. Geological Survey. “For sea otters, we began to see signs of recovery in the years leading up to 2009, two decades after the spill, and the most recent results from 2011 to 2013 are consistent with recovery

The Joint Review Panel generally accepted Northern Gateway’s and the energy industry’s evidence on the Exxon Valdez incident and concluded

The Panel’s finding regarding ecosystem recovery following a large spill is based on extensive scientific evidence filed by many parties, including information on recovery of the environment from large past spill events such as the Exxon Valdez oil spill. The Panel notes that different parties sometimes referred to the same studies on environmental recovery after oil spills, and drew different conclusions.

In its consideration of natural recovery of the environment, the Panel focused on effects that are more readily measurable such as population level impacts, harvest levels, or established environmental quality criteria such as water and sediment quality criteria.

The Panel finds that the evidence indicates that ecosystems will recover over time after a spill and that the post-spill ecosystem will share functional attributes of the pre-spill one. Postspill ecosystems may not be identical to pre-spill ecosystems. Certain ecosystem components may continue to show effects, and residual oil may remain in some locations. In certain unlikely circumstances, the Panel finds that a localized population or species could potentially be permanently affected by an oil spill.

Scientific studies after the Exxon Valdez spill indicated that the vast majority of species recovered following the spill and that functioning ecosystems, similar to those existing pre-spill, were established.
Species for which recovery is not fully apparent, such as Pacific herring, killer whales, and pigeon guillemots, appear to have been affected by other environmental factors or human influences not associated with the oil spill. Insufficient pre-spill baseline data on these species contributed to difficulties in determining the extent of spill effects.

Based on the evidence, the Panel finds that natural recovery of the aquatic environment after an oil spill is likely to be the primary recovery mechanism, particularly for marine spills. Both freshwater and marine ecosystem recovery is further mitigated where cleanup is possible, effective, and beneficial to the environment.

Natural processes that degrade oil would begin immediately following a spill. Although residual oil could remain buried in sediments for years, the Panel finds that toxicity associated with that oil would decline over time and would not cause widespread, long-term impacts.

The Panel finds that Northern Gateway’s commitment to use human interventions, including available spill response technologies, would mitigate spill impacts to ecosystems and assist in species recovery..

It is clear, however, from the local coverage in Alaska and from the attention of the world’s media that Prince William Sound has not fully recovered from the Exxon Valdez incident (it may yet in who knows how many years). Anger and bitterness still remains among the residents of Alaska, especially since the court cases are dragging on after a quarter century.

Those are the kinds of issues that Kitimat residents will face when they vote in the plebiscite on April 12. Just who do the people of Kitimat believe, those who say the chances for a spill are remote and the environment and the economy will quickly recover? It probably depends on whether or not you consider 25 years quick. Twenty-five years is quick in geological time but it is a third or a half of a human life time.

As for the residents of Kitamaat Village, and probably many people in Kitimat, Haisla Chief Counsellor Ellis Ross summed it up in a Facebook posting on Sunday

If this happens in Kitamaat, all those campaigning for Enbridge will pack up and leave for another coastline to foul. Haisla don’t have much of a choice. We would have to stay and watch the court battles on who should pay what.

Ross is right. Whether it’s Prince William Sound or Douglas Channel, the people who live the region are stuck with the mess while the big companies walk away and the lawyers get rich.

 

Anniversary stories (as of March 23, 2000 PT)

Alaska Media

Valdez Star
First Associated Press story on Exxon Valdez Oil Spill reprinted

KTUU

Exxon Valdez Oil Spill 25th Anniversary: Alaskans Remember

Alaska Dispatch

Exxon Valdez oil lingers on Prince William Sound beaches; experts debate whether to clean it up

While Alaska’s Prince William Sound is safer, questions linger about preventing oil spills

Recalling the shock and sadness of Exxon Valdez spill 25 years ago

How the Exxon Valdez spill gave birth to modern oil spill prevention plans

Seward City News
25 years later Exxon Valdez memories still stink

Bristol Bay Times
Exxon lesson: Prevention, RCACs the key to avoiding future disaster

Anchorage Daily News
Red Light to Starboard: Recalling the Exxon Valdez Disaster

Exxon Valdez photogallery

25 years later, oil spilled from Exxon Valdez still clings to lives, Alaska habitat

 

World Media
Al Jazeera
The legacy of Exxon Valdez spill
The tanker ran aground 25 years, but the accident continues to harm the environment and human health

Vancouver Sun
Opinion: Oil spills — the 10 lessons we must learn Reality check: Next incident would ruin coastal economy

Seattle Times

Promises broken by the Exxon Valdez oil spill, 25 years later

SFGate

25 years since the Exxon Valdez spill

CNN
After 25 years, Exxon Valdez oil spill hasn’t ended

How oil spills kill fish: new study points to cardiac arrest; possible implications for humans

Oil spills kill fish. That’s well known. Now scientists say they have found out why oil spills kill adult fish. The chemicals in the oil often trigger an irregular heartbeat and cardiac arrest.

A joint study by Stanford University and the US National Atmospheric and Oceanic Administration have discovered that crude oil interferes with fish heart cells. The toxic consequence is a slowed heart rate, reduced cardiac contractility and irregular heartbeats that can lead to cardiac arrest and sudden cardiac death.

The study was published Feb. 14, 2014 in the prestigious international journal Science and unveiled at the convention of the American Association for the Advancement of Science in Chicago.

The study is part of the ongoing Natural Resource Damage Assessment of the April 2010 Deepwater Horizon oil spill in the Gulf of Mexico.

Scientists have known for some time that crude oil is known to be “cardiotoxic” to developing fish. Until now, the mechanisms underlying the harmful effects were unclear.

Exxon Valdez

Studies going back to the Exxon Valdez oil spill in Alaska in 1989 have shown that exposure to crude oil-derived chemicals disrupt cardiac function and impairs development in larval fishes. The studies have described a syndrome of embryonic heart failure, bradycardia (slow heart beat), arrhythmias (irregular heartbeats) and edema in exposed fish embryos.

After the Gulf of Mexico spill, studies began on young fish in the aftermath of the Deepwater Horizon spill. The two science teams wanted to find out how oil specifically impacts heart cells.

Crude oil is a complex mixture of chemicals, some of which are known to be toxic to marine animals.

Past research focused on “polycyclic aromatic hydrocarbons” (PAHs), which can also be found in coal tar, creosote, air pollution and stormwater runoff from land. In the aftermath of an oil spill, the studies show PAHs can persist for many years in marine habitats and cause a variety of adverse environmental effects.

The scientists found that oil interferes with cardiac cell excitability, contraction and relaxation – vital processes for normal beat-to-beat contraction and pacing of the heart.

Low concentrations of crude

The study shows that very low concentrations of crude oil disrupt the specialized ion channel pores – where molecules flow in and out of the heart cells – that control heart rate and contraction in the cardiac muscle cell. This cyclical signalling pathway in cells throughout the heart is what propels blood out of the pump on every beat. The protein components of the signalling pathway are highly conserved in the hearts of most animals, including humans.

The researchers found that oil blocks the potassium channels distributed in heart cell membranes, increasing the time to restart the heart on every beat. This prolongs the normal cardiac action potential, and ultimately slows the heartbeat. The potassium ion channel impacted in the tuna is responsible for restarting the heart muscle cell contraction cycle after every beat, and is highly conserved throughout vertebrates, raising the possibility that animals as diverse as tuna, turtles and dolphins might be affected similarly by crude oil exposure. Oil also resulted in arrhythmias in some ventricular cells.

“The ability of a heart cell to beat depends on its capacity to move essential ions like potassium and calcium into and out of the cells quickly.” said Barbara Block, a professor of marine sciences at Stanford. She said, “We have discovered that crude oil interferes with this vital signalling process essential for our heart cells to function properly.”

Nat Scholz, leader of the Ecotoxicology Program at NOAA’s Northwest Fisheries Science Center in Seattle said.”We’ve known from NOAA research over the past two decades that crude oil is toxic to the developing hearts of fish embryos and larvae, but haven’t understood precisely why.”

Long term problems in fish hearts

He added: “These new findings more clearly define petroleum-derived chemical threats to fish and other species in coastal and ocean habitats, with implications that extend beyond oil spills to other sources of pollution such as land-based urban stormwater runoff.”

The new study also calls attention to a previously under appreciated risk to wildlife and humans, particularly from exposure to cardioactive PAHs that can also exist when there are high levels of air pollution.

“When we see these kinds of acute effects at the cardiac cell level,” Block said, “it is not surprising that chronic exposure to oil from spills such as the Deepwater Horizon can lead to long-term problems in fish hearts.”

The study used captive populations of bluefin and yellowfin tuna at the Tuna Research and Conservation Center, a collaborative facility operated by Stanford and the Monterey Bay Aquarium. That meant the research team was able to directly observe the effects of crude oil samples collected from the Gulf of Mexico on living fish heart cells.

“The protein ion channels we observe in the tuna heart cells are similar to what we would find in any vertebrate heart and provide evidence as to how petroleum products may be negatively impacting cardiac function in a wide variety of animals,” she said. “This raises the possibility that exposure to environmental PAHs in many animals – including humans – could lead to cardiac arrhythmias and bradycardia, or slowing of the heart.”

Tuna spawning

The Deepwater Horizon disaster released over 4 million barrels of crude oil during the peak spawning time for the Atlantic bluefin tuna in the spring of 2010. Electronic tagging and fisheries catch data indicate that Atlantic bluefin spawn in the area where the Deepwater Horizon drilling rig collapsed, raising the possibility that eggs and larvae, which float near the surface waters, were exposed to oil.

Blue fin tuna
An Atlantic bluefin tuna ( ©Gilbert Van Ryckevorsel/TAG A Giant/Courtesy Standford University)

The spill occurred in the major spawning ground of the western Atlantic population of bluefin tuna in the Gulf of Mexico. The most recent stock assessment, conducted in 2012, estimated the spawning population of the bluefin tuna to be at only 36 percent of the 1970 baseline population. Additionally, many other pelagic fishes were also likely to have spawned in oiled habitats, including yellowfin tuna, blue marlin and swordfish.

Block and her team bathed isolated cardiac cells from the tuna in low dose crude oil concentrations similar to what fish in early life stages may have encountered in the surface waters where they were spawned after the April 2010 oil spill in the Gulf of Mexico.

They measured the heart cells’ response to record how ions flowed into and out of the heart cells to identify the specific proteins in the excitation-contraction pathway that were affected by crude oil chemical components.

Fabien Brette, a research associate in Block’s lab and lead author on the study said the scientists looked at the function of healthy heart cells in a laboratory dish and then used a microscope to measure how the cells responded when crude oil was introduced.

“The normal sequence and synchronous contraction of the heart requires rapid activation in a coordinated way of the heart cells,” Block said. “Like detectives, we dissected this process using laboratory physiological techniques to ask where oil was impacting this vital mechanism.”

Related: Oil spill caused “unexpected lethal impact” on herring, study shows

 

Methane leaks from natural gas industry 50 per cent higher than EPA estimates study says

EPA Gas Leakage
This shows EPA Greenhouse Gas Inventory leakage estimates. Below: This shows results from recent experimental studies. Studies either focus on specific industry segments, or use broad atmospheric data to estimate emissions from multiple segments or the entire industry. Studies have generally found either higher emissions than expected from EPA inventory methods, or found mixed results (some sources higher and others lower).
( Stanford University School of Earth Sciences)

 

 

A new study indicates that atmospheric emissions of methane, a critical greenhouse gas, mostly leaking from the natural gas industry are likely 50 per cent higher than previously estimated by the US Environmental Protection Agency.

A study, “Methane Leakage from North American Natural Gas Systems,” published in the Feb. 14 issue of the international journal Science, synthesizes diverse findings from more than 200 studies ranging in scope from local gas processing plants to total emissions from the United States and Canada.

The scientists say this first thorough comparison of evidence for natural gas system leaks confirms that organizations including the EPA have underestimated U.S. methane emissions generally, as well as those from the natural gas industry specifically.

Natural gas consists predominantly of methane. Even small leaks from the natural gas system are important because methane is a potent greenhouse gas – about 30 times more potent than carbon dioxide.

“People who go out and actually measure methane pretty consistently find more emissions than we expect,” said the lead author of the new analysis, Adam Brandt, an assistant professor of energy resources engineering at Stanford University. “Atmospheric tests covering the entire country indicate emissions around 50 per cent more than EPA estimates,” said Brandt. “And that’s a moderate estimate.”

The standard approach to estimating total methane emissions is to multiply the amount of methane thought to be emitted by a particular kind of source, such as leaks at natural gas processing plants or belching cattle, by the number of that source type in a region or country. The products are then totalled to estimate all emissions. The EPA does not include natural methane sources, like wetlands and geologic seeps.

The national natural gas infrastructure has a combination of intentional leaks, often for safety purposes, and unintentional emissions, like faulty valves and cracks in pipelines. In the United States, the emission rates of particular gas industry components – from wells to burner tips – were established by the EPA in the 1990s.

Since then, many studies have tested gas industry components to determine whether the EPA’s emission rates are accurate, and a majority of these have found the EPA’s rates too low. The new analysis does not try to attribute percentages of the excess emissions to natural gas, oil, coal, agriculture, landfills, etc., because emission rates for most sources are so uncertain.
Several other studies have used airplanes and towers to measure actual methane in the air, to test total estimated emissions. The new analysis, which is authored by researchers from seven universities, several national laboratories and US federal government bodies, and other organizations, found these atmospheric studies covering very large areas consistently indicate total U.S. methane emissions of about 25 to 75 per cent higher than the EPA estimate.

Some of the difference is accounted for by the EPA’s focus on emissions caused by human activity. The EPA excludes natural methane sources like geologic seeps and wetlands, which atmospheric samples unavoidably include. The EPA likewise does not include some emissions caused by human activity, such as abandoned oil and gas wells, because the amounts of associated methane are unknown.

The new analysis finds that some recent studies showing very high methane emissions in regions with considerable natural gas infrastructure are not representative of the entire gas system. “If these studies were representative of even 25 percent of the natural gas industry, then that would account for almost all the excess methane noted in continental-scale studies,” said a co-author of the study, Eric Kort, an atmospheric science professor at the University of Michigan. “Observations have shown this to be unlikely.”

Methane air sampling systems
Top-down methods take air samples from aircraft or tall towers to measure gas concentrations remote from sources. Bottom-up methods take measurements directly at facilities. Top-down methods provide a more complete and unbiased assessment of emissions sources, and can detect emissions over broad areas. However, they lack specificity and face difficulty in assigning emissions to particular sources. Bottom-up methods provide direct, precise measurement of gas emissions rates. However, the high cost of sampling and the need for site access permission leads to small sample sizes and possible sampling bias.
(Stanford University School of Earth Sciences)

Natural gas as a replacement fuel

The scientists say that even though the gas system is almost certainly leakier than previously thought, generating electricity by burning gas rather than coal still reduces the total greenhouse effect over 100 years. Not only does burning coal release an enormous amount of carbon dioxide, mining it releases methane.

Perhaps surprisingly though, the analysis finds that powering trucks and buses with natural gas instead of diesel fuel probably makes the globe warmer, because diesel engines are relatively clean. For natural gas to beat diesel, the gas industry would have to be less leaky than the EPA’s current estimate, which the new analysis also finds quite improbable.

“Fueling trucks and buses with natural gas may help local air quality and reduce oil imports, but it is not likely to reduce greenhouse gas emissions. Even running passenger cars on natural gas instead of gasoline is probably on the borderline in terms of climate,” Brandt said.

The natural gas industry, the analysis finds, must clean up its leaks to really deliver on its promise of less harm. Fortunately for gas companies, a few leaks in the gas system probably account for much of the problem and could be repaired. One earlier study examined about 75,000 components at processing plants. It found some 1,600 unintentional leaks, but just 50 faulty components were behind 60 percent of the leaked gas.

“Reducing easily avoidable methane leaks from the natural gas system is important for domestic energy security,” said Robert Harriss, a methane researcher at the Environmental Defense Fund and a co-author of the analysis. “As Americans, none of us should be content to stand idly by and let this important resource be wasted through fugitive emissions and unnecessary venting.”

Gas companies not cooperating

One possible reason leaks in the gas industry have been underestimated is that emission rates for wells and processing plants were based on operators participating voluntarily. One EPA study asked 30 gas companies to cooperate, but only six allowed the EPA on site.

“It’s impossible to take direct measurements of emissions from sources without site access,” said Garvin Heath, a senior scientist with the National Renewable Energy Laboratory and a co-author of the new analysis. “But self-selection bias may be contributing to why inventories suggest emission levels that are systematically lower than what we sense in the atmosphere.”

The research was funded by the nonprofit organization Novim through a grant from the Cynthia and George Mitchell Foundation. “We asked Novim to examine 20 years of methane studies to explain the wide variation in existing estimates,” said Marilu Hastings, sustainability program director at the Cynthia and George Mitchell Foundation. “Hopefully this will help resolve the ongoing methane debate.”

###
Other co-authors of the Science study are Francis O’Sullivan of the MIT Energy Initiative; Gabrielle Pétron of the National Oceanic and Atmospheric Administration (NOAA) and the University of Colorado; Sarah M. Jordaan of the University of Calgary; Pieter Tans, NOAA; Jennifer Wilcox, Stanford; Avi Gopstein of the U.S. Department of State; Doug Arent of the National Renewable Energy Laboratory and the Joint Institute for Strategic Energy Analysis; Steven Wofsy of Harvard University; Nancy Brown of the Lawrence Berkeley National Laboratory; independent consultant Richard Bradley; and Galen Stucky and Douglas Eardley, both of the University of California-Santa Barbara. The views expressed in the study are those of the authors, and do not necessarily reflect those of the U.S. Department of State or the U.S. government.

 

Haisla response lists evidence rejected by Northern Gateway Joint Review

Members of the Joint Review panel make notes at Kitamaat Village (Robin Rowland)
Members of the Northern Gateway Joint Review Panel, left to right, Kenneth Bateman, chair Sheila Leggett and Hans Matthews make notes at the June 25, 2012 hearings at the Haisla Recreation Centre, Kitamaat Village. A map of Douglas Channel can be seen behind the panel. (Robin Rowland/Northwest Coast Energy News)

The Haisla Nation in their response to the Crown on the Northern Gateway Joint Review Panel details four studies, three Canadian and one American that were released after the Joint Review evidentiary deadline had passed, evidence that the Haisla say should be considered in any consideration of the Northern Gateway pipeline, terminal and tanker project. (The American report from the National Oceanic And Atmospheric Administration was released after the JRP final report)

JRP chair Sheila Leggett’s constant citing of rules of procedure and her stubborn refusal to consider new evidence and studies in a dynamic situation that was changing rapidly was one of the reasons that many people in the northwest said the JRP had lost credibility.

The Haisla say: “It is incumbent upon Canada to consider and discuss the information in these reports as part of a meaningful consultation process…” and then lists “key findings” that have potential impacts on aboriginal rights and title:

The West Coast Spill response for the government of British Columbia which found:

  • Most oil spilled into the marine environment cannot be cleaned up
  • There is a disconnect between planning and actual repose capability
  • Canada’s spill response is “far from world class.”

The Transport Canada Ship Oil Spill Preparedness and Response study:

  • Douglas Channel will go from low risk to high risk for pills if the project goes ahead
  • The study recommends preparation for a “true worst case discharge” rather than “the credible worst case discharge” as proposed by Northern Gateway
  • Canada needed a much more rigorous regulatory regime covering tankers.

The joint federal government technical report on the properties of bitumen from the Canadian Oil Sands:

  • There are uncertainties on how diluted bitumen would behave in a marine environment.
  • Northern Gateway did not provide adequate information about sediment levels to allow for proper study of interaction with diluted bitumen
  • Dispersant may not be effective.
  • Weathered diluted bitumen would “reach densities at which it will sink freshwater without mechanical or physical assistance.”

The US National Oceanic And Atmospheric Administration report on Transporting Alberta Oil sands:

  • Diluted bitumen has “significant differences from conventional crudes.’ (The JRP used conventional crude as a benchmark in its findings)
  • The physical properties of diluted bitumen “fluctuate based on a number of factors.
  • Pipeline operators may not have detailed information related to products in the pipeline at the time of a spill
  • There is a lack of experimental data on the weathering behaviour of oil sands product which limits the ability of spill response organizations “to understand and predict the behaviour and fate of oil sands products in freshwater, estuarine and saltwater environments.”

 
Related

Ottawa’s Northern Gateway consultation with First Nations limited to three simple questions and 45 days: documents

Haisla ask cabinet to postpone Northern Gateway decision to allow for adequate consultation with First Nations

Haisla consultation reply outlines flaws in Northern Gateway Joint Review report

 

The tsunami, Twitter and the Zones: Did social media amplify government generated confusion?

Kitimat, BC and New York City had one thing in common this week, the misuse and use of social media, Twitter and Facebook, that spread both accurate warnings and dangerous misinformation about an impending disaster. In the case of New York and the surrounding area, it was Superstorm Sandy that caused widespread devastation. For Kitimat it was the tsunami warning after the 7.7 earthquake off Haida Gwaii and no damage but a lot of worry for residents.

New York has a population of millions, it is the media centre for the United States, and much of the U.S. Northeast coast is still recovering from the horrendous damage from Superstorm Sandy.
Kitimat has a population of about 8,000 and my home town is off the media radar except when the Enbridge Northern Gateway pipeline issue pops up on the national assignment desks. If the October 27, 2012 tsunami from the Haida Gwaii earthquake did come up Douglas Channel to Kitimat harbour, it was so minimal that any water rise was scarcely noticed.

In one way New York (the state and the city) plus New Jersey and other states were ahead of Kitimat. In the US, there were numerous official sources on Twitter and Facebook, as well as those ubiquitous live TV news conferences with New York Mayor Michael Bloomberg or various state governors.

On October 27, neither Kitimat nor the nearby town of Terrace had any official emergency outlets on social media. In Kitimat, that may change as early as this Monday when District Council considers what happened last Saturday night.

It has been documented that there was no official response from Emergency Management British Columbia (still largely known under its former name Provincial Emergency Program) until an hour after the first earthquake report from the US Geological Survey. Only sometime later did BC’s provincial emergency officials hold a short conference call with reporters. (At the time the BC Liberals were holding a policy convention at Whistler. After the conference call, TV reporters at the convention in Whistler were doing live reports with taped clips of Attorney General Shirley Bond. It should have been easy for Bond and other senior government officials, including Premier Christy Clark–who is plummeting the polls– to hold a live news conference just as US state governors and mayors did later in the week when it came to Superstorm Sandy)

So in that hour of silence from the BC government, one question that has to be raised is: Were the tsunami warnings so completely uncoordinated–at least as far as the public is concerned– that that was one cause of the misinformation and inaccurate information on Twitter and Facebook? Or did confusing information from authorities simply compound and amplify the social media misinformation that was already spreading across British Columbia and around the world?

Here in the northwest, the two area fire chiefs Trent Bossence of Kitimat and John Klie of Terrace have said after the quake that landline phones and some cell phones were out, in some areas up to an hour after the first shock. Klie told CFTK’s Tyler Noble on Open Connection that after the landline phones came back up the Terrace fire department was flooded with calls from people “who wanted it now.” The ability of firefighters to get information was then delayed “because so many people were trying to get through.”

Kitimat has the advantage of being a small town. Emergency services already had scheduled a volunteer recruiting session last Monday night (October 29) for Emergency Social Services–the folks who run, coordinate and work in reception centres during an emergency–so it was easy to turn that meeting into a earthquake/tsunami warning post mortem. (Imagine that happening in New York?)

The most important issue on Saturday night was the false information on both Facebook and Twitter that the Kildala neighbourhood was being evacuated due to the tsunami warning. Other false information on social media indicated that the giant Bechtel work camp at the Rio Tinto Alcan Kitimat Modernization Project was also being evacuated.

As Kitimat’s Emergency Plan Coordinator Bob McLeod told the earthquake post mortem about the information on Facebook and Twitter:

Kitimat Emergency Coordinator Bob McLeod
Kitimat Emergency Coordinator Bob McLeod at the earthquake postmortem Oct. 29, 2012 (Robin Rowland/Northwest Coast Energy News)

“Your aim is to be saving people, and you’re not saving people. There was one case where someone was going around banging on doors in Kildala, telling them to get out. I think it was over when he was in the lockup that night. But this is the type of foolishness that goes on. You have people going on Facebook saying ‘Alcan’s been evacuated. they’re evacuating Kildala.’ I am going to be generous and say it is misinformation… It was a blatant lie. And that does not help.”

 

 

(For those outside Kitimat you can check the town on Google maps) As seen on this screen grab, Kildala is a low lying part of town. The area north of  Highway 37 is higher on a hill. Closer to the ocean at Douglas Channel are the Bechtel/RTA Kitimat Modernization Project work camps.

Map of Kitimat

Walter McFarlane of the Kitimat Daily recounted his experiences at the post mortem. (We were both at a Haisla dinner at Kitamaat Village when the quake struck. See my earlier story here and McFarlane’s Kitimat Daily story here).

After driving from the village to the town, McFarlane told the meeting that he stopped at the town viewpoint where “people were telling me they had already been evacuated out of the Kildala neighbourhood, so my first stop after that was the fire department.” The fire hall is about a couple of blocks from the viewpoint, so it was easy to get accurate information from the fire department.

McFarlane continued, “I found the night of the earthquake that no information is just as bad as wrong information. People were calling me on my cell saying why does the Kitimat Daily say we have to evacuate.” That is because the Daily republished a warning from the Pacific Tsunami Warning Centre that “said tsunami warning, evacuation for the north coast. People were saying we’re on the north coast, we got to go.”

I was about fifteen to twenty minutes behind McFarlane in reaching town. (I did not leave Kitamaat Village until after we heard the first tsunami warning.)  As soon as I got to back in cell  range, my cell phone started to beep with saved messages from my TV and radio news clients calling for information. When I got to my home office, my landline was still dead and would be for about another twenty minutes. The only source of information at that point was Google News, Facebook and Twitter.

I saw the initial, and it turns out general, warning from the Pacific Tsunami Warning Center. Soon I was also getting what I hoped was more specific information  on my marine radio from the Canadian Coast Guard Prince Rupert communications station.

But that, too was somewhat confusing. That Coast Guard advisory mentioned various zones, for example, Zone A and Zone B, but there was little specific context and that point I had no idea what Zone A meant. Prince Rupert Coast Guard Radio then went on to say evacuate low lying coastal areas. (transcript below)

With that confusion, and mindful of “when in doubt, leave it out,” I did not mention the zone system in any information I posted on Facebook and Twitter that night. I only retweeted official information or tweets from reporters I knew and trusted (and I did not see any tweeted official information from the province with a link to the page that identifies the official tsunami zones)

From the interview on CFTK, it appears that both the Kitimat and Terrace fire departments were also getting inadequate information.

“We went to our normal place to look EM BC (Emergency Management BC) and there was nothing there,so we went to Plan B to get information and went on from there,” Bossence told Tyler Noble.

Klie said: “We struggle with that every disaster big or small. Social media, I think emergency organizations are trying to tap into more and more. Up north we may be a little behind the eight ball but sure enough Twitter and Facebook information is out there instantly. Looking at Facebook with my son, I saw that they were evacuating whole cities and I knew that was not true. Because of my experience I can filter some of the information, but there is so much information out there that it’s hard to filter what’s real and not real. It’s an area where emergency coordinators have to get into because its the fastest way of getting information out.”

“Once the phone system came back online at the Fire Hall we got a flood of phone calls,” Bossence told CFTK, “it was nonstop and it was people wanting to know. ‘What’s going on? What are we going to do? Are we leaving?’ and they’re giving us ‘This is what is what I’m reading, this is what I’m being texted, on Facebook they’re saying we’re supposed to evacuate’ adding to that we had an individual going around claiming he was a fire department, he was going door to door and telling people to evacuate. That was the added issue we had to deal with. It was definitely misinformation and a sense of urgency that was coming out through the social network (and eventually the media) was big problem for us.”

In Kitimat, I was told about the man going door to door with inaccurate information and as soon as I confirmed it with reliable official sources, I posted that on both Twitter and Facebook, emphasizing there was, at that time, no evacuation order.

But every situation is different. In contrast, in Superstorm Sandy, another story about men going door to door in Williamsburg, a section of Brooklyn  was not true, as can be seen in an article summing problems with Twitter in New York, where Jared Keller of Bloomberg reported

I experienced this firsthand during Hurricane Sandy. After retweeting a message warning about muggers in Williamsburg dressed as Con Ed workers as an experiment, I received two sceptical responses checking the claim within 15 minutes, both from people who work in the media industry and spend a significant amount of time on Twitter. Within an hour, I received a mass text message from friends of mine who aren’t completely plugged into the social Web with the same warning: “I just read a news alert of two separate reports of people posing as coned workers, knocking on people’s door and robbing them at gunpoint in Williamsburg. I just want to pass along the info. Stay safe and maybe don’t answer your door.” Two other friends responded with thanks.

Keller goes on to stay “I know a lot of people, especially on Facebook, who end up believing whatever they see first,” says Kate Gardiner, a social media journalist. “It’s almost impossible to track something back to its point of origin there.”

You can read Keller’s complete article How Truth and Lies Spread on Twitter  here.

See also How to Tweet Responsibly During a Breaking-News Event by Garance Franke-Ruta  a senior editor at The Atlantic

With the earthquake and tsunami warning Saturday night, Twitter misinformation spread internationally. The first hashtag I saw was #bcquake, but as the the tsunami warning gained traction (especially after the warning was extended from BC and Alaska to Washington, Oregon and California and then to Hawaii) the more common hashtag #tsunami became prominent. As people outside BC began tweeting, they began using #Canadaquake and soon #prayforcanada also began to trend. Completely inaccurate information spread on #prayforcanada (believed to have originated in Indonesia) that it was Vancouver, not the north coast that had been hit by the 7.7 magnitude earthquake.

Are you in the Zone?

At this point, one question has to be asked. The spread of information, first the well-intended but wrong, second just rumour and third, the deliberately misleading, has been seen in social media not only during the earthquake and tsunami on the West Coast last weekend, and during Superstorm Sandy on the East Coast but all the way back to the 2004 Christmas tsunami in Southeast Asia.

For the west coast in 2012, however, how much of the problem of misinformation on social media during the earthquake and tsunami warning was the fault of confusing information from the authorities? Just how were people going to interpret such general terms as “north coast” and “low lying areas.”?

From the BC Provincial Emergency Program you have to ask “What is Zone A?” It turns out by checking a day or so later that the province of British Columbia has created Tsunami Identification Zones.

Emergency Management Tsunami Zones
Before October 27, it is likely no one outside of the provincial bureaucracy had ever heard of the provincial tsunami zones. At that time no one in BC, either on Twitter or Facebook or through the media was identifying the BC Tsunami Zones for the public. Later on, the television networks put up maps showing Zones A and B —but that was only good if you had power and were watching the right channel. Kitimat Daily and Terrace Daily posted an official update at 10:42 long after the danger was past explaining the Zone system. It was no good at all if you were listening to news reports on radio or to Prince Rupert Coast Guard Radio on a fishing boat and had no access to the actual maps.

Compounding the confusion is that the US system appears to be very different from the Canadian.

Also the US system has two levels of warning. The Pacific Tsunami Warning Center sends out general warnings but hands over for a more specific warning map from the Alaska -based West Coast and Alaska Pacific Tsunami warning centre. It uses its own system of lettered and numbered zones for the west coast of North America. (See the Oct 27 tsunami advisory here  Note it is a Google maps plugin.)

 

Alaska BC tsunami warning map
Possibly adding to uncertainty for those who sail the coast of British Columbia, is that usually when the Canadian Coast Guard talks about zones on marine radio, it is talking about the fishing zones as defined by the Department of Fisheries and Oceans, which are numbered not lettered 

 

DFO Management areas
Fisheries management zones as defined by the Department of Fisheries and Oceams (DFO)

 

So in case of a tsunami warning, Kitimat is in Zone B for the province of British Columbia and the Provincial Emergency Program and in Zone BZ921 for the West Coast and Alaska Tsunami Warning Centre. For the much more familiar fisheries management areas Kitimat is in Zone 6 (which of course has nothing to do with a tsunami, it’s simply the coastal zone system everyone is familiar with)

Tsunami warning map
Adding to the confusion is the fact that the EM British Columbia map shows Terrace, far inland up the Skeena River is considered in Zone A, along with Prince Rupert for tsunami warnings (if a tsunami was big enough to reach Terrace along the Skeena River valley, then I can only assume that much of the west coast of North America would have already been wiped out).

Tsunami Zone A

 

The Monday Post mortem

Warning brochuresAt the Monday, October 29 post mortem, when McLeod outlined the events of October 27, he began by looking back three weeks, saying, “I have feeling of frustration about a couple of things. October 7, I took 4,000 brochures [How Prepared Are you if Disaster Strikes?] down to the post office to mail out to the residents of Kitimat, They were all delivered by the post office. On Sunday, I had people coming to me and saying what are we supposed to do in the case of an earthquake? It is really, really difficult to get people interested.”

McLeod said that after he felt the earthquake, he went online to check information and then went up to the fire hall, which is Kitimat’s emergency coordination centre. There he met Fire Chief Bossence, his deputy, the RCMP detachment commander Staff Sergeant Steve Corp and representatives from Bechtel and the Rio Tinto Alcan modernization project.

“For the first little while we were going on line trying to get information. The usual method of dissemination getting information it comes from the West coast and Alaska tsunami warning system, then it goes to Victoria, Victoria gives it to the geophysical specialists and they will confirm or deny what ever the information and then it goes to the Provincial Emergency Program and they shoot it out to coastal communities.

“While in this case you’re working with what you find out from different sources and you are trying to determine how reliable these sources are.”

“In our case, for me the first thing you do when you get word of an impending tidal wave [tsunami] action is check the tide. If you’re on a high tide, it’s a different situation than a low tide

“The movie version of a tidal wave is this 50 foot mountain of water roaring along and this is not what is going to happen particularly in Douglas Channel because of the depth. So you are going to see a surge such as we saw in Japan and it will be an increasing surge of water.

“We were told that potentially some sort of surge hitting Langara  [the northern most island in Haida Gwaii) at 9:16, 9:16 came and went and there was no notification of a noticeable surge of water. So were down to a non event and we were on a receding tide.” (See advisory below)

“Misinformation going out is not helpful,” McLeod said. “You’ve got to set up a stream of how you get information out to people and it’s a valid point. The District Website, the Facebook page, something like that can get information out. But again if you lose power where do get it? Text can work even locally with cell phones. if you’re in a dead area with a cell phone, you can still get text”

McLeod then asked the audience, mainly people ranging from their thirties to seventies if they text. Only four or five people put up their hands. “You people are going to be saved, the rest of us…” McLeod quipped.

If a conclusion can be drawn from the earthquake and tsunami warning in the Kitimat region on October 27, it’s not just that in an emergency inaccurate, incomplete or malicious information can spread a the speed of light on social media, it’s worse that incomplete, inadequate and confusing information from the authorities is amplified and distorted by rapid posting on social media. That concept is not new for anyone who has tried the phone chain game where the outcome is often completely different from the start.

If Gardiner is correct when she says “I know a lot of people, especially on Facebook, who end up believing whatever they see first,” the BC government delays made everything worse. People Tweeted the first thing they saw and the first thing people saw came from multiple and often conflicting sources.  Add that to those Tweets that were exaggeration, rumour and lies.

The problem in 2012 it is not one person talking to one person talking to one person, it is a Tweet or Facebook posting that go out to thousands, or millions of people and that’s a lot more dangerous.

McLeod said the post mortem who said emergency services is trying to get more information out to public, but he added. “The unfortunate part is that if you publish it this week, by Christmas no one will remember. If you start throwing it out every week, it becomes like a stop sign at the end of the street. Nobody sees it.”

(Coming next. If Kitimat had to evacuate)

Transcript of Prince Rupert Coast Guard Radio tsunami warning.

Pan pan. Pan pan. This is Prince Rupert Coast Guard Radio, Prince Rupert Coast Guard Radio. Warning for coastal British Columbia issued by Environment Canada on behalf of the British Columbia Provincial Emergency Program at 2057 Pacific Daylight Time Saturday 27 October. Tsunami warning for Zone A, the north coast and Haida Gwaii,Zone B, the central coast and including Bella Coola, Bella Bella and (unintelligible). A tsunami warning has been issued, if you are in a low-lying area coastal area, you are at risk and must move to higher ground or inland now.
Do not return until directed to do so. Closely monitor local radio stations for additional information from local authorities. Please minimize phone use in affected areas, for further information contact the provincial emergency program at website www. papa echo papa period bravo charlie period charlie alpha.Prince Rupert Coast Guard Radio over.

General warning from the Pacific Tsunami Warning Centre

000
WEPA42 PHEB 280341
TIBPAC

TSUNAMI BULLETIN NUMBER 003
PACIFIC TSUNAMI WARNING CENTER/NOAA/NWS
ISSUED AT 0341Z 28 OCT 2012

THIS BULLETIN APPLIES TO AREAS WITHIN AND BORDERING THE PACIFIC
OCEAN AND ADJACENT SEAS…EXCEPT ALASKA…BRITISH COLUMBIA…
WASHINGTON…OREGON AND CALIFORNIA.

… TSUNAMI INFORMATION BULLETIN …

THIS BULLETIN IS FOR INFORMATION ONLY.

THIS BULLETIN IS ISSUED AS ADVICE TO GOVERNMENT AGENCIES. ONLY
NATIONAL AND LOCAL GOVERNMENT AGENCIES HAVE THE AUTHORITY TO MAKE
DECISIONS REGARDING THE OFFICIAL STATE OF ALERT IN THEIR AREA AND
ANY ACTIONS TO BE TAKEN IN RESPONSE.

AN EARTHQUAKE HAS OCCURRED WITH THESE PRELIMINARY PARAMETERS

ORIGIN TIME – 0304Z 28 OCT 2012
COORDINATES – 52.9 NORTH 131.9 WEST
DEPTH – 10 KM
LOCATION – QUEEN CHARLOTTE ISLANDS REGION
MAGNITUDE – 7.7

EVALUATION

NO DESTRUCTIVE WIDESPREAD TSUNAMI THREAT EXISTS BASED ON
HISTORICAL EARTHQUAKE AND TSUNAMI DATA.

HOWEVER – THE WEST COAST/ALASKA TSUNAMI WARNING CENTER HAS
ISSUED A REGIONAL WARNING FOR COASTS LOCATED NEAR THE EARTHQUAKE.
THIS CENTER WILL CONTINUE TO MONITOR THE SITUATION BUT DOES NOT
EXPECT A WIDER THREAT TO OCCUR.

THIS WILL BE THE ONLY BULLETIN ISSUED FOR THIS EVENT UNLESS
ADDITIONAL INFORMATION BECOMES AVAILABLE.

THE WEST COAST/ALASKA TSUNAMI WARNING CENTER WILL ISSUE PRODUCTS
FOR ALASKA…BRITISH COLUMBIA…WASHINGTON…OREGON…CALIFORNIA.

A more specific warning from the West Coast/Alaska Tsunami Warning Centre

 

WEAK51 PAAQ 280334
TSUAK1

BULLETIN
PUBLIC TSUNAMI MESSAGE NUMBER 2
NWS WEST COAST/ALASKA TSUNAMI WARNING CENTER PALMER AK
834 PM PDT SAT OCT 27 2012

THE MAGNITUDE IS UPDATED TO 7.7. THE WARNING ZONE REMAINS THE
SAME.

…THE TSUNAMI WARNING CONTINUES IN EFFECT FOR THE COASTAL
AREAS OF BRITISH COLUMBIA AND ALASKA FROM THE NORTH TIP OF
VANCOUVER ISLAND BRITISH COLUMBIA TO CAPE DECISION
ALASKA/LOCATED 85 MILES SE OF SITKA/…

…THIS MESSAGE IS INFORMATION ONLY FOR COASTAL AREAS OF
CALIFORNIA – OREGON – WASHINGTON AND BRITISH COLUMBIA FROM
THE CALIFORNIA-MEXICO BORDER TO THE NORTH TIP OF VANCOUVER
ISLAND BRITISH COLUMBIA…

…THIS MESSAGE IS INFORMATION ONLY FOR COASTAL AREAS OF
ALASKA FROM CAPE DECISION ALASKA/LOCATED 85 MILES SE OF
SITKA/ TO ATTU ALASKA…

A TSUNAMI WARNING MEANS… ALL COASTAL RESIDENTS IN THE WARNING
AREA WHO ARE NEAR THE BEACH OR IN LOW-LYING REGIONS SHOULD MOVE
IMMEDIATELY INLAND TO HIGHER GROUND AND AWAY FROM ALL HARBORS AND
INLETS INCLUDING THOSE SHELTERED DIRECTLY FROM THE SEA. THOSE
FEELING THE EARTH SHAKE… SEEING UNUSUAL WAVE ACTION… OR THE
WATER LEVEL RISING OR RECEDING MAY HAVE ONLY A FEW MINUTES BEFORE
THE TSUNAMI ARRIVAL AND SHOULD MOVE IMMEDIATELY. HOMES AND
SMALL BUILDINGS ARE NOT DESIGNED TO WITHSTAND TSUNAMI IMPACTS.
DO NOT STAY IN THESE STRUCTURES.

ALL RESIDENTS WITHIN THE WARNED AREA SHOULD BE ALERT FOR
INSTRUCTIONS BROADCAST FROM THEIR LOCAL CIVIL AUTHORITIES.
EARTHQUAKES OF THIS SIZE ARE KNOWN TO GENERATE TSUNAMIS.

AT 804 PM PACIFIC DAYLIGHT TIME ON OCTOBER 27 AN EARTHQUAKE WITH
PRELIMINARY MAGNITUDE 7.7 OCCURRED 25 MILES/40 KM SOUTH OF
SANDSPIT BRITISH COLUMBIA.
EARTHQUAKES OF THIS SIZE ARE KNOWN TO GENERATE TSUNAMIS.
IF A TSUNAMI HAS BEEN GENERATED THE WAVES WILL FIRST REACH
LANGARA ISLAND BRITISH COLUMBIA AT 916 PM PDT ON OCTOBER 27.
ESTIMATED TSUNAMI ARRIVAL TIMES AND MAPS ALONG WITH SAFETY RULES
AND OTHER INFORMATION CAN BE FOUND ON THE WEB SITE
WCATWC.ARH.NOAA.GOV.

TSUNAMIS CAN BE DANGEROUS WAVES THAT ARE NOT SURVIVABLE. WAVE
HEIGHTS ARE AMPLIFIED BY IRREGULAR SHORELINE AND ARE DIFFICULT TO
FORECAST. TSUNAMIS OFTEN APPEAR AS A STRONG SURGE AND MAY BE
PRECEDED BY A RECEDING WATER LEVEL. MARINERS IN WATER DEEPER
THAN 600 FEET SHOULD NOT BE AFFECTED BY A TSUNAMI. WAVE HEIGHTS
WILL INCREASE RAPIDLY AS WATER SHALLOWS. TSUNAMIS ARE A SERIES OF
OCEAN WAVES WHICH CAN BE DANGEROUS FOR SEVERAL HOURS AFTER THE
INITIAL WAVE ARRIVAL. DO NOT RETURN TO EVACUATED AREAS UNTIL AN
ALL CLEAR IS GIVEN BY LOCAL CIVIL AUTHORITIES.

PACIFIC COASTAL REGIONS OUTSIDE CALIFORNIA/ OREGON/ WASHINGTON/
BRITISH COLUMBIA AND ALASKA SHOULD REFER TO THE PACIFIC TSUNAMI
WARNING CENTER MESSAGES FOR INFORMATION ON THIS EVENT AT
PTWC.WEATHER.GOV.

THIS MESSAGE WILL BE UPDATED IN 30 MINUTES OR SOONER IF
THE SITUATION WARRANTS. THE TSUNAMI MESSAGE WILL REMAIN
IN EFFECT UNTIL FURTHER NOTICE. FOR FURTHER INFORMATION STAY TUNED
TO NOAA WEATHER RADIO… YOUR LOCAL TV OR RADIO STATIONS… OR SEE
THE WEB SITE WCATWC.ARH.NOAA.GOV.

$$

Did the media over react to the earthquake and tsunami warning?


There were also numerous Tweets on October 27, accusing the media of over reacting. The Haida Gwaii quake was 7.7 magnitude. Compare that to the Haiti earthquake on January 12, 2010 which was 7.0. The Christ Church, New Zealand earthquake on February 27, 2011 which caused major damage was 6.3 magnitude. So the Haida Gwaii earthquake was a major event. The tsunami warning that eventually reached as far off as Hawaii had to be taken seriously.

Fortunately Haida Gwaii is sparsely populated and there was minimal damage largely because most of the houses and buildings are wood and can absorb some of the shaking from an earthquake.

Given the tsunami damage in Southeast Asia in 2004 and in Japan in 2011, no media organization could ignore the developing story.

If there is justifiable criticism, it is that some media over hyped the story in the beginning, rather acting to reassure the public in a responsible manner. But the media that over hyped the earthquake and tsunami are the kind that would over hype any story. That is generally the result of management listening to “TV doctors” and media consultants who urge over hyping to increase ratings. (It often works). But those who,  quite early in the event, who tweeted that the media was overreacting, were themselves guilty of overreaction in their Tweets.

After the earthquake: Kitimat must immediately upgrade its emergency communications

As a 7.7 magnitude earthquake hit off Haida Gwaii shortly after eight o’clock on Saturday, I was at the Haisla Recreation Centre as the Haisla Nation marked the return of the G’ps Golox totem pole. Like a boat being lifted by gentle waves, the Rec Centre began to quietly roll up and down, then the rolling seemed to accelerate just a bit. I realized that it was an earthquake. As I told CBC’s Ian Hanomansing  later in the evening, I have been in a number of earthquakes, and for me at least, this quake, at least at Kitamaat Village, the rec centre was not shaking as badly as in some of the others I have felt.

The subsequent events of the evening show that the emergency communication system in Kitimat needs immediate improvement.

Cell service

Cell phone service at the village is poor and after the rolling stopped neither myself nor my Kitimat Daily colleague Walter McFarlane was able to get “bars.”

Now as a former network producer for both CBC and CTV I have handled a large number of earthquake stories from around the world over the past quarter century (sitting at a desk, I should add). With that experience, I was hoping to get a cell hit at the village so I could bring up Twitter. I already subscribe to the US Geological Survey and Canadian earthquake alert feeds. The US and Canadian computers automatically report earthquakes within seconds of detection and send out a Twitter bulletin as the same time as those computers are alerting their human masters. If I had been able to get cell service I would have known within minutes that the Haida Gwaii earthquake was a major event. (I did follow the alerts from my computer once I got back to Kitimat itself).

Recommendation One. Cell service in Kitimat, Kitamaat Village, the harbour area must be upgraded as soon as possible. Telus has applied to council to erect a new cell tower here. Given the events of the past 24 hours, District Council should make sure that all parts of the District of Kitimat and the Haisla Nation have proper cell coverage no matter what service one subscribes to, not just for the convenience of subscribers but for emergency situations.

Automatic alerts

With experience one knows that in a situation such as Saturday night, the official websites such as the US Geological Survey and the Pacific Tsunami Warning Center as well as  Natural Resources Canada are often overwhelmed. That is why the media use RSS feeds, Twitter feeds and e-mail alerts. It is also important to realize that these emergency organizations have their own language and procedures. It appears that a lot of the confusion on Saturday came from misinterpretation of the various Canadian and US warning systems.

Recommendation Two. If Kitimat emergency services are not familiar with how the US based earthquake and tsunami centres work, they should be trained in those systems, simply because the Americans are well ahead of Canada in these areas because the alerts go out by computer automatically and are constantly updated and as Saturday night showed, are often quicker and farther ahead than the Canadian systems.

Once I was back in Kitimat, it was clear that communications were breaking down, and this was at a time the tsunami warning was still active. There were numerous messages on Twitter and Facebook, from residents of Kitimat either trying to find out what was going on or retweeting/reposting rumours including one that the Kildala neighbourhood was being evacuated. I am told that residents were calling the RCMP to ask what was going on. This was another breakdown since North District HQ in Prince George handles all police services in this region and were likely busy with quake calls on Haida Gwaii, so that information calls in Kitimat that should have been handled by an emergency services public communications person were being handled the Mounties.

There were reports that one man was going door to door in Kildala telling people to evacuate. Whether this person was well intentioned but misinformed or a imposter intent on mischief doesn’t matter, there was an information vacuum.

It was clear from Twitter that other districts and municipalities were using that service to spread official information. (I don’t follow other areas on Facebook so it is unclear if information was being posted on Facebook. There was certainly no official presence from Kitimat on Facebook Saturday night.) It appears from reports in the Kitimat Daily and tweets about the Northern Sentinel that Kitimat emergency services was sending information out by fax. While faxing information was an advance in the 1980s, faxes are obsolete in 2012. Many major newsrooms no longer use fax machines after being inundated by junk faxes and after they laid off all the editorial assistants who would have cleared those fax machines (even by the late 90s most faxes were dumped in the garbage unless the EA had been told to look for a specific fax). Also though it is now more than two years since I returned to Kitimat and I regularly freelance for Global, CBC and Canadian Press, I had no contact from anyone in emergency services (also I don’t have a fax machine).

Recommendation Three: The District of Kitimat must immediately bring its emergency communications into the 21st century, with Twitter accounts, a Facebook page and an emergency e-mail or text message plan for media and other officials who can get the messages. ( A number of jurisdictions already use text messages for emergency alerts at various graduated levels, official, media, public). When the main means of communication today is social media, an emergency organization can no longer follow outdated procedures, an organization must be on social media immediately it becomes clear that there is an emergency (as we are seeing with all the official tweets with the Hurricane Sandy crisis on the east coast)

CFTK

In an emergency situation, local radio and television are vital to communications and letting people know what is going on.

CFTK did a much better job on March 27. 1964 after Kitimat felt the magnitude 9.2 Good Friday Anchorage earthquake than it did on the weekend with the Haida Gwaii earthquake.

The inadequate coverage of the quake was certainly not the fault the of the current CFTK news staff who were working hard (probably on their own time and unpaid) keeping Twitter updated with what they knew. The fault lies with corporate management across the media which these days doesn’t want to spend the money and resources and training to fulfill the public service portion of their broadcast licence mandate.

(There was a similar breakdown in the May 2000, Walkerton, Ontario e-coli crisis where the local medical officer of health was initially unable to alert the public because local radio wasn’t staffed on the weekends–the local stations were taking satellite feeds from their corporate headquarters)

In 1964, long before satellites, when the microwave towers that joined CFTK to the Canadian networks were still being built, the staff of CFTK, then, of course under local management, went to a live special within an hour of the Anchorage quake being felt far off from Alaska in Kitimat. The CFTK anchors were keeping its audience updated with “rip and read” wire copy, a camera on an atlas for a map and phone interviews.

In contrast, on this Saturday night, CFTK was taking the CBC BC network feed which was a hockey rerun (hardly a show that attracts  major audience numbers and certainly not a vital broadcast) until the CBC management in Vancouver decided to go to full network news special.

Since CFTK is the station that broadcasts not only to Kitimat, but to Haida Gwaii as well, CFTK should have been ahead of Vancouver on this story, called in its staff and mounted their own live special, joining the CBC feed when it began but, as on an election night, breaking away for local news when justified. CFTK has a responsibility under its licence from the CRTC to provide that service to the northwestern region, not just sending what ad revenue it generates back to Astral.

Rio Tinto Alcan

Another question that must be asked in this situation is where was Rio Tinto Alcan on Saturday night? In all areas that were under a tsunami warning the first scrutiny and clue if there was to be a problem is that region would be found by observing what has happening between the low tide line and the maximum hide tide line. In Prince Rupert, from the Twitter feeds I saw, public officials were monitoring the waterfront and the tide lines and updating the public. RTA has all the advantages of the private port of Kitimat. It appears that monitoring the water level at the tide lines at the port of Kitimat was the responsibility of Plant Protection. Was RTA communicating what was happening with emergency services? Since RTA runs the private port, unlike in other jurisdictions, RTA had a responsibility to the people of Kitimat to report promptly to the public the conditions on the waterfront. Corporate public relations cannot just be sending out news releases with “good news.” That means that RTA public relations should have used its corporate Twitter account which usually sends out a news release every few weeks, to keep Kitimat updated on a minute-by-minute basis. If RTA communications staff in Kitimat do not have access to the RTA corporate Twitter account, they should establish their own local Twitter feed.

Both in 1964 and in 2012, the tsunami that came up Douglas Channel was minimal. But we know that this region does have a record of major quakes and that Douglas Channel has also experienced major landslides that can, in some circumstances, trigger a tsunami without an earthquake. The next few years will be seeing more industrial development along Douglas Channel which can also bring other hazards to the Kitimat region. While there are always communications breakdowns in situations like happened on Saturday, it is clear that the Kitimat emergency communications system needs a major upgrade to make sure the public is informed quickly and accurately of what is going on.

 

 

 

 

 

 

Seattle cod trawler wastes 114 tons of Alaska halibut: Alaska Dispatch

Environment Fishery

The Alaska Dispatch reports that a Seattle cod trawler wastes 114 tons of Alaska halibut

The Seattle-based trawler Alaska Beauty recently had a great week of halibut fishing… Only one problem: Alaska Beauty wasn’t supposed to be fishing halibut; it was supposed to be fishing cod.

Despite that, 43 percent of its catch was halibut. All of that halibut, by law, must be dumped back into the sea. Most of it goes back dead. Some Alaskans are starting to get angry at this sort of large “by-catch” of halibut by Pacific Northwest and Kodiak-based trawlers at a time when the species’ stocks are declining, and Alaska charter and commercial longline fisheries are locked in a bitter battle over every flatfish.

An anonymous blogger who goes by the name of Tholepin says “228,800 pounds of halibut wasted by draggers just last week,” Tholepin notes in the latest post. “Value? In cash terms to longliners, about $1.6 million. In lost reproductive potential, in lost growth potential, in long-term resource damage; all unknowns ..