Bob McLeod, who recently retired as the District of Kitimat’s emergency coordinator, told Northwest Coast Energy News: “I think we’ve done quite a bit. One of the biggest issues in the first one was trying to get information out. We’ve come a long way on that. Whether you reach everyone or not, that’s another thing, because you never reach everybody. One of the critical things to me is getting the information out so you avoid all this Facebook, Twitter speculating and rumour. The communications aspect has improved a hundred fold.
“We did more work on the mapping and planning. Over the course of the last year, there were a lot of meetings with industry and various stakeholders, discussing emergency preparedness in general but touching on some of these other things as well.
“One of the things we did was to try to set up some shelter points. We have an agreement with the Baptist Church, the Catholic Church and the Seventh Day Adventists. They’re strategically located and could be gathering points for the various neighborhoods if necessary.
“We’ve also done quite a lot of work on Riverlodge as a group lodging centre, thinking in terms of an earthquake where there may be damage and you have to move people.
“We did look at the evacuation planning and we’ve had a couple of exercises involving that, looking strategically about how can you move people from certain neighborhoods, asking which neighborhoods would be at the most risk if you ended up with a tsunami situation.
As for tsunamis, McLeod said, “From everything we’ve heard and been told, tsunamis in extremely deep water like that is not going to be as dangerous as one in shallower water, but the possibility is still there.
“The thrusts are the killers when it comes to tsunamis, but there is a very good warning system on the tsunamis. We do get very very rapid feedback on the earthquakes.
“The only danger in that regard is if you have a severe earthquake and you have part of a mountain drop into the salt chuck, you’re going to get a massive wave and you’re going to get no warning whatsoever, like the Moon Bay collapse in the seventies.
“The emergency plan is in good shape. We scheduled a number of exercises last year through training programs.
“One of the things I personally push is personal preparedness. I think as a community, we fail greatly at that. That was evident even during the snowstorm. People are not just prepared to look after themselves, it’s unfortunate. You just have to keep chipping away.”
Last week, Northwest Coast Energy News asked Rio Tinto Alcan and the Haisla Nation Council if either could comment on updated earthquake or tsunami response plans. So we have received no answers.
The province of British Columbia has posted a request for bids for an extensive air shed study for Prince Rupert, a study that has much wider scope that the controversial Kitimat air shed study. The maximum cost for the study is set at $500,000.
a study of potential impacts to the environment and human health of air emissions from a range of existing and proposed industrial facilities in the Prince Rupert airshed, further referred to as Prince Rupert Airshed Study (PRAS) in North West British Columbia.
The “effects assessment” should include the “prediction of effects of existing and proposed air emissions of nitrogen dioxide, sulphur dioxide and fine particulate matter (at PM2.5, called dangerous by Wikipedia ) from “an existing BC Hydro gas fired turbine, a proposed oil refinery, and seven proposed LNG export terminals (Pacific Northwest LNG, Prince Rupert LNG, Aurora LNG, Woodside LNG, West Coast Canada LNG, Orca LNG, and Watson Island LNG).”
In addition to “stationary sources” of nitrogen dioxide, sulphur dioxide and particulate matter, “the impact assessment will also include rail and marine transportation sources of these contaminants in the study area.”
The request for proposal goes on to say:
The identified sources will be used for air dispersion modelling to determine how the contaminants in various aggregations (scenarios) will interact with the environment, including surface water, soils, vegetation and humans. Interactions of interest will include:
– water impact mechanisms related to acidification and eutrophication;
– soil impact mechanisms related to acidification and eutrophication; and
– vegetation and human health impact mechanisms related to direct exposure.
Water and soil impact predictions will be based on modelled estimates of critical loads for both media, given existing and predicted conditions in the airshed. Vegetation and human health impact predictions will be based on known thresholds of effects, given modelled existing and predicted conditions (contaminant concentrations) in the airshed.
Although the documents say that the Prince Rupert study will be based on the same parameters at the Kitimat air shed study, the Kitimat study only looked at sulphur dioxide and nitrogen dioxide, and did not include particulate matter.
Environmental groups also criticized the Kitimat air shed study for not including green house gases. The proposed Prince Rupert study also does not include green house gases.
A draft report is due by March 15, for review by the province and affected First Nations and subject to peer review. The District of Kitimat was not asked for comment on the study on that air shed study, even though scholars as far away as Finland were asked to review it. It appears that Prince Rupert itself is also excluded from a chance to review the study. The final report is due on May 15.
The province has issued a permit to Rio Tinto Alcan to increase sulphur dioxide emissions from the Kitimat Modernization Project. The Environmental Appeal Board will hold hearings in January 2015. Elisabeth Stannus and Emily Toews, from Kitimat, have appealed against decision to allow RTA to increase sulphur dioxide emissions.
The United States says acidification of the oceans means there is an already growing risk to the northwest coast fishery, including crab and salmon, according to studies released by the National Oceanic and Atmospheric Administration.
As more carbon dioxide is released into the atmosphere and absorbed by the oceans, the water is becoming more acidic and that affects many species, especially shellfish, dissolving the shells.
A NOAA study released today of environmental and economic risks to the Alaska fishery says:
Many of Alaska’s nutritionally and economically valuable marine fisheries are located in waters that are already experiencing ocean acidification, and will see more in the near future…. Communities in southeast and southwest Alaska face the highest risk from ocean acidification because they rely heavily on fisheries that are expected to be most affected by ocean acidification…
An earlier NOAA study, released in April, identified a long term threat to the salmon fishery as small ocean snails called pteropods which are a prime food source for pink salmon are already being affected by the acidification of the ocean.
The term “ocean acidification” describes the process of ocean water becoming more acidic as a result of absorbing nearly a third of the carbon dioxide released into the atmosphere from human sources. This change in ocean chemistry is affecting marine life, particularly the ability of shellfish, corals and small creatures in the early stages of the food chain to build skeletons or shells.
Today’s NOAA study is the first published research by the Synthesis of Arctic Research (SOAR) program, which is supported by an US inter-agency agreement between NOAA’s Office of Oceanic and Atmospheric Research and the Bureau of Ocean Energy Management (BOEM) Alaska Region.
Des Nobles, President of Local #37 Fish [UFAWU-UNIFOR] told Northwest Coast Energy News that the fisheries union and other fisheries groups in Prince Rupert have asked both the Canadian federal and the BC provincial governments for action on ocean acidification. Nobles says so far those requests have been ignored,
Threat to crabs
The studies show that red king crab and tanner crab grow more slowly and don’t survive as well in more acidic waters. Alaska’s coastal waters are particularly vulnerable to ocean acidification because of cold water that can absorb more carbon dioxide and unique ocean circulation patterns which bring naturally acidic deep ocean waters to the surface.
“We went beyond the traditional approach of looking at dollars lost or species impacted; we know these fisheries are lifelines for native communities and what we’ve learned will help them adapt to a changing ocean environment,” said Jeremy Mathis, Ph.D., co-lead author of the study, an oceanographer at NOAA’s Pacific Marine Environmental Laboratory in Seattle, and the director of the University of Alaska Fairbanks School of Fisheries and Ocean Sciences Ocean Acidification Research Center.
As for Dungeness crab, Sarah Cooley, a co-author of the Alaska study, who was with the Woods Hole Oceanographic Institution at the time, told Northwest Coast Energy News, “The studies have not been done for Dungeness crab that have been done for king and tanner crab, that’s something we’re keenly aware of. There’s a big knowledge gap at this point.” She says NOAA may soon be looking at pilot study on Dungeness crab.
Risk to Salmon, Mackerel and Herring
In a 2011-2013 survey, a NOAA-led research team found the first evidence: “that acidity of continental shelf waters off the West Coast is dissolving the shells of tiny free-swimming marine snails, called pteropods, which provide food for pink salmon, mackerel and herring.”
The survey estimated that the percentage of pteropods along the west coast with dissolving shells due to ocean acidification had “doubled in the near shore habitat since the pre-industrial era and is on track to triple by 2050 when coastal waters become 70 percent more corrosive than in the pre-industrial era due to human-caused ocean acidification.”
That study documented the movement of corrosive waters onto the continental shelf from April to September during the upwelling season, when winds bring water rich in carbon dioxide up from depths of about 120 to 180 metres to the surface and onto the continental shelf.
“We haven’t done the extensive amount of studies yet on the young salmon fry,” Cooley said. “I would love to see those studies done. I think there is a real need for that information. Salmon are just so so important for the entire Pacific Northwest and up to Alaska.”
In Prince Rupert, Barb Faggetter, an independent oceanographer whose company Ocean Ecology has consulted for the fisherman’s union and NGOs, who was not part of the study, spoke generally about the threat of acidification to the region.
She is currently studying the impact of the proposed Liquified Natural Gas terminals that could be built at Prince Rupert near the Skeena River estuary. Faggetter said that acidification could affect the species eaten by juvenile salmon. “As young juveniles they eat a lot of zooplankton including crustaceans and shell fish larvae.”
She added, “Any of the shell fish in the fishery, including probably things like sea urchins are all organisms that are susceptible to ocean acidification because of the loss of their capacity to actually incorporate calcium carbonate into their shells.”
Faggetter said her studies have concentrated on potential habitat loss near Prince Rupert as a result of dredging and other activities for liquified natural gas development, She adds that ocean acidification “has been a consideration that climate change will further worsen any potential damage that we’re currently looking at.”
Her studies of the Skeena estuary are concentrating on “rating” areas based on the food supply available to juvenile salmon, as well as predation and what habitat is available and the quality of that habitat to identify areas that “are most important for the juvenile salmon coming out of the Skeena River estuary and which are less important.”
She said that climate change and ocean acidification could impact the Skeena estuary and “probably reduce some of the environments that are currently good because they have a good food supply. If ocean acidification reduces that food supply that will no longer be good habitat for them” [juvenile salmon].
The August 2011 NOAA survey of the pteropods was done at sea using “bongo nets” to retrieve the small snails at depths up to 200 metres. The research drew upon a West Coast survey by the NOAA Ocean Acidification Program in that was conducted on board the R/V Wecoma, owned by the National Science Foundation and operated by Oregon State University.
Nina Bednarsek, Ph.D., of NOAA’s Pacific Marine Environmental Laboratory in Seattle, the lead author of the April pteropod paper said, “Our findings are the first evidence that a large fraction of the West Coast pteropod population is being affected by ocean acidification.
“Dissolving coastal pteropod shells point to the need to study how acidification may be affecting the larger marine ecosystem. These near shore waters provide essential habitat to a great diversity of marine species, including many economically important fish that support coastal economies and provide us with food.”
Ecology and economy
Today’s study on the effects of acidification on the Alaska fishery study examined the potential effects on a state where the fishing industry supports over 100,000 jobs and generates more than $5 billion in annual revenue. Fishery-related tourism also brings in $300 million annually to the state.
The study also shows that approximately 120,000 people or roughly 17 percent of Alaskans rely on subsistence fisheries for most, if not all of their dietary protein. The Alaska subsistence fishery is open to all residents of the state who need it, although a majority of those who participate in the subsistence fishery are Alaska’s First Nations. In that way it is somewhat parallel to Canada’s Food, Ceremonial and Social program for First Nations.
“Ocean acidification is not just an ecological problem—it’s an economic problem,” said Steve Colt, Ph.D., co-author of the study and an economist at the University of Alaska Anchorage. “The people of coastal Alaska, who have always looked to the sea for sustenance and prosperity, will be most affected. But all Alaskans need to understand how and where ocean acidification threatens our marine resources so that we can work together to address the challenges and maintain healthy and productive coastal communities.”
The Alaska study recommends that residents and stakeholders in vulnerable regions prepare for environmental challenge and develop response strategies that incorporate community values and needs.
“This research allows planners to think creatively about ways to help coastal communities withstand environmental change,” said Cooley, who is now science outreach manager at Ocean Conservancy, in Washington, D.C. “Adaptations can be tailored to address specific social and environmental weak points that exist in a community.
“This is really the first time that we’ve been able to go under the hood and really look at the factors that make a particular community in a borough or census are less or more vulnerable from changing conditions resulting from acidification. It gives us a lot of power so that we don’t just look at environmental issues but also look at the social story behind that risk.”
As for the southern part of the Alaska panhandle nearest British Columbia, Cooley said, “What we found is that there is a high relative risk compared to some of the other areas of Alaska and that is because the communities there undertake a lot of subsistence fishing, There tend not be a whole lot of commercial harvests in the fisheries there but they are very very important from a subsistence stand point… And they’re tied to species that we expect to be on the front line of acidification, many of the clam species that are harvested in that area and some of the crab species.”
Long term effects
Libby Jewett, Director of the NOAA Ocean Acidification Program and author of the pteropod study said, “Acidification of our oceans may impact marine ecosystems in a way that threatens the sustainability of the marine resources we depend on.
“Research on the progression and impacts of ocean acidification is vital to understanding the consequences of our burning of fossil fuels.”
“Acidification is happening now,” Cooley said. “We have not yet observed major declines in Alaskan harvested species. In Washington and Oregon they have seen widespread oyster mortality from acidification.
“We don’t have the documentation for what’s happening in Alaska right now but there are a lot of studies staring up right now that will just keep an eye out for that sort of thing, Acidification is going to be continuing progressively over the next decades into the future indefinitely until we really curb carbon dioxide emissions. There’s enough momentum in the system that is going to keep acidification advancing for quite some time.
“What we need to be doing as we cut the carbon dioxide, we need to find ways to strength communities that depend on resources and this study allows us to think differently about that and too really look at how we can strengthen those communities.
Faggetter said. “It’s one more blow to an already complex situation here, My study has been working particularly on eel grass on Flora Bank (pdf) which is a very critical habitat, which is going to be impacted by these potential industrial developments and that impact will affect our juvenile salmon and our salmon fishery very dramatically, that could be further worsened by ocean acidification.”
She said that acidification could also be a long term threat to plans in Prince Rupert to establish a geoduck fishery (pronounced gooey-duck).
The popular large 15 to 20 centimetre clam is harvested in Washington State and southern BC, but so far hasn’t been subject to commercial fishing in the north.
NOAA said today’s study shows that by examining all the factors that contribute to risk, more opportunities can be found to prevent harm to human communities at a local level. Decision-makers can address socioeconomic factors that lower the ability of people and communities to adapt to environmental change, such as low incomes, poor nutrition, lack of educational attainment and lack of diverse employment opportunities.
NOAA’s Ocean Acidification Program and the state of Alaska are also developing tools to help industry adapt to increasing acidity.
The new NOAA study is the first published research by the Synthesis of Arctic Research (SOAR) program. which is supported by an inter-agency agreement between NOAA’s Office of Oceanic and Atmospheric Research and the Bureau of Ocean Energy Management (BOEM) Alaska Region.
About 52 million years ago what is now the Bulkley Valley was home to a tiny hedgehog and an ancient ancestor of tapirs, who lived on the shores of a placid lake surrounded by a lush upland forest.
The newly discovered fossils at Driftwood Canyon near Smithers are significant advance in the study of the ancient history of the region. That’s because while the Driftwood Canyon Provincial Park is known for beautifully preserved fossils of leaves, fishes and insects, these are the first mammalian remains found at the site.
The fossil hedgehog and tapir are even more significant because at the time they lived near an upland lake, Earth was going through a period of rapid global warming, now called the Paleocene-Eocene Thermal Maximum.
In the past couple of years, climatologists and paleontologists have started to play closer attention to the Thermal Maximum period in hopes of understanding what could happen during climate change today.
Driftwood Canyon first became famous in 1977 with the discovery of oldest known ancestor of salmons, Eosalmo driftwoodensis, which lived in an Eocene lake at Driftwood Canyon.
Today’s study says the ancient hedgehog is a species hitherto unknown to science. It is named Silvacola acares, which means “tiny forest dweller,” since this minute hedgehog likely had a body length of only two to two and half inches or five to six centimetres, about the size of an adult human thumb.
“It is quite tiny and comparable in size to some of today’s shrews,” said Dr. Jaelyn Eberle of the University of Colorado, lead author of the study. She speculated Silvacola may have fed on insects, plants and perhaps seeds.
Did it have quills like contemporary hedgehogs? “We can’t say for sure,” Eberle said. “But there are ancestral hedgehogs living in Europe about the same time that had bristly hair covering them, so it is plausible Silvacola did too.”
The delicate fossil jaw of Silvacola was not freed from the surrounding rock as is typical for fossils. Instead it was studied using an industrial high resolution CT (computed tomography) scanner at Penn State University so it could be studied without risking damage to its tiny teeth.
Hedgehogs are no longer found naturally in North America. Modern hedgehogs and their relatives are restricted to Europe, Asia, and Africa. Hedgehogs have become quite the rage as pets in North America in the past several years. The most common hedgehog pet today is the African pygmy hedgehog, which is up to four times the length of the diminutive Silvacola.
The other mammal, about the size of a medium-sized dog, discovered at the site, is Heptodon, is an ancient relative of modern tapirs, which resemble small rhinos with no horns and a short, mobile, trunk or proboscis.
“Heptodon was about half the size of today’s tapirs, and it lacked the short trunk that occurs on later species and their living cousins. Based upon its teeth, it was probably a leaf-eater, which fits nicely with the rain forest environment indicated by the fossil plants at Driftwood Canyon,” Eberle said.
Most of the fossil-bearing rocks at Driftwood Canyon formed on the bottom of an ancient lake and are well-known for their exceptionally well-preserved leaves, insects, and fishes.
“The discovery in northern British Columbia of an early cousin to tapirs is intriguing because today’s tapirs live in the tropics. Its occurrence, alongside a diversity of fossil plants that indicates a rain forest, supports an idea put forward by others that tapirs and their extinct kin are good indicators of dense forests and high precipitation,” she said.
Forests, lakes, rivers
Fossil plants from the site indicate the area seldom experienced freezing temperatures and probably had a climate similar to that of Portland, Oregon, located roughly 1,126 kilometres or 700 miles to the south.
The current and previous studies have shown the hedgehog and tapid lived on the shores of a lake surrounded by a mixed conifer-broadleaf forest with redwoods, such as Metasequoia and Sequoia, cedars, fir, larch, golden larch, spruce, pine as well as rare ginkgoes. There were also broadleaf deciduous trees such as alder, birch, sassafras, elms, and relatives of the oak family. In the lake were Azolla, a floating fern, which are frequently found as preserved mats in the fossil shale of the cliff at Driftwood, which together with the fine preservation of the insects indicate a quiet water lake.
The remains on the hedgehog were found in the fossil lake bed while the tapir was found in river sediments.
The paleoclimate has been reconstructed suggesting the region had a mean annual temperature of between 10 degrees C and 15 degrees C, with minimal winter freezing and annual precipitation of about 100 centimetres a year. Today, the mean annual temperature for Smithers is 4.2 degrees C with 50.85 centimetres of precipitation a year
“Driftwood Canyon is a window into a lost world – an evolutionary experiment where palms grew beneath spruce trees and the insects included a mixture of Canadian and Australian species. Discovering mammals allows us to paint a more complete picture of this lost world,” said Dr. David Greenwood of Brandon University, a co-author of the study.
“The early Eocene is a time in the geological past that helps us understand how present day Canada came to have the temperate plants and animals it has today. However, it can also help us understand how the world may change as the global climate continues to warm.”
The Driftwood Canyon site is the northernmost of a series of Eocene lake sites spanning about 1000 kilometres that reach south from Smithers to Republic in northern Washington that the scientists call the Okanagan Highlands, with a mixture of temperate and tropical plants and animals and a high diversity of insects and plants.
While Driftwood Canyon is now among sites considered a key indicator of climate change 50 to 53 million years ago, the Harper government has cut almost all the funding for research into paleontology, not just at Driftwood Canyon but across the country, because looking for fossils doesn’t usually fit into the Conservative policy of only funding science that promotes industry.
“Within Canada, the only other fossil localities yielding mammals of similar age are from the Arctic, so these fossils from British Columbia help fill a significant geographic gap,” said Dr. Natalia Rybczynski of the Canadian Museum of Nature, a co-author of the study.
Other fossils of this age come from Wyoming and Colorado, some 4,345 kilometres or 2,700 miles to the south of the Arctic site of Ellesmere Island. In addition, sources have told Northwest Coast Energy News that the provincial budget for Driftwood Canyon, despite its significance, is the same as other small parks of that size, with virtually no security to prevent fossils leaving the park, either in the hands of professional looters or if they are picked up and taken home by visitors.
There are consistent reports that looted fossils from Driftwood Canyon are regularly showing up at fossil shows in the United States.
Sources have told Northwest Coast Energy News that the provincial government has ignored requests to improve security at Driftwood Canyon because it is considered a small (just 21 hectares) low priority park off the main tourist routes, rather than a significant fossil site.
The mammal fossils were discovered in 2012 before the budget cuts and are now in the Royal British Columbia museum in Victoria. The fieldwork was supported by Natural Sciences and Engineering Research Council of Canada.
The study “Early Eocene mammals from the Driftwood Creek beds, Driftwood Canyon Provincial Park, Northern British Columbia ” was published in the July 8, 2014 edition of the Journal of Vertebrate Paleontology.
Iron and steel in hatcheries, including rebar supporting concrete structural elements, could be distorting the ability of salmon and trout to navigate using the earth’s magnetic fields according to a study released today by Oregon State University.
The exposure to iron and steel distorts the magnetic field around the young fish, affecting the fish’s “map sense” and their ability to navigate, said Nathan Putman, who led the study while working as a postdoctoral researcher in the Oregon Department of Fisheries and Wildlife, part of OSU’s College of Agricultural Sciences.
For decades, scientists have studied how salmon find their way across vast stretches of ocean.
In a study last year, Putman and other researchers presented evidence of a correlation between the oceanic migration patterns of salmon and drift of the Earth’s magnetic field. They confirmed the ability of salmon to navigate using the magnetic field in experiments at the Oregon Hatchery Research Center.
That earlier research confirmed that fish possess a map sense, determining where they are and which way to swim based on the magnetic fields they encounter.
“The better fish navigate, the higher their survival rate,” said Putman, who conducted the research at the Oregon Hatchery Research Center in the Alsea River basin last year. “When their magnetic field is altered, the fish get confused.”
Subtle differences in the magnetic environment within hatcheries could help explain why some hatchery fish do better than others when they are released into the wild, Putman said.
The study suggests that stabilizing the magnetic field by using alternative forms of hatchery construction may be one way to produce a better yield of fish, he said.
“It’s not a hopeless problem,” he said. “You can fix these kinds of things. Retrofitting hatcheries with non-magnetic materials might be worth doing if it leads to making better fish.”
The new findings follow the earlier research by Putman and others that confirmed the connection between salmon and the Earth’s magnetic field.
Researchers exposed hundreds of juvenile Chinook salmon to different magnetic fields that exist at the latitudinal extremes of their oceanic range.
Fish responded to these “simulated magnetic displacements” by swimming in the direction that would bring them toward the center of their marine feeding grounds.
Putman repeated that experiment with the steelhead trout and achieved similar results. He then expanded the research to determine if changes to the magnetic field in which fish were reared would affect their map sense. One group of fish was maintained in a fiberglass tank, while the other group was raised in a similar tank but in the vicinity of iron pipes and a concrete floor with steel rebar, which produced a sharp gradient of magnetic field intensity within the tank. Iron pipes and steel reinforced concrete are common in fish hatcheries.
The scientists monitored and photographed the juvenile steelhead, called parr, and tracked the direction in which they were swimming during simulated magnetic displacement experiments. The steelhead reared in a natural magnetic field adjusted their map sense and tended to swim in the same direction. But fish that were exposed to the iron pipes and steel-reinforced concrete failed to show the appropriate orientation and swam in random directions.
More research is needed to determine exactly what that means for the fish. The loss of their map sense could be temporary and they could recalibrate their magnetic sense after a period of time, Putman said. Alternatively, if there is a critical window in which the steelhead’s map sense is imprinted, and it is exposed to an altered magnetic field then, the fish could remain confused forever, he said.
“There is evidence in other animals, especially in birds, that either is possible,” said Putman, who now works for the National Oceanic and Atmospheric Administration. “We don’t know enough about fish yet to know which is which. We should be able to figure that out with some simple experiments.”
Putman’s findings were published this week in the journal Biology Letters. The research was funded by Oregon Sea Grant and the Oregon Department of Fish and Wildlife, with support from Oregon State University. Co-authors of the study are OSU’s David Noakes, senior scientist at the Oregon Hatchery Research Center, and Amanda Meinke of the Oregon Hatchery Research Center.
A study by Duke University of the US shale gas boom has found that oil and gas development from shale fields has generally helped the public finances of local communities, providing new revenues and resources that usually — but not always — outweigh the increased demand for public services and other costs.
It found that many local governments in western North Dakota and eastern Montana, near the Bakken shale formation, have thus far experienced net negative fiscal effects. Also, some municipalities in rural parts of Colorado and Wyoming struggled to manage rapid population growth as natural gas production accelerated in the mid-to-late 2000s.
The research is a snapshot of the fiscal impact to date in the eight states and does not examine the long-term economic impact to governments and the communities they serve, a question the authors say is important and needs additional study.
Daniel Raimi and Richard Newell gathered data from communities surrounding ten oil and gas “plays” from September 2013 through February 2014, traveling to Arkansas, Colorado, Louisiana, Montana, North Dakota, Pennsylvania, Texas and Wyoming to interview local officials and collect information firsthand.
The report describes major revenue sources for local governments, which can include property taxes, sales taxes and state-collected severance taxes or fees that are sent back to the local level. Some local governments also partner with oil and gas companies to help maintain roads, an approach that helped reduce expenses associated with heavy truck traffic in states including Arkansas, Colorado and Pennsylvania.
New costs for local governments associated with oil and gas development, include damage to roads from heavy truck traffic, water and sewer service expansion, government staffing and other needs brought on by rapid population growth.
The researchers found that the net impact of recent oil and gas development has generally been positive for local public finances.
“The fiscal effects for local governments tend to vary from state to state, but we found that for most of them new revenues were outweighing new demand for services,” said Newell, director of the Duke University Energy Initiative and Gendell Professor of Energy and Environmental Economics at Duke’s Nicholas School of the Environment.
Newell and Raimi found net positive fiscal effects in regions where oil and gas booms were ongoing or had slowed in recent years, as well as in regions that experienced different scales of activity. This includes local governments in diverse regions where population density and government capacity vary substantially.
“One of the key questions is how these fiscal effects change over time,” said Raimi, an associate in research with Duke’s Energy Initiative. “In very rural areas, some local governments have faced challenges when development first surges. In many cases, those challenges faded over time. In most other areas, we found net positive or at least roughly neutral financial effects on local government.”
“In some parts of North Dakota, populations have doubled, tripled or even quadrupled just in the past few years,” Raimi said. “For local governments in these areas, it’s hard to keep up with the demand for services, especially costly infrastructure projects such as sewer and water treatment plants.”
The study was financed with the support of the Alfred P. Sloan Foundation. The Shale Public Finance project will continue to produce a series of publications that describes local experiences from a variety of U.S. local governments and identifies key findings.
It comes down to the idea that Harper will approve Gateway “in the national interest,” count on a vote split between the NDP and Liberals in British Columbia to avoid any consequences to the Conservative majority and then leave it up to Enbridge to actually get the job of building the pipeline and terminal project done.
Mason quotes “ a senior member of Mr. Harper’s government,” and while Mason doesn’t say what part of Canada the source is from, (unlikely in my view the source is from BC) what the member told Mason reveals that the Harper government is still mired in it the Matrix-world that has always governed its policy on Northern Gateway.
The first step, apparently coming in the next few days, is that the Harper government “rigorous” new tanker protocols for traffic along the west coast.
Even if the protocols are new, just who is going to enforce those policies?
Even if Gateway and the Kinder Morgan expansion went ahead, he argued, B.C. would still only see about 60 per cent of the annual oil tanker traffic the neighbouring state of Washington deals with. And yet Washington has an exceptionally clean record when it comes to the safe transport of oil in and out of its harbours – this, he noted, while operating under marine safety regulations that are not as rigorous as the ones Ottawa intends to put in place for the shipment of oil along the West Coast.
There are a lot big problems with that statement.
First, there’s an organization that the Mason’s source may have heard of known as the United States Coast Guard. The United States rigorously enforces its “weak” regulations, while Canada’s Coast Guard is plagued by staff shortages and budget cuts.
Second, the State of Washington also rigorously enforces its environmental regulations, not only on the coast but across the state. I have been told by retired British Columbia forestry and environmental officials (not to mention Fisheries and Oceans) that there are often more state environmental watch dogs in most Washington State counties than in all of northern British Columbia where the Northern Gateway is supposed to be going.
The September 2013, report by the US National Oceanographic and Atmospheric Administration on the export of Canadian bitumen sands through the US shows that the Washington Department of Ecology is working on strengthening regulations for both pipelines and (where it’s in state jurisdiction) tanker traffic. The same report says the state of Alaska Department of Environmental Conservation is updating its plans and possible regulations in anticipation that bitumen filled tanker traffic from Kitimat would come close to the coast en route to Asia.
Third, the coast of northern British Columbia is more rugged and stormy than the waters off Washington.
The one factor that the urban media seems to ignore, is the big question.
Who pays to enforce the 209 conditions that the Joint Review Panel imposed on the Northern Gateway project?
If the Harper government announces new tanker regulations in the coming days, who pays to enforce those regulations?
There were no provisions in the February budget for enforcing the 209 conditions. Rather there were continuing budget cuts to the very departments that the JRP ruled must be involved in the studying, planning, implementation and enforcement of the 209 conditions, Environment Canada, Fisheries and Oceans and Transport Canada.
So while Mason says “The federal government will play its part in meeting the five conditions laid out by the B.C. government for support of the project,” the response must be “Show me the money!”
During the recent plebiscite campaign, Northern Gateway finally revealed its plans for the “super tugs” that will escort tankers along the coast and up Douglas Channel. Owen McHugh, a Northern Gateway emergency manager said, “Adding these four or five tugs to the north coast provides a rescue capability that doesn’t exist in this format. So for any large commercial vessel that is traveling on our coast, this capacity to protect the waters of the north coast.” Those tugs and Northern Gateway’s plans to station teams at small bases along the coast means that the company is, in effect, creating a parallel, private, coast guard on the BC Coast.
What about the Coast Guard itself? The Harper government has been gutting Coast Guard resources along the coast even before it had its majority. It closed and dismantled the Kitsilano Coast Guard station in Vancouver. There is more dependence on the Royal Canadian Marine Search and Rescue volunteers, who have to raise money locally for modern rescue boats which cost up to $750,000. The money that government was “generously” giving to RCMSAR had to be split up to 70 stations in 42 communities along the coast as well as its administrative and training staff.
Does anyone notice what is missing from that list? What’s missing are better Coast Guard vessels just to police all the expected tanker traffic on the west coast (whether LNG or bitumen) and no mention of dedicated spill response vessels, which under the “polluter pay” policy will likely be left to private contractors (and hope that the ships are available at the time of a spill)
How will we know?
Then there is the question of how will people even know if the 209 conditions are being enforced; whether or not the reports demanded by the Joint Review Panel are going be sitting on the National Energy Board server and ignored.
There is every indication, given the government’s obsession with secrecy that until there is a disaster the Canadian public will never know what’s going on. Harper’s muzzling doesn’t just cover government scientists, it covers the lowest level of bureaucrats, as District of Kitimat Council found out when low level DFO bureaucrats refused to appear publicly before council to discuss the risk to the Kitimat River.
So the scenario is, according to Mason’s source
“I think once this decision is made, Enbridge could have shovels in the ground the next day,” the member said. “They are ready to go. This means the First Nations could start realizing profits from this right away, as opposed to the promised profits from LNG, which may never materialize. I think they need to think about that.”
While the LNG market is volatile, the “member” forgets that most of the First Nations of British Columbia have opposed the Northern Gateway since Enbridge first floated the idea in 2001. The current LNG rush didn’t start until after Japan shut down its nuclear power plants after the March 2011 earthquake, The first major anti-Enbridge rally, “The Solidarity Gathering of Nations” was held at Kitamaat Village in May 2010.
Writing off BC
It appears that Conservatives, in their election strategy have already written off Gateway opponents:
Still, there is a raw political calculus that needs to be taken into account. Polls measuring support for the pr.oject in B.C. vary, but generally have shown that anywhere from 55 to 60 per cent of the province opposes Gateway and 40 to 45 per cent support it. Isn’t that enough to scare off a government that needs critical votes in B.C. to win another majority?
“Let’s say 60 per cent are against it,” he said. “And that vote splits between the Liberals and the NDP come the next election. Who are the 40 per cent going to vote for?”
Mason also speculates that Harper will approve Gateway to stick it to Barack Obama and the delays on Keystone XL. As he points out that’s a political, not an economic decision.
There are civil disobedience classes being held across northwestern BC this month. Access to Information requests by the Vancouver Observer revealed increased RCMP surveillance of the anti-Gateway movement. There has always been talk of a “war in the woods” if the pipeline project is forced on an unwilling population.
So it comes down to a question that Mason and the Conservatives are avoiding. Mason’s source says Northern Gateway is crucial to the national interest:
“At the end of the day, you have to do what’s right, not what’s politically expedient,” he said. “You have to ask: What’s in the best interests of all Canadians?”
So given all that will the Harper government leave Enbridge to tough it out on its own?
But will the Harper government, with its bean counting obsession on balancing the budget be willing to pay for all that is needed?
There’s lots of marine clay along the pipeline route, laid down by ancient oceans. That brings to mind just one word. Quagmire, not just the wet, sticky BC mud but a political quagmire.
A new study, based at the University of Alberta, released this week, indicates that natural selection may be making the mountain pine beetle more tolerant of colder temperatures and that the beetle may be evolving the ability to fly longer distances.
A second study, from the Colorado School of Mines, also released this week, is tracking how the extent of pine beetle infected or killed trees in forests is changing ground water and stream flows.
The mountain pine beetle infestation has wreaked havoc in North America, across forests from the American Southwest to British Columbia and Alberta. Millions of hectares of forest have been lost, with severe economic and ecological impacts from a beetle outbreak ten times larger than previous ones.
Dust from beetle killed wood is believed partially responsible for the explosions at the Lakeland Mill in Prince George and the Babine Forest Products mill in Burns Lake. The explosion in early 2012 at the Babine Forest Products mill killed two workers and injured another twenty. The Lakeland Mill explosion killed two workers and injured twenty four others.
As part of the fight to contain the mountain pine beetle, scientists recently sequenced the pine beetle genome.
Using that genome, Jasmine Janes and colleagues at the University of Alberta, with assistance from the University of British Columbia and the University of Northern British Columbia, used genetics to track how the pine beetle was able to expand its range so rapidly. The study was published in Molecular Biology and Evolution.
Studied at molecular level
While teams of researchers have tracked the path of the pine beetle across BC on the ground, how the beetle spread so easily “is only beginning to be understood at the molecular level,” the study says.
Pine beetles were collected from 27 sites in Alberta and British Columbia. The University of Alberta scientists were especially interested in how the pine beetle was able to jump across the Rockies, something that earlier researchers believed would not happen.
By looking at the genetic markers, the team concluded that the pine beetle may have been able to spread by adjusting its cellular and metabolic functions to better withstand cooler climates and facilitate a larger geographic dispersal area.
In an e-mail to Northwest Coast Energy News, Janes said the research looked at genomic signatures to find out how the beetle had been able to spread into Alberta – where did it come from, what route did it take and how did it overcome the physical and climatic barriers that we had always assumed?
The research discovered that there are two genetically different populations of pine beetles, one from the south of British Columbia and Alberta and one in the north. Another group of pine beetles, found near Valemount, “were harder to classify as being from either north or south genetically. The beetles in this area were showing higher genetic diversity.”
The pine beetle has always been around and killed older trees (called “low quality hosts”), helping to renew the ecosystem. There were also larger five-year infestations that occurred on a 20 to 40 year cycle. The current pine beetle “epidemic” in BC has gone on now for more than 20 years, with the pine beetle “observed in previously unrecorded numbers” over a much wider area
It is generally believed that climate change has help the pine beetle survive milder winters.
The study notes:
Successful establishment of mountain pine beetle on pure jack pine in northern Alberta has raised concerns that the mountain pine beetle will continue to expand its range into the vast boreal forest of jack pine that extends across North America from the Northwest Territories to the Atlantic Coast.
The teams concludes that one group of the beetles originated in southwestern BC perhaps from Whistler or Manning Park. Then the beetles spread north up the west coast of BC toward Houston “with rapid population size increases in central and northern BC and then dispersed long distances with prevailing winds toward the east and Alberta.”
Some of the beetles from southwestern BC appear to have taken a different route, moving more slowly eastward in the south of BC toward Crowsnest Pass area and then moving northward along the base of the Rockies, Janes said, adding. “Our research suggests that these two routes have then met in the middle again, around Valemount and that is why we see the genetic patterns across the landscape that we observed.”
The second part of the research was answering the question of how the beetles were able to do this. How could they withstand the colder temperatures to spread further north and east?
To answer the scientists used the same genetic markers (single nucleotide polymorphisms) to conduct “selective sweeps” of the beetle genome. The sweeps look for unusual genetic markers that could indicate the beetles are under “selective pressure.”
The study looked at specific functions in the beetle: the genes that govern “actin filaments” that “control muscle contractions like shivering and moving wings”; “the synthesis of cholesterol that provides energy for metabolic activities” and transport of ions across cell membranes.
Normally, female pine beetles can only fly short distances to find a new host tree to lay eggs. They can travel longer distances if they are up in the tree canopy and are then carried by the wind. Stronger pine beetles that can fly longer distances show the threat is likely evolving.
The study concludes that Canadian Mountain Pine Beetle range expansion:
may continue as populations are currently exhibiting signals of selection. These signals suggest ongoing adaptation of metabolic and cellular processes that could potentially allow them to withstand colder temperatures, shift developmental timing and facilitate longer dispersal flights.
Janes said further research is required to fully validate and understand these signatures of selection, but it does suggest that the beetle is adapting and that is why it “may have been able to breach the Rocky Mountains”
Water and trees
In Colorado alone, the mountain pine beetle has caused the deaths of more than 3.4 million acres of pine trees. The new research findings show the consequences of an obvious observation: Dead trees don’t drink water
The Colorado study asked how all those dead trees are changing stream flow and water quality?
“The unprecedented tree deaths caused by these beetles provided a new approach to estimating the interaction of trees with the water cycle in mountain headwaters like those of the Colorado and Platte Rivers,” said Reed Maxwell a hydrologist at the Colorado School of Mines.
Maxwell and colleagues have published results of their study of beetle effects on stream flows in this week’s issue of the journal Nature Climate Change.
As the trees die, they stop taking up water from the soil, known as transpiration. Transpiration is the process of water movement through a plant and its evaporation from leaves, stems and flowers.
The “unused” water then becomes part of the local groundwater and leads to increased water flows in nearby streams.
“Large-scale tree death due to pine beetles has many negative effects,” says Tom Torgersen of the US National Science Foundation’s Directorate for Geosciences and program director for the NSF’s Water, Sustainability and Climate program.
“This loss of trees increases groundwater flow and water availability, seemingly a positive,” Torgersen says.
“The total effect, however, of the extensive tree death and increased water flow has to be evaluated for how much of an increase, when does such an increase occur, and what’s the water quality of the resulting flow?”
Under normal circumstances, green trees use shallow groundwater in late summer for transpiration.
Red- and gray-phase trees–those affected by beetle infestations–stop transpiring, leading to higher water tables and greater water availability for groundwater flow to streams.
The Colorado study shows that the fraction of late-summer groundwater flows from affected watersheds is about 30 per cent higher after beetles have infested an area, compared with watersheds with less severe beetle attacks.
“Water budget analysis confirms that transpiration loss resulting from beetle kill can account for the increase in groundwater contributions to streams,” write Maxwell and scientists Lindsay Bearup and John McCray of the Colorado School of Mines, and David Clow of the U.S. Geological Survey, in their paper.
Dead trees create changes in water quality
“Using ‘fingerprints’ of different water sources, defined by the sources’ water chemistry, we found that a higher fraction of late-summer stream flow in affected watersheds comes from groundwater rather than surface flows,” says Bearup.
“Increases in stream flow and groundwater levels are very hard to detect because of fluctuations from changes in climate and in topography. Our approach using water chemistry allows us to ‘dissect’ the water in streams and better understand its source.”
With millions of dead trees, adds Maxwell, “we asked: What’s the potential effect if the trees stop using water? Our findings not only identify this change, but quantify how much water trees use.”
An important implication of the research, Bearup says, is that the change can alter water quality.
The new results, she says, help explain earlier work by Colorado School of Mines scientists. “That research found an unexpected spike in carcinogenic disinfection by-products in late summer in water treatment plants.”
Where were those water treatment plants located? In bark beetle-infested watersheds.
The TRIA project (a collaboration of researchers at the University of Alberta, University of British Columbia, University of Montreal, University of Northern British Columbia). The TRIA project investigates the physiology, genetics and ecology of all three players in the mountain pine beetle system – the pines (lodgepole and jack pine, and their hybrids), the beetle and the fungus (several types).
The Colorado study is funded by the National Science Foundation’s (NSF) Water, Sustainability and Climate (WSC) Program. WSC is part of NSF’s Science, Engineering and Education for Sustainability initiative.
Using more wood for construction of both buildings and bridges thus reducing the amount of steel and concrete would substantially reduce global carbon dioxide emissions and fossil fuel consumption, according to a Yale University study.
The idea is that using wood would reduces the amount of energy required for steel and concrete and therefore greenhouse gases.
The study says that sustainable management of wood resources would both allow proper management of forsests while also reducing fossil fuel burning.
The results were published in the Journal of Sustainable Forestry.
Scientists from the Yale School of Forestry & Environmental Studies (F&ES) and the University of Washington’s College of the Environment evaluated a range of scenarios, including leaving forests untouched, burning wood for energy, and using various solid wood products for construction.
The researchers calculated that the amount of wood harvested globally each year (3.4 billion cubic meters) is equivalent to only about 20 percent of annual wood growth (17 billion cubic meters), and much of that harvest is burned inefficiently for cooking.
They found that increasing the wood harvest to the equivalent of 34 percent or more of annual wood growth would have profound and positive effects:
Between 14 and 31 percent of global CO2 emissions could be avoided by preventing emissions related to steel and concrete; by storing CO2 in the cellulose and lignin of wood products; and other factors.
About 12 to 19 percent of annual global fossil fuel consumption would be saved including savings achieved because scrap wood and unsellable materials could be burned for energy, replacing fossil fuel consumption.
Wood-based construction consumes much less energy than concrete or steel construction. For example, manufacturing a wood floor beam requires 80 megajoules (mj) of energy per square meter of floor space and emits 4 kilograms (kg) of CO2. By comparison, for the same square meter, a steel beam requires 516 mj and emits 40 kg of CO2, and a concrete slab floor requires 290 mj and emits 27 kg of CO2. Through efficient harvesting and product use, more CO2 is saved through the avoided emissions, materials, and wood energy than is lost from the harvested forest.
“This study shows still another reason to appreciate forests — and another reason to not let them be permanently cleared for agriculture,” said Chadwick Oliver, Pinchot Professor of Forestry and Environmental Studies, Director of the Global Institute of Sustainable Forestry at F&ES and lead author of the new study. “Forest harvest creates a temporary opening that is needed by forest species such as butterflies and some birds and deer before it regrows to large trees. But conversion to agriculture is a permanent loss of all forest biodiversity.”
The manufacture of steel, concrete, and brick accounts for about 16 percent of global fossil fuel consumption. When the transport and assembly of steel, concrete, and brick products is considered, its share of fossil fuel burning is closer to 20 to 30 percent, Oliver said.
Reductions in fossil fuel consumption and carbon emissions from construction will become increasingly critical as demand for new buildings, bridges and other infrastructure is expected to surge worldwide in the coming decades with economic development in Asia, Africa, and South America, according to a previous F&ES study. And innovative construction techniques are now making wood even more effective in bridges and mid-rise apartment buildings.
According to Oliver, carefully managed harvesting also reduces the likelihood of catastrophic wildfires.
And maintaining a mix of forest habitats and densities in non-reserved forests — in addition to keeping some global forests in reserves — would help preserve biodiversity in ecosystems worldwide, Oliver said. About 12.5 percent of the world’s forests are currently located in reserves.
“Forests historically have had a diversity of habitats that different species need,” Oliver said. “This diversity can be maintained by harvesting some of the forest growth. And the harvested wood will save fossil fuel and CO2 and provide jobs — giving local people more reason to keep the forests.”
The article, “Carbon, Fossil Fuel, and Biodiversity Mitigation with Woods and Forests,” was co-authored by Nedal T. Nassar of the Yale School of Forestry & Environmental Studies and Bruce R. Lippke and James B. McCarter of the University of Washington.
The herring, now dwindling on on the Pacific Coast, was once “superabundant” from Washington State through British Columbia to Alaska and that is a warning for the future, a new study says.
A team of scientists lead by Simon Fraser University argue that the archaeological record on the Pacific Coast offers a “deep time perspective” going back ten thousand years that can be a guide for future management of the herring and other fish species.
An archaeological study looked at 171 First Nations’ sites from Washington to Alaska and recovered and analyzed 435,777 fish bones from various species.
Herring bones were the most abundant and dating shows that herring abundance can be traced from about 10,700 years ago to about the mid-nineteenth century with the arrival of Europeans and the adoption of industrial harvesting methods by both settlers and some First Nations.
That means herring were perhaps the greatest food source for First Nations for ten thousand years surpassing the “iconic salmon.” Herring bones were the most frequent at 56 per cent of the sites surveyed and made up for 49 per cent of the bones at sites overall.
The study is one of many initiatives of the SFU-based Herring School, a group of researchers that investigates the cultural and ecological importance of herring.
“By compiling the largest data set of archaeological fish bones in the Pacific Northwest Coast, we demonstrate the value of using such data to establish an ecological baseline for modern fisheries,” says Iain McKechnie. The SFU archaeology postdoctoral fellow is the study’s lead author and a recent University of British Columbia graduate.
Co-author and SFU archaeology professor Dana Lepofsky states: “Our archaeological findings fit well with what First Nations have been telling us. Herring have always played a central role in the social and economic lives of coastal communities. Archaeology, combined with oral traditions, is a powerful tool for understanding coastal ecology prior to industrial development.”
The researchers drew from their ancient data-catch concrete evidence that long-ago herring populations were consistently abundant and widespread for thousands of years. This contrasts dramatically with today’s dwindling and erratic herring numbers.
“This kind of ecological baseline extends into the past well beyond the era of industrial fisheries. It is critical for understanding the ecological and cultural basis of coastal fisheries and designing sustainable management systems today,” says Ken Lertzman, another SFU co-author. The SFU School of Resource and Environmental Management professor directs the Hakai Network for Coastal People, Ecosystems and Management.
The paper says that the abundance of herring is additionally mirrored in First Nations’ place
names and origin narratives. They give the example of the 2,400-y-old site at Nulu where herring
made up about 85 per cent of the fish found in local middens. In Heiltsuk oral tradition, it is Nulu where Raven first found herring. Another site, 25 kilometres away at the Koeye River, has only has about 10 per cent herring remains and is not associated with herring in Heiltsuk tradition.
(In an e-mail to Northwest Coast Energy News, McKechnie said “there is a paucity of archaeological data from Kitimat and Douglas Channel. There is considerable data from around Prince Rupert, the Dundas Islands and on the central coast Namu/Bella Bella/ Rivers Inlet area and in southern Haida Gwaii.”)
The study says that the archaeological record indicates that places with abundant herring were consistently harvested over time, and suggests that the areas where herring massed or spawned were more extensive and less variable in the past than today. It says that even if there were natural variations in the herring population, the First Nations harvest did not affect the species overall.
Many coastal groups maintained family-owned locations for harvesting herring and herring roe from anchored kelp fronds, eel grass, or boughs of hemlock or cedar trees. Herring was harvested at other times of the year than the spawning period when massing in local waters but most ethnohistorical observations identify late winter and springtime spawning as a key period of harvest for both roe and fish.
The herring and herring roe were either consumed or traded among the First Nations.
Sustainable harvests encouraged by building kelp gardens,wherein some roe covered fronds were not collected, by minimizing noise and movement during spawning events, and by elaborate systems of kin-based rights and responsibilities that regulated herring use and distribution.
Industrial harvesting and widespread consumption changed all that. Large numbers of herring were harvested to for rendering to oil or meal. By 1910, the problem was already becoming clear. In that year British Columbia prohibited the reduction of herring for oil and fertilizer. There were reports at that time that larger bays on the Lower Mainland were “being gradually deserted by the larger schools where they were formerly easily obtained.”
But harvesting continued, in 1927 the fishery on eastern Vancouver Island, Columbia, processed
31,103 tons of herring. The SFU study notes that that is roughly twice the harvest rate for 2012 and would also be about 38 per cent of the current herring biomass in the Strait of Georgia.
In Alaska, reduction of herring began in 1882 and reached a peak of 75,000 tons in 1929.
As the coastal populations dwindled, as with other fisheries, the emphasis moved to deeper water. By the 1960s, the herring populations of British Columbia and Washington had collapsed. Canada banned herring reduction entirely in 1968, Washington followed in the early 1980s.
In the 1970s, the herring population off Japan collapsed, which opened up the demand for North American roe, which targeted female herring as they were ready to spawn. That further reduced the herring population so that the roe fishery is now limited to just a few areas including parts of the Salish Sea and off Sitka and Togiak, Alaska.
The First Nations food, social and ceremonial herring fishery continues.
Government fishery managers, scientists, and local and indigenous peoples lack consensus on the cumulative consequences of ongoing commercial fisheries on herring populations. Many First Nations, Native Americans, Alaska Natives, and other local fishers, based on personal observations and traditional knowledge, hypothesize that local herring stocks, on which they consistently relied for generations, have been dramatically reduced and made more difficult to access following 20th century industrial fishing
Deep time perspective
The SFU study says that some fisheries managers are suggesting that the herring population has just shifted to other locations and other causes may be climate change and the redounding of predator populations.
But the study concludes, that:
Our data support the idea that if past populations of Pacific herring exhibited substantial variability, then this variability was expressed around a high enough mean abundance such that there was adequate herring available for indigenous fishers to sustain their harvests but avoid the extirpation of local populations.
These records thus demonstrate a fishery that was sustainable at local and regional scales over millennia, and a resilient relationship between harvesters, herring, and environmental change that has been absent in the modern era.
Archaeological data have the potential to provide a deep time perspective on the interaction between humans and the resources on which they depend.
Furthermore, the data can contribute significantly toward developing temporally meaningful ecological baselines that avoid the biases of shorter-term records.
Other universities participating in the study were the University of British Columbia, University of Oregon, Portland State University, Lakehead University, University of Toronto, Rutgers University and the University of Alberta.