Wednesday, 22 March 2017

Health of geophysical industry critical to market recovery

Nikki C. Martin
President
International Association of Geophysical Contractors




Energy is one of the most important markets in the world, and the geophysical industry is the foundation of ensuring safe, affordable, and accessible oil and gas exploration and development. While OPEC’s landmark agreement at the end of the year to cut oil output boosted crude prices and optimism for some semblance of market certainty, no one can predict what the oil services market holds for 2017. One can be certain, however, that global energy demand will continue to increase, and that demand cannot be met without the geophysical industry.

From discovery to delivery, a viable geophysical industry is essential to providing energy resources to the world. In the past year, the geophysical industry enabled the discoveries of significant energy resources that were previously overlooked and thought impossible, beneath thick salt and within shale formations, doubling the world’s proved oil and gas reserves since the 1980s.

For decades, the geophysical industry has pioneered technology providing the blueprint for locating and producing safe affordable energy and the more than 6,000 petroleum-based products indispensable to everyday life. Oil and gas powers the equipment needed to construct cities. It powers the equipment to harvest crops and produce food, and the hundreds of millions of vehicles moving citizens and goods around the world. It is the foundational material for innovative and life-saving technology, from mobile phones to medical imaging devices.

Although the industry is currently in challenging times, the demand for these products and energy is not going away, and fossil fuels will dominate the energy supply for decades, requiring the ability to explore and develop resources currently out of reach. Critical resource plays are moving farther offshore and into more complex onshore formations.

As the cornerstone of the energy industry, the International Association of Geophysical Contractors’ (IAGC) members are driving technological innovation, reducing risk, and enhancing recovery. Even in the midst of the downturn, they have not let technology become stagnant, demonstrating a relentless pursuit to innovate and improve, so the discoveries that are impossible today are possible tomorrow.

Our members strive to provide trusted experience, improved confidence, and reduced uncertainty in the interpretation of the subsurface for their clients. Likewise, IAGC works to reduce regulatory uncertainty by promoting and ensuring a safe, environmentally responsible, and competitive exploration industry. The industry’s viability hinges not only on navigating one of the longest crises in the oil services market, but also overcoming the many present and real regulatory challenges pushed by the anti-fossil fuel lobby around the world.

Unfortunately, attacks from environmental extremists do not diminish with the price of oil, but resources do. The geophysical industry is often the first expenditure at the earliest stage of exploration and the first presence of oil and gas in a geographic area. As a bellwether for the service industry, it has been the hardest hit by the current downturn and the target of increasing anti-development environmental activist agenda.

Environmental groups are disseminating misinformation about the geophysical industry because of its critical role in exploration and development, saying seismic exploration is the gateway to oil and gas drilling. They believe if they stop seismic exploration, they can halt development.

These extremists are increasingly advocating for application of the precautionary principle to exploration and development activities around the globe. This approach requires weighing the evidence showing any effect of a proposed activity more than the evidence showing no effect; where effects are disputed, they argue activity should be resisted without boundary or reason.

This principle resists decision-making based upon the “best available science,” mandated by US law. In practice, however, we see increasing challenges to exploration industry activities calling for “precautionary” mitigation measures and agency decisions, regardless of the proposed activities’ actual limited impact on the environment or affected species. In the Mediterranean, environmental activists attempted unsuccessfully to oust IAGC from a treaty group after it urged science-based mitigation, rather than strict adherence to the precautionary principle. And even in the US, the principle is flexing its muscle.

The Obama administration recently denied applications to conduct seismic surveying in the Atlantic, kowtowing to environmental activists and substituting politics for science. The decision directly contradicted the Bureau of Ocean Energy Management’s own repeated findings that there is no scientific evidence of sound from seismic surveys adversely impacting marine animal populations, the environment, commercial fishing, or coastal communities. IAGC is working to reverse this egregious decision and see these surveys proposed for the Atlantic permitted without delay.

Since IAGC increased its focus in regulatory and government engagement and advocacy just over three years ago, it has been shaping and leading the oil and gas industry’s dialogue on the geophysical industry’s interaction with the environment, and engaging government and regulatory entities with credible scientific, technical, and legal analysis. IAGC will continue to advocate for informed energy policies that are based on the best available data, so that exploration is supported by scientific impact analyses, and development decisions are made with an updated understanding of the resources available.

As the oil services market recovers, be assured that the IAGC is working now to ensure that the industry is positioned for future success and the exploration industry still has the opportunity to operate onshore and offshore around the world for decades to come. And as new resource plays are discovered, and existing fields are yielding more than anticipated, one will find the geophysical industry at the forefront of turning yesterday’s overlooked and impossible into today’s discoveries.


Monday, 20 March 2017

Delivering Change: Service And Supply Companies Finding Ways To Thrive In The New Normal

By: Maurice Smith
Published By: Daily Oil Bulletin
March 20th, 2017


It's official. Cautious optimism has crept into the oil and gas industry entering 2017 as the price of oil hovers around $50 per bbl and companies are looking to modestly grow spending and increase drilling activity as they align to the new normal of improving, if not exceptional, commodity prices going forward.

At least, that is the feeling of most of the respondents to the Grant Thornton sponsored JWN Service & Supply 2017 Outlook Report: Delivering change: aligning with a new normal. The comprehensive report, which includes the results of a survey of over 350 representatives of the service and supply industry, will be released in April.

One thing is clear, the report found -- service and supply companies have to learn how to operate in a new normal in order to not only survive, but to thrive. It was a theme that played out at two industry workshops held in conjunction with the survey.

Workshops were held in Edmonton and Calgary in January. In order to facilitate open communication the workshops were conducted under the Chatham House Rules that allow participants the freedom to voice ideas without their statements being credited to them as individuals or organizations.

Other themes to emerge at the workshops included the necessity to continue to restructure to remain competitive in the new normal; the need to attract new employees who are adaptable to the new environment and who will be the leaders of tomorrow; the need to build continuity and succession planning into the enterprise; and the benefits that can be derived from diversifying into new products, services, industries and geographies.

For many service and supply companies, aligning to the new normal revolves around a new focus on meeting the needs to the customer: building a better product or service, building trust and open lines of communication with the customer, and approaching deals as a win-win proposition for both sides. Keeping and satisfying present customers is a priority versus aggressively chasing new ones.

Workshop participants spoke of taking a customer facing approach, increasing co-operation with competition and/or like businesses and finding ways to "be the best contractor."

"Collaboration has been a focus for us this year," said one participant. "You don't want to give your customer incentive to go find another supplier. You want to come closer to them to build trust for your product."

"We are more customer-oriented and listen to them to deliver solutions as required," said another.

With activity expected to experience an uptick if crude oil prices remain firm, there is increased focus on employee training and hiring practices, and on creating the right workforce for the new normal. Many companies are engaged in long-term planning for how to attract and retain new talent, aiming to be less reactive and more stable going forward, and finding ways to build company loyalty.

One participant put it this way: "There's been a lot of change in how we do business. Some of it is driven internally -- doing so much with so few -- and coming up with better ways to do things. We are hiring too. We are bringing in different people than we let go, with different skillsets. They are more open-minded, less specific, and not age-dependent. As we reduced staff, people that were let go were those who were too specific and unable to 'cross' [roles and skills]."

Said another: "Companies are looking for people who are open, adaptable [and] understand all of it -- not just one very specific area. A mix of technical and business/sales/marketing [expertise]. The people who can do bother are very rare."

One company said it was focused on retooling and cross-training its staff, and "rebuilding its bench strength" as it looks forward to 2017. It is also exploring the option of building a technology platform to make it easier to connect contractors, vendors and customers.

Lessons learned from the boom and bust cycle have also played into new hiring practices. "We take company culture seriously. The culture suffered when we were growing. During growth, it was all about finding warm bodies. We didn't really look at whether they would be a good fit. Now, we've been able to concentrate on culture in positive way. It's important to hire people that 'fit' into company culture."

The cyclical nature of the business also puts succession planning and mentorship increasingly in the spotlight. The current environment makes it a good time to "have a good look at who you have and develop them for the future."

"Now it's about building the culture internally and spending more time mentoring and training ... building more junior assets so that when they get busy, they are ready."

Having the right workforce also entails creating a healthy work environment to attract and keep the best candidates. "[We are] focused on creating an environment where employees thrive and love coming to work," one said. "People prefer a steady job, stability and good benefits rather than high pay, high volatility. [The priority is in] engaging employees and giving them a voice to be active participants for changes within the organization."

"If you harness their interest, they'll give you 200 percent," said another. "If your employees are engaged, they will appreciate that [and will be loyal]. Companies that have not engaged their employees, [those employees] are just waiting around for something better and waiting to jump ship."

Cross training is another big priority; even more critical with the reductions in staffing that has occurred in recent years. "Your job is what we require you to do, not what your title says."

"Today, personal development/training is not optional, it is mandatory," another added.

Companies need to be proactive in transferring knowledge from those who will be leaving the workforce to their successors, as increasing numbers of the baby boomers hit retirement age, many participants said. While large companies tend to have processes in place to deal with continuity, many smaller companies do not, putting knowledge retention at risk when staff turnover occurs.

"You need to capture the [company] knowledge and keep it with the company, not the person. Lots of times people walk out the door with their laptops [containing that knowledge] -- we see this all the time," offered one workshop partaker.
"Look at all the processes and document them all."

"Salespeople should document their relationships," another offered.

Standard procedures can be put in place to ensure that important data is not lost and that knowledge will be passed on in a systematic way. "There's a lot of unleveraged data,: said one executive. It may "sit in someone's spreadsheet buried away -- data would be there but value diminishes if no one knows what it means. Even if it's on the company server, it may be impossible to find."

Technology could also play a role in maintaining valuable data. Field operators can collect data on a smart phone or laptop, collect data on well sites and sync data automatically to the cloud, for example.

Meanwhile, diversification was found to be one route to prosperity for some companies in the downturn. "We added complementary services and with a bit of trail and error it proved to be hugely beneficial. We broadened our scope of operations beyond Alberta and focused on tapping the U.S. market."

Another company said it used the downturn as an opportunity to invest in research and development to create a new product it plans to launch into the Canadian and international markets in 2017.

Others said they were leveraging government agencies and not-for-profit organizations to obtain funds and expertise to assist in going global and accessing new markets, and are now well positioned to take their business to the next level in 2017.

The Service & Supply 2017 Outlook Report will be released the first week of April, with launch events in Calgary April 4 and in Edmonton April 6. Key findings and insights from the report, including recommendations essential for companies aligning to the new normal, will be represented. For more information or to register,

http://www.jwnenergy.com/events/service-supply-2017-outlook-report-launch-yyc/




Wednesday, 15 March 2017

Reality bites: Carbon pricing in Canada and beyond

By: Jason Clemens and Kenneth Green - Analysts with the Fraser Institute
Published: Forts Nelson News


             TROY MEDIA - There's a general, indeed a strong consensus, within the economic community that a properly designed carbon tax can both reduce emissions and improve the economy. We broadly agree with this academic analysis. The problem is that carbon taxes in the real world have to be implemented through a political system that deviates substantially from the academic ideal.
            Economists tend to agree that the most efficient way to manage emissions is by placing a price on them that reflects the social costs of the emissions. By placing a price on carbon, emitting firms are incentivized to introduce emission-reducing technologies or change their production. In other words, the introduction of a price on carbon creates incentives for firms (and individuals) to respond to the social costs of emissions.
            Critically, however, there are several key assumptions necessary for this approach to be efficient. First, the introduction of a carbon price must replace, not be in addition to, existing regulations.
            Second, revenues from carbon pricing (i.e. tax) must be used in their totality to reduce other more costly taxes such as marginal personal or business income taxes. The idea is that revenues from carbon pricing are used to reduce other more damaging taxes so there's a net improvement in incentives for investment and entrepreneurship, which yield stronger economic growth.
            Third, and related to the second, revenues from the carbon tax should not be used to subsidize substitutes (wind, solar, or other alternative energies) for carbon-emitting activities since the whole point of introducing the price on carbon is to allow the market to determine the optimal substitutes.
             No jurisdiction in or outside of Canada, including much-heralded British Columbia, meets these assumptions. No province or country has introduced an "ideal" carbon-pricing system and thus the benefits from it will necessarily be less than theory suggests.
            No jurisdiction that introduced carbon pricing has eliminated the corresponding command-and-control regulations. Europe, California, and all of the Canadian provinces have retained most, if not all of their existing regulations after introducing carbon pricing.
            Moreover, no province or country has maintained revenue neutrality for carbon pricing.
Perhaps the closest and certainly most talked about is B.C., which maintained revenue neutrality for the first five years of its carbon tax. However, beginning in 2013-14, B.C.'s carbon tax began generating revenues in excess of the legitimate tax offsets. Indeed, the government's own projections indicate that the carbon tax will generate almost $900 million in net revenues over a six-year period.
           Finally, many jurisdictions including Ontario specifically use carbon-pricing revenues to subsidize alternative energy sources such as wind and solar. Subsidizing carbon-intense energy substitutes such as wind and solar short-circuits the market process envisioned by carbon pricing advocates by having governments choose the "right" solution.
            Further complicating the economics of carbon pricing are considerations regarding competitiveness and potential leakage. Specifically, adding a carbon tax means that the costs for firms in carbon-intensive industries such as agriculture, manufacturing, and resources are higher relative to their incentives for firms to switch production from jurisdictions with carbon taxes to those without, which would damage the Canadian economy without providing any environmental benefit. This is made all the more poignant now that it's clear the United States will not introduce a national carbon tax.
           Our own federal government has mandated carbon pricing for all provinces by 2018. It's imperative, therefore, that we understand the realities of carbon pricing as opposed to their theoretical possibilities. The politically altered carbon pricing observed in and outside of Canada including B.C.'s carbon tax will inevitably deliver lower benefits than the theoretical models predict or advocates suggest - and do real harm to the Canadian economy.

Tuesday, 7 March 2017

SCIENTIFIC LOGIC AND CLIMATE CHANGE

By: Henry Lyatsky, P.Geoph.,P.Geol., Lyatsky Geoscience Research & Consulting Ltd.
Published: CSEG - Recorder (page 26)


                             
"…There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.” DONALD RUMSFELD


The planet Earth is cooling. The interglacial climate period that has kept us warm for the last several thousand years, allowing civilization to rise and flourish, is over. Earth is about to return to the deep-freeze conditions of the last ice age that ended some 12,000 years ago, when all of Canada and much of Europe were covered by the sort of thick
continental glaciers that today blanket remote Greenland and Antarctica.

This scientific “truth” was drilled into me, a young geology undergrad    in Calgary, by esteemed professors in basic courses at the beginning of the 1980s.
In the 1970s the media were abuzz with global-cooling scares. Cooling was supposedly a scientific fact.
Thankfully, the old fears of an impending new ice age have so far proved unfounded. But now, global warming has replaced the global-cooling craze.
This article is not, by any means, a final word. I am neither a climatolo- gist nor a logician. My purpose is to encourage the readers to explore scientific logic, to always be skeptical, to question the methods and the motives, and to always be ready to wonder and be surprised.

Empiricism

Natural science is empirical. Empiricism says knowledge is derived from what we can sense or observe. Knowledge is gained by passive observation of natural occurrences or by active, preferably controlled, experiments.
Epistemology is the study of human knowledge. It deals with how we know things.
Much philosophical ink has been spilled on these subjects over the past several millennia. Too much of that ink flowed uselessly, or it is irrelevant to the discussion at hand. Only a few key points are summarized below.
Karl Popper (1950, 1968) was probably the past century’s foremost empirical philosopher. He taught that a legitimate scientific hypoth- esis must be falsifiable, i.e., capable of being disproved by subsequent observations or experiments. If a “theory rules out certain possible occurrences, … it will be falsified if these possible occurrences do in fact occur”.
An abstract hypothesis that is not capable of being falsified is not science at all. It is metaphysics – something worthless to science, even damaging, which scientists should avoid like the plague. If one discards falsifiability, then any arbitrary fantasy, postulation or assumption can call itself science. When empirical logical rigor dies, science dies.
In Popper’s scheme, where abstract hypotheses are constantly modified, discarded and created anew to account for accumulating facts, science should advance more or less smoothly. In the actual progress of science, however, there are fits and starts. Some new discoveries are more consequential than others. Besides, as Kuhn (1962) noted, scientists need some common intellectual framework in which to  interpret data and communicate with one another. These tentative abstract frameworks, or “paradigms”, can exist for a period of time, as pressure builds up from the growing volume of contrary facts. When the pressure gets too much, a shift occurs to a new paradigm, and the cycle of new discovery and testable conceptualization starts again.
A fact is something we observe. Our observations are necessarily limited and subjective, but this is not an argument to abandon empiricism. A  lack of empiricism leads to relativism and arbitrary postulations. On the contrary, the imperfection of observation is an argument to maximize   the gathering of facts and tighten the empirical  rigor.
Do observed facts represent “reality”? What is a phenomenon? These questions are at least as old as Ancient Greece, going back to Plato’s cave and Heraclitus’ note that “everything changes and nothing remains still” (see also Russell, 1961).
All these caveats complicate empirical thinking, but we have no alterna- tive. Empiricism is the only way to keep ourselves disciplined and grounded in “reality”, whatever that is, as we expand our knowledge and avoid pitfalls.

Deduction and induction

Inductive reasoning works from the particular to the general, as it looks for a falsifiable scheme that explains all available relevant facts. This is the empirical way to develop testable hypotheses.
Deduction works the other way, from the general to the specific, by applying firm rules of logic to reach a conclusion. Deduction can be used to test hypotheses by making falsifiable predictions. In Popper’s words, a fundamental question in empirical science is “how do we test scientific statements by their deductive  consequences?”
For example, an abstract hypothesis may deductively predict that the rocks in a particular locality should be Silurian limestone. If Jurassic sandstone is found instead, the hypothesis is falsified. If Silurian limestone does indeed crop out, the hypothesis is by no means proved correct: it has only passed one particular test and awaits many others.
Two major pitfalls can trip up the unwary and the lazy.
     Canonizing a tentative hypothesis, model or paradigm and treating it as received truth
     Applying concept-driven deduction where fact-driven induction is required, which leads to false determinism
This is where today’s climate science often goes wrong. In geoscience and climatology alike, hypotheses are too often canonized unskeptically. The meticulous collection of facts is lazily disregarded in favor of grand but premature or idle theorizing and fancy-looking modeling. Arbitrary hypotheses are sometimes even used to insist without evidence on what the facts not are but what they should be.





Abstract modeling

“All models are wrong but some are useful” – George Box

A model is an abstraction, a conceptual representation of some phenomenon. It is neither a truth nor an empirical fact. A model is something we formulate. In natural sciences, it is a mentally contrived system, typically composed of linked elements, designed to represent some aspects of an empirically sensed reality.
Parameters are factors used to relate variables to their functions, so parameters can define how the function changes in response to the variables.
If enough of a system’s elements and links are known, the variables can be defined or at least constrained. This is the main purpose of scientific modeling. It can be done if a system is sufficiently well determined; i.e., the loose variables do not overwhelm what is known.
If the unconstrained variables are too many and the knowns too few, the system is underdetermined and it defies unique solution.
An abstract model is only as good as its known parameters, acknowl- edged variables, and assumptions. What do we assume about the system’s elements? And about ways in which they are linked? Dealing with a complex phenomenon such as climate, with its multitude of unknown factors, influences and links, one must be extremely cautious.
It is commonly lack of knowledge that makes things deceptively appear simpler than they are. Climate is extremely complex, with a multitude of interconnected influences and feedbacks.
A conceptual model with too few interlinked elements reduces uncertainty and it may even enable a unique solution, but it may be too coarse to be useful. Simple models might have their uses, however, at the appropriate levels of the phenomenological hierarchy. For example, for all we know, all atoms of the same isotope are the same. For some purposes, their behavior can thus be modeled with some simplicity. Yet, atomic physics and molecular chemistry do not define the variety of geologic features in the rocks: every basin and every mineral deposit is different from any other.
At these, higher levels of the phenomenological hierarchy, many small events such as the Brownian motion of molecules, for example average each other out and become irrelevant, while new complexity emerges.
The ever-more-rapid advances in computing power enable extremely elaborate and complex models. Unfortunately, modelers sometimes forget the phenomenon they are trying to represent and instead fall in love with the computer-gener-ated abstractions. In the absence of sufficient factual (i.e. observational) constraints, model complexity must never become a self-purpose.
Mathematics is abstract. What is three? A number is an abstract means to count observed occurrences, but it is not these occurrences themselves (as in “three fingers”). Complex mathematics, by itself, does not make a model true. It is a means to an end, not an end in itself. Without adequate factual constraints  or realistic parameterization, too much math simply piles one abstraction on top of another.
As with any abstract scientific scheme or hypothesis, be it quantitative or qualitative, a model is empirically tested by its falsifiability. Do the observed facts accord with the model’s predictions?
The manner in which the term “model” is commonly misused in tectonics is highly misleading. In this misusage, what is called a model is merely a genetic tectonic scenario intended to account for some rock configuration. Strictly speaking, these are not formal models, just loose and often speculative hypotheses or even fantasies. If they are not falsifi- able, they should not be  proposed.
In climatology, unknown variables are legion. It is a common mistake to take a climate model too seriously, even more so to canonize it. An arbitrarily chosen model becomes a false and misleading paradigm, leading scientists astray or diverting them to spend their time fighting to debunk it.

Illogic of the “official” global-warming dogma

Science has been “settled” before, and not just for global cooling. In    the 1890s, for example, some very prominent physicists thought that nothing much fundamentally new remained to be discovered in their science: classical mechanics and electromagnetics covered the main bases, and future research would be a matter of elaboration. However, in 1895 Wilhelm Röntgen discovered X-rays, and in 1905 came Einstein.
Science is never settled. There is no last word. A good scientist is always a skeptic, never a denier. Personally, I am agnostic about anthropo- genic global warming. I simply don’t know. Given the current state of knowledge, nobody knows – and I am happy to admit it. Clearly, though, the “official” global-warming dogma is   bogus.
In contrast to the current debates about climate change, the science was much more compelling in identifying the causes of the depletion  of the atmosphere’s ozone layer, which vitally protects us from harmful extraterrestrial radiation. A particular type of man-made molecule    was found to be at fault, with a particular sort of chemical reaction.
The scientific case was empirically clear and relatively simple, and the consensus was genuine. The international Montreal Protocol in 1987 rightly mandated a world-wide phase-out of the culprit chemicals.  Climate has been changing the entire time there’s been planet Earth, getting colder and warmer by turns. Identification  of  glacial  deposits and facies in ancient sedimentary rocks makes this clear, as do studies of  paleo-faunas  and paleo-floras. Glacial deposits and landforms are spectacular around Calgary, and the magnificent moraines in Wisconsin are the reason that state’s name is attached to the last period of major glaciation.

  What drove past climate changes? Perhaps fluctuations of solar output, or natural changes in the composition and particle load of the atmosphere, or variations in the Earth’s orbit. It has all been going on forever.

A famous Swedish scientist, Svante Arrhenius, in 1896 took a stab at explaining the ice ages in the geologic past. Based on the contempo-rary knowledge of chemistry and physics, he suggested that an increase of atmospheric CO2 should lead to an increase in the world’s tempera- ture. Since human industrial activity produces CO2 and other gases that supposedly warm the planet through a “greenhouse effect”, modern anti-capitalism activists conclude that the most advanced industrial countries are the culprits behind the man-made global warming. Arrhenius’ scientific idea is a supposition that should be falsifiable, or testable based on observed facts. A tendency or a theoretical potenti- ality may well deserve serious consideration – but they are not facts. So far, the factual record is mixed. Perhaps it will take a lot longer before we reliably know what is actually happening with the climate.

Predictions such as Arrhenius’ are hypotheses. Only observations are facts. Climate models predicting global temperatures a century ahead are too dependent on assumptions to be reliable. They might as well be 10,000-year predictions.

One only needs to follow the news to know the official global-warming dogma stands falsified. No global warming has been measured for the last two decades. The infamous “hockey-stick graph” in the 1990s predicted that we are about to imminently enter a new period of rapidly accelerated global warming. So far at least, nothing like this has occurred.

But a canonized model must be upheld. We are now told that the “missing” heat is hiding in the world’s oceans, soon to come out with a vengeance. Maybe it is, but in the absence of hard evidence this is not science but merely Popper’s metaphysics.

The Arctic and Antarctic ice was supposed to melt away fast. The anomaly of the 2015-2016 strong El Niño aside, the recent years saw a growth of polar ice, preceded by a reduction since the 1990s. And the Arctic and Antarctic ice does not shrink and grow in tandem. Are the re-grown ice caps here to stay? Will they keep growing? Will they shrink again? It is much too soon to tell. At any rate, our modern factual knowledge of the climate is too limited to permit long-term predictions.

Leonardo DiCaprio, a Hollywood actor, was ridiculed when he proclaimed southern Alberta’s ageless chinook winds to be evidence of global warming. Less farcically, America’s venerable space agency, NASA, is often accused of fudging the data to make the recent decades’ global warming look bigger than it was. This was allegedly done by artificially depressing the graphed temperature values from earlier in the 20th century, so any later increases would look steeper. Many academics defend NASA (they would, wouldn’t they?) but in the absence of open scientific transparency, how does one judge

A false claim by the alarmists is that 97% of climate scientists believe global warming to be man-made and urgent. The roots of this claim vary, but reportedly, they tend to involve exaggerating the strength of stated convictions and the extent of reported individuals’ alarm. The poll sample is sometimes small. Ask if there is global warming and if any of it is man-made, and most reasonable people’s answers (my own included) would be at least non-negative. However, that is no evidence of a consensus for urgency.

On the other extreme, to deny man-made drivers of climate change completely, as some activists do, is as problematic as it is to postulate them without question. These activists deserve full credit for standing up to the warmist dogma, but their own absolutism itself should invite skepticism. A scientist is not a scientist if he is not a healthy skeptic.

What is going on with the climate? The point is, we don’t know.

Don’t thank your prof

How many of us had a college course in scientific logic? In all my programs, I never did. And if you are not taught, how would you know?
Perhaps the professors themselves don’t know logic enough to teach. Or they don’t care, or couldn’t be bothered. Nor is history of ideas taught much. Ignorance, unless it is checked, propagates generation to generation.
Much of this has to do with the much-bemoaned (e.g., Bloom, 1987) loss since the 1960s of the classical education. As access to college education has expanded in the past decades, the quality has declined.
“More means worse”, as the writer Kingsley Amis used to say. That’s why a B.Sc. degree is no longer quite enough to ensure a strong career, and an M.Sc. could be advisable to improve one’s competitive advantage.
Removal of logical rigor leads to relativism. Everything is deemed subjective (Bloom, 1987; Hughes, 1993). Truth is relative: each person has his own, to be upheld without question in the interest of diversity  and tolerance. One idea or theory or value system or even behavior is as good as any other – and who do you think you are to judge?
Apart from the obvious (un)ethical repercussions in the social sphere, such lazy and deconstructive thinking destroys empiricism, and with it the integrity of science.

Thought control

“Does Big Brother exist?”
“Of course he exists. The Party exists. Big Brother is the embodiment of the Party.”
“Does he exist in the same way as I exist?” “You do not exist.”
– GEORGE ORWELL (“Nineteen Eighty-Four”)

In 1633, the great Galileo famously saved himself from the Inquisition by formally recanting his scientific views. He got away with merely house arrest.

The modern system of administering the academia took its essential form in the 1940s. Some of the inspiration is commonly attributed to Vannevar Bush, a prominent American inventor, weapons designer, entrepreneur and public administrator. The intent was to give scholars freedom to work and create. The academia is officially independent, to shield it from the whims of fluctuating public passions and political pressures. Outsiders have limited scope to interfere.

Tenured professors are essentially safe from layoffs. The academics decide for themselves who will be hired into their midst as new faculty. They review each other’s funding applications and publication drafts, and build up each other’s citation counts by mutually quoting each other’s work in their own papers.

In such an opaque and self-enclosed system, all members of the academic cartel have a common interest to maximize “funding” (i.e. their intake of taxpayers’ money) while avoiding external scrutiny. To gain career advancement, each cartel member vitally depends on every other for favorable reviews and citations.

In a tightly closed shop, this is a recipe for self-organizing conformism. The academia’s incestuous self-rule (“academic independence”) is organized around mutual peer review of each other’s work. Collective self-accountability, however, is no accountability at all. We have heard plenty about all this after Climategate.


In such a closed system, open debate, which is the lifeblood of scientific advance, is choked off. Healthy skeptics of the official climate-change dogma are sometimes dismissed pejoratively as “deniers”, which poisonously and falsely hints at an analogy with the denial of the Holocaust.
Insistence on mutual peer review often leads to censorship or self-censorship. Inconvenient dissenters are easy to squeeze out, by killing their grant applications and publication drafts. Evil be to him who evil thinks.
If something undesirable does make it into print, it can simply go un-quoted  and ignored.

The academia has its failings in geoscience too. Along the Pacific coast of British Columbia and northern Washington state, a number of published studies (e.g., Acharya, 1992; McCrumb et al., 1989; Lyatsky,
1996) suggest that the oceanic Juan de Fuca plate offshore seems to be deforming internally and no longer subducting.

Empirically, the Cascadia subduction zones lacks some of the main features from which subduction zones are normally identified. There is no bathymetric trench(!), the supposed arc volcanism is weak and scattered, earthquake seismicity does not suggest an east-west lithospheric stress, and regional patterns of ongoing coastal subsidence and uplift do not fit a simple pattern expected of subduction.

The conventionally assumed scheme with ongoing, rigid-plate  West Coast subduction is obviously falsified. This suggests that the earthquake risk in the Vancouver-Seattle area, while clearly very substan- tial, might be exaggerated. No active subduction should mean no impending  mega-thrust quake.

The scientific work discussing these complications has been published in major peer-reviewed scientific journals and high-end book lines.
However, one finds few if any references to these studies in the literature on the West Coast tectonics not even to rebut!

Each particular academic failing may have its own specific, even innocent, causes and explanations, and cheap individual finger-pointing is usually not warranted. The overall pattern, though, is clear: scientific work that does not fit the “party line” of the moment can have a hard time getting published or cited.

If maximizing funding and promoting the academic cartel’s agendas   is a priority, voters and legislators can be scared into opening their wallets by scientific prophesies of impending doom. Very troubling   are some famous and controversial words from the late, prominent climate scientist, Stephen  Schneider:
“On the one hand, as scientists we are ethically bound to the scientific method, in effect promising to tell the truth, the whole truth, and nothing but which means that we must include all doubts, the caveats, the ifs, and the buts. On the other hand, we are not just scientists but human beings as well. And like most people we’d like to see the world a better
place, which in this context translates into our working to reduce the risk of potentially disastrous climatic change. To do that we need to get some broadbased support, to capture the public’s imagination. That, of course, entails getting loads of media coverage. So we have to offer up scary scenarios, make simplified, dramatic statements, and make little mention of any doubts we might have. This ‘double ethical bind’ we frequently
find ourselves in cannot be solved by any formula. Each of us has to decide what the right balance is between being effective and being honest. I hope that means being both.”
Nonsense. There is no “one hand, other hand” in science. A true scholar does not get to have it both ways. Socrates drank hemlock – rendering himself permanently ineffective – to maintain his intellectual  integrity.
Climategate was not a fluke. The self-serving modern academia can no longer be regarded as a credible source of climate science. In today’s “post-truth” intellectual and political environment, where mere shrill-  ness on all sides too often drowns out thought and replaces intellectual rigor, openness is needed more than ever. If sunlight is the best of disinfectants, it is time to throw the windows wide  open.

Some political agendas

It is not just in the Leap Manifesto. Lomborg (2001) called it “the Litany”.

The modern, populous, industrial civilization, it is sometimes said in various words, is an unsustainable parasite load on the planet, and to stop it strong government action is only  righteous.
Democracy and freedom, say some people on both the far left and the far right, are a Western capitalist sham. Citizens are mere automatons and puppets, as a cabal of big corporations and banks controls the media and brainwashes common people into more consumption than virtue and environmental responsibility would  require.
The planet, we are told, is facing a man-made environmental catastrophe, which will soon wipe out countless species and make our current way of life impossible. What sort of catastrophe, exactly, is not important: before global warming, for example, there were the Club of Rome’s neo-Malthusian, now empirically discredited “limits to   growth”.


The urgency of the supposed impending catastrophe makes it impera- tive for the government to take things firmly in hand, redirect  or curtail the industrial and business activity, impose punitive taxation, and reduce our living standards and rates of consumption. Individual free choice would be severely restricted. Some virtuous form of communalism would be imposed instead. The age of capitalism, free trade and “neo-liberalism” would be brought to an   end.


Because opposition to this radical agenda objectively contributes to imminent planetary destruction, we are told, such opposition must not be allowed to stand in the way of vital progress. There is no time to waste! Global-warming skeptics, we have been told even by some very prominent public figures, belong in jail, or at the very least such skepti- cism should be made as socially unacceptable as overt racism.

This whole brave new world of post-capitalist environmental justice would presumably be directed by correctly illuminated and selfless persons with necessary powers and a grave responsibility to save the planet. These philosopher-kings would presumably resemble the activists who propose such solutions.


The superficially seductive idea of a rationally organized and virtuous society, run in the name of “the people” but actually from the top  down by illuminati, is at least as old as Plato’s “Republic”. Many liberal scholars, not least Popper (1950), have denounced such schemes as a call for tyranny.


Today’s would-be philosopher-kings often find it convenient to dress up in pseudo-environmental green. A former senior UN climate official, Christiana Figueres, has been pilloried in the media for stating the following:

“This is the first time in the history of mankind that we are setting ourselves the task of intentionally, within a defined period of time, to change the economic development model that has been reigning for at least 150 years, since the Industrial Revolution.”

And more. “This is probably the most difficult task we have ever given ourselves, which is to intentionally transform the economic development model for the first time in human history.


No, this is not at all the first time in history. The Marxists’ previous attempt to do something of this nature littered Eurasia with 100 million corpses and failed in squalor.


This comment from Europe’s former climate commissioner, Connie Hedegaard, received its own share of harsh rebukes.

"Let's say that science, some decades from now, said 'we were wrong, it was not about climate', would it not in any case have been good to do many of things you have to do in order to combat climate change?”


Perhaps so, but this is a debate of an entirely separate nature, for which dubious science must not be used as a cover.

With previous hints of approval from the Barack Obama administration, some American state-level prosecutors have been investigating ways to prosecute oil companies, think tanks and assorted global-warming “deniers” under the Racketeer Influenced and Corrupt Organizations Act, whose initial intent was to stop organized crime. In the 1970s, would they have prosecuted skeptics of global cooling?


On the heels of the 20th century, this is chilling. Remembering the totali- tarian perversions in National Socialist (Nazi) Germany and Communist Russia and China, the radical socio-economic schemes advanced by many climate alarmists today are nothing new. Benito Mussolini, an ex-socialist who implemented one such scheme almost a century ago, said it memorably:.

For if the nineteenth century was a century of individualism (Liberalism always signifying individualism) it may be expected that this will be a century of collectivism, and hence the century of the State.”

Have we not had enough?






REFERENCES

Acharya, H., 1992. Comparison of seismicity parameters in different subduction zones and its applications for the Cascadia subduction zone; Journal of Geophysical Research, v. 97, p. 8831-8842.

Bloom, A., 1987. The Closing of the American Mind; Simon and Schuster. Hughes, R., 1993. Culture of Complaint; Oxford University Press.

Kuhn, T.S., 1962. The Structure of Scientific Revolutions; University of Chicago Press. Lomborg, B., 2001. The Skeptical Environmentalist; Cambridge University Press.

Lyatsky, H.V., 1996. Continental-Crust Structures on the Continental Margin of Western North America; Springer-Verlag.

McCrumb, D.R., Galster, R.W., West, D.O., Crosson, R.S., Ludwin, R.S., Hancock, W.E., and Mann, L.V., 1989. Tectonics, seismicity and engineering geology in Washington; in: R.W. Galster (ed.), Engineering Geology in Washington, v. I; Washington Division of Geology and Earth Resources, Bulletin 78, p. 97-120.

Popper, K.R., 1950. The Open Society and Its Enemies (2nd edition); Princeton University Press.

Popper, K.R., 1968. The Logic of Scientific Discovery (2nd English edition); Routledge. Russell, B., 1961. History of Western Philosophy (new edition); George Allen & Unwin.

About the author


Henry Lyatsky is a Calgary-based consultant who has worked in oil and  mineral  exploration  around  North  America  and  overseas.  He is the first or sole author of three books (Springer-Verlag) and two atlases (Alberta Geological Survey) on the regional geology and geophysics  of  western Canada.





Read more in the mid-April RECORDER – FOCUS: Greenhouse Gas/Environmental  Geoscience.