Category: Richard Zuber

The Courts against Proposition 65


A federal judge has frozen plans to require all products containing the widely popular herbicide glyphosate to display a Proposition 65 warning in a landmark ruling that could signal the turning of the tide for California’s nanny-state regulations.  The decision by Federal District Judge William Shubb represents a significant blow to both to the much maligned Proposition 65 and the organization that accounts for so many of its listings, the International Agency for Research on Cancer (IARC).

Shubb’s ruling is a major victory for states’ rights – due to the size of California’s economy, its overregulation spills over into other jurisdictions – and for farmers’ groups around the country who had pushed back against California’s imposition of these costly and misleading warnings.  Shubb has struck another nail in the coffin of Proposition 65, a law so universally detested that even the leftist L.A. Times came out against it.

A lengthy legal wrangle

Around 1.8 million tons of glyphosate has been used across the U.S. since 1974.  Such a commonly used chemical has obviously demanded a rigorous health and safety assessment.  It has repeatedly been certified as non-threatening to humans from regulatory bodies all over the world, including in the U.S., Europe, Canada, Australia, New Zealand, and Japan.

The controversy over the substance arose when IARC – a semi-autonomous branch of the World Health Organization (WHO) based in Lyon, France, which was recently slammed by House Science Committee members for its “manipulation of scientific data” and “shoddy work” – found it “probably carcinogenic” to humans.  Despite the fact that this remains the only major study to reach such a conclusion, IARC’s ruling meant that glyphosate was automatically added to California’s Proposition 65 list, a lengthy catalogue of supposed carcinogens.  Under California state law, any product containing these substances must display a warning stating that the product in question “is known to the State of California to cause cancer,” resulting in a proliferation of inaccurate and scare-mongering signs all over the Golden State.

The new classification had several immediate implications for glyphosate.  Firstly, products containing the herbicide would have been required to carry a Proposition 65 label beginning July 7, 2018.  In addition to this onerous burden, no fewer than 184 plaintiffs sprang up across America, accusing herbicide-manufacturers of giving them cancer.  Federal judge Vince Chhabria is currently determining whether these plaintiffs’ cases can proceed, a decision hinging on whether a link between glyphosate and cancer has been “tested, reviewed and published and is widely accepted in the scientific community.”

Judge Shubb’s ruling means that, while glyphosate will remain listed for now under Proposition 65, glyphosate products will no longer have to carry warning labels starting this July.  It also makes it difficult for Judge Chhabria to find that the connection between glyphosate and cancer is “widely accepted in the scientific community.”

IARC: Notoriously unreliable

The rationale behind Judge Shubb’s ruling?  According to Shubb, a label that decries glyphosate as “known to cause cancer” would be “misleading at best” and both “factually inaccurate and controversial.”  This is because in over 40 years of research, the IARC study is the only one to have found it dangerous to human health.  It’s also because the study itself is mired in controversy.

Influential scientist Aaron Blair withheld key information from the IARC panel studying glyphosate.  Consequently, the IARC did not take into consideration the most comprehensive investigation into the long-term effects of glyphosate on farmers, which found zero evidence linking glyphosate with cancer.

Further muddying the water, there were considerable discrepancies between two drafts of IARC’s glyphosate report.  Someone at IARC redacted portions of the report disagreeing with the organization’s eventual conclusions.  The toxicologist in charge of reviewing the data has since claimed he does not know who made the edits, or why and when.  His story is difficult to verify, since IARC unusually discourages the experts working on its reports from retaining drafts or discussing their work.

The perils of Proposition 65

Given these critical issues with IARC’s review process, as well as its conclusions that are diametrically opposed to those of the rest of the scientific community, Judge Shubb deemed it prudent to freeze the institution of warning labels.  His decision has made the plaintiffs in the case quietly and cautiously optimistic that the listing itself will also be overturned in due course.  Regardless of the final outcome, this controversy is yet another thorn in the side of Proposition 65 itself, which has been the subject of intense debate in recent years.

Recent attempts to impose Prop 65 signage on commodities as commonplace as coffee and fast food have prompted disbelief and derision from critics, who claim that the overabundance of warning notices is having numerous detrimental effects on society.  In addition to the carbon footprint caused by producing millions of plastic, paper, and metal signs, a toxic litigation racket has sprung up in the state.  Private enforcers of Proposition 65 requirements, colloquially known as “bounty hunters,” have crafted a lucrative career around extorting businesses that have failed to comply with the warning rules.  In 2013 alone, these litigious mercenaries collected a cool $15 million.  Business is booming in California, but for all the wrong reasons.

Finally, and perhaps most concerning of all, the proliferation of warning signs all over California has had the exact opposite effect of what was originally intended.  Instead of reacting with caution and concern to a Proposition 65 notice, desensitized citizens have become jaded by their omnipresence in gas stations and garages, restaurants, and liquor stores.  The cotton wool intended to protect the everyman is in danger of blinding his vision and smothering him with its ubiquity.  Judge Shubb’s ruling, therefore, might represent a victory not just for the farmers, the states, and scientific integrity, but for common sense itself.

A federal judge has frozen plans to require all products containing the widely popular herbicide glyphosate to display a Proposition 65 warning in a landmark ruling that could signal the turning of the tide for California’s nanny-state regulations.  The decision by Federal District Judge William Shubb represents a significant blow to both to the much maligned Proposition 65 and the organization that accounts for so many of its listings, the International Agency for Research on Cancer (IARC).

Shubb’s ruling is a major victory for states’ rights – due to the size of California’s economy, its overregulation spills over into other jurisdictions – and for farmers’ groups around the country who had pushed back against California’s imposition of these costly and misleading warnings.  Shubb has struck another nail in the coffin of Proposition 65, a law so universally detested that even the leftist L.A. Times came out against it.

A lengthy legal wrangle

Around 1.8 million tons of glyphosate has been used across the U.S. since 1974.  Such a commonly used chemical has obviously demanded a rigorous health and safety assessment.  It has repeatedly been certified as non-threatening to humans from regulatory bodies all over the world, including in the U.S., Europe, Canada, Australia, New Zealand, and Japan.

The controversy over the substance arose when IARC – a semi-autonomous branch of the World Health Organization (WHO) based in Lyon, France, which was recently slammed by House Science Committee members for its “manipulation of scientific data” and “shoddy work” – found it “probably carcinogenic” to humans.  Despite the fact that this remains the only major study to reach such a conclusion, IARC’s ruling meant that glyphosate was automatically added to California’s Proposition 65 list, a lengthy catalogue of supposed carcinogens.  Under California state law, any product containing these substances must display a warning stating that the product in question “is known to the State of California to cause cancer,” resulting in a proliferation of inaccurate and scare-mongering signs all over the Golden State.

The new classification had several immediate implications for glyphosate.  Firstly, products containing the herbicide would have been required to carry a Proposition 65 label beginning July 7, 2018.  In addition to this onerous burden, no fewer than 184 plaintiffs sprang up across America, accusing herbicide-manufacturers of giving them cancer.  Federal judge Vince Chhabria is currently determining whether these plaintiffs’ cases can proceed, a decision hinging on whether a link between glyphosate and cancer has been “tested, reviewed and published and is widely accepted in the scientific community.”

Judge Shubb’s ruling means that, while glyphosate will remain listed for now under Proposition 65, glyphosate products will no longer have to carry warning labels starting this July.  It also makes it difficult for Judge Chhabria to find that the connection between glyphosate and cancer is “widely accepted in the scientific community.”

IARC: Notoriously unreliable

The rationale behind Judge Shubb’s ruling?  According to Shubb, a label that decries glyphosate as “known to cause cancer” would be “misleading at best” and both “factually inaccurate and controversial.”  This is because in over 40 years of research, the IARC study is the only one to have found it dangerous to human health.  It’s also because the study itself is mired in controversy.

Influential scientist Aaron Blair withheld key information from the IARC panel studying glyphosate.  Consequently, the IARC did not take into consideration the most comprehensive investigation into the long-term effects of glyphosate on farmers, which found zero evidence linking glyphosate with cancer.

Further muddying the water, there were considerable discrepancies between two drafts of IARC’s glyphosate report.  Someone at IARC redacted portions of the report disagreeing with the organization’s eventual conclusions.  The toxicologist in charge of reviewing the data has since claimed he does not know who made the edits, or why and when.  His story is difficult to verify, since IARC unusually discourages the experts working on its reports from retaining drafts or discussing their work.

The perils of Proposition 65

Given these critical issues with IARC’s review process, as well as its conclusions that are diametrically opposed to those of the rest of the scientific community, Judge Shubb deemed it prudent to freeze the institution of warning labels.  His decision has made the plaintiffs in the case quietly and cautiously optimistic that the listing itself will also be overturned in due course.  Regardless of the final outcome, this controversy is yet another thorn in the side of Proposition 65 itself, which has been the subject of intense debate in recent years.

Recent attempts to impose Prop 65 signage on commodities as commonplace as coffee and fast food have prompted disbelief and derision from critics, who claim that the overabundance of warning notices is having numerous detrimental effects on society.  In addition to the carbon footprint caused by producing millions of plastic, paper, and metal signs, a toxic litigation racket has sprung up in the state.  Private enforcers of Proposition 65 requirements, colloquially known as “bounty hunters,” have crafted a lucrative career around extorting businesses that have failed to comply with the warning rules.  In 2013 alone, these litigious mercenaries collected a cool $15 million.  Business is booming in California, but for all the wrong reasons.

Finally, and perhaps most concerning of all, the proliferation of warning signs all over California has had the exact opposite effect of what was originally intended.  Instead of reacting with caution and concern to a Proposition 65 notice, desensitized citizens have become jaded by their omnipresence in gas stations and garages, restaurants, and liquor stores.  The cotton wool intended to protect the everyman is in danger of blinding his vision and smothering him with its ubiquity.  Judge Shubb’s ruling, therefore, might represent a victory not just for the farmers, the states, and scientific integrity, but for common sense itself.



Source link

How Regulations Made California's Fires Worse


After raging through almost all of December, the so-called Thomas fire, California’s largest wildfire ever recorded, was finally contained on January 12.  While the worst is behind us (for now), the fact that last year’s wildfires so violently spun out of control puts the spotlight on the Golden State’s government and its lack of fire prevention measures.

The fires across the state caused unprecedented damage and loss of life.  Unsurprisingly, California governor Jerry Brown was quick to pin the blame on climate change for the forest fires’ ferocity and extraordinary longevity this season.  Whatever truth there may be to this, it would be a mistake to gloss over how misguided policies and regulations have hurt California’s ability to prevent and respond to fires.

According to the California Department of Forestry and Fire Protection (CAL FIRE), shrubs and live and dead vegetation are the most important factor in forest fires, being an easily ignitable fuel source that helps spread the flames quickly over vast distances.  For a dry and warm state prone to fires, regular clearing measures removing this vegetation should be common sense.  However, California has enacted several laws that heavily restrict such vital fire-preventing measures as logging, removal of dead trees, and clearing of dry underbrush.

During a congressional hearing in May, California congressman Tom McClintock blasted environmentalists for having fervently opposed such measures since the early 1970s.  Instead, they have been advocating that forests be left to their own devices – despite the fact that thousands of years of history shows that forests need to be appropriately maintained in order to reap all their benefits and reduce the risk of fires.  This understanding of the environment has too often been trumped by politics in California.

When a 2013 environmental impact report advocated the benefits of large-scale vegetation management in San Diego County, activists violently rejected its conclusions.  As in the decades before, concerns over wildlife and environmental impacts were ultimately more important than the safety of fellow citizens, with the result that brush and dead vegetation were allowed to accumulate unimpeded for more than forty years.

Ironically, 2013 also saw a range of massive wildfires across California that were exacerbated by the U.S. Forest Service failing to follow through on crucial tree-thinning projects.  The same happened immediately before the recent devastating fires, with the U.S. Forest Service once again neglecting to clear brush in the woods around Los Angeles as originally planned.

Worse still, government agencies have actively stymied rescue efforts.  In this case, it was CAL FIRE withholding a license for a Boeing 747 Global SuperTanker firefighting plane, capable of dropping almost 20,000 gallons of fire retardant on the inferno below.  Although the license was finally granted in September, precious months had passed during which the plane could have deployed to offer much needed fire-suppressing support across California.  The Boeing subsequently proved instrumental in containing much of the fires ravaging California in December – less than a week after Donald Trump declared the California fires a national emergency.

As the SuperTanker example shows, private resources are indispensable for a swift and effective response in national emergencies.  Top-of-the-line equipment like the SuperTanker is heavily reliant on functioning fixed-base operators (FBOs) for refueling and replenishing payloads.  In the midst of the fires, a campaign to impose price regulations on these very FBOs emerged, threatening to undermine the first line of defense in providing essential services during natural disasters.

If history is a lesson, more regulation certainly isn’t what’s needed in times of emergency, especially not if these regulations seek to counteract the market forces necessary for rescue efforts to continue their operations.  Luckily, the Federal Aviation Administration (FAA) reached the same conclusions amid the December wildfires.  In a document published on December 7, the FAA strongly argued that such price controls are unnecessary, seeing how FBOs need to adapt to the pressures of business realities that are often beyond their control.

While the FAA acted for the greater good in this instance, the same can’t be expected from California’s authorities.  Luckily, other states have been taking note.  Oregon, a state equally prone to wildfires, has successfully prevented the outbreak of large-scale fires since it implemented comprehensive thinning and pruning measures in the 1990s.  Meanwhile, citizens in New Mexico are petitioning the state legislature to facilitate forest management so that fires at the scale of those in California can be prevented.

California would be smart to cut the environmentalist zeal from its policies and follow Oregon’s or New Mexico’s lead.  A catastrophe the likes of which we witnessed in 2017 should more than suffice to trigger a rethinking of old approaches.  Deregulation means not only standing up to the green lobby, but also protecting citizens’ lives and homes in national emergencies.

After raging through almost all of December, the so-called Thomas fire, California’s largest wildfire ever recorded, was finally contained on January 12.  While the worst is behind us (for now), the fact that last year’s wildfires so violently spun out of control puts the spotlight on the Golden State’s government and its lack of fire prevention measures.

The fires across the state caused unprecedented damage and loss of life.  Unsurprisingly, California governor Jerry Brown was quick to pin the blame on climate change for the forest fires’ ferocity and extraordinary longevity this season.  Whatever truth there may be to this, it would be a mistake to gloss over how misguided policies and regulations have hurt California’s ability to prevent and respond to fires.

According to the California Department of Forestry and Fire Protection (CAL FIRE), shrubs and live and dead vegetation are the most important factor in forest fires, being an easily ignitable fuel source that helps spread the flames quickly over vast distances.  For a dry and warm state prone to fires, regular clearing measures removing this vegetation should be common sense.  However, California has enacted several laws that heavily restrict such vital fire-preventing measures as logging, removal of dead trees, and clearing of dry underbrush.

During a congressional hearing in May, California congressman Tom McClintock blasted environmentalists for having fervently opposed such measures since the early 1970s.  Instead, they have been advocating that forests be left to their own devices – despite the fact that thousands of years of history shows that forests need to be appropriately maintained in order to reap all their benefits and reduce the risk of fires.  This understanding of the environment has too often been trumped by politics in California.

When a 2013 environmental impact report advocated the benefits of large-scale vegetation management in San Diego County, activists violently rejected its conclusions.  As in the decades before, concerns over wildlife and environmental impacts were ultimately more important than the safety of fellow citizens, with the result that brush and dead vegetation were allowed to accumulate unimpeded for more than forty years.

Ironically, 2013 also saw a range of massive wildfires across California that were exacerbated by the U.S. Forest Service failing to follow through on crucial tree-thinning projects.  The same happened immediately before the recent devastating fires, with the U.S. Forest Service once again neglecting to clear brush in the woods around Los Angeles as originally planned.

Worse still, government agencies have actively stymied rescue efforts.  In this case, it was CAL FIRE withholding a license for a Boeing 747 Global SuperTanker firefighting plane, capable of dropping almost 20,000 gallons of fire retardant on the inferno below.  Although the license was finally granted in September, precious months had passed during which the plane could have deployed to offer much needed fire-suppressing support across California.  The Boeing subsequently proved instrumental in containing much of the fires ravaging California in December – less than a week after Donald Trump declared the California fires a national emergency.

As the SuperTanker example shows, private resources are indispensable for a swift and effective response in national emergencies.  Top-of-the-line equipment like the SuperTanker is heavily reliant on functioning fixed-base operators (FBOs) for refueling and replenishing payloads.  In the midst of the fires, a campaign to impose price regulations on these very FBOs emerged, threatening to undermine the first line of defense in providing essential services during natural disasters.

If history is a lesson, more regulation certainly isn’t what’s needed in times of emergency, especially not if these regulations seek to counteract the market forces necessary for rescue efforts to continue their operations.  Luckily, the Federal Aviation Administration (FAA) reached the same conclusions amid the December wildfires.  In a document published on December 7, the FAA strongly argued that such price controls are unnecessary, seeing how FBOs need to adapt to the pressures of business realities that are often beyond their control.

While the FAA acted for the greater good in this instance, the same can’t be expected from California’s authorities.  Luckily, other states have been taking note.  Oregon, a state equally prone to wildfires, has successfully prevented the outbreak of large-scale fires since it implemented comprehensive thinning and pruning measures in the 1990s.  Meanwhile, citizens in New Mexico are petitioning the state legislature to facilitate forest management so that fires at the scale of those in California can be prevented.

California would be smart to cut the environmentalist zeal from its policies and follow Oregon’s or New Mexico’s lead.  A catastrophe the likes of which we witnessed in 2017 should more than suffice to trigger a rethinking of old approaches.  Deregulation means not only standing up to the green lobby, but also protecting citizens’ lives and homes in national emergencies.



Source link

Coal Is Here for the Long Haul


While renewables and nuclear power are set to be the world’s fastest-growing sources of energy through 2040, it’s fossil fuels that will still account for more than 75% of world power production for decades to come. That’s according to the U.S. Energy Information Administration’s (EIA) latest international outlook report, which predicts that global energy consumption will rise by a whopping 28% between 2015 and 2040, with most of the growth driven by rapidly developing Asian nations.

The global energy outlook lays bare what many politicians and energy sector specialists have long known to be true but have been wary of saying above a whisper, especially during the Obama years: coal is here to stay. The numbers – along with major power installation projects underway domestically and abroad – don’t lie. And while green activists might wring their hands, this energy outlook in fact represents a goldmine for the U.S. if we choose the right strategy: one focused on meeting the needs of developing economies hungry for more energy, and on investing in clean coal and similar technologies to maximize the efficiency of our nation’s new plants.

While the liberal media has focused myopically on the one-off shuttering of certain coal plants as a harbinger of a carbon-free future, in fact, it’s coal-fired installations that continue to provide the backbone of electricity generation across the country. In Michigan, for instance, three nuclear power plants powered 28% of the state’s net electricity generation last year, but now, up to 10-20% of that electricity is set to evaporate after the scheduled shutdown of the Palisades nuclear plant. On top of that, U.S. natural gas production has fallen for the first time since 2005, certain states have imposed rules on fracking that could further curtail production, and natural gas prices have shot up by 50% over the past 14 months. This trend, combined with Michigan’s loss of a key source of base load power, makes it abundantly clear that coal-fired plants will have to make up the shortfall. With other states across the country relying on similar energy mixes — overall, 19.7% from nuclear, 30.4% from coal, 33.8% from natural gas, and 14.9% from renewables — steady and even growing reliance on coal-fired plants is set to be replicated nationwide.

On top of domestic plans to increase investments in coal-powered installations, new schemes to supply allied nations with coal are also underway. In Longview, Washington, the Millennium terminal, the largest proposed coal export station in North America, is set to help energy-poor allies like Japan meet their power, national security, and economic growth needs. Japan has always lacked adequate energy resources and this shortfall grew even greater after Fukushima and the ensuing suspension of nuclear energy production. As a result, the government reevaluated the importance of imported coal to base load power and has emerged as a leader in the clean coal technology energy space. The government has plans to build an additional 48 high efficiency, low emissions power plants and is constructing two advanced gasification-based coal plants near Fukushima.

Fortunately, highly developed economies like Japan have enough human and financial capital to invest in advanced coal technology projects. But that’s far from the case for developing countries, which will account for the bulk in world production and use of coal for the next 20+ years. According to the EIA, Africa, the Middle East, and other non-OECD Asian states are predicted to increase coal capacity and generation through 2040, with coal consumption in those countries growing on average 2.4% per year and accounting for 20% of total energy use.

Fortunately, a number of these countries, notably India, have seen the writing on the wall and have started to push for more investment in carbon capture, utilization and storage (CCUS) technology to meet their energy needs more affordably and efficiently. Earlier this summer, the government announced a new National Mission on advanced ultra supercritical technologies for clean coal utilization at a cost of $248 million, as well as the creation of two centers of excellence on clean coal technology. The government has also been prioritizing ultra high-efficiency, low-emissions technology, with plans to develop an 800-MW power plant with ultra-supercritical boilers within the next three years.

But with New Delhi still struggling to bring electricity to the 20% of the public that is off the grid, it will need to collaborate with more advanced partners to achieve their energy goals. The head of the World Coal Association already said as much earlier this month when he called for India to ally with states like the U.S. to clinch cheaper funding from multilateral development banks to access more efficient technologies. Here, the U.S. has a golden opportunity to export more American commodities and know-how and to catch up with the likes of China and Japan, which have both been pumping colossal funding into power projects overseas and investing in clean energy technologies.

Already, a few steps have been taken towards seizing this chance, with the administration announcing that Washington will use its vote at the World Bank, where it is the biggest shareholder, to help countries use fossil fuels more efficiently and access renewable energy sources. But in other ways, the administration has been falling short. Despite positive words for clean coal, Trump still hasn’t put his money where his mouth is, threatening to cut funding for the very department that researches CCUS by 55%. With the administration’s own EIA making it clear that coal will be on the menu for years to come, the U.S. needs to invest more in exportable technology to help developing nations gain access to reliable, affordable sources of base load power — and to benefit from a market that is set to explode. 

While renewables and nuclear power are set to be the world’s fastest-growing sources of energy through 2040, it’s fossil fuels that will still account for more than 75% of world power production for decades to come. That’s according to the U.S. Energy Information Administration’s (EIA) latest international outlook report, which predicts that global energy consumption will rise by a whopping 28% between 2015 and 2040, with most of the growth driven by rapidly developing Asian nations.

The global energy outlook lays bare what many politicians and energy sector specialists have long known to be true but have been wary of saying above a whisper, especially during the Obama years: coal is here to stay. The numbers – along with major power installation projects underway domestically and abroad – don’t lie. And while green activists might wring their hands, this energy outlook in fact represents a goldmine for the U.S. if we choose the right strategy: one focused on meeting the needs of developing economies hungry for more energy, and on investing in clean coal and similar technologies to maximize the efficiency of our nation’s new plants.

While the liberal media has focused myopically on the one-off shuttering of certain coal plants as a harbinger of a carbon-free future, in fact, it’s coal-fired installations that continue to provide the backbone of electricity generation across the country. In Michigan, for instance, three nuclear power plants powered 28% of the state’s net electricity generation last year, but now, up to 10-20% of that electricity is set to evaporate after the scheduled shutdown of the Palisades nuclear plant. On top of that, U.S. natural gas production has fallen for the first time since 2005, certain states have imposed rules on fracking that could further curtail production, and natural gas prices have shot up by 50% over the past 14 months. This trend, combined with Michigan’s loss of a key source of base load power, makes it abundantly clear that coal-fired plants will have to make up the shortfall. With other states across the country relying on similar energy mixes — overall, 19.7% from nuclear, 30.4% from coal, 33.8% from natural gas, and 14.9% from renewables — steady and even growing reliance on coal-fired plants is set to be replicated nationwide.

On top of domestic plans to increase investments in coal-powered installations, new schemes to supply allied nations with coal are also underway. In Longview, Washington, the Millennium terminal, the largest proposed coal export station in North America, is set to help energy-poor allies like Japan meet their power, national security, and economic growth needs. Japan has always lacked adequate energy resources and this shortfall grew even greater after Fukushima and the ensuing suspension of nuclear energy production. As a result, the government reevaluated the importance of imported coal to base load power and has emerged as a leader in the clean coal technology energy space. The government has plans to build an additional 48 high efficiency, low emissions power plants and is constructing two advanced gasification-based coal plants near Fukushima.

Fortunately, highly developed economies like Japan have enough human and financial capital to invest in advanced coal technology projects. But that’s far from the case for developing countries, which will account for the bulk in world production and use of coal for the next 20+ years. According to the EIA, Africa, the Middle East, and other non-OECD Asian states are predicted to increase coal capacity and generation through 2040, with coal consumption in those countries growing on average 2.4% per year and accounting for 20% of total energy use.

Fortunately, a number of these countries, notably India, have seen the writing on the wall and have started to push for more investment in carbon capture, utilization and storage (CCUS) technology to meet their energy needs more affordably and efficiently. Earlier this summer, the government announced a new National Mission on advanced ultra supercritical technologies for clean coal utilization at a cost of $248 million, as well as the creation of two centers of excellence on clean coal technology. The government has also been prioritizing ultra high-efficiency, low-emissions technology, with plans to develop an 800-MW power plant with ultra-supercritical boilers within the next three years.

But with New Delhi still struggling to bring electricity to the 20% of the public that is off the grid, it will need to collaborate with more advanced partners to achieve their energy goals. The head of the World Coal Association already said as much earlier this month when he called for India to ally with states like the U.S. to clinch cheaper funding from multilateral development banks to access more efficient technologies. Here, the U.S. has a golden opportunity to export more American commodities and know-how and to catch up with the likes of China and Japan, which have both been pumping colossal funding into power projects overseas and investing in clean energy technologies.

Already, a few steps have been taken towards seizing this chance, with the administration announcing that Washington will use its vote at the World Bank, where it is the biggest shareholder, to help countries use fossil fuels more efficiently and access renewable energy sources. But in other ways, the administration has been falling short. Despite positive words for clean coal, Trump still hasn’t put his money where his mouth is, threatening to cut funding for the very department that researches CCUS by 55%. With the administration’s own EIA making it clear that coal will be on the menu for years to come, the U.S. needs to invest more in exportable technology to help developing nations gain access to reliable, affordable sources of base load power — and to benefit from a market that is set to explode. 



Source link

FEMA Apologists Unite


After Hurricane Harvey devastated Texas, the floodgates of the Left’s self-righteous wrath have opened with similar fury. Among all the howling, President Trump requested Congress to approve a $7.85 billion down payment for disaster relief, most of which are meant to fill the coffers of the Federal Emergency Management Agency’s (FEMA) disaster relief fund. The original plan to cut FEMA’s budget by $876 million to free up resources for national security priorities such as the border wall was already on the verge of being repealed. But with the new funding request being pegged to raising the debt ceiling, Democrats have positioned themselves to stifle any debate about fiscal responsibility.

As it stands now, Democrats can hit two birds with one stone and push through two of their main agendas – a higher debt ceiling and the continued funding of a dysfunctional federal agency. With budget and debt ceilings approaching at the end of the month, the Left is willingly pushing the U.S. into financial ruin.

Trump’s decision may be motivated by the desire to avoid a drawn-out congressional standoff over the upcoming budget and debt ceiling, for which he needs Democrat support. Unfortunately, it also set him on a collision course with the House Freedom Caucus, whose members have already made it clear they will not such support fiscal maneuvering. And rightfully so: with U.S. debt standing at almost $20 trillion, thanks in large part due to Obama’s generous $836 billion giveaway in 2009, the U.S. needs fiscally responsible structural reforms more than anything else.

To no one’s surprise, liberals are not concerned with such matters, willfully ignoring that the entire argument for greater government involvement in disaster relief stands on more than shaky feet. However, blinded by their gleeful I-told-you-so-attitude they fail to see the irony that FEMA itself is the best example of this. FEMA has been one of the main hooks for Trump-bashing from the big government crowd. Yet while projects relating to national security are worth their funding, throwing billions of dollars at an organization as underperforming as FEMA is not. Harvey’s estimated costs of more than $100 billion are drowning out FEMA’s capabilities, and pouring government money into the sieve that is our current emergency relief infrastructure will not change that.

The fact that FEMA is extraordinarily bad at doing what it is supposed to should be reason enough for normal, rationally thinking people to upend its ability to wreak havoc on the budget. Originally designed to provide grants to uninsured homeowners for emergency repairs, the agency doesn’t provide long-term recovery loans to those in need, and massive red tape has often hindered relief efforts rather than aid them. Now, the agency is already faltering under Harvey-induced pressure. It has begun referring the needy to the Small Business Administration to apply for loans in their search for long-term recovery assistance. Houston Mayor Sylvester Turner vented his frustration with FEMA’s penchant for bureaucratic delays, calling on the organization to step up efforts as tens of thousands of people wallow in emergency shelters.

After Hurricane Katrina in 2005, FEMA’s National Flood Insurance Program (NFIP) had to be bailed out to reduce its debt load to just under $18 billion. Then, following Hurricane Sandy in 2012, the NFIP was $24 billion in debt, and had to be kept afloat with another payout package when losses were projected to reach $12 billion, far exceeding the NFIP’s borrowing capacity of $2.9 billion. Such damage estimates were only a fraction of the projected cost of Hurricane Harvey, which begs the question: does FEMA expect a handout for every disaster, no matter what the price tag?

The worshippers of the god of big government would gladly respond with a resounding “Yes,” but one mustn’t forget that Trump’s budget proposal published in March still stands. Cutting 9% from public funds, it’s part of a long-term plan to make the government leaner and get the private sector to shoulder greater responsibility. In what can rightfully be regarded as a plea for help in handling the situation, FEMA chief William Long recently reiterated Trump’s goal to form public-private partnerships (PPPs) in disaster relief efforts.

The private sector has already proved itself more effective in urgent situations than the federal government, especially when supplies are needed to alleviate suffering, or prevent and contain the spread of infectious diseases. With Harvey-related flooding increasing the risk of cholera infections, it’s worth remembering how, following Katrina, various enterprises stepped in to pick up the slack where federal authorities failed. Walmart’s supply chain and distribution system delivered supplies to locations where the government had little to no reach. And by leveraging its expertise and resources, Walmart was able to prevent a cholera outbreak as the federal response floundered.

Other countries have been relying on PPPs for years to respond effectively to national catastrophes. In Guinea, a treatment and research center was built by mining operator Rusal to help the government fight the 2014 Ebola epidemic. Three years later, that same center is now running a vaccination program to test a new vaccine against the virus that will continue into 2018. In Thailand’s 2011 floods, Honda partnered with the Thai Department of Disaster Prevention and Mitigation (DDPM), in providing its vehicles for disaster management activities. The company also trained local communities on best practices in dealing with future natural disaster.

The left is short-sighted in its clamoring for greater funding for government agencies. Past emergency responses illustrate that America’s private sector is fully capable of coordinating with Texas policymakers to deliver a more efficient, cost-effective relief effort. Hurricane Harvey is an opportunity for America to implement a new private-sector driven system of emergency relief, one free of red tape and the trappings of government bureaucracy. 

After Hurricane Harvey devastated Texas, the floodgates of the Left’s self-righteous wrath have opened with similar fury. Among all the howling, President Trump requested Congress to approve a $7.85 billion down payment for disaster relief, most of which are meant to fill the coffers of the Federal Emergency Management Agency’s (FEMA) disaster relief fund. The original plan to cut FEMA’s budget by $876 million to free up resources for national security priorities such as the border wall was already on the verge of being repealed. But with the new funding request being pegged to raising the debt ceiling, Democrats have positioned themselves to stifle any debate about fiscal responsibility.

As it stands now, Democrats can hit two birds with one stone and push through two of their main agendas – a higher debt ceiling and the continued funding of a dysfunctional federal agency. With budget and debt ceilings approaching at the end of the month, the Left is willingly pushing the U.S. into financial ruin.

Trump’s decision may be motivated by the desire to avoid a drawn-out congressional standoff over the upcoming budget and debt ceiling, for which he needs Democrat support. Unfortunately, it also set him on a collision course with the House Freedom Caucus, whose members have already made it clear they will not such support fiscal maneuvering. And rightfully so: with U.S. debt standing at almost $20 trillion, thanks in large part due to Obama’s generous $836 billion giveaway in 2009, the U.S. needs fiscally responsible structural reforms more than anything else.

To no one’s surprise, liberals are not concerned with such matters, willfully ignoring that the entire argument for greater government involvement in disaster relief stands on more than shaky feet. However, blinded by their gleeful I-told-you-so-attitude they fail to see the irony that FEMA itself is the best example of this. FEMA has been one of the main hooks for Trump-bashing from the big government crowd. Yet while projects relating to national security are worth their funding, throwing billions of dollars at an organization as underperforming as FEMA is not. Harvey’s estimated costs of more than $100 billion are drowning out FEMA’s capabilities, and pouring government money into the sieve that is our current emergency relief infrastructure will not change that.

The fact that FEMA is extraordinarily bad at doing what it is supposed to should be reason enough for normal, rationally thinking people to upend its ability to wreak havoc on the budget. Originally designed to provide grants to uninsured homeowners for emergency repairs, the agency doesn’t provide long-term recovery loans to those in need, and massive red tape has often hindered relief efforts rather than aid them. Now, the agency is already faltering under Harvey-induced pressure. It has begun referring the needy to the Small Business Administration to apply for loans in their search for long-term recovery assistance. Houston Mayor Sylvester Turner vented his frustration with FEMA’s penchant for bureaucratic delays, calling on the organization to step up efforts as tens of thousands of people wallow in emergency shelters.

After Hurricane Katrina in 2005, FEMA’s National Flood Insurance Program (NFIP) had to be bailed out to reduce its debt load to just under $18 billion. Then, following Hurricane Sandy in 2012, the NFIP was $24 billion in debt, and had to be kept afloat with another payout package when losses were projected to reach $12 billion, far exceeding the NFIP’s borrowing capacity of $2.9 billion. Such damage estimates were only a fraction of the projected cost of Hurricane Harvey, which begs the question: does FEMA expect a handout for every disaster, no matter what the price tag?

The worshippers of the god of big government would gladly respond with a resounding “Yes,” but one mustn’t forget that Trump’s budget proposal published in March still stands. Cutting 9% from public funds, it’s part of a long-term plan to make the government leaner and get the private sector to shoulder greater responsibility. In what can rightfully be regarded as a plea for help in handling the situation, FEMA chief William Long recently reiterated Trump’s goal to form public-private partnerships (PPPs) in disaster relief efforts.

The private sector has already proved itself more effective in urgent situations than the federal government, especially when supplies are needed to alleviate suffering, or prevent and contain the spread of infectious diseases. With Harvey-related flooding increasing the risk of cholera infections, it’s worth remembering how, following Katrina, various enterprises stepped in to pick up the slack where federal authorities failed. Walmart’s supply chain and distribution system delivered supplies to locations where the government had little to no reach. And by leveraging its expertise and resources, Walmart was able to prevent a cholera outbreak as the federal response floundered.

Other countries have been relying on PPPs for years to respond effectively to national catastrophes. In Guinea, a treatment and research center was built by mining operator Rusal to help the government fight the 2014 Ebola epidemic. Three years later, that same center is now running a vaccination program to test a new vaccine against the virus that will continue into 2018. In Thailand’s 2011 floods, Honda partnered with the Thai Department of Disaster Prevention and Mitigation (DDPM), in providing its vehicles for disaster management activities. The company also trained local communities on best practices in dealing with future natural disaster.

The left is short-sighted in its clamoring for greater funding for government agencies. Past emergency responses illustrate that America’s private sector is fully capable of coordinating with Texas policymakers to deliver a more efficient, cost-effective relief effort. Hurricane Harvey is an opportunity for America to implement a new private-sector driven system of emergency relief, one free of red tape and the trappings of government bureaucracy. 



Source link

Kaspersky Kerfuffle: Can Foreign Contractors Protect America's Data?


With controversy over Russia swirling around Washington, our lawmakers have suddenly gotten wise to the fact that the Moscow-based cybersecurity firm Kaspersky Labs has spent two decades securing important contracts with U.S. government agencies. After years of resolute inaction, the House Science, Space, and Technology Committee got around to asking 22 federal agencies for their documents and communications concerning Kaspersky last month. The deadline to reply was Friday, August 11.

Before this year, lawmakers on both sides of the aisle studiously ignored the online security giant’s Kremlin connections. Other countries stopped trusting Kaspersky with their secrets years ago: China’s government procurement office blacklisted them back in 2014. The Trump administration finally yanked the company from two lists of approved vendors used by government agencies one month ago. Before then, American officials had apparently been comfortable with millions of devices automatically sending reams of data back to Kaspersky’s headquarters in Moscow.

Incredibly, U.S. government systems have been so reliant on Kaspersky software that the Obama White House ruled out going after the company as a retaliatory measure in 2016. Eugene Kaspersky, a tech heavyweight trained at a KGB-funded university, has close links with the Russian government and the Federal Security Service – the same FSB that inherited the legacy of the KGB. The move to pull Kaspersky from the U.S. General Services Administration (GSA)’s list of approved vendors for contracts came after Bloomberg obtained emails showing the company had been involved in developing software for the FSB, and that some of the anti-virus vendor’s employees had accompanied state investigators on cybercrime raids.

Taking Kaspersky off the GSA’s procurement list might be the first concrete action after years of speculation over its links with Russia’s security services, but Kaspersky products purchased outside the GSA contract process can still be used by agencies. This raises the question the House is now trying to answer: just how much official data has this company had access to? Even more importantly, what are the potential dangers of handing government contracts (and sensitive or confidential data) to foreign companies, especially those known to be hostile to U.S. interests?

Then again, that second question assumes the government even knows who it’s handing data over to. Chinese tech firms like Huawei and ZTE may be banned from bidding for U.S. government contracts over spying concerns, but some of America’s most important secrets and mission-critical data still manage to end up in China with hardly a second thought. In one of the most spectacular own goals of the Obama era, the Office of Personnel Management (OPM) hired overseas contractors and gave them root access to the personnel records of 14 million federal employees and applicants — including undercover American intelligence personnel based in China.

Where were those contractors based? Argentina, for one, but also inside the People’s Republic of China. At the time, Congressman Jason Chaffetz of the House Oversight Committee compared OPM’s move to “leaving all your doors and windows unlocked and hoping nobody would walk in and take the information.” As it turns out, Beijing is perfectly capable of walking through open doors. By attacking OPM, Chinese hackers managed to steal millions of those records.

Unfortunately, the spy agencies themselves also hand over mission-critical tasks to potentially unreliable foreign companies. Systran International, the formerly French-owned firm that was acquired by South Korea’s CSLi in 2014, provides translation services to the National Security Agency (NSA), as well as to carmakers such as Ford. The company’s machine translation technology is used by the NSA to track online conversations in foreign languages for evidence of terror-related activity in real time. In Paris, though, where Systran is still based, the company’s recent struggles and layoffs have officials wondering whether they can trust the company’s ability to keep the classified data from France’s spy agencies (who also use its services) secure.

One of the most prominent overseas companies providing IT services to U.S. government agencies is Japan’s NTT Data Corp, which works with the military, the Department of Homeland Security (DHS) and the Drug Enforcement Agency (DEA). In all, NTT Data serves more than 50 federal agencies, raking in billions every year in spite of President Trump’s “America First” jobs agenda. NTT has managed to snap up so much of the federal contracting market in part by buying U.S. competitors, like Dell, to make themselves seem “almost as American as American companies.”

Apart from depriving U.S. companies of contracts, it’s abundantly clear that farming out our country’s most sensitive IT infrastructure to foreign firms poses very real national security issues. Then again, such threats don’t just come from outside the homeland. Bradley (now Chelsea) Manning, Ed Snowden, and Reality Winner have forced us to learn the hard way that too many people have access to classified data and too many untrustworthy individuals are somehow receiving clearances. Around five million people in the U.S. currently have security clearances that allow them access to sensitive and confidential material, with many of them working on behalf of outside contractors (American or otherwise).

Embarrassing leaks or the loss of federal employees’ personal data are bad, but they’re not the worst thing that can happen if the government continues to bungle its approach to protecting the nation’s data. The Russians have repeatedly tested new cyberweapons by temporarily bringing down parts of Ukraine’s electric grid. Those attacks have put Congress on edge, but it turns out that the U.S. isn’t all that much better prepared for a similar attack on our own grid or other infrastructure networks. In the event of a major operation conducted by the Russians, Chinese, or even a lesser adversary like the North Koreans or non-state cybercriminals, the disorganized hodgepodge of companies and contractors responsible for keeping us connected could finally be exposed as our Achilles heel.

With controversy over Russia swirling around Washington, our lawmakers have suddenly gotten wise to the fact that the Moscow-based cybersecurity firm Kaspersky Labs has spent two decades securing important contracts with U.S. government agencies. After years of resolute inaction, the House Science, Space, and Technology Committee got around to asking 22 federal agencies for their documents and communications concerning Kaspersky last month. The deadline to reply was Friday, August 11.

Before this year, lawmakers on both sides of the aisle studiously ignored the online security giant’s Kremlin connections. Other countries stopped trusting Kaspersky with their secrets years ago: China’s government procurement office blacklisted them back in 2014. The Trump administration finally yanked the company from two lists of approved vendors used by government agencies one month ago. Before then, American officials had apparently been comfortable with millions of devices automatically sending reams of data back to Kaspersky’s headquarters in Moscow.

Incredibly, U.S. government systems have been so reliant on Kaspersky software that the Obama White House ruled out going after the company as a retaliatory measure in 2016. Eugene Kaspersky, a tech heavyweight trained at a KGB-funded university, has close links with the Russian government and the Federal Security Service – the same FSB that inherited the legacy of the KGB. The move to pull Kaspersky from the U.S. General Services Administration (GSA)’s list of approved vendors for contracts came after Bloomberg obtained emails showing the company had been involved in developing software for the FSB, and that some of the anti-virus vendor’s employees had accompanied state investigators on cybercrime raids.

Taking Kaspersky off the GSA’s procurement list might be the first concrete action after years of speculation over its links with Russia’s security services, but Kaspersky products purchased outside the GSA contract process can still be used by agencies. This raises the question the House is now trying to answer: just how much official data has this company had access to? Even more importantly, what are the potential dangers of handing government contracts (and sensitive or confidential data) to foreign companies, especially those known to be hostile to U.S. interests?

Then again, that second question assumes the government even knows who it’s handing data over to. Chinese tech firms like Huawei and ZTE may be banned from bidding for U.S. government contracts over spying concerns, but some of America’s most important secrets and mission-critical data still manage to end up in China with hardly a second thought. In one of the most spectacular own goals of the Obama era, the Office of Personnel Management (OPM) hired overseas contractors and gave them root access to the personnel records of 14 million federal employees and applicants — including undercover American intelligence personnel based in China.

Where were those contractors based? Argentina, for one, but also inside the People’s Republic of China. At the time, Congressman Jason Chaffetz of the House Oversight Committee compared OPM’s move to “leaving all your doors and windows unlocked and hoping nobody would walk in and take the information.” As it turns out, Beijing is perfectly capable of walking through open doors. By attacking OPM, Chinese hackers managed to steal millions of those records.

Unfortunately, the spy agencies themselves also hand over mission-critical tasks to potentially unreliable foreign companies. Systran International, the formerly French-owned firm that was acquired by South Korea’s CSLi in 2014, provides translation services to the National Security Agency (NSA), as well as to carmakers such as Ford. The company’s machine translation technology is used by the NSA to track online conversations in foreign languages for evidence of terror-related activity in real time. In Paris, though, where Systran is still based, the company’s recent struggles and layoffs have officials wondering whether they can trust the company’s ability to keep the classified data from France’s spy agencies (who also use its services) secure.

One of the most prominent overseas companies providing IT services to U.S. government agencies is Japan’s NTT Data Corp, which works with the military, the Department of Homeland Security (DHS) and the Drug Enforcement Agency (DEA). In all, NTT Data serves more than 50 federal agencies, raking in billions every year in spite of President Trump’s “America First” jobs agenda. NTT has managed to snap up so much of the federal contracting market in part by buying U.S. competitors, like Dell, to make themselves seem “almost as American as American companies.”

Apart from depriving U.S. companies of contracts, it’s abundantly clear that farming out our country’s most sensitive IT infrastructure to foreign firms poses very real national security issues. Then again, such threats don’t just come from outside the homeland. Bradley (now Chelsea) Manning, Ed Snowden, and Reality Winner have forced us to learn the hard way that too many people have access to classified data and too many untrustworthy individuals are somehow receiving clearances. Around five million people in the U.S. currently have security clearances that allow them access to sensitive and confidential material, with many of them working on behalf of outside contractors (American or otherwise).

Embarrassing leaks or the loss of federal employees’ personal data are bad, but they’re not the worst thing that can happen if the government continues to bungle its approach to protecting the nation’s data. The Russians have repeatedly tested new cyberweapons by temporarily bringing down parts of Ukraine’s electric grid. Those attacks have put Congress on edge, but it turns out that the U.S. isn’t all that much better prepared for a similar attack on our own grid or other infrastructure networks. In the event of a major operation conducted by the Russians, Chinese, or even a lesser adversary like the North Koreans or non-state cybercriminals, the disorganized hodgepodge of companies and contractors responsible for keeping us connected could finally be exposed as our Achilles heel.



Source link

at-painter-og-image.png

How the Paris Climate Deal Is Resurrecting American Coal


When former President Barack Obama signed the Paris Climate Agreement in 2015 (COP21), it was widely seen as the death knell for the beleaguered U.S. coal mining industry. But what was supposed to be Obama’s apotheosis in his pursuit of environmentalism at the expense of American jobs and competitiveness turned out to be one of the main forces that propelled Trump to November’s shock victory. With the shoe clearly on the other foot, the White House is now faced with a choice: should it stay or should it leave the agreement?

To be fair, much of the hate directed at the Accord comes from the way Obama played roughshod with it. The former president made no secret of his hatred for coal country. He unashamedly admitted to as much back in 2008 during his first presidential campaign, when he vowed to bankrupt any company attempting to build a coal-powered plant by aggressively slapping it with exorbitant charges on carbon emissions.

Years later, in blatant disregard of constitutional stipulations, the Obama White House concluded the Paris agreement without consulting either the Senate or Congress. The decision to circumvent our major legislative bodies is particularly disquieting as it speaks to the Obama administration’s contempt for anyone who disagreed with its environmental policies. This misplaced environmental zeal has caused enormous job losses and bankruptcies of major players in the coal sector, reducing a once flourishing industry to a shadow of its former self. Adding insult to injury, Obama left the Oval Office kicking and screaming. By last-minute executive order, he rolled out new regulations to crack down on coal mining across the country, despite the tens of millions of dollars the move was projected to cost the industry every year.

But putting emotions aside, in an unexpected and ironic twist, the COP21 presents an opportunity for coal to beat Obama at his own game. And this is exactly what is happening. America’s largest players in the coal and oil industries, including Cloud Peak and Exxon Mobil, have reversed their opposition to the deal and have advised Trump to remain within the framework, arguing that it is better to be part of the club and help steer policy rather than excluding oneself from the debate.

Indeed, the Trump White House and American coal producers can use their participation in international talks on the future of the world’s energy mix to promote the development of high efficiency, low-emission (HELE) coal-fired power plants and carbon capture storage (CSS) technology. All major international bodies agree on this point: the goals of the COP21 cannot be implemented without upgrading coal plants. As such, implementing the accord could turn the U.S. into a global clean coal leader, save thousands of jobs in the process, and ensure the continuation of an industry worth billions of dollars to the country’s economy.

Clean coal — an American export?

Clearly, the interests of U.S. coal and the country’s wider energy industry will be best served by retaining a seat at future international climate policy discussions. At a time when clean coal technology is creating new possibilities, working with other countries is the only way forward for U.S. coal. Particularly now that the industry’s future is so heavily dependent on exports to foreign markets, Washington cannot allow others to dictate policies that will create unfavorable market conditions for American coal producers.

Aggressively exploring these opportunities has the potential to bring about a renaissance in the U.S. coal industry. With a renewed focus, American energy firms can look to emulate advances made by Asian competitors such as India, which has demonstrated the technological feasibility of carbon capture and utilization at a facility in Chennai. In fact, New Delhi is actively looking toward the U.S. for increased technology and energy cooperation in the clean coal sector, making it possible for the U.S. to become an industry frontrunner.  

At the same time, India is a major market for clean coal technology. Instead of passing harsh regulations and forcing coal plants to close – as Obama imagined – the Indian government embarked on an ambitious plan to convert 40 GW of old coal plants into “supercritical” ones using HELE technology. In fact, in what sounds like music to environmentalists and climate realists alike, the country’s energy minister recently quipped that upgrading those coal plants will reduce CO2 emissions more than building 100 GW of solar panels.

The White House has said it will make a decision as to whether or not the U.S. will remain as a signatory to the Paris Agreement before G7 leaders meet in Italy next month. If he wants to keep his promise to America’s coal mining communities and secure the long-term future of an industry ravaged by his predecessor’s obsession with bringing about its end, the president would do well to listen to the growing number of industry experts and members of his own team who are vocally advocating the benefits of remaining inside the climate tent. If Trump plays this right, he will not only save U.S. energy firms’ global competitiveness, and secure many thousands of jobs; he will also add a plot twist ending to Obama’s anti-coal drama. Obama wanted coal dead but inadvertently he provided for its revival.   

When former President Barack Obama signed the Paris Climate Agreement in 2015 (COP21), it was widely seen as the death knell for the beleaguered U.S. coal mining industry. But what was supposed to be Obama’s apotheosis in his pursuit of environmentalism at the expense of American jobs and competitiveness turned out to be one of the main forces that propelled Trump to November’s shock victory. With the shoe clearly on the other foot, the White House is now faced with a choice: should it stay or should it leave the agreement?

To be fair, much of the hate directed at the Accord comes from the way Obama played roughshod with it. The former president made no secret of his hatred for coal country. He unashamedly admitted to as much back in 2008 during his first presidential campaign, when he vowed to bankrupt any company attempting to build a coal-powered plant by aggressively slapping it with exorbitant charges on carbon emissions.

Years later, in blatant disregard of constitutional stipulations, the Obama White House concluded the Paris agreement without consulting either the Senate or Congress. The decision to circumvent our major legislative bodies is particularly disquieting as it speaks to the Obama administration’s contempt for anyone who disagreed with its environmental policies. This misplaced environmental zeal has caused enormous job losses and bankruptcies of major players in the coal sector, reducing a once flourishing industry to a shadow of its former self. Adding insult to injury, Obama left the Oval Office kicking and screaming. By last-minute executive order, he rolled out new regulations to crack down on coal mining across the country, despite the tens of millions of dollars the move was projected to cost the industry every year.

But putting emotions aside, in an unexpected and ironic twist, the COP21 presents an opportunity for coal to beat Obama at his own game. And this is exactly what is happening. America’s largest players in the coal and oil industries, including Cloud Peak and Exxon Mobil, have reversed their opposition to the deal and have advised Trump to remain within the framework, arguing that it is better to be part of the club and help steer policy rather than excluding oneself from the debate.

Indeed, the Trump White House and American coal producers can use their participation in international talks on the future of the world’s energy mix to promote the development of high efficiency, low-emission (HELE) coal-fired power plants and carbon capture storage (CSS) technology. All major international bodies agree on this point: the goals of the COP21 cannot be implemented without upgrading coal plants. As such, implementing the accord could turn the U.S. into a global clean coal leader, save thousands of jobs in the process, and ensure the continuation of an industry worth billions of dollars to the country’s economy.

Clean coal — an American export?

Clearly, the interests of U.S. coal and the country’s wider energy industry will be best served by retaining a seat at future international climate policy discussions. At a time when clean coal technology is creating new possibilities, working with other countries is the only way forward for U.S. coal. Particularly now that the industry’s future is so heavily dependent on exports to foreign markets, Washington cannot allow others to dictate policies that will create unfavorable market conditions for American coal producers.

Aggressively exploring these opportunities has the potential to bring about a renaissance in the U.S. coal industry. With a renewed focus, American energy firms can look to emulate advances made by Asian competitors such as India, which has demonstrated the technological feasibility of carbon capture and utilization at a facility in Chennai. In fact, New Delhi is actively looking toward the U.S. for increased technology and energy cooperation in the clean coal sector, making it possible for the U.S. to become an industry frontrunner.  

At the same time, India is a major market for clean coal technology. Instead of passing harsh regulations and forcing coal plants to close – as Obama imagined – the Indian government embarked on an ambitious plan to convert 40 GW of old coal plants into “supercritical” ones using HELE technology. In fact, in what sounds like music to environmentalists and climate realists alike, the country’s energy minister recently quipped that upgrading those coal plants will reduce CO2 emissions more than building 100 GW of solar panels.

The White House has said it will make a decision as to whether or not the U.S. will remain as a signatory to the Paris Agreement before G7 leaders meet in Italy next month. If he wants to keep his promise to America’s coal mining communities and secure the long-term future of an industry ravaged by his predecessor’s obsession with bringing about its end, the president would do well to listen to the growing number of industry experts and members of his own team who are vocally advocating the benefits of remaining inside the climate tent. If Trump plays this right, he will not only save U.S. energy firms’ global competitiveness, and secure many thousands of jobs; he will also add a plot twist ending to Obama’s anti-coal drama. Obama wanted coal dead but inadvertently he provided for its revival.   



Source link