Needed in Virginia: A Fiscal Early Warning System

Share Button

In the wake of the City of Petersburg’s fiscal meltdown, the warningGeneral Assembly has appointed Del. R. Steven Landes, R-Augusta, to head a subcommittee to study how other states deal with fiscally stressed localities. In surveying best practices in other states, it’s a good bet that Landes will familiarize himself with a new report, “State Strategies to Detect Local Fiscal Distress,” published by the Pew Charitable Trusts.

It turns out that 22 states, many of whom have learned from hard experience, have established mechanisms for monitoring the fiscal health of local governments. The idea is to spot the warning signs of fiscal distress in the hopes that local elected officials will take action before their municipality becomes the next Detroit, Mich., Stockton, Calif., or Central Falls, R.I.

Virginia requires cities, counties and towns to file audited documents called Comprehensive Annual Financial Reports (CAFRs), but the oversight ends there. No one at the state level analyzes the data or conveys findings to the public.

Of the 22 states that monitor local fiscal health, Pew classifies eight as “early warning states” with laws defining when local governments are in “fiscal distress” and systems to identify when a locality reaches that status. As Pew notes, an early warning system can save states big headaches and liabilities down the road if it can avert local government failure.

Another big bonus: Credit-rating agencies look positively upon such systems. “All else being equal, we tend to assign higher ratings to troubled governments in states with strong oversight, well-established policies of intervention, and a track record of success,” the study quotes Moody’s Investors Service as saying. Among the key indicators that analysts in other states scrutinize:

  • Audits and other financial information submitted on time
  • Deficits and minimum fund balances
  • Size of debt-service obligations, and debt service as a ratio of population and operating revenue
  • Sufficiciency of cash for services
  • Total revenue and expenditures per capita
  • Unrestricted fund balances
  • Cash-to-liabilities ratio
  • Pension plan funding ratios

Few local elected officials have the financial training to read municipal balance sheets and spot the signs of impending trouble. A state report card on key indicators and ratios would be invaluable to them and to members of the public. Flashing warning signs would give them time to address problems before they reached the point where, like Petersburg, city officials have to make catastrophic cuts to core government services.

(This article first ran in Bacon’s Rebellion on October 10, 2016)

bacon-90Email this author


Facebook Comments
Posted in Economy | Leave a comment

Future of the Interstates Study Under Way

Share Button

highwayThe Transportation Research Board last week formally kicked off a several-year study of the future of America’s Interstate highway system. In the FAST Act, Congress authorized $5 million for the study to figure out how this vital infrastructure should be upgraded and restored for the 21st century. TRB is in charge of the project, and has selected a blue-ribbon committee to oversee the work. Chaired by Norman Augustine (former CEO of Lockheed Martin), its 14 members include former DOT Secretary Norm Mineta, Metropolitan Transportation Commission Executive Director Steve Heminger, University of Southern California researcher Genevieve Giuliano, University of Texas researcher Michael Walton, and Michigan DOT Director Kirk Steudle. The Committee held its first meeting last week, at the National Academy of Sciences in Washington, DC.

There are five priorities for the study, as follows:

  1. Identify reconstruction priorities for the next 50 years, based on the current conditions of individual Interstates and their rates of deterioration.
  2. Project future demand for the system, both freight and passenger.
  3. Integrate new technological capabilities, to increase both safety and efficiency.
  4. Propose current major highways (on the National Highway System) that make sense to upgrade to Interstate status.
  5. Identify the required resources for reconstruction, expansion, and ongoing maintenance.

This study is long overdue, and should have been launched prior to the system’s 50thanniversary a decade ago. But at least it is now under way. While all five priorities are important, I’d like to offer a few comments on several of them. Since my own research that produced the 2013 Reason Foundation study “Interstate 2.0,” I have kept an eye out for proposed new Interstate corridors. As part of that exercise, I compiled data on large cities that have grown the most or shrunk the most between 1950 and 2008. Top of the list were Las Vegas (over 21 times more people than in 1950) and Phoenix (14 times more). Yet because the map of what became the Interstate system was drawn up during World War II, those two cities were almost afterthoughts—and there is still no Interstate route between them. Other high-growth cities include Tucson (11 times), San Jose (9 times), Austin (5 times), and Albuquerque and Charlotte (both 4 times larger). Yet a number of the new Interstate routes proposed by members of Congress and local boosters in recent years connect much smaller places. I hope the Committee and its consultants develop a rigorous methodology for assessing benefits versus costs for such new corridors.

Automated and connected vehicles will certainly have an impact on the demand for Interstate capacity in coming decades. One of the most promising developments is likely to be truck platooning, especially in corridors with high and increasing heavy truck movements. FHWA’s Freight Analysis Framework several years ago projected which specific Interstate corridors would reach 40% of their traffic as trucks in the decades between 2020 and 2040. Reason’s 2015 policy study “Truck-Friendly Tolling for 21stCentury Interstates” drew on that research to propose dedicated truck lanes in 11 major corridors, seven of which are multi-state corridors. Such dedicated lanes would be the best setting for truck platooning—and given how long it takes to get major highway projects implemented, planning needs to start now to be ready for the time truck platooning is ready for large-scale implementation.

Finally, the question of the resources needed for reconstruction and modernization is perhaps the most important outcome to be expected from this study. It is very clear that with current fuel taxes not generating enough revenue to even properly maintain current highway capacity, there is no identified funding source for the $1-2 trillion set of mega-projects that will constitute a 21st century Interstate system. Such a program cannot and will not be fundable out of annual cash flow. It is a need tailor-made for long-term user-feefinancing. And it would also be a good fit for procurement as a large set of P3 concession megaprojects, shifting the typical megaproject risks to investors, rather than saddling taxpayers with those risks.

For reference:

“Interstate 2.0: Modernizing the Interstate Highway System via Toll Finance,” September 2013.

“Truck-Friendly Tolls for 21st Century Interstates,” July 2015.

(This article first ran in the September 2016 edition of Surface Transportation Innovations)

Email this author

Facebook Comments
Posted in Transportation | Leave a comment

Deeper Analysis on Potential Pacific Trade Pact: Japan

Share Button

(Editor’s note: this is the fourth in this series by Gary Baise, a noted agriculture policy/environmental stewardship expert who lives here in Virginia.)

rice-japan-tppThe Trans Pacific Partnership (TPP) sets forth each nation’s tariff commitments. It is argued by many in agriculture that TPP is a good deal and needs all of our support. Others are more critical. The annex sections dealing with Japan appear to be fodder for those who have concern about the fairness of TPP regarding the United States. The section on rice, for example, does not appear to be great.

While Canada has four chapters regarding tariff eliminations and tariff rate quotas and Chile has two, there are eight such chapters related to Japan. Among the eight chapters is Japan’s listing of its tariff schedule that numbers 1,133 pages. (HUGE!) The level of detail is mind boggling. It can be read from the Japanese schedule there are certain areas where it imposes substantial tariffs on the product. Certain meats presently have a 50% base rate tariff and in year 1 of the agreement, if ratified, the tariff on Cheek meat and head meat would drop from 50% to 39% and tariffs on such products would last not for 20 years. Even after 20 years a 9% tariff would apply to the product.

There are tens of products where the remarks column refers the reader back to the general notes of the tariff schedule. Japan’s general notes to Annex 2-D does indicate that many products will come into the country duty free. A person has to examine the 1,100 pages to determine when the product will be duty free. Certain products under category B4 are eliminated in 4 years and others in 6 years. As pointed out earlier, some tariffs are not eliminated for 20 years.

Under Section N of Japan’s tariff schedule, there is language which on its face appears indecipherable. To give you a flavor I will quote the language. It states “Customs duties on originating goods provided for in the items in staging category JPB8…shall be eliminated as follows: i) the customs duties shall be reduced to 10 percent ad valorem per liter, on the date of entry into force of this agreement for Japan;”. This type of language goes on for 2.5 pages covering products through 10 years. Maybe this language is helpful for American products, but the complexity and obtuseness makes it not readily apparent that this is a transparent deal. There is nothing I can find to date about currency manipulation which can also have a major impact on prices.

Japan has added an Appendix B-1 Agricultural Safeguard Measures. This is a 25-page document. This appendix sets out “the originating agricultural goods that may be subject to agricultural safeguard measures under paragraph 5 of the General Notes of the Tariff Schedule of Japan.” This appendix  identifies agricultural products that may be subjected to safeguard measures. It describes how Japan may apply agricultural safeguard measures and increase the rate of customs duties on each agricultural product. In addition, the document claims for certain agricultural products it can limit the number of metric tons coming into Japan for at least 10 years beginning in year 11 running through year 15. It can grant small increases in the number of tons of whatever agricultural product is listed in a category. Rice is an example.

Japan’s Appendix A on page 40, Section C describes the quota quantity of rice from the United States, which will be allowed in duty free. Rice – a staple in the Japanese diet – is a contentious issue. In year 1, the U.S. is able to export 50,000 metric tons of rice to Japan, and in year 15 that amount increases to 70 metric tons. The amount of rice produced in Japan according to rice statistics in 2014 equaled 8,345,000 metric tons.

I suppose U.S. rice growers should be happy with this quantity of rice to be exported because Australia is limited to 6,000 metric tons and by year 15 can export 8,400 metric tons to Japan. Something is better than nothing, but this opening created by TPP appears fairly insignificant. In the negotiations, U.S. rice producers had sought to export 165,000 metric tons of rice to Japan. The results fell well short of that goal. Promoters of TPP argue that in 15 years the U.S. will reduce its current 11% tariffs on rice imported products to 0%. This appears to be a pretty steep price to pay to get a mere 50,000-70,000 tons into Japan in 15 years.

It appears a new negotiator is needed.

(This article first ran in Farm Futures on September 20, 2016)

Email this author

Facebook Comments
Posted in Economy | Leave a comment

Who is Guarding the Guards?

Share Button

audit-irsSeveral years ago, Wells Fargo Bank discovered that employees had boosted sales, by opening some 2 million deposit and credit card accounts without customer knowledge or authorization. Over the next few years, the bank fired more than 5,000 employees for misconduct and reimbursed customers $2.6 million in fees that they may have incurred on the bogus accounts.

Insufficient response and retribution, regulators and politicians howled. They played no role in uncovering the fraud, but they are hounding bank officials and demanding $185 million in fines.

In another action, the Environmental Protection Agency, Federal Trade Commission and State of California agreed to a $14.7 billion settlement with Volkswagen, to compensate 482,000 buyers who bought diesel cars that the company illegally made appear less polluting than they actually were.

“This settlement shows that EPA is committed to upholding standards to protect public health, enforce the law and protect clean air,” said Administrator Gina McCarthy. But it’s just a “partial settlement,” a “first step” in holding VW accountable for breaching “the public’s trust,” added DOJ Deputy AG Sally Yates.

Meanwhile, Ms. Yates wants prosecutors to employ the Responsible Corporate Officers Doctrine (or Park Doctrine) more often, to hold executives individually accountable for the actions of company employees, without requiring that the government prove the execs intended to break any laws – or even that senior managers were negligent or didn’t even know someone in the company was violating a law.

No one should be victimized by corporate fraud, negligence or incompetence. But neither should they be victimized by negligent, incompetent or criminal actions of government agencies and bureaucrats, or of third parties they hire to validate their policies and agendas. Those actions also breach the public trust.

Equally fundamental and essential, policies and rules that affect our livelihoods, living standards and liberties must be based on honesty, accountability, evenhanded application, and verifiable evidence.

Those basic guidelines are patently ignored today, as countless examples demonstrate beyond doubt.

The IRS repeatedly abused its power in targeting conservative groups. But then Lois Lerner’s emails mysteriously disappeared, she took the Fifth and retired with full pension, “two employees on the night shift” deleted the email backup tapes (with no repercussions) and Mr. Koskinen steadfastly refuses to cooperate with congressional investigators. No Park Doctrine for any of them.

Abuses are rampant throughout federal, state and local governments, as news accounts constantly attest. Incompetence, fraud and public trust violations just in the environmental arena are mind-numbing.

On August 5, 2015, an EPA-hired crew negligently reopened the Gold King Mine above Silverton, Colorado and unleashed a 3,000,000-gallon toxic flashflood that contaminated rivers all the way to Lake Powell in Utah. EPA waited an entire day before notifying the public, offered apologies but only minimal compensation, refused to fire, fine or demote anyone – and issued a report that whitewashed the agency’s incompetence and even scrubbed the names of EPA on-site coordinator Hayes Griswold and his team.

But it’s on the regulatory front that the duplicity, exaggeration, fabrication and betrayal of our public trust are really outrageous – and used to amass more power and control over our energy, economy, job creation and living standards, close down companies and industries that regulators detest, and advance crony corporatist deals with favored entities, regardless of costs or impacts on jobs, health and welfare.

EPA is determined to make our air not merely safe or healthy, but pristine, with no human pollutants. Since 1970, US cars have reduced tailpipe pollutants by 99% and coal-fired power plants have eliminated 92% of their particulate, sulfur dioxide and nitrogen oxide emissions. That’s still not enough, says EPA.

To promote its claim that any soot and dust particles are deadly, the agency employs “epidemiological” studies that attempt to link slightly higher death and pollution rates in different locales – and attribute the difference to manmade particulates. However, it is impossible to distinguish health effects due to vehicle, refinery or power plant pollutants from scores of natural pollutants, or to tell whether a death was caused by pollution or by bacteria, obesity, smoking, diabetes or countless other factors.

So to augment its baseless claims, EPA employed illegal experiments on people. But even when its human guinea pigs breathed up to 30 times more particulates than the agency insists are lethal, no one died. Apparently, air pollutants are a health hazard when they come from cars, refineries or coal-fired power plants – but not when they are administered in massive quantities by researchers hired by EPA.

EPA gets away with this by having activist groups posing as scientific bodies rubberstamp its pseudo-science. Since 2000, it has paid the American Lung Association more than $25 million, given its “independent” Clean Air Scientific Advisory Committee members over $181 million, and let CASAC deny membership to industry or other experts who might question EPA findings.

EPA also wants to regulate all ponds, puddles, creeks, ditches and other “waters of the United States” (WOTUS) that are even remotely connected to a navigable waterway. That way it can control nearly all land uses and family, farm and industrial activities in the USA – based on equally specious “science” regarding supposedly dangerous pollutants that might get into drinking water or wildlife habitats.

The junk science really goes into hyper-drive on climate change. Of course, it’s not just EPA. Virtually every Executive Branch agency has been enlisted in President Obama’s campaign to use “dangerous manmade climate change” to justify fundamentally transforming our nation’s energy, economic, legal and constitutional systems: from NASA and NOAA, to Agriculture and Interior, and even the Defense Department and Securities and Exchange Commission. The agenda overrides science and ethics.

EPA’s 54.5 mpg dictate for vehicles will force millions into smaller, lighter, plasticized cars that will not survive collisions with walls, trees, trucks and buses – causing thousands more serious injuries and deaths every year. That human toll is ignored in the agency’s “social cost of carbon” reports. So are the absence of hurricanes hitting the US mainland for 11 years, no rise in average global temperatures for 18 years, followed by a couple tenths of a degree since then, and the barely seven inches per century in Real World sea level rise, contrary to climate models and White House, EPA, IPCC and Al Gore assertions.

Equally absurd, these regulators are hobbling the US economy, while China, India and other developing nations produce and use increasing amounts of oil, natural gas and coal every year. Perhaps worse:

Federal regulations cost US businesses and families $1.9 trillion per year – with EPA alone accounting for $353 billion of that. This is a major reason for America’s anemic 1.1% annual economic growth and its worst labor participation rate in decades. As always, poor and minority families are hit hardest. And far too many of these regulations and costs are based on questionable, fabricated, even fraudulent science.

To top it off, illegal, unethical collusion has also become rampant at EPA: in sue and settle lawsuits, Alaska’s Pebble Mine permits, the Clean Power Plan, and helping climate activists with fund raising.

If these actions were committed by a private corporation, EPA and Justice Department SWAT teams would come after its executives, with no intent, negligence or knowledge required. But Ms. McCarthy and her staff have not been held to any such Park Doctrine standards – at least not yet.


Facebook Comments
Posted in Government Reform, Uncategorized | Leave a comment

Energy Facts and Figures for Dummies Part VI

Share Button

(This is part VI of a series on energy.  We hope it helps the reader better understand the issues facing our country and our state as we endeavor to tackle the problem of providing our citizens and our businesses with their energy needs.)

Coal discussion continued from last time ………

Consumption, Production, Exports, and Prices

just-facts-energy 150* In 2015, the U.S. produced 895 million short tons of coal, consumed 802 million short tons, and had net exports of 63 million short tons.[534] [535]

* Between 2008 and 2015, U.S. coal consumption declined by 28%, primarily as a result of lower natural gas prices and stricter environmental regulations:[536] [537] [538] [539]

Coal Production, Consumption, Exports


* In 2014, the average domestic price of coal was $35.72 per ton.[541]

Inflation-Adjusted Average U.S. Coal Prices

[542] [543]

* Demand for electricity varies on an hourly, daily, and seasonal basis due to factors such as:

  • the time of the day, which influences the usage of lighting, computers, and other electric devices.
  • the weather, which influences the usage of heating, air conditioning, and ventilation systems.[544] [545] [546]

* As shown in the following graph, the terms “baseload” and “peak load” are used to describe the minimum and maximum demands for electricity over a given time period. The term “intermediate load” is used to describe the range between them.[547] [548]

Electricity Load Curve Example


* Coal is the dominant energy source for generating baseload capacity because once built, low fuel costs make coal plants inexpensive to run continuously, which is ideal for generating baseload capacity.[550] [551] [552]

* Natural gas is the dominant energy source for generating intermediate and peak load capacity because:

  • natural gas power plants can ramp up and down quickly, which is ideal for intermediate and peak load capacity.
  • natural gas power plants are less expensive to build than coal and nuclear power plants.[553] [554] [555] [556]

In 2009, natural gas became competitive with coal for generating baseload capacity in some areas of the U.S. This was primarily due to increased domestic natural gas production, which reduced prices. Other factors included increased coal prices, stricter environmental regulations, and expansion of natural gas pipelines.[557] [558] [559][560]

* In 2012, both coal and natural gas fuels were competitive for generating baseload capacity under differing circumstances in different regions of the U.S.[561]

* Due to their higher efficiency, natural gas power plants that employ a technology called “combined cycle” can generate baseload power less expensively than coal plants when natural gas is about 1.5 times the price of coal.[562][563] [564] In 2015, the average energy-equivalent price paid by electric power plants for natural gas was about 1.5 times the price of coal.[565]


* In the U.S., coal is mined in two primary ways: surface mining and underground mining. Per the U.S. Department of Energy:

Surface mining accounts for about 60 percent of the coal produced in the United States. It is used mostly in the West where huge coal deposits lie near the surface and can be up to 100 feet thick.

In surface mining, bulldozers clear and level the mining area. Topsoil is removed and stored for later use in the land reclamation process. Specially designed machines … expose the coal bed. Smaller shovels load the coal into large trucks that remove the coal from the pit.

Before mining begins, coal companies must post bonds for each acre of land to be mined to assure that it will be properly reclaimed. In the reclamation process … the area restored as nearly as possible to its original contours. Since 1977, more than 2 million acres of coal mine lands have been reclaimed in this manner.

Where coal seams are too deep or the land is too hilly for surface mining, coal miners must go underground to extract the coal. Most underground mining takes place east of the Mississippi, especially in the Appalachian mountain states and is used to produce about 30 percent of U.S. coal today.[566]

* In 2014, 16 U.S. coal workers were killed while working.[567] [568] In conjunction with technological advances, improved safety measures, and stricter regulations,[569] [570] coal worker fatalities have declined from a high of 3,242 people in 1907 to a low of 16 people in 2014:

Coal Worker Fatalities


* Per the Encyclopædia Britannica:

Coal mines and coal-preparation plants caused much environmental damage in the past. Surface areas exposed during mining, as well as coal and rock waste (which were often dumped indiscriminately), weathered rapidly, producing abundant sediment and soluble chemical products such as sulfuric acid and iron sulfates. Nearby streams became clogged with sediment, iron oxides stained rocks, and “acid mine drainage” caused marked reductions in the numbers of plants and animals living in the vicinity. Potentially toxic elements, leached from the exposed coal and adjacent rocks, were released into the environment. Since the 1970s, however, stricter laws have significantly reduced the environmental damage caused by coal mining.[572]

Natural Resources

* The U.S. has more recoverable coal reserves than any other nation, amounting to one quarter of the world’s coal resources.[573] [574]

* Based on U.S. Energy Information Administration (EIA) estimates from 2011, the U.S. has roughly 262 billion short tons of recoverable coal reserves, comprised of 23 billion tons of lignite, 96 billion tons of subbituminous coal, 139 billion tons of bituminous coal, and 4 billion tons of anthracite. These resources amount to:

  • 285 years of lignite production at the 2011 production rate.
  • 187 years of subbituminous coal production at the 2011 production rate.
  • 277 years of bituminous coal production at the 2011 production rate.
  • 1,764 years of anthracite production at the 2011 production rate.[575] [576] [577]


* Nuclear energy is so-named because it is stored in the nuclei of atoms. Through the process of fission, this energy is transformed into heat, which can be used to power steam boilers that drive electricity-generating turbines.[578] [579]

* Uranium is the primary fuel used in nuclear power plants because the process of fission is most easily achieved with elements with heavy nuclei, and uranium is the “heaviest naturally-occurring element available in large quantities.”[580] [581]

* The world’s first controlled nuclear fission reactor was built in the U.S. by Italian physicist Enrico Fermi, and it became operational in 1942.[582] The world’s first nuclear-powered electricity plant was built in the Soviet Union, and it became operational in 1954.[583]

* Through fission, a single pound of uranium can generate as much energy as burning three million pounds of coal.[584]

* In 2015, nuclear energy supplied 8.5% of all primary energy consumed in the United States:

Nuclear Energy Usage


* In 2015, nuclear energy generated 19.5% of all electricity produced in the U.S.[586]

Baseload Power

 * Demand for electricity varies on an hourly, daily, and seasonal basis due to factors such as:

  • the time of the day, which influences the usage of lighting, computers, and other electric devices.

  • the weather, which influences the usage of heating, air conditioning, and ventilation systems.[587] [588] [589]

* As shown in the following graph, the terms “baseload” and “peak load” are used to describe the minimum and maximum demands for electricity over a given time period. The term “intermediate load” is used to describe the range between them.[590] [591]

Electricity Load Curve Example


* Nuclear power is a major source of baseload capacity because once built, low fuel costs make nuclear plants inexpensive to run continuously, which is ideal for generating baseload capacity.[593] [594] [595]


* Because the products of nuclear fission emit hazardous levels of radiation, generate heat, and could be used in weapons called “dirty bombs,” they must be reprocessed and/or stored in secure locations and cooled.[596] [597][598] [599] [600] [601]

* Waste and fuel from commercial nuclear power plants cannot accidentally or intentionally be used to produce a nuclear blast. Such explosions require different grades of materials than those used and produced by commercial power plants.[602] [603]

* Nuclear power plant operators must pay up-front fees to the federal government for the future costs of decommissioning of their plants, thus making it impossible for operators to avoid these costs through bankruptcy after the plant closes.[604] [605] [606]

* The Nuclear Waste Policy Act of 1982 required the federal government to:

  • take responsibility for storing waste from nuclear power plants and find at least one suitable location to store it.
  • collect fees from nuclear power plant operators for storing the waste.
  • start accepting waste from the power plants by 1998.[607] [608] [609] [610] [611]

* A 1987 law directed the federal government to evaluate storing the waste in the Yucca Mountain, which is located on a 230-square mile plot of federal land in the Mojave Desert of southern Nevada:[612] [613]

Yucca Mountain


* Current law limits the amount of fuel that could be stored at Yucca Mountain to 70,000 metric tons, which is about equal to the nation’s current nuclear waste. Per evaluations performed by the Department of Energy, at least 3-4 times this amount can be safely stored at Yucca.[615] [616]

* At a cost of hundreds of millions of dollars during the 1990s, the U.S. Department of Energy drilled a 5-mile long, 25-feet diameter tunnel into the Yucca Mountain, along with a 2-mile long tunnel that branches off of it.[617] [618][619]

* A 2002 federal law approved the Yucca site for permanent nuclear waste storage.[620] [621]

* By 2006, Minnesota had banned the construction of new nuclear power plants, and 11 other states had restricted the construction of new plants until certain provisions for long-term disposal of nuclear waste are met.[622] [623] [624]

* In June 2008, the Bush administration Department of Energy (DOE) submitted an application to the Nuclear Regulatory Commission (NRC) for approval to construct a waste repository at Yucca Mountain.[625]

* In March 2009, the Obama administration DOE announced that it was going to terminate the Yucca Mountain repository. Inquiries to DOE by the U.S. Government Accountability Office and Nuclear Regulatory Commission found that the decision “was made for policy reasons, not technical or safety reasons.” Per the Obama administration DOE:

[The Energy] Secretary’s judgment is not that Yucca Mountain is unsafe or that there are flaws in the license application, but rather that it is not a workable option and that alternatives will better serve the public interest.[626][627]

* After this announcement, the Obama administration moved to shut down the Yucca Mountain program by September 2010 by terminating leases and contracts, archiving documents, eliminating the jobs of all federal employees working on the project, and disposing or transferring federal assets used for the project. [628]

* Between 1983 and 2011, the federal government spent roughly $15 billion “to evaluate potential nuclear waste repository sites, evaluate the Yucca Mountain site in more depth, and develop and submit the license application for it.”[629]

* Between 1983 and 2011, nuclear power plant operators paid more than $30 billion in fees (including earned interest) to the federal government to dispose of nuclear waste. The government used $9.5 billion of these fees “to evaluate potential nuclear waste repository sites, evaluate the Yucca Mountain site in more depth, and develop and submit the license application for it.”[630] [631]

* At the end 2014, the U.S. had more than 74,000 metric tons of commercial nuclear waste, most of which is being stored at nuclear power plants.[632] [633] [634] [635]

* Due to a breach of federal government’s responsibility to start taking waste from power plants starting in 1998, the federal government has paid $5.3 billion in court-ordered damages and settlements to power plant operators as of September 2015.[636] [637]

* In 2015, the Inspector General of the Department of Energy estimated that the federal government’s total liabilities for breaching this responsibility will amount to $29 billion. The nuclear power industry estimates that it will be at least $50 billion.[638]

* In 2013, a three-judge panel of the District of Columbia Court of Appeals ruled (2-1) that the NRC “was violating federal law by declining to further process the license application” for the Yucca facility. The court ordered the NRC to continue this process.[639] [640]

* After this ruling, the NRC published reports in 2014 and 2016 finding that the Yucca facility could safely store nuclear waste for a million years.[641] [642] [643]

Email this author

Facebook Comments
Posted in Energy, Uncategorized | Leave a comment