The nuclear power industry in the United States has had a history of wild swings from optimism to pessimism to fatalism. After the first wave of over 100 nuclear reactors that were planned in the 1960’s and 1970’s was completed, there has been a span of 2 decades without a new reactor being built. Then, starting in the early 2000’s, a new feeling of optimism arose as the nuclear industry, electric utilities, the US government, and even some environmental organizations realized that nuclear power could be the solution to the world’s global warming problem. And with the high cost of fossil fuel (at a time before fracking technology drove natural gas prices to historic lows), most in the industry believed that new nuclear plants could be built quickly and be cost competitive with other new power sources. These plants would incorporate new technology and advanced safety features, would be governed under new streamlined government regulations to avoid costly design changes mid-construction, would apply lessons learned during construction of the earlier plants, and would have access to competitive financing with federal loan guarantees. This was considered the beginning of the “nuclear renaissance”. Between 2007 and 2009, 13 companies applied to the Nuclear Regulatory Commission (NRC) for construction and operating licenses to build 31 new nuclear power reactors in the US. Today, plans for virtually all of those reactors have been cancelled, and nuclear power generation reached a peak in 2010 and has since been declining (Figure 1).