COAL MINING PRODUCTIVITY: THE WHOLE STORY
Coal plays an important role in our national well-being: it provides more than a fifth of the energy and half of the electricity consumed in the United States. Much attention therefore focuses on coal mining productivity: how much coal can be mined for each hour of labor spent? National statistics on coal mining productivity show that--except during the 1970s--coal mines have become steadily more efficient.
However, an Energy Laboratory study by Dr. A. Denny Ellerman and Professors Thomas Stoker and Ernst Berndt suggests that those national statistics do not tell the whole story. When the researchers analyzed productivity data for more than 19,000 mines from 1972-95, they found that some regions and some technologies lagged far behind others. Thus, while western longwall mines were five times more productive in 1995 than in 1972, other types of mines improved by less than half as much.
Detailed analyses of why productivity changed brought some unexpected results. For example, even after accounting for geology and technology, bigger mines were more productive than smaller ones. Also, prices affect the national aggregates. When coal prices increase relative to labor prices, companies open smaller mines with less favorable geology and overall productivity drops. Indeed, according to the analysis, price increases were more important than new regulations in causing overall productivity to plummet in the 1970s.
The study shows that aggregated national productivity data do not provide an accurate picture of the efficiency with which an industry uses its resources or of the causes of changes in overall productivity.
Dr. Ellerman is a senior lecturer at the Sloan School and executive director of the Center for Energy and Environmental Policy Research (CEEPR). Professor Stoker is the Gordon Y Billard Professor of Applied Economics; Professor Berndt is the Louis E. Seley Professor of Applied Economics. Their research was supported by CEEPR.
REDUCING DOWN TIME IN NUCLEAR POWER PLANTS
Today, the typical US nuclear power plant spends almost two out of every 18 months shut down for refueling. As owners of such plants face new competition for customers, they're looking for ways to reduce costs, and refueling less often is one option.
Working closely with power plant operators, MIT Energy Laboratory researchers have designed reactor cores and operating procedures that would enable power plants to run for up to about four years before needing to refuel. Because of the extra cost of the necessary enriched fuel, adopting a four-year "extended operating cycle" under today's economic conditions would be cost-effective at plants that now experience relatively long down times for refueling and forced shutdowns, but not at plants that operate more efficiently.
A three-year operating cycle requiring less highly enriched fuel would bring savings at many more plants. And if laser-based technology now being developed reduces the cost of enriched uranium, the economics of the extended cycles would improve significantly.
Perhaps most important, the MIT team identified strategies that plant operators can use to reduce forced shutdowns and to perform more maintenance procedures while their plants are on line. The researchers emphasize that any reduction in down time will not only reduce costs but also prevent possible long-term damage to plants caused by repeated stopping and starting.
The team is led by Neil Todreas, the Kepco Professor of Nuclear Engineering, who holds appointments in the Departments of Nuclear Engineering and Mechanical Engineering. The research was funded by the DOE's Idaho National Engineering and Environmental Laboratory University Research Consortium.
This column features summaries of MIT research drawn from several sources. If you have an item to suggest, send it to Elizabeth Thomson, News Office assistant director for science and engineering news, Rm 5-111, or firstname.lastname@example.org.
A version of this article appeared in MIT Tech Talk on January 13, 1999.