Skip to content ↓

President's Report supplement

"WE MUST REMIND OURSELVES, AND THE PUBLIC, THAT OUR VALUE TO PRACTICAL CONCERNS LIKE HEALTH, ECONOMIC PRODUCTIVITY, AND NATIONAL SECURITY ACCRUES ULTIMATELY FROM OUR ENTHUSIASM FOR MYSTERIES - OUR READINESS, AND THAT OF OUR STUDENTS, TO EXPLORE THE TRULY UNKNOWN."

This is a period in American higher education when it is essential that research universities articulate their value to the nation and world. We are operating in a political and economic environment that requires increased efficiency and cost effectiveness on the part of all of its institutions, including universities. To many, being "cost effective" implies that our education and research programs - particularly in engineering and science - must be more clearly and directly relevant to industry and other pervasive human endeavors. As a result, the discourse about our role in the community often is focused on university contributions that have obvious, widespread, and positive impact. Research universities in general, and MIT in particular, have had and will continue to have an extremely strong story to tell in this context. It is inherent in our institutional nature, and we are proud of it.

These circumstances invariably lead us to highlight our recent accomplishments, discuss current trends in education, and provide indicators of important technology transfer and medical advances - in other words, to talk about what we have already learned. Yet as we consider the nature of universities, and as we continue a dialogue with the public, we would do well to remember that the ultimate rationale for supporting a university system derives more from the unknown than the known.

It is the romance of discovery that draws young people to study and to pursue careers in science. It is the dream of creating entirely new devices, materials, and techniques that drives engineers. Humanists and social scientists look for new insights into the human psyche and social systems. Architects and planners seek new aesthetics and systems to enhance the quality of our lives. Management experts explore new principles upon which to organize institutions and the way we work.

It is a fact of modern life that research universities must increase their connections with the worlds of industry and professional practice. We must teach our students to relate analysis and theory to the practical and the concrete. However, it is the pursuit of the truly unknown - of principles, insights, materials, and organisms of which we currently have no inkling - that will yield the greatest rewards for a society that invests in education, scholarship, and research. New knowledge can advance the human spirit, strengthen the economy, and enhance the quality of life.

My annual report gives me an opportunity to reflect on the reasons why we in the academy dedicate our careers to research and education. This year, in preparing this report, I asked several members of the faculty to give me their reasons - in the form of the questions and puzzles they are seeking to solve. Their replies were illuminating both in content and in what they showed of different styles of thought. Even with their contributions, this report can offer only a tiny sampling of the countless gateways to the unknown. This sampling, however, offers more than sufficient justification for investing personal energy and public resources in building individual careers and major institutions devoted to education and research.

Interestingly, issues of what we can and cannot know or predict permeatedmany of the examples. Historically, we have employed science to discoverbasic principles on which we could then base practical predictions. Through engineering we have used basic principles and predictions to develop devicesand systems to accomplish work, heal disease, travel, harness energy, communicate,learn, entertain, and create wealth. In the modern world, however, we must deal with increasingly larger and more complex issues and systems,both natural and constructed. As we do so, we must be willing to considerthe limits of our historical strategies.

THE EARTH AND ITS CLIMATE

Such questions bear on matters of immense practical value, such as the ability to predict climate and earthquakes.

Climate can be loosely understood as the long-term average state of the weather, and results from the complex interactions among the atmosphere, biosphere, oceans, and land masses. Given the fact that there are natural variations in climate that have enormous impact on our lives, as well as the prospect that human activities, notably the burning of fossil fuels, might trigger a cascade of dangerous changes in climate beginning with "global warming," there is widespread agreement that we need to improve our capability to predict climate in order to inform public policy. This societal need has emerged as one of the greatest challenges we face in the natural sciences.

Computer modeling is one of the most valuable tools in the arsenal of scientists who study climate. However, even the most elaborate climate models, running on the most powerful computers, cannot reproduce today's climate without introducing uncomfortable levels of artificiality. Improving the computer codes may not be sufficient to answer the basic questions about the climate, because we do not know, even in principle, which aspects of climate are predictable. We must turn to other modes of analysis to address the issue of predictability.

Another tool available to scientists who work with very large, complex systems is "chaos theory," which grew out of the insights of MIT professor E. N. Lorenz. In his studies of weather, which is the short-term behavior of the atmosphere, Lorenz discovered that perturbations of a system that are so small as to be unobservable can lead to dramatically differing results over time. Chaos is now known to have applications in areas as diverse as chemical reactions and heart disease.

Scientists are now trying to learn what elements of climate are chaotic, as well as how interactions among the subsystems of climate, such as the oceans, the polar ice caps, and the clouds that help cool the earth, will amplify or damp the human impact on climate. They are also trying to refine precisely what we mean by prediction: what we have to know, in what detail, over what time span, in order to satisfy particular needs of society.

Many of the unknowns in our understanding of climate have parallels in the study of earthquakes, another physical system in which the ability to predict events in the short term could have tremendous benefits for individual lives and national economies. We know that earthquakes occur primarily at the boundaries of the earth's tectonic plates, and that the most active boundaries are situated in regions where the populations are increasing and mega-cities are developing most rapidly. Further, there is evidence that some large earthquakes appear to be predictable: premonitory phenomena led to the evacuation of the Chinese city of Haicheng, for example, before it was destroyed by a magnitude 7.5 earthquake in 1975.

By contrast, sensitive instruments in place in Kobe, Japan, and Northridge, California, showed no systematic premonitory events leading up to recent earthquakes that resulted in significant suffering and property damage. The problem is that there are many types of earthquakes occurring in different geological settings, and we do not know which classes of earthquakes are predictable. We don't understand the processes that lead to ground failure, or the interactions among earthquakes and other events that occur along fault systems. We don't know with any relability how serious an event will occur where and when, which is the level of understanding we need to protect lives and property.

HUMAN SYSTEMS AND ORGANIZATIONS

Understanding the behavior of physical systems like the climate, weather, and the earth's crust is one thing. Systems and organizations that involve human beings are quite another. These systems have the ability to think, communicate, adjust themselves to changing conditions, and to intentionally change themselves. This lends an even greater level of complexity to understanding and prediction. Still, we know empirically that the behavior of certain aspects of such systems can be understood in an approximate sense and that there are basic principles, though of a much less deterministic nature than in the case of physical systems.

For example, we have a number of very reliable indices of national economic growth and many years of data on individual countries. We can show that for the last few decades, rapid economic growth, with an associated rise in standard of living, has occurred in a number of less-developed Asian countries, but not in many less-developed African countries. There have also been substantial variations in growth rates among the leading industrialized nations, where only Japan and Germany have approached the rapid growth rates of best-performing Asian economies in the post-World War II period. Yet we do not know why national economies grow at such different rates either at a particular moment, or over time.

We know the likely factors that affect economic growth--education, capital accumulation, national investment in research and development, tax structures, trade policies, regulation, and basic legal and political structure. The relative importance of these factors and their interactions, however, are not known with any degree of precision, yet governments continue to develop and implement economic policy. In fact, governments routinely fall because they failed to live up to public expectations for growth--a situation where lack of knowledge of what works can actually contribute to worldwide political instability.

On a smaller scale, we do not know what the successful organization of the coming decades will look like. Even the most experienced business leaders cannot predict which companies will thrive and which will go under. By drawing on such fields as coordination science, information technology, learning theory, and strategic analysis, we hope to find the principles that will provide the basis of organizations that are efficient, flexible, innovative, and successful over time.

USING INFORMATION AND INFORMATION TECHNOLOGY

Some of the most profound and permeating changes in the nature of organizations and economies are being created by the rapidly expanding access to information. Even our largest and most dominant organizations for centuries--nations--will not be immune. We do not know what the consequences will be for the nation state of the explosion in networked electronic communications.

The enormous collective bandwidth of the Internet makes it quite unlike the telephone, and it has the potential to create a new kind of "society," an entity in itself. We cannot predict if we will have a society of very local nets, centered around individuals and small groups, or a massive global society. We do not know the consequences in either case, nor how to steer these developments even if we could determine what outcome is desirable. But clearly, the outcomes will affect the very fabric of our communities and our own daily lives. Already we are faced with the development of extragovernmental systems that are not only rich in information but that operate around the world. One need only think of the problem of organizing and operating very large scale, integrated, international systems such as a global air-traffic control system to see the magnitude of the challenge.

And on the interface of learning and information technology, we confront the fact that we really do not know how best to use our information infrastructure and new media to promote learning among children, particularly among those children whose home and community environments do not nurture and reinforce the most positive kinds of learning. Nor have we made more than a dent in the potential of information technology to promote lifetime learning among adults.

One feature of information technology--the vast archives of information available worldwide and the rapidly proliferating tools for accessing and manipulating this information--presents us with a particularly powerful and complex set of challenges. We do not know how the vast store of instantly available information can or will be understood and used.

Access alone does not assure that information can be located or understood. How can knowledge be gathered from disparate sources and then represented and shaped to enhance our understanding and our ability to use it productively? Can we strengthen our ability to transmit and understand concepts as well as simple facts? Can we better the odds that individuals of different age, language, experience, and culture will be able to assimilate and utilize the knowledge to which they now will have shared access?

These are not new problems, nor are they ones that are defined only in terms of modern information technology, but they are increasingly compelling. We need to explore the power of the human mind to better locate and use to advantage what already is known, and to take in newly available information and grasp its significance.

MEMORY, LANGUAGE, AND THOUGHT

There is no greater mystery than how we learn, remember, think, and communicate, and there is no field in which major advances would have more profound effects for human progress and health.

The achievements of human memory are astounding. We can easily recognize thousands of faces, innumerable visual scenes, and countless melodies and familiar voices. We execute motor skills like driving a car, playing a piano, or skiing. But we do not know how we learn and remember, or how we think and communicate. We do not yet know the chemical or physical nature of storage of information in the brain. We do not know where in the brain information is stored, how we retrieve it, or whether there are limits to the amount we can store.

There is every reason to believe, however, that continued, determined investigations of the biological basis of learning and memory, coupled with computer modeling, will greatly expand our understanding of the mind in the decades ahead. Not only is this an exciting scientific frontier, but a better understanding of the brain, brain chemistry, and the role of genetics may prove the key to vastly improved diagnostic and therapeutic techniques for chemically based mental illnesses such as schizophrenia and manic-depressive syndromes. Such advances would enable us to reduce both human suffering and staggering costs of health care.

Not only do we not understand brain function in the large, but in a more specific case, we do not understand the relationship between language and thought. Can we have thoughts that cannot be expressed in words? Can every-thing that can be expressed in one language be expressed in any other language as well? Cultural matters aside, linguists believe that the answer to both questions is no, but we do not know for sure. The answers are important if we are to perfect machine translation.

Another question in linguistics has even wider practical importance. All cultures have spoken language, but the discovery of written language is an historical rarity. That suggests that our biological endowment does not support reading the way it does spoken language. We do not know why or how it is that some children seem to learn to read with all the ease with which all children acquire language. If we can find out, we may be able to greatly enhance our teaching of reading to every child and eventually bring down the illiteracy rate among adults as well.

No less important than cognitive science, linguistics, and biology in helping us to understand the processes of perception and thought, though from an entirely different perspective, are the disciplines of the arts--the domain of playwrights, musicians, sculptors, dancers, and their kin. They remind us that we still do not know the parameters of free will. Nor do we always know how to see through the accepted social conventions and get down to the truth beneath. How much of our lives is within our control? When does courage consist of accepting and working with our fate, and when does it consist of fighting back? Artists remind us continually that in much of human experience, questions cannot be answered just once, for all times and all places, but rather must be asked and answered by each generation, each culture, each individual. In a society and a world where rigidity of thought and inability to see another point of view constitute a deadly epidemic, that message is more crucial than ever.

ENERGY AND THE EFFICIENT USE OF RESOURCES

As world population and industrialization expand simultaneously, issues of the efficient use of natural resources and its relation to environmental quality are becoming paramount. The underlying questions come from such disparate disciplines as engineering, chemistry, economics, political science, and materials science. They affect both developed and developing societies.

For example, although economists can show that pollution imposes real social costs, markets do a poor job of encouraging individual and organizational players to incorporate these costs into their decision making. Governments have to step in, but the approaches they have adopted to date have not, by and large, encouraged industry to be efficient and technologically innovative in solving pollution problems. We know how to design policy tools to control pollution efficiently, at least in theory. However, until recently policymakers have not relied on these policy instruments. Work is going on at MIT and other places to evaluate the potential of various "economic instruments," such as taxes on specific emissions, or tradable permits for certain effluents, to achieve both social and market goals, work that could make the policy choices more compelling.

We do not know how to produce materials with no waste by-products. In the most expansive sense, this is the objective of the emerging field of "industrial ecology," which tracks the production, use, and disposal or recycling of materials. Industrial ecologists concern themselves not only with the material inputs and outputs of a fabrication process, but with its energy requirements as well.

We do not know how to convert solar energy into practical, cost-efficient fuels for a wide variety of applications, nor do we know how to create advanced fuels for nuclear fission reactors. Renewable, safe alternative sources of energy are critical to our ability to enhance our quality of life while sustaining the quality of our environment.

On an even more fundamental level, we do not know how to extract all the energy from existing fuel sources. We know that a certain amount of energy is stored in chemical bonds, but when we burn the fuel to break those bonds, we waste much of the energy emitted as untapped heat and chemical by-products. And as anyone who has worked on technologies from spacecraft to pacemakers can tell us, the ability to milk every unit of energy from a power source could be a breakthrough of great practical importance.

An important quest of modern chemistry that bears on the efficient use of energy and resources has to do with catalysts--substances that cause reactions to speed up but are not consumed themselves. Catalysts are at the heart of most industrial processes in the chemical, petroleum, fertilizer, pharmaceutical, and related industries. And yet, we do not know how to design catalysts for many important chemical reactions. Many of the catalysts we presently employ were discovered serendipitously; scientists then worked backward to reconstruct in each case how the reaction might work. We need to discover more about the fundamental principles governing the operations of catalysts, principles that could enable us to design new catalysts that would have a host of implications for energy, the economy, and the environment.

Similarly, superconductivity, the ability of a material to carry electric current without any loss of energy, is a phenomenon that could have almost limitless practical applications. We know how superconductivity works at low temperatures, but the applications are limited by the difficulty of holding materials at appropriate temperatures. We also know that there are materials that can superconduct at much higher temperatures, but we do not fully understand how "high-temperature super-conductivity" works. An understanding of this phenomenon would open the possibility of creating new materials to take us to the next step, room temperature superconductivity, which holds out exciting promise for electric power storage and transmission, as well as ocean and rail transportation.

CANCER AND HEALTH

There was a time when the public hoped cancer might respond to an all-out attack with military singleness of purpose. We had defeated polio, ended centuries of death and disfigurement from smallpox, and created such a wealth of antibiotics that once life-threatening injuries were reduced to almost trivial annoyances.

Cancer, however, turned out not to be a single disease, in the sense of one causal agent and one set of symptoms. Rather, it is a condition of runaway cell growth triggered by the confluence of a multitude of causal factors and revealed in a multitude of physical responses. But cancer is yielding to discoveries of basic science, sometimes of quite surprising origin and nature, and cancer research has demonstrated clearly the importance of the interplay between fundamental science and applied fields like medicine and even engineering.

We still have more questions than answers, but understanding the basic cellular processes in all living organisms underlies our ability to understand and, ultimately, to prevent or treat cancer.

We now know that genes are a key to our understanding, but we do not know all the specific genes whose mutations contribute to the development and progression of cancer, nor do we understand the mechanisms by which they do it. This includes both "oncogenes," genes that can cause cancer, and "tumor suppressor genes", genes that suppress excess growth and, if absent or damaged, allow tumors to develop. A number of genes in each category has been discovered, but many have not yet been identified, nor have all the properties of known genes been explored for their possible use in cancer treatment. For example, identification of such genes may lead to diagnostic tests that can identify high-risk individuals or identify which cancers are treatable by radiation or chemotherapy.

We also do not know how and why cells die. The suppression of normal cell death--essentially cell suicide, known as "apoptosis"--is believed to be involved in the growth of certain cancers, and promoting the controlled death of particular cells is obviously the objective of much of cancer treatment. This also has implications in the understanding of auto-immune and neuro-degenerative diseases.

Yet another puzzle is this: We do not know why tumor cells migrate to new sites in the body. This question is closely related to a more general problem that arises in developmental biology: how do different cells know where to go during the development of an embryo? Here is yet another example of how studying the fundamental process can shed light on any number of related cases, such as why white blood cells home in on a site of infection or inflammation. We do know that there are areas on the cell surface (called adhesion receptors) that control how cells attach to their neighbors and whether or not they migrate. While research is far enough along that clinical trials are underway on blockers for adhesion in platelets and in white blood cells, not enough is yet known about the receptors that are actually used by the cells in spreading cancer to a new site. Once we know that, it may then be possible to inject patients with drugs that block the adhesion receptors on stray cancer cells and prevent them from binding to new sites in the body.

In the questions above, as in many others, the outline of the unknown evolved: the basic advances did not come from people looking at how to block metastasis, but from scientists trying to understand basic cell biology. Further advances in health-related fields will come from the increasing interaction of biology with other scientific and engineering disciplines.

For example, we do not know the threshold of safe exposure of living organisms to radiation. Cell biologists will be called upon to help nuclear scientists and engineers find out. We know that such a threshold exists. Cells have an inherent capacity to repair small changes caused by atomic or nuclear interaction, a capacity that allows biological life to flourish amidst the background radiation of the natural environment. Safety standards are essential for the medical use of radiation as well as in regulating emissions from industrial applications of radioactive materials. A biologically based safety standard is likely to offer far more reliable protection than our present standard, which is based on extrapolations from the effects of high levels of radiation.

The interface between biology and yet another discipline--mathematics--offers more unanswered questions. We do not know how viruses form their elegant, geometric structures from commonly occurring protein building blocks, nor do we understand the role of these structures in the infection process. By applying mathematical methods to analyze viral protein structure, we hope to gain sufficient understanding of the infection process to aid in the development of anti-viral drugs for applications from HIV to influenza.

And finally: We do not know how living cells interact with molecules of nonliving materials. The answers to this question hold the promise of making great strides in the development of artificial limbs, organs, and tissues. The opportunities here have spurred cell biologists and re-searchers in such areas as materials and chemical engineering to work together in the emerging new field of biomaterials.

THE PHYSICAL UNIVERSE

Humankind continues to passionately pursue the age-old questions about our universe. We do not know how old the universe is, what it is made of, or what its fate will be; we do not understand what mechanism generates mass in the basic building blocks of matter.

On a somewhat more modest scale, we do not know if stars other than our own sun have earth-like planets capable of sustaining life, and we do not yet have the ability to detect life, or good methods of detecting planets themselves.

Even the basic mathematics that is the language of theoretical physics must still be advanced in fundamental ways. We do not understand, in more than a limited way, three dimensional spaces and four-dimensional geometries of space and time. Recent ideas from quantum field theory have given mathematicians novel, effective tools to help classify these spaces and perceive their shapes. New insights in these areas are expected, for example, to change our conceptions of the "big bang" and the expanding or contracting universe, which have been based, of necessity, on our present-day theories of four-dimensional geometries.

We don't know whether antimatter comes from other galaxies. The answer to that question would answer a fundamental question about the origin of the universe. Nor do we know whether the universe indeed is predominantly constituted of so-called dark matter. Most basic knowledge of the physical universe is sought by both land-based and space-based instruments. The development of such instruments is made possible by the advancing state of the art in engineering, electronics, computers, and communications technology. In turn, instrumentation development often advances our engineering capabilities. One noteworthy experiment that will address such basic issues is the antimatter spectrometer (AMS), which will be placed on the International Space Station.

Even though space-based instruments are doing much to advance our knowledge of the universe, we are still drawn by the adventure of human exploration of space. Further human exploration of the solar system presents exciting opportunities for space science and challenges for space technology, but the biggest unknown at present remains the crew itself.

We do not know how to plan a mission to Mars that would not result in a dangerously unhealthy crew. Current knowledge and experience indicate that available counter-measures such as exercise may not be adequate to offset the deconditioning effects of prolonged weightlessness. We must find ways to either dramatically shorten the journey or to develop some means of artificial gravity that will provide a more earth-like inertial environment for the long trip to and from Mars.

CONCLUSION

These questions--about our physical universe, our social systems, our biological systems--represent the thoughts of only a handful of faculty at one institution, albeit a faculty and an institution that are world leaders. Such questions cause us to look to the future rather than the past, a particularly appropriate focus for the MIT community and those who would share our adventure.

Being able to shape good questions is a critical capacity for every teacher and learner; it is the key to education. Unanswered questions are the single most valuable thing we lay before our graduate students. The "right thesis topic" is the question that will open not one door but many; in fact, it is a question that will lead to a whole career's worth of new questions.

Doc Edgerton once remarked that students were always coming to him worried that all the really interesting problems had been solved, that there wasn't a lot of the fun of discovery left for young scientists and technologists. Somebody had already invented the silicon chip and cloned the first gene; what was left? Doc, of course, took the opposite tack; he thought the world was a sea of wonderful puzzles. That was probably the secret of his great success, as an inventor and researcher and as a teacher: his zest for the unanswered questions.

His is a mantle the entire university community must take up. We must remind ourselves, and the public, that our value to practical concerns like health, economic productivity, and national security accrues ultimately from our enthusiasm for mysteries--our readiness, and that of our students, to explore the truly unknown.

Charles M. Vest, November 1995

---------------------------------------------------------------

IN SPECIAL RECOGNITION

This eventful year saw a number of changes within the faculty and staff of MIT.

Many of these changes significantly affected the Institute's senior administration, including the resignation of Professor Mark S. Wrighton as provost, and the appointment of Professor Joel Moses as his successor.

Professor Wrighton, a member of the chemistry faculty at MIT since 1972 and provost since 1990, became chancellor of Washington University in St. Louis on July 1, 1995. Dr. Wrighton, the CIBA-GEIGY Professor of Chemistry, had become a full professor in 1977 at the unusually young age of 28. He headed MIT's Department of Chemistry from 1987 until he was named provost in 1990. He has mentored about 70 doctoral students at MIT, and is widely considered to be one of the nation's leading scientists. His extraordinary service and accomplishments as provost contributed greatly to the continued excellence and vitality of MIT.

Professor Moses, the Dugald C. Jackson Professor of Computer Science and Engineering and a member of the MIT faculty since 1967, had been head of the Department of Electrical Engineering and Computer Science from 1981 to 1989, and dean of the School of Engineering since 1991. As dean, Professor Moses set a new course for engineering education, and has been widely recognized as the national leader.

A version of this article appeared in MIT Tech Talk on November 15, 1995.

Related Topics

More MIT News

Headshot of Catherine Wolfram

A delicate dance

Professor of applied economics Catherine Wolfram balances global energy demands and the pressing need for decarbonization.

Read full story