Contents - Previous - Next


This is the old United Nations University website. Visit the new site at http://unu.edu


Part 1: Science, technology, and development


1 Modern science and technology
2 The story of development thinking
3 Measuring science, technology, and innovation


Part 1 sets the scene. Jean-Jacques Salomon first reviews the emergence of modern science: its successive institutionalization, professionalization, and industrialization. The fact that this process tended to happen in a different order in developing countries raises particular problems for them. In recent years, in industrialized countries the expansion of modern science and technology has gone hand in hand with the rise of science policy - that is policy for science and policy through science - as a result of increasing concern about the impact of advances in science and technology on society. Science is linked to the state, and in the context of the Cold War, there was a full-scale mobilization of scientific research. It is impossible to underestimate the importance of the innumerable innovations generated by economic competition and by defence-related R&D during this period, and especially the role they played in the conception and development of the new technologies that characterize the "new technical system" now flourishing. In an era of increasing international competitiveness, innovation rests on a much wider range of actors, institutions, and issues, raising a lively debate on the role of the state: how far should it intervene, under what circumstances, and on what criteria? The chapter ends with a discussion of the universality of science and the coexistence and complementarily of rationalities, which may challenge Western science as a unique model, but not its operational effectiveness.

What is development? Nasser Pakdaman traces the evolution since the Second World War of the ideas, theories, and practices that have lain behind the efforts of the third world countries to emerge from "underdevelopment." The patchiness of their success - indeed, the frequent failures - has led some commentators to refer to the rise and fall of development economics, as if the subject were bound to disappear so that others could rise, like a phoenix, from its ashes, relying increasingly on an ever wider range of social sciences (sociology, anthropology, history, etc.). There is now a better understanding of the factors leading to economic growth, but there is still no clear definition of what constitutes economic development, beyond the fact that it involves a process of gradual transformation over the long term, and the ingredients are never exclusively economic. The current preference for "sustainable development" arises out of an awareness that the pure "economic paradigm" has its limits, whether inspired by the Left or the Right, and that economic theory and practice must abandon the illusions of rapid "take-off" or "catching up" and instead fit in with the historical realities that shape the specific characteristics - and constraints - of each country.

Though it is difficult to achieve international comparability using existing R&D and innovation indicators, Jan Annerstedt attempts to provide from the existing statistics a comprehensive picture of measurements of science, technology, and innovation, stressing the uneven relationship in R&D spending: in 1988-1989, the third world had a little more than 4.5 per cent of total R&D funds, with considerable differences among developing countries. A proposed worldwide science and technology-related typology identifies countries (a) with no science and technology base, (b) with the fundamental elements of a science and technology base, (c) with a science and technology base well established, and (d) with an economically effective science and technology base, notably in relation to industry. Finally, the author argues that to develop policies that could avoid further marginalization in foreign investment and technology transfer, the developing countries need much more detailed and statistically grounded analyses of the role of science and technology in the globalization process, and he reviews the innovation indicators in the making.

1 Modern science and technology

Jean-Jacques Salomon


The emergence of modern science
The expansion of modern science and technology
Cultures and coexistence of rationalities
References


The emergence of modern science

It has been said that all the old scientific movements of all the different civilizations were rivers flowing into the ocean of "modern" science [31]. Modern science has its roots in a past that is extremely diverse in both time and space, ranging from the earliest civilizations of Asia, Mesopotamia, Egypt, to the "Greek miracle," through the Judeo-Christian, Arab, and scholastic traditions. However, science as we understand the term is a relatively recent phenomenon. A major advance occurred in the seventeenth century, an advance so different from all previous ones that it can be called an unprecedented "intellectual revolution."

Gaston Bachelard [1] has labelled it an epistemological breakthrough and Thomas Kuhn [19] a paradigm shift. Either way, this turning-point was of even greater historical significance because it began in Europe and developed almost exclusively there for several centuries. The economic and social transformations coming in the wake of the invention of printing and the enormous stimulus to curiosity provided by the "great discoveries" and accompanying this scientific revolution helped to ensure, strengthen, and speed up the expansion of Western civilization relative to all the others. It is not surprising that the history of Western science has often been written as a history of conquest, and oversimplified in such a way that science has featured as an agent of European colonialism or as a residual feature of post-colonial imperialism. Yet history is no less complicated than is the concept of a scientific revolution [7].

Modern science did not happen in a single day - it took time to make an impact on people's thinking and on institutions, with added difficulties because, when experimental science started, most facts were still so uncertain that speculation had a field day. Furthermore, some of the most innovative thinkers (such as Kepler and Newton) in many respects belonged to the old order, half in the modern era through their radical contributions to astronomy, but half in the past because of their links with hermetics, mysticism, or astrology. In a system of thought that had not freed itself from alchemy nor from the bookish tradition handed down from Aristotle, the spread of new ideas was hindered by strong resistance, resulting from a combination of prejudice, dogma, and habits. The scientific revolution of the seventeenth century has generated a huge literature, which is constantly being reinterpreted and reassessed [24].

"Nature is expressed in mathematics": Galileo's famous phrase appeared in his Saggiatore in 1623; it marks symbolically the break with the ancient notion of Nature as an ensemble of substances, forms, and qualities and suggests instead a completely different conception in terms of quantitative phenomena that can by definition be measured and therefore potentially controlled. This "intellectual reform" led not only to the transformation of science - which gradually developed into a range of many and varied sciences, each of them in turn splitting up into more and more specialized subdisciplines - but also to one of perceptions, structures, and institutions. The break between arts and crafts and science reflected a break in the social order and hence a class distinction; technology, until then reserved for the "servile class," becomes the indispensable collaborator of speculative science, which had been reserved for the "professional class." This nearing of theory and practice is a revolutionary turn at both the intellectual and the social level. For the old saying, "to know is to contemplate," a new one was substituted: "to know is to act, to manipulate, to transform" - knowledge is power, in Bacon's phrase. And by the same token, the technician's know-how is to be closely associated with the scientist's theoretical way of thinking and doing.

The process of the creation, expansion, consolidation, and success of modern science has had three distinct phases: institutionalization, professionalization, and industrialization. In all the industrialized countries these phases occurred in the same historical sequence and took several centuries, whereas in the developing countries - most of which became independent nations only very recently - they have often occurred in a different order, with professionalization starting before institutionalization, or even industrialization before professionalization. The problems of the scientific and technological systems in many of these countries, like the lack of social recognition of their scientists and research institutions, can often be largely attributed to this hasty development, which frequently occurs without the benefit of any previous scientific tradition and within a few decades in circumstances very different from those of the industrialized countries.

The institutionalization of science

Bacon, in his utopia New Atlantis (1627), already envisaged scientific research as a public service, taking in most of the functions that it in fact acquired between his day and ours: research would become a profession, managed by administrators, the subject of political decision-making, requiring funding and choices to be made; it would yield usable and useful results; it would be responsible for informing and educating at all levels, drawing on a wide range of specialists, from researchers to administrators, even to scientific attaches, whose brief would be both to make known a country's discoveries abroad and to monitor - if not spy on - developments elsewhere. The link that modern science established between theory and practice creates a power to act inseparable from its power to explain.

Institutionalization began in the scholarly communities of the Academies, the first ones appearing in Italy: they distanced themselves from both Aristotelian science (grammar, rhetoric, and logic) and from other institutions (political, religious, philosophical), which did not share their exclusive concern with "perfecting the knowledge of natural things and of all useful arts. . . by experiment," to quote the charter of the Royal Society (1662). Herein lies the origin of both the secularization of the modern world - the differentiation of the sphere of scientific proofs and facts from that of faith and conviction and the reductionist, positivist, or even "scientistic" leanings of some scientists. One can also see in the stance of the Academies the beginnings of the conflicts that science has had ever since Galileo with authorities who thought they could impose their beliefs, contrary to scientific theories and scientifically established facts. Indeed, little has changed since Galileo wrote to Christina of Lorraine that to interfere with the work of researchers "would be to order them to see what they do not see, not to understand what they understand and when they seek, to find the opposite of what they find."

Nevertheless, from the outset, the scientific establishment has been linked to those with political power, demanding their protection and support and in return providing useful and usable results. The style of institutionalization naturally varied according to the national context. The Académie Royale des Sciences in France was created by Louis XIV's minister Colbert and kept under tight royal control; its members received salaries and the state treasury allocated 12,000 livres per year for equipment and experiments; and certain foreign scholars (such as Huyghens and the Cassinis) were hired abroad for huge salaries - an early example of an organized "brain drain." By contrast, the Royal Society in London enjoyed purely formal official support and until 1740 had an annual budget of less than £232, mainly contributed by the Fellows, and only two official appointments. Both, however, were eager to gain recognition through services rendered to the state, e.g. by solving the problem of calculating longitude at sea, a major strategic concern for which the maritime nations offered substantial rewards [27].

The process of institutionalization spread throughout the seventeenth and eighteenth centuries. The laboratories attached to the academies provided a new setting, outside the universities, for the activities of researchers and the development of new ideas. But institutionalization did not yet mean professionalization, even though the members of the Paris or Berlin academies received salaries. The membership was still limited to a tiny elite, many of which were active in politics, the army, or the church rather than engaged in scientific research. Institutionalization helped to foster the "role of the scientist as researcher," but this role was just starting to develop and was far from achieving social recognition [2].

The professionalization of science

A profession is a legally recognized occupation, usually offering a lifetime career path as well as a livelihood. Scientific research began to achieve this status in the early nineteenth century, but did not do so fully until the eve of the Second World War. The Ecole Polytechnique in France started the process: it provided for the first time technical training involving both a research laboratory and teaching by specially appointed professors (e.g. Monge). However, Polytechnique soon concentrated on teaching rather than on "science in the making," and its graduates became senior civil servants rather than research scientists. The German chemist Liebig, a graduate of Polytechnique, introduced the model to his university in Giessen, whence it spread throughout the Continent. Research became the purview of (professional) university teachers rather than of (amateur) academicians. Humboldt's reform of the university was very much along these lines, and made scientific research an integral part of the university's responsibilities. Merely to possess knowledge and transmit it was not sufficient; the university must also create knowledge.

These developments were reflected in the changing membership of the Royal Society: the number of academic scientists more than doubled between 1881 and 1914, when they made up 61 per cent of the total, while other categories such as "distinguished laymen," soldiers, and clergy were drastically reduced. First used by Whewell in 1840, the term "scientist" came to replace "natural philosopher" or "savant," first in the English-speaking countries, a century later elsewhere. Indeed, the language and activities of science had become incomprehensible to anyone who had not had the appropriate training. New specialisms, disciplines, and subdisciplines proliferated and generated their own networks of institutions, journals, and meetings. The number of researchers grew enormously: not just scientists, but engineers and technical experts, increasingly working in teams or groups, often outside the universities in public or industrial laboratories, or for defence establishments. As in any profession, the growth in numbers led to fierce competition for recognition and hence resources and survival. James Watson gives a very personal and vivid account of the discovery of the genetic code in The Double Helix [59], describing the ruthless behaviour often required to be recognized as one of the top research teams in the world and to achieve the ultimate accolade, the Nobel Prize. The American catch-phrase, "publish or perish," is another example of the distortion of the scientific ethic brought about by competition within a worldwide scientific community, where the "credit" attached to results produced and published to gain fame also determines the financial "credit" that all research programmes require to survive.

The process of professionalization implies membership in a community, with its own rules and initiation rites and tests for entry and continued acceptance. The scientific community in fact has a double role: communication and regulation. It is responsible for disseminating the results of work in progress, as well as publicizing and promoting science, both within its own ranks and outside, to decision makers and the general public. It also looks after scholarly exchanges, sanctions qualifications and research projects, sees to the promotion of researchers and honours them with prizes and grants. In institutional terms, these functions are carried out by the Academies, learned societies, "peer review committees," boards of examiners, and juries. The basic qualification for the researcher is the doctorate, which originated in Germany in the mid-nineteenth century and is now the standard entry requirement for the profession.

In basic research, unlike technological research, scientists are expected to share their results freely with the rest of the scientific community. Progress occurs through and depends on publishing results and on cooperation that by definition transcends national and ideological boundaries: it is indeed a matter of "public knowledge," where the norms set the conditions for working in the field, just as they do for advancing knowledge and know-how [64]. In return, scientists expect to receive additional resources in order to continue their work, perhaps leading to further and more substantial recognition. There are indeed certain similarities with the process of canonization by the Church, except that the candidates are alive and the cursus of honours (publication in prestigious journals, membership of learned societies, national and international prizes, etc.) helps them to advance in their careers. Kuhn [19] has shown that professionalization in the natural sciences is inseparable from this regulatory role of the scientific community. If science is able to advance, it is precisely because the learning process depends on the publication of current research efforts in a given field. A scientific revolution occurs when a new "paradigm" is adopted, obliging the community to throw away the books and articles produced on the basis of the previous paradigm. There is no equivalent in scientific education of the art museum or the library of classics. Whereas in the arts or social sciences one cannot ignore the work of the great names of the past - the writings of Plato or Weber are still a fundamental element of discussions in philosophy or sociology - a modern student of physics is not required to read Newton, Faraday, or Maxwell.

Finally, the process of professionalization not only leads to recognition of status in the abstract, but also (perhaps above all) involves socially sanctioned rewards in terms of income and resources directly linked to the activity of research. This social legitimation occurred earlier in the United States than in Europe, just after the First World War. As Ben-David [2] has pointed out,

The requirement of a Ph.D. made suitable candidates scarcer, and raised thereby the market value of those who possessed the degree. But its principal effect was to create a professional role that implied a certain ethos on the part of the scientist as well as his employer. The ethos demanded that those who received the Ph.D. must keep abreast of scientific developments, do research, and contribute to the advancement of science. The employer, by employing a person with a Ph.D., accepted an implicit obligation to provide him with the facilities, the time, and the freedom for continuous further study and research which were appropriate to his status.

In Europe in the interwar period, scientists had great difficulties in convincing governments to recognize their role as researchers. In fact, research activities still appeared there to be an end in themselves - a calling rather than a productive function - in the context of a university culture, insulated by its institutions and context from community problems and mundane affairs; they were kept on the fringe of university functions and remained there for such a long time that Jean Perrin, Nobel prizewinner in physics, could say, as late as 1933, that "the use of university grants for scientific research is an irregularity to which the authorities are prepared to turn a blind eye" [52].

It was only after the Second World War that the function of scientists devoting themselves full time to research came to be fully recognized in most of the capitalist industrialized countries, with negotiable salaries. In the United States, this negotiation takes place on the basis of individual contracts, whereas in countries such as France, it is part of the standard negotiations with trade unions and professional organizations relating to conditions in the public service. Whatever the system, however, research has joined the general category of professions that provide their members with their livelihood. This stage would probably not have been reached as fast or on the scale that it has without the stimuli of developments in industry and of deliberate policies for science and technology launched after the Second World War.

The industrialization of science

The industrialization of science should not be confused with industrial research. The latter dates back to the mid-nineteenth century and merely brings together the laboratory and the factory. Industrialization means the development of big equipment and the application of industrial management methods to scientific activities themselves. This stage of "big science" [43] occurred only between the world wars and increased rapidly after 1945. In fact, science and technology had relatively little contact with one another until the middle of the nineteenth century; and technology contributed to science (via scientific instruments) rather than vice versa. As is well known, the Industrial Revolution was not closely linked to science at the outset, but rather was produced by craftsmen and engineers, often trained on the job. The most famous example is the steam engine, which was invented almost a century before the principles of thermodynamics were understood.

The turning point came again thanks to Liebig, who brought about the creation of "applied science" in Germany with the exploitation of advances in organic chemistry in the dyeing industry between 1858 and 1862. Von Baer's team, working on the synthesis of indigo, was given direct support by the Badische Anilin und Soda Fabrik, which invested almost £1 million in both research and development, i.e. establishing the chemical reactions required on a large scale prior to commercial production. Similarly, Menlo Park, created by Edison in 1876, was the first R&D laboratory in electromechanics and one of the first instances of substantial venture capital being invested by banks hoping to profit from future inventions. Edison did not so much mark the end of the heroic age of great inventors as the beginning of science-based technology. A self-taught experimenter rather than a scholar himself, he brought to Menlo Park scientists and technicians trained in the best European institutions.

Industrial research soon spawned a new type of entrepreneur, entrepreneurs with science degrees from universities and engineering schools, who were employed by industrial firms or who themselves started new industries. It is important to realize that these developments depended on special conditions whose absence in developing countries often explains their difficulties in properly integrating scientists and laboratories into the production process. For industrial research to flourish, there must already be a layer of relatively mature and varied industries, and the industrialists themselves need to have an adequate scientific background that they can bring to bear on both management and production. There must also be a pool of scientists willing to undertake "directed" research on the problems facing firms, with the aim of producing commercially viable results within a reasonably short time [6]. In some specific cases of scientific research (elementary particles, fusion, astronomy, space research, genome) no progress is conceivable without a critical mass of manpower, equipment, and institutions. These prerequisites could not be satisfied in Europe until the beginning or even the middle of the twentieth century. The Industrial Revolution was accompanied by essential transformations of higher education: the combination of research and teaching, the creation of new specialisms, the modification of university structures in line with changes arising from scientific progress, but also the introduction of university-industry contracts and the increasing recruitment by industry of university-trained scientists.

The industrialization of research - and even of science itself - is the most recent development, dating back to the aftermath of the First World War. The system for supplying weapons, transport, food, and health care (the first vaccines) set up in order to wage the war provided a model for the rational management of technology in terms of organization, discipline, standardization, coordination, separation of line and staff, etc. [47, 48]. The First World War did not so much create new weapons as adapt existing civilian technologies for military purposes (automobiles turned into armoured cars, aeroplanes into bombers, etc.). It was the first war where the outcome was determined by success in maintaining a constant supply of materiel, of machines as much as munitions, and also the first where military operations started to be mechanized and submitted to scientific management. The basic principles underlying the American and European industrial systems with regard to machine tools, spare parts, standardization, and mass production were then extended from the military to the civilian economy via the armies' suppliers: Taylorism and Fordism thus had their first applications [25]. The changes begun in the interwar period, most vividly illustrated by the creation of enormous industrial laboratories such as Bell Laboratories or Du Pont de Nemours in the United States, were considerably strengthened during and just after the Second World War, which was the immediate stimulus for new weapons systems (the atomic bomb, radar, computers, jet engines, rockets, etc.), sanctioning the shift to "big science" as well as "big technology." The links between science and technology became so close that their advance became increasingly interdependent.

The characteristic feature of this stage is that science became increasingly capital-intensive, dependent on huge investments in manpower and specialized equipment. This was partly because research programmes were far more expensive than before and partly because the research programmes were also far more ambitious in terms of both scale and expectations of quick results. "This change is as radical as that which occurred in the productive economy when independent artisan producers were displaced by capital-intensive factory production employing hired labour" [46, p. 44]. Science became indispensable to industry, while industry imposed itself on science, forcing science to adopt its concerns, making science dependent on its contracts, influencing the moral code even to the extent of sometimes preventing the publication of certain results or, conversely, insisting on patenting things that previously had remained in the public domain (e.g. computer software or biological cloning). The industrialization of science also altered and extended the scientist's role so as to become simultaneously: in the university a teacher, administrator, and research scientist; with various state agencies, a contractor for research, an assessor for research proposals, an official adviser on existing projects, a military or diplomatic adviser, a specialist in strategic problems such as the management of advanced weapons systems or the negotiations on arms control; with commercial industry, a private consultant to firms, and a businessman manufacturing equipment of his own invention. These transformations did not occur without causing problems, challenging traditional values, and exposing researchers to conflicts of interest and forcing them to make political, ideological, or commercial commitments that their predecessors had been sheltered from (or alleged they were) thanks to the "neutrality" of science.

Habits change in time: the "detached" academic researcher came to be replaced by the scientific entrepreneur struggling for recognition and maximum profit. Henceforth, many more scientist-researchers worked in industrial laboratories, public or private, and for the military than in the universities. In the era of industrialized science, businesses organize themselves with a view to science-based production and technical innovation. The distinction between science and technology has become blurred: as technologies have become increasingly sophisticated and complex, the innovation process has become increasingly dependent on the findings and methodology of science. From now on, the practice and advance of science are far more dependent on technology than vice versa. Important discoveries are as likely to be made in industrial laboratories as in universities (e.g. nylon by Du Pont, the transistor by Bell Laboratories, enzyme synthesis by Merck, superconductors by IBM). And the system of management, control, and evaluation typical of industry is increasingly applied to research activities, including those in universities.


Contents - Previous - Next