Tampilkan postingan dengan label application of science. Tampilkan semua postingan
Tampilkan postingan dengan label application of science. Tampilkan semua postingan

Kamis, 26 Juni 2008

Science and technology studies

Science and technology studies (STS) is the study of how social, political, and cultural values affect scientific research and technological innovation, and how these in turn affect society, politics, and culture. More than two dozen universities worldwide offer baccalaureate degrees in STS; about half of these also offer doctoral or master's programs.

STS scholars tend to be inspired by one or both of the following[citation needed]:

  • The discovery of relationships between scientific and technological innovations and society, from new and revealing perspectives, with the assumption that science and technology are socially embedded.
  • Concern over the direction and the risks of science and technology.

For the impacts of science and technology upon society, and vice versa, go to: Technology and society.


History

STS is a new and expanding subject; for example, in 2005, four major U.S. universities announced new STS programs.[citation needed] Like most interdisciplinary programs, it emerged from the confluence of a variety of disciplines and disciplinary subfields, all of which had developed an interest -- typically, during the 1960s or 1970s-- in viewing science and technology as socially embedded enterprises.

Early developments

The key disciplinary components of STS took shape independently, beginning in the 1960s, and developed in isolation from each other well into the 1980s, although Ludwig Fleck's monograph (1935) Genesis and Development of a Scientific Fact anticipated many of STS's key themes:

  • Science studies, a branch of the sociology of scientific knowledge that places scientific controversies in their social context.
  • History of technology, that examines technology in its social and historical context. Starting in the 1960s, some historians questioned technological determinism, a doctrine that can induce public passivity to technologic and scientific 'natural' development. At the same time, some historians began to develop similarly contextual approaches to the history of medicine.
  • History and philosophy of science (1960s). After the publication of Thomas Kuhn's well-known The Structure of Scientific Revolutions (1962), which attributed changes in scientific theories to changes in underlying intellectual paradigms, programs were founded at the University of California, Berkeley and elsewhere that brought historians of science and philosophers together in unified programs.
  • Science, technology, and society In the mid- to late-1960s, student and faculty social movements in the U.S., UK, and European universities helped to launch a range of new interdiscplinary fields (such as Women's Studies) that were seen to address relevant topics that the traditional curriculum ignored. One such development was the rise of "science, technology, and society" programs, which are also -- confusingly -- known by the STS acronym. Drawn from a variety of disciplines, including anthropology, history, political science, and sociology, scholars in these programs created undergraduate curricula devoted to exploring the issues raised by science and technology. Unlike scholars in science studies, history of technology, or the history and philosophy of science, they were and are more likely to see themselves as activists working for change rather than dispassionate, "ivory tower" researchers[citation needed]. As an example of the activist impulse, feminist scholars in this and other emerging STS areas addressed themselves to the exclusion of women from science and engineering.
  • Science, engineering, and public policy studies emerged in the 1970s from the same concerns that motivated the founders of the science, technology, and society movement: A sense that science and technology were developing in ways that were increasingly at odds with the public’s best interests. The science, technology, and society movement tried to humanize those who would make tomorrow’s science and technology, but this discipline took a different approach: It would train students with the professional skills needed to become players in science and technology policy. Some programs came to emphasize quantitative methodologies, and most of these were eventually absorbed into systems engineering. Others emphasized sociological and qualitative approaches, and found that their closest kin could be found among scholars in science, technology, and society departments.

During the 1970s and 1980s, leading universities in the U.S., UK, and Europe began drawing these various components together in new, interdisciplinary programs. For example, in the 1970s, Cornell University developed a new program that united science studies and policy-oriented scholars with historians and philosophers of science and technology. Each of these programs developed unique identities due to variation in the components that were drawn together, as well as their location within the various universities. For example, the University of Virginia's STS program united scholars drawn from a variety of fields (with particular strength in the history of technology); however, the program's teaching responsibilities -- it is located within an engineering school and teaches ethics to undergraduate engineering students -- means that all of its faculty share a strong interest in engineering ethics.

The "turn to technology"

A decisive moment in the development of STS was the mid-1980s addition of technology studies to the range of interests reflected in science studies programs. During that decade, two works appeared en seriatim that signaled what Steve Woolgar was to call the “turn to technology”: Social Shaping of Technology (MacKenzie and Wajcman, 1985) and The Social Construction of Technological Systems (Bijker, Hughes et al., 1987). MacKenzie and Wajcman primed the pump by collecting a highly readable collection of articles attesting to the influence of society on technological design. In a seminal article, Trevor Pinch and Wiebe Bijker attached all the legitimacy of the Sociology of Scientific Knowledge to this development by showing how the sociology of technology could proceed along precisely the theoretical and methodological lines established by the sociology of scientific knowledge. This was the intellectual foundation of the field they called the social construction of technology.

The "turn to technology" helped to cement an already growing awareness of underlying unity among the various emerging STS programs. More recently, there has been an assocaited turn to materiality, whereby the socio-technical and material co-produce each other. This is especially evident in work in STS analyses of biomedicine (such as Carl May, Nelly Oudshoorn, and Andrew Webster).

Professional associations

The subject has several professional associations:

Founded in 1975, the Society for Social Studies of Science, initially provided scholarly communication facilities -- including a journal (Science, Technology, and Human Values) and annual meetings -- that were mainly attended by science studies scholars, but the society has since grown into the most important professional association of science and technology studies scholars worldwide. The Society for Social Studies of Science members also include government and industry officials concerned with research and development as well as science and technology policy; scientists and engineers who wish to better understand the social embeddedness of their professional practice; and citizens concerned about the impact of science and technology in their lives. Proposals have been made to add the word "technology" to the association's name, thereby reflecting its stature as the leading STS professional society, but there seems to be widespread sentiment that the name is long enough as it is.

In Europe, the European Society for the Study of Science and Technology (EASST) was founded in 1981 to stimulate communication, exchange and collaboration in the field of studies of science and technology.

Founded in 1958, the Society for the History of Technology initially attracted members from the history profession who had interests in the contextual history of technology. After the "turn to technology" in the mid-1980s, the society's well-regarded journal (Technology and Culture) and its annual meetings began to attract considerable interest from non-historians with technology studies interests.

Less identified with STS, but also of importance to many STS scholars in the US, are the History of Science Society, the Philosophy of Science Association, and the American Association for the History of Medicine. In addition, there are significant STS-oriented special interest groups within major disciplinary associations, including the American Anthropological Association, the American Political Science Association, and the American Sociological Association.

from: wikipedia.org

Scientific enterprise

Scientific enterprise refers to science-based projects developed by, or in cooperation with, private entrepreneurs. For example, in the Age of Exploration, leaders like Henry the Navigator founded schools of navigation, from which stemmed voyages of exploration.

Examples of enterprising scientific organizations

Each of the organizations listed below, have the ability to conduct scientific research on an extended basis, involving multiple researchers over an extended time. Generally, the research is funded not only for the science itself, but for some application which shows promise for the enterprise. But the researchers, if left to their own choices, will tend to follow their research interest, which is essential for the long-term health of their chosen field. Note that a successful scientific enterprise is not equivalent to a successful high-tech enterprise or to a successful business enterprise, but that they form an ecology, a food chain.

from: wikipedia.org

Computational science

omputational science (or scientific computing) is the field of study concerned with constructing mathematical models and numerical solution techniques and using computers to analyse and solve scientific, social scientific and engineering problems. In practical use, it is typically the application of computer simulation and other forms of computation to problems in various scientific disciplines.

The field is distinct from computer science (the mathematical study of computation, computers and information processing). It is also different from theory and experiment which are the traditional forms of science and engineering. The scientific computing approach is to gain understanding, mainly through the analysis of mathematical models implemented on computers.

Scientists and engineers develop computer programs, application software, that model systems being studied and run these programs with various sets of input parameters. Typically, these models require massive amounts of calculations (usually floating-point) and are often executed on supercomputers or distributed computing platforms.

Numerical analysis is an important underpinning for techniques used in computational science.

Applications of computational science

Problem domains for computational science/scientific computing include:

Numerical simulations

Numerical simulations have different objectives depending on the nature of the task being simulated:

  • Reconstruct and understand known events (e.g., earthquake, tsunamis and other natural disasters).
  • Predict future or unobserved situations (e.g., weather, sub-atomic particle behaviour).

Model fitting and data analysis

  • Appropriately tune models or solve equations to reflect observations, subject to model constraints (e.g. oil exploration geophysics, computational linguistics)
  • Use graph theory to model networks, especially those connecting individuals, organizations, and websites.

Optimization

  • Optimize known scenarios (e.g., technical and manufacturing processes, front end engineering).

Methods and algorithms

Algorithms and mathematical methods used in computational science are varied. Commonly applied methods include:

Programming languages commonly used for the more mathematical aspects of scientific computing applications include Mathematica, MATLAB, SciLab, GNU Octave, COMSOL Multiphysics, and PDL. The more computationally-intensive aspects of scientific computing will often utilize some variation of C or Fortran.

Computational science application programs often model real-world changing conditions, such as weather, air flow around a plane, automobile body distortions in a crash, the motion of stars in a galaxy, an explosive device, etc. Such programs might create a 'logical mesh' in computer memory where each item corresponds to an area in space and contains information about that space relevant to the model. For example in weather models, each item might be a square kilometer; with land elevation, current wind direction, humidity, temperature, pressure, etc. The program would calculate the likely next state based on the current state, in simulated time steps, solving equations that describe how the system operates; and then repeat the process to calculate the next state.

The term computational scientist is used to describe someone skilled in scientific computing. This person is usually a scientist, an engineer or an applied mathematician who applies high-performance computers in different ways to advance the state-of-the-art in their respective applied disciplines in physics, chemistry or engineering. Scientific computing has increasingly also impacted on other areas including economics, biology and medicine.

Computational science is now commonly considered a third mode of science, complementing and adding to experimentation/observation and theory. [1]

[edit] Education

Scientific computation is most often studied through an applied mathematics or computer science program, or within a standard mathematics, sciences, or engineering program. At some institutions a specialization in scientific computation can be earned as a "minor" within another program (which may be at varying levels). However, there are increasingly many bachelor's and master's programs in computational science. Some schools also offer the Ph.D. in computational science, computational engineering, computational science and engineering, or scientific computation. Check External Links below for universities that offer computational science programs.

from: wikipedia.org

Military funding of science

The military funding of science has had a powerful transformative effect on the practice and products of scientific research since the early 20th century. Particularly since World War I, advanced science-based technologies have been viewed as essential elements of a successful military.

World War I is often called "the chemists’ war", both for the extensive use of poison gas and the importance of nitrates and advanced high explosives. Poison gas, beginning in 1915 with chlorine from the powerful German dye industry, was used extensively by the Germans and the British ; over the course of the war, scientists on both sides raced to develop more and more potent chemicals and devise countermeasures against the newest enemy gases. Physicists also contributed to the war effort, developing wireless communication technologies and sound-based methods of detecting U-boats, resulting in the first tenuous long-term connections between academic science and the military.

World War II marked a massive increase in the military funding of science, particularly physics. In addition to the Manhattan Project and the resulting atomic bomb, British and American work on radar was widespread and ultimately highly influential in the course of the war; radar enabled detection of enemy ships and aircraft, as well as the radar-based proximity fuze. Mathematical cryptography, meteorology, and rocket science were also central to the war effort, with military-funded wartime advances having a significant long-term effect on each discipline. The technologies employed at the end—jet aircraft, radar and proximity fuzes, and the atomic bomb—were radically different from pre-war technology; military leaders came to view continued advances in technology as the critical element for success in future wars. The advent of the Cold War solidified the links between military institutions and academic science, particularly in the United States and the Soviet Union, so that even during a period of nominal peace military funding continued to expand. Funding spread to the social sciences as well as the natural sciences, and whole new fields, such as digital computing, were born of military patronage. Following the end of the Cold War and the collapse of the Soviet Union, military funding of science has decreased substantially, but much of the American military-scientific complex remains in place.

The sheer scale of military funding for science since World War II has instigated a large body of historical literature analyzing the effects of that funding, especially for American science. Since Paul Forman’s 1987 article “Behind quantum electronics: National security as a basis for physical research in the United State, 1940-1960,” there has been an ongoing historical debate over precisely how and to what extent military funding affected the course of scientific research and discovery. Forman and others have argued that military funding fundamentally redirected science—particularly physics—toward applied research, and that military technologies predominantly formed the basis for subsequent research even in areas of basic science; ultimately the very culture and ideals of science were colored by extensive collaboration between scientists and military planners. An alternate view has been presented by Daniel Kevles, that while military funding provided many new opportunities for scientists and dramatically expanded the scope of physical research, scientists by-and-large retained their intellectual autonomy.


Science and military technology before the modern era

Replica catapult at Château des Baux, France
Replica catapult at Château des Baux, France

While there were numerous instances of military support for scientific work before the 20th century, these were typically isolated instances; knowledge gained from technology was generally far more important for the development of science than scientific knowledge was to technological innovation. Thermodynamics, for example, is a science partly born from military technology: one of the many sources of the first law of thermodynamics was Count Rumford’s observation of the heat produced by boring cannon barrels.[5] Mathematics was important in the development of the Greek catapult and other weapons,[6] but analysis of ballistics was also important for the development of mathematics, while Galileo tried to promote the telescope as a military instrument to the military-minded Republic of Venice before turning it to the skies while seeking the patronage of the Medici court in Florence. In general, craft-based innovation, disconnected from the formal systems of science, was the key to military technology well into the 19th century.

Interchangeable gun parts, illustrated in the 1832 Edinburgh Encyclopaedia
Interchangeable gun parts, illustrated in the 1832 Edinburgh Encyclopaedia

Even craft-based military technologies were not generally produced by military funding. Instead, craftsmen and inventors developed weapons and military tools independently and actively sought the interest of military patrons afterward. Following the rise of engineering as a profession in the 18th century, governments and military leaders did try to harness the methods of both science and engineering for more specific ends, but frequently without success. In the decades leading up to the French Revolution, French artillery officers were often trained as engineers, and military leaders from this mathematical tradition attempted to transform the process of weapons manufacture from a craft-based enterprise to an organized and standardized system based on engineering principles and interchangeable parts (pre-dating the work of Eli Whitney in the U.S.). During the Revolution, even natural scientists participated directly, attempting to create “weapons more powerful than any we possess” to aid the cause of the new French Republic, though there were no means for the revolutionary army to fund such work.[9] Each of these efforts, however, was ultimately unsuccessful in producing militarily useful results. A slightly different outcome came from the longitude prize of the 18th century, offered by the British government for an accurate method of determining a ship’s longitude at sea (essential for the safe navigation of the powerful British navy): intended to promote—and financially reward—a scientific solution, it was instead won by a scientific outsider, the clockmaker John Harrison.[10] However, the naval utility of astronomy did help increase the number of capable astronomers and focus research on developing more powerful and versatile instruments.

Through the 19th century, science and technology grew closer together, particularly through electrical and acoustic inventions and the corresponding mathematical theories. The late 19th and early 20th centuries witnessed a trend toward military mechanization, with the advent of repeating rifles with smokeless powder, long-range artillery, high explosives, machine guns, and mechanized transport along with telegraphic and later wireless battlefield communication. Still, independent inventors, scientists and engineers were largely responsible for these drastic changes in military technology (with the exception of the development of battleships, which could only have been created through organized large-scale effort).[11]

World War I and the interwar years

See also: Technology during World War I

World War I marked the first large-scale mobilization of science for military purposes. Prior to the war, the American military ran a few small laboratories as well as the Bureau of Standards, but independent inventors and industrial firms predominated.[12] Similarly in Europe, military-directed scientific research and development was minimal. The powerful new technologies that led to trench warfare, however, reversed the traditional advantage of fast-moving offensive tactics; fortified positions supported by machine guns and artillery resulted in high attrition but strategic stalemate. Militaries turned to scientists and engineers for even newer technologies, but the introduction of tanks and aircraft had only a marginal impact; the use of poison gas made a tremendous psychological impact, but decisively favored neither side. The war ultimately turned on maintaining adequate supplies of materials, a problem also addressed by military-funded science—and, through the international chemical industry, closely related to the advent of chemical warfare.

The Germans introduced gas as a weapon in part because naval blockades limited their supply of nitrate for explosives, while the massive German dye industry could easily produce chlorine and organic chemicals in large amounts. Industrial capacity was completely mobilized for war, and Fritz Haber and other industrial scientists were eager to contribute to the German cause; soon they were closely integrated into the military hierarchy as they tested the most effective ways of producing and delivering weaponized chemicals. Though the initial impetus for gas warfare came from outside the military, further developments in chemical weapon technology might be considered military-funded, considering the blurring of lines between industry and nation in Germany.[13]

Tear gas casualties from the Battle of Estaires, April 10, 1918
Tear gas casualties from the Battle of Estaires, April 10, 1918

Following the first chlorine attack by the Germans in May 1915, the British quickly moved to recruit scientists for developing their own gas weapons. Gas research escalated on both sides, with chlorine followed by phosgene, a variety of tear gases, and mustard gas. A wide array of research was conducted on the physiological effects of other gases, such and hydrogen cyanide, arsenic compounds, and a host of complex organic chemicals. The British built from scratch what became an expansive research facility at Porton Down, which remains a significant military research institution into the 21st century. Unlike many earlier military-funded scientific ventures, the research at Porton Down did not stop when the war ended or an immediate goal was achieved. In fact, every effort was made to create an attractive research environment for top scientists, and chemical weapons development continued apace—though in secret—through the interwar years and into World War II. German military-backed gas warfare research did not resume until the Nazi era, following the 1936 discovery of tabun, the first nerve agent, through industrial insecticide research.

In the United States, the established tradition of engineering was explicitly competing with the rising discipline of physics for WWI military largess. A host of inventors, lead by Thomas Edison and his newly created Naval Consulting Board, cranked out thousands of inventions to solve military problems and aid the war effort, while academic scientists worked through the National Research Council (NRC) led by Robert Millikan. Submarine detection was the most important problem that both the physicists and inventors hoped to solve, as German U-boats were decimating the crucial naval supply lines from the U.S. to England. Edison’s Board produced very few useful innovations, but NRC research resulted in a moderately successful sound-based methods for locating submarines and hidden ground-based artillery, as well as useful navigational and photographic equipment for aircraft. Because of the success of academic science in solving specific military problems, the NRC was retained after the war’s end, though it gradually decoupled from the military.

Many industrial and academic chemists and physicists came under military control during the Great War, but post-war research by the Royal Engineers Experimental Station at Porton Down and the continued operation of the National Research Council were exceptions to the overall pattern; wartime chemistry funding was a temporary redirection of a field largely driven by industry and later medicine, while physics grew closer to industry than to the military. The discipline of modern meteorology, however, was largely built from military funding. During World War I, the French civilian meteorological infrastructure was largely absorbed into the military. The introduction of military aircraft during the war as well as the role of wind and weather in the success or failure of gas attacks meant meteorological advice was in high demand. The French army (among others) created its own supplementary meteorological service as well, retraining scientists from other fields to staff it. At war's end, the military continued to control French meteorology, sending weathermen to French colonial interests and integrating weather service with the growing air corps; most of the early-twentieth century growth in European meteorology was the direct result of military funding. World War II would result in a similar transformation of American meteorology, initiating a transition from an apprenticeship system for training weathermen (based on intimate knowledge of local trends and geography) to the university-based, science-intensive system that has predominated since.

World War II

See also: Technology during World War II

If World War I was the chemists’ war, World War II was the physicists’ war. As with other total wars, it is difficult to draw a line between military funding and more spontaneous military-scientific collaboration during World War II. Well before the Invasion of Poland, nationalism was a powerful force in the German physics community (see Deutsche Physik); the military mobilization of physicists was all but irresistible after the rise of National Socialism. German and Allied investigations of the possibility of a nuclear bomb began in 1939 at the initiative of civilian scientists, but by 1942 the respective militaries were heavily involved. The German nuclear energy project had two independent teams, a civilian-controlled team under Werner Heisenberg and a military-controlled led by Kurt Diebner; the latter was more explicitly aimed at producing a bomb (as opposed to a power reactor) and received much more funding from the Nazis, though neither was ultimately successful.[16]

In the U.S., the Manhattan Project and other projects of the Office of Scientific Research and Development resulted in a much more extensive military-scientific venture, the scale of which dwarfed previous military-funded research projects. Theoretical work by a number of British and American scientists resulted in significant optimism about the possibility of a nuclear chain reaction. As the physicists convinced military leaders of the potential of nuclear weapons, funding for actual development was ratcheted up rapidly. A number of large laboratories were created across the United States for work on different aspects of the bomb, while many existing facilities were reoriented to bomb-related work; some were university-managed while others were government-run, but all were ultimately funded and directed by the military.[17] The May 1945 surrender of Germany, the original intended target for the bomb, did virtually nothing to slow the project’s momentum. After Japan’s surrender immediately following the atomic bombings of Hiroshima and Nagasaki, many scientists returned to academia or industry, but the Manhattan Project infrastructure was too large—and too effective—to be dismantled wholesale; it became the model for future military-scientific work, in the U.S. and elsewhere.[18]

Other wartime physics research, particularly in rocketry and radar technology, was less significant in popular culture but much more significant for the outcome of the war. German rocketry was driven by the pursuit of Wunderwaffen, resulting in the V-2 ballistic missile; the technology as well as the personal expertise of the German rocketry community was absorbed by the U.S. and the U.S.S.R. rocket programs after the war, forming the basis of long-term military funded rocketry, ballistic missile, and later space research. Rocket science was only beginning to make impact by the final years of the war. German rockets created fear and destruction in London, but had only modest military significance, while air-to-ground rockets enhanced the power of American air strikes; jet aircraft also went into service by the end of the war.[19] Radar work before and during the war provided even more of an advantage for the Allies. British physicists pioneered long-wave radar, developing an effective system for detecting incoming German air forces. Work on potentially more precise short-wave radar was turned over to the U.S.; several thousand academic physicists and engineers not participating the Manhattan Project did radar work, particularly at MIT and Stanford, resulting in microwave radar systems that could resolve more detail in incoming flight formations. Further refinement of microwave technology led to proximity fuzes, which greatly enhanced the ability of the U.S. Navy to defend against Japanese bombers. Microwave production, detection and manipulation also formed the technical foundation to complement the institutional foundation of the Manhattan Project in much post-war defense research.

American Cold War science

In the years immediately following World War II, the military was by far the most significant patron of university science research in the U.S., and the national labs also continued to flourish. After two years in political limbo (but with work on nuclear power and bomb manufacture continuing apace) the Manhattan Project became a permanent arm of the government as the Atomic Energy Commission. The Navy—inspired by the success of military-directed wartime research—created its own R&D organization, the Office of Naval Research, which would preside over an expanded long-term research program at Naval Research Laboratory as well as fund a variety of university-based research. Military money following up the wartime radar research led to explosive growth in both electronics research and electronics manufacturing. The Air Force became an independent service branch from the Army and established its own research and development system, and the Army followed suit (though it was less invested in academic science than the Navy or Air Force). Meanwhile, the perceived communist menace of the Soviet Union caused tensions—and military budgets—to escalate rapidly.

The Department of Defense primarily funded what has been broadly described as “physical research,” but to reduce this to merely chemistry and physics is misleading. Military patronage benefited a large number of fields, and in fact helped create a number of the modern scientific disciplines. At Stanford and MIT, for example, electronics, aerospace engineering, nuclear physics, and materials science—all physics, broadly speaking—each developed in different directions, becoming increasingly independent of parent disciplines as they grew and pursued defense-related research agendas. What began as interdepartmental laboratories became the centers for graduate teaching and research innovation thanks to the broad scope of defense funding. The need to keep up with corporate technology research (which was receiving the lion’s share of defense contracts) also prompted many science labs to establish close relationships with industry.

Computing

The complex histories of computer science and computer engineering were shaped, in the first decades of digital computing, almost entirely by military funding. Most of the basic component technologies for digital computing were developed through the course of the long-running Whirlwind-SAGE program to develop an automated radar shield. Virtually unlimited funds enabled two decades of research that only began producing useful technologies by the end of the 50s; even the final version of the SAGE command and control system had only marginal military utility. More so than with previously-established disciplines receiving military funding, the culture of computer science was permeated with a Cold War military perspective. Indirectly, the ideas of computer science also had a profound effect on psychology, cognitive science and neuroscience through the mind-computer analogy.[23]

Geosciences and astrophysics

The history of earth science and the history of astrophysics were also closely tied to military purposes and funding throughout the Cold War. American geodesy, oceanography, and seismology grew from small sub-disciplines in into full-fledged independent disciplines as for several decades, virtually all funding in these fields came from the Department of Defense. A central goal that tied these disciplines together (even while providing the means for intellectual independence) was the figure of the Earth, the model of the earth’s geography and gravitation that was essential for accurate ballistic missiles. In the 1960s, geodesy was the superficial goal of the satellite program CORONA, while military reconnaissance was in fact a driving force. Even for geodetic data, new secrecy guidelines worked to restrict collaboration in a field that had formerly been fundamentally international; the Figure of the Earth had geopolitical significance beyond questions of pure geoscience. Still, geodesists were able to retain enough autonomy and subvert secrecy limitations enough to make use of the findings of their military research to overturn some of the fundamental theories of geodesy. Like geodesy and satellite photography research, the advent of radio astronomy had a military purpose hidden beneath official astrophysical research agenda. Quantum electronics permitted both revolutionary new methods of analyzing the universe and—using the same equipment and technology—the monitoring of Soviet electronic signals.

Military interest in (and funding of) seismology, meteorology and oceanography was in some ways a result of the defense-related payoffs of physics and geodesy. The immediate goal of funding in these fields was to detect clandestine nuclear testing and track fallout radiation, a necessary precondition for treaties to limit the nuclear weapon technology earlier military research had created. In particular, the feasibility of monitoring underground nuclear explosions was crucial to the possibility of a comprehensive rather than partial nuclear test ban treaty.[26] But the military-funded growth of these disciplines continued even when no pressing military goals were driving them; as with other natural sciences, the military also found value in having ‘scientists on tap’ for unforeseen future R&D needs.

Biological sciences

The biological sciences were also affected by military funding, but, with the exception of nuclear physics-related medical and genetic research, largely indirectly. The most significant funding sources for basic research before the rise of the military-industrial-academic complex were philanthropic organizations such as the Rockefeller Foundation. After World War II (and to some extent before), the influx of new industrial and military funding opportunities for the physical sciences prompted philanthropies to divest from physics research—most early work in high-energy physics and biophysics had been the product of foundation grants—and refocus on biological and medical research.

The social sciences also found limited military support from the 1940s to the 1960s, but much defense-minded social science research could be—and was—pursued without extensive military funding. In the 1950s, social scientists tried to emulate the interdisciplinary organizational success of the physical sciences’ Manhattan Project with the synthetic behavioral science movement. Social scientists actively sought to promote their usefulness to the military, researching topics related to propaganda (put to use in Korea), decision making, the psychological and sociological causes and effects of communism, and a broad constellation of other topics of Cold War significance. By the 1960s, economists and political scientists offered up modernization theory for the cause of Cold War nation-building; modernization theory found a home in the military in the form of Project Camelot, a study of the process of revolution, as well as in the Kennedy administration’s approach to the Vietnam War. Project Camelot was ultimately canceled because of the concerns it raised about scientific objectivity in the context of such a politicized research agenda; though natural sciences were not yet susceptible to implications of the corrupting influence of military and political factors, the social sciences were.

Historical debate

Historian Paul Forman, in his seminal 1987 article, proposed that not only had military funding of science greatly expanded the scope and significance of American physics, it also initiated "a qualitative change in its purposes and character." Historians of science were beginning to turn to the Cold War relationship between science and the military for detailed study, and Forman’s “distortionist critique” (as Roger Geiger has described it) served to focus the ensuing debates. Forman and others (e.g., Robert Seidel, Stuart Leslie, and for the history of the social sciences, Ron Robin) view the influx of military money and the focus on applied rather than basic research as having had, at least partially, a negative impact on the course of subsequent research. In turn, critics of the distortionist thesis, beginning with Daniel Kevles, deny that the military "seduced American physicists from, so to speak, a 'true basic physics'." Kevles, as well as Geiger, instead view the effects of military funding relative to such funding simply being absent—rather than put to alternate scientific use. Most recent scholarship has moved toward a tempered version of Forman's thesis, in which scientists retained significant autonomy despite the radical changes brought about by military funding.