Social function of scientific knowledge
Поможем в ✍️ написании учебной работы
Поможем с курсовой, контрольной, дипломной, рефератом, отчетом по практике, научно-исследовательской и любой другой работой

The sociology of scientific knowledge is the study of science as a so- cial activity, especially dealing with the social conditions and effects of science, and with the social structures and processes of scientific activi- ty.The sociology of scientific ignorance is complementary to the sociol- ogy of scientific knowledge. For comparison, the sociology of knowledge studies the impact of human knowledge and the prevailing ideas on societies and relations between knowledge and the social con- text within which it arises.


Sociologists of scientific knowledge study the development of a sci- entific field and attempt to identify points of contingency or interpreta- tive flexibility where ambiguities are present. Such variations may be linked to a variety of political, historical, cultural or economic factors. Crucially, the field does not set out to promote relativism or to attack the scientific project; the aim of the researcher is to explain why one inter- pretation rather than another succeeds due to external social and histori- cal circumstances.

The field emerged in the late 1960s and early 1970s and at first was an almost exclusively British practice. Other early centers for the devel- opment of the field were in France, Germany, and the United States. Major theorists include Barry Barnes, David Bloor, Sal Resti- vo, Randall Collins, Gaston Bachelard, Harry Collins, Paul Feyera- bend, Steve Fuller, Thomas Kuhn, Martin Kusch, Bruno Latour, Mike Mulkay, Derek J. de Solla Price, Lucy Suchman and Anselm Strauss.

The sociology of scientific knowledge in its Anglophone versions emerged in the 1970s in self-conscious opposition to the sociology of science associated with the American Robert K. Merton, generally con- sidered one of the seminal authors in the sociology of science. Merton's was a kind of "sociology of scientists, which left the cognitive content of science out of sociological account; SSK by contrast aimed at provid- ing sociological explanations of scientific ideas themselves, taking its lead from aspects of the work of Thomas S. Kuhn, but especially from established traditions in cultural anthropology as well as the later Witt- genstein. David Bloor, one of SSK's early champions, has contrasted the so-called 'weak programme' which merely gives social explanations for erroneous beliefs, with what he called the 'strong programme', which considers sociological factors as influencing all beliefs.

The weak programme is more of a description of an approach than an organised movement. The term is applied to historians, sociologists and philosophers of science who merely cite sociological factors as being responsible for those beliefs that went wrong. Imre Lakatos and Thomas Kuhn might be said to adhere to it. The strong programme is particularly associated with the work of two groups: the 'Edinburgh School' in the 1970s and '80s, and the 'Bath School' in the same period. "Edinburgh sociologists" and "Bath sociologists" promoted, respectively, the Strong Programme and Empirical Programme of Relativism. Also associated with SSK in the 1980s was discourse analysis as applied to science, as well as a concern with issues of reflexivity arising from paradoxes relat-


ing to SSK's relativist stance towards science and the status of its own knowledge-claims.

The sociology of scientific knowledge has major international net- works through its principal associations, 4S and EASST, with recently established groups in Japan, South Korea, Taiwan and Latin America. It has made major contributions in recent years to a critical analysis of the biosciences and informatics.

Studies of mathematical practice and quasi-empiricism in mathemat- ics are also rightly part of the sociology of knowledge, since they focus on the community of those who practice mathematics and their common assumptions. Since Eugene Wignerraised the issue in 1960 and Hilary Putnam made it more rigorous in 1975, the question of why fields such as physics and mathematics should agree so well has been debated. Pro- posed solutions point out that the fundamental constituents of mathe- matical space, form-structure, and number-proportion are also the fun- damental constituents of physics. It is also worthwhile to note that phys- ics is nothing but a modeling of reality, and seeing causal relationships governing repeatable observed phenomena, and much of mathematics, especially in relation to the growth of the calculus, has been developed precisely for the goal of developing these models in a rigorous fashion. The division of human scientific thinking through using words such as 'mathematics' and 'physics' is only useful in their practical everyday function to categorize and distinguish.

Fundamental contributions to the sociology of mathematical knowledge have been made by Sal Restivo and David Bloor. Restivo draws upon the work of scholars such as Oswald Spengler, Raymond L. Wilder and Lesley A. White, as well as contemporary sociologists of knowledge and science studies scholars. David Bloor draws upon Lud- wig Wittgenstein and other contemporary thinkers. They both claim that mathematical knowledge is socially constructed and has irreducible con- tingent and historical factors woven into it. More recently Paul Ernest has proposed a social constructivist account of mathematical knowledge, drawing on the works of both of these sociologists.

SSK has received criticism from theorists of the Actor-network theo- ry school of science and technology studies. These theorists criticise SSK for sociological reductionism and a human centered universe. SSK, they say, relies too heavily on human actors and social rules and con- ventions settling scientific controversies.


7.1.19. Modernization theory

Modernization theory is used to explain the process of modernization within societies. Modernization refers to a model of a progressive transi- tion from a 'pre-modern' or 'traditional' to a 'modern' society. Moderni- zation theory originated from the ideas of German sociologist Max We- ber, which provided the basis for the modernization paradigm developed by Harvard sociologist Talcott Parsons. The theory looks at the internal factors of a country while assuming that with assistance, "traditional" countries can be brought to development in the same manner more de- veloped countries have been. Moderniziation theory was a dominant paradigm in the social sciences in the 1950s and 1960s, then went into a deep eclipse. It made a comeback after 1990 but remains a controversial model.

Modernization theory both attempts to identify the social variables that contribute to social progress and development of societies and seeks to explain the process of social evolution. Modernization theory is sub- ject to criticism originating among socialist and free-market ideologies, world-systems theorists, globalization theorists an dependency theorists among others. Modernization theory stresses not only the process of change but also the responses to that change. It also looks at internal dynamics while referring to social and cultural structures and the adap- tation of new technologies. Modernization theory maintains that tradi- tional societies will develop as they adopt more modern practices. Pro- ponents of modernization theory claim that modern states are wealthier and more powerful and that their citizens are freer to enjoy a higher standard of living.

Developments such as new data technology and the need to update traditional methods in transport, communication and production, it is argued, make modernization necessary or at least preferable to the status quo. That view makes critique of modernization difficult since it implies that such developments control the limits of human interaction, not vice versa. It also implies that human agency controls the speed and severity of modernization. Supposedly, instead of being dominated by tradition, societies undergoing the process of modernization typically arrive at forms of governance dictated by abstract principles. Traditional reli- gious beliefs and cultural traits, according to the theory, usually become less important as modernization takes hold.

Historians link modernization to the processes of urbanization and industrialization and the spread of education. Globalization can be de-


fined as the integration of economic, political and social cultures and, it is argued that is related to the spreading of modernization across bor- ders.

Global trade has grown continuously since the European discovery of new continents in the Early modern period; it increased particularly as a result of the Industrial Revolution and the mid-20th century adop- tion of the shipping container. Annual trans-border tourist arrivals rose to 456 million by 1990 and are expected to double again, to 937 million per annum, by 2010. Communication is another major area that has grown due to modernization. Communication industries have enabled capitalism to spread throughout the world. Telephony, television broad- casts, news services and online service providers have played a crucial part in globalization.

With the many apparent positive attributes to globalization there are also negative consequences. The dominant, neoliberal model of globali- zation often increases disparities between a society's rich and its poor. In major cities of developing countries there exist pockets where technolo- gies of the modernised world, computers, cell phones and satellite tele- vision, exist alongside stark poverty. Globalists are globalization mod- ernization theorists and argue that globalization is positive for everyone, as its benefits must eventually extend to all members of society, includ- ing vulnerable groups such as women and children.

New technology is a major source of social change. Since moderni- zation entails the social transformation from agrarian societies to indus- trial ones, it is important to look at the technological viewpoint; howev- er, new technologies do not change societies by itself. Rather, it is the response to technology that causes change. Frequently, technology is recognized but not put to use for a very long time such as the ability to extract metal from rock. Although that initially went unused, it later had profound implications for the developmental course of societies. Tech- nology makes it possible for a more innovated society and broad social change. That dramatic change through the centuries that has evolved socially, industrially, and economically, can be summed up by the term modernization. Cell phones, for example, have changed the lives of mil- lions throughout the world.

That is especially true in Africa and other parts of the Middle East, where there is a low cost communication infrastructure. With cell phone technology, widely dispersed populations are connected, which facili-


tates business-to-business communication and provides internet access to remoter areas, with a consequential rise in literacy.

Development, like modernization, has become the orienting principle of modern times. Countries that are seen as modern are also seen as de- veloped, which means that they are generally more respected by institu- tions such as the United Nations and even as possible trade partners for other countries. The extent to which a country has modernized or devel- oped dictates its power and importance on the international level.

Modernization of the health sector of developing nations recognizes that transitioning from 'traditional' to 'modern' is not merely the ad- vancement in technology and the introduction of Western practices; im- plementing modern healthcare requires the reorganization of political agenda and, in turn, an increase in funding by feeders and resources to- wards public health. However, rather than replicating the stages of de- veloped nations, whose roots of modernization are found with the con- text of industrialization or colonialism, underdeveloped nations should apply proximal interventions to target rural communities and focus on prevention strategies rather than curative solutions.

That has been successfully exhibited by the Christian Medical Com- mission and in China through 'barefoot doctors'. Additionally, a strong advocate of the DE-emphasis of medical institutions was Halfdan T. Mahler, the WHO General Director from 1973 to 1988. Overall, howev- er, this is not to say that the nations of the Global South can function independently from Western states; significant funding is received from well-intention programs, foundations, and charities that target epidemics such as HIV/AIDS, malaria, and tuberculosis that have substantially improved the lives of millions of people and impeded future develop- ment.

Modernization theorists often saw traditions as obstacles to economic growth. According to Seymour Martin Lipset, economic conditions are heavily determined by the cultural, social values present in that given society. Furthermore, while modernization might deliver violent, radical change for traditional societies, it was thought worth the price. Critics insist that traditional societies were often destroyed without ever gaining the promised advantages if, among other things, the economic gap be- tween advanced societies and such societies actually increased. The net effect of modernization for some societies was therefore the replace- ment of traditional poverty by a more modern form of misery, according to these critics. Others point to improvements in living standards, physi-


cal infrastructure, education and economic opportunity to refute such criticisms.

From the 1960s, modernization theory has been criticized by numer- ous scholars, including Andre Gunder Frank and Immanuel Wallerstein. In this model, the modernization of a society required the destruction of the indigenous culture and its replacement by a more Westernized one. By one definition, modern simply refers to the present, and any society still in existence is therefore modern. Proponents of modernization typi- cally view only Western society as being truly modern and argue that others are primitive or unevolved by comparison. That view sees un- modernized societies as inferior even if they have the same standard of living as western societies. Opponents argue that modernity is inde- pendent of culture and can be adapted to any society. Japan is cited as an example by both sides. Some see it as proof that a thoroughly modern way of life can exist in a non western society. Others argue that Japan has become distinctly more western as a result of its modernization.

As Tipps has argued, by conflating modernization with other pro- cesses, with which theorists use interchangeably, the term becomes im- precise and therefore difficult to disprove. The theory has also been crit- icised empirically, as modernization theorists ignore external sources of change in societies. The binary between traditional and modern is un- helpful, as the two are linked and often interdependent, and 'moderniza- tion' does not come as a whole. Modernization theory has also been ac- cused of being Eurocentric, as modernization began in Europe, with the Industrial Revolution, the French Revolution and the Revolutions of 1848 and has long been regarded as reaching its most advanced stage in Europe. Anthropologists typically make their criticism one step further and say that the view is ethnocentric and is specific to Western culture.

 







The creative industries

The creative industries refers to a range of economic activities which are concerned with the generation or exploitation of knowledge and information. They may variously also be referred to as the cultural industries or the creative economy, and most recently they have been denominated as the Orange Economy in Latin America and the Caribbe- an.

Howkins' creative economy comprise advertising, architecture, art, crafts, design, fashion, film, music, performing arts, publishing, R&D, software, toys and games, TV and radio, and video games . Some schol-


ars consider that education industry, including public and private ser- vices, is forming a part of creative industry. There remain, therefore, different definitions of the sector.

The creative industries have been seen to become increasingly im- portant to economic well-being, proponents suggesting that "human cre- ativity is the ultimate economic resource". Various commentators have provided varying suggestions on what activities to include in the con- cept of "creative industries" and the name itself has become a contested issue - with significant differences and overlap between the terms "crea- tive industries", "cultural industries" and "creative economy". Lash and Urry suggest that each of the creative industries has an "irreducible core" concerned with "the exchange of finance for rights in intellectual property".This echoes the UK Government Department for Culture, Media and Sport definition which describes the creative industries as those industries which have their origin in individual creativity, skill and talent and which have a potential for wealth and job creation through the generation and exploitation of intellectual property

The various fields of engineering do not appear on this list, that emerged from the DCMS reports. This was due, probably, to the fact that engineers occupy relevant positions in "non-cultural" corporations, performing activities of project, management, operation, maintenance, risk analysis and supervision, among others. However, historically and presently, several tasks of engineers can be regarded as highly creative, inventive and innovative. The contribution of engineering is represented by new products, processes and services.

Hesmondhalgh reduces the list to what he terms "the core cultural industries" of advertising and marketing, broadcasting, film, internet and music industries, print and electronic publishing, and video and comput- er games. His definition only includes those industries that create "texts"' or "cultural artefacts" and which engage in some form of indus- trial reproduction. The DCMS list has proven influential, and many oth- er nations. Have formally adopted it. It has also been criticised. It has been argued that the division into sectors obscures a divide between lifestyle business, non-profits, and larger businesses, and between those who receive state subsidies and those who do not. The inclusion of the antiques trade often comes into question, since it does not generally in- volve production. The inclusion of all computer services has also been questioned. Some areas, such as Hong Kong, have preferred to shape their policy around a tighter focus on copyright ownership in the value


chain. They adopt the WIPO's classifications, which divide up the crea- tive industries according to who owns the copyrights at various stages during the production and distribution of creative content.

The Inter-American Development Bank has denominated them for Latin America and the Caribbean as the Orange Economy which is de- fined as the group of linked activities through which ideas are trans- formed into cultural goods and services whose value is determined by intellectual property.

Others have suggested a distinction between those industries that are open to mass production and distribution, and those that are primarily craft-based and are meant to be consumed in a particular place and mo- ment.

The DCMS classifies enterprises and occupations as creative accord- ing to what the enterprise primarily produces, and what the worker pri- marily does. Thus, a company which produces records would be classi- fied as belonging to the music industrial sector, and a worker who plays piano would be classified as a musician. The primary purpose of this is to quantify - for example it can be used to count the number of firms, and the number of workers, creatively employed in any given location, and hence to identify places with particularly high concentrations of creative activities. It leads to some complications which are not imme- diately obvious. For example, a security guard working for a music company would be classified as a creative employee, although not as creatively occupied. The total number of creative employees is then cal- culated as the sum of:

· All workers employed in creative industries, whether or not cre- atively occupied

· All workers that are creatively occupied, and are not employed in creative industries. This includes people whose second job is creative, for example somebody who does weekend gigs, writes books, or pro- duces artwork in their spare time

According to Caves, creative industries are characterized by seven economic properties:

1. Nobody knows principle: Demand uncertainty exists because the consumers' reaction to a product are neither known beforehand, nor easily understood afterward.

2. Art for art's sake: Workers care about originality, technical pro- fessional skill, harmony, etc. of creative goods and are willing to settle for lower wages than offered by 'humdrum' jobs.


3. Motley crew principle: For relatively complex creative prod- ucts, the production requires diversely skilled inputs. Each skilled input must be present and perform at some minimum level to produce a valu- able outcome.

4. Infinite variety: Products are differentiated by quality and uniqueness; each product is a distinct combination of inputs leading to infinite variety options.

5. A list/B list: Skills are vertically differentiated. Artists are ranked on their skills, originality, and proficiency in creative processes and/or products. Small differences in skills and talent may yield huge differences in success.

6. Time flies: When coordinating complex projects with diversely skilled inputs, time is of the essence.

7. Ars longa: Some creative products have durability aspects that invoke copyright protection, allowing a creator or performer to collect rents.

The properties described by Caves have been criticized for being too rigid. Not all creative workers are purely driven by 'art for art's sake'. The 'ars longa' property also holds for certain noncreative products. The 'time flies' property also holds for large construction projects. Creative industries are therefore not unique, but they score generally higher on these properties relative to non-creative industries.

There is often a question about the boundaries between creative in- dustries and the similar term of cultural industries. Cultural industries are best described as an adjunct-sector of the creative industries. Cultur- al industries include industries that focus on cultural tourism and herit- age, museums and libraries, sports and outdoor activities, and a variety of 'way of life' activities that arguably range from local pet shows to a host of hobbyist concerns. Thus cultural industries are more concerned about delivering other kinds of value – including cultural wealth and social wealth – rather than primarily providing monetary value. Some authors, such as the economist Richard Florida, argue for a wider focus on the products of knowledge workers, and judge the 'creative class' to include nearly all those offering professional knowledge-based services. The term creative industries begins to elide with knowledge econo- my and questions of intellectual property ownership in general. Florida's focus leads him to pay particular attention to the nature of the creative workforce. In a study of why particular US cities such as San Francisco seem to attract creative producers, Florida argues that a high proportion


of workers from the 'creative class' provide a key input to creative pro- duction, which enterprises seek out. He seeks to quantitatively establish the importance of diversity and multiculturalism in the cities concerned, for example the existence of a significant public gay community, ethnic and religious variety, and tolerance.

Globally, Creative Industries excluding software and general scien- tific research and development are said to have accounted for around 4% of the world's economic output in 1999, which is the last year for which comprehensive figures are currently available. Estimates of the output corresponding to scientific Research and Development suggest that an additional 4-9% might be attributable to the sector if its defini- tion is extended to include such activities, though the figures vary sig- nificantly between different countries.

Taking the UK as an example, in the context of other sectors, the creative industries make a far more significant contribution to output than hospitality or utilities and deliver four times the output due to agri- culture, fisheries and forestry. In terms of employment and depending on the definition of activities included, the sector is a major employer of between 4-6% of the UK's working population, though this is still sig- nificantly less than employment due to traditional areas of work such as retail and manufacturing.

Within the creative industries sector and again taking the UK as an example, the three largest sub-sectors are design, publishing, and televi- sion and radio. Together these account for around 75% of revenues and 50% of employment. The complex supply chains in the creative indus- tries sometimes make it challenging to calculate accurate figures for the gross value added by each sub-sector. This is particularly the case for the service-focused sub-sectors such as advertising, whereas it is more straightforward in product-focused sub-sectors such as crafts. Not surprisingly, perhaps, competition in product-focused areas tends to be more intense with a tendency to drive the production end of the supply chain to become a commodity business.

There may be a tendency for publicly funded creative industries de- velopment services to inaccurately estimate the number of creative businesses during the mapping process. There is also imprecision in nearly all tax code systems that determine a person's profession, since many creative people operate simultaneously in multiple roles and jobs. Both these factors mean that official statistics relating to the Creative Industries should be treated with caution.


The creative industries in Europe make a significant contribution to the EU economy, creating about 3% of EU GDP - corresponding to an annual market value of ?500 billion - and employing about 6 million people. In addition, the sector plays a crucial role in fostering innova- tion, in particular for devices and networks. The EU records the second highest TV viewing figures globally, producing more films than any other region in the world. In that respect, the newly proposed 'Creative Europe' programme will help preserve cultural heritage while increasing the circulation of creative works inside and outside the EU. The pro- gramme will play a consequential role in stimulating cross border co- operation, promoting peer learning and making these sectors more pro- fessional. The Commission will then propose a financial instrument run by the European Investment Bank to provide debt and equity finance for cultural and creative industries. The role of the non-state actors within the governance regarding Medias will not be neglected anymore. There- fore, building a new approach extolling the crucial importance of a Eu- ropean level playing field industry may boost the adoption of policies aimed at developing a conducive environment, enabling European com- panies as well as citizens to use their imagination and creativity – both sources of innovation, and therefore of competitiveness and sustainabil- ity. It supposes to tailor the regulatory and institutional frameworks in supporting private-public collaboration, in particular in the Medias sec- tor. The EU therefore plans to develop clusters, financing instruments as well as foresight activities to support this sector. The European Com- mission wishes to assist European creators and audiovisual enterprises to develop new markets through the use of digital technology, and asks how policy-making can best help achieve this. A more entrepreneurial culture will have to take hold with a more positive attitude towards risk- taking, and a capacity to innovate anticipating future trends. Creativity plays an important role in human resource management as artists and creative professionals can think laterally. Moreover, new jobs requiring new skills created in the post-crisis economy should be supported by labour mobility to ensure that people are employed wherever their skills are needed.

In the introduction to a 2013 special issue of Work and Occupations on artists in the US workforce, the guest editors argue that by examining the work lives of artists, one can identify characteristics and actions that help both individual workers and policy makers adapt to changing eco- nomic conditions. Elizabeth Lingo and Steven Tepper cite multiple


sources to suggest artists' skill sets allow them to work beyond existing markets and create entirely new opportunities for themselves and others. Specifically, Lingo and Tepper suggest artistic workers are "catalysts of change and innovation because they face special challenges managing ambiguity, developing and sustaining a relative identity, and forming community in the context of an individually based enterprise economy. Because of these adaptive skills, the suggestion is that studying how artists cope with uncertainty and the factors that influence their success should be relevant for understanding these broader social and economic trends facing today's workforce.

This view of artist-as-change-agent changes the questions research- ers ask of creative economies. Old research questions would focus on topics like skills, work practices, contracts, wage differentials, employ- ment incentives, formal credentials, employment pipelines, and labor flows of differentiated occupational categories. Examples of new ques- tions include:

1. How do artists both create changes in the labor market itself and the way cultural work is done?

2. What is their process of innovation and enterprise?

3. What is the nature of their work and the resources they draw upon?

4. How do different network structures produce different opportunity spaces?

5. How do artistic workers create and manage planned serendipity – the spaces and exchanges that produce unexpected collaborations and opportunities?

6. How do creative workers broker and synthesize across occupational, genre, geographic, and industry boundaries to create new possibili- ties?

As some first world countries struggle to compete in traditional mar- kets such as manufacturing, many now see the creative industries as a key component in a new knowledge economy, capable perhaps of deliv- ering urban regeneration, often through initiatives linked to exploitation of cultural heritage that leads to increased tourism. It is often argued that, in future, the ideas and imagination of countries like the United Kingdom will be their greatest asset; in support of this argument, a number of universities in the UK have started to offer creative entrepre- neurship as a specific area for study and research. Indeed, UK govern- ment figures reveal that the UK's creative industries account for over a


million jobs and brought in £112.5 billion to the UK economy, although the data sets underlying these figures are open to question.

In recent years, creative industries have become 'increasingly attrac- tive to governments outside the developed world'. In 2005, the United Nations Conference on Trade and Development XI High Level Panel on Creative Industries and Development commissioned several studies to identify challenges and opportunities facing the growth and develop- ment of creative industries in developing industries. As Cunningham et al. Рut it, 'the harnessing of creativity brings with it the potential of new wealth creation, the cultivation of local talent and the generation of crea- tive capital, the development of new export markets, significant multi- plier effects throughout the broader economy, the utilisation of infor- mation communication technologies and enhanced competitiveness in an increasingly global economy'. A key driver of interest in creative in- dustries and development is the acknowledgement that the value of crea- tive production resides in ideas and individual creativity, and developing countries have rich cultural traditions and pools of creative talent which lay a basic foundation for creative enterprises. Reflecting the growing interest in the potential of creative industries in developing countries, in October 2011 a Ministry of Tourism and Creative Economy was created within the Indonesian government with well-known economist Dr Mari Pangestu appointed as the first minister to hold the position.

 








Method of the science

The scientific method is a body of techniques for investigating phe- nomena, acquiring new knowledge, or correcting and integrating previ- ous knowledge. To be termed scientific, a method of inquiry is com- monly based on empirical or measurable evidence subject to specific principles of reasoning. The scientific method as a method or procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the for- mulation, testing, and modification of hypotheses. Experiments need to be designed to test hypotheses. The most important part of the scientific method is the experiment.

The scientific method is a continuous process, which usually begins with observations about the natural world. Human beings are naturally inquisitive, so they often come up with questions about things they see or hear and often develop ideas about why things are the way they are. The best hypotheses lead to predictions that can be tested in various


ways, including making further observations about nature. In general, the strongest tests of hypotheses come from carefully controlled and replicated experiments that gather empirical data.

Depending on how well the tests match the predictions, the original hypothesis may require refinement, alteration, expansion or even rejec- tion. If a particular hypothesis becomes very well supported a gen- eral theory may be developed. Although procedures vary from one field of inquiry to another, identifiable features are frequently shared in common between them. The overall process of the scientific method involves making conjectures, deriving predictions from them as logical consequences, and then carrying out experiments based on those predic- tions. A hypothesis is a conjecture, based on knowledge obtained while formulating the question. The hypothesis might be very specific or it might be broad. Scientists then test hypotheses by conducting experi- ments. Under modern interpretations, a scientific hypothesis must be falsifiable, implying that it is possible to identify a possible outcome of an experiment that conflicts with predictions deduced from the hy- pothesis; otherwise, the hypothesis cannot be meaningfully tested.

The purpose of an experiment is to determine whether observations agree with or conflict with the predictions derived from a hypothesis. Experiments can take place anywhere from a college lab to CERN's Large Hadron Collider. There are difficulties in a formulaic statement of method, however. Though the scientific method is often presented as a fixed sequence of steps, it represents rather a set of general principles. Not all steps take place in every scientific inquiry, and are not always in the same order. Some philosophers and scientists have argued that there is no scientific method, such as Lee Smolin and Paul Feyerabend.

The scientific method is the process by which science is carried out. As in other areas of inquiry, science can build on previous knowledge and develop a more sophisticated understanding of its topics of study over time.This model can be seen to underlay the scientific revolu- tion. One thousand years ago, Alhazen argued the importance of form- ing questions and subsequently testing them, an approach which was advocated by Galileo in 1638 with the publication of Two New Scienc- es. The current method is based on a hypothetico-deductive model for- mulated in the 20th century, although it has undergone significant revi- sion since first proposed.

The overall process involves making conjectures, deriving predic- tions from them as logical consequences, and then carrying out experi-


ments based on those predictions to determine whether the original con- jecture was correct. There are difficulties in a formulaic statement of method, however. Though the scientific method is often presented as a fixed sequence of steps, they are better considered as general principles. Not all steps take place in every scientific inquiry, and are not always in the same order. As noted by William Whewell, invention, sagacity, ge- nius are required at every step.

The question can refer to the explanation of a specific observation, as in "Why is the sky blue?", but can also be open-ended, as in "How can I design a drug to cure this particular disease?" This stage frequently in- volves finding and evaluating evidence from previous experiments, per- sonal scientific observations or assertions, and/or the work of other sci- entists. If the answer is already known, a different question that builds on the previous evidence can be posed. When applying the scientific method to scientific research, determining a good question can be very difficult and affects the final outcome of the investigation.

A hypothesis is a conjecture, based on knowledge obtained while formulating the question. That may explain the observed behavior of a part of our universe. The hypothesis might be very specific, e.g., Ein- stein's equivalence principle or Francis Crick's DNA makes RNA makes protein, or it might be broad, e.g., unknown species of life dwell in the unexplored depths of the oceans. A statistical hypothesis is a conjecture about some population. For example, the population might be people with a particular disease. The conjecture might be that a new drug will cure the disease in some of those people. Terms commonly associated with statistical hypotheses are null hypothesis and alternative hypothe- sis. A null hypothesis is the conjecture that the statistical hypothesis is false, e.g., that the new drug does nothing and that any cures are due to chance effects. Researchers normally want to show that the null hypoth- esis is false. The alternative hypothesis is the desired outcome, e.g., that the drug does better than chance. A final point: a scientific hypothesis must be falsifiable, meaning that one can identify a possible outcome of an experiment that conflicts with predictions deduced from the hypothe- sis; otherwise, it cannot be meaningfully tested.

This step involves determining the logical consequences of the hy- pothesis. One or more predictions are then selected for further testing. The more unlikely that a prediction would be correct simply by coinci- dence, then the more convincing it would be if the prediction were ful- filled; evidence is also stronger if the answer to the prediction is not al-


ready known, due to the effects of hindsight bias. Ideally, the prediction must also distinguish the hypothesis from likely alternatives; if two hy- potheses make the same prediction, observing the prediction to be cor- rect is not evidence for either one over the other. These statements about the relative strength of evidence can be mathematically derived using Bayes' Theorem.

This is an investigation of whether the real world behaves as predict- ed by the hypothesis. Scientists test hypotheses by conducting experi- ments. The purpose of an experiment is to determine whether observa- tions of the real world agree with or conflict with the predictions derived from a hypothesis. If they agree, confidence in the hypothesis increases; otherwise, it decreases. Agreement does not assure that the hypothesis is true; future experiments may reveal problems. Karl Popper advised sci- entists to try to falsify hypotheses, i.e., to search for and test those ex- periments that seem most doubtful. Large numbers of successful con- firmations are not convincing if they arise from experiments that avoid risk. Experiments should be designed to minimize possible errors, espe- cially through the use of appropriate scientific controls. For example, tests of medical treatments are commonly run as double-blind tests.

Test personnel, who might unwittingly reveal to test subjects which samples are the desired test drugs and which are placebos, are kept igno- rant of which are which. Such hints can bias the responses of the test subjects. Furthermore, failure of an experiment does not necessarily mean the hypothesis is false. Experiments always depend on several hypotheses, e.g., that the test equipment is working properly, and a fail- ure may be a failure of one of the auxiliary hypotheses. Experiments can be conducted in a college lab, on a kitchen table, at CERN's Large Had- ron Collider, at the bottom of an ocean, on Mars, and so on. Astrono- mers do experiments, searching for planets around distant stars. Finally, most individual experiments address highly specific topics for reasons of practicality. As a result, evidence about broader topics is usually ac- cumulated gradually.

This involves determining what the results of the experiment show and deciding on the next actions to take. The predictions of the hypothe- sis are compared to those of the null hypothesis, to determine which is better able to explain the data. In cases where an experiment is repeated many times, a statistical analysis such as a chi-squared test may be re- quired. If the evidence has falsified the hypothesis, a new hypothesis is required; if the experiment supports the hypothesis but the evidence is


not strong enough for high confidence, other predictions from the hy- pothesis must be tested. Once a hypothesis is strongly supported by evi- dence, a new question can be asked to provide further insight on the same topic. Evidence from other scientists and experience are frequently incorporated at any stage in the process. Depending on the complexity of the experiment, many iterations may be required to gather sufficient evidence to answer a question with confidence, or to build up many an- swers to highly specific questions in order to answer a single broader question. The scientific method also includes other components required even when all the iterations of the steps above have been completed.

If an experiment cannot be repeated to produce the same results, this implies that the original results might have been in error. As a result, it is common for a single experiment to be performed multiple times, es- pecially when there are uncontrolled variables or other indications of experimental error. For significant or surprising results, other scien- tists may also attempt to replicate the results for themselves, especially if those results would be important to their own work.

The process of peer review involves evaluation of the experiment by experts, who typically give their opinions anonymously. Some journals request that the experimenter provide lists of possible peer reviewers, especially if the field is highly specialized. Peer review does not certify correctness of the results the experiments themselves were sound. If the work passes peer review, which occasionally may require new experi- ments requested by the reviewers, it will be published in a peer- reviewed scientific journal. The specific journal that publishes the re- sults indicates the perceived quality of the work.

Scientists typically are careful in recording their data, a requirement promoted by Ludwik Fleck and others.Though not typically required, they might be requested to supply this data to other scientists who wish to replicate their original results, extending to the sharing of any exper- imental samples that may be difficult to obtain.

Scientific inquiry generally aims to obtain knowledge in the form of testable explanations that scientists can use to predict the results of fu- ture experiments. This allows scientists to gain a better understanding of the topic under study, and later to use that understanding to intervene in its causal mechanisms (such as to cure disease). The better an explana- tion is at making predictions, the more useful it frequently can be, and the more likely it will continue to explain a body of evidence better than its alternatives. The most successful explanations – those which explain


and make accurate predictions in a wide range of circumstances – are often called scientific theories.

Most experimental results do not produce large changes in human understanding; improvements in theoretical scientific understanding typ- ically result from a gradual process of development over time, some- times across different domains of science. Scientific models vary in the extent to which they have been experimentally tested and for how long, and in their acceptance in the scientific community. In general, explana- tions become accepted over time as evidence accumulates on a given topic, and the explanation in question proves more powerful than its al- ternatives at explaining the evidence. Often subsequent researchers re- formulate the explanations over time, or combined explanations to pro- duce new explanations.

Tow sees the scientific method in terms of an evolutionary algorithm applied to science and technology. Scientific knowledge is closely tied to empirical findings, and can remain subject to falsification if new ex- perimental observation incompatible with it is found. That is, no theory can ever be considered final, since new problematic evidence might be discovered. If such evidence is found, a new theory may be proposed, or it is found that modifications to the previous theory are sufficient to ex- plain the new evidence. The strength of a theory can be argued to relate to how long it has persisted without major alteration to its core princi- ples.

Theories can also become subsumed by other theories. For example, Newton's laws explained thousands of years of scientific observations of the planets almost perfectly. However, these laws were then determined to be special cases of a more general theory, which explained both the exceptions to Newton's laws and predicted and explained other observa- tions such as the deflection of light by gravity. Thus, in certain cases independent, unconnected, scientific observations can be connected to each other, unified by principles of increasing explanatory power.

Since new theories might be more comprehensive than what preced- ed them, and thus be able to explain more than previous ones, successor theories might be able to meet a higher standard by explaining a larger body of observations than their predecessors. For example, the theory of evolution explains the diversity of life on Earth, how species adapt to their environments, and many other patterns observed in the natural world; its most recent major modification was unification with genetics to form the modern evolutionary synthesis. In subsequent modifications,


it has also subsumed aspects of many other fields such as biochemistry and molecular biology.

Scientific methodology often directs that hypotheses be tested in controlled conditions wherever possible. This is frequently possible in certain areas, such as in the biological sciences, and more difficult in other areas, such as in astronomy. The practice of experimental control and reproducibility can have the effect of diminishing the potentially harmful effects of circumstance, and to a degree, personal bias. For ex- ample, pre-existing beliefs can alter the interpretation of results, as in confirmation bias; this is a heuristic that leads a person with a particular belief to see things as reinforcing their belief, even if another observer might disagree.

A historical example is the belief that the legs of a galloping horse are splayed at the point when none of the horse's legs touches the ground, to the point of this image being included in paintings by its sup- porters. However, the first stop-action pictures of a horse's gallop by Eadweard Muybridge showed this to be false, and that the legs are in- stead gathered together. Another important human bias that plays a role is a preference for new, surprising statements, which can result in a search for evidence that the new is true. In contrast to this standard in the scientific method, poorly attested beliefs can be believed and acted upon via a less rigorous heuristic, sometimes taking advantage of the narrative fallacy that when narrative is constructed its elements be- come easier to believe. Sometimes, these have their elements assumed a priori, or contain some other logical or methodological flaw in the pro- cess that ultimately produced them.

There are different ways of outlining the basic method used for sci- entific inquiry. The scientific community and philosophers of science generally agree on the following classification of method components. These methodological elements and organization of procedures tend to be more characteristic of natural sciences than social sciences. Nonethe- less, the cycle of formulating hypotheses, testing and analyzing the re- sults, and formulating new hypotheses, will resemble the cycle de- scribed below.

Four essential elements of the scientific met hod are iterations, recur- sions, interleavings, or orderings of the following:

· Characterizations (observations, definitions, and measurements of the subject of inquiry)


· Hypotheses (theoretical, hypothetical explanations of observa- tions and measurements of the subject)

· Predictions (reasoning including deductive reasoning from the hypothesis or theory)

· Experiments (tests of all of the above)

Each element of the scientific method is subject to peer review for possible mistakes. These activities do not describe all that scientists do but apply mostly to experimental sciences. The elements above are often taught in the educational system as the scientific method.

The scientific method is not a single recipe: it requires intelligence, imagination, and creativity. In this sense, it is not a mindless set of standards and procedures to follow, but is rather an ongoing cycle, con- stantly developing more useful, accurate and comprehensive models and methods. For example, when Einstein developed the Special and Gen- eral Theories of Relativity, he did not in any way refute or discount Newton's Principia. On the contrary, if the astronomically large, the vanishingly small, and the extremely fast are removed from Einstein's theories – all phenomena Newton could not have observed – Newton's equations are what remain. Einstein's theories are expansions and re- finements of Newton's theories and, thus, increase confidence in New- ton's work. A linearized, pragmatic scheme of the four points above is sometimes offered as a guideline for proceeding.

1. Define a question

2. Gather information and resources

3. Form an explanatory hypothesis

4. Test the hypothesis by performing an experiment and collecting data in a reproducible manner

5. Analyze the data

6. Interpret the data and draw conclusions that serve as a starting point for new hypothesis

7. Publish results

8. Retest

The iterative cycle inherent in this step-by-step method goes from point 3 to 6 back to 3 again. While this schema outlines a typical hy- pothesis/testing method, it should also be noted that a number of philos- ophers, historians, and sociologists of science, including Paul Feyera- bend, claim that such descriptions of scientific method have little rela- tion to the ways that science is actually practiced.


The scientific method depends upon increasingly sophisticated char- acterizations of the subjects of investigation. For example, Benjamin Franklin conjectured, correctly, that St. Elmol fire was electrical in na- ture, but it has taken a long series of experiments and theoretical chang- es to establish this. While seeking the pertinent properties of the sub- jects, careful thought may also entail some definitions and observations; the observations often demand careful measurements and/or counting.

The systematic, careful collection of measurements or counts of rele- vant quantities is often the critical difference between pseudo-sciences, such as alchemy, and science, such as chemistry or biology. Scientific measurements are usually tabulated, graphed, or mapped, and statistical manipulations, such as correlation and regression, performed on them. The measurements might be made in a controlled setting, such as a la- boratory, or made on more or less inaccessible or unmanipulatable ob- jects such as stars or human populations. The measurements often re- quire specialized scientific instruments such as thermometers, spectro- scopes, particle accelerators, or voltmeters, and the progress of a scien- tific field is usually intimately tied to their invention and improvement. I am not accustomed to saying anything with certainty after only one or two observations.

Measurements in scientific work are also usually accompanied by es- timates of their uncertainty. The uncertainty is often estimated by mak- ing repeated measurements of the desired quantity. Uncertainties may also be calculated by consideration of the uncertainties of the individual underlying quantities used. Counts of things, such as the number of people in a nation at a particular time, may also have an uncertainty due to data collection limitations. Or counts may represent a sample of de- sired quantities, with an uncertainty that depends upon the sampling method used and the number of samples taken.

Measurements demand the use of operational definitions of relevant quantities. That is, a scientific quantity is described or defined by how it is measured, as opposed to some more vague, inexact or "idealized" def- inition. For example, electric current, measured in amperes, may be op- erationally defined in terms of the mass of silver deposited in a certain time on an electrode in an electrochemical device that is described in some detail. The operational definition of a thing often relies on com- parisons with standards: the operational definition of "mass" ultimately relies on the use of an artifact, such as a particular kilogram of platinum- iridium kept in a laboratory in France.


The scientific definition of a term sometimes differs substantially from its natural language usage. For example, mass and weight overlap in meaning in common discourse, but have distinct meanings in me- chanics. Scientific quantities are often characterized by their units of measure which can later be described in terms of conventional physical units when communicating the work.

New theories are sometimes developed after realizing certain terms have not previously been sufficiently clearly defined. For example, Al- bert Einstein's first paper on relativity begins by defining simultaneity and the means for determining length. These ideas were skipped over by Isaac Newton with, I do not define time, space, place and motion, as being well known to all. Einstein's paper then demonstrates that they (viz., absolute time and length independent of motion) were approxima- tions. Francis Crick cautions us that when characterizing a subject, how- ever, it can be premature to define something when it remains ill- understood. In Crick's study of consciousness, he actually found it easier to study awareness in the visual system, rather than to study free will, for example. His cautionary example was the gene; the gene was much more poorly understood before Watson and Crick's pioneering discovery of the structure of DNA; it would have been counterproductive to spend much time on the definition of the gene, before them.

The history of the discovery of the structure of DNA is a classic ex- ample of the elements of the scientific method: in 1950 it was known that genetic inheritance had a mathematical description, starting with the studies of Gregor Mendel, and that DNA contained genetic infor- mation. But the mechanism of storing genetic information in DNA was unclear. Researchers in Bragg's laboratory at Cambridge Universi- ty made X-raydiffraction pictures of various molecules, starting with crystals of salt, and proceeding to more complicated substances. Using clues painstakingly assembled over decades, beginning with its chemi- cal composition, it was determined that it should be possible to charac- terize the physical structure of DNA, and the X-ray images would be the vehicle.

The characterization element can require extended and extensive study, even centuries. It took thousands of years of measurements, from the Chaldean, Indian, Persian, Greek, Arabic and European astronomers, to fully record the motion of planet Earth. Newton was able to include those measurements into consequences of his laws of motion. But the perihelion of the planet Mercury's orbit exhibits a precession that cannot


be fully explained by Newton's laws of motion, as Leverrier pointed out in 1859. The observed difference for Mercury's precession between Newtonian theory and observation was one of the things that occurred to Einstein as a possible early test of his theory of General Relativity. His relativistic calculations matched observation much more closely than did Newtonian theory. The difference is approximately 43 arc-seconds per century.

A hypothesis is a suggested explanation of a phenomenon, or alter- nately a reasoned proposal suggesting a possible correlation between or among a set of phenomena. Normally hypotheses have the form of a mathematical model. Sometimes, but not always, they can also be for- mulated as existential statements, stating that some particular instance of the phenomenon being studied has some characteristic and causal ex- planations, which have the general form of universal statements, stating that every instance of the phenomenon has a particular characteristic.

Scientists are free to use whatever resources they have – their own creativity, ideas from other fields, inductive reasoning, Bayesian infer- ence, and so on – to imagine possible explanations for a phenomenon under study. Charles Sanders Peirce, borrowing a page from Aristotle described the incipient stages of inquiry, instigated by the "irritation of doubt" to venture a plausible guess, as abductive reasoning. The history of science is filled with stories of scientists claiming a "flash of inspira- tion", or a hunch, which then motivated them to look for evidence to support or refute their idea. Michael Polanyi made such creativity the centerpiece of his discussion of methodology.

William Glen observes that the success of a hypothesis, or its service to science, lies not simply in its perceived "truth", or power to displace, subsume or reduce a predecessor idea, but perhaps more in its ability to stimulate the research that will illuminate bald suppositions and areas of vagueness.

In general scientists tend to look for theories that are "elegant" or "beautiful". In contrast to the usual English use of these terms, they here refer to a theory in accordance with the known facts, which is neverthe- less relatively simple and easy to handle. Occam's serves as a rule of thumb for choosing the most desirable amongst a group of equally ex- planatory hypotheses.

To minimize the confirmation bias which results from entertaining a single hypothesis, strong inference emphasizes the need for entertaining multiple alternative hypotheses.


Linus Pauling proposed that DNA might be a triple helix. This hy- pothesis was also considered by Francis Crick and James D. Watson but discarded. When Watson and Crick learned of Pauling's hypothesis, they understood from existing data that Pauling was wrong and that Pauling would soon admit his difficulties with that structure. So, the race was on to figure out the correct structure.

Any useful hypothesis will enable predictions, by reasoning includ- ing deductive reasoning. It might predict the outcome of an experiment in a laboratory setting or the observation of a phenomenon in nature. The prediction can also be statistical and deal only with probabilities.

It is essential that the outcome of testing such a prediction be current- ly unknown. Only in this case does a successful outcome increase the probability that the hypothesis is true. If the outcome is already known, it is called a consequence and should have already been considered while formulating the hypothesis.

If the predictions are not accessible by observation or experience, the hypothesis is not yet testable and so will remain to that extent unscien- tific in a strict sense. A new technology or theory might make the neces- sary experiments feasible. Thus, much scientifically based speculation might convince one that the hypothesis that other intelligent species ex- ist is true. But since there no experiment now known which can test this hypothesis, science itself can have little to say about the possibility. In future, some new technique might lead to an experimental test and the speculation would then become part of accepted science.

Einstein's theory of General Relativity makes several specific predic- tions about the observable structure of space-time, such as that ight bends in a gravitational field, and that the amount of bending depends in a precise way on the strength of that gravitational field. Arthur Edding- ton's observations made during a 1919 solar eclipse supported General Relativity rather than Newtonian gravitation.

 












Experiments

Once predictions are made, they can be sought by experiments. If the test results contradict the predictions, the hypotheses which entailed them are called into question and become less tenable. Sometimes the experiments are conducted incorrectly or are not very well designed, when compared to a crucial experiment. If the experimental results con- firm the predictions, then the hypotheses are considered more likely to be correct, but might still be wrong and continue to be subject to further


testing. The experimental control is a technique for dealing with obser- vational error. This technique uses the contrast between multiple sam- ples under differing conditions to see what varies. We vary the condi- tions for each measurement, to help isolate what has changed. Mill's canons can then help us figure out what the important factor is. Factor analysis is one technique for discovering the important factor in an ef- fect.

Depending on the predictions, the experiments can have different shapes. It could be a classical experiment in a laboratory setting, a dou- ble-blind study or an archaeological excavation. Even taking a plane from New York to Paris is an experiment which tests the aerodynamical hypotheses used for constructing the plane. Scientists assume an attitude of openness and accountability on the part of those conducting an exper- iment. Detailed record keeping is essential, to aid in recording and re- porting on the experimental results, and supports the effectiveness and integrity of the procedure. They will also assist in reproducing the ex- perimental results, likely by others. Traces of this approach can be seen in the work of Hipparchus , when determining a value for the precession of the Earth, while controlled experiments can be seen in the works of Jābir ibn Hayyān, al-Battani, Alhazen.

The scientific method is iterative. At any stage it is possible to refine its accuracy and precision, so that some consideration will lead the sci- entist to repeat an earlier part of the process. Failure to develop an inter- esting hypothesis may lead a scientist to re-define the subject under con- sideration. Failure of a hypothesis to produce interesting and testable predictions may lead to reconsideration of the hypothesis or of the defi- nition of the subject. Failure of an experiment to produce interesting results may lead a scientist to reconsider the experimental method, the hypothesis, or the definition of the subject.

Other scientists may start their own research and enter the process at any stage. They might adopt the characterization and formulate their own hypothesis, or they might adopt the hypothesis and deduce their own predictions. Often the experiment is not done by the person who made the prediction, and the characterization is based on experiments done by someone else. Published results of experiments can also serve as a hypothesis predicting their own reproducibility.

Science is a social enterprise, and scientific work tends to be accept- ed by the scientific community when it has been confirmed. Crucially, experimental and theoretical results must be reproduced by others with-


in the scientific community. Researchers have given their lives for this vision; Georg Wilhelm Richmann was killed by ball lightning when at- tempting to replicate the 1752 kite-flying experiment of Benjamin Franklin. To protect against bad science and fraudulent data, govern- ment research-granting agencies such as the National Science Founda- tion, and science journals, including Nature and Science, have a policy that researchers must archive their data and methods so that other re- searchers can test the data and methods and build on the research that has gone before. Scientific data archiving can be done at a number of national archives in the U.S. or in the World Data Center.

 



Scientific theory

A scientific theory is an explanation of some aspect of the natural world that can, in accordance with the scientific method be repeatedly tested, using a predefined protocol of observations and experi- ments.Established scientific theories have withstood rigorous scrutiny and are a comprehensive form of scientific knowledge.It is important to note that the definition of a "scientific theory" as used in the disciplines of science is significantly different from the common vernacular usage of the word theory In everyday non-scientific speech, "theory" can im- ply that something is an unsubstantiated and speculative guess, conjec- ture, idea, or, hypothesis; such a usage is the opposite of the word "theo- ry" in science. These different usages are comparable to the differing, and often opposing, usages of the term "prediction" in science versus "prediction" in vernacular speech, denoting a mere hope.

The strength of a scientific theory is related to the diversity of phe- nomena it can explain. As additional scientific evidence is gathered, a scientific theory may be rejected or modified if it does not fit the new empirical findings; in such circumstances, a more accurate theory is then desired. In certain cases, the less-accurate unmodified scientific theory can still be treated as a theory if it is useful as an approximation under specific conditions.

Scientific theories are testable and make falsifiablepredictions. They describe the causal elements responsible for a particular natural phe- nomenon, and are used to explain and predict aspects of the physical universe or specific areas of inquiry. Scientists use theories as a founda- tion to gain further scientific knowledge, as well as to accomplish goals such as inventing technology or curing disease.


As with other forms of scientific knowledge, scientific theories are both deductive and inductive in nature and aim for predictive power and explanatory capability.

Albert Einstein described two types of scientific theories, "Construc- tive theories" and "principle theories". Constructive theories are con- structive models for a phenomen, for example Kinetic energy. Typically for any theory to be accepted within most academia there is one simple criterion. The essential criterion is that the theory must be observable and repeatable. The aforementioned criterion is essential to prevent fraud and perpetuate science itself.

The defining characteristic of all scientific knowledge, including theories, is the ability to make falsifiable or testable predictions. The relevance and specificity of those predictions determine how potentially useful the theory is. A would-be theory that makes no observable pre- dictions is not a scientific theory at all. Predictions not sufficiently spe- cific to be tested are similarly not useful. In both cases, the term "theo- ry" is not applicable.

A body of descriptions of knowledge can be called a theory if it ful- fills the following criteria:

· It makes falsifiable predictions with consistent accuracy across a broad area of scientific inquiry.

· It is well-supported by many independent strands of evidence, rather than a single foundation.

· It is consistent with preexisting experimental results and at least as accurate in its predictions as are any preexisting theories.

These qualities are certainly true of such established theories as spe- cialь and general relativity, quantum mechanics, plate tectonics, the modern evolutionary synthesis, etc.

In addition, scientists prefer to work with a theory that meets the fol- lowing qualities:

· It can be subjected to minor adaptations to account for new data that do not fit it perfectly, as they are discovered, thus increasing its predictive capability over time.

· It is among the most parsimonious explanations, economical in the use of proposed entities or explanatory steps as per Occam's razor.

The formal scientific definition of theory is quite different from the everyday meaning of the word. It refers to a comprehensive explanation of some aspect of nature that is supported by a vast body of evidence. Many scientific theories are so well established that no new evidence is


likely to alter them substantially. For example, no new evidence will demonstrate that the Earth does not orbit around the sun, or that living things are not made of cells, that matter is not composed of atoms, or that the surface of the Earth is not divided into solid plates that have moved over geological timescales. One of the most useful properties of scientific theories is that they can be used to make predictions about natural events or phenomena that have not yet been observed.

A scientific theory is a well-substantiated explanation of some aspect of the natural world, based on a body of facts that have been repeatedly confirmed through observation and experiment. Such fact-supported theories are not "guesses" but reliable accounts of the real world. The theory of biological evolution is more than "just a theory". It is as factu- al an explanation of the universe as the atomic theory of matter or the germ theory of disease. Our understanding of gravity is still a work in progress. But the phenomenon of gravity, like evolution, is an accepted fact. Note that the term theory would not be appropriate for describing untested but intricate hypotheses or even scientific models.

The scientific method involves the proposal and testing of hypothe- ses, by deriving predictions from the hypotheses about the results of fu- ture experiments, then performing those experiments to see whether the predictions are valid. This provides evidence either for or against the hypothesis. When enough experimental results have been gathered in a particular area of inquiry, scientists may propose an explanatory frame- work that accounts for as many of these as possible. This explanation is also tested, and if it fulfills the necessary criteria, then the explanation becomes a theory. This can take many years, as it can be difficult or complicated to gather sufficient evidence.

Once all of the criteria have been met, it will be widely accepted by scientists as the best available explanation of at least some phenomena. It will have made predictions of phenomena that previous theories could not explain or could not predict accurately, and it will have resisted at- tempts at falsification. The strength of the evidence is evaluated by the scientific community, and the most important experiments will have been replicated by multiple independent groups.

Theories do not have to be perfectly accurate to be scientifically use- ful. For example, the predictions made by classical mechanics are known to be inaccurate in the relatistivic realm, but they are almost ex- actly correct at the comparatively low velocities of common human ex- perience. In chemistry, there are many acid-base theories providing


highly divergent explanations of the underlying nature of acidic and basic compounds, but they are very useful for predicting their chemical behavior. Like all knowledge in science, no theory can ever be com- pletely certain, since it is possible that future experiments might conflict with the theory's predictions. However, theories supported by the scien- tific consensus have the highest level of certainty of any scientific knowledge; for example, that all objects are subject to gravity or that life on Earth evolved from a common ancestor.

Acceptance of a theory does not require that all of its major predic- tions be tested, if it is already supported by sufficiently strong evidence. For example, certain tests may be unfeasible or technically difficult. As a result, theories may make predictions that have not yet been confirmed or proven incorrect; in this case, the predicted results may be described informally with the term "theoretical". These predictions can be tested at a later time, and if they are incorrect, this may lead to the revision or rejection of the theory.

If experimental results contrary to a theory's predictions are ob- served, scientists first evaluate whether the experimental design was sound, and if so they confirm the results by independent replication. A search for potential improvements to the theory then begins. Solutions may require minor or major changes to the theory, or none at all if a sat- isfactory explanation is found within the theory's existing framework. Over time, as successive modifications build on top of each other, theo- ries consistently improve and greater predictive accuracy is achieved. Since each new version of a theory must have more predictive and ex- planatory power than the last, scientific knowledge consistently be- comes more accurate over time.

If modifications to the theory or other explanations seem to be insuf- ficient to account for the new results, then a new theory may be re- quired. Since scientific knowledge is usually durable, this occurs much less commonly than modification. Furthermore, until such a theory is proposed and accepted, the previous theory will be retained. This is be- cause it is still the best available explanation for many other phenomena, as verified by its predictive power in other contexts. For example, it was known in 1859 that the observed perihelion precession of Mercury vio- lated Newtonian mechanics, but the theory remained the best explana- tion available until relativity was supported by sufficient evidence. Also, while new theories may be proposed by a single person or by many, the


cycle of modifications eventually incorporates contributions from many different scientists.

After the changes, the accepted theory will explain more phenomena and have greater predictive power; this new explanation will then be open to further replacement or modification. If a theory does not require modification despite repeated tests, this implies that the theory is very accurate. This also means that accepted theories continue to accumulate evidence over time, and the length of time that a theory remains accept- ed often indicates the strength of its supporting evidence.

In some cases, two or more theories may be replaced by a single the- ory that explains the previous theories as approximations or special cas- es, analogous to the way a theory is a unifying explanation for many confirmed hypotheses; this is referred to as unificationof theories. For example, electricity and magnetism are now known to be two aspects of the same phenomenon, referred to as electromagnetism.

When the predictions of different theories appear to contradict each other, this is also resolved by either further evidence or unification. For example, physical theories in the 19th century implied that the Sun could not have been burning long enough to allow certain geological changes as well as the evolution of life. This was resolved by the dis- covery of nuclear fusion, the main energy source of the Sun. Contradic- tions can also be explained as the result of theories approximating more fundamental phenomena. For example, atomic theory is an approxima- tion of quantum mechanics. Current theories describe three separate fundamental phenomena of which all other theories are approximations; the potential unification of these is sometimes called the Theory of Eve- rything.

In 1905, Albert Einstein published the principle of special relativity, which soon became a theory. Special relativity predictively aligned the Newtonian principle of Galilean invariance, also termed Galilean rela- tivity, with the electromagnetic field. By omitting from special relativity the luminiferous aether, Einstein stated time dilation and length contrac- tion to be expected by an observer measuring an object in relative mo- tion inertial – that is, an objecting exhibiting constant velocity, which is speed with direction, when measured by its observer – and thereby du- plicated the Lorentz transformation and the Lorentz contraction that had been inserted into electrodynamic theory as dynamical consequences of the aether's properties hypothesized to resolve experimental riddles. El- egant, special relativity yielded its own consequences, such as the


equivalence of mass and energy transforming into one another and the resolution of the paradox that an excitation of the electromagnetic field could be viewed in one reference frame as electricity, but in another as magnetism.

Einstein sought to generalize the invariance principle to all reference frames, whether inertial or accelerating. Rejecting Newtonian gravita- tion – a central force acting instantly at a distance – Einstein presumed a gravitational field. In 1907, Einstein's equivalence principle inferred that a free fall within a uniform gravitational field is equivalent to inertial motion. By extending special relativity's effects into three dimensions, length contraction became space contraction in general relativity, whose 4D spacetime is the gravitational field that local alters geometrically and sets all local objects' pathways. And even massless energy exerts gravi- tational motion on local objects by "curving" the geometrical "surface" of 4D spacetime. Yet unless vast, the relativistic effects of energy, whether by speed or mass in the vicinity – where space is contracted and time is slowed – are negligible when merely predicting motion. Alt- hough general relativity is embraced as the more explanatory theory via scientific realism, Newton's theory remains successful as merely a predictive theory via instrumentalism. To calculate trajectories, engi- neers and NASA still uses Newton's equations, which are simpler to operate. Both scientific laws and scientific theories are produced from the scientific method through the formation and testing of hypotheses, and can predict the behavior of the natural world. Both are typically well-supported by observations and/or experimental evidence. However, scientific laws are descriptive accounts of how nature will behave under certain conditions. Scientific theories are broader in scope, and give overarching explanations of how nature works and why it exhibits cer- tain characteristics. Theories are supported by evidence from many dif- ferent sources, and may contain one or several laws.

A common misconception is that scientific theories are rudimentary ideas that will eventually graduate into scientific laws when enough data and evidence have been accumulated. A theory does not change into a scientific law with the accumulation of new or better evidence. A theory will always remain a theory; a law will always remain a law. Both theo- ries and laws could potentially be falsified by countervailing evidence.

Theories and laws are also distinct from hypotheses. Unlike hypothe- ses, theories and laws may be simply referred to as scientific fact. How-


ever, in science, theories are different from facts even when they are well supported. For example, evolution is both a theory and a fact.

The logical positivists thought of scientific theories as statements in a formal language. First-order logic is an example of a formal language. The logical positivists envisaged a similar scientific language. In addi- tion to scientific theories, the language also included observation sen- tences, definitions, and mathematical statements. The phenomena ex- plained by the theories, if they could not be directly observed by the senses, were treated as theoretical concepts. In this view, theories func- tion as axioms: predicted observations are derived from the theories much like theorems are derived in Euclidean geometry. However, the predictions are then tested against reality to verify the theories, and the "axioms" can be revised as a direct result.

The phrase "the received view of theories" is used to describe this approach. Terms commonly associated with it are "linguistic" and "syntactic". Problems in defining this kind of language precisely, e.g., are objects seen in microscopes observed or are they theoretical objects, led to the effective demise of logical positivism in the 1970s.

The semantic view of theories, which identifies scientific theories with models rather than propositions, has replaced the received view as the dominant position in theory formulation in the philosophy of sci- ence. A model is a logical framework intended to represent reality, simi- lar to the way that a map is a graphical model that represents the territo- ry of a city or country.

In this approach, theories are a specific category of models that fulfill the necessary criteria. One can use language to describe a model; how- ever, the theory is the model (or a collection of similar models), and not the description of the model. A model of the solar system, for example, might consist of abstract objects that represent the sun and the planets. These objects have associated properties, e.g., positions, velocities, and masses. The model parameters, e.g., Newton's Law of Gravitation, de- termine how the positions and velocities change with time. This model can then be tested to see whether it accurately predicts future observa- tions; astronomers can verify that the positions of the model's objects over time match the actual positions of the planets. For most planets, the Newtonian model's predictions are accurate; for Mercury, it is slightly inaccurate and the model of general relativitymust be used instead.

The word "semantic" refers to the way that a model represents the real world. The representation describes particular aspects of a phenom-


enon or the manner of interaction among a set of phenomena. For in- stance, a scale model of a house or of a solar system is clearly not an actual house or an actual solar system; the aspects of an actual house or an actual solar system represented in a scale model are, only in certain limited ways, representative of the actual entity. A scale model of a house is not a house; but to someone who wants to learn about houses, analogous to a scientist who wants to understand reality, a sufficiently detailed scale model may suffice.

Several commentators have stated that the distinguishing characteris- tic of theories is that they are explanatory as well as descriptive, while models are only descriptive. Philosopher Stephen Pepperalso distin- guished between theories and models, and said in 1948 that general models and theories are predicated on a "root" metaphor that constrains how scientists theorize and model a phenomenon and thus arrive at test- able hypotheses.

Engineering practice makes a distinction between "mathematical models" and "physical models"; the cost of fabricating a physical model can be minimized by first creating a mathematical model using a com- puter software package, such as a computer aided design tool. The com- ponent parts are each themselves modelled, and the fabrication toleranc- es are specified. An exploded view drawing is used to lay out the fabri- cation sequence. Simulation packages for displaying each of the subas- semblies allow the parts to be rotated, magnified, in realistic detail. Cer- tain assumptions are necessary for all empirical claims. However, theo- ries do not generally make assumptions in the conventional sense. While assumptions are often incorporated during the formation of new theo- ries, these are either supported by evidence or the evidence is produced in the course of validating the theory. This may be as simple as observ- ing that the theory makes accurate predictions, which is evidence that any assumptions made at the outset are correct or approximately correct under the conditions tested.

Conventional assumptions, without evidence, may be used if the the- ory is only intended to apply when the assumption is valid. For example, the special theory of relativity assumes an inertial frame of reference. The theory makes accurate predictions when the assumption is valid, and does not make accurate predictions when the assumption is not val- id. Such assumptions are often the point with which older theories are succeeded by new ones.


The term "assumption" is actually broader than its standard use, ety- mologically speaking. The Oxford English Dictionary and online Wik- tionary indicate its Latin source as assumere , which is a conjunction of ad- and sumere. The root survives, with shifted meanings, in the Italian sumere and Spanish sumir. The first sense of "assume" in the OED is "to take unto, receive, accept, adopt". The term was originally employed in religious contexts as in "to receive up into heaven", especially "the re- ception of the Virgin Mary into heaven, with body preserved from cor- ruption", but it was also simply used to refer to "receive into associa- tion" or "adopt into partnership". Moreover, other senses of assumere included (i) "investing oneself with (an attribute)", (ii) "to undertake" (especially in Law), (iii) "to take to oneself in appearance only, to pre- tend to possess", and (iv) "to suppose a thing to be" (all senses from OED entry on "assume"; the OED entry for "assumption" is almost per- fectly symmetrical in senses). Thus, "assumption" connotes other asso- ciations than the contemporary standard sense of "that which is assumed or taken for granted; a supposition, postulate". Karl Popper described the characteristics of a scientific theory as follows:

1. It is easy to obtain confirmations, or verifications, for nearly every theory – if we look for confirmations.

2. Confirmations should count only if they are the result of risky predictions; that is to say, if, unenlightened by the theory in question, we should have expected an event which was incompatible with the the- ory – an event which would have refuted the theory.

3. Every "good" scientific theory is a prohibition: it forbids certain things to happen. The more a theory forbids, the better it is.

4. A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory but a vice.

5. Every genuine test of a theory is an attempt to falsify it, or to re- fute it. Testability is falsifiability; but there are degrees of testability: some theories are more testable, more exposed to refutation, than others; they take, as it were, greater risks.

6. Confirming evidence should not count except when it is the re- sult of a genuine test of the theory; and this means that it can be present- ed as a serious but unsuccessful attempt to falsify the theory.

7. Some genuinely testable theories, when found to be false, might still be upheld by their admirers – for example by introducing post hoc some auxiliary hypothesis or assumption, or by reinterpreting the theory post hoc in such a way that it escapes refutation. Such a procedure is


always possible, but it rescues the theory from refutation only at the price of destroying, or at least lowering, its scientific status, by tampering with evidence. The temptation to tamper can be mini- mized by first taking the time to write down the testing protocol before embarking on the scientific work.

Several philosophers and historians of science have, however, argued that Popper's definition of theory as a set of falsifiable statements is wrong because, as Philip Kitcher has pointed out, if one took a strictly Popperian view of "theory", observations of Uranus when first discov- ered in 1781 would have "falsified" Newton's celestial mechanics. Ra- ther, people suggested that another planet influenced Uranus' orbit – and this prediction was indeed eventually confirmed.

According to Kitcher, good scientific theories must have three fea- tures:

1. Unity: "A science should be unified. Good theories consist of just one problem-solving strategy, or a small family of problem-solving strategies, that can be applied to a wide range of problems."

2. Fecundity: "A great scientific theory, like Newton's, opens up new areas of research. Because a theory presents a new way of looking at the world, it can lead us to ask new questions, and so to embark on new and fruitful lines of inquiry. Typically, a flourishing science is in- complete. At any time, it raises more questions than it can currently an- swer. But incompleteness is not vice. On the contrary, incompleteness is the mother of fecundity. A good theory should be productive; it should raise new questions and presume those questions can be answered with- out giving up its problem-solving strategies."

3. Auxiliary hypotheses that are independently testable: "An auxil- iary hypothesis ought to be testable independently of the particular prob- lem it is introduced to solve, independently of the theory it is designed to save."

Like other definitions of theories, including Popper's, Kitcher makes it clear that a theory must include statements that have observational consequences. But, like the observation of irregularities in the orbit of Uranus, falsification is only one possible consequence of observation. The production of new hypotheses is another possible and equally im- portant result.

The concept of a scientific theory has also been described using analogies and metaphors. For instance, the logical empiricist Carl Gus-


tav Hempel likened the structure of a scientific theory to a "complex spatial network:"

Its terms are represented by the knots, while the threads connecting the latter correspond, in part, to the definitions and, in part, to the fun- damental and derivative hypotheses included in the theory. The whole system floats, as it were, above the plane of observation and is anchored to it by the rules of interpretation. These might be viewed as strings which are not part of the network but link certain points of the latter with specific places in the plane of observation. By virtue of these inter- pretive connections, the network can function as a scientific theory: From certain observational data, we may ascend, via an interpretive string, to some point in the theoretical network, thence proceed, via def- initions and hypotheses, to other points, from which another interpretive string permits a descent to the plane of observation.

Michael Polanyi made an analogy between a theory and a map: A theory is something other than myself. It may be set out on paper as a system of rules, and it is the more truly a theory the more completely it can be put down in such terms. Mathematical theory reaches the highest perfection in this respect. But even a geographical map fully embodies in itself a set of strict rules for finding one's way through a region of otherwise uncharted experience. Indeed, all theory may be regarded as a kind of map extended over space and time.

A scientific theory can also be thought of as a book that captures the fundamental information about the world.

 











Дата: 2019-07-24, просмотров: 246.